Bash
Contents
Grep
Find mutliple items
grep -e "files to consider" -e "^wrote:" -e "^sent" /var/log/rsync.log
Find file name with the item
grep -e "mysql" -s /var/www/html/iserver/python/*
Find
Find large files
find / -size +20000 -exec ls -hl {} \; | awk '{ print $9, $5}'
Find and Delete tmp files
find /home -type f -name *.tmp -exec rm {} \;
find /var/spool/mqueue.in/ -size 942c -exec rm {} \;
find ./ -type f -mtime 6 -exec rm {} \;
Find a file with a specific date / time
/usr/local/cpanel/bin/nativessl-install
find /home -type f -name *.tmp
Find files with a modified date
The date of a file has been modified in the last 5 days.
find ./ -type f -mtime -5 -exec ls -la {} \;
find * -name index.php -type f
Find symbolic links
ls -lR /path/to/folder | grep ^l
Dealing with processes
(Find what is running on a particular port)
lsof -i :993
then and found rpc.rquotad was using the port (nfs)
netstat -tnlp
and
kill -9 ####
nfs uses random ports unless you tell it to use fixed ports. Indeed you can configure your own port numbers in your /etc/sysconfig/nfs (your nfs config file)
tcpdump
tcpdump -pnvi eth1 host 202.174.163.68
tcpdump -nnvvXSpni eth0 host 202.174.163.68 and port 25
Rsync
rsync -avutP 192.168.20.3::iserver /var/www/html/iserver/
rsync -avutP /var/www/html /backup/html
rsync -avutP --delete /var/www/html /backup/html
Nmap
yum install nmap nmap-frontend.noarch
Run under 'System Tools' Zenmap.
nmap -T4 -A -v 192.168.7.* nmap -sP 192.168.20.0/24
Tar & other compressions
TO Create
tar czf "filename".tar.gz "/directory/to/compress"
TO Extract
tar -xvf file.tar
tar zxvf .gz .sit
tar xzf "filename".tar.gz -C "/file/extraction/location"
TO Append
tar tf archive.tar : List archive contents
tar tzf archive.tar.Z : List compressed archive contents
tar xvvf archive.tar : Extract files from archive with really verbose output
tar xzpvf archive.tar.gz : Extract files from compressed archive and retain permissions
tar xzpvf archive.tar.gz "filename" : Extract filename
tar xzpvf archive.tar.gz "directory" : Extract directory and files
tar cf /dev/fd0 directory : Create archive on device
tar cf archive.tar directory : Create archive on file
tar czf archive.tar.gz directory : Create compressed archive
tar cf - directory | ( cd /newdir; tar xvpf - ) : Move a directory
tar cf archive.tar `find /directory -print` : Create an archive from a filelist
tar rPf archive.tar "filename" : Append a file to an archive
http://www.ozetechnology.com/howtos/compress.shtml
http://www.smcoe.k12.ca.us/ssfusd/as/linux/zipfiles.html
Dealing with Directories
find . -maxdepth 1 -type d | while read f do echo $f done
convert
Create a thumb name of a pdf. First get rid of names with spaces and names with '
for f in *\ *; do mv "$f" "${f// /_}"; done
for f in *\'*; do mv "$f" "${f//\'/_}"; done
Now reduce everything to lower case
ls | while read upName; do loName=`echo "${upName}" | tr '[:upper:]' '[:lower:]'`; mv "$upName" "$loName"; done
Now
for f in *.pdf; do convert -thumbnail x200 -background white -alpha remove "$f"[0] "${f%.pdf}.png"; done
Create a thumb name of a jpg.
convert rose.jpg -resize 10% rose10.png
for f in *.jpg; do convert -thumbnail x200 -background white -alpha remove "$f"[0] "${f%.jpg}.png"; done
for f in *.JPG; do convert -thumbnail x100 -background white -alpha remove "$f"[0] "${f%.JPG}.png"; done
Convert all file to a present width
for f in *.jpg; do convert -resize 1000 "$f"[0] "${f%.jpg}-1000.jpg"; done
EXTRA, go in to the directory in question.
First get rid of names with spaces and names with '
for f in *\ *; do mv "$f" "${f// /_}"; done
for f in *\'*; do mv "$f" "${f//\'/_}"; done
Now reduce everything to lower case
ls | while read upName; do loName=`echo "${upName}" | tr '[:upper:]' '[:lower:]'`; mv "$upName" "$loName"; done
Now for the size
for f in *.jpg; do convert -resize 1000 "$f"[0] "${f%.jpg}-1000.jpg"; done
rename extensions only to lower case
find . -name '*.*' -exec sh -c '
a=$(echo "$0" | sed -r "s/([^.]*)\$/\L\1/");
[ "$a" != "$0" ] && mv "$0" "$a" ' {} \;
wget
wget -r --no-parent http://mysite.com/configs/.vim/
tr
Find and replace every space in a file name with _.
for file in *; do mv "$file" `echo $file | tr ' ' '_'` ; done
chmod
http://en.wikipedia.org/wiki/Chmod Making sudo usable.
ls -ltr /usr/bin/sudo ---x--x--x 2 root root 164360 Jan 7 2007 /usr/bin/sudo
Can not sudo bash
chmod u+s /usr/bin/sudo ls -ltr /usr/bin/sudo ---s--x--x 2 root root 164360 Jan 7 2007 /usr/bin/sudo
chown dd
from: http://www.debianhelp.co.uk/ddcommand.htm full hard disk copy
dd if=/dev/hdx of=/dev/hdy dd if=/dev/hdx of=/path/to/image dd if=/dev/hdx | gzip > /path/to/image.gz
Hdx could be hda, hdb etc. In the second example gzip is used to compress the image if it is really just a backup. Restore Backup of hard disk copy
dd if=/path/to/image of=/dev/hdx
gzip -dc /path/to/image.gz | dd of=/dev/hdx MBR backup
In order to backup only the first few bytes containing the MBR and the partition table you can use dd as well.
dd if=/dev/hdx of=/path/to/image count=1 bs=512 MBR restore
dd if=/path/to/image of=/dev/hdx
Add "count=1 bs=446" to exclude the partition table from being written to disk. You can manually restore the table. wget
wget is a utility that allows you to pull files off a ftp server, either from a public area or from a private area using a user name and passwd. Syntax
wget -c ftp://www.yoursite.co.nz/file/to/get
-c will continue from where the last session broke from
wget --passive-ftp ftp://202.202.202.202/file/to/get
--passive-ftp because of firewalls and other issues some sites need to have passive turned on to be able to get the files needed.
wget ftp://user:passwd@www.yoursite.co.nz/file/to/get/*
wget accepts wild cards You can also do this to get a file from a website that has users and passwords.
cd /files/My\ Documents\Vendors/price\ lists/ rm -f pricelist.hlp wget http://user:passwd@www.vendor.co.nz/computer/dealerfiles/pricelist.hlp
This is a script that runs automatically each morning to get a vendors pricelist off their private website that requires a username and password to get in.
Email attachment from command line
mutt -s "Test mail" -a /tmp/file.tar.gz vivek@nixcraft.co.in < /tmp/mailmessage.txt
Linux ftp
This is a script that can be run from a cron to automatically log on to a remote site and upload or download files.
ftp -n yoursite.co.nz << end
us username password
cd scripts
binary
get file.tar.gz
bye
sed
- A list of files
- *.html
- need to replace contact.html with http://ziggys.nz/contact-us/email-us.html
- Because the replacement string has the delimiter / in it, the normal sed command would fail.
Therefore:
find . -type f -print0 | xargs -0 sed -i 's|contact.html|http://ziggys.nz/contact-us/email-us.html|g'
find *.html -type f -print0 | xargs -0 sed -i 's|ziggy@xtra.co.nz|simon@ziggys.co.nz|g'
find *.html -type f -print0 | xargs -0 sed -i 's|mailto:simon@ziggys.co.nz|http://ziggys.nz/contact-us/email-us.html|g'
mysql
Find the number of records in every table of a database.
SELECT table_name, table_rows FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_SCHEMA = 'konatech_wiki';
backup: to one file
mysqldump -u root -p[root_password] [database_name] > dumpfilename.sql
restore: from one file
mysql -u root -p[root_password] [database_name] < dumpfilename.sql
backup to one file per table (Must be to the tmp dir becuase of permissions)
mkdir /tmp/mysql chown -R mysql.mysql /tmp/mysql mysqldump --user=dbuser --password --tab=/tmp/mysql dbname
- Did you find this page useful?
- Do you have an issue that you have not yet fixed?
We can do this for you.
I am available for technical support. Please follow this link. Tech Support Request.
+64-6-880-0000 : ++1-808-498-7146 : help@ai.net.nz
Getting us to help you