Difference between revisions of "Tools"

From wiki.linuxonlinehelp.eu
Jump to navigation Jump to search
(Created page with "==== find ==== ==== grep ==== ==== tar ==== ==== bash scripts ==== Header: <pre> #!/bin/bash #set interpreter date=`/bin/date +%Y%m%d-%H%M%S` #...")
 
 
(13 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
==== find ====
 
==== find ====
 +
Sample: Find multiple files/folders/by size
 +
<pre>
 +
find /path -type d \( -name "foldername1" -o -name "foldername2" \ )
 +
</pre>
 +
find and remove on the fly
 +
<pre>
 +
find /path -name "filename-to-delete" -type f -exec rm -rf {} \;
 +
</pre>
 +
 +
find by size over 500MB and list them into Textfile
 +
<pre>
 +
find / -size +500M -type f > /big-files.txt
 +
</pre>
  
 +
find file with ending ".sh" and header "shebang" called inside "#!/bin/bash" to migrate Scripts from Linux to Unix!
 +
<pre>
 +
find / -name '*.sh' -print -exec grep bash {} \;
 +
</pre>
  
 
==== grep ====
 
==== grep ====
 +
Sample: By default the Linux and Unix Config Files written with comments for help but somtimes its too much!
 +
Comments mostly starting with "#"
 +
to filter them out do:
 +
<pre>
 +
grep -v '^#' config
 +
</pre>
 +
Sytem Echos NOW only active Settings!! That's easy to control for bugs..
  
  
 
==== tar ====
 
==== tar ====
 +
to tar do:
 +
<pre>
 +
tar -cvf backup.tar /home/
 +
</pre>
 +
To boost the tar on cron jobs remove the "-v" Switch
 +
 +
on Single Core Server compress with TAR
 +
<pre>
 +
tar -cvf backup.tar /home/
 +
</pre>
  
 +
==== pigz ====
 +
on Multi CPU Servers you can compress TAR-Files with PIGZ it offers Multi Core Usage against of TAR builtin Compressor<br>
 +
This saves Time, Energy and Space!
 +
enter:
 +
<pre>
 +
tar -cf - /home | pigz -9 > backup.tgz #collect home to TAR Archiv then compress it with all CPU's you have on Compress Level 9 (High)
 +
</pre>
 +
At the htop Process Viewer you can view the usage!
  
 
==== bash scripts ====
 
==== bash scripts ====
Line 14: Line 56:
 
date=`/bin/date +%Y%m%d-%H%M%S`    #set data-time variable for $date
 
date=`/bin/date +%Y%m%d-%H%M%S`    #set data-time variable for $date
 
</pre>
 
</pre>
 +
 +
==== rsync ====
 +
Sync Path, Drives and mounted Networks:
 +
Sync with Human readable Output, can be logged to a Log!
 +
<pre>
 +
rsync -av /source/ /destination
 +
</pre>
 +
To BOOST I/O DON'T use COMPRESS! "-z" cause the System Pause Transmissions during compress!!
 +
On Cronjobs remove the "-v" this BOOST the rsync TOO!
 +
<pre>
 +
rsync -a /source/ /destination  #fastest and hold File Attributes and Rights!
 +
</pre>
 +
 +
==== wget ====
 +
Network pull tool to copy Websites or Data
 +
<pre>
 +
wget -O LocalImageName.jpg "http://domainname.info/image.jpg" --tries=2 --timeout=10 #try to pull only twice then stop, and stop after 10 sec of Timouts
 +
</pre>
 +
To Mirror complete Webspaces if allowed by robots.txt
 +
<pre>
 +
wget -m wget http://domainname.info/ --tries=2 --timeout=10
 +
</pre>
 +
This command can create a locel Offline Copy of a Site include Images and Links

Latest revision as of 07:43, 14 June 2017

find

Sample: Find multiple files/folders/by size

find /path -type d \( -name "foldername1" -o -name "foldername2" \ )

find and remove on the fly

find /path -name "filename-to-delete" -type f -exec rm -rf {} \;

find by size over 500MB and list them into Textfile

find / -size +500M -type f > /big-files.txt

find file with ending ".sh" and header "shebang" called inside "#!/bin/bash" to migrate Scripts from Linux to Unix!

find / -name '*.sh' -print -exec grep bash {} \;

grep

Sample: By default the Linux and Unix Config Files written with comments for help but somtimes its too much! Comments mostly starting with "#" to filter them out do:

grep -v '^#' config

Sytem Echos NOW only active Settings!! That's easy to control for bugs..


tar

to tar do:

tar -cvf backup.tar /home/

To boost the tar on cron jobs remove the "-v" Switch

on Single Core Server compress with TAR

tar -cvf backup.tar /home/

pigz

on Multi CPU Servers you can compress TAR-Files with PIGZ it offers Multi Core Usage against of TAR builtin Compressor
This saves Time, Energy and Space! enter:

tar -cf - /home | pigz -9 > backup.tgz #collect home to TAR Archiv then compress it with all CPU's you have on Compress Level 9 (High)

At the htop Process Viewer you can view the usage!

bash scripts

Header:

#!/bin/bash                        #set interpreter
date=`/bin/date +%Y%m%d-%H%M%S`    #set data-time variable for $date

rsync

Sync Path, Drives and mounted Networks: Sync with Human readable Output, can be logged to a Log!

rsync -av /source/ /destination

To BOOST I/O DON'T use COMPRESS! "-z" cause the System Pause Transmissions during compress!! On Cronjobs remove the "-v" this BOOST the rsync TOO!

rsync -a /source/ /destination  #fastest and hold File Attributes and Rights!

wget

Network pull tool to copy Websites or Data

wget -O LocalImageName.jpg "http://domainname.info/image.jpg" --tries=2 --timeout=10 #try to pull only twice then stop, and stop after 10 sec of Timouts

To Mirror complete Webspaces if allowed by robots.txt

wget -m wget http://domainname.info/ --tries=2 --timeout=10

This command can create a locel Offline Copy of a Site include Images and Links