xargs deals badly with special characters (such as space, ' and "). In this case if you have a file called '12" record'. Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem. Both solutions work bad if the number of files is more than the allowed line length of the shell.
Compresses each file individually, creating a $fileneame.tar.gz and removes the uncompressed version, usefull if you have lots of files and don't want 1 huge archive containing them all. you could replace ls with ls *.pdf to just perform the action on pdfs for example.
tar command options: -z : Uncompress the resulting archive with gzip command. -x : Extract to disk from the archive. -v : Produce verbose output i.e. show progress and file names while extracting files. -f backup.tgz : Read the archive from the specified file called backup.tgz. -C /tmp/data : Unpack/extract files in /tmp/data instead of the default current directory.
Simple commant to tar a path
due to bug can not comment
Very simple and useful, you need to change the word "directory" for your directory
This deals nicely with files having special characters in the file name (space ' or "). Parallel is from https://savannah.nongnu.org/projects/parallel/
if I need get olnly script.sh from "folder/script.sh" Show Sample Output
This is how I've done it in the past
Doesn't create a file Make sure to list the files / directories in the same order every time.
Remove annoying improperly packaged files that untar into the incorrect directory. Example, When you untar and it extracts hundreds of files into the current directory.... bleh.
Simplicity tends to win out on commandlinefu.com Also, why type multiple filenames when range operators work too. Saves finger abuse and time and reduces the chances for mistakes.
Just copy and paste the code in your terminal. Note : sudo apt-get for debian versions , change as per your requirement . Source : www.h3manth.com
Where foodir is the directory you want to zip up.
I find the ouput of ls -lR to be un-satisfying (why is the path data up there?) and find syntax to be awkward. Running 'du -a' means you will have likely to trim-off filesize data before feeding filenames to the next step in the pipe.
the f is for file and - stdout, This way little shorter. I Like copy-directory function It does the job but looks like SH**, and this doesn't understand folders with whitespaces and can only handle full path, but otherwise fine, function copy-directory () { ; FrDir="$(echo $1 | sed 's:/: :g' | awk '/ / {print $NF}')" ; SiZe="$(du -sb $1 | awk '{print $1}')" ; (cd $1 ; cd .. ; tar c $FrDir/ )|pv -s $SiZe|(cd $2 ; tar x ) ; } Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: