Commands using xargs (769)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

See system users

autossh + ssh + screen = super rad perma-sessions
Only useful for really flakey connections (but im stuck with one for now). Though if youre in this situation ive found this to be a good way to run autossh and it does a pretty good job of detecting when the session is down and restarting. Combined with the -t and screen commands this pops you back into your working session lickety split w/ as few headaches as possible. And if autossh is a bit slow at detecting the downed ssh connection, just run this in another tab/terminal window to notify autossh that it should drop it and start over. Basically for when polling is too slow. kill -SIGUSR1 `pgrep autossh`

resolve short urls
since the most url shorteners respond with a header containing the Location: ... this works with most common shorteners

Find usb device in realtime
Using this command you can track a moment when usb device was attached.

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Calculate days on which Friday the 13th occurs (inspired from the work of the user justsomeguy)
Friday is the 5th day of the week, monday is the 1st. Output may be affected by locale.

bash/ksh function: given a file, cd to the directory it lives
fcd : file change directory A bash function that takes a fully qualified file path and cd's into the directory where it lives. Useful on the commadline when you have a file name in a variable and you'd like to cd to the directory to RCS check it in or look at other files associated with it. Will run on any ksh, bash, likely sh, maybe zsh.

Synchronise a file from a remote server
You will be prompted for a password unless you have your public keys set-up.

Find files that are older than x days
Find files that are older than x days in the working directory and list them. This will recurse all the sub-directories inside the working directory. By changing the value for -mtime, you can adjust the time and by replacing the ls command with, say, rm, you can remove those files if you wish to.

defragment files
Thanks to flatcap for optimizing this command. This command takes advantage of the ext4 filesystem's resistance to fragmentation. By using this command, files that were previously fragmented will be copied / deleted / pasted essentially giving the filesystem another chance at saving the file contiguously. ( unlike FAT / NTFS, the *nix filesystem always try to save a file without fragmenting it ) My command only effects the home directory and only those files with your R/W (read / write ) permissions. There are two issues with this command: 1. it really won't help, it works, but linux doesn't suffer much (if any ) fragmentation and even fragmented files have fast I/O 2. it doesn't discriminate between fragmented and non-fragmented files, so a large ~/ directory with no fragments will take almost as long as an equally sized fragmented ~/ directory The benefits i managed to work into the command: 1. it only defragments files under 16mb, because a large file with fragments isn't as noticeable as a small file that's fragmented, and copy/ delete/ paste of large files would take too long 2. it gives a nice countdown in the terminal so you know how far how much progress is being made and just like other defragmenters you can stop at any time ( use ctrl+c ) 3. fast! i can defrag my ~/ directory in 11 seconds thanks to the ramdrive powering the command's temporary storage bottom line: 1. its only an experiment, safe ( i've used it several times for testing ), but probably not very effective ( unless you somehow have a fragmentation problem on linux ). might be a placebo for recent windows converts looking for a defrag utility on linux and won't accept no for an answer 2. it's my first commandlinefu command


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: