Commands using xargs (769)

  • This dup finder saves time by comparing size first, then md5sum, it doesn't delete anything, just lists them.


    82
    find -not -empty -type f -printf "%s\n" | sort -rn | uniq -d | xargs -I{} -n1 find -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate
    grokskookum · 2009-09-21 00:24:14 57

  • 41
    tar -tf <file.tar.gz> | xargs rm -r
    prayer · 2009-07-06 22:23:11 19

  • 36
    ssh root@remote.host "rpm -qa" | xargs yum -y install
    BoxingOctopus · 2012-01-06 17:10:56 14
  • This one-liner will the *delete* without any further confirmation all 100% duplicates but one based on their md5 hash in the current directory tree (i.e including files in its subdirectories). Good for cleaning up collections of mp3 files or pictures of your dog|cat|kids|wife being present in gazillion incarnations on hd. md5sum can be substituted with sha1sum without problems. The actual filename is not taken into account-just the hash is used. Whatever sort thinks is the first filename is kept. It is assumed that the filename does not contain 0x00. As per the good suggestion in the first comment, this one does a hard link instead: find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }' Show Sample Output


    19
    find . -type f -print0|xargs -0 md5sum|sort|perl -ne 'chomp;$ph=$h;($h,$f)=split(/\s+/,$_,2);print "$f"."\x00" if ($h eq $ph)'|xargs -0 rm -v --
    masterofdisaster · 2009-06-07 03:14:06 15

  • 16
    find . -type d -name '.svn' -print0 | xargs -0 rm -rdf
    blue64 · 2009-02-05 17:47:03 29
  • I'm working in a group project currently and annoyed at the lack of output by my teammates. Wanting hard metrics of how awesome I am and how awesome they aren't, I wrote this command up. It will print a full repository listing of all files, remove the directories which confuse blame, run svn blame on each individual file, and tally the resulting line counts. It seems quite slow, depending on your repository location, because blame must hit the server for each individual file. You can remove the -R on the first part to print out the tallies for just the current directory. Show Sample Output


    16
    svn ls -R | egrep -v -e "\/$" | xargs svn blame | awk '{print $2}' | sort | uniq -c | sort -r
    askedrelic · 2009-07-29 02:10:45 15
  • This is a very simple and lightweight way to play DI.FM stations For a more complete version of the command with proper strings in the menu, try: (couldnt fit in the command field above) zenity --list --width 500 --height 500 --title 'DI.FM' --text 'Pick a Radio' --column 'radio' --column 'url' --print-column 2 $(curl -s http://www.di.fm/ | awk -F '"' '/href="http:.*\.pls.*96k/ {print $2}' | sort | awk -F '/|\.' '{print $(NF-1) " " $0}') | xargs mplayer This command line parses the html returned from http://di.fm and display all radio stations in a nice graphical menu. After the radio is chosen, the url is passed to mplayer so the music can start dependencies: - x11 with gtk environment - zenity: simple app for displaying gtk menus (sudo apt-get install zenity on ubuntu) - mplayer: simple audio player (sudo apt-get install mplayer on ubuntu) Show Sample Output


    16
    zenity --list --width 500 --height 500 --column 'radio' --column 'url' --print-column 2 $(curl -s http://www.di.fm/ | awk -F '"' '/href="http:.*\.pls.*96k/ {print $2}' | sort | awk -F '/|\.' '{print $(NF-1) " " $0}') | xargs mplayer
    polaco · 2010-04-28 23:45:35 14
  • echo "http%3A%2F%2Fwww.google.com" | sed -e's/%\([0-9A-F][0-9A-F]\)/\\\\\x\1/g' | xargs echo -e http://www.google.com Works under bash on linux. just alter the '-e' option to its corresponding equivalence in your system to execute escape characters correctly.


    14
    sed -e's/%\([0-9A-F][0-9A-F]\)/\\\\\x\1/g' | xargs echo -e
    mohan43u · 2009-05-25 05:37:44 39
  • checks which files are not under version control, fetches the names and runs them through "svn add". WARNING: doesn't work with white spaces.


    13
    svn status |grep '\?' |awk '{print $2}'| xargs svn add
    xsawyerx · 2009-01-29 10:33:22 81
  • Using xargs is better than: find /path/to/dir -type f -exec rm \-f {} \; as the -exec switch uses a separate process for each remove. xargs splits the streamed files into more managable subsets so less processes are required.


    12
    find /path/to/dir -type f -print0 | xargs -0 rm
    root · 2009-01-26 11:30:47 55
  • Create a tgz archive of all the files containing local changes relative to a subversion repository. Add the '-q' option to only include files under version control: svn st -q | cut -c 8- | sed 's/^/\"/;s/$/\"/' | xargs tar -czvf ../backup.tgz Useful if you are not able to commit yet but want to create a quick backup of your work. Of course if you find yourself needing this it's probably a sign you should be using a branch, patches or distributed version control (git, mercurial, etc..)


    12
    svn st | cut -c 8- | sed 's/^/\"/;s/$/\"/' | xargs tar -czvf ../backup.tgz
    chrisdrew · 2009-02-09 11:24:31 14
  • This helped me find a botnet that had made into my system. Of course, this is not a foolproof or guarantied way to find all of them or even most of them. But it helped me find it.


    12
    cat /var/lib/dpkg/info/*.list > /tmp/listin ; ls /proc/*/exe |xargs -l readlink | grep -xvFf /tmp/listin; rm /tmp/listin
    kamathln · 2009-09-09 18:09:14 14

  • 12
    find /proc -user myuser -maxdepth 1 -type d -mtime +7 -exec basename {} \; | xargs kill -9
    sharfah · 2009-10-05 14:49:51 9
  • no loop, only one call of grep, scrollable ("less is more", more or less...)


    12
    ls /usr/bin | xargs whatis | grep -v nothing | less
    michelsberg · 2010-01-26 12:59:47 31
  • xargs can be used in this manner to download multiple files at a time, and xargs will in this case run 10 processes at a time and initiate a new one when the number running falls below 10. Show Sample Output


    11
    wget -nv http://en.wikipedia.org/wiki/Linux -O- | egrep -o "http://[^[:space:]]*.jpg" | xargs -P 10 -r -n 1 wget -nv
    grokskookum · 2009-08-31 18:37:33 52
  • Search for files and list the 20 largest. find . -type f gives us a list of file, recursively, starting from here (.) -print0 | xargs -0 du -h separate the names of files with NULL characters, so we're not confused by spaces then xargs run the du command to find their size (in human-readable form -- 64M not 64123456) | sort -hr use sort to arrange the list in size order. sort -h knows that 1M is bigger than 9K | head -20 finally only select the top twenty out of the list Show Sample Output


    11
    find . -type f -print0 | xargs -0 du -h | sort -hr | head -20
    flatcap · 2012-03-30 10:21:12 9
  • This command will find the biggest files recursively under a certain directory, no matter if they are too many. If you try the regular commands ("find -type f -exec ls -laSr {} +" or "find -type f -print0 | xargs -0 ls -laSr") the sorting won't be correct because of command line arguments limit. This command won't use command line arguments to sort the files and will display the sorted list correctly. Show Sample Output


    10
    find . -type f -printf '%20s %p\n' | sort -n | cut -b22- | tr '\n' '\000' | xargs -0 ls -laSr
    fsilveira · 2009-08-13 13:13:33 14
  • change the *.avi to whatever you want to match, you can remove it altogether if you want to check all files.


    10
    find -type f -name "*.avi" -print0 | xargs -0 mplayer -vo dummy -ao dummy -identify 2>/dev/null | perl -nle '/ID_LENGTH=([0-9\.]+)/ && ($t +=$1) && printf "%02d:%02d:%02d\n",$t/3600,$t/60%60,$t%60' | tail -n 1
    grokskookum · 2009-09-24 15:50:39 31
  • I needed a way to search all files in a web directory that contained a certain string, and replace that string with another string. In the example, I am searching for "askapache" and replacing that string with "htaccess". I wanted this to happen as a cron job, and it was important that this happened as fast as possible while at the same time not hogging the CPU since the machine is a server. So this script uses the nice command to run the sh shell with the command, which makes the whole thing run with priority 19, meaning it won't hog CPU processing. And the -P5 option to the xargs command means it will run 5 separate grep and sed processes simultaneously, so this is much much faster than running a single grep or sed. You may want to do -P0 which is unlimited if you aren't worried about too many processes or if you don't have to deal with process killers in the bg. Also, the -m1 command to grep means stop grepping this file for matches after the first match, which also saves time. Show Sample Output


    10
    sh -c 'S=askapache R=htaccess; find . -mount -type f|xargs -P5 -iFF grep -l -m1 "$S" FF|xargs -P5 -iFF sed -i -e "s%${S}%${R}%g" FF'
    AskApache · 2009-10-02 05:03:10 7
  • This command is useful when you want to know what process is responsible for a certain GUI application and what command you need to issue to launch it in terminal. Show Sample Output


    9
    xprop | awk '/PID/ {print $3}' | xargs ps h -o pid,cmd
    jackhab · 2009-02-16 07:55:19 83
  • Very useful set of commands to know when your file system was created. Show Sample Output


    9
    df / | awk '{print $1}' | grep dev | xargs tune2fs -l | grep create
    Kaio · 2009-02-16 18:45:03 832
  • Run this in the directory you store your music in. mp3gain and vorbisgain applies the ReplayGain normalization routine to mp3 and ogg files (respectively) in a reversible way. ReplayGain uses psychoacoustic analysis to make all files sound about the same loudness, so you don't get knocked out of your chair by loud songs after cranking up the volume on quieter ones.


    9
    find . -iname \*.mp3 -print0 | xargs -0 mp3gain -krd 6 && vorbisgain -rfs .
    Viaken · 2009-03-09 18:11:35 11
  • 1. find file greater than 10 MB 2. direct it to xargs 3. xargs pass them as argument to ls Show Sample Output


    9
    find ./ -size +10M -type f -print0 | xargs -0 ls -Ssh1 --color
    eastwind · 2009-08-25 18:40:47 8
  • Adapted using your usefull comments !


    9
    grep -e `date +%Y-%m-%d` /var/log/dpkg.log | awk '/install / {print $4}' | uniq | xargs apt-get -y remove
    skygreg · 2010-01-12 09:42:22 11
  • Maybe simpler, but again, don't know how it will work with space in filename.


    9
    xargs -n 2 mv < file_with_colums_of_names
    Juluan · 2010-12-27 18:06:15 6
  •  1 2 3 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Display Spinner while waiting for some process to finish
alternatively, run the spinner for 5 seconds: timeout 5 bash -c 'spinner=( Ooooo oOooo ooOoo oooOo ooooO oooOo ooOoo oOooo); while true; do for i in ${spinner[@]}; do for j in seq 0 ${#i}; do echo -en "\b\b"; done; echo -ne "${i}"; sleep 0.2; done; done'

Find partition name using mount point
lsblk | grep mountpoint

list all opened ports on host

Get your outgoing IP address
should be very consistent cause it's google :-)

Get ethX mac addresses
I much prefer using /sbin/ip over /sbin/ifconfig for most everything. I find the interface and output to be much more consistent and it has many abilities that ifconfig, route, etc. do not. To get the mac address for only one interface, add 'show dev [interface]' to the 'ip link' part of the command: ip link show dev eth0 | grep 'link/ether' | awk '{print $2}' . Also, both this command and the ifconfig one do not require root access to run, so the sudo is not necessary.

get all Amazon cloud (amazonws etc) ipv4 subnets

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

For a $FILE, extracts the path, filename, filename without extension and extension.
Useful for use in other scripts for renaming, testing for extensions, etc.

Show changed files, ignoring permission, date and whitespace changes
Only shows files with actual changes to text (excluding whitespace). Useful if you've messed up permissions or transferred in files from windows or something like that, so that you can get a list of changed files, and clean up the rest.

delete all leading and trailing whitespace from each line in file


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: