All commands (14,187)

  • This command is useful when you are programming, for example.


    -1
    sed -i 's/[ \t]\+$//g' file.txt
    elder · 2011-09-07 01:47:44 9

  • -1
    net user USERNAME /domain
    shawn_abdushakur · 2014-01-02 20:22:46 7
  • just a alternative using a saved html file of all of my bookmarks. works well although it takes awhile.


    -1
    wget -r --wait=5 --quota=5000m --tries=3 --directory-prefix=/home/erin/Documents/erins_webpages --limit-rate=20k --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net -F -i ./delicious-20090629.htm
    bbelt16ag · 2009-07-02 01:46:21 7
  • It works in every linux box Show Sample Output


    -1
    cat /proc/cpuinfo
    magicjohnson_ · 2010-09-24 09:27:58 3
  • http://github.com/c3w/ash . a Ruby SSH helper script . reads a JSON config file to read host, FQDN, user, port, tunnel options . changes OSX Terminal profiles based on host 'type' USAGE: put 'ash' ruby script in your PATH modify and copy ashrc-dist to ~/.ashrc configure OSX Terminal profiles, such as "webserver", "development", etc run "ash myhostname" and away you go! v.2 will re-attach to a 'screen' named in your ~/.ashrc Show Sample Output


    -1
    ash prod<tab>
    c3w · 2012-05-12 19:51:02 8
  • ls


    -1
    ls
    yingkailiang · 2013-03-14 01:28:01 5
  • This commands queries the delicious api then runs the xml through xml2, grabs the urls cuts out the first two columns, passes through uniq to remove duplicates if any, and then goes into linkchecker who checks the links. the links go the blacklist in ~/.linkchecker/blacklist. please see the manual pages for further info peeps. I took me a few days to figure this one out. I how you enjoy it. Also don't run these api more then once a few seconds you can get banned by delicious see their site for info. ~updated for no recursive Show Sample Output


    -1
    curl -k https://Username:Password@api.del.icio.us/v1/posts/all?red=api | xml2| \grep '@href' | cut -d\= -f 2- | sort | uniq | linkchecker -r0 --stdin --complete -v -t 50 -F blacklist
    bbelt16ag · 2013-05-04 17:43:21 8
  • This is very similar to the first example except that it employs the 'exec' argument of the find command rather than piping the result to xargs. The second example is nice and tidy but different *NIXs may not have as capable a grep command.


    -1
    find . -name "*.php" -exec grep -il searchphrase {} \;
    unixmonkey7797 · 2010-01-16 05:09:30 4
  • first 10 big file


    -1
    du -s * | sort -nr | head
    chenge · 2010-05-13 12:21:22 4

  • -1
    ffmpeg -r 12 -i img%03d.jpg -sameq -s hd720 -vcodec libx264 -crf 25 OUTPUT.MP4
    brainstorm · 2013-05-04 18:46:36 9
  • The while loop is an overkill, it would be simpler to prevent the file to be modified. That said, none of the proposed solutions are such: a real one would go to the source of the problem.


    -1
    chkmod -w /etc/resolve.conf
    ntropia · 2018-05-14 16:25:47 163
  • Make sure that find does not touch anything other than regular files, and handles non-standard characters in filenames while passing to xargs.


    -1
    find . -type f -exec grep -qi 'foo' {} \; -print0 | xargs -0 vim
    arcege · 2009-09-03 17:55:26 7
  • Useful since "export http_proxy=blahblah:8080" doesn't seem to work with pear Show Sample Output


    -1
    pear config-set http_proxy http://myusername:mypassword@corporateproxy:8080
    KoRoVaMiLK · 2010-05-13 14:44:03 30
  • Output: Version 3.2-0 (for example if you type # aptitude show bash | grep Vers Depends on the language of your distribution, because the name of the word "Version" in other languages may be different.


    -1
    aptitude show $PROGRAM | grep Vers
    aabilio · 2009-02-27 23:24:37 8

  • -1
    xrandr -q | grep -w Screen
    hemanth · 2010-02-14 15:38:49 3
  • splits a postscript file into multiple postscript files. for each page of the input file one output file will be generated. The files will be numbered for example 1_orig.ps 2_orig.ps ... The psselect commad is part of the psutils package


    -1
    file=orig.ps; for i in $(seq `grep "Pages:" $file | sed 's/%%Pages: //g'`); do psselect $i $file $i\_$file; done
    damncool · 2010-09-24 19:44:32 4
  • This command shows a high level overview of system memory and usage refreshed in seconds. Change -n 10 to you desired refresh interval. Show Sample Output


    -1
    watch -n 10 free -m
    Darkstar · 2014-01-04 10:10:15 12
  • Uses the pid to get the full path of the process. Useful when you do not which command got picked from the path Show Sample Output


    -1
    readlink -f /proc/<pid>/cmdline
    naseer · 2009-05-26 10:09:03 23
  • This got a bit complicated, because I had to introduce an additional dot at the end that has to be removed again later.


    -1
    for each in *; do file="$each."; name=${file%%.*}; suffix=${file#*.}; mv "$each" "$(echo $name | rot13)${suffix:+.}${suffix%.}"; done
    hfs · 2010-03-20 16:11:12 6

  • -1
    ls --color=never -1| grep -E "[0-9]{4}"|sed -re "s/^(.*)([0-9]{4})(.*)$/\2 \1\2\3/" | sort -r
    ysangkok · 2014-01-04 20:50:12 9

  • -1
    netstat -4tnape
    gnuyoga · 2009-05-26 11:50:52 5
  • Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion. 1.) wget source.tar.gz 2.) tar xzvf source.tar.gz 3.) cd source 4.) ls From there you can run ./configure, make and etc. Show Sample Output


    -1
    wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
    oshazard · 2010-01-17 11:25:47 3
  • This is just a little snippit to split a large file into smaller chunks (4mb in this example) and then send the chunks off to (e)mail for archival using mutt. I usually encrypt the file before splitting it using openssl: openssl des3 -salt -k <password> -in file.tgz -out file.tgz.des3 To restore, simply save attachments and rejoin them using: cat file.tgz.* > output_name.tgz and if encrypted, decrypt using: openssl des3 -d -salt -k <password> -in file.tgz.des3 -out file.tgz edit: (changed "g" to "e" for political correctness)


    -1
    split -b4m file.tgz file.tgz. ; for i in file.tgz.*; do SUBJ="Backup Archive"; MSG="Archive File Attached"; echo $MSG | mutt -a $i -s $SUBJ YourEmail@(E)mail.com
    tboulay · 2010-03-20 16:49:19 8

  • -1
    if [ -x /etc/*-release ]; then cat /etc/*-release ; else cat /etc/*-version ; fi
    hugoeustaquio · 2011-06-22 14:09:24 3
  • Transfer files with rsync over ssh on a non-standard port, showing a progress bar and resuming partial transfers.


    -1
    rsync -P -e 'ssh -p PORT' SRC DEST
    vickio · 2011-10-13 08:59:07 4
  • ‹ First  < 498 499 500 501 502 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

how many pages will my text files print on?
This gives a very rough estimate of how many pages your text files will print on. Assumes 60 lines per page, and does not take long lines into account.

Find and display most recent files using find and perl
This pipeline will find, sort and display all files based on mtime. This could be done with find | xargs, but the find | xargs pipeline will not produce correct results if the results of find are greater than xargs command line buffer. If the xargs buffer fills, xargs processes the find results in more than one batch which is not compatible with sorting. Note the "-print0" on find and "-0" switch for perl. This is the equivalent of using xargs. Don't you love perl? Note that this pipeline can be easily modified to any data produced by perl's stat operator. eg, you could sort on size, hard links, creation time, etc. Look at stat and just change the '9' to what you want. Changing the '9' to a '7' for example will sort by file size. A '3' sorts by number of links.... Use head and tail at the end of the pipeline to get oldest files or most recent. Use awk or perl -wnla for further processing. Since there is a tab between the two fields, it is very easy to process.

list files recursively by size

Find Duplicate Files (based on size first, then MD5 hash)
for OS X

Scans for open ports using telnet

Efficient count files in directory (no recursion)
$ time perl -e 'if(opendir D,"."){@a=readdir D;print $#a - 1,"\n"}' 205413 real 0m0.497s user 0m0.220s sys 0m0.268s $ time { ls |wc -l; } 205413 real 0m3.776s user 0m3.340s sys 0m0.424s ********* ** EDIT: turns out this perl liner is mostly masturbation. this is slightly faster: $ find . -maxdepth 1 | wc -l sh-3.2$ time { find . -maxdepth 1|wc -l; } 205414 real 0m0.456s user 0m0.116s sys 0m0.328s ** EDIT: now a slightly faster perl version $ perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"' sh-3.2$ time perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"' 205414 real 0m0.415s user 0m0.176s sys 0m0.232s

Remove executable bit from all files in the current directory recursively, excluding other directories
With GNU chmod at least it is that simple.

Check the current price of Bitcoin in USD

Find the most recently changed files (recursively)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: