All commands (14,187)

  • This command is useful when you are programming, for example.


    -1
    sed -i 's/[ \t]\+$//g' file.txt
    elder · 2011-09-07 01:47:44 9

  • -1
    net user USERNAME /domain
    shawn_abdushakur · 2014-01-02 20:22:46 7
  • just a alternative using a saved html file of all of my bookmarks. works well although it takes awhile.


    -1
    wget -r --wait=5 --quota=5000m --tries=3 --directory-prefix=/home/erin/Documents/erins_webpages --limit-rate=20k --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net -F -i ./delicious-20090629.htm
    bbelt16ag · 2009-07-02 01:46:21 7
  • It works in every linux box Show Sample Output


    -1
    cat /proc/cpuinfo
    magicjohnson_ · 2010-09-24 09:27:58 3
  • http://github.com/c3w/ash . a Ruby SSH helper script . reads a JSON config file to read host, FQDN, user, port, tunnel options . changes OSX Terminal profiles based on host 'type' USAGE: put 'ash' ruby script in your PATH modify and copy ashrc-dist to ~/.ashrc configure OSX Terminal profiles, such as "webserver", "development", etc run "ash myhostname" and away you go! v.2 will re-attach to a 'screen' named in your ~/.ashrc Show Sample Output


    -1
    ash prod<tab>
    c3w · 2012-05-12 19:51:02 8
  • ls


    -1
    ls
    yingkailiang · 2013-03-14 01:28:01 5
  • This commands queries the delicious api then runs the xml through xml2, grabs the urls cuts out the first two columns, passes through uniq to remove duplicates if any, and then goes into linkchecker who checks the links. the links go the blacklist in ~/.linkchecker/blacklist. please see the manual pages for further info peeps. I took me a few days to figure this one out. I how you enjoy it. Also don't run these api more then once a few seconds you can get banned by delicious see their site for info. ~updated for no recursive Show Sample Output


    -1
    curl -k https://Username:Password@api.del.icio.us/v1/posts/all?red=api | xml2| \grep '@href' | cut -d\= -f 2- | sort | uniq | linkchecker -r0 --stdin --complete -v -t 50 -F blacklist
    bbelt16ag · 2013-05-04 17:43:21 8
  • This is very similar to the first example except that it employs the 'exec' argument of the find command rather than piping the result to xargs. The second example is nice and tidy but different *NIXs may not have as capable a grep command.


    -1
    find . -name "*.php" -exec grep -il searchphrase {} \;
    unixmonkey7797 · 2010-01-16 05:09:30 4
  • first 10 big file


    -1
    du -s * | sort -nr | head
    chenge · 2010-05-13 12:21:22 4

  • -1
    ffmpeg -r 12 -i img%03d.jpg -sameq -s hd720 -vcodec libx264 -crf 25 OUTPUT.MP4
    brainstorm · 2013-05-04 18:46:36 9
  • The while loop is an overkill, it would be simpler to prevent the file to be modified. That said, none of the proposed solutions are such: a real one would go to the source of the problem.


    -1
    chkmod -w /etc/resolve.conf
    ntropia · 2018-05-14 16:25:47 165
  • Make sure that find does not touch anything other than regular files, and handles non-standard characters in filenames while passing to xargs.


    -1
    find . -type f -exec grep -qi 'foo' {} \; -print0 | xargs -0 vim
    arcege · 2009-09-03 17:55:26 7
  • Useful since "export http_proxy=blahblah:8080" doesn't seem to work with pear Show Sample Output


    -1
    pear config-set http_proxy http://myusername:mypassword@corporateproxy:8080
    KoRoVaMiLK · 2010-05-13 14:44:03 30
  • Output: Version 3.2-0 (for example if you type # aptitude show bash | grep Vers Depends on the language of your distribution, because the name of the word "Version" in other languages may be different.


    -1
    aptitude show $PROGRAM | grep Vers
    aabilio · 2009-02-27 23:24:37 8

  • -1
    xrandr -q | grep -w Screen
    hemanth · 2010-02-14 15:38:49 3
  • splits a postscript file into multiple postscript files. for each page of the input file one output file will be generated. The files will be numbered for example 1_orig.ps 2_orig.ps ... The psselect commad is part of the psutils package


    -1
    file=orig.ps; for i in $(seq `grep "Pages:" $file | sed 's/%%Pages: //g'`); do psselect $i $file $i\_$file; done
    damncool · 2010-09-24 19:44:32 4
  • This command shows a high level overview of system memory and usage refreshed in seconds. Change -n 10 to you desired refresh interval. Show Sample Output


    -1
    watch -n 10 free -m
    Darkstar · 2014-01-04 10:10:15 12
  • Uses the pid to get the full path of the process. Useful when you do not which command got picked from the path Show Sample Output


    -1
    readlink -f /proc/<pid>/cmdline
    naseer · 2009-05-26 10:09:03 23
  • This got a bit complicated, because I had to introduce an additional dot at the end that has to be removed again later.


    -1
    for each in *; do file="$each."; name=${file%%.*}; suffix=${file#*.}; mv "$each" "$(echo $name | rot13)${suffix:+.}${suffix%.}"; done
    hfs · 2010-03-20 16:11:12 6

  • -1
    ls --color=never -1| grep -E "[0-9]{4}"|sed -re "s/^(.*)([0-9]{4})(.*)$/\2 \1\2\3/" | sort -r
    ysangkok · 2014-01-04 20:50:12 9

  • -1
    netstat -4tnape
    gnuyoga · 2009-05-26 11:50:52 5
  • Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion. 1.) wget source.tar.gz 2.) tar xzvf source.tar.gz 3.) cd source 4.) ls From there you can run ./configure, make and etc. Show Sample Output


    -1
    wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
    oshazard · 2010-01-17 11:25:47 3
  • This is just a little snippit to split a large file into smaller chunks (4mb in this example) and then send the chunks off to (e)mail for archival using mutt. I usually encrypt the file before splitting it using openssl: openssl des3 -salt -k <password> -in file.tgz -out file.tgz.des3 To restore, simply save attachments and rejoin them using: cat file.tgz.* > output_name.tgz and if encrypted, decrypt using: openssl des3 -d -salt -k <password> -in file.tgz.des3 -out file.tgz edit: (changed "g" to "e" for political correctness)


    -1
    split -b4m file.tgz file.tgz. ; for i in file.tgz.*; do SUBJ="Backup Archive"; MSG="Archive File Attached"; echo $MSG | mutt -a $i -s $SUBJ YourEmail@(E)mail.com
    tboulay · 2010-03-20 16:49:19 8

  • -1
    if [ -x /etc/*-release ]; then cat /etc/*-release ; else cat /etc/*-version ; fi
    hugoeustaquio · 2011-06-22 14:09:24 3
  • Transfer files with rsync over ssh on a non-standard port, showing a progress bar and resuming partial transfers.


    -1
    rsync -P -e 'ssh -p PORT' SRC DEST
    vickio · 2011-10-13 08:59:07 4
  • ‹ First  < 498 499 500 501 502 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Remove specific versions of old kernels (Ubuntu/Debian)
If, for example, you want to remove all kernels and headers but the last three versions, you can't use one of that magic all-in-one "remove old stuff" commands. With this simple but elegant command you can remove a range of versions, or a list of versions with e.g. {14,16,20}.

Google text-to-speech in local language or language of choice
Google text-to-speech in your local language or in language of choice via country code switch (ISO 639-1).

List detailed information about a ZIP archive
list zipfile info in long Unix ``ls -l'' format.

Display packages and versions on Debian/Ubuntu distrib
Need admin right to run dpkg-query

Create a bash script from last commands
In order to write bash-scripts, I often do the task manually to see how it works. I type ### at the start of my session. The function fetches the commands from the last occurrence of '###', excluding the function call. You could prefix this with a here-document to have a proper script-header. Delete some lines, add a few variables and a loop, and you're ready to go. This function could probably be much shorter...

Show this month's calendar, with today's date highlighted
Explanation: * The date command evaluated to today's date with blank padded on the left if single digit * The grep command search and highlight today's date * The --before-context and --after-context flags displays up to 6 lines before and after the line containing today's date; thus completes the calendar. I have tested this command on Mac OS X Leopard and Xubuntu 8.10

copy root to new device
Clone a root partition. The reason for double-mounting the root device is to avoid any filesystem overlay issues. This is particularly important for /dev. Also, note the importance of the trailing slashes on the paths when using rsync (search the man page for "slash" for more details). rsync and bash add several subtle nuances to path handling; using trailing slashes will effectively mean "clone this directory", even when run multiple times. For example: run once to get an initial copy, and then run again in single user mode just before rebooting into the new disk. Using file globs (which miss dot-files) or leaving off the trailing slash with rsync (which will create /mnt/target/root) are traps that are easy to fall into.

list block devices
Shows all block devices in a tree with descruptions of what they are.

View non-printing characters with cat
Useful to detect number of tabs in an empty line, DOS newline (carriage return + newline). A tool that can help you understand why your parsing is not working.

Find files modified since a specific date
This command uses -newerXY to show you the files that are modified since a specific date. I recommend looking for "-newerXY" on the manpage to get the specifics.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: