All commands (14,187)

  • This command is useful when you are programming, for example.


    -1
    sed -i 's/[ \t]\+$//g' file.txt
    elder · 2011-09-07 01:47:44 9

  • -1
    net user USERNAME /domain
    shawn_abdushakur · 2014-01-02 20:22:46 7
  • just a alternative using a saved html file of all of my bookmarks. works well although it takes awhile.


    -1
    wget -r --wait=5 --quota=5000m --tries=3 --directory-prefix=/home/erin/Documents/erins_webpages --limit-rate=20k --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net -F -i ./delicious-20090629.htm
    bbelt16ag · 2009-07-02 01:46:21 7
  • It works in every linux box Show Sample Output


    -1
    cat /proc/cpuinfo
    magicjohnson_ · 2010-09-24 09:27:58 3
  • http://github.com/c3w/ash . a Ruby SSH helper script . reads a JSON config file to read host, FQDN, user, port, tunnel options . changes OSX Terminal profiles based on host 'type' USAGE: put 'ash' ruby script in your PATH modify and copy ashrc-dist to ~/.ashrc configure OSX Terminal profiles, such as "webserver", "development", etc run "ash myhostname" and away you go! v.2 will re-attach to a 'screen' named in your ~/.ashrc Show Sample Output


    -1
    ash prod<tab>
    c3w · 2012-05-12 19:51:02 8
  • ls


    -1
    ls
    yingkailiang · 2013-03-14 01:28:01 5
  • This commands queries the delicious api then runs the xml through xml2, grabs the urls cuts out the first two columns, passes through uniq to remove duplicates if any, and then goes into linkchecker who checks the links. the links go the blacklist in ~/.linkchecker/blacklist. please see the manual pages for further info peeps. I took me a few days to figure this one out. I how you enjoy it. Also don't run these api more then once a few seconds you can get banned by delicious see their site for info. ~updated for no recursive Show Sample Output


    -1
    curl -k https://Username:Password@api.del.icio.us/v1/posts/all?red=api | xml2| \grep '@href' | cut -d\= -f 2- | sort | uniq | linkchecker -r0 --stdin --complete -v -t 50 -F blacklist
    bbelt16ag · 2013-05-04 17:43:21 8
  • This is very similar to the first example except that it employs the 'exec' argument of the find command rather than piping the result to xargs. The second example is nice and tidy but different *NIXs may not have as capable a grep command.


    -1
    find . -name "*.php" -exec grep -il searchphrase {} \;
    unixmonkey7797 · 2010-01-16 05:09:30 4
  • first 10 big file


    -1
    du -s * | sort -nr | head
    chenge · 2010-05-13 12:21:22 4

  • -1
    ffmpeg -r 12 -i img%03d.jpg -sameq -s hd720 -vcodec libx264 -crf 25 OUTPUT.MP4
    brainstorm · 2013-05-04 18:46:36 9
  • The while loop is an overkill, it would be simpler to prevent the file to be modified. That said, none of the proposed solutions are such: a real one would go to the source of the problem.


    -1
    chkmod -w /etc/resolve.conf
    ntropia · 2018-05-14 16:25:47 163
  • Make sure that find does not touch anything other than regular files, and handles non-standard characters in filenames while passing to xargs.


    -1
    find . -type f -exec grep -qi 'foo' {} \; -print0 | xargs -0 vim
    arcege · 2009-09-03 17:55:26 7
  • Useful since "export http_proxy=blahblah:8080" doesn't seem to work with pear Show Sample Output


    -1
    pear config-set http_proxy http://myusername:mypassword@corporateproxy:8080
    KoRoVaMiLK · 2010-05-13 14:44:03 30
  • Output: Version 3.2-0 (for example if you type # aptitude show bash | grep Vers Depends on the language of your distribution, because the name of the word "Version" in other languages may be different.


    -1
    aptitude show $PROGRAM | grep Vers
    aabilio · 2009-02-27 23:24:37 8

  • -1
    xrandr -q | grep -w Screen
    hemanth · 2010-02-14 15:38:49 3
  • splits a postscript file into multiple postscript files. for each page of the input file one output file will be generated. The files will be numbered for example 1_orig.ps 2_orig.ps ... The psselect commad is part of the psutils package


    -1
    file=orig.ps; for i in $(seq `grep "Pages:" $file | sed 's/%%Pages: //g'`); do psselect $i $file $i\_$file; done
    damncool · 2010-09-24 19:44:32 4
  • This command shows a high level overview of system memory and usage refreshed in seconds. Change -n 10 to you desired refresh interval. Show Sample Output


    -1
    watch -n 10 free -m
    Darkstar · 2014-01-04 10:10:15 12
  • Uses the pid to get the full path of the process. Useful when you do not which command got picked from the path Show Sample Output


    -1
    readlink -f /proc/<pid>/cmdline
    naseer · 2009-05-26 10:09:03 23
  • This got a bit complicated, because I had to introduce an additional dot at the end that has to be removed again later.


    -1
    for each in *; do file="$each."; name=${file%%.*}; suffix=${file#*.}; mv "$each" "$(echo $name | rot13)${suffix:+.}${suffix%.}"; done
    hfs · 2010-03-20 16:11:12 6

  • -1
    ls --color=never -1| grep -E "[0-9]{4}"|sed -re "s/^(.*)([0-9]{4})(.*)$/\2 \1\2\3/" | sort -r
    ysangkok · 2014-01-04 20:50:12 9

  • -1
    netstat -4tnape
    gnuyoga · 2009-05-26 11:50:52 5
  • Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion. 1.) wget source.tar.gz 2.) tar xzvf source.tar.gz 3.) cd source 4.) ls From there you can run ./configure, make and etc. Show Sample Output


    -1
    wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
    oshazard · 2010-01-17 11:25:47 3
  • This is just a little snippit to split a large file into smaller chunks (4mb in this example) and then send the chunks off to (e)mail for archival using mutt. I usually encrypt the file before splitting it using openssl: openssl des3 -salt -k <password> -in file.tgz -out file.tgz.des3 To restore, simply save attachments and rejoin them using: cat file.tgz.* > output_name.tgz and if encrypted, decrypt using: openssl des3 -d -salt -k <password> -in file.tgz.des3 -out file.tgz edit: (changed "g" to "e" for political correctness)


    -1
    split -b4m file.tgz file.tgz. ; for i in file.tgz.*; do SUBJ="Backup Archive"; MSG="Archive File Attached"; echo $MSG | mutt -a $i -s $SUBJ YourEmail@(E)mail.com
    tboulay · 2010-03-20 16:49:19 8

  • -1
    if [ -x /etc/*-release ]; then cat /etc/*-release ; else cat /etc/*-version ; fi
    hugoeustaquio · 2011-06-22 14:09:24 3
  • Transfer files with rsync over ssh on a non-standard port, showing a progress bar and resuming partial transfers.


    -1
    rsync -P -e 'ssh -p PORT' SRC DEST
    vickio · 2011-10-13 08:59:07 4
  • ‹ First  < 498 499 500 501 502 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

translate what is in the clipboard in english and write it to the terminal
Uses google api to translate, you can modify the language in which translate modifying the parameter "langpair=|en", the format is language input|language output.

urldecoding
$ echo "http%3A%2F%2Fwww.google.com" | sed -e's/%\([0-9A-F][0-9A-F]\)/\\\\\x\1/g' | xargs echo -e http://www.google.com $ Works under bash on linux. just alter the '-e' option to its corresponding equivalence in your system to execute escape characters correctly.

Calculate days on which Friday the 13th occurs (inspired from the work of the user justsomeguy)
Friday is the 5th day of the week, monday is the 1st. Output may be affected by locale.

Get a free shell account on a community server
Bash process substitution which curls the website 'hashbang.sh' and executes the shell script embedded in the page. This is obviously not the most secure way to run something like this, and we will scold you if you try. The smarter way would be: Download locally over SSL > curl https://hashbang.sh >> hashbang.sh Verify integrty with GPG (If available) > gpg --recv-keys 0xD2C4C74D8FAA96F5 > gpg --verify hashbang.sh Inspect source code > less hashbang.sh Run > chmod +x hashbang.sh > ./hashbang.sh

Convert multiple flac files to mp3
make sure that flac and lame are installed sudo apt-get install lame flac

Check all bash scripts in current dir for syntax errors
Check all bash scripts in current dir for syntax errors WITHOUT running them.

Get a list of all browsable Samba shares on the target server.

kill process by name
Or even easier, if it's available: $ killall firefox I have no idea why you would want to rely on two unusual dependencies to do something that can be done a hundred ways from coreutils...

quickly backup or copy a file with bash

count processes with status "D" uninterruptible sleep


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: