All commands (14,187)

  • This is a handy way to circumvent the "Maximum line length of 2048 exceeded" grep error. Once you have run the above command (or put it in your .bashrc), files can be searched using: lgrep search-string /file/to/search


    1
    lgrep() { string=$1; file=$2; awk -v String=${string} '$0 ~ String' ${file}; }
    dopeman · 2010-01-19 09:42:19 3
  • To also move the db backup to another location you could pass the output to the dd command instead of a file mysqldump -u user -h host -ppwd -B dbname | bzip2 -zc9 | dd ssh usr@server "dd of=db_dump"


    1
    mysqldump -u user -h host -ppwd -B dbname | bzip2 -zc9 > dbname.sql.bz2
    olaseni · 2010-01-19 07:34:21 3

  • 1
    perl -i~ -0777pe's/^/\!\#\/usr\/bin\/ksh\n/' testing
    azil · 2010-01-19 06:49:10 4
  • This command defragment the SQLite databases found in the home folder of the current Windows user. This is usefull to speed up Firefox startup. The executable sqlite3.exe must be located in PATH or in the current folder. In a script use: for /f "delims==" %%a in (' dir "%USERPROFILE%\*.sqlite" /s/b ') do echo vacuum;|"sqlite3.exe" "%%a" Show Sample Output


    -3
    for /f "delims==" %a in (' dir "%USERPROFILE%\*.sqlite" /s/b ') do echo vacuum;|"sqlite3.exe" "%a"
    vutcovici · 2010-01-18 20:56:00 6
  • It's very common to have cron jobs that send emails as their output, but the From: address is whatever account the cron job is running under, which is often not the address you want replies to go to. Here's a way to change the From: address right on the command line. What's happening here is that the "--" separates the options to the mail client from options for the sendmail backend. So the -f and -F get passed through to sendmail and interpreted there. This works on even on a system where postfix is the active mailer - looks like postfix supports the same options. I think it's possible to customize the From: address using mutt as a command line mailer also, but most servers don't have mutt preinstalled.


    10
    mail -s "subject" user@todomain.com <emailbody.txt -- -f customfrom@fromdomain.com -F 'From Display Name'
    dmmst19 · 2010-01-18 19:55:27 30
  • if firefox is running the database is locked, so you need to copy the places.sqlite file temporarily somewhere to be able to query it...


    2
    sqlite3 -list /home/$USER/.mozilla/firefox/*.default/places.sqlite 'select url from moz_places ;' | grep http
    bubo · 2010-01-18 15:25:00 3
  • http://www.mplayerhq.hu/DOCS/HTML/en/menc-feat-mpeg.html MEncoder can create MPEG (MPEG-PS) format output files. Usually, when you are using MPEG-1 or MPEG-2 video, it is because you are encoding for a constrained format such as SVCD, VCD, or DVD. To change MEncoder's output file format, use the -of mpeg option. Creating an MPEG-1 file suitable to be played on systems with minimal multimedia support, such as default Windows installs: mencoder input.avi -of mpeg -mpegopts format=mpeg1:tsaf:muxrate=2000 \ -o output.mpg -oac lavc -lavcopts acodec=mp2:abitrate=224 -ovc lavc \ -lavcopts vcodec=mpeg1video:vbitrate=1152:keyint=15:mbd=2:aspect=4/3


    1
    mencoder input.avi -of mpeg -ovc lavc -lavcopts vcodec=mpeg1video \ -oac copy other_options -o output.mpg
    slishan · 2010-01-18 13:12:03 3
  • Also look at xload


    1
    tload -s 10
    chinmaya · 2010-01-18 08:14:06 3

  • 0
    purple-remote "setstatus?status=Available&message=Checking libpurple"
    spsneo · 2010-01-17 23:48:17 4
  • Note: you'll want to set up pub-key ssh auth. Gives you a quick means of changing volume/tracks/etc for rhythmbox on a remote machine. E.g.: rc --next # Play next track rc --print-playing # Grab the name rc --volume-down rc --help


    9
    alias rc='ssh ${MEDIAPCHOSTNAME} env DISPLAY=:0.0 rhythmbox-client --no-start'
    rhythmx · 2010-01-17 19:43:43 6

  • -1
    watch -n 7 -d 'uptime | sed s/.*users?, //'
    matthewbauer · 2010-01-17 18:45:52 3
  • cat - concatenate MP3 files and save it... Show Sample Output


    -4
    # cat file1.mp3 file2.mp3 > file3.mp3
    svnlabs · 2010-01-17 13:18:34 5
  • CHANGELOG Version 1.1 removedir () { echo "You are about to delete the current directory $PWD Are you sure?"; read human; if [[ "$human" = "yes" ]]; then blah=$(echo "$PWD" | sed 's/ /\\ /g'); foo=$(basename "$blah"); rm -Rf ../$foo/ && cd ..; else echo "I'm watching you" | pv -qL 10; fi; } BUG FIX: Folders with spaces Version 1.0 removedir () { echo "You are about to delete the current directory $PWD Are you sure?"; read human; if [[ "$human" = "yes" ]]; then blah=`basename $PWD`; rm -Rf ../$blah/ && cd ..; else echo "I'm watching you" | pv -qL 10; fi; } BUG FIX: Hidden directories (.dotdirectory) Version 0.9 rmdir () { echo "You are about to delete the current directory $PWD. Are you sure?"; read human; if [[ "$human" = "yes" ]]; then blah=`basename $PWD`; rm -Rf ../$blah/ && cd ..; else echo "I'm watching you" | pv -qL 10; fi; } Removes current directory with recursive and force flags plus basic human check. When prompted type yes 1. [user@host ~]$ ls foo bar 2. [user@host ~]$ cd foo 3. [user@host foo]$ removedir 4. yes 5. rm -Rf foo/ 6. [user@host ~]$ 7. [user@host ~]$ ls bar Show Sample Output


    -2
    removedir () { echo "Deleting the current directory $PWD Are you sure?"; read human; if [[ "$human" = "yes" ]]; then blah=$(echo "$PWD" | sed 's/ /\\ /g'); foo=$(basename "$blah"); rm -Rf ../$foo/ && cd ..; else echo "I'm watching you" | pv -qL 10; fi; }
    oshazard · 2010-01-17 11:34:38 31
  • Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion. 1.) wget source.tar.gz 2.) tar xzvf source.tar.gz 3.) cd source 4.) ls From there you can run ./configure, make and etc. Show Sample Output


    -1
    wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
    oshazard · 2010-01-17 11:25:47 3

  • 2
    echo -e "swap=me\n1=2"|sed 's/\(.*\)=\(.*\)/\2=\1/g'
    axelabs · 2010-01-16 22:01:37 3
  • Will find all files containing "sample" in the current directory and in the directories below.


    -9
    find . -exec grep -l "sample" {} \;
    whoami · 2010-01-16 13:12:52 4

  • 6
    mwiki () { blah=`echo $@ | sed -e 's/ /_/g'`; dig +short txt $blah.wp.dg.cx; }
    oshazard · 2010-01-16 07:13:43 3
  • This is very similar to the first example except that it employs the 'exec' argument of the find command rather than piping the result to xargs. The second example is nice and tidy but different *NIXs may not have as capable a grep command.


    -1
    find . -name "*.php" -exec grep -il searchphrase {} \;
    unixmonkey7797 · 2010-01-16 05:09:30 4
  • I use this command (PS1) to show a list bash prompt's special characters. I tested it against A flavor of Red Hat Linux and Mac OS X Show Sample Output


    3
    alias PS1="man bash | sed -n '/ASCII bell/,/end a sequence/p'"
    haivu · 2010-01-15 23:39:28 3

  • -2
    hdid somefile.dmg
    rnoyfb · 2010-01-15 12:00:48 5
  • If you really _must_ use a loop, this is better than parsing the output of 'ps': PID=$! ;while kill -0 $PID &>/dev/null; do sleep 1; done kill -0 $PID returns 0 if the process still exists; otherwise 1


    0
    wait
    bhepple · 2010-01-15 04:03:11 5

  • -4
    dd if=/dev/zero of=/tmp/bigfile bs=1024k count=100
    wincus · 2010-01-15 00:44:44 4
  • shorter :p Show Sample Output


    2
    grep -rHi searchphrase *.php
    psybermonkey · 2010-01-15 00:23:25 5
  • This command will find all files recursively containing the phrase entered, represented here by "searchphrase". This particular command searches in all php files, but you could change that to just be html files or just log files etc. Show Sample Output


    2
    find . -name "*.php" | xargs grep -il searchphrase
    refrax · 2010-01-14 22:42:36 5
  • This will output the characters at 10 per second.


    124
    echo "You can simulate on-screen typing just like in the movies" | pv -qL 10
    dennisw · 2010-01-14 20:17:44 882
  • ‹ First  < 408 409 410 411 412 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

quickly change all .html extensions on files in folder to .htm

Block known dirty hosts from reaching your machine
Blacklisted is a compiled list of all known dirty hosts (botnets, spammers, bruteforcers, etc.) which is updated on an hourly basis. This command will get the list and create the rules for you, if you want them automatically blocked, append |sh to the end of the command line. It's a more practical solution to block all and allow in specifics however, there are many who don't or can't do this which is where this script will come in handy. For those using ipfw, a quick fix would be {print "add deny ip from "$1" to any}. Posted in the sample output are the top two entries. Be advised the blacklisted file itself filters out RFC1918 addresses (10.x.x.x, 172.16-31.x.x, 192.168.x.x) however, it is advisable you check/parse the list before you implement the rules

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Convert files from DOS line endings to UNIX line endings
Here "^M" is NOT "SHIFT+6" and "M". Type CTRL+V+M to get it instead. Its shortest and easy. And its sed!, which is available by default in all linux flavours.. no need to install extra tools like fromdos.

get a desktop notification from the terminal
tired of switching to the console to check if some command has finished yet? if notify-send does not work on your box try this one... e.g. rsync -av -e /usr/bin/lsh $HOME slowconnection.bar:/mnt/backup ; z (now fire up X, do something useful, get notified if this stuff has finished).

prevent large files from being cached in memory (backups!)
We all know... $ nice -n19 for low CPU priority.   $ ionice -c3 for low I/O priority.   nocache can be useful in related scenarios, when we operate on very large files just a single time, e.g. a backup job. It advises the kernel that no caching is required for the involved files, so our current file cache is not erased, potentially decreasing performance on other, more typical file I/O, e.g. on a desktop.   http://askubuntu.com/questions/122857 https://github.com/Feh/nocache http://packages.debian.org/search?keywords=nocache http://packages.ubuntu.com/search?keywords=nocache   To undo caching of a single file in hindsight, you can do $ cachedel   To check the cache status of a file, do $ cachestats

Quick and Temporary Named Commands
* Add comment with # in your command * Later you can search that command on that comment with CTRL+R In the title command, you could search it later by invoking the command search tool by first typing CTRL+R and then typing "revert"

Reverse ssh
Both hosts must be running ssh and also the outside host must have a port forwarded to port 22.

Find the package that installed a command

Disconnect a wireless client on an atheros-based access point
This command will disconnect the user whose mac was specified from the current list of clients from the wireless network when the network card is working in access point mode. Works on atheros-based access points which use the madwifi driver (not sure, but don't think it will work on access points which are not atheros-based, as it uses the atheros's iwpriv extensions). It will not prevent the user from reconnecting to the network, but may force the user to roam to another AP, with stronger signal.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: