All commands (14,187)

  • This is a handy way to circumvent the "Maximum line length of 2048 exceeded" grep error. Once you have run the above command (or put it in your .bashrc), files can be searched using: lgrep search-string /file/to/search


    1
    lgrep() { string=$1; file=$2; awk -v String=${string} '$0 ~ String' ${file}; }
    dopeman · 2010-01-19 09:42:19 3
  • To also move the db backup to another location you could pass the output to the dd command instead of a file mysqldump -u user -h host -ppwd -B dbname | bzip2 -zc9 | dd ssh usr@server "dd of=db_dump"


    1
    mysqldump -u user -h host -ppwd -B dbname | bzip2 -zc9 > dbname.sql.bz2
    olaseni · 2010-01-19 07:34:21 3

  • 1
    perl -i~ -0777pe's/^/\!\#\/usr\/bin\/ksh\n/' testing
    azil · 2010-01-19 06:49:10 4
  • This command defragment the SQLite databases found in the home folder of the current Windows user. This is usefull to speed up Firefox startup. The executable sqlite3.exe must be located in PATH or in the current folder. In a script use: for /f "delims==" %%a in (' dir "%USERPROFILE%\*.sqlite" /s/b ') do echo vacuum;|"sqlite3.exe" "%%a" Show Sample Output


    -3
    for /f "delims==" %a in (' dir "%USERPROFILE%\*.sqlite" /s/b ') do echo vacuum;|"sqlite3.exe" "%a"
    vutcovici · 2010-01-18 20:56:00 6
  • It's very common to have cron jobs that send emails as their output, but the From: address is whatever account the cron job is running under, which is often not the address you want replies to go to. Here's a way to change the From: address right on the command line. What's happening here is that the "--" separates the options to the mail client from options for the sendmail backend. So the -f and -F get passed through to sendmail and interpreted there. This works on even on a system where postfix is the active mailer - looks like postfix supports the same options. I think it's possible to customize the From: address using mutt as a command line mailer also, but most servers don't have mutt preinstalled.


    10
    mail -s "subject" user@todomain.com <emailbody.txt -- -f customfrom@fromdomain.com -F 'From Display Name'
    dmmst19 · 2010-01-18 19:55:27 30
  • if firefox is running the database is locked, so you need to copy the places.sqlite file temporarily somewhere to be able to query it...


    2
    sqlite3 -list /home/$USER/.mozilla/firefox/*.default/places.sqlite 'select url from moz_places ;' | grep http
    bubo · 2010-01-18 15:25:00 3
  • http://www.mplayerhq.hu/DOCS/HTML/en/menc-feat-mpeg.html MEncoder can create MPEG (MPEG-PS) format output files. Usually, when you are using MPEG-1 or MPEG-2 video, it is because you are encoding for a constrained format such as SVCD, VCD, or DVD. To change MEncoder's output file format, use the -of mpeg option. Creating an MPEG-1 file suitable to be played on systems with minimal multimedia support, such as default Windows installs: mencoder input.avi -of mpeg -mpegopts format=mpeg1:tsaf:muxrate=2000 \ -o output.mpg -oac lavc -lavcopts acodec=mp2:abitrate=224 -ovc lavc \ -lavcopts vcodec=mpeg1video:vbitrate=1152:keyint=15:mbd=2:aspect=4/3


    1
    mencoder input.avi -of mpeg -ovc lavc -lavcopts vcodec=mpeg1video \ -oac copy other_options -o output.mpg
    slishan · 2010-01-18 13:12:03 3
  • Also look at xload


    1
    tload -s 10
    chinmaya · 2010-01-18 08:14:06 3

  • 0
    purple-remote "setstatus?status=Available&message=Checking libpurple"
    spsneo · 2010-01-17 23:48:17 4
  • Note: you'll want to set up pub-key ssh auth. Gives you a quick means of changing volume/tracks/etc for rhythmbox on a remote machine. E.g.: rc --next # Play next track rc --print-playing # Grab the name rc --volume-down rc --help


    9
    alias rc='ssh ${MEDIAPCHOSTNAME} env DISPLAY=:0.0 rhythmbox-client --no-start'
    rhythmx · 2010-01-17 19:43:43 6

  • -1
    watch -n 7 -d 'uptime | sed s/.*users?, //'
    matthewbauer · 2010-01-17 18:45:52 3
  • cat - concatenate MP3 files and save it... Show Sample Output


    -4
    # cat file1.mp3 file2.mp3 > file3.mp3
    svnlabs · 2010-01-17 13:18:34 5
  • CHANGELOG Version 1.1 removedir () { echo "You are about to delete the current directory $PWD Are you sure?"; read human; if [[ "$human" = "yes" ]]; then blah=$(echo "$PWD" | sed 's/ /\\ /g'); foo=$(basename "$blah"); rm -Rf ../$foo/ && cd ..; else echo "I'm watching you" | pv -qL 10; fi; } BUG FIX: Folders with spaces Version 1.0 removedir () { echo "You are about to delete the current directory $PWD Are you sure?"; read human; if [[ "$human" = "yes" ]]; then blah=`basename $PWD`; rm -Rf ../$blah/ && cd ..; else echo "I'm watching you" | pv -qL 10; fi; } BUG FIX: Hidden directories (.dotdirectory) Version 0.9 rmdir () { echo "You are about to delete the current directory $PWD. Are you sure?"; read human; if [[ "$human" = "yes" ]]; then blah=`basename $PWD`; rm -Rf ../$blah/ && cd ..; else echo "I'm watching you" | pv -qL 10; fi; } Removes current directory with recursive and force flags plus basic human check. When prompted type yes 1. [user@host ~]$ ls foo bar 2. [user@host ~]$ cd foo 3. [user@host foo]$ removedir 4. yes 5. rm -Rf foo/ 6. [user@host ~]$ 7. [user@host ~]$ ls bar Show Sample Output


    -2
    removedir () { echo "Deleting the current directory $PWD Are you sure?"; read human; if [[ "$human" = "yes" ]]; then blah=$(echo "$PWD" | sed 's/ /\\ /g'); foo=$(basename "$blah"); rm -Rf ../$foo/ && cd ..; else echo "I'm watching you" | pv -qL 10; fi; }
    oshazard · 2010-01-17 11:34:38 31
  • Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion. 1.) wget source.tar.gz 2.) tar xzvf source.tar.gz 3.) cd source 4.) ls From there you can run ./configure, make and etc. Show Sample Output


    -1
    wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
    oshazard · 2010-01-17 11:25:47 3

  • 2
    echo -e "swap=me\n1=2"|sed 's/\(.*\)=\(.*\)/\2=\1/g'
    axelabs · 2010-01-16 22:01:37 3
  • Will find all files containing "sample" in the current directory and in the directories below.


    -9
    find . -exec grep -l "sample" {} \;
    whoami · 2010-01-16 13:12:52 4

  • 6
    mwiki () { blah=`echo $@ | sed -e 's/ /_/g'`; dig +short txt $blah.wp.dg.cx; }
    oshazard · 2010-01-16 07:13:43 3
  • This is very similar to the first example except that it employs the 'exec' argument of the find command rather than piping the result to xargs. The second example is nice and tidy but different *NIXs may not have as capable a grep command.


    -1
    find . -name "*.php" -exec grep -il searchphrase {} \;
    unixmonkey7797 · 2010-01-16 05:09:30 4
  • I use this command (PS1) to show a list bash prompt's special characters. I tested it against A flavor of Red Hat Linux and Mac OS X Show Sample Output


    3
    alias PS1="man bash | sed -n '/ASCII bell/,/end a sequence/p'"
    haivu · 2010-01-15 23:39:28 3

  • -2
    hdid somefile.dmg
    rnoyfb · 2010-01-15 12:00:48 5
  • If you really _must_ use a loop, this is better than parsing the output of 'ps': PID=$! ;while kill -0 $PID &>/dev/null; do sleep 1; done kill -0 $PID returns 0 if the process still exists; otherwise 1


    0
    wait
    bhepple · 2010-01-15 04:03:11 5

  • -4
    dd if=/dev/zero of=/tmp/bigfile bs=1024k count=100
    wincus · 2010-01-15 00:44:44 4
  • shorter :p Show Sample Output


    2
    grep -rHi searchphrase *.php
    psybermonkey · 2010-01-15 00:23:25 5
  • This command will find all files recursively containing the phrase entered, represented here by "searchphrase". This particular command searches in all php files, but you could change that to just be html files or just log files etc. Show Sample Output


    2
    find . -name "*.php" | xargs grep -il searchphrase
    refrax · 2010-01-14 22:42:36 5
  • This will output the characters at 10 per second.


    124
    echo "You can simulate on-screen typing just like in the movies" | pv -qL 10
    dennisw · 2010-01-14 20:17:44 869
  • ‹ First  < 408 409 410 411 412 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

List empty any directories

clear current line

prints message in given argument on on center of screen
$ function echox { echo `tput cup $(($(tput lines))) $(( ($(tput cols) - $(echo "${#1}"))/2 ))`"$1"`tput cup $(tput lines) $(( $(tput cols)-1 ))`; } echox prints given argument on bottom line center screen in terminal $ function echoxy { echo `tput cup $(($(tput lines)/2)) $(( ($(tput cols) - $(echo "${#1}"))/2))`"$1"`tput cup $(tput lines) $(( $(tput cols)-1 ))`; } exhoxy prints given argument center screen $ function echos { echo `tput cup $(($(tput lines)-2)) $(($(tput cols)-$(echo ${#1})))&&tput sc`"$1"`tput cup $(($(tput lines)-2)) 0 && tput rc`; } $ while [ 1 ]; do echos "`date`"; done echos prints date and time on second from last line (used as status message) you can easily use these functions by placing them in your .bashrc file, make sure to source your .bashrc once you do

Verify if ntpd is working properly
Used to verify if Network Time Protocol daemon is working properly.

Split a file one piece at a time, when using the split command isn't an option (not enough disk space)
bs = buffer size (basically defined the size of a "unit" used by count and skip) count = the number of buffers to copy (16m * 32 = 1/2 gig) skip = (32 * 2) we are grabbing piece 3...which means 2 have already been written so skip (2 * count) i will edit this later if i can to make this all more understandable

Locate config files of the program
Locate config files of the program. May not be used for interactive programs like vim.

SMS reminder
Send an e-mail to SMS reminder in 15 minutes from now, to call my wife. See list of carriers bellow Carrier Email to SMS Gateway Alltel [10-digit phone number]@message.alltel.com AT&T (formerly Cingular) [10-digit phone number]@txt.att.net [10-digit phone number]@mms.att.net (MMS) [10-digit phone number]@cingularme.com Boost Mobile [10-digit phone number]@myboostmobile.com Nextel (now Sprint Nextel) [10-digit telephone number]@messaging.nextel.com Sprint PCS (now Sprint Nextel) [10-digit phone number]@messaging.sprintpcs.com [10-digit phone number]@pm.sprint.com (MMS) T-Mobile [10-digit phone number]@tmomail.net US Cellular [10-digit phone number]email.uscc.net (SMS) [10-digit phone number]@mms.uscc.net (MMS) Verizon [10-digit phone number]@vtext.com [10-digit phone number]@vzwpix.com (MMS) Virgin Mobile USA [10-digit phone number]@vmobl.com

Put public IP address in a variable

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Find all the files more than 10MB, sort in descending order of size and record the output of filenames and size in a text file.
This command specifies the size in Kilobytes using 'k' in the -size +(N)k option. The plus sign says greater than. -exec [cmd] {} \; invokes ls -l command on each file and awk strips off the values of the 5th (size) and the 9th (filename) column from the ls -l output to display. Sort is done in reversed order (descending) numerically using sort -rn options. A cron job could be run to execute a script like this and alert the users if a dir has files exceeding certain size, and provide file details as well.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: