All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Remote Serial connection redirected over network using SSH
Requires software found at: http://lpccomp.bc.ca/remserial/ Remote [A] (with physical serial port connected to device) $./remserial -d -p 23000 -s "115200 raw" /dev/ttyS0 & Local [B] (running the program that needs to connect to serial device) Create a SSH tunnel to the remote server: $ssh -N -L 23000:localhost:23000 user@hostwithphysicalserialport Use the locally tunnelled port to connect the local virtual serial port to the remote real physical port: $./remserial -d -r localhost -p 23000 -l /dev/remser1 /dev/ptmx & Example: Running minicom on machine B using serial /dev/remser1 will actually connect you to whatever device is plugged into machine A's serial port /dev/ttyS0.

split a multi-page PDF into separate files
Have to do this once per output file, because if device is 'pdfwrite', even if 'gs' sees '%d' in the OutputFile it still only creates one single output file. Embed it into a simple shell script if you want to split a document out into one file for every page.

livehttpheaders (firefox addon) replacement
Default output-file is "liveh.txt". This uses only BRE, in case you're using an older version of sed(1) that doesn't have support for ERE added. With a modern sed(1), to reduce false positive matches, you might do something like: liveh(){ tcpdump -lnnAs512 -i ${1-} tcp |sed 's/.*GET /GET /;s/.*Host: /Host: /;s/.*POST /POST /;/GET |Host: |POST /!d;/[\"'"'"]/d;/\.\./d;w '"${2-liveh.txt}"'' >/dev/null ;} Anyway, it's easy to clean up the output file with sed(1) later.

restoring some data from a corrupted text file
man tac When there is a bad block in the middle of your file, you can see its begninning with `cat' and its end with `tac'. But both commands terminates with an error. So this sequence rebuilds a new file without badblock.

Print a cron formatted time for 2 minutes in the future (for crontab testing)
Another function to stick into your .bashrc This spits out the time two minutes in the future, but already formatted for pasting into your crontab file for testing without any thought required on your part. Frequently things don't work the way you expect inside a crontab job, and you probably want to find out now that your $PATH is completely different inside of cron or other global variables aren't defined. So this will generate a date you can use for testing now, and then later you can change it to run at 5:37 am on a Sunday evening.

Make sure a script is run in a terminal.
Exit with error if script is not run in a terminal

create an incremental backup of a directory using hard links
dname is a directory named something like 20090803 for Aug 3, 2009. lastbackup is a soft link to the last backup made - say 20090802. $folder is the folder being backed up. Because this uses hard linking, files that already exist and haven't changed take up almost no space yet each date directory has a kind of "snapshot" of that day's files. Naturally, lastbackup needs to be updated after this operation. I must say that I can't take credit for this gem; I picked it up from somewhere on the net so long ago I don't remember where from anymore. Ah, well... Systems that are only somewhat slicker than this costs hundreds or even thousands of dollars - but we're HACKERS! We don't need no steenkin' commercial software... :)

Read null character seperated fields from a file
Handle any bad named file which contains ",',\n,\b,\t,` etc Store the file name as null character separated list $find . -print0 >name.lst and retrieve it using $read -r -d "" Eg: $find . -print0 >name.lst; $cat name.lst| while IFS="" read -r -d "" file; $do $ls -l "$file"; $done

Use color grep by default
Alias the grep command to show colored results by default.

Get all these commands in a text file with description.
I tried out on my Mac, jot to generate sequence ( 0,25,50,..), you can use 'seq' if it is linux to generate numbers, need curl installed on the machine, then it rocks. @Satya


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: