All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Network Folder Copy with Monitoring ( tar + nc + pv )
Transfer tar stream thru nc with pv montoiring taken from: http://www.catonmat.net/blog/unix-utilities-pipe-viewer/

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Use mplayer to save video streams to a file
I use this command to save RTSP video streams over night from one of our national TV stations, so I won't have to squeeze the data through my slow internet connection when I want to watch it the next day. For ease of use, you might want to put this in a file: #!/bin/bash FILE="`basename \"$1\"`" mplayer -dumpstream -dumpfile "$FILE" -playlist "$1"

Show me a histogram of the busiest minutes in a log file:
Busiest seconds: $ cat /var/log/secure.log | awk '{print substr($0,0,15)}' | uniq -c | sort -nr | awk '{printf("\n%s ",$0) ; for (i = 0; i

FAST Search and Replace for Strings in all Files in Directory
I needed a way to search all files in a web directory that contained a certain string, and replace that string with another string. In the example, I am searching for "askapache" and replacing that string with "htaccess". I wanted this to happen as a cron job, and it was important that this happened as fast as possible while at the same time not hogging the CPU since the machine is a server. So this script uses the nice command to run the sh shell with the command, which makes the whole thing run with priority 19, meaning it won't hog CPU processing. And the -P5 option to the xargs command means it will run 5 separate grep and sed processes simultaneously, so this is much much faster than running a single grep or sed. You may want to do -P0 which is unlimited if you aren't worried about too many processes or if you don't have to deal with process killers in the bg. Also, the -m1 command to grep means stop grepping this file for matches after the first match, which also saves time.

Apache CLF access log format to CSV converter
- excel date compatible with a separate hour field - added a fixed 1 for easier request counter aggregation - split URL in directory, filename, fileext, query - used with tomcat valve with response bytes replaced by elapsed time

delay execution of a command that needs lots of memory and CPU time until the resources are available
[ 2000 -ge "$(free -m | awk '/buffers.cache:/ {print $4}')" ] returns true if less than 2000 MB of RAM are available, so adjust this number to your needs. [ $(echo "$(uptime | awk '{print $10}' | sed -e 's/,$//' -e 's/,/./') >= $(grep -c ^processor /proc/cpuinfo)" | bc) -eq 1 ] returns true if the current machine load is at least equal to the number of CPUs. If either of the tests returns true we wait 10 seconds and check again. If both tests return false, i.e. 2GB are available and machine load falls below number of CPUs, we start our command and save it's output in a text file. The ( ( ... ) & ) construct lets the command run in background even if we log out. See http://www.commandlinefu.com/commands/view/3115/ .

Have netcat listening on your ports and use telnet to test connection
This will start a netcat process listening on port 666. If you are able connect to your your server, netcat will receive the data being sent and spit it out to the screen (it may look like random garbage, so you might want to redirect it to a file).

batch crop images whit ImageMagick
Just starting to get in love with mogrify.

Compress blank lines


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: