Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Hide

Credits

All commands from sorted by
Terminal - All commands - 12,144 results
while [ "$(ls -l --full-time TargetFile)" != "$a" ] ; do a=$(ls -l --full-time TargetFile); sleep 10; done
2015-05-09 03:19:49
User: dmmst19
Functions: ls sleep
0

Here's a way to wait for a file (a download, a logfile, etc) to stop changing, then do something. As written it will just return to the prompt, but you could add a "; echo DONE" or whatever at the end.

This just compares the full output of "ls" every 10 seconds, and as keeps going as long as that output has changed since the last interval. If the file's being appended to the size will change, and if it's being modified without growing the timestamp from the --full-time option will have changed. The output of just "ls -l" isn't sufficient since by default it doesn't show seconds, just minutes.

Waiting for a file to stop changing is not a very elegant or reliable way to measure that some process is finished - if you know the process ID there are much better ways. This method will also give a false positive if the changes to the target file are delayed longer than the sleep interval for any reason (network timeouts, etc). But sometimes the process that is writing the file doesn't exit, rather it continues on doing something else, so this approach can be useful if you understand its limitations.

tr -s ' ' | cut -d' ' -f2-
du -ks .[^.]* * | sort -n
2015-05-08 12:26:34
User: rdc
Functions: du sort
Tags: du usage disk
0

This command summarizes the disk usage across the files and folders in a given directory, including hidden files and folders beginning with ".", but excluding the directories "." and ".."

It produces a sorted list with the largest files and folders at the bottom of the list

while kill -0 0; do timeout 5 bash -c 'spinner=( Ooooo oOooo ooOoo oooOo ooooO oooOo ooOoo oOooo); while true; do for i in ${spinner[@]}; do for _ in seq 0 ${#i}; do echo -en "\b\b"; done; echo -ne "${i}"; sleep 0.2; done; done'; done
2015-05-07 19:13:08
User: anapsix
Functions: bash echo kill seq sleep
1

alternatively, run the spinner for 5 seconds:

timeout 5 bash -c 'spinner=( Ooooo oOooo ooOoo oooOo ooooO oooOo ooOoo oOooo); while true; do for i in ${spinner[@]}; do for j in seq 0 ${#i}; do echo -en "\b\b"; done; echo -ne "${i}"; sleep 0.2; done; done'

i=in.swf; dump-gnash -1 -j 1280 -k 720 -D "${i%.*}".bgra@12 -A "${i%.*}".wav "${i}"
2015-05-06 23:52:39
User: mhs
0

This will dump a raw BGRA pixel stream and WAV which must then be converted to video:

ffmpeg -f rawvideo -c:v rawvideo -s 1280x720 -r 12 -pix_fmt bgra -i "${i%.*}".bgra -c:v libx264 -preset veryslow -qp 0 -movflags +faststart -i "${i%.*}".wav -c:a libfdk_aac -b:a 384k "${i%.*}".mp4 ; rm "${i%.*}".bgra "${i%.*}".wav

Our example generates an x264/720p/12fps/AAC best-quality MP4.

To get dump-gnash, first install the build-dependencies for gnash (this step is OS-specific). Then:

git clone http://git.savannah.gnu.org/r/gnash.git ; cd gnash ; ./autogen.sh ; ./configure --enable-renderer=agg --enable-gui=dump --disable-menus --enable-media=ffmpeg --disable-jemalloc ; make
awk '{out="";for(i=2;i<=NF;i++){out=out" "$i};sub(/ /, "", out);print out}'
2015-05-06 22:26:28
User: endix
Functions: awk
Tags: awk
0

Increase "2" in "i=2" to drop more columns.

wget -q -O- https://access.redhat.com/documentation/en-US/Red_Hat_Enterprise_Linux/ | grep Linux/7/pdf | cut -d \" -f 2 | awk '{print "https://access.redhat.com"$1}' | xargs wget
tail -f /var/squid/logs/access.log | perl -pe 's/(\d+)/localtime($1)/e'
sudo mysql -sNe 'show tables like "PREFIX_%"' DBNAME | xargs sudo mysqldump DBNAME > /tmp/dump.sql
echo -e ''$_{1..80}'\b+'
2015-05-05 22:13:33
User: knoppix5
Functions: echo
3

(here is character '+' repeated 80 times)

Sometimes needed to enhance the title of the script.

zenity --info --text "Your welcome! Lunch?" --display=:0
clear; while sleep 1; do d=$(date +"%H:%M:%S"); e=$(echo "toilet -t -f mono12 $d");tput setaf 1 cup 0; eval $e; tput setaf 4 cup 8; eval "$e -F flop";tput cup 0; done
find . -path "*/any_depth/*" -exec grep "needle" {} +
mysms='xxx0001234@messaging.sprintpcs.com' ; expect -c "log_user 0 ; set timeout -1 ; spawn usbmon -i usb0 ; expect -re \"C.*Ii.*-2:128\" { spawn sendmail $mysms ; send \"Smart Home Sensor Triggered\n.\n\" ; expect }"
4

An old USB A/B cable is all you need to make your own Smart Home hardware!

Cut off and discard the B-portion of the USB cable. On the A side, connect the RED (+) and WHITE (D-) wires via a 1 kiloohm resistor.

Picture:

http://imgur.com/dJGVlAU

Now plug the cable into a USB port on your Linux computer. Your hardware is ready!

Run the above command after changing variable mysms to your personal email-to-SMS gateway info as required by your cellular service provider.

The command uses the amazing usbmon tool (see link below) to detect the cable.

For the curious, to view the raw usbmon output, run this command: (Also see the sample output)

usbmon -i usb0

How does it work? When the red and white wires are connected (via the 1 kiloohm resistor) the USB hardwere is tricked into thinking that a new USB device is trying to start up.

We then use the usbmon utility to capture the host USB events as it tries to talk to the cable.

The expect utility watches the usbmon stream and waits for the disconnect text "-2:128" before sending the SMS message.

Finally, the sendmail tool is used to email the SMS message to your smartphone via your cellular provider's SMS-to-email gateway.

As a result, when the electrical connection between the red and white wire is interrupted, or the USB cable is unplugged from your computer, you get an SMS notification of the disconnect event on your smartphone.

Could this be the cheapest smart home gadget ever? What are YOU going to sense with it?

Please let me know in the comments and please don't forget to click it up!

Links:

http://www.linuxcertif.com/man/8/usbmon/

http://en.wikipedia.org/wiki/USB#Pinouts

http://imgur.com/dJGVlAU

function summaryIP() { < $1 awk '{print $1}' | while read ip ; do verifyIP ${ip} && echo ${ip}; done | awk '{ip_array[$1]++} END { for (ip in ip_array) printf("%5d\t%s\n", ip_array[ip], ip)}' | sort -rn; }
2015-05-01 16:45:05
User: mpb
Functions: awk echo read sort
1

Working with lists of IP addresses it is sometimes useful to summarize a count of how many times an IP address appears in the file.

This example, summarizeIP, uses another function "verifyIP" previously defined in commandlinefu.com to ensure only valid IP addresses get counted. The summary list is presented in count order starting with highest count.

echo "DISPLAY=$DISPLAY xmessage call the client" | at 10:00
2015-05-01 14:57:15
User: op4
Functions: at echo
Tags: echo at xmessage
1

This command will create a popup reminder window to assist in remembering tasks

http://i.imgur.com/2n7viiA.png is how it looks when created

function verifyIP() { octet="(25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])"; ip4="^$octet\.$octet\.$octet\.$octet$"; [[ ${1} =~ $ip4 ]] && return 0 || return 1; }
2015-05-01 12:22:57
User: mpb
Functions: return
1

When processing IP addresses in the shell (or shell script) it is useful to be able to verify that the value of data is an IP address (an not some random string or non-sensible IP address).

openssl x509 -enddate -noout -in file.pem
git for-each-ref --sort=-committerdate refs/heads/
openssl rsa -in key.priv -pubout > key.pub
2015-04-28 19:10:17
User: krizzo
2

This will extract the public key that is stored in the private key using openssl.

egrep 'word1.*word2' --color /path/file.log |more
2015-04-28 15:09:45
User: alissonf
Functions: egrep
0

grep for 2 words existing on the same line

for a in $(ls /usr/sbin /usr/bin); do ps -fC $a;done|grep -v PPID
2015-04-27 18:15:56
User: knoppix5
Functions: grep ls ps
-2

Thanks to pooderbill for the idea :-)

find . -type f -name '*' -exec md5sum '{}' + > hashes.txt
input=a.pdf ; pages=`pdftk $input dump_data | grep -i numberofpages | cut -d" " -f 2`; pdftk A=$input shuffle A1-$[$pages/2] A$pages-$[$pages/2+1] output "${input%.*}.rearranged.${input##*.}"
2015-04-26 20:05:20
User: kobayashison
Functions: cut grep
0

Rearrange pdf document coming from a simplex document feed scanner, feeded first with odd pages, then with even pages from the end. Needs pdftk >1.44 w/ shuffle.

Similar to http://www.commandlinefu.com/commands/view/7965/pdf-simplex-to-duplex-merge where there are 2 separate documents, odd and even

perl -e 'for(;;sleep 1){printf"\r"."%.4b "x6,split"",`date +%H%M%S`}'