Commands using head (303)

  • Find random strings within /dev/urandom. Using grep filter to just Alphanumeric characters, and then print the first 30 and remove all the line feeds. Show Sample Output


    51
    strings /dev/urandom | grep -o '[[:alnum:]]' | head -n 30 | tr -d '\n'; echo
    jbcurtis · 2009-02-16 00:39:28 10
  • You have an external USB drive or key. Apply this command (using the file path of anything on your device) and it will simulate the unplug of this device. If you just want the port, just type : echo $(sudo lshw -businfo | grep -B 1 -m 1 $(df "/path/to/file" | tail -1 | awk '{print $1}' | cut -c 6-8) | head -n 1 | awk '{print $1}' | cut -c 5- | tr ":" "-") Show Sample Output


    30
    echo $(sudo lshw -businfo | grep -B 1 -m 1 $(df "/path/to/file" | tail -1 | awk '{print $1}' | cut -c 6-8) | head -n 1 | awk '{print $1}' | cut -c 5- | tr ":" "-") | sudo tee /sys/bus/usb/drivers/usb/unbind
    tweet78 · 2014-04-06 12:06:29 9
  • This makes an alias for a command named 'busy'. The 'busy' command opens a random file in /usr/include to a random line with vim. Drop this in your .bash_aliases and make sure that file is initialized in your .bashrc.


    23
    alias busy='my_file=$(find /usr/include -type f | sort -R | head -n 1); my_len=$(wc -l $my_file | awk "{print $1}"); let "r = $RANDOM % $my_len" 2>/dev/null; vim +$r $my_file'
    busybee · 2010-03-09 21:48:41 8
  • Print out list of all branches with last commit date to the branch, including relative time since commit and color coding. Show Sample Output


    16
    for k in `git branch|perl -pe s/^..//`;do echo -e `git show --pretty=format:"%Cgreen%ci %Cblue%cr%Creset" $k|head -n 1`\\t$k;done|sort -r
    brunost · 2009-06-03 08:25:00 3
  • This will give you the Dell Service tag number associated with your machine. Incredibly useful when you need that number for tech support or downloads. Show Sample Output


    15
    sudo dmidecode | grep Serial\ Number | head -n1
    nlinux · 2009-02-18 14:54:28 3
  • Plot your most used commands with gnuplot.


    14
    history | awk '{a[$2]++}END{for(i in a){print a[i] " " i}}' | sort -rn | head > /tmp/cmds | gnuplot -persist <(echo 'plot "/tmp/cmds" using 1:xticlabels(2) with boxes')
    sthrs · 2010-06-13 23:35:13 2
  • This one uses dictionary.com


    13
    pronounce(){ wget -qO- $(wget -qO- "http://dictionary.reference.com/browse/$@" | grep 'soundUrl' | head -n 1 | sed 's|.*soundUrl=\([^&]*\)&.*|\1|' | sed 's/%3A/:/g;s/%2F/\//g') | mpg123 -; }
    matthewbauer · 2010-03-13 04:23:56 4
  • If you want a password length longer than 6, changing the -c6 to read -c8 will give you 8 random characters instead of 6. To end up with a line-feed, use this with echo: # echo `< /dev/urandom tr -dc _A-Z-a-z-0-9 | head -c6`


    11
    < /dev/urandom tr -dc _A-Z-a-z-0-9 | head -c6
    Blackbit · 2009-02-24 09:43:40 4
  • using cat WAR_AND_PEACE_By_LeoTolstoi.txt | tr -cs "[:alnum:]" "\n"| tr "[:lower:]" "[:upper:]" | sort -S16M | uniq -c |sort -nr | cat -n | head -n 30 ("sort -S1G" - Linux/GNU sort only) will also do the job but as some drawbacks (caused by space/time complexity of sorting) for bigger files... Show Sample Output


    11
    cat WAR_AND_PEACE_By_LeoTolstoi.txt | tr -cs "[:alnum:]" "\n"| tr "[:lower:]" "[:upper:]" | awk '{h[$1]++}END{for (i in h){print h[i]" "i}}'|sort -nr | cat -n | head -n 30
    cp · 2010-07-05 06:39:20 5
  • Pump up the chatter, run this script on a regular basis to listen to your twitter timeline. This is a rough first cut using several cli clips I have spotted around. There is no facility to not read those things already read to you. This could also easily be put in a loop for timed onslaught from the chatterverse, though I think it might violate several pointsof the Geneva Convention UPDATE - added a loop, only reads the first 6 twits, and does this every 5 mins. Show Sample Output


    10
    while [ 1 ]; do curl -s -u username:password http://twitter.com/statuses/friends_timeline.rss|grep title|sed -ne 's/<\/*title>//gp' | head -n 6 |festival --tts; sleep 300;done
    tomwsmf · 2009-02-20 20:20:21 3
  • This Anti-TarBomb function makes it easy to unpack a .tar.gz without worrying about the possibility that it will "explode" in your current directory. I've usually always created a temporary folder in which I extracted the tarball first, but I got tired of having to reorganize the files afterwards. Just add this function to your .zshrc / .bashrc and use it like this; atb arch1.tar.gz and it will create a folder for the extracted files, if they aren't already in a single folder. This only works for .tar.gz, but it's very easy to edit the function to suit your needs, if you want to extract .tgz, .tar.bz2 or just .tar. More info about tarbombs at http://www.linfo.org/tarbomb.html Tested in zsh and bash. UPDATE: This function works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in zsh (not working in bash): atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t(ar.gz||ar.bz2||gz||bz||ar)} && tar xf $1 -C ${1%.t(ar.gz||ar.bz2||gz||bz||ar)}; fi ;} UPDATE2: From the comments; bepaald came with a variant that works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in bash: atb() {shopt -s extglob ; l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)} && tar xf $1 -C ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)}; fi ; shopt -u extglob} Show Sample Output


    10
    atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.tar.gz} && tar xf $1 -C ${1%.tar.gz}; fi ;}
    elfreak · 2010-10-16 05:50:32 5
  • Avoiding a for loop brought this time down to less than 3 seconds on my old machine. And just to be clear, 33554432 = 8192 * 4086.


    10
    base64 /dev/urandom | head -c 33554432 | split -b 8192 -da 4 - dummy.
    pdxdoughnut · 2013-11-12 17:56:23 1
  • Not as taxing on the CPU.


    9
    while [ true ]; do head -n 100 /dev/urandom; sleep .1; done | hexdump -C | grep "ca fe"
    campassi · 2010-10-05 16:23:31 1

  • 9
    cat /dev/urandom | tr -dc A-Za-z0-9 | head -c 32
    noqqe · 2011-11-20 17:29:45 0
  • Search for files and list the 20 largest. find . -type f gives us a list of file, recursively, starting from here (.) -print0 | xargs -0 du -h separate the names of files with NULL characters, so we're not confused by spaces then xargs run the du command to find their size (in human-readable form -- 64M not 64123456) | sort -hr use sort to arrange the list in size order. sort -h knows that 1M is bigger than 9K | head -20 finally only select the top twenty out of the list Show Sample Output


    9
    find . -type f -print0 | xargs -0 du -h | sort -hr | head -20
    flatcap · 2012-03-30 10:21:12 3

  • 9
    find . -type f -print0 | xargs -0 du -h | sort -hr | head -10
    netaxiz · 2012-06-30 10:03:31 1
  • /dev/urandom is cryptographically secure, and indistinguishable from true random, as it gathers data from external sources, influenced by human timing interactions with computers, to fill the entropy pool, and hashes the input with SHA-1. As such, this is a quick way to do a "true random" fair-6 dice roll. Using this method, you could easily create passphrases with Diceware http://diceware.com. Change the head(1) count to something other than 5 for more or less numbers.


    9
    tr -cd '1-6' < /dev/urandom | head -c 1; echo
    atoponce · 2012-09-21 02:16:42 3
  • 64 elements max on 16 rows, 4 cols. GNU Barcode will adapt automagically the width and the eight of your elements to fill the page. Standard output format is PostScript.


    8
    ls /home | head -64 | barcode -t 4x16 | lpr
    flux · 2009-04-21 22:54:45 1
  • Suppose you made a backup of your hard disk with dd: dd if=/dev/sda of=/mnt/disk/backup.img This command enables you to mount a partition from inside this image, so you can access your files directly. Substitute PARTITION=1 with the number of the partition you want to mount (returned from sfdisk -d yourfile.img). Show Sample Output


    7
    INFILE=/path/to/your/backup.img; MOUNTPT=/mnt/foo; PARTITION=1; mount "$INFILE" "$MOUNTPT" -o loop,offset=$[ `/sbin/sfdisk -d "$INFILE" | grep "start=" | head -n $PARTITION | tail -n1 | sed 's/.*start=[ ]*//' | sed 's/,.*//'` * 512 ]
    Alanceil · 2009-03-06 21:29:13 3

  • 7
    vim $( ls -t | head -n1 )
    salamando · 2009-03-11 00:07:49 1
  • bash.org is a collection of funny quotes from IRC. WARNING: some of the quotes contain "adult" jokes... may be embarrassing if your boss sees them... Thanks to Chen for the idea and initial version! This script downloads a page with random quotes, filters the html to retrieve just one liners quotes and outputs the first one. Just barely under the required 255 chars :) Improvment: You can replace the head -1 at the end by: awk 'length($0)>0 {printf( $0 "\n%%\n" )}' > bash_quotes.txt which will separate the quotes with a "%" and place it in the file. and then: strfile bash_quotes.txt which will make the file ready for the fortune command and then you can: fortune bash_quotes.txt which will give you a random quote from those in the downloaded file. I download a file periodically and then use the fortune in .bashrc so I see a funny quote every time I open a terminal. Show Sample Output


    7
    curl -s http://bash.org/?random1|grep -oE "<p class=\"quote\">.*</p>.*</p>"|grep -oE "<p class=\"qt.*?</p>"|sed -e 's/<\/p>/\n/g' -e 's/<p class=\"qt\">//g' -e 's/<p class=\"qt\">//g'|perl -ne 'use HTML::Entities;print decode_entities($_),"\n"'|head -1
    Iftah · 2009-05-07 13:13:21 6
  • fancy command line ncdu clone Show Sample Output


    7
    for i in `du --max-depth=1 $HOME | sort -n -r | awk '{print $1 ":" $2}'`; do size=`echo $i | awk -F: '{print $1}'`; dir=`echo $i | awk -F: '{print $NF}'`; size2=$(($size/1024)); echo "$size2 MB used by $dir"; done | head -n 10
    tuxifier · 2009-06-02 21:27:48 2
  • order the files by modification (thanks stanishjohnd) time, one file per output line and filter first 10


    7
    ls -1t | head -n10
    wires · 2009-06-23 12:15:12 3
  • Next time you are leaching off of someone else's wifi use this command before you start your bittorrent ...for legitimate files only of course. It creates a hexidecimal string using md5sum from the first few lines of /dev/urandom and splices it into the proper MAC address format. Then it changes your MAC and resets your wireless (wlan0:0). Show Sample Output


    7
    ran=$(head /dev/urandom | md5sum); MAC=00:07:${ran:0:2}:${ran:3:2}:${ran:5:2}:${ran:7:2}; sudo ifconfig wlan0 down hw ether $MAC; sudo ifconfig wlan0 up; echo ifconfig wlan0:0
    workingsmart · 2009-07-16 16:21:44 2

  • 7
    echo $(shuf -i 1-49 | head -n6 | sort -n)
    twfcc · 2009-10-22 06:48:20 0
  •  1 2 3 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Print entire field if string is detected in column
It searches for a specific value in the specified column and if it finds it it'll print the whole field/row. Similarly, if you don't know what you're looking for exactly but want to exclude something you're already aware of, you can exclude that "something: awk '{ if ($column != "string") print $0}'

Enable verbose boot in Mac OS X Open Firmware

Extract audio stream from an AVI file using mencoder
This commands saves the output in the audio directory. The portion ${file/%avi/mp3} uses bash string replacement to replace the avi to mp3 within the ${file} variable.

Route outbound SMTP connections through a addtional IP address rather than your primary

Extract title from HTML files
not the best, uses 4 pipes!

Load file into RAM (cache) for faster accessing for repeated usage
Best result when file size less than half of RAM size

Get the weather forecast for the next 24 to 48 for your location.
This shell function grabs the weather forecast for the next 24 to 48 hours from weatherunderground.com. Replace <YOURZIPORLOCATION> with your zip code or your "city, state" or "city, country", then calling the function without any arguments returns the weather for that location. Calling the function with a zip code or place name as an argument returns the weather for that location instead of your default. To add a bit of color formatting to the output, use the following instead: $weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-}"|perl -ne '/([^

Pull git submodules in parallel using GNU parallel
Make sure to run this command in your git toplevel directory. Modify `-j4` as you like. You can also run any arbitrary command beside `git pull` in parallel on all of your git submodules.

Remotely sniff traffic and pass to snort
I have a small embedded linux device that I wanted to use for sniffing my external network, but I didn't want to recompile/cross-compile snort for the embedded platform. So I used tcpdump over ssh to pass all the traffic as pcap data to a "normal" Linux system that then takes the pcap data and passes it to snort for processing.

Docker: Remove all exited docker container


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: