Commands using head (314)

  • In OSX you would have to make sure that you "sudo -s" your way to happiness since it will give a few "Permission denied" errors before finally spitting out the results. In OSX the directory structure has to start with the "Users" Directory then it will recursively perform the operation. Your Lord and master, Mematron Show Sample Output


    1
    sudo -s du -sm /Users/* | sort -nr | head -n 10
    mematron · 2012-09-13 10:15:23 4

  • 1
    echo $(</dev/urandom tr -dc 1-6 | head -c1)
    unixmonkey40000 · 2012-09-21 08:38:51 7
  • Generate a 18 character password from character set a-zA-Z0-9 from /dev/urandom, pipe the output to Python which prints the password on standard out and in crypt sha512 form. Show Sample Output


    1
    cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 18 | head -1 | python -c "import sys,crypt; stdin=sys.stdin.readline().rstrip('\n'); print stdin;print crypt.crypt(stdin)"
    cnyg · 2012-11-09 00:40:22 4
  • Replaces hexdump with the more succint xxd, and the sed was unnecessarily complex.


    1
    xxd -p /dev/urandom |fold -60|head -30|sed 's/\(..\)/\1 /g'
    psifertex · 2013-02-19 22:18:52 4

  • 1
    tr -dc 'A-Za-z0-9!@#$%^&*' < /dev/urandom | fold -w 12 | head -n 1
    opexxx · 2013-03-15 13:20:32 81
  • Interesting to see which packages are larger than the kernel package. Useful to understand which RPMs might be candidates to remove if drive space is restricted. Show Sample Output


    1
    rpm -qa --queryformat '%{size} %{name}-%{version}-%{release}\n' | sort -k 1,1 -rn | nl | head -16
    mpb · 2013-03-19 21:10:54 6

  • 1
    awk '{print $1}' ~/.bash_history | sort | uniq -c | sort -rn | head -n 10
    nesses · 2013-05-03 16:24:30 6
  • I'm not sure how reliable this command is, but it works for my needs. Here's also a variant using grep. nslookup www.example.com | grep "^Address: " | awk '{print $2}' Show Sample Output


    1
    nslookup www.example.com | tail -2 | head -1 | awk '{print $2}'
    wsams · 2013-09-05 20:26:45 11
  • Sorts by latest modified files by looking to current directory and all subdirectories Show Sample Output


    1
    find . -name '*pdf*' -print0 | xargs -0 ls -lt | head -20
    fuats · 2013-10-03 21:58:51 9

  • 1
    for i in `seq 1 4096`; do tr -dc A-Za-z0-9 </dev/urandom | head -c8192 > dummy$i.rnd; done
    BoxingOctopus · 2013-11-11 21:27:15 8
  • Using the 'time' command, running this with 'tr' took 28 seconds (and change) each time but using base64 only took 8 seconds (and change). If the file doesn't have to be viewable, pulling straight from urandom with head only took 6 seconds (and change)


    1
    for i in {1..4096}; do base64 /dev/urandom | head -c 8192 > dummy$i.rnd ; done
    pdxdoughnut · 2013-11-12 00:36:10 9
  • Download latest NVIDIA Geforce x64 Windows7-8 driver from Nvidia's website. Pulls the latest download version (which includes beta). This is the "English" version. The following command includes a 'sed' line to replace "english" with "international" if needed. You can also replace the starting subdomain with "eu." "uk." and others. Enjoy this one liner! 1 character under the max :) wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us" | awk '/driverResults.aspx/ {print $4}' | cut -d "'" -f2 | head -n 1)" | awk '/url=/ {print $2}' | sed -e "s/english/international/" | cut -d '=' -f3 | cut -d '&' -f1)" Show Sample Output


    1
    wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us"|awk '/driverResults.aspx/ {print $4}'|cut -d "'" -f2|head -n 1)"|awk '/url=/ {print $2}'|cut -d '=' -f3|cut -d '&' -f1)"
    lowjax · 2013-11-21 03:04:59 11
  • Specific to OSX. Show Sample Output


    1
    sysctl -a | grep boottime | head -n 1
    lgarron · 2014-01-24 13:03:48 7

  • 1
    git verify-pack -v .git/objects/pack/pack-*.idx | grep blob | sort -k3nr | head | while read s x b x; do git rev-list --all --objects | grep $s | awk '{print "'"$b"'",$0;}'; done
    qdrizh · 2014-06-25 07:37:24 6
  • Print the IP address and the Mac address in the same line Show Sample Output


    1
    ifconfig | head -n 2 | tr -d '\n' | sed -n 's/.*\(00:[^ ]*\).*\(adr:[^ ]*\).*/mac:\1 - \2/p'
    Koobiac · 2014-09-03 14:35:27 18
  • Top 30 History Command line with histogram display Show Sample Output


    1
    history|awk '{print $2}'|sort|uniq -c|sort -rn|head -30|awk '!max{max=$1;}{r="";i=s=100*$1/max;while(i-->0)r=r"#";printf "%50s %5d %s %s",$2,$1,r,"\n";}'
    injez · 2014-09-29 12:40:43 9
  • This checks the system load every second and if it's over a certain threshold (.8 in this example), it spits out the date, system loads and top 4 processes sorted by CPU. Additionally, the \a in the first echo creates an audible bell.


    1
    while sleep 1; do if [ $(echo "$(cat /proc/loadavg | cut -d' ' -f1) > .8 " | bc) -gt 0 ]; then echo -e "\n\a"$(date)" \e[5m"$(cat /proc/loadavg)"\e[0m"; ps aux --sort=-%cpu|head -n 5; fi; done
    tyzbit · 2014-12-08 15:44:40 8
  • Finds the date of the first commit in a git repository branch Show Sample Output


    1
    git rev-list --all|tail -n1|xargs git show|grep -v diff|head -n1|cut -f1-3 -d' '
    binaryten · 2015-02-04 19:35:16 11
  • I copied this (let's be honest) somewhere on internet and I just made it as a function ready to be used as alias. It shows the 10 most used commands from history. This seems to be just another "most used commands from history", but hey.. this is a function!!! :D Show Sample Output


    1
    mosth() { history | awk '{CMD[$2]++;count++;}END { for (a in CMD)print CMD[a] " " CMD[a]/count*100 "% " a;}' | grep -v "./" | column -c3 -s " " -t | sort -nr | nl | head -n10; }
    nnsense · 2015-05-11 17:41:55 19
  • Useful to identify the field number in big CSV files with large number of fields. The index is the reference to use in processing with commands like 'cut' or 'awk' involved. Show Sample Output


    1
    head -1 file.csv | tr ',' '\n' | tr -d " " | awk '{print NR,$0}'
    neomefistox · 2015-08-26 05:46:15 18

  • 1
    curl -sL http://goo.gl/3sA3iW | head -16 | tail -14
    cadejscroggins · 2015-09-19 07:19:49 10
  • You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials Show Sample Output


    1
    head -n1 | xargs -I {} aws sts get-session-token --serial-number $MFA_ID --duration-seconds 900 --token-code {} --output text --query [Credentials.AccessKeyId,Credentials.SecretAccessKey,Credentials.SessionToken]
    keymon · 2016-04-12 10:57:00 45

  • 1
    du -a /var | sort -n -r | head -n 10
    zluyuer · 2016-05-27 04:05:08 12
  • sort -R randomize the list. head -n1 takes the first.


    1
    links `lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort -R | uniq | head -n1`
    mogoh · 2016-07-26 12:54:53 15

  • 1
    find / -path /proc -prune -o -type f -printf '%TY-%Tm-%Td %TT %p\n' | sort -r | head -50
    sidneycrestani · 2016-11-20 02:45:01 13
  • ‹ First  < 3 4 5 6 7 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Create Bash script to change modification time of files
Create a bash script to change the modification time for each file in 'files.txt' such that they are in the same order as in 'files.txt' File name for bash script specified by variable, 'scriptName'. It is made an executable once writing into it has been completed.

Plowshare, download files from cyberlocker like rapidshare megaupload ...etc
you need to have plowshare installed http://code.google.com/p/plowshare/ plowshare supports Megaupload, Rapidshare, 2Shared, 4Shared, ZShare, Badongo, Divshare.com, Depositfiles, Netload.in, Sendspace, Usershare, x7.to and some others file sharing services.

list files recursively by size

List the URLs of tabs of the frontmost Chrome window in OS X
This also works with Safari if you just change the application name. Replace $ window 1 with $ windows to list the URLs of tabs in all windows instead of only the frontmost window. This also includes titles: $ osascript -e{'set o to""','tell app"google chrome"','repeat with t in tabs of window 1','set o to o&url of t&"\n"&" "&title of t&"\n"',end,end}|sed \$d .

list files recursively by size

list files recursively by size

Create QR codes from a URL.
QR codes are those funny square 2d bar codes that everyone seems to be pointing their smart phones at. Try the following... $ qrurl http://xkcd.com Then open qr.*.png in your favorite image viewer. Point your the bar code reader on your smart phone at the code, and you'll shortly be reading xkcd on your phone. URLs are not the only thing that can be encoded by QR codes... short texts (to around 2K) can be encoded this way, although this function doesn't do any URL encoding, so unless you want to do that by hand it won't be useful for that.

power off system in X minutes
Replace 60 with the number of minutes until you want the machine to shut down. Alternatively give an absolute time in the format hh:mm (shutdown -h 9:30) Or shutdown right away (shutdown -h now)

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

diff two unsorted files without creating temporary files
bash/ksh subshell redirection (as file descriptors) used as input to diff


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: