Commands using head (314)

  • In OSX you would have to make sure that you "sudo -s" your way to happiness since it will give a few "Permission denied" errors before finally spitting out the results. In OSX the directory structure has to start with the "Users" Directory then it will recursively perform the operation. Your Lord and master, Mematron Show Sample Output


    1
    sudo -s du -sm /Users/* | sort -nr | head -n 10
    mematron · 2012-09-13 10:15:23 4

  • 1
    echo $(</dev/urandom tr -dc 1-6 | head -c1)
    unixmonkey40000 · 2012-09-21 08:38:51 7
  • Generate a 18 character password from character set a-zA-Z0-9 from /dev/urandom, pipe the output to Python which prints the password on standard out and in crypt sha512 form. Show Sample Output


    1
    cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 18 | head -1 | python -c "import sys,crypt; stdin=sys.stdin.readline().rstrip('\n'); print stdin;print crypt.crypt(stdin)"
    cnyg · 2012-11-09 00:40:22 4
  • Replaces hexdump with the more succint xxd, and the sed was unnecessarily complex.


    1
    xxd -p /dev/urandom |fold -60|head -30|sed 's/\(..\)/\1 /g'
    psifertex · 2013-02-19 22:18:52 4

  • 1
    tr -dc 'A-Za-z0-9!@#$%^&*' < /dev/urandom | fold -w 12 | head -n 1
    opexxx · 2013-03-15 13:20:32 81
  • Interesting to see which packages are larger than the kernel package. Useful to understand which RPMs might be candidates to remove if drive space is restricted. Show Sample Output


    1
    rpm -qa --queryformat '%{size} %{name}-%{version}-%{release}\n' | sort -k 1,1 -rn | nl | head -16
    mpb · 2013-03-19 21:10:54 6

  • 1
    awk '{print $1}' ~/.bash_history | sort | uniq -c | sort -rn | head -n 10
    nesses · 2013-05-03 16:24:30 6
  • I'm not sure how reliable this command is, but it works for my needs. Here's also a variant using grep. nslookup www.example.com | grep "^Address: " | awk '{print $2}' Show Sample Output


    1
    nslookup www.example.com | tail -2 | head -1 | awk '{print $2}'
    wsams · 2013-09-05 20:26:45 11
  • Sorts by latest modified files by looking to current directory and all subdirectories Show Sample Output


    1
    find . -name '*pdf*' -print0 | xargs -0 ls -lt | head -20
    fuats · 2013-10-03 21:58:51 9

  • 1
    for i in `seq 1 4096`; do tr -dc A-Za-z0-9 </dev/urandom | head -c8192 > dummy$i.rnd; done
    BoxingOctopus · 2013-11-11 21:27:15 8
  • Using the 'time' command, running this with 'tr' took 28 seconds (and change) each time but using base64 only took 8 seconds (and change). If the file doesn't have to be viewable, pulling straight from urandom with head only took 6 seconds (and change)


    1
    for i in {1..4096}; do base64 /dev/urandom | head -c 8192 > dummy$i.rnd ; done
    pdxdoughnut · 2013-11-12 00:36:10 9
  • Download latest NVIDIA Geforce x64 Windows7-8 driver from Nvidia's website. Pulls the latest download version (which includes beta). This is the "English" version. The following command includes a 'sed' line to replace "english" with "international" if needed. You can also replace the starting subdomain with "eu." "uk." and others. Enjoy this one liner! 1 character under the max :) wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us" | awk '/driverResults.aspx/ {print $4}' | cut -d "'" -f2 | head -n 1)" | awk '/url=/ {print $2}' | sed -e "s/english/international/" | cut -d '=' -f3 | cut -d '&' -f1)" Show Sample Output


    1
    wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us"|awk '/driverResults.aspx/ {print $4}'|cut -d "'" -f2|head -n 1)"|awk '/url=/ {print $2}'|cut -d '=' -f3|cut -d '&' -f1)"
    lowjax · 2013-11-21 03:04:59 11
  • Specific to OSX. Show Sample Output


    1
    sysctl -a | grep boottime | head -n 1
    lgarron · 2014-01-24 13:03:48 7

  • 1
    git verify-pack -v .git/objects/pack/pack-*.idx | grep blob | sort -k3nr | head | while read s x b x; do git rev-list --all --objects | grep $s | awk '{print "'"$b"'",$0;}'; done
    qdrizh · 2014-06-25 07:37:24 6
  • Print the IP address and the Mac address in the same line Show Sample Output


    1
    ifconfig | head -n 2 | tr -d '\n' | sed -n 's/.*\(00:[^ ]*\).*\(adr:[^ ]*\).*/mac:\1 - \2/p'
    Koobiac · 2014-09-03 14:35:27 18
  • Top 30 History Command line with histogram display Show Sample Output


    1
    history|awk '{print $2}'|sort|uniq -c|sort -rn|head -30|awk '!max{max=$1;}{r="";i=s=100*$1/max;while(i-->0)r=r"#";printf "%50s %5d %s %s",$2,$1,r,"\n";}'
    injez · 2014-09-29 12:40:43 9
  • This checks the system load every second and if it's over a certain threshold (.8 in this example), it spits out the date, system loads and top 4 processes sorted by CPU. Additionally, the \a in the first echo creates an audible bell.


    1
    while sleep 1; do if [ $(echo "$(cat /proc/loadavg | cut -d' ' -f1) > .8 " | bc) -gt 0 ]; then echo -e "\n\a"$(date)" \e[5m"$(cat /proc/loadavg)"\e[0m"; ps aux --sort=-%cpu|head -n 5; fi; done
    tyzbit · 2014-12-08 15:44:40 8
  • Finds the date of the first commit in a git repository branch Show Sample Output


    1
    git rev-list --all|tail -n1|xargs git show|grep -v diff|head -n1|cut -f1-3 -d' '
    binaryten · 2015-02-04 19:35:16 11
  • I copied this (let's be honest) somewhere on internet and I just made it as a function ready to be used as alias. It shows the 10 most used commands from history. This seems to be just another "most used commands from history", but hey.. this is a function!!! :D Show Sample Output


    1
    mosth() { history | awk '{CMD[$2]++;count++;}END { for (a in CMD)print CMD[a] " " CMD[a]/count*100 "% " a;}' | grep -v "./" | column -c3 -s " " -t | sort -nr | nl | head -n10; }
    nnsense · 2015-05-11 17:41:55 19
  • Useful to identify the field number in big CSV files with large number of fields. The index is the reference to use in processing with commands like 'cut' or 'awk' involved. Show Sample Output


    1
    head -1 file.csv | tr ',' '\n' | tr -d " " | awk '{print NR,$0}'
    neomefistox · 2015-08-26 05:46:15 18

  • 1
    curl -sL http://goo.gl/3sA3iW | head -16 | tail -14
    cadejscroggins · 2015-09-19 07:19:49 10
  • You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials Show Sample Output


    1
    head -n1 | xargs -I {} aws sts get-session-token --serial-number $MFA_ID --duration-seconds 900 --token-code {} --output text --query [Credentials.AccessKeyId,Credentials.SecretAccessKey,Credentials.SessionToken]
    keymon · 2016-04-12 10:57:00 46

  • 1
    du -a /var | sort -n -r | head -n 10
    zluyuer · 2016-05-27 04:05:08 12
  • sort -R randomize the list. head -n1 takes the first.


    1
    links `lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort -R | uniq | head -n1`
    mogoh · 2016-07-26 12:54:53 15

  • 1
    find / -path /proc -prune -o -type f -printf '%TY-%Tm-%Td %TT %p\n' | sort -r | head -50
    sidneycrestani · 2016-11-20 02:45:01 13
  • ‹ First  < 3 4 5 6 7 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Save a file you edited in vim without the needed permissions
Calls sudo tee like all the other lines, but also automatically reloads the file. Optionally you can add command Wq :execute ':W' | :q and command WQ :Wq to make quitting easier

Retrieve the size of a file on a server
Downloads the entire file, but http servers don't always provide the optional 'Content-Length:' header, and ftp/gopher/dict/etc servers don't provide a filesize header at all.

Catch a proccess from a user and strace it.
It sits there in a loop waiting for a proccess from that user to spawn. When it does it will attach strace to it

Mac Sleep Timer
Schedule your Mac to sleep at any future time. Also wake, poweron, shutdown, wakeorpoweron. Or repeating with $ sudo pmset repeat wakeorpoweron MTWRFSU 7:00:00 Query with $ pmset -g sched Lots more at http://www.macenterprise.org/articles/powermanagementandschedulingviathecommandline

Mutt - Change mail sender.

List the popular module namespaces on CPAN
Grabs the complete module list from CPAN, pulls the first column, ditches html lines, counts, ditches small namespaces.

urldecode with AWK
Fast and simple awk urldecoder! Note: Parameter -n is specific to GNU awk

Prevent non-root users from logging in
Also with optional message: $ echo "no login for you" > /etc/nologin (This doesn't affect your current X session - you're already logged in!)

ls not pattern
I've been looking for a way to do this for a while, get a not pattern for shell globs. This works, I'm using to grab logs from a remote server via scp.

GRUB2: set Super Mario as startup tune
I'll let Slayer handle that. Raining Blood for your pleasure.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: