Commands using head (314)

  • Get an approximation of who the workstation is assigned to. You can wrap it in su - "$()" if you want to log into a shell as that user. Show Sample Output


    0
    last | grep -i console | grep -iv 'root' | cut -f 1 -d ' ' | sort | uniq -c | sort -nr | awk '{print $2}' | head -1
    phyxia · 2015-07-20 18:07:40 10
  • Useful when you need to generate password or random hash string. If you need longer string adjust parameter for "head -c 20" Show Sample Output


    0
    dd if=/dev/urandom bs=1k count=1 2>/dev/null|LC_CTYPE=C tr -dc 'abcdefghijklmnopqrstuvwxyz0123456789!@#%^&*(-_=+)'|head -c 20
    nitrogear · 2015-07-25 21:24:28 9

  • 0
    ls -1 /proc/$(ps ax | grep <Process Name> | head -n 1 | awk '{print $1;}')/task | tail -n +2
    happymarmoset · 2015-10-06 07:44:48 10

  • 0
    cut -f1 -d" " ~/.bash_history | sort | uniq -c | sort -nr | head -n 30
    kenorb · 2015-10-09 16:11:37 10
  • Quick way to get the URL of the most recent audio file out of a podcast xml feed without any fancy xml parsing tools. Just curl, grep and head


    0
    mpc add `curl -s http://link.to/podcast/feed.xml | grep -o 'https*://[^"]*mp3' | head -1`
    tbon3r · 2015-10-11 09:17:35 9
  • this one includes special characters. note some some chars may be disallowed on windows systems. *nix will allow pretty much any character in a password except a carriage return. you do not want non printing characters in your password, so this is limited to the printable chars displayed on a keyboard , less space and return. edited to fix minor typo Show Sample Output


    0
    tr -dc '[:print:]' < /dev/urandom | fold -w10 |head -n1 |sed 's/ //g'
    wr250 · 2016-01-09 13:43:16 12
  • Returns a list, with attributes (think `ls -l`), in reverse chronological order. N is a single numeric parameter. Robust against unfriendly filenames and directory structures. Show Sample Output


    0
    nmf() { find . -type f -printf '%T@ ' -print0 -printf '\n' | sort -rn | head -"$1" | cut -f2- -d" " | tr -d "\0" | tr "\n" "\0" | xargs -0 ls -Ulh; }
    incidentnormal · 2016-03-04 14:53:14 12
  • Display the top processes sorted by memory usage. This is mostly useful because it's easy to remember and can give me a quick 'top' view of a group of servers when used over pssh. (Though I'd recommend |head -10 to minimize the output). Show Sample Output


    0
    top -b -o +%MEM |head -17
    dak1n1 · 2016-03-16 22:14:34 12
  • Grabs the first JSON file in the directory, reads its keys, prints TSV, then prints all the json files' values as TSV. Nested objects appear as json. Unhappy times if your json has literal tabs in it. Show Sample Output


    0
    jq -r 'keys | join("\t")' $(ls -f *.json | head -1) && jq -Sr 'to_entries | [ .[] | .value | tostring ] | join("\t")' *.json
    drjeats · 2016-04-08 23:30:30 12
  • This command will find any named file types in / between two dates then will list all the metadata of those files in long format and human readable form. Adding a 't' flag to the ls command sorts the files by modified time. After all that the head -5 lists the first 5 which can be changed.


    0
    ls -laht `find / -name "*.*" -type f -newermt "2016-04-05" ! -newermt "2016-04-10"`|head -5
    ubercoo · 2016-04-19 14:26:23 10
  • by determining most popular use in history using percentage . Show Sample Output


    0
    history | awk '{CMD[$2]++;count++;}END { for (a in CMD)print CMD[a] " " CMD[a]/count*100 "% " a;}' | grep -v "./" | column -c3 -s " " -t | sort -nr | nl | head -n10
    turrtle13 · 2016-04-24 17:21:35 9
  • This server can be access by a browser or other remote terminal with ncat. I have to use de test && break to allow ctrl-c to close. Show Sample Output


    0
    while [ 1 ]; do cat /dev/urandom | tr -dc ' -~' | head -c 10 | ncat -l 8080 &> /dev/null; test $? -gt 128 && break; done
    xxjcaxx · 2016-05-04 14:36:47 10

  • 0
    cat access.log | awk '{print $1}' | sort -n | uniq -c | sort -nr | head -20
    prees · 2016-05-05 20:52:03 13
  • This is a alternate command I like to use instead of TOP or HTOP to see what are the processes which are taking up the most memory on a system. It shows the username, process ID, CPU usage, Memory usage, thread ID, Number of threads associated with parent process, Resident Set Size, Virtual Memory Size, start time of the process, and command arguments. Then it's sorted by memory and showing the top 10 with head. This of course can be changed to suit you needs. I have a small system which is why Firefox is taking so much resources. Show Sample Output


    0
    watch -n .8 'ps -eaLo uname,pid,pcpu,pmem,lwp,nlwp,rss,vsz,start_time,args --sort -pmem| head -10'
    ubercoo · 2016-05-11 01:05:53 11

  • 0
    find /var -type f -exec du -h {} \; | sort -rh | head -10
    jiananmail · 2016-05-31 00:21:32 10
  • Finds the login id of the user that owns the console. I use it to reset my touchpad after resume from suspend in /etc/pm/sleep.d/s99local


    0
    who | grep :0 | head -1 | cut -d " " -f 1
    mikef5410 · 2016-06-22 17:33:38 12

  • 0
    tr "\|\;" "\n" < ~/.bash_history | sed -e "s/^ //g" | cut -d " " -f 1 | sort | uniq -c | sort -rn | head -20
    turrtle13 · 2016-07-01 19:27:12 10

  • 0
    ps aux | awk '{if ($5 != 0 ) print $2,$5,$6,$11}' | sort -k2rn | head -10 | column -t
    turrtle13 · 2016-07-01 19:40:56 12
  • When bundle install sucks ...This runs isuckat_ruby.rb and when stderror matches find gem ' it will gem install what ever is missing ... Show Sample Output


    0
    gem install `ruby ./isuckat_ruby.rb 2>&1 | sed -e 's/.*find gem .//g' -e 's/ .*//g' | head -n 1`
    operat0r · 2016-08-03 19:41:27 13

  • 0
    last -x | grep shutdown | head -1
    creepyjones · 2016-08-10 21:51:48 12
  • to simulating connections Simultaneous to specific server adress to test penetrations Show Sample Output


    0
    for i in {0..60}; do (curl -Is http://46.101.214.181:10101 | head -n1 &) 2>/dev/null; sleep 1; done;
    aysadk · 2017-01-15 14:32:02 16

  • 0
    for i in {0..60}; do (curl -Is http://<domain/ip> | head -n1 &) 2>/dev/null; sleep 1; done;
    aysadk · 2017-01-24 02:47:13 13
  • To allow recursivity : find -type f -exec md5sum '{}' ';' | sort | uniq -c -w 33 | sort -gr | head -n 5 | cut -c1-7,41- Display only filenames : find -maxdepth 1 -type f -exec md5sum '{}' ';' | sort | uniq -c -w 33 | sort -gr | head -n 5 | cut -c43- Show Sample Output


    0
    find -maxdepth 1 -type f -exec md5sum '{}' ';' | sort | uniq -c -w 33 | sort -gr | head -n 5 | cut -c1-7,41-
    MaDCOw · 2017-02-09 11:36:31 18

  • 0
    head filename.txt | column -t -s $'\t'
    NLKNguyen · 2017-02-10 23:26:10 20
  • alex@alex-box:~$ sl The program 'sl' is currently not installed. You can install it by typing: sudo apt-get install sl alex@alex-box:~$ dolast Reading package lists... Done Building dependency tree Reading state information... Done The following NEW packages will be installed: sl 0 upgraded, 1 newly installed, 0 to remove and 0 not upgraded.


    0
    alias dolast='$( $(history 2| head -n 1| sed "s/.* //") 2>&1 | tail -n 1)'
    aheinous · 2017-02-25 01:26:49 19
  • ‹ First  < 8 9 10 11 12 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Query Wikipedia via console over DNS
Query Wikipedia by issuing a DNS query for a TXT record. The TXT record will also include a short URL to the complete corresponding Wikipedia entry.You can also write a little shell script like: $ $ cat wikisole.sh $ #!/bin/sh $ dig +short txt ${1}.wp.dg.cx and run it like $ ./wikisole.sh unix were your first option ($1) will be used as search term.

grab all commandlinefu shell functions into a single file, suitable for sourcing.
Each shell function has its own summary line, as a comment. If there are multiple shell functions with the same name, the function with the highest number of votes is put into the file. Note: added 'grep -v' to the end of the pipeline, to eliminate extraneous lines containing only '--'. Thanks to matthewbauer for pointing this out.

Rename many files in directories and subdirectories
This is probably overkill, but I have some issues when the directories have spaces in their names. The $ find . -type d -print0 | while read -d $'\0' dir; do xxx; done loops over all the subdirectories in this place, ignoring the white spaces (to some extend). $ cd "$dir"; echo " process $dir"; cd -; goes to the directory and back. It also prints some info to check the progress. $ find . -maxdepth 1 -name "*.ogg.mp3" -exec rename 's/.ogg.mp3/.mp3/' {} \; renames the file within the current directory. The whole should work with directories and file names that include white spaces.

Update your OpenDNS network ip
Intended for dynamic ip OpenDNS users, this command will update your OpenDNS network IP. For getting your IP, you can use one of the many one-liners here on commandlinefu. Example: I use this in a script which is run by kppp after it has successfully connected to my ISP: --- #!/bin/bash IP="`curl -s http://checkip.dyndns.org/ | grep -o '[[:digit:].]\+'`" PW="hex-obfuscated-pw-here" if [ "$IP" == "" ] ; then echo 'Not online.' ; exit 1 else wget -q --user=topsecret --password="`echo $PW | xxd -ps -r`" 'https://updates.opendns.com/nic/update?hostname=myhostname&myip='"$IP" -O - /etc/init.d/ntp-client restart & fi --- PS: DynDNS should use a similar method, if you know the URL, please post a comment. (Something with members.dyndns.org, if I recall correctly)

Multiple variable assignments from command output in BASH
This version uses read instead of eval.

scroll file one line at a time (w/only UNIX base utilities)
usage examples ls largedir |rd lynx -dump largewebsite.com |rd rd < largelogfile

find unreadable file

Log the current memory statistics frequently to syslog
Uses logger in a while loop to log memory statistics frequently into the local syslog server.

Lookup your own IPv4 address

Check how far along (in %) your program is in a file
Say you're started "xzcat bigdata.xz | complicated-processing-program >summary" an hour ago, and you of course forgot to enable progress output (you could've just put "awk 'NR%1000==0{print NR>"/dev/stderr"}{print}'" in the pipeline but it's too late for that now). But you really want some idea of how far along your program is. Then you can run the above command to see how many % along xzcat is in reading the file. Note that this is for the GNU/Linux version of lsof; the one found on e.g. Darwin has slightly different output so the awk part may need some tweaks.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: