Commands tagged function (131)

  • defines a handy function for quick calculations from cli. once defined: ? 10*2+3 Show Sample Output


    58
    ? () { echo "$*" | bc -l; }
    fizz · 2009-06-28 20:15:30 27
  • How often do you make a directory (or series of directories) and then change into it to do whatever? 99% of the time that is what I do. This BASH function 'md' will make the directory path then immediately change to the new directory. By using the 'mkdir -p' switch, the intermediate directories are created as well if they do not exist. Show Sample Output


    32
    md () { mkdir -p "$@" && cd "$@"; }
    drewk · 2009-09-24 16:09:19 20
  • This function displays the latest comic from xkcd.com. One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic. To get a random xkcd comic, I also use the following: xkcdrandom(){ wget -qO- dynamic.xkcd.com/comic/random|tee >(feh $(grep -Po '(?<=")http://imgs[^/]+/comics/[^"]+\.\w{3}'))|grep -Po '(?<=(\w{3})" title=").*(?=" alt)';}


    24
    xkcd(){ wget -qO- http://xkcd.com/|tee >(feh $(grep -Po '(?<=")http://imgs[^/]+/comics/[^"]+\.\w{3}'))|grep -Po '(?<=(\w{3})" title=").*(?=" alt)';}
    eightmillion · 2009-11-27 09:11:47 22
  • This command lets you select from 10 different BBC stations. When one is chosen, it streams it with mplayer. Requires: mplayer with wma support.


    23
    bbcradio() { local s PS3="Select a station: ";select s in 1 1x 2 3 4 5 6 7 "Asian Network an" "Nations & Local lcl";do break;done;s=($s);mplayer -playlist "http://www.bbc.co.uk/radio/listen/live/r"${s[@]: -1}".asx";}
    eightmillion · 2009-11-14 08:17:03 29
  • This function takes a word or a phrase as arguments and then fetches definitions using Google's "define" syntax. The "nl" and perl portion isn't strictly necessary. It just makes the output a bit more readable, but this also works: define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+';} If your version of grep doesn't have perl compatible regex support, then you can use this version: define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/<li>//g'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;} Show Sample Output


    18
    define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
    eightmillion · 2010-01-29 05:01:11 18
  • A function for streaming youtube to mplayer. The option "-g" for youtube-dl tells it to output the direct video URL, instead of downloading the video. "-fs" tells MPlayer to go FullScreen, and "-quit" makes it less verbose. Requires: youdube-dl ( http://bitbucket.org/rg3/youtube-dl/ ) (Tested in zsh) Show Sample Output


    17
    yt () mplayer -fs -quiet $(youtube-dl -g "$1")
    elfreak · 2010-09-29 18:48:19 20
  • Very useful in shell scripts because you can run a task nicely in the background using job-control and output progress until it completes. Here's an example of how I use it in backup scripts to run gpg in the background to encrypt an archive file (which I create in this same way). $! is the process ID of the last run command, which is saved here as the variable PI, then sleeper is called with the process id of the gpg task (PI), and sleeper is also specified to output : instead of the default . every 3 seconds instead of the default 1. So a shorter version would be sleeper $!; The wait is also used here, though it may not be needed on your system. echo ">>> ENCRYPTING SQL BACKUP" gpg --output archive.tgz.asc --encrypt archive.tgz 1>/dev/null & PI=$!; sleeper $PI ":" 3; wait $PI && rm archive.tgz &>/dev/null Previously to get around the $! not always being available, I would instead check for the existance of the process ID by checking if the directory /proc/$PID existed, but not everyone uses proc anymore. That version is currently the one at http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html but I plan on upgrading to this new version soon. Show Sample Output


    14
    sleeper(){ while `ps -p $1 &>/dev/null`; do echo -n "${2:-.}"; sleep ${3:-1}; done; }; export -f sleeper
    AskApache · 2009-09-21 07:36:25 8
  • ps and grep is a dangerous combination -- grep tries to match everything on each line (thus the all too common: grep -v grep hack). ps -C doesn't use grep, it uses the process table for an exact match. Thus, you'll get an accurate list with: ps -fC sh rather finding every process with sh somewhere on the line. Show Sample Output


    14
    ps -fC PROCESSNAME
    pooderbill · 2015-04-20 13:09:44 17
  • This is a command that I find myself using all the time. It works like regular grep, but returns the paragraph containing the search pattern instead of just the line. It operates on files or standard input. grepp <PATTERN> <FILE> or <SOMECOMMAND> | grepp <PATTERN> Show Sample Output


    13
    grepp() { [ $# -eq 1 ] && perl -00ne "print if /$1/i" || perl -00ne "print if /$1/i" < "$2";}
    eightmillion · 2010-01-12 04:30:15 13

  • 13
    compgen -A function
    patrick2000 · 2010-04-30 14:27:16 6
  • This Anti-TarBomb function makes it easy to unpack a .tar.gz without worrying about the possibility that it will "explode" in your current directory. I've usually always created a temporary folder in which I extracted the tarball first, but I got tired of having to reorganize the files afterwards. Just add this function to your .zshrc / .bashrc and use it like this; atb arch1.tar.gz and it will create a folder for the extracted files, if they aren't already in a single folder. This only works for .tar.gz, but it's very easy to edit the function to suit your needs, if you want to extract .tgz, .tar.bz2 or just .tar. More info about tarbombs at http://www.linfo.org/tarbomb.html Tested in zsh and bash. UPDATE: This function works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in zsh (not working in bash): atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t(ar.gz||ar.bz2||gz||bz||ar)} && tar xf $1 -C ${1%.t(ar.gz||ar.bz2||gz||bz||ar)}; fi ;} UPDATE2: From the comments; bepaald came with a variant that works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in bash: atb() {shopt -s extglob ; l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)} && tar xf $1 -C ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)}; fi ; shopt -u extglob} Show Sample Output


    10
    atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.tar.gz} && tar xf $1 -C ${1%.tar.gz}; fi ;}
    elfreak · 2010-10-16 05:50:32 8
  • I love this function because it tells me everything I want to know about files, more than stat, more than ls. It's very useful and infinitely expandable. find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' | sort -rgbS 50% 00761 drwxrw---x askapache:askapache 777:666 [06/10/10 | 06/10/10 | 06/10/10] [d] /web/cg/tmp The key is: # -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' which believe it or not took me hundreds of tweaking before I was happy with the output. You can easily use this within a function to do whatever you want.. This simple function works recursively if you call it with -r as an argument, and sorts by file permissions. lsl(){ O="-maxdepth 1";sed -n '/-r/!Q1'<<<$@ &&O=;find $PWD $O -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -rgbS 50%; } Personally I'm using this function because: lll () { local a KS="1 -r -g"; sed -n '/-sort=/!Q1' <<< $@ && KS=`sed 's/.*-sort=\(.*\)/\1/g'<<<$@`; find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -k$KS -bS 50%; } # i can sort by user lll -sort=3 # or sort by group reversed lll -sort=4 -r # and sort by modification time lll -sort=6 If anyone wants to help me make this function handle multiple dirs/files like ls, go for it and I would appreciate it.. Something very minimal would be awesome.. maybe like: for a; do lll $a; done Note this uses the latest version of GNU find built from source, easy to build from gnu ftp tarball. Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    8
    find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'
    AskApache · 2010-06-10 22:03:08 8
  • Uses GNU Parallel. Show Sample Output


    8
    timeDNS() { parallel -j0 --tag dig @{} "$*" ::: 208.67.222.222 208.67.220.220 198.153.192.1 198.153.194.1 156.154.70.1 156.154.71.1 8.8.8.8 8.8.4.4 | grep Query | sort -nk5; }
    unixmonkey74668 · 2015-04-26 08:22:32 31
  • This bash function uses albumart.org to find the cover for an album. It returns an amazon.com url to the image. Usage: albumart [artist] [album] These arguments can be reversed and if the album name is distinct enough, it may be possible to omit the artist. The command can be extended with wget to automatically download the matching image like this: albumart(){ local x y="$@";x=$(awk '/View larger image/{gsub(/^.*largeImagePopup\(.|., .*$/,"");print;exit}' <(curl -s 'http://www.albumart.org/index.php?srchkey='${y// /+}'&itempage=1&newsearch=1&searchindex=Music'));[ -z "$x" ]&&echo "Not found."||wget "$x" -O "${y}.${x##*.}";} Show Sample Output


    7
    albumart(){ local y="$@";awk '/View larger image/{gsub(/^.*largeImagePopup\(.|., .*$/,"");print;exit}' <(curl -s 'http://www.albumart.org/index.php?srchkey='${y// /+}'&itempage=1&newsearch=1&searchindex=Music');}
    eightmillion · 2009-11-15 19:54:16 11
  • This version works on Mac (avoids grep -P, adding a sed step instead, and invokes /usr/bin/perl with full path in case you have another one installed). Still requires that you install perl module HTML::Entities ? here's how: http://www.perlmonks.org/?node_id=640489


    7
    define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/^<li>//g'|nl|/usr/bin/perl -MHTML::Entities -pe 'decode_entities($_)';}
    gthb · 2010-01-30 13:08:03 14
  • This shell function grabs the weather forecast for the next 24 to 48 hours from weatherunderground.com. Replace <YOURZIPORLOCATION> with your zip code or your "city, state" or "city, country", then calling the function without any arguments returns the weather for that location. Calling the function with a zip code or place name as an argument returns the weather for that location instead of your default. To add a bit of color formatting to the output, use the following instead: weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "\x1B[0;34m%s\x1B[0m: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';} Requires: perl, curl Show Sample Output


    7
    weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "%s: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';}
    eightmillion · 2010-02-10 01:23:39 16

  • 7
    whatinstalled() { which "$@" | xargs -r readlink -f | xargs -r dpkg -S ;}
    pdxdoughnut · 2016-11-08 20:59:25 23
  • SH

    cat mod_log_config.c | shmore or shmore < mod_log_config.c Most pagers like less, more, most, and others require additional processes to be loaded, additional cpu time used, and if that wasn't bad enough, most of them modify the output in ways that can be undesirable. What I wanted was a "more" pager that was basically the same as running: cat file Without modifying the output and without additional processes being created, cpu used, etc. Normally if you want to scroll the output of cat file without modifying the output I would have to scroll back my terminal or screen buffer because less modifies the output. After looking over many examples ranging from builtin cat functions created for csh, zsh, ksh, sh, and bash from the 80's, 90s, and more recent examples shipped with bash 4, and after much trial and error, I finally came up with something that satisifed my objective. It automatically adjusts to the size of your terminal window by using the LINES variable (or 80 lines if that is empty) so This is a great function that will work as long as your shell works, so it will work just find if you are booted in single user mode and your /usr/bin directory is missing (where less and other pagers can be). Using builtins like this is fantastic and is comparable to how busybox works, as long as your shell works this will work. One caveat/note: I always have access to a color terminal, and I always setup both the termcap and the terminfo packages for color terminals (and/or ncurses and slang), so for that reason I stuck the tput setab 4; tput setaf 7 command at the beginning of the function, so it only runs 1 time, and that causes the -- SHMore -- prompt to have a blue background and bright white text. This is one of hundreds of functions I have in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html">.bash_profile at http://www.askapache.com/">AskApache.com, but actually won't be included till the next update. If you can improve this in any way at all please let me know, I would be very grateful! ( Like one thing I want is to be able to continue to the next screen by pressing any key instead of now having to press enter to continue) Show Sample Output


    6
    shmore(){ local l L M="`echo;tput setab 4&&tput setaf 7` --- SHMore --- `tput sgr0`";L=2;while read l;do echo "${l}";((L++));[[ "$L" == "${LINES:-80}" ]]&&{ L=2;read -p"$M" -u1;echo;};done;}
    AskApache · 2010-04-21 00:40:37 30

  • 6
    eog `curl -s http://xkcd.com/ | sed -n 's/<h3>Image URL.*: \(.*\)<\/h3>/\1/p'`
    bluesman · 2010-08-31 13:23:21 5
  • A bitcoin "brainwallet" is a secret passphrase you carry in your brain. The Bitcoin Brainwallet Private Key Base58 Encoder is the third of three functions needed to calculate a bitcoin PRIVATE key from your "brainwallet" passphrase. This base58 encoder uses the obase parameter of the amazing bc utility to convert from ASCII-hex to base58. Tech note: bc inserts line continuation backslashes, but the "read s" command automatically strips them out. I hope that one day base58 will, like base64, be added to the amazing openssl utility. Show Sample Output


    6
    function b58encode () { local b58_lookup_table=({1..9} {A..H} {J..N} {P..Z} {a..k} {m..z}); bc<<<"obase=58;ibase=16;${1^^}"|(read -a s; for b58_index in "${s[@]}" ; do printf %s ${b58_lookup_table[ 10#"$b58_index" ]}; done); }
    nixnax · 2014-02-18 02:29:30 13
  • If you omit the function name, the command will display all definitions Show Sample Output


    5
    declare -f [ function_name ]
    haivu · 2009-10-22 17:52:47 6

  • 5
    declare -F | cut -d ' ' -f 3
    wilmoore · 2010-05-29 08:19:47 4
  • A bitcoin "brainwallet" is a secret passphrase you carry in your brain. The Bitcoin Brainwallet Exponent Calculator is the second of three functions needed to calculate a bitcoin PRIVATE key. Roughly, checksum is the first 8 hex digits of sha256(sha256(0x80+sha256(passphrase))) Note that this is a bash function, which means you have to type its name to invoke it Show Sample Output


    5
    function brainwallet_checksum () { (o='openssl sha256 -binary'; p='printf';($p %b "\x80";$p %s "$1"|$o)|$o|sha256sum|cut -b1-8); }
    nixnax · 2014-02-18 02:07:02 33
  • More recent versions of the date command finally have the ability to decode the unix epoch time into a human readable date. This function makes it simple to utilize this feature quickly. Show Sample Output


    4
    utime { date -d @$1; }
    deltaray · 2010-05-12 12:21:15 7
  • Just add this function to your .zshrc / .bashrc, and by typing "shout *URL*" you get a randomly chosen English word that ShoutKey.com uses to short your URL. You may now go to shoutkey.com/*output_word* and get redirected. The URL will be valid for 5 minutes. (I've never used sed before, so I'll be quite glad if someone could straighten up the sed commands and combine them (perhaps also removing the whitespace). If so, I'll update it right away ;) ) Show Sample Output


    4
    shout () { curl -s "http://shoutkey.com/new?url=$1" | sed -n 's/\<h1\>/\&/p' | sed 's/<[^>]*>//g;/</N;//b' ;}
    elfreak · 2010-10-04 23:50:54 3
  •  1 2 3 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Define shell variable HISTIGNORE so that comments (lines starting with #) appear in shell history
I was surprised to find that with RedHat bash, I could not find any comment lines (begining with #) in my bash shell history. Surprised because in Mageia Linux this works. It turns out that RedHat's bash will keep comment lines if in my .bashrc, I define: export HISTIGNORE=' cd "`*: PROMPT_COMMAND=?*?' Why have comment lines in shell history? It's a handy and convenient way to make proto-commands (to be completed later) and for storing brief text data that is searchable in shell history.

Use find to get around Argument list too long problem
Can be used for other commands as well, replace rm with ls. It is easy to make this shorter but if the filenames involved have spaces, you will need to do use find's "-print0" option in conjunction with xargs's "-0" option. Otherwise the shell that xargs uses to execute the "rm" command line will treat the space as a token separator, thereby treating the name as two (or more) names.

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Retrieve the size of a file on a server
Downloads the entire file, but http servers don't always provide the optional 'Content-Length:' header, and ftp/gopher/dict/etc servers don't provide a filesize header at all.

Sort all running processes by their memory & CPU usage
you can also pipe it to "tail" command to show 10 most memory using processes.

Create POSIX tar archive
tar(1) and cpio(1) are not fully platform agnostic, although their file formats are specified in POSIX.1-2001. As such, GNU tar(1) might not be able to extract a BSD tar(1) archive, and ivce versa. pax(1) is defined in POSIX.1-2001. To extract an archive: $ pax -rf archive.tar

BourneShell: Go to previous directory
cd - would return to the previous directory of your cd command. NB: previous dir is always stored in $OLDPWD variable.

Copy a file using dd and watch its progress
This is a more accurate way to watch the progress of a dd process. The $DDPID=$! is needed so that you don't get the PID of the sleep. The sleep 1 is needed because in my testing at least, if you run kill -USR1 against dd too quickly, it will kill it off instead of display the status. So you need to wait a second, probably so that it can configure itself to trap the USR1 signal.

Find files with size over 100MB and output with better lay-out

Convert all Flac in a directory to Mp3 using maximum quality variable bitrate


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: