Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 12,420 results
wbinfo - Get all users group membership, with primary group starred (Red description for full command)
2014-06-20 20:45:52
User: jaimerosario
Functions: users
Tags: samba wbinfo
0

###

for ADUSER in $(wbinfo -u --domain="$(wbinfo --own-domain)" | sort); do WBSEP=$(wbinfo --separator); ADUNAME=$(wbinfo -i "$ADUSER" | cut -d ":" -f5); UINFO=$(wbinfo -i "$ADUSER" | cut -d ":" -f4); SIDG=$(wbinfo -G "$UINFO"); GROUPID=$(wbinfo -s "$SIDG" | sed 's/.\{1\}$//' | cut -d "$WBSEP" -f2); echo -e "$ADUSER ($ADUNAME)\n$(printf '%.s-' {1..32})\n\t[*] $GROUPID"; for GID in $(wbinfo -r "$ADUSER"); do SID=$(wbinfo -G "$GID"); GROUP=$(wbinfo -s "$SID" | cut -d " " -f1,2); echo -e "\t[ ] $(echo -e "${GROUP/%?/}" | cut -d "$WBSEP" -f2)"; done | sed '1d'; echo -e "$(printf '%.s=' {1..32})\n"; done

###

echo "I am $BASH_SUBSHELL levels nested";
php -v
echo $(ifconfig) | egrep -o "en.*?inet [^ ]* " | sed 's/.*inet \(.*\)$/\1/' | tail -n +2
for file in *.pdf; do convert -verbose -colorspace RGB -resize 800 -interlace none -density 300 -quality 80 "$file" "${file//.pdf/.jpg}"; done
2014-06-19 15:52:42
User: malathion
Functions: file
Tags: pdf convert
3

Without the bashisms and unnecessary sed dependency. Substitutions quoted so that filenames with whitespace will be handled correctly.

function google { Q="$@";GOOG_URL='https://www.google.com/search?tbs=li:1&q=';AGENT="Mozilla/4.0";stream=$(curl -A "$AGENT" -skLm 10 "${GOOG_URL}${Q//\ /+}");echo "$stream" | grep -o "href=\"/url[^\&]*&" | sed 's/href=".url.q=\([^\&]*\).*/\1/';}
aws ec2 describe-instances --query 'Reservations[*].Instances[*].[InstanceId,LaunchTime]' --output text | sort -n -k 2
2014-06-16 21:51:51
User: hakamadare
Functions: sort
Tags: aws jq
2

You can do the filtering natively in the aws cli, without using jq (although jq is awesome!)

env PS4=' ${BASH_SOURCE:-0$}:${LINENO}(${FUNCNAME[0]}) ' sh -x /etc/profile
wget -q -O "quote" https://www.goodreads.com/quotes_of_the_day;notify-send "$(echo "Quote of the Day";cat quote | grep '&ldquo;\|/author/show' | sed -e 's/<[a-zA-Z\/][^>]*>//g' | sed 's/&ldquo;//g' | sed 's/&rdquo;//g')"; rm -f quote
2014-06-15 03:17:19
User: nowhereman88
Functions: rm wget
0

Just pulls a quote for each day and displays it in a notification bubble...

or you can change it a bit and just have it run in the terminal

wget -q -O "quote" https://www.goodreads.com/quotes_of_the_day;echo "Quote of the Day";cat quote | grep '&ldquo;\|/author/show' | sed -e 's/<[a-zA-Z\/][^>]*>//g' | sed 's/&ldquo;//g' | sed 's/&rdquo;//g'; rm -f quote
find -type f -exec bash -c 'if ffmpeg -i "{}" 2>&1 | grep -qi h264 ; then echo "{}"; fi' \;
wget -r -P ./dl/ -A jpg,jpeg http://captivates.com
for file in ./data/message-snapshots/*.jpg; do cp "$file" /data/digitalcandy/ml/images/; done
2014-06-14 17:26:21
User: ferdous
Functions: cp file
Tags: cp ARG_MAX
0

helpful when you see something like this:

zsh: argument list too long: cp

alias echourl="wget -qO -"
2014-06-14 00:23:07
User: Sepero
Functions: alias
Tags: wget
1

Directly send the content of a url to standard out. This command is most convenient for sending the output of a download directly to another command.

for DOMAIN in $(wbinfo -m); do WBSEP=$(wbinfo --separator); ADSERVER=$(wbinfo ... (Read description for full command)))
2014-06-13 21:03:23
User: jaimerosario
0

###

for ADUSER in $(wbinfo -u --domain="$(wbinfo --own-domain)" | sort); do WBSEP=$(wbinfo --separator); ADUNAME=$(wbinfo -i "$ADUSER" | cut -d ":" -f5); UINFO=$(wbinfo -i "$ADUSER" | cut -d ":" -f3); GINFO=$(wbinfo -i "$ADUSER" | cut -d ":" -f4); SIDU=$(wbinfo -U "$UINFO"); SIDG=$(wbinfo -G "$GINFO"); USERID=$(wbinfo -s "$SIDU" | sed 's/.\{1\}$//' | cut -d "$WBSEP" -f2); GROUPID=$(wbinfo -s "$SIDG" | sed 's/.\{1\}$//' | cut -d "$WBSEP" -f2); echo -e "$ADUSER:$USERID:$ADUNAME:$GROUPID"; done | column -tx -s:

###

wakeonlan 00:00:DE:AD:BE:EF
2014-06-13 16:16:33
User: tyzbit
0

Wakes up a computer on your LAN with a Wake-On-LAN packet. MAC Address must match the NIC MAC, computer must have WOL enabled in the BIOS.

cvlc /path/to/file.avi -V caca
2014-06-13 16:10:36
User: tyzbit
Tags: video vlc
0

Use this command to watch video files on the terminal using VLC.

prerequisite: VLC and cvlc

sudo apt-get install vlc cvlc

^Z <...> % &
2014-06-13 15:35:52
User: Dev_NIX
Tags: job control
0

Like this you can continue a suspended job without blocking again your terminal

netstat -lptu | grep -E "22.*ESTABLISHED" | cut -s -d ':' -f2 | awk '{print $2}'
2014-06-13 08:38:16
User: DarkXDroid
Functions: awk cut grep netstat
0

Show If Someone Is Connected To The Android Device On And Get Their IP Address

rename 's/result_([0-9]+)_([0-9]+)_([0-9]+)\.json\.txt/sprintf("%d%02d%02d.txt",$3,$2,$1)/ge' result_*.txt
2014-06-13 07:34:32
User: sucotronic
Functions: rename
Tags: perl rename
0

Given a bunch of files with "wrong" date naming, it renames them in a "good" format.

alias alive='(while true; do ping -c 4 192.168.1.1 > /dev/null 2>&1 ; sleep 300 ; done)'
2014-06-13 06:13:57
User: DarkXDroid
Functions: alias ping sleep
0

Everytime You Run Bash It Will Run And Send The Command To Background In A Loop Forever. This Is Useful In Android To Avoid Getting Discconnected While Using ADB Or Other Services Like SSH By Being Inactive For Long Periods Of Time. In My Case I Get Bash Full Suport Only Through ADB And Also A Decent Python Interpreter Using Python For Android.

youtube-dl -c -o "%(title)s" -f 18 https://www.youtube.com/watch?v=5qSCKUCjdKg
2014-06-12 23:31:55
User: tg3793
0

Before you use this command you want to replace everything after the "https:" with the url of the video which you want to download. This string and it's switches will use "youtube-dl" to download the Youtube url into the directory/folder where it is called from. It will output the video using the same name as Youtube uses.

watch -n10 --no-title "w3m http://www.livescore.com/ |egrep 'live [0-9H]+[^ ]'"
2014-06-12 21:44:26
Functions: watch
Tags: bash livescore
1

World Cup Live Score of the ongoing match.

Alternative to have the live score with the match statistics:

watch -n10 --no-title "w3m http://www.livescore.com/ |awk '/live [0-9H]+[^ ]/,/red cards/'"
x11vnc -many -rfbauth ~/.vnc_passwd -forever -nevershared
x11vnc -storepasswd "password" ~/my_vnc_pass
mktemp!() { mktemp $TMPDIR$1.XXXXXXXXXX }