Quicker way to search man pages of command for key word Show Sample Output
Gnu grep allows to restrict the search to files only matching a given pattern. It also allows to exclude files.
If we want files with more than one extension, like .tar.gz, only appear the latest, .gz:
ls -Xp /path/to/dir | grep -Eo "\.[^./]+$" | uniq
Show Sample Output
recursive find and replace. important stuff are grep -Z and zargs -0 which add zero byte after file name so sed can work even with file names with spaces.
grep multiline in Perl regexp syntax with pcregrep Show Sample Output
I used 110 as the port number in examples for clarity. backslash+lessthan or backslash+b marks 'edge of the word'. Show Sample Output
range context (-A -B) search, with exclusion of vcs directories Show Sample Output
Some source package have many 'README' kind of files, among many other regular files/directories. This command could be useful when one wants to list only 'README' kind of files among jungle of other files. (e.g. I came across this situation after downloading source for module-init-tools) Warning: This command would miss a file like => README.1 (or one with spaces in-between) Corrections welcome. Show Sample Output
same as
grep -lL "foo" $(grep -l bar *cl*.log)
Bash method to remove all files but "abc". It would be 'rm *~abc' in Zsh.
Changed wget to curl and it doesn't create a file anymore. Show Sample Output
yt2mp3(){ for j in `seq 1 301`;do i=`curl -s gdata.youtube.com/feeds/api/users/$1/uploads\?start-index=$j\&max-results=1|grep -o "watch[^&]*"`;ffmpeg -i `wget youtube.com/$i -qO-|grep -o 'url_map"[^,]*'|sed -n '1{s_.*|__;s_\\\__g;p}'` -vn -ab 128k "`youtube-dl -e ${i#*=}`.mp3";done;}
squeezed the monster (and nifty ☺) command from 7776 from 531 characters to 284 characters, but I don't see a way to get it down to 255. This is definitely a kludge!
easier to remember Show Sample Output
Why use grep and awk?
If your version of curl does not support the --compressed option, use
curl -s http://funnyjunk.com | gunzip
instead of
curl -s --compressed http://funnyjunk.com
vim 7 required
nothing special Show Sample Output
No final count, but clean and simple output.
Expand a URL, aka do a head request, and get the URL. Copy this value to clipboard.
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: