Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Hide

Credits

Commands using ls from sorted by
Terminal - Commands using ls - 462 results
lsli() { ls -l --color "$@" | awk '{ for(i=9;i<NF;i++){ printf("%s ",$i) } printf("%s\n",$NF) }'; }
2010-02-23 15:05:28
User: quigybo
Functions: awk ls
2

displays the output of ls -l without the rest of the crud. pretty simple but useful.

ls -RAx | grep "svn:$" | sed -e "s/svn:/svn/" | xargs rm -fr
ls *.wav | while read f; do lame "$f" -o "$(echo $f | cut -d'.' -f1)".mp3; done;
ls *.jpg | grep -n "" | sed 's,.*,0000&,' | sed 's,0*\(...\):\(.*\).jpg,mv "\2.jpg" "image-\1.jpg",' | sh
ls . | xargs file | grep text | sed "s/\(.*\):.*/\1/" | xargs gedit
newest () { find ${1:-\.} -type f |xargs ls -lrt ; }
ls -l | sed -e 's/--x/1/g' -e 's/-w-/2/g' -e 's/-wx/3/g' -e 's/r--/4/g' -e 's/r-x/5/g' -e 's/rw-/6/g' -e 's/rwx/7/g' -e 's/---/0/g'
ls -t1 | sed 1d | parallel -X rm
2010-01-28 12:28:18
Functions: ls sed
-1

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

ls -Q * | xargs -p rm
2010-01-27 02:46:49
User: temp_reg
Functions: ls xargs
1

ls -Q will show the filenames in quotes. xargs -p rm will print all the filenames piped from ls -Q and ask for confirmation before deleting the files.

without the -Q switch, if we have spaces in names, then the files won't be deleted.

ls /usr/bin | xargs whatis | grep -v nothing | less
2010-01-26 12:59:47
User: michelsberg
Functions: grep ls whatis xargs
12

no loop, only one call of grep, scrollable ("less is more", more or less...)

for i in $(ls /usr/bin); do whatis $i | grep -v nothing; done | more
find / \( -local -o -prune \) \( -perm -4000 -o -perm -2000 \) -type f -exec ls -l {} \;
xdg-open $(ls . | dmenu)
2010-01-08 17:16:28
User: matthewbauer
Functions: ls
1

Simple file browser with dmenu, ls, and xdg-open.

ls / | sed -e :a -e 's/^.\{1,15\}$/&_/;ta'
2010-01-06 17:22:01
User: glaudiston
Functions: ls sed
0

the sql command lpad and rpad using sed

for lpad, invert the &_ with _&:

ls / | sed -e :a -e 's/^.\{1,15\}$/_$/;ta'
ls | sed "/^/=" | sed "N;s/\n/. /"
ls | sed -n '1h;2,$H;${g;s/\n/,/g;p}'
2010-01-06 15:56:16
User: glaudiston
Functions: ls sed
0

searching for sed to make a csv, I found the solution from Mr. Stolz in http://funarg.nfshost.com/r2/notes/sed-return-comma.html

you can also to use:

tr "\n" "," ;

But I was looking for a sed way =)

ls -lah --color=always | most
2010-01-04 22:21:13
User: Code_Bleu
Functions: ls
-2

Even though --color is an option for 'ls' it will not display in color when doing 'ls -lah --color=always | less' to have color output when doing a directory listing and piping it out to page through results, replace less with most.

To install most if not installed, run:

sudo apt-get install most

BACKUP_FILE_SIZE=`eval ls -l ${BACKUP_FILE} | awk {'print $5'}`; if [ $BACKUP_FILE_SIZE -le 20 ]; then echo "its empty"; else echo "its not empty"; fi
2009-12-29 08:34:37
User: Redrocket
Functions: awk echo ls
-2

If you gzip an empty file it becomes 20 bytes. Some backup checks i do check to see if the file is greater than zero size (-s flag) but this is no good here. Im sure someone has a better check than me for this? No check to see if file exists before checking it's size.

find dir -size -1024k -type f | xargs -d $'\n' -n1 ls -l | cut -d ' ' -f 5 | sed -e '2,$s/$/+/' -e '$ap' | dc
2009-12-28 04:23:01
User: zhangweiwu
Functions: cut dir find ls sed xargs
Tags: size sum
1

The command gives size of all files smaller than 1024k, this information, together with disk usage, can help determin file system parameter (e.g. block size) or storage device (e.g. SSD v.s. HDD).

Note if you use awk instead of "cut| dc", you easily breach maximum allowed number of records in awk.

find . -name '*png' -printf '%h\0' | xargs -0 ls -l --hide=*.png | grep -ZB1 ' 0$'
sudo ls -RFal / | gzip > all_files_list.txt.gz
2009-12-14 21:40:56
User: roryokane
Functions: gzip ls sudo
2

This command is meant to be used to make a lightweight backup, for when you want to know which files might be missing or changed, but you don't care about their contents (because you have some way to recover them).

Explanation of parts:

"ls -RFal /" lists all files in and below the root directory, along with their permissions and some other metadata.

I think sudo is necessary to allow ls to read the metadata of certain files.

"| gzip" compresses the result, from 177 MB to 16 MB in my case.

"> all_files_list.txt.gz" saves the result to a file in the current directory called all_files_list.txt.gz. This name can be changed, of course.

dir=$(pwd); while [ ! -z "$dir" ]; do ls -ld "$dir"; dir=${dir%/*}; done; ls -ld /
2009-12-14 14:38:11
User: hfs
Functions: dir ls
2

Useful if a different user cannot access some directory and you want to know which directory on the way misses the x bit.

perl -lne 'print for /url":"\K[^"]+/g' $(ls -t ~/.mozilla/firefox/*/sessionstore.js | sed q)
2009-12-14 00:51:54
User: sputnick
Functions: ls perl sed
0

If you want all the URLs from all the sessions, you can use :

perl -lne 'print for /url":"\K[^"]+/g' ~/.mozilla/firefox/*/sessionstore.js

Thanks to tybalt89 ( idea of the "for" statement ).

For perl purists, there's JSON and File::Slurp modules, buts that's not installed by default.

ls | xargs du -sh
grep -oP '"url":"\K[^"]+' $(ls -t ~/.mozilla/firefox/*/sessionstore.js | sed q)
2009-12-09 20:34:32
User: sputnick
Functions: grep ls sed
0

Require "grep -P" ( pcre ).

If you don't have grep -P, use that :

grep -Eo '"url":"[^"]+' $(ls -t ~/.mozilla/firefox/*/sessionstore.js | sed q) | cut -d'"' -f4