Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using ls from sorted by
Terminal - Commands using ls - 448 results
ls | sed "/^/=" | sed "N;s/\n/. /"
ls | sed -n '1h;2,$H;${g;s/\n/,/g;p}'
2010-01-06 15:56:16
User: glaudiston
Functions: ls sed
0

searching for sed to make a csv, I found the solution from Mr. Stolz in http://funarg.nfshost.com/r2/notes/sed-return-comma.html

you can also to use:

tr "\n" "," ;

But I was looking for a sed way =)

ls -lah --color=always | most
2010-01-04 22:21:13
User: Code_Bleu
Functions: ls
-2

Even though --color is an option for 'ls' it will not display in color when doing 'ls -lah --color=always | less' to have color output when doing a directory listing and piping it out to page through results, replace less with most.

To install most if not installed, run:

sudo apt-get install most

BACKUP_FILE_SIZE=`eval ls -l ${BACKUP_FILE} | awk {'print $5'}`; if [ $BACKUP_FILE_SIZE -le 20 ]; then echo "its empty"; else echo "its not empty"; fi
2009-12-29 08:34:37
User: Redrocket
Functions: awk echo ls
-2

If you gzip an empty file it becomes 20 bytes. Some backup checks i do check to see if the file is greater than zero size (-s flag) but this is no good here. Im sure someone has a better check than me for this? No check to see if file exists before checking it's size.

find dir -size -1024k -type f | xargs -d $'\n' -n1 ls -l | cut -d ' ' -f 5 | sed -e '2,$s/$/+/' -e '$ap' | dc
2009-12-28 04:23:01
User: zhangweiwu
Functions: cut dir find ls sed xargs
Tags: size sum
1

The command gives size of all files smaller than 1024k, this information, together with disk usage, can help determin file system parameter (e.g. block size) or storage device (e.g. SSD v.s. HDD).

Note if you use awk instead of "cut| dc", you easily breach maximum allowed number of records in awk.

find . -name '*png' -printf '%h\0' | xargs -0 ls -l --hide=*.png | grep -ZB1 ' 0$'
sudo ls -RFal / | gzip > all_files_list.txt.gz
2009-12-14 21:40:56
User: roryokane
Functions: gzip ls sudo
2

This command is meant to be used to make a lightweight backup, for when you want to know which files might be missing or changed, but you don't care about their contents (because you have some way to recover them).

Explanation of parts:

"ls -RFal /" lists all files in and below the root directory, along with their permissions and some other metadata.

I think sudo is necessary to allow ls to read the metadata of certain files.

"| gzip" compresses the result, from 177 MB to 16 MB in my case.

"> all_files_list.txt.gz" saves the result to a file in the current directory called all_files_list.txt.gz. This name can be changed, of course.

dir=$(pwd); while [ ! -z "$dir" ]; do ls -ld "$dir"; dir=${dir%/*}; done; ls -ld /
2009-12-14 14:38:11
User: hfs
Functions: dir ls
2

Useful if a different user cannot access some directory and you want to know which directory on the way misses the x bit.

perl -lne 'print for /url":"\K[^"]+/g' $(ls -t ~/.mozilla/firefox/*/sessionstore.js | sed q)
2009-12-14 00:51:54
User: sputnick
Functions: ls perl sed
0

If you want all the URLs from all the sessions, you can use :

perl -lne 'print for /url":"\K[^"]+/g' ~/.mozilla/firefox/*/sessionstore.js

Thanks to tybalt89 ( idea of the "for" statement ).

For perl purists, there's JSON and File::Slurp modules, buts that's not installed by default.

ls | xargs du -sh
grep -oP '"url":"\K[^"]+' $(ls -t ~/.mozilla/firefox/*/sessionstore.js | sed q)
2009-12-09 20:34:32
User: sputnick
Functions: grep ls sed
0

Require "grep -P" ( pcre ).

If you don't have grep -P, use that :

grep -Eo '"url":"[^"]+' $(ls -t ~/.mozilla/firefox/*/sessionstore.js | sed q) | cut -d'"' -f4
ls -l `locate your_search_here`
2009-11-27 05:53:46
User: tjcertified
Functions: ls
-5

This command lists extended information about files, i.e. whether or not it is a true file or link, who owns it, etc. without having to 'ls' from the specific directory. If you know the filename, but not the location, this helps with finding other information about the file. It can be truncated by creating an alias for 'ls -l'. The sample output shows difference in regular locate vs. ls + locate.

ls -la /dev/disk/by-id/usb-*
2009-11-25 16:02:06
User: casidiablo
Functions: ls
2

This command lists the names of your USB devices connected and what file in /dev they are using. It's pretty useful if you don't have an automount option in your desktop or you don't have any graphical enviroment.

ls *[^p][^a][^t]* ; # or shopt -s extglob; ls !(*pattern*)
ls | grep -vi pattern
ls -F | sed -n 's/@$//p'
ls -l `ls -l |awk '/^l/ {print $8}'`
2009-11-23 16:02:18
User: yooreck
Functions: awk ls
-3

ls -l may vary depending on operating system, so "print $8" may have to be changed

ls -d .*
ls -l | awk '$5 > 1000000' | sort -k5n
for f in $(ls *.xml.skippy); do mv $f `echo $f | sed 's|.skippy||'`; done
2009-11-19 21:36:26
User: argherna
Functions: ls mv sed
Tags: sed ls mv for
-2

For this example, all files in the current directory that end in '.xml.skippy' will have the '.skippy' removed from their names.

video=$(ls /tmp | grep -e Flash\w*); ffmpeg -i /tmp/$video -f mp3 -ab 192k ~/ytaudio.mp3
function lsless() { ls "$@" | less; }
2009-11-13 17:28:06
User: argherna
Functions: ls
Tags: less ls function
-2

This is useful for paging through long directories, mulitple directories, etc. I put this in my ~/.bash_aliases file and alias 'lsl' to it.

slice(){ cut -c$((${#1}+1))-; }; ls -l | slice "-rw-r--r--"
slice="-rw-r--r-- "; ls -l | cut -c $(echo "$slice" | wc -c)-
ls -al
2009-11-12 12:27:32
User: eastwind
Functions: ls
-11

it does provide much more information , the owner , group , the size in byte , and the last modified time a file or directory was

ls -al : list all in long format