Commands using ls (517)


  • 1
    find / -perm +6000 -type f -exec ls -ld {} \;
    aguslr · 2011-10-14 22:19:58 5
  • This will list the files in a directory, then zip each one with the original filename individually. video1.wmv -> video1.zip video2.wmv -> video2.zip This was for zipping up large amounts of video files for upload on a Windows machine.


    1
    ls -1 | awk ' { print "zip "$1".zip " $1 } ' | sh
    kaywhydub · 2011-12-14 20:30:56 6

  • 1
    ls -d $PWD/*
    putnamhill · 2011-12-16 19:12:55 3

  • 1
    ls -d1 $PWD/{.*,*}
    bunam · 2011-12-17 12:25:15 2

  • 1
    ls -d1 $PWD/*
    www · 2011-12-31 14:46:41 3

  • 1
    find <directory> -type f -printf "%T@\t%p\n"|sort -n|cut -f2|xargs ls -lrt
    rik · 2012-03-02 12:51:06 3
  • Here's an annotated version of the command, using full-names instead of aliases. It is exactly equivalent to the short-hand version. # Recursively list all the files in the current directory. Get-ChildItem -Recurse | # Filter out the sub-directories themselves. Where-Object { return -not $_.PsIsContainer; } | # Group the resulting files by their extensions. Group-Object Extension | # Pluck the Name and Count properties of each group and define # a custom expression that calculates the average of the sizes # of the files in that group. # The back-tick is a line-continuation character. Select-Object ` Name, Count, @{ Name = 'Average'; Expression = { # Average the Length (sizes) of the files in the current group. return ($_.Group | Measure-Object -Average Length).Average; } } | # Format the results in a tabular view, automatically adjusted to # widths of the values in the columns. Format-Table -AutoSize ` @{ # Rename the Name property to something more sensible. Name = 'Extension'; Expression = { return $_.Name; } }, Count, @{ # Format the Average property to display KB instead of bytes # and use a formatting string to show it rounded to two decimals. Name = 'Average Size (KB)'; # The "1KB" is a built-in constant which is equal to 1024. Expression = { return $_.Average / 1KB }; FormatString = '{0:N2}' } Show Sample Output


    1
    ls -r | ?{-not $_.psiscontainer} | group extension | select name, count, @{n='average'; e={($_.group | measure -a length).average}} | ft -a @{n='Extension'; e={$_.name}}, count, @{n='Average Size (KB)'; e={$_.average/1kb}; f='{0:N2}'}
    brianpeiris · 2012-03-13 17:58:10 9
  • This will generate the same output without changing the current directory, and filepath will be relative to the current directory. Note: it will (still) fail if your iTunes library is in a non-standard location.


    1
    ls "~/Music/iTunes/iTunes Media/Mobile Applications" > filepath
    minnmass · 2012-05-04 09:51:59 50
  • Nothing too magical here, just uses pngcrush to losslessly compress all your pngs!


    1
    ls *.png | while read line; do pngcrush -brute $line compressed/$line; done
    waffleboi9 · 2012-07-17 20:20:49 5
  • Sometimes I would like to see hidden files, prefix with a period, but some files or folders I never want to see (and really wish I could just remove all together). Show Sample Output


    1
    alias ls='if [[ -f .hidden ]]; then while read l; do opts+=(--hide="$l"); done < .hidden; fi; ls --color=auto "${opts[@]}"'
    expelledboy · 2012-08-12 13:10:23 5

  • 1
    ln -s /base/* /target && ls -l /target
    mattcen · 2012-08-22 11:27:40 5
  • Show the UUID-based alternate device names of ZEVO-related partitions on Darwin/OS X. Adapted from the lines by dbrady at http://zevo.getgreenbytes.com/forum/viewtopic.php?p=700#p700 and following the disk device naming scheme at http://zevo.getgreenbytes.com/wiki/pmwiki.php?n=Site.DiskDeviceNames Show Sample Output


    1
    ls /dev/disk* | xargs -n 1 -t sudo zdb -l | grep GPTE_
    grahamperrin · 2012-10-06 20:19:45 5
  • Substitute for #11720 Can probably be even shorter and easier. Show Sample Output


    1
    ls -l /dev/disk/by-id/ | grep '/sda$' | grep -o 'ata[^ ]*'
    michelsberg · 2013-01-16 17:28:11 7
  • I find it useful, when cleaning up deleting unwanted files to make more space, to list in size order so I can delete the largest first. Note that using "q" shows files with non-printing characters in name. In this sample output (above), I found two copies of the same iso file both of which are immediate "delete candidates" for me. Show Sample Output


    1
    ls -qahlSr # list all files in size order - largest last
    mpb · 2013-03-13 09:52:07 29
  • zsh: list of files sorted by size, greater than 100mb, head the top 5. '**/*' is recursive, and the glob qualifiers provide '.' = regular file, 'L' size, which is followed by 'm' = 'megabyte', and finally '+100' = a value of 100


    1
    ls -Sh **/*(.Lm+100) | tail -5
    khayyam · 2013-03-21 20:22:11 4
  • make usable on OSX with filenames containing spaces. note: will still break if filenames contain newlines... possible, but who does that?!


    1
    svn ls -R | egrep -v -e "\/$" | tr '\n' '\0' | xargs -0 svn blame | awk '{print $2}' | sort | uniq -c | sort -nr
    rymo · 2013-04-10 19:37:53 5
  • Like top, but for files


    1
    watch -d -n 2 'df; ls -FlAt;'
    G2G · 2013-09-17 05:44:47 6
  • Sorts by latest modified files by looking to current directory and all subdirectories Show Sample Output


    1
    find . -name '*pdf*' -print0 | xargs -0 ls -lt | head -20
    fuats · 2013-10-03 21:58:51 9
  • displays a list of all file extensions in current directory and how many files there are of each type of extension in ascending order (case insensitive) Show Sample Output


    1
    ls | tr [:upper:] [:lower:] | grep -oP '\.[^\.]+$' | sort | uniq -c | sort
    icefyre · 2014-01-30 11:37:27 10

  • 1
    npm ls -g|grep "^[&#9500;&#9492;]\(.\+\)\?[&#9516;&#9472;] "
    lucasmezencio · 2014-02-03 21:50:39 7
  • Very quick! Based only on the content sizes and the character counts of filenames. If both numbers are equal then two (or more) directories seem to be most likely identical. if in doubt apply: diff -rq path_to_dir1 path_to_dir2 AWK function taken from here: http://stackoverflow.com/questions/2912224/find-duplicates-lines-based-on-some-delimited-fileds-on-line Show Sample Output


    1
    find . -type d| while read i; do echo $(ls -1 "$i"|wc -m) $(du -s "$i"); done|sort -s -n -k1,1 -k2,2 |awk -F'[ \t]+' '{ idx=$1$2; if (array[idx] == 1) {print} else if (array[idx]) {print array[idx]; print; array[idx]=1} else {array[idx]=$0}}'
    knoppix5 · 2014-02-25 22:50:09 27
  • Tested with GNU and BSD ls. Show Sample Output


    1
    ls -la | grep ^l
    gatopan · 2014-08-11 03:06:48 8
  • With this version, you can list all symlinks in the current directory (no subdirectories), and have it list both the link and the target. Show Sample Output


    1
    ls -l `find ~ -maxdepth 1 -type l -print`
    skittleys · 2015-01-04 02:36:47 7
  • list all txt files order by time, newest first


    1
    ls -lt --time=atime *.txt
    miccaman · 2015-05-21 21:03:44 10
  • Adding course name prefix to lecture pdfs Show Sample Output


    1
    ls *.pdf | while read file; do newfile="CS749__${file}"; mv "${file}" "${newfile}"; done;
    programmer · 2016-04-19 11:04:47 16
  • ‹ First  < 6 7 8 9 10 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Count lines of code across multiple file types, sorted by least amount of code to greatest
The same as the other two alternatives, but now less forking! Instead of using '\;' to mark the end of an -exec command in GNU find, you can simply use '+' and it'll run the command only once with all the files as arguments. This has two benefits over the xargs version: it's easier to read and spaces in the filesnames work automatically (no -print0). [Oh, and there's one less fork, if you care about such things. But, then again, one is equal to zero for sufficiently large values of zero.]

Substitute an already running command
eg: Already running cmd $sleep 120 Substitution cmd $c=$(pgrep sleep) && sleep 5 && kill $c

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Generate random valid mac addresses
Doubt it actually generates valid mac addresses but this version doesn't need any external commands so it runs much faster. Much shorter as well.

One command line web server on port 80 using nc (netcat)
Very simple web server listening on port 80 will serve index.html file or whatever file you like pointing your browser at http://your-IP-address/index.html for example. If your web server is down for maintenance and you'd like to inform your visitors about it, quickly and easily, you just have to put into the index.html file the right HTML code and you are done! Of course you need to be root to run the command using port 80.

Shrink more than one blank lines to one in VIM.

Resume aborted scp file transfers
Put it into your sh startup script (I use alias scpresume='rsync --partial --progress --rsh=ssh' in bash). When a file transfer via scp has aborted, just use scpresume instead of scp and rsync will copy only the parts of the file that haven't yet been transmitted.

Backup with versioning
Apart from an exact copy of your recent contents, also keep all earlier versions of files and folders that were modified or deleted. Inspired by EVACopy http://evacopy.sourceforge.net

Find the most recent snapshot for an AWS EBS volume
Uses the python-based AWS CLI (https://aws.amazon.com/cli/) and the JSON query tool, JQ (https://stedolan.github.io/jq/)

Shorten any Url using bit.ly API, using your API Key which enables you to Track Clicks
Shorten any Url using bit.ly API, using your API Key which enables you to Track Clicks I have it as a Function in my .bash_aliases [code] shorten () { longUrl=$1; curl "http://api.bit.ly/shorten?version=2.0.1&longUrl=LONG_URL_YOU_WANT_SHORTENED&login=rungss&apiKey=" } [/code] Here is an Output showing the Function Detail.. [konsole] bijay@bijay:$ type shorten shorten is a function shorten () { longUrl=$1; curl "http://api.bit.ly/shorten?version=2.0.1&longUrl=$longUrl&login=rungss&apiKey=R_48d7e0b40835b09e3861bd455f7abec7" } [/konsole]


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: