Commands using ls (517)

  • Returns a list, with attributes (think `ls -l`), in reverse chronological order. N is a single numeric parameter. Robust against unfriendly filenames and directory structures. Show Sample Output


    0
    nmf() { find . -type f -printf '%T@ ' -print0 -printf '\n' | sort -rn | head -"$1" | cut -f2- -d" " | tr -d "\0" | tr "\n" "\0" | xargs -0 ls -Ulh; }
    incidentnormal · 2016-03-04 14:53:14 12
  • I've shortened it to: lsc PATH | l ... by adding ... alias lsc="ls --color" ... and ... alias l="less -R" ... to my ~/.bashrc file


    0
    ls --color PATH | less -R
    kevjonesin · 2016-03-07 13:46:02 13
  • Grabs the first JSON file in the directory, reads its keys, prints TSV, then prints all the json files' values as TSV. Nested objects appear as json. Unhappy times if your json has literal tabs in it. Show Sample Output


    0
    jq -r 'keys | join("\t")' $(ls -f *.json | head -1) && jq -Sr 'to_entries | [ .[] | .value | tostring ] | join("\t")' *.json
    drjeats · 2016-04-08 23:30:30 12
  • It will print a compact ls -la list with the directories at the beginning. --almost-all - do not list implied . and .. --group-directories-first - group directories before files --color - colorize the output --no-group - in a long listing, don't print group names --human-readable - print human readable sizes (e.g., 1K 234M 2G) --classify - append indicator (one of */=>@|) to entries If you want to see the owner: ls -l --almost-all --group-directories-first --color --no-group --human-readable --classify Show Sample Output


    0
    ls -g --almost-all --group-directories-first --color --no-group --human-readable --classify
    icatalina · 2016-04-15 17:12:35 13
  • This command will find any named file types in / between two dates then will list all the metadata of those files in long format and human readable form. Adding a 't' flag to the ls command sorts the files by modified time. After all that the head -5 lists the first 5 which can be changed.


    0
    ls -laht `find / -name "*.*" -type f -newermt "2016-04-05" ! -newermt "2016-04-10"`|head -5
    ubercoo · 2016-04-19 14:26:23 10

  • 0
    ls -l --color | less -R
    Sparkette · 2016-05-07 17:19:56 11

  • 0
    man $(ls /bin | shuf -n1)
    jubnzv · 2016-06-28 18:34:46 12
  • Pipes the output of ls to espeak Also works nice with fortune fortune | espeak


    0
    ls | espeak
    BigZ · 2016-08-02 17:54:39 13
  • I seem to do this compulsively every time I change directories, sometimes even when I don't, even if I know exactly what I need to do. (Don't worry, the sample output is just an exaggeration. :) Show Sample Output


    0
    grep -cx ls ~/.bash_history
    Sparkette · 2016-09-14 17:06:59 17
  • ls -l outputs long listing of files to awk, which sorts the output to include all lines that have the executable bit set (-x.), but excludes (!) the lines that have the directory bit set (drw), then prints the results to the screen. Show Sample Output


    0
    ls -l|awk ''/-x./' && !'/drw/' {print}'
    PCnetMD · 2016-09-21 14:42:10 15

  • 0
    find . -mtime +30 -exec ls -all "{}" \; | awk '{COUNTER+=$5} END {SIZE=COUNTER/1024/1024; print "size sum of found files is: " SIZE "MB"}'
    breign · 2016-10-28 08:05:57 16
  • Resume incomplete youtube-dl video files. Assuming mp4 format here.


    0
    ls *.part | sed 's/^.*-\(.\{11,11\}\)\.mp4\.part$/\1/g' - | youtube-dl -i -f mp4 -a -
    agp · 2017-02-28 23:31:55 21
  • All the other example fail when running in a folder containing too many files due to * being saturated. This command does not use *, allowing me to run it in one folder containing over 300000 audio files. As to running on so many files, I used GNU parallel in order to spawn as many processes as cores, tremendously fasting up the process. Show Sample Output


    0
    ls|grep ".wav"|parallel -j$(nproc) soxi -D {}|awk '{SUM += $1} END { printf "%d:%d:%d\n",SUM/3600,SUM%3600/60,SUM%60}'
    jupiter126 · 2017-05-02 21:37:24 20

  • 0
    ls -t /mcdata/archive/learn/backup-moodle2-course-* | tail -n +11 | xargs -I {} rm {}
    tlezotte · 2017-05-04 13:50:02 17
  • Show file count into directories. Usefull when you try to find hugh directories that elevate system CPU (vmstat -> sy) Show Sample Output


    0
    find / -type d | while read i; do ls $i | wc -l | tr -d \\n; echo " -> $i"; done | sort -n
    Zort · 2017-05-12 00:02:43 19
  • It works extremely fast, because it calculates md5sum only on the files that have the same size and name. But there is nothing for free - it won't find duplicates with the different names. Show Sample Output


    0
    find -type f -printf '%20s\t%100f\t%p\n' | sort -n | uniq -Dw121 | awk -F'\t' '{print $3}' | xargs -d '\n' md5sum | uniq -Dw32 | cut -b 35- | xargs -d '\n' ls -lU
    ant7 · 2017-05-21 02:26:16 16

  • 0
    find . -name '*.log' | xargs ls -hlt > /tmp/logs.txt && vi /tmp/logs.txt
    zluyuer · 2017-07-07 05:13:51 20

  • 0
    AWS_DEFAULT_REGION="sa-east-1" jungle ec2 ls | grep midas | sort | cut -f4 | xargs -I {} ssh ubuntu@{} sudo apt-get install ntp -y
    xymor · 2017-11-22 19:20:08 19
  • To HUNT for all the important stuffs. TRUST EL TRAPPER Works every time!


    0
    ls -ltrapR
    K33st · 2018-03-19 18:15:51 25

  • 0
    ls -tr ~/Downloads/*.pdf|tail -1
    masroor · 2018-05-14 14:01:55 159
  • I couldn't find movie library on any of the SQLlite Stremio databases, but on ~/.config/stremio/backgrounds2 the background image filenames corresponds to IMDB URL. So I foreach files and wget HTML title of each movie and save it to a file. This will retrieve all movie names, not just the Library.


    0
    time for movie in $(ls -1 $HOME/.config/stremio/backgrounds2 | sort -u);do echo "https://www.imdb.com/title/$movie/" | wget -qO- -O- -i- --header="Accept-Language: en" | hxclean | hxselect -s '\n' -c 'title' 2>/dev/null | tee -a ~/movie-list.txt ; done
    pabloab · 2018-08-16 06:11:41 317
  • On Linux, use watch -n 1 ls path/to/dir H/t: https://stackoverflow.com/a/9574123/805405 Show Sample Output


    0
    while :; do clear; ls path/to/dir | wc -l; sleep 1; done
    minademian · 2018-12-13 17:48:24 231
  • find all files that have 20 or more MB on every filesystem, change the size and filesystem to your liking


    0
    find / -type f -size +20000k -exec ls -lh {} \; 2> /dev/null | awk '{ print $NF ": " $5 }' | sort -nrk 2,2
    Marius · 2019-07-08 21:04:09 37

  • -1
    cleartool co -nc `cleartool ls -recurse | grep "hijacked" | sed s/\@\@.*// | xargs`
    ultrahax · 2009-02-06 00:03:51 51
  • On my music directory, I create variable that contains all mp3s files, then I play them with mpg123. -C options enable terminal control key, s for stop, p for pause, f for forward to next song.


    -1
    PLAYLIST=$(ls -1) ; mpg123 -C $PLAYLIST
    servermanaged · 2009-03-19 17:20:28 12
  • ‹ First  < 13 14 15 16 17 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

quickly change all .html extensions on files in folder to .htm

Block known dirty hosts from reaching your machine
Blacklisted is a compiled list of all known dirty hosts (botnets, spammers, bruteforcers, etc.) which is updated on an hourly basis. This command will get the list and create the rules for you, if you want them automatically blocked, append |sh to the end of the command line. It's a more practical solution to block all and allow in specifics however, there are many who don't or can't do this which is where this script will come in handy. For those using ipfw, a quick fix would be {print "add deny ip from "$1" to any}. Posted in the sample output are the top two entries. Be advised the blacklisted file itself filters out RFC1918 addresses (10.x.x.x, 172.16-31.x.x, 192.168.x.x) however, it is advisable you check/parse the list before you implement the rules

Annotate tail -f with timestamps

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Convert files from DOS line endings to UNIX line endings
Here "^M" is NOT "SHIFT+6" and "M". Type CTRL+V+M to get it instead. Its shortest and easy. And its sed!, which is available by default in all linux flavours.. no need to install extra tools like fromdos.

get a desktop notification from the terminal
tired of switching to the console to check if some command has finished yet? if notify-send does not work on your box try this one... e.g. rsync -av -e /usr/bin/lsh $HOME slowconnection.bar:/mnt/backup ; z (now fire up X, do something useful, get notified if this stuff has finished).

prevent large files from being cached in memory (backups!)
We all know... $ nice -n19 for low CPU priority.   $ ionice -c3 for low I/O priority.   nocache can be useful in related scenarios, when we operate on very large files just a single time, e.g. a backup job. It advises the kernel that no caching is required for the involved files, so our current file cache is not erased, potentially decreasing performance on other, more typical file I/O, e.g. on a desktop.   http://askubuntu.com/questions/122857 https://github.com/Feh/nocache http://packages.debian.org/search?keywords=nocache http://packages.ubuntu.com/search?keywords=nocache   To undo caching of a single file in hindsight, you can do $ cachedel   To check the cache status of a file, do $ cachestats

Quick and Temporary Named Commands
* Add comment with # in your command * Later you can search that command on that comment with CTRL+R In the title command, you could search it later by invoking the command search tool by first typing CTRL+R and then typing "revert"

Reverse ssh
Both hosts must be running ssh and also the outside host must have a port forwarded to port 22.

Find the package that installed a command


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: