Commands using ls (517)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

create a progress bar...
A simple way yo do a progress bar like wget.

Check your unread Gmail from the command line
Just an alternative with more advanced formating for readability purpose. It now uses colors (too much for me but it's a kind of proof-of-concept), and adjust columns.

Decode base64-encoded file in one line of Perl
Another option is openssl.

Test load balancers
With the "--resolve" switch, you can avoid doing DNS lookups or edit the /etc/hosts file, by providing the IP address for a domain directly. Useful if you have many servers with different IP addresses behind a load balancer. Of course, you would loop it: $ for IP in 10.11.0.{1..10}; do curl --resolve subdomain.example.com:80:$IP subdomain.example.com -I -s; done

Anti DDOS
Takes IP from web logs and pipes to iptables, use grep to white list IPs.. use if a particular file is getting requested by many different addresses. Sure, its already down pipe and you bandwidth may suffer but that isnt the concern. This one liner saved me from all the traffic hitting the server a second time, reconfigure your system so your system will work like blog-post-1.php or the similar so legitimate users can continue working while the botnet kills itself.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

list folders containing less than 2 MB of data
This command will search all subfolders of the current directory and list the names of the folders which contain less than 2 MB of data. I use it to clean up my mp3 archive and to delete the found folders pipe the output to a textfile & run: $ while read -r line; do rm -Rv "$line"; done < textfile

Listing directory content of a directory with a lot of entries
Ever wanted to get the directory content with 'ls' or 'find' and had to wait minutes until something was printed? Perl to the rescue. The one-liner above(redirected to a file) took less than five seconds to run in a directory with more man 2 million files. One can adapt it to e.g. delete files that match a certain pattern.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Decode base64-encoded file in one line of Perl
Another option is openssl.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: