All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Convert a date to timestamp
Simple way to get a timestamp from a date

Nmap list IPs in a network and saves in a txt

add static arp entry to default gateway, arp poison protection

Get your external IP address
curl ifconfig.me/ip -> IP Adress curl ifconfig.me/host -> Remote Host curl ifconfig.me/ua ->User Agent curl ifconfig.me/port -> Port thonks to http://ifconfig.me/

Robust expansion (i.e. crash) of bash variables with a typo
By default bash expands an unbound variable to an empty string. This can be dangerous, if a critical variable name (a path prefix for example) has a typo. The -u option causes bash to treat this as an error, and the -e option causes it to exit in case of an error. These two together will make your scripts a lot safer against typos. The default behaviour can be explicitly requested using the ${NAME:-} syntax. A (less explicit) variation: #!/bin/bash -eu

Compress and store the image of a disk over the network
Create an image of "device" and send it to another machine through the network ("target" and "port" sets the ip and port the stream will be sent to), outputting a progress bar On the machine that will receive, compress and store the file, use: $nc -l -p | 7z a -si -m0=lzma2 -mx=9 -ms=on Optionally, add the -v4g switch at the end of the line in order to split the file every 4 gigabytes (or set another size: accepted suffixes are k, m and g). The file will be compressed using 7z format, lzma2 algorithm, with maximum compression level and solid file activated. The compression stage will be executed on the machine which will store the image. It was planned this way because the processor on that machine was faster, and being on a gigabit network, transfering the uncompressed image wasn't much of a problem.

set history file length
set how many commands to keep in history Default is 500 Saved in /home/$USER/.bash_history Add this to /home/$USER/.bashrc HISTFILESIZE=1000000000 HISTSIZE=1000000

A fun thing to do with ram is actually open it up and take a peek. This command will show you all the string (plain text) values in ram
cat? dd? RTFM

Get a funny one-liner from www.onelinerz.net
Put this command in .bashrc and every time you open a new terminal a random quote will be downloaded and printed from onelinerz.net. By altering the URL in the w3m statement you can change the output: 1 to 10 lines - http://www.onelinerz.net/random-one-liners/(number)/ 20 newest lines - http://www.onelinerz.net/latest-one-liners/ Top 10 lines - http://www.onelinerz.net/top-100-funny-one-liners/ Top 10 lines are updated daily.

Prints the latest modified files in a directory tree recursively
Sorts by latest modified files by looking to current directory and all subdirectories


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: