All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Nginx - print all optional modules before compilation
wget http://nginx.org/download/nginx-1.15.3.tar.gz && tar -xzf 1.15.3.tar.gz && cd nginx-1.15.3

Avoids ssh timeouts by sending a keep alive message to the server every 60 seconds
ssh_config is the system-wide configuration file for ssh. For per-user configuration, which allows for different settings for each host: $echo 'ServerAliveInterval 60' >> ~/.ssh/ssh_config On OSX: $echo 'ServerAliveInterval 60' >> ~/.ssh/config or $echo 'ServerAliveInterval 60' >> ~/etc/ssh_config

using scanner device from command line
you have to replace "mustek_usb" with the scanner found by `scanimage -l`

Write comments to your history.
A null operation with the name 'comment', allowing comments to be written to HISTFILE. Prepending '#' to a command will *not* write the command to the history file, although it will be available for the current session, thus '#' is not useful for keeping track of comments past the current session.

cpuinfo

Create AUTH PLAIN string to test SMTP AUTH session
I use this as an alias: alias authplain "printf '\!:1\0\!:1\0\!:2' | mmencode | tr -d '\n' | sed 's/^/AUTH PLAIN /'" then.. # authplain someuser@somedomain.com secretpassword AUTH PLAIN c29tZXVzZXJAc29tZWRvbWFpbi5jb20Ac29tZXVzZXJAc29tZWRvbWFpbi5jb20Ac2VjcmV0cGFzc3dvcmQ= #

ensure your ssh tunnel will always be up (add in crontab)
I find it ugly & sexy at the same time isn't it ?

Easily decode unix-time (funtion)
More recent versions of the date command finally have the ability to decode the unix epoch time into a human readable date. This function makes it simple to utilize this feature quickly.

Kill any process with one command using program name
This will kill all. e.g. killall firefox

Fastest segmented parallel sync of a remote directory over ssh
Mirror a remote directory using some tricks to maximize network speed. lftp:: coolest file transfer tool ever -u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa) -e: execute internal lftp commands set sftp:connect-program: use some specific command instead of plain ssh ssh:: -a -x -T: disable useless things -c arcfour: use the most efficient cipher specification -o Compression=no: disable compression to save CPU mirror: copy remote dir subtree to local dir -v: be verbose (cool progress bar and speed meter, one for each file in parallel) -c: continue interrupted file transfers if possible --loop: repeat mirror until no differences found --use-pget-n=3: transfer each file with 3 independent parallel TCP connections -P 2: transfer 2 files in parallel (totalling 6 TCP connections) sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate) You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network. If the files are compressible removing "-o Compression=n" can be beneficial. Better create an alias for the command.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: