Commands using ssh (347)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

Set all CPU cores' CPU frequency scaling governor to maximum performance

Get absolut path to your bash-script
Another way of doing it that's a bit clearer. I'm a fan of readable code.

Matrix Style
Unlike other alternatives, this command only relies on bash builtins and should also work on windows platforms with the bash executable. Sparseness corresponds to the number 128 and can be adjusted. To print all possible digits instead of only 0 and 1 replace RANDOM%2 by RANDOM%10 or RANDOM%16 to add letters [A-F].

Upgrading packages. Pacman can update all packages on the system with just one command. This could take quite a while depending on how up-to-date the system is. This command can synchronize the repository databases and update the system's packages.
Warning: Instead of immediately updating as soon as updates are available, users must recognize that due to the nature of Arch's rolling release approach, an update may have unforeseen consequences. This means that it is not wise to update if, for example, one is about to deliver an important presentation. Rather, update during free time and be prepared to deal with any problems that may arise. Pacman is a powerful package management tool, but it does not attempt to handle all corner cases. Read The Arch Way if this causes confusion. Users must be vigilant and take responsibility for maintaining their own system. When performing a system update, it is essential that users read all information output by pacman and use common sense. If a user-modified configuration file needs to be upgraded for a new version of a package, a .pacnew file will be created to avoid overwriting settings modified by the user. Pacman will prompt the user to merge them. These files require manual intervention from the user and it is good practice to handle them right after every package upgrade or removal. See Pacnew and Pacsave Files for more info. Tip: Remember that pacman's output is logged in /var/log/pacman.log.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Insert the last command without the last argument (bash)
$/usr/sbin/ab2 -f TLS1 -S -n 1000 -c 100 -t 2 http://www.google.com/ then $!:- http://www.commandlinefu.com/ is the same as $/usr/sbin/ab2 -f TLS1 -S -n 1000 -c 100 -t 2 http://www.commandlinefu.com/

execute your commands and avoid history records
$ secret_command;export HISTCONTROL= This will make "secret_command" not appear in "history" list.

Calculate days on which Friday the 13th occurs (inspired from the work of the user justsomeguy)
Friday is the 5th day of the week, monday is the 1st. Output may be affected by locale.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: