Commands using ssh (347)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Forwards connections to your port 2000 to the port 22 of a remote host via ssh tunnel

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Insert the last command without the last argument (bash)
$/usr/sbin/ab2 -f TLS1 -S -n 1000 -c 100 -t 2 http://www.google.com/ then $!:- http://www.commandlinefu.com/ is the same as $/usr/sbin/ab2 -f TLS1 -S -n 1000 -c 100 -t 2 http://www.commandlinefu.com/

Advanced python tracing
Trace python statement execution and syscalls invoked during that simultaneously

execute your commands hiding secret bits from history records
$ wget --user=username --password="$password" http://example.org/ Instead of hiding commands entirely from history, I prefer to use "read" to put the password into a variable, and then use that variable in the commands instead of the password. Without the "-e" and "-s" it should work in any bourne-type shell, but the -s is what makes sure the password doesn't get echoed to the screen at all. (-e makes editing work a bit better)

Show the command line for a PID with ps
Show the command line for a PID with ps

Watch TCP, UDP open ports in real time with socket summary.

Fake system time before running a command
Fake system time before running any command.

start a VNC server for another user

To print a specific line from a file
Just one character longer than the sed version ('FNR==5' versus -n 5p). On my system, without using "exit" or "q", the awk version is over four times faster on a ~900K file using the following timing comparison: $ testfile="testfile"; for cmd in "awk 'FNR==20'" "sed -n '20p'"; do echo; echo $cmd; eval "$cmd $testfile"; for i in {1..3}; do time for j in {1..100}; do eval "$cmd $testfile" >/dev/null; done; done; done Adding "exit" or "q" made the difference between awk and sed negligible and produced a four-fold improvement over the awk timing without the "exit". For long files, an exit can speed things up: $ awk 'FNR==5{print;exit}'


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: