This improves on #9892 by compressing the directory on the remote machine so that the amount of data transferred over the network is much smaller. The command uses ssh(1) to get to a remote host, uses tar(1) to archive and compress a remote directory, prints the result to STDOUT, which is written to a local file. In other words, we are archiving and compressing a remote directory to our local box.
Open a ssh session opened forever, great on laptops losing Internet connectivity when switching WIFI spots.
This command sequence allows simple setup of (gasp!) password-less SSH logins. Be careful, as if you already have an SSH keypair in your ~/.ssh directory on the local machine, there is a possibility ssh-keygen may overwrite them. ssh-copy-id copies the public key to the remote host and appends it to the remote account's ~/.ssh/authorized_keys file. When trying ssh, if you used no passphrase for your key, the remote shell appears soon after invoking ssh user@host.
We force IPv4, compress the stream, specify the cypher stream to be Blowfish. I suppose you could use aes256-ctr as well for cypher spec. I'm of course leaving out things like master control sessions and such as that may not be available on your shell although that would speed things up as well.
Have you ever had to scp a file to your work machine in order to copy its contents to a mail? xclip can help you with that. It copies its stdin to the X11 buffer, so all you have to do is middle-click to paste the content of that looong file :)
The command copies a file from remote SSH host on port 8322 with bandwidth limit 100KB/sec; --progress shows a progress bar --partial turns partial download on; thus, you can resume the process if something goes wrong --bwlimit limits bandwidth by specified KB/sec --ipv4 selects IPv4 as preferred I find it useful to create the following alias: alias myscp='rsync --progress --partial --rsh="ssh -p 8322" --bwlimit=100 --ipv4' in ~/.bash_aliases, ~/.bash_profile, ~/.bash_login or ~/.bashrc where appropriate. Show Sample Output
Good if only you have access to host1 and host2, but they have no access to your host (so ncat won't work) and they have no direct access to each other.
I have this on a daily cronjob to backup the commandlinefu.com database from NearlyFreeSpeech.net (awesome hosts by the way) to my local drive. Note that (on my Ubuntu system at least) you need to escape the % signs on the crontab.
remove the host for the .ssh/know_host file
The pee command is in the moreutils package.
ssh -X example.org xeyes
The SSH server configuration requires:
X11Forwarding yes # this is default in Debian
And it's convenient too:
Compression delayed
It happens that sometime you remember that you used a special command short time before and you want to check the command again. WIth this command you can just put the beginning of a command and then bash will look for you and it will print back safely withou executing Show Sample Output
it compresses the files and folders to stdout, secure copies it to the server's stdin and runs tar there to extract the input and output to whatever destination using -C. if you emit "-C /destination", it will extract it to the home folder of the user, much like `scp file user@server:`. the "v" in the tar command can be removed for no verbosity.
This can be much faster than downloading one or both trees to a common servers and comparing the files there. After, only those files could be copied down for deeper comparison if needed. Show Sample Output
copy files to a ssh server with gzip compression
Put it into your sh startup script (I use alias scpresume='rsync --partial --progress --rsh=ssh' in bash). When a file transfer via scp has aborted, just use scpresume instead of scp and rsync will copy only the parts of the file that haven't yet been transmitted. Show Sample Output
Just run the command, type your password, and that's the last time you need to enter your password for that server. This assumes that the server supports publickey authentication. Also, the permissions on your home dir are 755, and the permissions on your .ssh dir are 700 (local and remote).
This is also handy for taking a look at resource usage of a remote box.
ssh -t remotebox top
When using tcpdump, specify -U option to prevent buffering.
Execute it from the source host, where the source files you wish backup resides. With the minus '-' the tar command deliver the compressed output to the standar output and, trough over the ssh session to the remote host. On the other hand the backup host will be receive the stream and read it from the standar input sending it to the /path/to/backup/backupfile.tar.bz2 Show Sample Output
A variation of a script I found on this site and then slimmed down to just use awk. It displays all users who have attempted to login to the box and failed using SSH. Pipe it to the sort command to see which usernames have the most failed logins. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: