commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
Mirror a remote directory using some tricks to maximize network speed.
lftp:: coolest file transfer tool ever
-u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa)
-e: execute internal lftp commands
set sftp:connect-program: use some specific command instead of plain ssh
-a -x -T: disable useless things
-c arcfour: use the most efficient cipher specification
-o Compression=no: disable compression to save CPU
mirror: copy remote dir subtree to local dir
-v: be verbose (cool progress bar and speed meter, one for each file in parallel)
-c: continue interrupted file transfers if possible
--loop: repeat mirror until no differences found
--use-pget-n=3: transfer each file with 3 independent parallel TCP connections
-P 2: transfer 2 files in parallel (totalling 6 TCP connections)
sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate)
You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network.
If the files are compressible removing "-o Compression=n" can be beneficial.
Better create an alias for the command.
Creates a file with contents like `du -a`, only it is remote server filesystem hierarchy. Very usefull then for grep-ing without remote connection.
Make sure the file you use in your test is > 50mb to get good results.
sudo apt-get install lftp iperf
This is for files only, for directories 'mirror' has to be used.
It works best if you first login and then do the fetch:
lftp -u user,pass ftp://site.com/
mirror -c --parallel=3 --use-pget-n=5 "Some folder"
wget/curl/friends are not good with mirroring files off websites, especially those with Apache-generated directory listings. These tools endlessly waste time downloading useless index HTML pages. lftp's mirror command does a better job without the mess.