commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Creates a file with contents like `du -a`, only it is remote server filesystem hierarchy. Very usefull then for grep-ing without remote connection.
Require lftp and this script to work (adapt path and credentials as needed):
echo $'#!/bin/bash\n# coded by sputnick under GPL 20101007\nlftp -u user,passwd firstname.lastname@example.org -e "$(cat)"' > /PATH/TO/ftp-latest; chmod +x !$
I must monitorize a couple of ftp servers every morning WITHOUT a port-scanner
Instead of ftp'ing on 100 ftp servers manually to test their status I use this loop.
It might be adaptable to other services, however it may require a 'logout' string instead of 'quit'.
The file ftps.txt contains the full list of ftp servers to monitorize.
`aria2c` (from the aria2 project) allows. Change -s 4 to an arbitrary number of segments to control the number of concurrent connections. It is also possible to provide multiple URLs to the same content (potentially over multiple protocols) to download the file concurrently from multiple hosts.
connect to a remote server using ftp protocol over FUSE file system, then rsync the remote folder to a local one and then unmount the remote ftp server (FUSE FS)
it can be divided to 3 different commands and you should have curlftpfs and rsync installed