wget/curl/friends are not good with mirroring files off websites, especially those with Apache-generated directory listings. These tools endlessly waste time downloading useless index HTML pages. lftp's mirror command does a better job without the mess.
connect to a remote server using ftp protocol over FUSE file system, then rsync the remote folder to a local one and then unmount the remote ftp server (FUSE FS) it can be divided to 3 different commands and you should have curlftpfs and rsync installed
If the username includes an @ you can use this one: wget -r --user=username_here --password=pass_here ftp://ftp.example.com
While `sshfs $REMOTE_HOST:$REMOTE_PATH $LOCAL_PATH` "pulls" a directory from the remote server to the local host, the above command does the reverse and "pushes" a directory from the local host to the remote server. This makes use of the "slave" option of sshfs which instructs it to communicate over plain stdin/stdout and the `dpipe` tool from vde2 to connect the sftp-server stdout to the sshfs stdin and vice-versa.
Mirror a remote directory using some tricks to maximize network speed. lftp:: coolest file transfer tool ever -u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa) -e: execute internal lftp commands set sftp:connect-program: use some specific command instead of plain ssh ssh:: -a -x -T: disable useless things -c arcfour: use the most efficient cipher specification -o Compression=no: disable compression to save CPU mirror: copy remote dir subtree to local dir -v: be verbose (cool progress bar and speed meter, one for each file in parallel) -c: continue interrupted file transfers if possible --loop: repeat mirror until no differences found --use-pget-n=3: transfer each file with 3 independent parallel TCP connections -P 2: transfer 2 files in parallel (totalling 6 TCP connections) sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate) You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network. If the files are compressible removing "-o Compression=n" can be beneficial. Better create an alias for the command. Show Sample Output
to download latest version of "util", maybe insert a sort if they wont be shown in right order. curl lists all files on mirror, grep your util, tail -1 will gets the one lists on the bottom and get it with wget Show Sample Output
I use this for connect via sftp to a server listening on a non default ssh port.
Seq allows you to define printf like formating by specified with -f, %03g is actually tells seq I got three digits, fill the blank digits with 0, and the range is from 176 to 240.
This is for files only, for directories 'mirror' has to be used.
This command will download $file via server. I've used this when FTP was broken at the office and I needed to download some software packages.
I sue this in my .bashrc file This will also do auto-completion for scp and sftp
Creates a file with contents like `du -a`, only it is remote server filesystem hierarchy. Very usefull then for grep-ing without remote connection.
Simple TCPDUMP grepping for common unsafe protocols (HTTP, POP3, SMTP, FTP) Show Sample Output
Usage:
sftp-cp * | sftp user@host:/dir
This is useful if there is a process on the remote machine waiting for files in an incoming directory. This way it won't see half-transmitted files if it ignores hidden files.
Prompts the user for username and password, that are then exported to http_proxy for use by wget, yum etc Default user, webproxy and port are used. Using this script prevent the cleartext user and pass being in your bash_history and on-screen Show Sample Output
I must monitorize a couple of ftp servers every morning WITHOUT a port-scanner Instead of ftp'ing on 100 ftp servers manually to test their status I use this loop. It might be adaptable to other services, however it may require a 'logout' string instead of 'quit'. The file ftps.txt contains the full list of ftp servers to monitorize.
By putting ! in front of a command, we are able to run it from an ftp session. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: