Commands tagged lftp (4)

  • Mirror a remote directory using some tricks to maximize network speed. lftp:: coolest file transfer tool ever -u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa) -e: execute internal lftp commands set sftp:connect-program: use some specific command instead of plain ssh ssh:: -a -x -T: disable useless things -c arcfour: use the most efficient cipher specification -o Compression=no: disable compression to save CPU mirror: copy remote dir subtree to local dir -v: be verbose (cool progress bar and speed meter, one for each file in parallel) -c: continue interrupted file transfers if possible --loop: repeat mirror until no differences found --use-pget-n=3: transfer each file with 3 independent parallel TCP connections -P 2: transfer 2 files in parallel (totalling 6 TCP connections) sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate) You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network. If the files are compressible removing "-o Compression=n" can be beneficial. Better create an alias for the command. Show Sample Output


    4
    lftp -u user,pwd -e "set sftp:connect-program 'ssh -a -x -T -c arcfour -o Compression=no'; mirror -v -c --loop --use-pget-n=3 -P 2 /remote/dir/ /local/dir/; quit" sftp://remotehost:22
    colemar · 2014-10-17 00:29:34 10
  • Creates a file with contents like `du -a`, only it is remote server filesystem hierarchy. Very usefull then for grep-ing without remote connection.


    1
    lftp -u<<credentials>> <<server>> -e "du -a;exit" > server-listing.txt
    danbst · 2013-09-24 08:23:44 7
  • Require lftp and this script to work (adapt path and credentials as needed): echo $'#!/bin/bash\n# coded by sputnick under GPL 20101007\nlftp -u user,passwd remoteuser@host.tld -e "$(cat)"' > /PATH/TO/ftp-latest; chmod +x !$


    0
    ftp-latest <<< "cd /; cls -1 | tail -1 | xargs -I% echo get % | /PATH/TO/ftp-latest"
    sputnick · 2010-10-07 17:59:45 6
  • sets the option set ftp:ssl-allow false, excludes some files I don't want to synchronize


    0
    lftp -u user,pass -e "set ftp:ssl-allow false; mirror --exclude settings.php --exclude .htaccess serverdir devdir" serverhost
    aufumy · 2015-09-08 21:25:41 10

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Get line number 12 (or n) from a file
Extracts only file number 12 from file. It's meant for text files. Replace 12 with the number you want. First line starts at 1 not 0. We use q on next line so doesn't process any line more.

Never rewrites a file while copying (or moving)
Allows you to preserve your files when using cp, mv, ln, install or patch. When the target file exists, it will generate a file named XXX.~N~ (N is an auto-incremental number) instead of deleting the target file.

Quick and dirty RSS
runs an rss feed through sed replacing the closing tags with newlines and the opening tags with white space making it readable.

ROT13 whole file in vim.
gg puts the cursor at the begin g? ROT13 until the next mov G the EOF

Another Curl your IP command
Just another curl command to get your public facing IP

underscore to camelCase

Kills a process that is locking a file.
Useful when you're trying to unmount a volume and other sticky situations where a rogue process is annoying the hell out of you.

Buffer in order to avoir mistakes with redirections that empty your files
A common mistake in Bash is to write command-line where there's command a reading a file and whose result is redirected to that file. It can be easily avoided because of : 1) warnings "-bash: file.txt: cannot overwrite existing file" 2) options (often "-i") that let the command directly modify the file but I like to have that small function that does the trick by waiting for the first command to end before trying to write into the file. Lots of things could probably done in a better way, if you know one...

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

Happy New Year!


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: