Commands tagged bandwidth (7)

  • Limits the usage of bandwidth by apt-get, in the example the command will use 30Kb/s ;) It should work for most apt-get actions (install, update, upgrade, dist-upgrade, etc.)


    18
    sudo apt-get -o Acquire::http::Dl-Limit=30 upgrade
    alemani · 2010-03-22 01:29:44 13
  • The command copies a file from remote SSH host on port 8322 with bandwidth limit 100KB/sec; --progress shows a progress bar --partial turns partial download on; thus, you can resume the process if something goes wrong --bwlimit limits bandwidth by specified KB/sec --ipv4 selects IPv4 as preferred I find it useful to create the following alias: alias myscp='rsync --progress --partial --rsh="ssh -p 8322" --bwlimit=100 --ipv4' in ~/.bash_aliases, ~/.bash_profile, ~/.bash_login or ~/.bashrc where appropriate. Show Sample Output


    17
    rsync --progress --partial --rsh="ssh -p 8322" --bwlimit=100 --ipv4 user@domain.com:~/file.tgz .
    ruslan · 2011-02-10 14:25:22 11
  • the command is obvious, I know, but maybe not everyone knows that using the parameter "-l" you can limit the use of bandwidth command scp. In this example fetch all files from the directory zutaniddu and I copy them locally using only 10 Kbs


    11
    scp -l10 pippo@serverciccio:/home/zutaniddu/* .
    0disse0 · 2010-02-19 16:44:24 14
  • On the machine acting like a server, run: iperf -s On the machine acting like a client, run: iperf -c ip.add.re.ss where ip.add.re.ss is the ip or hostname of the server. Show Sample Output


    8
    iperf -s
    forcefsck · 2011-01-24 07:58:38 20
  • Nethogs groups bandwidth by process. Show Sample Output


    5
    sudo nethogs eth0
    totti · 2013-01-25 08:20:44 6
  • in Debian-based systems apt-get could be limited to the specified bandwidth in kilobytes using the apt configuration options(man 5 apt.conf, man apt-get). I'd quote man 5 apt.conf: "The used bandwidth can be limited with Acquire::http::Dl-Limit which accepts integer values in kilobyte. The default value is 0 which deactivates the limit and tries uses as much as possible of the bandwidth..." "HTTPS URIs. Cache-control, Timeout, AllowRedirect, Dl-Limit and proxy options are the same as for http..."


    2
    sudo apt-get -o Acquire::http::Dl-Limit=20 -o Acquire::https::Dl-Limit=20 upgrade -y
    ruslan · 2011-02-14 05:24:49 4
  • Trickle is here: http://monkey.org/~marius/pages/?page=trickle Trickle is a simple bandwidth limiter


    1
    trickle sudo apt-get update -y
    mrman · 2011-02-15 02:05:37 4

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

List Listen Port by numbers
Show TCP Listen ports sorted by number (bugs: IPV6 addresses not supported)

Monitor progress of a command
Pipe viewer is a terminal-based tool for monitoring the progress of data through a pipeline. It can be inserted into any normal pipeline between two processes to give a visual indication of how quickly data is passing through, how long it has taken, how near to completion it is, and an estimate of how long it will be until completion. Source: http://www.catonmat.net/blog/unix-utilities-pipe-viewer/

edit, view or execute last modified file with a single key-press
Copy this function to command line, press 'Enter' 'f'' 'Enter' to execute (sentence on the left written only for newbies). Hint 'e|x|v|1..9' in front of displayed last modified file name means: "Press 'e' for edit,'x' for execute,'v' for view or a digit-key '1..9' to touch one file from the recent files list to be last modified" and suggested (hidden files are listed too, else remove 'a' from 'ls -tarp' statement if not intended). If you find this function useful you can then rename it if needed and append or include into your ~/.bashrc config script. With the command $ . ~/.bashrc the function then can be made immediately available. In the body of the function modifications can be made, i.e. replaced joe editor command or added new option into case statement, for example 'o) exo-open $h;;' command for opening file with default application - or something else (here could not be added since the function would exceed 255 chars). To cancel execution of function started is no need to press Ctrl-C - if the mind changed and want to leave simple Enter-press is enough. Once defined, this function can with $ typeset -f f command be displayed in easy readable form

View the latest astronomy picture of the day from NASA.
Substitute feh for the image viewer of your choice. display (part of imagemagick) seems to be a popular choice.

Sum size of files returned from FIND

Happy Days
AFAIR this is the wording ;)

Remove all the files except abc in the directory
Finds all files in the current directory and deletes them besides file called "abc"

Get the full path to a file
Part of coreutils - so needs no extra package...

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

sleep until X o'clock
pauses exactly long enough to wake at the top of the hour


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: