Commands tagged bandwidth (7)

  • Limits the usage of bandwidth by apt-get, in the example the command will use 30Kb/s ;) It should work for most apt-get actions (install, update, upgrade, dist-upgrade, etc.)


    18
    sudo apt-get -o Acquire::http::Dl-Limit=30 upgrade
    alemani · 2010-03-22 01:29:44 12
  • The command copies a file from remote SSH host on port 8322 with bandwidth limit 100KB/sec; --progress shows a progress bar --partial turns partial download on; thus, you can resume the process if something goes wrong --bwlimit limits bandwidth by specified KB/sec --ipv4 selects IPv4 as preferred I find it useful to create the following alias: alias myscp='rsync --progress --partial --rsh="ssh -p 8322" --bwlimit=100 --ipv4' in ~/.bash_aliases, ~/.bash_profile, ~/.bash_login or ~/.bashrc where appropriate. Show Sample Output


    17
    rsync --progress --partial --rsh="ssh -p 8322" --bwlimit=100 --ipv4 user@domain.com:~/file.tgz .
    ruslan · 2011-02-10 14:25:22 9
  • the command is obvious, I know, but maybe not everyone knows that using the parameter "-l" you can limit the use of bandwidth command scp. In this example fetch all files from the directory zutaniddu and I copy them locally using only 10 Kbs


    11
    scp -l10 pippo@serverciccio:/home/zutaniddu/* .
    0disse0 · 2010-02-19 16:44:24 13
  • On the machine acting like a server, run: iperf -s On the machine acting like a client, run: iperf -c ip.add.re.ss where ip.add.re.ss is the ip or hostname of the server. Show Sample Output


    8
    iperf -s
    forcefsck · 2011-01-24 07:58:38 19
  • Nethogs groups bandwidth by process. Show Sample Output


    5
    sudo nethogs eth0
    totti · 2013-01-25 08:20:44 6
  • in Debian-based systems apt-get could be limited to the specified bandwidth in kilobytes using the apt configuration options(man 5 apt.conf, man apt-get). I'd quote man 5 apt.conf: "The used bandwidth can be limited with Acquire::http::Dl-Limit which accepts integer values in kilobyte. The default value is 0 which deactivates the limit and tries uses as much as possible of the bandwidth..." "HTTPS URIs. Cache-control, Timeout, AllowRedirect, Dl-Limit and proxy options are the same as for http..."


    2
    sudo apt-get -o Acquire::http::Dl-Limit=20 -o Acquire::https::Dl-Limit=20 upgrade -y
    ruslan · 2011-02-14 05:24:49 3
  • Trickle is here: http://monkey.org/~marius/pages/?page=trickle Trickle is a simple bandwidth limiter


    1
    trickle sudo apt-get update -y
    mrman · 2011-02-15 02:05:37 3

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Stop procrastination on Facebook.com
or echo '127.0.0.1 facebook.com' | sudo tee -a /etc/hosts Do not execute this command if you don't know what you are doing.

bash script to zip a folder while ignoring git files and copying it to dropbox
Better tool for exporting git's repository is Git itself!

list files recursively by size

Serial console to a Vmware VM
Create a serial console with "socket (named pipe)" of "/tmp/socket", "from:server, to:virtual machine" in vmware player, etc.. gui. Run the above command after you have booted the guest OS (which should also be configured for serial console).

List folders containing only PNGs

geoip information
That makes a function you can put in your ~/.bashrc to run it when you need in any term with an IP as argument

a function to find the fastest DNS server
http://public-dns.info gives a list of online dns servers. you need to change the country in url (br in this url) with your country code. this command need some time to ping all IP in list.

Performance tip: compress /usr/
Periodically run the one-liner above if/when there are significant changes to the files in /usr/ = Before rebooting, add following to /etc/fstab : = $ /squashed/usr/usr.sfs /squashed/usr/ro squashfs loop,ro 0 0 $ usr /usr aufs udba=reval,br:/squashed/usr/rw:/squashed/usr/ro 0 0 No need to delete original /usr/ ! (unless you don't care about recovery). Also AuFS does not work with XFS

all out
How to force a userid to log out of a Linux host, by killing all processes owned by the user, including login shells:

Stream audio over ssh
This one doesn't need to convert to wav.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: