commandlinefu.com is the place to record those command-line gems that you return to again and again.
You can sign-in using OpenID credentials, or register a traditional username and password.
Subscribe to the feed for:
Nethogs groups bandwidth by process.
in Debian-based systems apt-get could be limited to the specified bandwidth in kilobytes using the apt configuration options(man 5 apt.conf, man apt-get). I'd quote man 5 apt.conf:
"The used bandwidth can be limited with Acquire::http::Dl-Limit which accepts integer values in kilobyte. The default value is 0 which deactivates the limit and tries uses as much as possible of the bandwidth..."
"HTTPS URIs. Cache-control, Timeout, AllowRedirect, Dl-Limit and proxy options are the same as for http..."
The command copies a file from remote SSH host on port 8322 with bandwidth limit 100KB/sec;
--progress shows a progress bar
--partial turns partial download on; thus, you can resume the process if something goes wrong
--bwlimit limits bandwidth by specified KB/sec
--ipv4 selects IPv4 as preferred
I find it useful to create the following alias:
alias myscp='rsync --progress --partial --rsh="ssh -p 8322" --bwlimit=100 --ipv4'
in ~/.bash_aliases, ~/.bash_profile, ~/.bash_login or ~/.bashrc where appropriate.
On the machine acting like a server, run:
On the machine acting like a client, run:
iperf -c ip.add.re.ss
where ip.add.re.ss is the ip or hostname of the server.
Limits the usage of bandwidth by apt-get, in the example the command will use 30Kb/s ;)
It should work for most apt-get actions (install, update, upgrade, dist-upgrade, etc.)
the command is obvious, I know, but maybe not everyone knows that using the parameter "-l" you can limit the use of bandwidth command scp.
In this example fetch all files from the directory zutaniddu and I copy them locally using only 10 Kbs