Commands tagged benchmark (9)

  • -n 9000 : Number of requests to perform for the benchmarking session -c 900 : Number of multiple requests to perform at a time Show Sample Output


    10
    ab -n 9000 -c 900 localhost:8080/index.php
    amaymon · 2009-08-07 07:19:40 7
  • See: http://imgur.com/JgjK2.png for example. Do some serious benchmarking from the commandline. This will write to a file with the time it took to compress n bytes to the file (increasing by 1). Run: gnuplot -persist <(echo "plot 'lzma' with lines, 'gzip' with lines, 'bzip2' with lines") To see it in graph form.


    3
    for a in bzip2 lzma gzip;do echo -n>$a;for b in $(seq 0 256);do dd if=/dev/zero of=$b.zero bs=$b count=1;c=$(date +%s%N);$a $b.zero;d=$(date +%s%N);total=$(echo $d-$c|bc);echo $total>>$a;rm $b.zero *.bz2 *.lzma *.gz;done;done
    matthewbauer · 2009-10-20 01:00:51 5

  • 1
    sync; time `dd if=/dev/cciss/c0d1p1 of=/dev/null bs=1M count=10240`
    w00binda · 2009-11-19 10:34:13 4
  • # 4 cores with 2500 pi digits CPUBENCH 4 2500 . every core will use 100% cpu and you can see how fast they calculate it. if you do 50000 digitits and more it can take hours or days Show Sample Output


    1
    CPUBENCH() { local CPU="${1:-1}"; local SCALE="${2:-5000}"; { for LOOP in `seq 1 $CPU`; do { time echo "scale=${SCALE}; 4*a(1)" | bc -l -q | grep -v ^"[0-9]" & } ; done }; echo "Cores: $CPU"; echo "Digit: $SCALE" ;}
    emphazer · 2018-05-14 17:30:37 292

  • 0
    sync; time `dd if=/dev/zero of=bigfile bs=1M count=2048 && sync`
    w00binda · 2009-11-19 10:29:03 14
  • Iozone with a file of 2GB, 64KB record size, write/rewrite and read/re-read test, using just one thread. Show Sample Output


    0
    iozone -s 2g -r 64 -i 0 -i 1 -t 1
    w00binda · 2009-11-19 10:43:54 4
  • (echo "https://example.com/"; echo "https://example.com/"; echo "https://example.com/"; echo "https://example.com/") | parallel -k 'ab -n 10000 -c 15 {}'


    0
    cat url_list.txt | parallel -k 'ab -n 10000 -c 15 {}'
    emphazer · 2018-05-17 11:23:28 301
  • You could have that little benchmark run on all cores in parallel, as a multi-core benchmark or stress test First find the number of cores, then have parallel iterate over that in, well, parallel Show Sample Output


    -1
    time cat /proc/cpuinfo |grep proc|wc -l|xargs seq|parallel -N 0 echo "2^2^20" '|' bc
    kostis · 2018-12-06 05:36:55 1062
  • Broken in two parts, first get the number of cores with cat /proc/cpuinfo |grep proc|wc -l and create a integer sequence with that number (xargs seq), then have GNU parallel loop that many times over the given command. Cheers! Show Sample Output


    -2
    time cat /proc/cpuinfo |grep proc|wc -l|xargs seq|parallel -N 0 echo "scale=4000\; a\(1\)\*4" '|' bc -l
    kostis · 2018-12-06 05:15:24 701

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Install pip with Proxy
Installs pip packages defining a proxy

Compare two CSV files, discarding any repeated lines
The value for the sort command's -k argument is the column in the CSV file to sort on. In this example, it sorts on the second column. You must use some form of the sort command in order for uniq to work properly.

print all except first collumn

Easily decode unix-time (funtion)

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

send incoming audio to a Icecast server (giss.tv)
easy way to setup an "internet radio sation", pre-requisite, create an account at an icecast server, in this example, just created beforehand an account at giss.tv. Change the word password, with the respective real password you created at server. Make sure to have installed rec, oggnec, oggfwd and tee. I have a mixer connected at line in, so I can mix music and microphone. This also will produce a local recorded copy of the session, it will be called "streamdump.ogg"

keylogger
$python -c "DEV = '/dev/input/event4' #if event0 doesn't work, try event1 event2 etc fo = open(DEV) def interpret(keycode,state): if state == 0: print '%i up'%keycode if state == 1: print '%i down'%keycode if state == 2: print '%i repeat'%keycode while 1: line = fo.read(16) if ord(line[10]) != 0: keycode,state = line[10],line[12] interpret(ord(keycode),ord(state)) "

Lists all listening ports together with the PID of the associated process
Lists all opened sockets (not only listeners), no DNS resolution (so it's fast), the process id and the user holding the socket. Previous samples were limiting to TCP too, this also lists UDP listeners.

split a file by a specific number of lines
Splits the file "my_file" every 500 lines. Will create files called xx01 xx02 and so on. You can change the prefix by using the -f option. Comes in handy for splitting logfiles for example. I am using it for feeding a logfile parser with smaller files instead of one big file (due to performance reasons)

Find processes utilizing high memory in human readable format
Finding high memory usage report in human readable format.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: