Commands by kostis (4)

  • This is a standard procedure for me, whenever I set up a new Raspberry Pi system. Because the default user is "pi", I quickly replace it with my own (e.g. "kostis"), but I have to substitute that user to all of pi's groups first, before deleting the default account. xargs helps a lot with that in a single line, while avoiding boring "for" loops. For everything trickier, there's always "parallel" :)


    3
    groups pi | xargs -n 1 | tail -n +4 | xargs -n 1 sudo adduser kostis
    kostis · 2022-01-25 07:20:09 447
  • As an alternative to the above command, this one ditches the unnecessary and complicated for loop in favor of a way faster multi-core approach for a task that's more CPU than I/O intensive, making it a perfect suite for GNU parallel


    4
    parallel cwebp -q 80 {} -o {.}.webp ::: *.png
    kostis · 2018-12-07 23:37:24 32
  • You could have that little benchmark run on all cores in parallel, as a multi-core benchmark or stress test First find the number of cores, then have parallel iterate over that in, well, parallel Show Sample Output


    -1
    time cat /proc/cpuinfo |grep proc|wc -l|xargs seq|parallel -N 0 echo "2^2^20" '|' bc
    kostis · 2018-12-06 05:36:55 1059
  • Broken in two parts, first get the number of cores with cat /proc/cpuinfo |grep proc|wc -l and create a integer sequence with that number (xargs seq), then have GNU parallel loop that many times over the given command. Cheers! Show Sample Output


    -2
    time cat /proc/cpuinfo |grep proc|wc -l|xargs seq|parallel -N 0 echo "scale=4000\; a\(1\)\*4" '|' bc -l
    kostis · 2018-12-06 05:15:24 674

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Monitor open connections for httpd including listen, count and sort it per IP
It's not my code, but I found it useful to know how many open connections per request I have on a machine to debug connections without opening another http connection for it. You can also decide to sort things out differently then the way it appears in here.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Decode base64-encoded file in one line of Perl
Another option is openssl.

Find biggest 10 files in current and subdirectories and sort by file size

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Purge configuration files of removed packages on debian based systems

Decode base64-encoded file in one line of Perl
If you are in an environment where you don't have the base64 executable or MIME tools available, this can be very handy for salvaging email attachments when the headers are mangled but the encoded document itself is intact.

old man's advice

Temporarily ignore known SSH hosts
you may create an alias also, which I did ;-) alias sshu="ssh -o UserKnownHostsFile=/dev/null "

dd with progress bar and statistics to gzipped image


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: