Commands by particleflux (2)

  • Benchmark a SQL query against MySQL Server. The example runs the query 10 times, and you get the average runtime in the output. To ensure that the query does not get cached, use `RESET QUERY CACHE;` on top in the query file. Show Sample Output


    1
    perf stat -r 10 sh -c "mysql > /dev/null < query.sql"
    particleflux · 2018-05-03 12:20:03 0
  • Pipe any JSON to jq, then count with the appropiate expression and use the | length on the array


    1
    echo '[1,2,3]' | jq '. | length'
    particleflux · 2018-05-03 06:34:05 1

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

execute your commands hiding secret bits from history records
$ wget --user=username --password="$password" http://example.org/ Instead of hiding commands entirely from history, I prefer to use "read" to put the password into a variable, and then use that variable in the commands instead of the password. Without the "-e" and "-s" it should work in any bourne-type shell, but the -s is what makes sure the password doesn't get echoed to the screen at all. (-e makes editing work a bit better)

ssh to machine behind shared NAT
Useful to get network access to a machine behind shared IP NAT. Assumes you have an accessible jump host and physical console or drac/ilo/lom etc access to run the command. Run the command on the host behind NAT then ssh connect to your jump host on port 2222. That connection to the jump host will be forwarded to the hidden machine. Note: Some older versions of ssh do not acknowledge the bind address (0.0.0.0 in the example) and will only listen on the loopback address.

FizzBuzz in one line of Bash
The (in)famous "FizzBuzz" programming challenge, answered in a single line of Bash code. The "|column" part at the end merely formats the output a bit, so if "column" is not installed on your machine you can simply omit that part. Without "|column", the solution only uses 75 characters. The version below is expanded to multiple lines, with comments added. for i in {1..100} # Use i to loop from "1" to "100", inclusive. do ((i % 3)) && # If i is not divisible by 3... x= || # ...blank out x (yes, "x= " does that). Otherwise,... x=Fizz # ...set x to the string "Fizz". ((i % 5)) || # If i is not divisible by 5, skip (there's no "&&")... x+=Buzz # ...Otherwise, append (not set) the string "Buzz" to x. echo ${x:-$i} # Print x unless it is blanked out. Otherwise, print i. done | column # Wrap output into columns (not part of the test).

Show me a histogram of the busiest minutes in a log file:
Busiest seconds: $ cat /var/log/secure.log | awk '{print substr($0,0,15)}' | uniq -c | sort -nr | awk '{printf("\n%s ",$0) ; for (i = 0; i

Convert (almost) any video file into webm format for online html5 streaming
If you're using the experimental vorbis encoder (homebrew version of libffmpeg)

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

dd with progress bar and statistics
Uses the pv utility to show progress of data transfer and an ETA until completion. You can install pv via Homebrew on macOS

Get all links from commandlinefu front page
You need to install WWW::Mechanize Perl module with # cpan -i WWW::Mezchanize or by searching mechanize | grep perl in your package manager With this command, you can get forms, images, headers too

remove OSX resource forks ._ files
DESCRIPTION For each dir, dot_clean recursively merges all ._* files with their cor- responding native files according to the rules specified with the given arguments. By default, if there is an attribute on the native file that is also present in the ._ file, the most recent attribute will be used. If no operands are given, a usage message is output. If more than one directory is given, directories are merged in the order in which they are specified.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: