Commands by sairon (7)

  • The pwgen program generates passwords which are designed to be easily memorized by humans, while being as secure as possible. Human-memorable passwords are never going to be as secure as completely completely random passwords. [from pwgen man page] Show Sample Output


    6
    pwgen 30 1
    sairon · 2011-07-24 19:43:48 5
  • This one-liner greps first 30 direct URLs for .torrent files matching your search querry, ordered by number of seeds (descending; determined by the second number after your querry, in this case 7; for other options just check the site via your favorite web-browser). You don't have to care about grepping the torrent names as well, because they are already included in the .torrent URL (except for spaces and some other characters replaced by underscores, but still human-readable). Be sure to have some http://isup.me/ macro handy (someone often kicks the ethernet cables out of their servers ;) ). I've also coded a more user-friendly ash (should be BASH compatible) script, which also lists the total size of download and number of seeds/peers (available at http://saironiq.blogspot.com/2011/04/my-shell-scripts-4-thepiratebayorg.html - may need some tweaking, as it was written for a router running OpenWrt and transmission). Happy downloading!


    4
    wget -U Mozilla -qO - "http://thepiratebay.org/search/your_querry_here/0/7/0" | grep -o 'http\:\/\/torrents\.thepiratebay\.org\/.*\.torrent'
    sairon · 2011-04-15 15:01:16 30
  • wget -qO - "http://www.google.com/dictionary/json?callback=dict_api.callbacks.id100&q=steering+wheel&sl=en&tl=en&restrict=pr,de&client=te" this does the actual google dictionary query, returns a JSON string encapsulated in some fancy tag sed 's/dict_api\.callbacks.id100.//' here we remove the tag beginning sed 's/,200,null)//' and here the tag end There are also some special characters which could cause problems with some JSON parsers, so if you get some errors, this is probably the case (sed is your friend). I laso like to trim the "webDefinitions" part, because it (sometimes) contains misleading information. sed 's/\,\"webDefinitions.*//' (but remember to append a "}" at the end, because the JSON string will be invalid) The output also contains links to mp3 files with pronounciation. As of now, this is only usable in the English language. If you choose other than English, you will only get webDefinitions (which are crap).


    1
    wget -qO - "http://www.google.com/dictionary/json?callback=dict_api.callbacks.id100&q=steering+wheel&sl=en&tl=en&restrict=pr,de&client=te" | sed 's/dict_api\.callbacks.id100.//' | sed 's/,200,null)//'
    sairon · 2011-03-08 15:00:39 16
  • Records audio from your mic in FLAC (Free Lossless Audio Codec) format, starts only after it detects at least 0.1 seconds of noise and stops after 1 second of silence. You can adjust the percent values (sensitivity) to best fit your microphone and voice (0.1% if you have a great quality mic, higher if you don't, 0% does not trim anything). Useful for speech recognition in conjunction with my previous command titled 'Google voice recognition "API"' (http://www.commandlinefu.com/commands/view/8043/google-voice-recognition-api).


    1
    sox -t alsa default ./recording.flac silence 1 0.1 5% 1 1.0 5%
    sairon · 2011-03-08 14:36:39 4
  • EDIT: command updated to support accented characters! Works in any of 58 google supported languages (some sound like crap, english is the best IMO). You get a mp3 file containing your query in spoken language. There is a limit of 100 characters for the "q" parameter, so be careful. The "tl" parameter contains target language.


    37
    wget -q -U Mozilla -O output.mp3 "http://translate.google.com/translate_tts?ie=UTF-8&tl=en&q=hello+world
    sairon · 2011-03-08 14:05:36 23
  • The FLAC audio must be encoded at 16000Hz sampling rate (SoX is your friend). Outputs a short JSON string, the actual speech is in the hypotheses->utterance, the accuracy is stored in hypotheses->confidence (ranging from 0 to 1). Google also accepts audio in some special speex format (audio/x-speex-with-header-byte), which is much smaller in comparison with losless FLAC, but I haven't been able to encode such a sample. Show Sample Output


    3
    wget -q -U "Mozilla/5.0" --post-file speech.flac --header="Content-Type: audio/x-flac; rate=16000" -O - "http://www.google.com/speech-api/v1/recognize?lang=en-us&client=chromium"
    sairon · 2011-03-08 13:39:01 4
  • substitute "example" with desired string; tl = target language (en, fr, de, hu, ...); you can leave sl parameter as-is (autodetection works fine) Show Sample Output


    5
    wget -U "Mozilla/5.0" -qO - "http://translate.google.com/translate_a/t?client=t&text=translation+example&sl=auto&tl=fr" | sed 's/\[\[\[\"//' | cut -d \" -f 1
    sairon · 2011-03-06 13:46:16 109

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

back up your commandlinefu contributed commands
Use `zless` to read the content of your *rss.gz file: $ zless commandlinefu-contribs-backup-2009-08-10-07.40.39.rss.gz

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Let's make screen and ssh-agent friends
When you start screen as `ssh-agent screen`, agent will die after detatch. If you don't want to take care about files when stored agent's pid/socket/etc, you have to use this command.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Find the package that installed a command

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Get the size of all the directories in current directory (Sorted Human Readable)
This allows the output to be sorted from largest to smallest in human readable format.

Remove spaces from filenames - through a whole directory tree.
An example of zsh glob qualifiers.

Recursively grep for string and format output for vi(m)
This is a big time saver for me. I often grep source code and need to edit the findings. A single highlight of the mouse and middle mouse click (in gnome terminal) and I'm editing the exact line I just found. The color highlighting helps interpret the data.

Unite pdf files
pdfunite is a part of the poppler-utils. poppler-utils package is only 150KB. The alternative - pdftk package is 14MB! Install poppler-utils if you need simple pdf operation commands like unite, separate, info, text/html conversions


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: