http://public-dns.info gives a list of online dns servers. you need to change the country in url (br in this url) with your country code. this command need some time to ping all IP in list. Show Sample Output
Random text of length "$1" without the useless cat command.
a little bit smarter & i only want .de domains, so ..... Show Sample Output
The platform-agnostic version of https://www.commandlinefu.com/commands/view/25276/compute-newest-kernel-version-from-makefile-on-torvalds-git-repository because macOS doesn't have wget installed
sh as: #! /bin/sh while [ 1 -ne 6 ]; do pid=`ps -ef | grep -v "grep" | grep "trans_gzdy" | cut -c10-17` ps gv $pid | head -2 sleep 1 done check changes of RSS. Show Sample Output
Coming back to a project directory after sometime elsewhere? Need to know what the most recently modified files are? This little function "t" is one of my most frequent commands. I have a tcsh alias for it also: alias t 'ls -ltch \!* | head -20' Show Sample Output
This little command (function) shows the CSV header fields (which are field names separated by commas) as an ordered list, clearly showing the fields and their order. Show Sample Output
Strangely enough, there is no option --lines=[negative] with tail, like the head's one, so we have to use sed, which is very short and clear, you see. Strangely more enough, skipping lines at the bottom with sed is not short nor clear. From Sed one liner : # delete the last 10 lines of a file $ sed -e :a -e '$d;N;2,10ba' -e 'P;D' # method 1 $ sed -n -e :a -e '1,10!{P;N;D;};N;ba' # method 2 Show Sample Output
List files and sizes
This is an extension of a previous command by satyavvd on 2009-07-23 12:04:02, but this one grabs the whole archive. Hard coded numbers in previous script capped number of commands that could be fetched. This one grabs them all regardless of how big the archive gets. Show Sample Output
Makes use of $RANDOM environment variable.
Similar but using mediainfo instead of totem-something
** Replace the ... in URLS with: www.census.gov/genealogy/www/data/1990surnames Couldn't fit in 256 Created on Ubuntu 9.10 but nothing out of the ordinary, should work anywhere with a little tweaking. 5163 is the number of unique first names you get when combine the male and female first name files from. http://www.census.gov/genealogy/www/data/1990surnames/names_files.html Show Sample Output
first off, if you just want a random UUID, here's the actual command to use:
uuidgen
Your chances of finding a duplicate after running this nonstop for a year are about the same as being hit by a meteorite before finishing this sentence
The reason for the command I have is that it's more provably unique than the one that uuidgen creates. uuidgen creates a random one by default, or an unencrypted one based on time and network address if you give it the -t option.
Mine uses the mac address of the ethernet interface, the process id of the caller, and the system time down to nanosecond resolution, which is provably unique over all computers past, present, and future, subject to collisions in the cryptographic hash used, and the uniqueness of your mac address.
Warning: feel free to experiment, but be warned that the stdin of the hash is binary data at that point, which may mess up your terminal if you don't pipe it into something. If it does mess up though, just type
reset
Show Sample Output
Ever wanted to stream your favorite podcast across the network, well now you can. This command will parse the iTunes enabled podcast and stream the latest episode across the network through ssh encryption. Show Sample Output
Gets the latest podcast show from from your favorite Podcast. Uses curl and xmlstarlet. Make sure you change out the items between brackets.
Simple way of having random mrxvt backgrounds. Add this to your bashrc and change the path names for the pictures.
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: