Query Wikipedia by issuing a DNS query for a TXT record. The TXT record will also include a short URL to the complete corresponding Wikipedia entry.You can also write a little shell script like:
$ cat wikisole.sh
#!/bin/sh
dig +short txt ${1}.wp.dg.cx
and run it like
./wikisole.sh unix
were your first option ($1) will be used as search term.
Show Sample Output
Instead of opening your browser, googling "whatismyip"... Also useful for scripts. dig can be found in the dnsutils package.
It's somewhat common ISPs to intercept DNS queries at port 53 and resolve them at their own. To check if your ISP is intercepting your DNS queries just type this command in the terminal. "#.abc" it's an OK answer. But if you get something like "I am not an OpenDNS resolver.", yep, you are beign cheated by your ISP.
Shorter version, works with multiple words. Show Sample Output
Uses GNU Parallel. Show Sample Output
The +short option should make dig less chatty.
Performs a reverse DNS lookup, variants include:
nslookup 74.125.45.100
or:
host 74.125.45.100
Show Sample Output
Simple command to trace a DNS query from the root all the way to the authoritative servers.
google has added 2 more netblocks... Show Sample Output
A function that takes a domain name as an argument Show Sample Output
Change the $domain variable to whichever domain you wish to query. Works with the majority of whois info; for some that won't, you may have to compromise: domain=google.com; for a in $(whois $domain | grep "Domain servers in listed order:" --after 3 | grep -v "Domain servers in listed order:"); do echo ">>> Nameservers for $domain from $a Note that this doesn't work as well as the first one; if they have more than 3 nameservers, it won't hit them all. As the summary states, this can be useful for making sure the whois nameservers for a domain match the nameserver records (NS records) from the nameservers themselves. Show Sample Output
Quick shortcut if you know the hostname and want to save yourself one step for looking up the IP address separately.
Evoke from the command like as:
timeDNS commandlinefu.com
.
This isn't too terribly practical, but it is a good code example of using subshells to run the queries in parallel and the use of an "anonymous function" (a/k/a "inline group") to group i/o.
.
I'm assuming you have already defined your local DNS cache as ${local_DNS}, (here, it's 192.168.0.1).
.
You do need to install `moreutils` to get `sponge`.
.
If you're willing to wait, a slower version w/o sponge, (and w/o sorting), is this:
.
DNS () { for x in "192.168.0.1" "208.67.222.222" "208.67.220.220" "198.153.192.1" "198.153.194.1" "156.154.70.1" "156.154.71.1" "8.8.8.8" "8.8.4.4"; do (echo -n "$x "; dig @"$x" "$*"|grep Query) ; done ; }
Show Sample Output
Mostly for Norwegians, but easily adoptable to others. Very handy if you are brainstorming for a new domainname. Will only display the available ones.. You can usually do this better with dig, but if you dont have dig, or the TLD only have an online service to check with, this will be usefull.. Show Sample Output
Works for multiple hosts (such as www.google.com) and/or wrong hosts. Show Sample Output
This removes the enclosing quotation marks ("), and sticthes the different packets together, e.g. ' Show Sample Output
Short and concise output appropiate for scripts. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: