Commands by jrdbz (2)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Extracts PDF pages as images

Remove ( color / special / escape / ANSI ) codes, from text, with sed
Remove ( color / special / escape / ANSI ) codes, from text, with sed Credit to the original folks who I've copied this command from. The diff here is: Theirs: [m|K] Theirs is supposed to remove \E[NUMBERS;NUMBERS[m OR K] This statement is incorrect in 2 ways. 1. The letters m and K are two of more than 20+ possible letters that can end these sequences. 2. Inside []'s , OR is already assumed, so they are also looking for sequences ending with | which is not correct. This : [a-zA-Z] This resolves the "OR" issue noted above, and takes care of all sequences, as they all end with a lower or upper cased letter. This ensures 100% of any escape code 'mess' is removed.

Network Proxy to dump the application level forward traffic in plain text in the console and in a file.
If you have a client that connects to a server via plain text protocol such as HTTP or FTP, with this command you can monitor the messages that the client sends to the server. Application level text stream will be dumped on the command line as well as saved in a file called proxy.txt. You have to change 8080 to the local port where you want your client to connect to. Change also 192.168.0.1 to the IP address of the destination server and 80 to the port of the destination server. Then simply point your client to localhost 8080 (or whatever you changed it to). The traffic will be redirected to host 192.168.0.1 on port 80 (or whatever you changed them to). Any requests from the client to the server will be dumped on the console as well as in the file "proxy.txt". Unfortunately the responses from the server will not be dumped.

prevents replace an existing file by mistake
Use set +o noclobber and you will be able to replace files again

Get all URLs from webpage via Regular Expression
Get all URLs from website via Regular Expression... You must have lynx installed in your computer to execute the command. --> lynx --dump "" | egrep -o "" - Must substitute it for the website path that you want to extract the URLs - Regular Expression that you wanna filter the website

Calculate days on which Friday the 13th occurs (inspired from the work of the user justsomeguy)
Friday is the 5th day of the week, monday is the 1st. Output may be affected by locale.

Summarize Apache Extended server-status to show longest running requests
Ever need to know why Apache is bogging down *right now*? Hate scanning Apache's Extended server-status for the longest running requests? Me, too. That's why I use this one liner to quickly find suspect web scripts that might need review. Assuming the Extended server-status is reachable at the target URL desired, this one-liner parses the output through elinks (rendering the HTML) and shows a list of active requests sorted by longest running request at the bottom of the list. I include the following fields (as noted in the header line): Seconds: How long the request is alive PID: Process ID of the request handler State: State of the request, limited to what I think are the relevant ones (GCRK_.) IP: Remote Host IP making the request Domain: Virtual Host target (HTTP/1.1 Host: header). Important for Virtual Hosting servers TYPE: HTTP verb URL: requested URL being served. Putting this in a script that runs when triggered by high load average can be quite revealing. Can also capture "forgotten" scripts being exploited such as "formmail.pl", etc.

List files with full path
This version is a bit more portable although it isn't extended as easily with '-type f' etc. On AIX the find command doesn't have -maxdepth or equivalent.

Go (cd) directly into a new temp folder
This command create a new temp directory using mktemp (to avoid collisions) and change the current working directory to the created directory.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: