Commands by epistemenical (1)

  • Written on OSX after `brew install unrar coreutils`; presumably works on other unices with minimal modifications. Didn't test rars that actually have paths in them, just "flat" files. Won't include files in the rar starting with a dot.


    0
    function rar2zip { rar="$(grealpath "$1")"; zip="$(grealpath "${2:-$(basename "$rar" .rar).zip}")"; d=$(mktemp -d /tmp/rar2zip.XXXXXX); cd "$d"; unrar x "$rar"; zip -r "$zip" *; cd -; rm -r "$d"; }
    epistemenical · 2014-05-28 07:51:17 5

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

autossh + ssh + screen = super rad perma-sessions
Only useful for really flakey connections (but im stuck with one for now). Though if youre in this situation ive found this to be a good way to run autossh and it does a pretty good job of detecting when the session is down and restarting. Combined with the -t and screen commands this pops you back into your working session lickety split w/ as few headaches as possible. And if autossh is a bit slow at detecting the downed ssh connection, just run this in another tab/terminal window to notify autossh that it should drop it and start over. Basically for when polling is too slow. kill -SIGUSR1 `pgrep autossh`

shell function to underline a given string.
underline() will print $1, followed by a series of '=' characters the width of $1. An optional second argument can be used to replace '=' with a given character. This function is useful for breaking lots of data emitted in a for loop into sections which are easier to parse visually. Let's say that 'xxxx' is a very common pattern occurring in a group of CSV files. You could run $ grep xxxx *.csv This would print the name of each csv file before each matching line, but the output would be hard to parse visually. $ for i in *.csv; do printf "\n"; underline $i; grep "xxxx" $i; done Will break the output into sections separated by the name of the file, underlined.

shutdown pc in 4 hours without needing to keep terminal open / user logged in.
From the 'disown' man page: disown prevents the current shell from sending a HUP signal to each of the given jobs when the current shell terminates a login session.

List latest 5 modified files recursively
The output format is given by the -printf parameter: %T@ = modify time in seconds since Jan. 1, 1970, 00:00 GMT, with fractional part. Mandatory, hidden in the end. %TY-%Tm-%Td %TH:%TM:%.2TS = modify time as YYYY-MM-DD HH:MM:SS. Optional. %p = file path Refer to http://linux.die.net/man/1/find for more about -printf formatting. ------------------------ sort -nr = sort numerically and reverse (higher values - most recent timestamp - first) head -n 5 = get only 5 first lines (change 5 to whatever you want) cut -f2- -d" " = trim first field (timestamp, used only for sorting) ------------------------ Very useful for building scripts for detecting malicious files upload and malware injections.

find and reduce 8x parallel the size of PNG images without loosing quality via optipng

Go (cd) directly into a new temp folder
This command create a new temp directory using mktemp (to avoid collisions) and change the current working directory to the created directory.

Convert unix timestamp to date

finding cr-lf files aka dos files with ^M characters
Looking for carriage returns would also identify files with legacy mac line endings. To fix both types: $ perl -i -pe 's/\r\n?/\n/g' $(find . -type f -exec fgrep -l $'\r' "{}" \;)

back up your commandlinefu contributed commands
Use `zless` to read the content of your *rss.gz file: $ zless commandlinefu-contribs-backup-2009-08-10-07.40.39.rss.gz

Run remote web page, but don't save the results
I have a remote php file that I want to run once an hour. I set up cron to run this wget. I don't really care about what's in the file though, I don't want to save the results, so I run the -O and send it to /dev/null


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: