All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Repeat a portrait eight times so it can be cut out from a 6"x4" photo and used for visa or passport photos
Yes, You could do it in the GIMP or even use Inkscape to auto-align the clones, but the command line is so much easier. NOTE: The +clone and -clone options are just to shorten the command line instead of typing the same filename eight times. It might also speed up the montage by only processing the image once, but I'm not sure. "+clone" duplicates the previous image, the following two "-clone"s duplicate the first two and then the first four images. NOTE2: The -frame option is just so that I have some lines to cut along. BUG: I haven't bothered to calculate the exact geometry (width and height) of each image since that was not critical for the visa photos I need. If it matters for you, it should be easy enough to set using the -geometry flag near the end of the command. For example, if you have your DPI set to 600, you could use "-geometry 800x1200!" to make each subimage 1⅓ x 2 inches. You may want to use ImageMagick's "-density 600" option to put a flag in the JPEG file cuing the printer that it is a 600 DPI image. BUG2: ImageMagick does not autorotate images based on the EXIF information. Since the portrait photo was taken with the camera sideways, I made the JPEG rotate using jhead like so: jhead -autorot 2007-08-25-3685.jpg

Open a Remote Desktop (RDP) session with a custom resolution.
Using a widescreen monitor, I often get annoyed that the RDP window is too high, or too narrow for what I want to display. In this example, I'm on a 1680 x 1050 display.

find duplicate processes
This command will allow to search for duplicate processes and sort them by their run count. Note that if there are same processes run by different users you'll see only one user in the result line, so you'll need to do: $ ps aux | grep to see all users that run this command.

create thumbnail of pdf

Easily decode unix-time (funtion)

Allow any local (non-network) connection to running X server

Export MS Access mdb files to csv
-H suppress Headers -I Inserts instead of csv -R to give ; as the row delimeter. Probably you can concatenate each line with a ; while importing to the db.

copy timestamps of files from one location to another - useful when file contents are already synced but timestamps are wrong.
Sometimes when copying files from one place to another, the timestamps get lost. Maybe you forgot to add a flag to preserve timestamps in your copy command. You're sure the files are exactly the same in both locations, but the timestamps of the files in the new home are wrong and you need them to match the source. Using this command, you will get a shell script (/tmp/retime.sh) than you can move to the new location and just execute - it will change the timestamps on all the files and directories to their previous values. Make sure you're in the right directory when you launch it, otherwise all the touch commands will create new zero-length files with those names. Since find's output includes "." it will also change the timestamp of the current directory. Ideally rsync would be the way to handle this - since it only sends changes by default, there would be relatively little network traffic resulting. But rsync has to read the entire file contents on both sides to be sure no bytes have changed, potentially causing a huge amount of local disk I/O on each side. This could be a problem if your files are large. My approach avoids all the comparison I/O. I've seen comments that rsync with the "--size-only" and "--times" options should do this also, but it didn't seem to do what I wanted in my test. With my approach you can review/edit the output commands before running them, so you can tell exactly what will happen. The "tee" command both displays the output on the screen for your review, AND saves it to the file /tmp/retime.sh. Credit: got this idea from Stone's answer at http://serverfault.com/questions/344731/rsync-copying-over-timestamps-only?rq=1, and combined it into one line.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Change user, assume environment, stay in current dir
I've used this a number of times troubleshooting user permissions. Instead of just 'su - user' you can throw another hyphen and stay in the original directory.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: