Commands by ludwig (1)

  • Another step to bring cli and gui closer together: gnome-open It opens a path with the default (gui) application for its mime type. I would recommend a shorter alias like alias o=gnome-open More examples: gnome-open . [opens the current folder in nautilus / your default file browser] gnome-open some.pdf [opens some.pdf in evince / your default pdf viewer] gnome-open trash:// [opens the trash with nautilus] gnome-open http://www.commandlinefu.com [opens commandlinefu in your default webbrowser] Show Sample Output


    2
    gnome-open [path]
    ludwig · 2010-06-30 07:20:05 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

List the files any process is using
List the files a process is using.

Short one line while loop that outputs parameterized content from one file to another
The above is an example of grabbing only the first column. You can define the start and end points specifically by chacater position using the following command: $ while read l; do echo ${l:10:40}; done < three-column-list.txt > column-c10-c40.txt Of course, it doesn't have to be a column, or extraction, it can be replacement $ while read l; do echo ${l/foo/bar}; done < list-with-foo.txt > list-with-bar.txt Read more about parameter expansion here: http://wiki.bash-hackers.org/syntax/pe Think of this as an alternative to awk or sed for file operations

Add a progress counter to loop (see sample output)
For this hack you need following function: $ finit() { count=$#; current=1; for i in "$@" ; do echo $current $count; echo $i; current=$((current + 1)); done; } and alias: $ alias fnext='read cur total && echo -n "[$cur/$total] " && read' Inspired by CMake progress counters.

Copy data using gtar
It copies the entire current working directory to the destination directory with compression enabled.

Parse compressed apache error log file and show top errors
credit shall fall to this for non-gzipped version: https://gist.github.com/marcanuy/a08d5f2d9c19ba621399

Repeat a portrait eight times so it can be cut out from a 6"x4" photo and used for visa or passport photos
Yes, You could do it in the GIMP or even use Inkscape to auto-align the clones, but the command line is so much easier. NOTE: The +clone and -clone options are just to shorten the command line instead of typing the same filename eight times. It might also speed up the montage by only processing the image once, but I'm not sure. "+clone" duplicates the previous image, the following two "-clone"s duplicate the first two and then the first four images. NOTE2: The -frame option is just so that I have some lines to cut along. BUG: I haven't bothered to calculate the exact geometry (width and height) of each image since that was not critical for the visa photos I need. If it matters for you, it should be easy enough to set using the -geometry flag near the end of the command. For example, if you have your DPI set to 600, you could use "-geometry 800x1200!" to make each subimage 1⅓ x 2 inches. You may want to use ImageMagick's "-density 600" option to put a flag in the JPEG file cuing the printer that it is a 600 DPI image. BUG2: ImageMagick does not autorotate images based on the EXIF information. Since the portrait photo was taken with the camera sideways, I made the JPEG rotate using jhead like so: jhead -autorot 2007-08-25-3685.jpg

Lists all usernames in alphabetical order

Rename all files in lower case
rename is a really powerfull to, as its name suggests, rename files

Command to logout all the users in one command
It's only to logout all other user's except "root"

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: