Commands by icco (2)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Upload folder to imageshack.us (forum)
Each file in the current folder is uploaded to imageshack.us If the folder contains other filetypes change: $for files in * to: $for files in *.jpg (to upload ONLY .jpg files) Additionally you can try (results may vary): $for files in *.jpg *.png The output URL is encased with BB image tags for use in a forum.

Clone IDE Hard Disk
This command clone the first partition of the primary master IDE drive to the second partition of the primary slave IDE drive (!!! back up all data before trying anything like this !!!)

Dump dvd from a different machine onto this one.
You can use this to directly dump from machine A (with dvd drive) to machine B (without dvd drive) . I used this to copy dvd using my friend's machine to my netbook. Above command is to be issued on machine B. Advantages : 1) No wasting time dumping first to machine A and then copying to Machine B. 2) You dont need to use space on Machine A. In fact, this will work even when Machine A doesnt have enough hdd space to dump the DVD. Use -C ssh option on slow networks (enables compression). you can replace "dd if=/dev/dvd" with any ripping command as long as it spews the iso to stdout.

Exclude inserting a table from a sql import
Starting with a large MySQL dump file (*.sql) remove any lines that have inserts for the specified table. Sometimes one or two tables are very large and uneeded, eg. log tables. To exclude multiple tables you can get fancy with sed, or just run the command again on subsequently generated files.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Write comments to your history.
A null operation with the name 'comment', allowing comments to be written to HISTFILE. Prepending '#' to a command will *not* write the command to the history file, although it will be available for the current session, thus '#' is not useful for keeping track of comments past the current session.

Get the IP address of a machine. Just the IP, no junk.
Why use many different utilities all piped together, when you only need two?

monitor a tail -f command with multiple processes
when using named pipes only one reader is given the output by default. Also, most commands piped to by grep use a buffer which save output until tail -f finishes, which is not convenient. Here, using a combination of tee, sub-processes and the --line-buffered switch in grep we can workaround the problem.

Get the dir listing of an executable without knowing its location

defragment files
Thanks to flatcap for optimizing this command. This command takes advantage of the ext4 filesystem's resistance to fragmentation. By using this command, files that were previously fragmented will be copied / deleted / pasted essentially giving the filesystem another chance at saving the file contiguously. ( unlike FAT / NTFS, the *nix filesystem always try to save a file without fragmenting it ) My command only effects the home directory and only those files with your R/W (read / write ) permissions. There are two issues with this command: 1. it really won't help, it works, but linux doesn't suffer much (if any ) fragmentation and even fragmented files have fast I/O 2. it doesn't discriminate between fragmented and non-fragmented files, so a large ~/ directory with no fragments will take almost as long as an equally sized fragmented ~/ directory The benefits i managed to work into the command: 1. it only defragments files under 16mb, because a large file with fragments isn't as noticeable as a small file that's fragmented, and copy/ delete/ paste of large files would take too long 2. it gives a nice countdown in the terminal so you know how far how much progress is being made and just like other defragmenters you can stop at any time ( use ctrl+c ) 3. fast! i can defrag my ~/ directory in 11 seconds thanks to the ramdrive powering the command's temporary storage bottom line: 1. its only an experiment, safe ( i've used it several times for testing ), but probably not very effective ( unless you somehow have a fragmentation problem on linux ). might be a placebo for recent windows converts looking for a defrag utility on linux and won't accept no for an answer 2. it's my first commandlinefu command


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: