Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands using mv from sorted by
Terminal - Commands using mv - 186 results
for x in *.dat;do sort -k 3 $x >tmp && mv -f tmp $x;done
2010-07-07 07:57:37
User: rajarshi
Functions: mv sort
Tags: sorting
-2

We normally get tasks in which one has to sort a data file according to some column. For a single file say foo, we would use

sort -k 3 foo >tmp && tmp foo

The for loop is useful when we have to do it on a number of files.

find ~ -maxdepth 20 -type f -size -16M -print > t; for ((i=$(wc -l < t); i>0; i--)) do a=$(sed -n ${i}p < t); mv "$a" /dev/shm/d; mv /dev/shm/d "$a"; echo $i; done; echo DONE; rm t
2010-07-07 04:29:22
User: LinuxMan
Functions: echo find mv rm sed wc
2

Thanks to flatcap for optimizing this command.

This command takes advantage of the ext4 filesystem's resistance to fragmentation.

By using this command, files that were previously fragmented will be copied / deleted / pasted essentially giving the filesystem another chance at saving the file contiguously. ( unlike FAT / NTFS, the *nix filesystem always try to save a file without fragmenting it )

My command only effects the home directory and only those files with your R/W (read / write ) permissions.

There are two issues with this command:

1. it really won't help, it works, but linux doesn't suffer much (if any ) fragmentation and even fragmented files have fast I/O

2. it doesn't discriminate between fragmented and non-fragmented files, so a large ~/ directory with no fragments will take almost as long as an equally sized fragmented ~/ directory

The benefits i managed to work into the command:

1. it only defragments files under 16mb, because a large file with fragments isn't as noticeable as a small file that's fragmented, and copy/ delete/ paste of large files would take too long

2. it gives a nice countdown in the terminal so you know how far how much progress is being made and just like other defragmenters you can stop at any time ( use ctrl+c )

3. fast! i can defrag my ~/ directory in 11 seconds thanks to the ramdrive powering the command's temporary storage

bottom line:

1. its only an experiment, safe ( i've used it several times for testing ), but probably not very effective ( unless you somehow have a fragmentation problem on linux ). might be a placebo for recent windows converts looking for a defrag utility on linux and won't accept no for an answer

2. it's my first commandlinefu command

HTMLTEXT=$( curl -s http://www.page.de/test.html > /tmp/new.html ; diff /tmp/new.html /tmp/old.html ); if [ "x$HTMLTEXT" != x ] ; then echo $HTMLTEXT | mail -s "Page has changed." [email protected] ; fi ; mv /tmp/new.html /tmp/old.html
2010-07-04 21:45:37
User: Emzy
Functions: diff echo mail mv
2

Checks if a web page has changed. Put it into cron to check periodically.

Change http://www.page.de/test.html and [email protected] for your needs.

sudo find . -maxdepth 1 -cnewer olderFilesNameToMove -and ! -cnewer newerFileNameToMove -exec mv -v {} /newDirectory/ \;
2010-06-30 20:40:30
User: javamaniac
Functions: find mv sudo
2

In a folder with many files and folders, you want to move all files where the date is >= the file olderFilesNameToMove and

for n in * ; do mv $n `echo $n | tr '[:lower:]' '[:upper:]'`; done
2010-06-25 19:20:04
User: max_allan
Functions: mv tr
1

Simple bash/ksh/sh command to rename all files from lower to upper case. If you want to do other stuff you can change the tr command to a sed or awk... and/or change mv to cp....

for i in somefiles*.png ; do echo "$i" ; N=$(stat -c %Y $i); mv -i $i $N.png; done
2010-06-01 19:28:05
User: sufoo
Functions: echo mv stat
0

This renames a pattern matched bunch of files by their last modified time.

rename by timestamp

rename by time created

rename by time modified

lynx -dump -listonly 'url' | grep -oe 'http://.*\.ogg' > 11 ; vlc 11 ; mv 11 /dev/null
shopt -s extglob; for f in *.ttf *.TTF; do g=$(showttf "$f" 2>/dev/null | grep -A1 "language=0.*FullName" | tail -1 | rev | cut -f1 | rev); g=${g##+( )}; mv -i "$f" "$g".ttf; done
2

Just a quick hack to give reasonable filenames to TrueType and OpenType fonts.

I'd accumulated a big bunch of bizarrely and inconsistently named font files in my ~/.fonts directory. I wanted to copy some, but not all, of them over to my new machine, but I had no idea what many of them were. This script renames .ttf files based on the name embedded inside the font. It will also work for .otf files, but make sure you change the mv part so it gives them the proper extension.

REQUIREMENTS: Bash (for extended pattern globbing), showttf (Debian has it in the fontforge-extras package), GNU grep (for context), and rev (because it's hilarious).

BUGS: Well, like I said, this is a quick hack. It grew piece by piece on the command line. I only needed to do this once and spent hardly any time on it, so it's a bit goofy. For example, I find 'rev | cut -f1 | rev' pleasantly amusing --- it seems so clearly wrong, and yet it works to print the last argument. I think flexibility in expressiveness like this is part of the beauty of Unix shell scripting. One-off tasks can be be written quickly, built-up as a person is "thinking aloud" at the command line. That's why Unix is such a huge boost to productivity: it allows each person to think their own way instead of enforcing some "right way".

On a tangent: One of the things I wish commandlinefu would show is the command line HISTORY of the person as they developed the script. I think it's that conversation between programmer and computer, as the pipeline is built piece-by-piece, that is the more valuable lesson than any canned script.

mv ubuntu-10.04-rc-desktop-amd64.iso ubuntu-10.04-desktop-amd64.iso; i=http://releases.ubuntu.com/10.04/ubuntu-10.04-desktop-amd64.iso.zsync; while true; do if wget $i; then zsync $i; date; break; else sleep 30; fi; done
2010-04-29 15:49:43
Functions: mv sleep wget
4

Need to have rc iso pre-downloaded before running command.

inplace() { eval F=\"\$$#\"; "$@" > "$F".new && mv -f "$F".new "$F"; }
2010-04-09 11:36:31
User: inof
Functions: eval mv
1

Some commands (such as sed and perl) have options to support in-place editing of files, but many commands do not. This shell function enables any command to change files in place. See the sample output for many examples.

The function uses plain sh syntax and works with any POSIX shell or derivative, including zsh and bash.

for each in *; do file="$each."; name=${file%%.*}; suffix=${file#*.}; mv "$each" "$(echo $name | rot13)${suffix:+.}${suffix%.}"; done
2010-03-20 16:11:12
User: hfs
Functions: mv
-1

This got a bit complicated, because I had to introduce an additional dot at the end that has to be removed again later.

find -type f -exec mv {} . \;
2010-03-02 07:09:45
User: and3k
Functions: find mv
11

Find every file and move it to current directory.

ls -d */* | sed -e 's/^/\"/g' -e 's/$/\"/g' | xargs mv -t $(pwd)
2010-03-01 23:43:26
User: leovailati
Functions: ls mv sed xargs
-1

You WILL have problems if the files have the same name.

Use cases: consolidate music library and unify photos (especially if your camera separates images by dates).

After running the command and verifying if there was no name issues, you can use

ls -d */ | sed -e 's/^/\"/g' -e 's/$/\"/g' | xargs rm -r

to remove now empty subdirectories.

wget -r --no-parent http://codeigniter.com/user_guide/ ; mv codeigniter.com/user_guide/* . ; rm -rf codeigniter.com
2010-03-01 02:37:26
Functions: mv rm wget
-1

I constantly need to work on my local computer, thus I need a way to download the codeigniter user guide, this is the wget way I figured.

for file in `find . -iname "FILENAME"`; do cat $file | sed "s/SEARCH_STRING/REPLACE_STRING/" > $file.tmp; mv $file.tmp $file; done
for i in */*/*\(1\)*; do mv -f "$i" "${i/ (1)}"; done
2010-01-30 03:11:55
User: magenine
Functions: mv
1

Renames duplicates from MusicBrainz Picard, so you get the latest copy and not a bunch of duplicates.

for f in *;do mv "$f" "${f// /_}";done
2010-01-29 19:57:16
User: ethanmiller
Functions: mv
Tags: bash
9

I realize there's a few of these out there, but none exactly in this form, which seems the cleanest to me

find . -maxdepth 1 -type f| xargs sha1sum | sed 's/^\(\w*\)\s*\(.*\)/\2 \1/' | while read LINE; do mv $LINE; done
sudo find /etc/rc{1..5}.d -name S99myservice -type l -exec sh -c 'NEWFN=`echo {} | sed 's/S99/K99/'` ; mv -v {} $NEWFN' \;
2010-01-03 00:56:57
User: zoomgarden
Functions: find mv sed sh sudo
0

Change run control links from start "S" to stop "K" (kill) for whatever run levels in curly braces for a service called "myservice". NEWFN variable is for the new filename stored in the in-line shell. Use different list of run levels (rc*.d, rc{1,3,5}.d, etc.) and/or swap S with K in the command to change function of run control links.

for i in *\ *; do if [ -f "$i" ]; then mv "$i" ${i// /_}; fi; done
2010-01-02 16:30:45
User: auriza
Functions: mv
1

This command will replace spaces in filename with underscore, for all file in directory that contain spaces.

git clean -n | sed 's/Would remove //; /Would not remove/d;' | xargs mv -t stuff/
for i in `seq 100`;do mkdir f{1..100} touch myfile$i mv myfile$i f$i;done
for f in $(ls *.xml.skippy); do mv $f `echo $f | sed 's|.skippy||'`; done
2009-11-19 21:36:26
User: argherna
Functions: ls mv sed
Tags: sed ls mv for
-2

For this example, all files in the current directory that end in '.xml.skippy' will have the '.skippy' removed from their names.

find /home/dir -mtime +1 -print -exec gzip -9 {} \; -exec mv {}.gz {}_`date +%F`.gz \;
ls | while read f; do mv "$f" "${f// /_}";done