Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using mv from sorted by
Terminal - Commands using mv - 180 results
mkdir -p temp && for f in *.pdf ; do qpdf --password=YOURPASSWORDHERE --decrypt "$f" "temp/$f"; done && mv temp/* . && rm -rf temp
2013-06-25 18:41:51
Functions: mkdir mv rm
-1

Replace YOURPASSWORDHERE with the pdf password. [qpdf needed]

ls -1 | while read file; do new_file=$(echo $file | sed s/\ /_/g); mv "$file" "$new_file"; done
split -l 12000 -a 5 database.sql splited_file i=1 for file in splited_file* do mv $file database_${i}.sql i=$(( i + 1 )) done
2013-05-15 18:17:47
User: doczine
Functions: file mv split
0

For some reason split will not let you add extension to the files you split. Just add this to a .sh script and run with bash or sh and it will split your text file at 12000 lines for each file and then add a .sql extension to the file name.

svg2png(){ png="${1%.*}.png"; inkscape --export-png="$png" --without-gui "$1" && pngcrush -brute -rem alla -rem text "$png" "$png.new" && mv "$png.new" "$png";}
2013-05-08 15:21:52
Functions: mv
0

Convert an SVG to PNG and then crush the filesize brutally with pngcrush. Good for icons and website junk that you want to keep small, expecially before base64 encoding.

Uses inkscape, not imagemagick, as IM doesn't always handle gradients well. This way also seems to sometime save some file size (eg. 619 with Inkscape compared to 695 with IM).

IM can do general images:

img2png(){ png="${1%.*}.png"; convert -background none "$1" "$png" && pngcrush -brute -rem alla -rem text "$png" "$png.new" && mv "$png.new" "$png"; }
mv data.{json,yaml}
2013-04-25 07:47:32
User: bunam
Functions: mv
Tags: json yaml
-2

since Mozai said that JSON is a subset of YAML ;)

alias rn='mkdir -p ~/.rm`pwd`; mv -v -f --backup=t -t ~/.rm`pwd` "$@"'
sudo apt-get install git gcc make libx11-dev libxtst-dev pkg-config -y && git clone https://github.com/hanschen/ksuperkey.git && cd ksuperkey && make && sudo mv ksuperkey /usr/bin/ksuperkey && cd ~ && rm -rf ksuperkey
2013-04-17 07:12:46
User: FadeMind
Functions: cd gcc install make mv rm sudo
0

Install Ksuperkey one command in Kubuntu.

You must manually add ksuperkey to autostart in System Settings KDE.

count='1'; for i in *.jpg; do mv $i $(printf '%01d'.jpg $count); (( count++ )); done
2013-02-20 06:38:25
User: lalanza808
Functions: mv printf
0

The '1' in '%01d' changes the amounts of digits in the integer, eg. 1 vs 0001.

for i in $(seq -w 0 100) ; do mv prefix$(( 10#$i )).jpg prefix${i}.jpg ; done
for i in `find -name '*_test.rb'` ; do mv $i ${i%%_test.rb}_spec.rb ; done
2012-10-09 14:08:38
User: olopopo
Functions: mv
0

Renames all files ending in "_test.rb" to "_spec.rb"

rename(){ txtToReplace=${1} ; replacementTxt=${2} ; shift 2 ; files=${@} ; for file in $files ; do mv ${file} ${file/${txtToReplace}/${replacementTxt}} ; done ; }
2012-10-03 17:03:29
Functions: file mv rename shift
2

Implementation of `rename` for systems on which I don't have access to it.

for f in *; do fn=`echo $f | sed 's/\(.*\)\.\([^.]*\)$/\1\n\2/;s/\./-/g;s/\n/./g'`; mv $f $fn; done
2012-09-29 02:10:00
Functions: mv sed
0

This command can rename all files in a folder changing all the dots in the filename for dashes, but respecting the final dot for the extension.

touch -t 201208211200 first ; touch -t 201208220100 last ; find /path/to/files/ -newer first ! -newer last | xargs -ifile mv -fv file /path/to/destination/ ; rm first; rm last;
2012-08-22 09:51:40
User: ktopaz
Functions: file find last mv rm touch xargs
0

touch -t 201208211200 first ; touch -t 201208220100 last ;

creates 2 files: first & last, with timestamps that the find command should look between:

201208211200 = 2012-08-21 12:00

201208220100 = 2012-08-22 01:00

then we run find command with "-newer" switch, that finds by comparing timestamp against a reference file:

find /path/to/files/ -newer first ! -newer last

meaning: find any files in /path/to/files that are newer than file "first" and not newer than file "last"

pipe the output of this find command through xargs to a move command:

| xargs -ifile mv -fv file /path/to/destination/

and finally, remove the reference files we created for this operation:

rm first; rm last;

ls | grep -Ze ".*rar" | xargs -d '\n' -I {} mv {} backup-folder
2012-08-06 09:07:03
User: crisboot
Functions: grep ls mv xargs
0

In the example suppose we want to move all *.rar files in the current folder to a backupfolder

or i in `seq 1 12| tac` ; do mv access_log.{$i,$((i+1))}.gz ; done
2012-06-13 17:46:37
User: fobriste
Functions: mv
0

Edit as necessary. Should match the logs and the number should be least, greatest.

function rjust_file_nums() {for i in *.ogg; do; mv $i `ruby -e "print ARGV.first.gsub(/\d+/){|d| d.rjust($1,'0')}" $i`; done}
2012-05-19 15:41:06
User: timrand
Functions: mv
2

each number in a file name gets expanded to the number of digets provided as arg_1 of the arguments in rjust_file_nums. Put the funciton in the .bashrc file. Be sure to $ source ~/.bashrc so that the function will be accessible from bash.

function rjust_file_nums(){for i in *.ogg; do; mv $i `ruby -e "print ARGV.first.gsub(/\d+/){|d| d.rjust($1,'0')}" $i`; done }
2012-05-19 15:39:39
User: timrand
Functions: mv
1

each number in a file name gets expanded to the number of digets provided as arg_1 of the arguments in rjust_file_nums. Put the funciton in the .bashrc file. Be sure to $ source ~/.bashrc so that the function will be accessible from bash.

for i in [0-9].ogg; do mv {,0}$i; done
2012-05-18 18:02:26
User: zoke
Functions: mv
0

This only includes files with numbers.

zeros=3; from=1; to=15; for foo in $(seq $from $to); do echo mv "front${foo}back" "front$(printf "%0${zeros}d\n" $foo)back"; done
2012-05-17 10:54:45
Functions: echo mv seq
0

This command takes a few changes to get to the file format, but once you have that, you're good to go. Set your environment variables and then change the text "front" and "back" to whatever you're files start and end with. You'll end up with some easily sort-able files.

for i in ?.ogg; do mv $i 0$i; done
2012-05-15 02:52:52
User: Bonster
Functions: mv
18

from

1.ogg

2.ogg

3.ogg

10.ogg

11.ogg

to

01.ogg

02.ogg

03.ogg

10.ogg

11.ogg

for file in * ; do mv "$file" `echo "$file" | tr ' ' '_' | tr '[A-Z]' '[a-z]'`; done
2012-05-06 17:54:06
User: cengztr
Functions: file mv tr
0

All files in the directory will be renamed replacing every space in the filename by "_" (underline) and converting upper case characters to lower case characters.

e.g. Foo Bar.txt --> foo_bar.txt

for i in *.jpg; do dst=$(exif -t 0x9003 -m $i ) && dst_esc=$(echo $dst | sed 's/ /-/g' ) && echo mv $i $dst_esc.jpg ; done
2012-05-02 07:23:38
User: klisanor
Functions: echo mv sed
Tags: exif date rename
1

The command renames all files in a certain directory. Renaming them to their date of creation using EXIF. If you're working with JPG that contains EXIF data (ie. from digital camera), then you can use following to get the creation date instead of stat.

* Since not every file has exif data, we want to check that dst is valid before doing the rest of commands.

* The output from exif has a space, which is a PITA for filenames. Use sed to replace with '-'.

* Note that I use 'echo' before the mv to test out my scripts. When you're confident that it's doing the right thing, then you can remove the 'echo'... you don't want to end up like the guy that got all the files blown away.

Credits: http://stackoverflow.com/questions/4710753/rename-files-according-to-date-created

for file in "* *"; do mv "${file}" "${file// /_}"; done
for i in *.txt; do j=`mktemp | awk -F. '{print $2".txt"}'`; mv "$i" "$j"; done
2012-04-17 17:13:32
User: yepitsken
Functions: awk mv
0

A simple way to rename a set of files to a unique, randomized file name.