Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using rename from sorted by
Terminal - Commands using rename - 61 results
find . -type d -print0 | while read -d $'\0' dir; do cd "$dir"; echo " process $dir"; find . -maxdepth 1 -name "*.ogg.mp3" -exec rename 's/.ogg.mp3/.mp3/' {} \; ; cd -; done
2014-08-25 11:28:43
Functions: cd echo find read rename
2

This is probably overkill, but I have some issues when the directories have spaces in their names.

The

find . -type d -print0 | while read -d $'\0' dir; do xxx; done

loops over all the subdirectories in this place, ignoring the white spaces (to some extend).

cd "$dir"; echo " process $dir"; cd -;

goes to the directory and back. It also prints some info to check the progress.

find . -maxdepth 1 -name "*.ogg.mp3" -exec rename 's/.ogg.mp3/.mp3/' {} \;

renames the file within the current directory.

The whole should work with directories and file names that include white spaces.

dir=${PWD##*/}; rename "s/`ls -b1 | head -n1 | sed 's/.\{4\}$//'`/$dir/" -v *
2014-07-08 03:20:04
User: codycook
Functions: dir rename
0

I use this on Debian to rename files that exist in directories but do not have the year in the file name. The directory has the year but the files inside don't.

How I explain how this runs:

The dir variable grabs the name of the folder.

Using rename, substitute the name of the first file and remove the extension, then rename it to the directory name.

To test this before you run it, change -v to -vn.

rename 's/result_([0-9]+)_([0-9]+)_([0-9]+)\.json\.txt/sprintf("%d%02d%02d.txt",$3,$2,$1)/ge' result_*.txt
2014-06-13 07:34:32
User: sucotronic
Functions: rename
Tags: perl rename
0

Given a bunch of files with "wrong" date naming, it renames them in a "good" format.

find . -exec rename 's/_/\ /g' {} +
2014-05-05 02:47:19
User: KlfJoat
Functions: find rename
1

Everyone wants to take spaces out of filenames. Forget that. I want to put them back in. We've got tools and filesystems that support spaces, they look better, so I'm going to use them.

Because of how find works I find I need to run this multiple times, if it's renaming subdirs. But it can be re-run without issues.

I got this version of the command from a comment in this underscore-generating command. http://www.commandlinefu.com/commands/view/760/find-recursively-from-current-directory-down-files-and-directories-whose-names-contain-single-or-multiple-whitespaces-and-replace-each-such-occurrence-with-a-single-underscore. All I did was change the regex.

rename 's/\.sh//' ./*
2014-04-02 16:33:25
User: abhikeny
Functions: rename
0

The 'rename' command with the first argument as "'s/\.//'" and the second argument as "" will remove the specified extension from the filenames.

rename *.JPG *.jpg
2014-03-05 14:54:33
User: gtoal
Functions: rename
Tags: batch rename
-1

# Limited and very hacky wildcard rename

# works for rename *.ext *.other

# and for rename file.* other.*

# but fails for rename file*ext other*other and many more

# Might be good to merge this technique with mmv command...

mv-helper() {

argv="`history 1 | perl -pe 's/^ *[0-9]+ +[^ ]+ //'`"

files="`echo \"$argv\"|sed -e \"s/ .*//\"`"

str="`history 1 | perl -pe 's/^ *[0-9]+ +[^ ]+ //' | tr -d \*`"

set -- $str

for file in $files

do

echo mv $file `echo $file|sed -e "s/$1/$2/"`

mv $file `echo $file|sed -e "s/$1/$2/"`

done

}

alias rename='mv-helper #'

rename s/ .php/ .html/ *.html
find ./ -name '*:*' -exec rename 's/:/_/g' {} +
rename 's/.xls/.ods/g' *.xls
rename "s/ /_/g" * .*
export l=$1; shift; rename 'my $l=$ENV{'l'}; my $z="0" x $l; s/\d+/substr("$z$&",-$l,$l)/e' "$@"
2013-03-13 15:14:20
User: hgrupht13
Functions: export rename
0

Use it as bash-script.

The first positional parameter specifies the fixed length of the numerical index.

Further params specify the files to manipulate.

rename 's/\d+/sprintf("%02d",$&)/e' -- $@
2013-02-14 18:29:18
User: Vilemirth
Functions: rename
5

Uses 'rename' to pad zeros in front of first existing number in each filename. The "--" is not required, but it will prevent errors on filenames which start with "-". You can change the "2d" to any number you want, equaling the total numeric output: aka, 4d = ????, 8d = ????????, etc.

I setup a handful of handy functions to this effect (because I couldn't figure out how to insert a var for the value) in the form of 'padnum?', such as:

padnum5 () {

/usr/bin/rename 's/\d+/sprintf("%05d",$&)/e' -- $@

}

Which would change a file "foo-1.txt" to "foo-00001.txt"

rename(){ txtToReplace=${1} ; replacementTxt=${2} ; shift 2 ; files=${@} ; for file in $files ; do mv ${file} ${file/${txtToReplace}/${replacementTxt}} ; done ; }
2012-10-03 17:03:29
Functions: file mv rename shift
2

Implementation of `rename` for systems on which I don't have access to it.

find . -type f -name *.MP3 -print0 | xargs -0 -i rename .MP3 .mp3 {}
2012-04-29 02:44:44
User: pibarnas
Functions: find rename xargs
0

Using a for loop, rename all files with .MP3 extension to .mp3.

rename 's/ /_/g' *
2012-04-25 10:32:02
Functions: rename
0

rename is often an alias to prename, bundled with perl.

perl -e 'for (<*.mp3>) { $old = $_; s/ /-/g; rename $old, $_ }'
rename 's/ /-/g' *.mp3
rename ' ' '_' *
2012-01-28 21:31:08
User: toadsted
Functions: rename
0

and that was the only thing that worked for me.

rename s/^/./ *
find . -type f -print0 | xargs -0 rename 's/\ //g'
2011-10-25 21:46:36
Functions: find rename xargs
1

delete file name space

the rename is rename perl version

rename.ul "" 00 ?.jpg; rename "" 0 ??.jpg;
find / -name "*.xls" -print0 | xargs -0 rename .xls .ods {}
2011-07-18 12:48:49
User: jagjit
Functions: find rename xargs
0

This command can be used to rename all the files with extension .xls( in this case) to .ods files. It can be used for other files with certain extension.

rename foo bar filename
rename foo bar directory/filename
2011-05-04 22:29:11
User: hexram
Functions: rename
-2

rename command in my system -Fuduntu running 2.6.38 Linux Kernel- is an ELF 64-bit LSB executable, not a Perl script. man page for rename command shows syntax as "rename from to where" (or something like that), so I am doing just what I have been told...