Create a bunch of random files with random binary content. Basically dd dumps randomly from your hard disk to files random-file*. Show Sample Output
get files without extensions, get ASCII and utf-8 as "text/plain" Show Sample Output
"&&" runs sed if and only if the backup completed and /bin/cp exited cleanly. Works for multiple files; just specify multiple filenames (or glob). Use -v switch for cp to play it safe.
List all text files in the current directory.
cat -n file : number all line cat -b file : number only non empty line see man cat
I know this has been beaten to death but finding video files using mime types and printing the "hours of video" for each directory is (IMHO) easier to parse than just a single total. Output is in minutes. Among the other niceties is that it omits printing of non-video files/folders PS: Barely managed to fit it within the 255 character limit :D Show Sample Output
Finds all (not just adjacent) repeated lines in a file. Show Sample Output
If you used to do `vlc /tmp/Flash*`, but no longer can't, this is for you.
If you make a mess (like I did) and you removed all the executable permissions of a directory (or you set executable permissions to everything) this can help.
It supports spaces and other special characters in the file paths, but it will work only in bash, GNU find and GNU egrep.
You can complement it with these two commands:
1. add executable permission to directories:
find . type d -print0 | xargs -0 chmod +x
2. and remove to files:
find . type d -print0 | xargs -0 chmod -x
Or, in the same loop:
while IFS= read -r -u3 -d $'\0' file; do
case $(file "$file" | cut -f 2- -d :) in
:*executable*|*ELF*|*directory*)
chmod +x "$file"
;;
*)
chmod -x "$file"
;;
esac || break
done 3< <(find . -print0)
Ideas stolen from Greg's wiki: http://mywiki.wooledge.org/BashFAQ/020
Rename all mp4 files with crc32 information. Show Sample Output
You need: pxz for the actual work (http://jnovy.fedorapeople.org/pxz/). The function could be better with better multifile and stdin/out support.
Helps to fix permissions when a user clobbers them in their home directory or elsewhere. Does not rely on file extension, but uses the `file` command for context.
Converts flac files to mp3 with the same file names in the same directory. Show Sample Output
All files in the directory will be renamed replacing every space in the filename by "_" (underline) and converting upper case characters to lower case characters. e.g. Foo Bar.txt --> foo_bar.txt
Allows to change 'shell' compatible files execution bit even if their name is not *.sh
touch -t 201208211200 first ; touch -t 201208220100 last ; creates 2 files: first & last, with timestamps that the find command should look between: 201208211200 = 2012-08-21 12:00 201208220100 = 2012-08-22 01:00 then we run find command with "-newer" switch, that finds by comparing timestamp against a reference file: find /path/to/files/ -newer first ! -newer last meaning: find any files in /path/to/files that are newer than file "first" and not newer than file "last" pipe the output of this find command through xargs to a move command: | xargs -ifile mv -fv file /path/to/destination/ and finally, remove the reference files we created for this operation: rm first; rm last;
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: