Note the space before the command; that prevents your history eliminating command from being recorded. ' history -c && rm -f ~/.bash_history' Both steps are needed. 'history -c' clears what you see in the history command. 'rm -f ~/.bash_history' deletes the history file in your home directory.
This uses mpg123 to convert the files to wav before burning, but you can use mplayer or mencoder or ffmpeg or lame with the --decode option, or whatever you like.
Add permanent line numbers to a file without creating a temp file. The rm command deletes file10 while the nl command works on the open file descriptor of file10 which it outputs into a new file again named file10. The new file10 will now be numbered in the same directory with the same file name and content as before, but it will in fact be a new file, using (ls -i) to show its inode number will prove this.
With the plus instead of semicolon, find builds the (eg.) rm command like xargs does - invokes as few extra processes as possible.
Will stop all running containers, then remove all containers **This isn't for selectively handling containers, it removes everything**
I had to compress it a bit to meet the 255 limit. See sample for full command (274) usage: ffgif foo.ext Supports 3 arguments (optional) ffgif filename seek_time time_duration scale ffgif foo 10 5 320 will seek 10 seconds in, convert for 5 seconds at a 320 scale. Default will convert whole video to gif at 320 scale. Inspiration - http://superuser.com/questions/556029/how-do-i-convert-a-video-to-gif-using-ffmpeg-with-reasonable-quality/556031#556031 Show Sample Output
I often need to send screenshots to other people to explain settings and whatever. So I created this oneline which I use to create the screenshot with imagemagik, upload it via scp to my server and then the command opens an firefox tab with the screenshot. The screenshot can be a region or a window. You just have to replace the parts beginning with YOUR.
remove file that has sensitive info safely. Overwrites it 33 times with zeros
Inspired by http://www.commandlinefu.com/commands/view/2573/remove-all-files-previously-extracted-from-a-tar.gz-file. .... yet for zip files
A common mistake in Bash is to write command-line where there's command a reading a file and whose result is redirected to that file. It can be easily avoided because of : 1) warnings "-bash: file.txt: cannot overwrite existing file" 2) options (often "-i") that let the command directly modify the file but I like to have that small function that does the trick by waiting for the first command to end before trying to write into the file. Lots of things could probably done in a better way, if you know one... Show Sample Output
Check out the usage of 'trap', you may not have seen this one much. This command provides a way to schedule commands at certain times by running them after sleep finishes sleeping. In the example 'sleep 2h' sleeps for 2 hours. What is cool about this command is that it uses the 'trap' builtin bash command to remove the SIGHUP trap that normally exits all processes started by the shell upon logout. The 'trap 1' command then restores the normal SIGHUP behaviour. It also uses the 'nice -n 19' command which causes the sleep process to be run with minimal CPU. Further, it runs all the commands within the 2nd parentheses in the background. This is sweet cuz you can fire off as many of these as you want. Very helpful for shell scripts.
The command first deletes any old playlist calles playlist.tmp under /tmp. After that it recursively searches all direcotries under ~/mp3 and stores the result in /tmp/playlist.tmp. After havin created the playlist, the command will execute mplayer which will shuffle through the playlist. This command is aliased to m is aliased to `rm -rf /tmp/playlist.tmp && find ~/mp3 -name *.mp3 > /tmp/playlist.tmp && mplayer -playlist /tmp/playlist.tmp -shuffle -loop 0 | grep Playing' in my ~/.bashrc. Show Sample Output
This will search all directories and ignore the CVS ones. Then it will search all files in the resulting directories and act on them.
This function does a batch edition of all OOO3 Writer files in current directory. It uses sed to search a FOO pattern into body text of each file, then replace it to foo pattern (only the first match) . I did it because I've some hundreds of OOO3 Writer files where I did need to edit one word in each ones and open up each file in OOO3 gui wasn't an option. Usage: bsro3 FOO foo
My variation on an audio burning command from commandlinefu - this one doesn't crap out if you want to burn a CD in a directory whose permissions don't allow it, and instead rips everything to /tmp. If you mount your music partition like I do using Samba, you probably don't have write permission inside that file system in order to create the temporary directory other audio burning commands here use. Not a bad idea to add cdrom to your groups, and /bin/eject with visudo.
Thanks to flatcap for optimizing this command. This command takes advantage of the ext4 filesystem's resistance to fragmentation. By using this command, files that were previously fragmented will be copied / deleted / pasted essentially giving the filesystem another chance at saving the file contiguously. ( unlike FAT / NTFS, the *nix filesystem always try to save a file without fragmenting it ) My command only effects the home directory and only those files with your R/W (read / write ) permissions. There are two issues with this command: 1. it really won't help, it works, but linux doesn't suffer much (if any ) fragmentation and even fragmented files have fast I/O 2. it doesn't discriminate between fragmented and non-fragmented files, so a large ~/ directory with no fragments will take almost as long as an equally sized fragmented ~/ directory The benefits i managed to work into the command: 1. it only defragments files under 16mb, because a large file with fragments isn't as noticeable as a small file that's fragmented, and copy/ delete/ paste of large files would take too long 2. it gives a nice countdown in the terminal so you know how far how much progress is being made and just like other defragmenters you can stop at any time ( use ctrl+c ) 3. fast! i can defrag my ~/ directory in 11 seconds thanks to the ramdrive powering the command's temporary storage bottom line: 1. its only an experiment, safe ( i've used it several times for testing ), but probably not very effective ( unless you somehow have a fragmentation problem on linux ). might be a placebo for recent windows converts looking for a defrag utility on linux and won't accept no for an answer 2. it's my first commandlinefu command Show Sample Output
Maybe you want first check which files will be deleted:
find $HOME -name '*.sol' -exec echo rm {} \;
I think this is less resource consuming than the previous examples
Multi-argument version, but with VIM loveliness :D
recursively deletes all broken symlinks using zsh globbing syntax.
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: