commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Description by segments delimited by pipe (|)
1. List all git branches
2. Exclude master
3. Trim output and remove display elements such as * next to current branch
4. Repeat branch name after a space (output on each line: branch_name branch_name)
5. Prepend each line with the git tag command
6. Execute the output with bash
Download video files from a bunch of sites (here is a list https://rg3.github.io/youtube-dl/supportedsites.html).
The options say: base filename on title, ignores errors and continue partial downloads. Also, stores some metadata into a .json file plz.
Paste youtube users and playlists for extra fun.
Protip: git-annex loves these files
compress directory archive with xz compression, if tar doesn't have the -J option (OSX tar doesn't have -J)
Compress files or a directory to xz format. XZ has superior and faster compression than bzip2 in most cases. XZ is superior to 7zip format because it can save file permissions and other metadata data.
Magic line will extract almost all possible archives from current folder in its own folders. Don't forget to change USER name in sudo command. sed is used to create names for folders from archive names w/o extension. You can test sed expression, used in this command:
arg='war.lan.net' ; x=$(echo $arg|sed 's/\(.*\)\..*/\1/') ; echo $x
If some archives can't be extracted, install packages:
apt-get install p7zip-full p7zip-rar
Hope this will save a lot of your time. Enjoy.
This can be used to delete or archive old mails. In fact, for archiving its a bit different, you need to archive mails with any tools (e.g archivemail), and then deleting (if you want!).
Here we use -path ".*/cur/*" to avoid files limit in bash globbing and to search in any inbox (e.g .mymail .spam .whatever).
! -newermt "1 week ago" can be read: All files which is older than "1 week ago", adapt it in consequence.
This is just a little snippit to split a large file into smaller chunks (4mb in this example) and then send the chunks off to (e)mail for archival using mutt.
I usually encrypt the file before splitting it using openssl:
openssl des3 -salt -k <password> -in file.tgz -out file.tgz.des3
To restore, simply save attachments and rejoin them using:
cat file.tgz.* > output_name.tgz
and if encrypted, decrypt using:
openssl des3 -d -salt -k <password> -in file.tgz.des3 -out file.tgz
edit: (changed "g" to "e" for political correctness)
Create a tar file in multiple parts if it's to large for a single disk, your filesystem, etc.
Rejoin later with `cat .tar.*|tar xf -`
This will unarchive the entire working directory. Good for torrents (I don't know why they put each file into a seperate archive).
Sometimes it is handy to be able to list contents of a tar file within a compressed archive, such as 7Zip in this instance, without having to extract the archive first. This is especially helpful when dealing with larger sized files.
Using 7z to create archives is OK, but when you use tar, you preserve all file-specific information such as ownership, perms, etc. If that's important to you, this is a better way to do it.