Commands by kumarrav90 (2)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Convert the contents of a directory listing into a colon-separated environment variable
Useful for making a CLASSPATH out of a list of JAR files, for example. Also: export CLASSPATH=.:$(find ./lib -name '*.jar' -printf '%p:')

See how many % of your memory firefox is using

Install pip with Proxy
Installs pip packages defining a proxy

Create a tar archive using xz compression
compress directory archive with xz compression, if tar doesn't have the -J option (OSX tar doesn't have -J)

Use Vim to convert text to HTML.
``vimhtml somefile.txt`` will open vim for the HTML convertion and close it immediately after its done, leaving you with somefile.html which you can later use in your website or whatever.

Commandline document conversion with Libreoffice
In this example, the docx gets converted to Open Document .odt format. For other formats, you'll need to specify the correct filter (Hint: see "Comments" link below for a nice list).

generate file list modified since last commit and export to tar file
################################################################################ # get all modified files since last commit and zip them to upload to live server ################################################################################ # delete previous tar output file rm mytarfile.tar -rf #rm c:/tarOutput/*.* -rf # get last commit id and store in variable declare RESULT=$(git log --format="%H" | head -n1) # generate file list and export to tar file git diff-tree -r --no-commit-id --name-only --diff-filter=ACMRT $RESULT | xargs tar -rf mytarfile.tar # extract tar files to specified location tar -xf mytarfile.tar -C c:/tarOutput

easily find megabyte eating files or directories
This is easy to type if you are looking for a few (hundred) "missing" megabytes (and don't mind the occasional K slipping in)... A variation without false positives and also finding gigabytes (but - depending on your keyboard setup - more painful to type): $du -hs *|grep -P '^(\d|,)+(M|G)'|sort -n (NOTE: you might want to replace the ',' according to your locale!) Don't forget that you can modify the globbing as needed! (e.g. '.[^\.]* *' to include hidden files and directories (w/ bash)) in its core similar to: http://www.commandlinefu.com/commands/view/706/show-sorted-list-of-files-with-sizes-more-than-1mb-in-the-current-dir

Find all active ip's in a subnet
nmap for windows and other platforms is available on developer's site: http://nmap.org/download.html nmap is robust tool with many options and has various output modes - is the best (imho) tool out there.. from nmap 5.21 man page: -oN/-oX/-oS/-oG : Output scan in normal, XML, s|

How many files in the current directory ?
A simple "ls" lists files *and* directories. So we need to "find" the files (type 'f') only. As "find" is recursive by default we must restrict it to the current directory by adding a maximum depth of "1". If you should be using the "zsh" then you can use the dot (.) as a globbing qualifier to denote plain files: zsh> ls *(.) | wc -l for more info see the zsh's manual on expansion and substitution - "man zshexpn".


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: