pipe into | sed "s/$/\/(1024\*1024\*1024)/" | bc to get size in GB
This command lists all the directories in SEARCHPATH by size, displaying their size in a human readable format. Show Sample Output
Command uses find to find and chmod all files recursively.
This is a more concise answer to http://blog.commandlinekungfu.com/2011/09/episode-158-old-switcheroo.html in my opinion.
cd / find `pwd` -name '.*' -prune -o \( -name *.h -o -name *.cpp \) -print | cscope -bi- export CSCOPE_DB=/cscope.out vim +'set cst'
A different approach to the problem - maintain a small sorted list, print the largest as we go, then the top 10 at the end. I often find that the find and sort take a long time, and the large file might appear near the start of the find. By printing as we go, I get better feedback. The sort used in this will be much slower on perls older than 5.8. Show Sample Output
be careful where you execute this from do a 'sudo ls' beforehand to prime sudo to not ask for your password
When downloading files on a Mac, Apple adds the x-attribute: com.apple.quarantine. Often, this makes it so you can't even run a ./configure. This command gets rid of the quarantine for all files in the current directory.
Using xargs is usually much quicker as it does not have to execute chmod for every file
This is useful when you are uploading svn project files to a new git repo.
Finds files modified today since 00:00, removes ugly dotslash characters in front of every filename, and sorts them. *EDITED* with the advices coming from flatcap (thanks!)
This solution is similar to [1] except that it does not have any dependency on GNU Parallel. Also, it tries to minimize the impact on the running system (using ionice and nice). [1] http://www.commandlinefu.com/commands/view/7009/recompress-all-.gz-files-in-current-directory-using-bzip2-running-1-job-per-cpu-core-in-parallel
First use find to find all the images that end with jpg or JPG in the current dir and all its children. Then pipe that to xargs. The -I{} makes it so spaces in filenames don't matter. The 1024">" makes it so it takes any image greater in dimension than 1024 and resizes them to 1024 width, but keeping aspect ratio on height. Then it sets the image quality to 40. Piping it through xargs means you avoid the file count limit, and you could run this on your entire file system if you wanted.
Use find's built-in "exec" option to avoid having to do any weirdness with quoting.
this command is used to locate all pom.xml files, access the dir and do a mvn clean, but I do recommend you to disable network interfaces to not download dependencies packages to be faster. Show Sample Output
This begins recursively looking at dot files starting from "./path_to_dir". Then it prints out the names of those files. If you are satisfied with the list of files discovered then you can delete them like so `find ./path_to_dir -type f -name '.*' -exec rm '{}' \;` which executes the removal program against each of those names previously printed. This is useful when you want to remove thumbnail files on Mac OSX/Windows or simply want to reset an app's configuration on Linux.
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: