commandlinefu.com is the place to record those command-line gems that you return to again and again.
You can sign-in using OpenID credentials, or register a traditional username and password.
Subscribe to the feed for:
`pwd` returns the current path
`grep -o` prints each slash on new line
perl generates the paths sequence: './.', './../.', ...
`readlink` canonicalizes paths (it makes the things more transparent)
`xargs -tn1` applies chmod for each of them. Each command applied is getting printed to STDERR.
This version also attaches to new processes forked by the parent apache process. That way you can trace all current and *future* apache processes.
completely remove those packages that leave files in debian / ubuntu marked with rc and not removed completely with traditional tools
This functionality seems to be missing from commands like dpkg. Ideally, I want to duplicate the behavior of rpm --verify, but it seems difficult to do this in one relatively short command pipeline.
Find all the occurrences in the git repo of 'foo' and replace with 'bar'
This checks jpeg data and metadata, should be grepped as needed, maybe a -B1 Warning for the first, and a -E "WARNING|ERROR" for the second part....
Executing pfiles will return a list of all descriptors utilized by the process
We are interested in the S_IFREG entries since they are pointing usually to files
In the line, there is the inode number of the file which we use in order to find the filename.
The only bad thing is that in order not to search from / you have to suspect where could possibly be the file.
Improvements more than welcome.
lsof was not available in my case
This is a modified version of the OP, wrapped into a bash function.
This version handles newlines and other whitespace correctly, the original has problems with the thankfully rare case of newlines in the file names.
It also allows checking an arbitrary number of directories against each other, which is nice when the directories that you think might have duplicates don't have a convenient common ancestor directory.
You can simply run "largest", and list the top 10 files/directories in ./, or you can pass two parameters, the first being the directory, the 2nd being the limit of files to display.
Best off putting this in your bashrc or bash_profile file
Did some research and found the previous command wrong, we don't kill a zombie but its parent. Just made some modifcation to khashmeshab's command.
do 1000 at a time so that if your doodoo is deep you can avoid avoid "command-line too big" error
find . = will set up your recursive search. You can narrow your search to certain file by adding -name "*.ext" or limit buy using the same but add prune like -name "*.ext" -prune
xargs =sets it up like a command line for each file find finds and will invoke the next command which is perl.
perl = invoke perl
-p sets up a while loop
-i in place and the .bak will create a backup file like filename.ext.bak
-e execute the following....
's/ / /;' your basic substitute and replace.