commandlinefu.com is the place to record those command-line gems that you return to again and again.
You can sign-in using OpenID credentials, or register a traditional username and password.
Subscribe to the feed for:
The number on the far right is ratio of comments to code, expressed as a percentage. For the rest of the Yardstick documentation see https://github.com/calmh/yardstick/blob/master/README.md#reported-metrics
xargs is a more elegant approach to executing a command on find results then -exec as -exec is meant as a filtering flag.
Returns the version of the kernel module specified as "MODULENAME", when available.
`pwd` returns the current path
`grep -o` prints each slash on new line
perl generates the paths sequence: './.', './../.', ...
`readlink` canonicalizes paths (it makes the things more transparent)
`xargs -tn1` applies chmod for each of them. Each command applied is getting printed to STDERR.
This version also attaches to new processes forked by the parent apache process. That way you can trace all current and *future* apache processes.
completely remove those packages that leave files in debian / ubuntu marked with rc and not removed completely with traditional tools
This functionality seems to be missing from commands like dpkg. Ideally, I want to duplicate the behavior of rpm --verify, but it seems difficult to do this in one relatively short command pipeline.
Find all the occurrences in the git repo of 'foo' and replace with 'bar'
This checks jpeg data and metadata, should be grepped as needed, maybe a -B1 Warning for the first, and a -E "WARNING|ERROR" for the second part....
Executing pfiles will return a list of all descriptors utilized by the process
We are interested in the S_IFREG entries since they are pointing usually to files
In the line, there is the inode number of the file which we use in order to find the filename.
The only bad thing is that in order not to search from / you have to suspect where could possibly be the file.
Improvements more than welcome.
lsof was not available in my case
This is a modified version of the OP, wrapped into a bash function.
This version handles newlines and other whitespace correctly, the original has problems with the thankfully rare case of newlines in the file names.
It also allows checking an arbitrary number of directories against each other, which is nice when the directories that you think might have duplicates don't have a convenient common ancestor directory.
You can simply run "largest", and list the top 10 files/directories in ./, or you can pass two parameters, the first being the directory, the 2nd being the limit of files to display.
Best off putting this in your bashrc or bash_profile file
Did some research and found the previous command wrong, we don't kill a zombie but its parent. Just made some modifcation to khashmeshab's command.