Check These Out
Since the original command (#1873) didn't work on FreeBSD whose stat lacks the "-c" switch, I wrote an alternative that does. This command shows also the fourth digit of octal format permissions which yields the sticky bit information.
Computes factorials.
Ever need to know why Apache is bogging down *right now*? Hate scanning Apache's Extended server-status for the longest running requests? Me, too. That's why I use this one liner to quickly find suspect web scripts that might need review.
Assuming the Extended server-status is reachable at the target URL desired, this one-liner parses the output through elinks (rendering the HTML) and shows a list of active requests sorted by longest running request at the bottom of the list. I include the following fields (as noted in the header line):
Seconds: How long the request is alive
PID: Process ID of the request handler
State: State of the request, limited to what I think are the relevant ones (GCRK_.)
IP: Remote Host IP making the request
Domain: Virtual Host target (HTTP/1.1 Host: header). Important for Virtual Hosting servers
TYPE: HTTP verb
URL: requested URL being served.
Putting this in a script that runs when triggered by high load average can be quite revealing. Can also capture "forgotten" scripts being exploited such as "formmail.pl", etc.
If the pdf/dvi/etc documentation for a latex package is already part of your local texmf tree, then texdoc will find and display it for you. If the documentation is not available on your system, it will bring up the package's webpage at CTAN to help you investigate.
get diskusage of files (in this case logfiles in /var/log) modified during the last n days:
$ sudo find /var/log/ -mtime -n -type f | xargs du -ch
n -> last modified n*24 hours ago
Numeric arguments can be specified as
+n for greater than n,
-n for less than n,
n for exactly n.
=> so 7*24 hours (about 7 days) is -7
$ sudo find /var/log/ -mtime -7 -type f | xargs du -ch | tail -n1
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token.
This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use:
`awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'`
You must adapt the command line to include:
* $MFA_IDis ARN of the virtual MFA or serial number of the physical one
* TTL for the credentials
Put the positive clauses after the '-o' option.
From the cwd, recursively find all rar files, extracting each rar into the directory where it was found, rather than cwd.
A nice time saver if you've used wget or similar to mirror something, where each sub dir contains an rar archive.
Its likely this can be tuned to work with multi-part archives where all parts use ambiguous .rar extensions but I didn't test this. Perhaps unrar would handle this gracefully anyway?
I have a remote php file that I want to run once an hour. I set up cron to run this wget. I don't really care about what's in the file though, I don't want to save the results, so I run the -O and send it to /dev/null