Check These Out
It's also possible to delay the extraction (echo "unrar e ... fi" |at now+20 minutes) wich is really convenient!
This will lower the quality of mp3 files, but is necessary to play them on some mobile devices.
Changing newline to spaces using just echo
This is an alternative to cron which allows a one-off task to be scheduled for a certain time.
There was another line that was dependent on having un-named screen sessions. This just wouldn't do. This one works no matter what the name is. A possible improvement would be removing the perl dependence, but that doesn't effect me.
The (in)famous "FizzBuzz" programming challenge, answered in a single line of Bash code. The "|column" part at the end merely formats the output a bit, so if "column" is not installed on your machine you can simply omit that part. Without "|column", the solution only uses 75 characters.
The version below is expanded to multiple lines, with comments added.
for i in {1..100} # Use i to loop from "1" to "100", inclusive.
do ((i % 3)) && # If i is not divisible by 3...
x= || # ...blank out x (yes, "x= " does that). Otherwise,...
x=Fizz # ...set x to the string "Fizz".
((i % 5)) || # If i is not divisible by 5, skip (there's no "&&")...
x+=Buzz # ...Otherwise, append (not set) the string "Buzz" to x.
echo ${x:-$i} # Print x unless it is blanked out. Otherwise, print i.
done | column # Wrap output into columns (not part of the test).
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token.
This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use:
`awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'`
You must adapt the command line to include:
* $MFA_IDis ARN of the virtual MFA or serial number of the physical one
* TTL for the credentials
you can just use one awk script to parse the rss feed. No need to pipe so many awk's and sed's. Its ugly and inefficient.
We all know...
$ nice -n19
for low CPU priority.
$ ionice -c3
for low I/O priority.
nocache can be useful in related scenarios, when we operate on very large files just a single time, e.g. a backup job. It advises the kernel that no caching is required for the involved files, so our current file cache is not erased, potentially decreasing performance on other, more typical file I/O, e.g. on a desktop.
http://askubuntu.com/questions/122857
https://github.com/Feh/nocache
http://packages.debian.org/search?keywords=nocache
http://packages.ubuntu.com/search?keywords=nocache
To undo caching of a single file in hindsight, you can do
$ cachedel
To check the cache status of a file, do
$ cachestats