Check These Out
Installs pip packages defining a proxy
The value for the sort command's -k argument is the column in the CSV file to sort on. In this example, it sorts on the second column. You must use some form of the sort command in order for uniq to work properly.
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token.
This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use:
`awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'`
You must adapt the command line to include:
* $MFA_IDis ARN of the virtual MFA or serial number of the physical one
* TTL for the credentials
easy way to setup an "internet radio sation", pre-requisite, create an account at an icecast server, in this example, just created beforehand an account at giss.tv.
Change the word password, with the respective real password you created at server.
Make sure to have installed rec, oggnec, oggfwd and tee.
I have a mixer connected at line in, so I can mix music and microphone.
This also will produce a local recorded copy of the session, it will be called "streamdump.ogg"
$python -c "DEV = '/dev/input/event4'
#if event0 doesn't work, try event1 event2 etc
fo = open(DEV)
def interpret(keycode,state):
if state == 0:
print '%i up'%keycode
if state == 1:
print '%i down'%keycode
if state == 2:
print '%i repeat'%keycode
while 1:
line = fo.read(16)
if ord(line[10]) != 0:
keycode,state = line[10],line[12]
interpret(ord(keycode),ord(state))
"
Lists all opened sockets (not only listeners), no DNS resolution (so it's fast), the process id and the user holding the socket.
Previous samples were limiting to TCP too, this also lists UDP listeners.
Splits the file "my_file" every 500 lines. Will create files called xx01 xx02 and so on. You can change the prefix by using the -f option. Comes in handy for splitting logfiles for example. I am using it for feeding a logfile parser with smaller files instead of one big file (due to performance reasons)
Finding high memory usage report in human readable format.