Commands by random_bob (2)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

escape any command aliases
e.g. if rm is aliased for 'rm -i', you can escape the alias by prepending a backslash: rm [file] # WILL prompt for confirmation per the alias \rm [file] # will NOT prompt for confirmation per the default behavior of the command

Pause and Resume Processes
Add that and "cont () { ps -ec | grep $@ | kill -SIGCONT `awk '{print $1}'`; }" (without the quotes) to you bash profile and then use it to pause and resume processes safely

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Test file system performance
You need bonnie++ package for this. More detail than a simple hdparm -t /dev/sda would give you. the -d is the directory where it performs writes/reads for example I use /tmp/scratch with 777 permissions Bonnie++ benchmarks three things: data read and write speed, number of seeks that can be performed per second, and number of file metadata operations that can be performed per second.

restart apache only if config works
making lots of configurations to apache and restarting the server only to find it broken just plain sucks.

Keep track of diff progress
You're running a program that reads LOTS of files and takes a long time. But it doesn't tell you about its progress. First, run a command in the background, e.g. $ find /usr/share/doc -type f -exec cat {} + > output_file.txt Then run the watch command. "watch -d" highlights the changes as they happen In bash: $! is the process id (pid) of the last command run in the background. You can change this to $(pidof my_command) to watch something in particular.

Short one line while loop that outputs parameterized content from one file to another

list files recursively by size

Record a webcam output into a video file.
The option -an disables audio recording, -f forces the use of video4linux for the input, -s sets the video to the size 320x240, -b sets the recording bitrate, -r sets the frame rate to 15fps, -i gives the input device, -vcodec sets the output format. Press Q to stop recording or you can specify the recording time with the -t option like -t 00:1:30

Migrate existing Ext3 filesystems to Ext4
Before doing this, back-up all data on any ext3 partitions that are to be converted to ext4. After running previous command you MUST run fsck, is needed to return the filesystem to a consistent state. $ fsck -pDf /dev/yourpartition Edit /etc/fstab and change the 'type' from ext3 to ext4 for any partitions that are converted to ext4.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: