Commands using sudo (537)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

sort monthwise
sort command can sort month-wise (first three letters of each month). See the sample output for clarification. Sorting Stable ? NO. Take note if that matters to you. Sample output suggests that sort performs unstable sorting (see the relative order of two 'feb' entries).

find out how many days since given date
Exactly the same number of characters, exactly the same results, but with bc

Display which distro is installed
Works on Ubuntu

Exclude inserting a table from a sql import
Starting with a large MySQL dump file (*.sql) remove any lines that have inserts for the specified table. Sometimes one or two tables are very large and uneeded, eg. log tables. To exclude multiple tables you can get fancy with sed, or just run the command again on subsequently generated files.

Copy a folder tree through ssh using compression (no temporary files)
This command will copy a folder tree (keeping the parent folders) through ssh. It will: - compress the data - stream the compressed data through ssh - decompress the data on the local folder This command will take no additional space on the host machine (no need to create compressed tar files, transfer it and then delete it on the host). There is some situations (like mirroring a remote machine) where you simply cant wait for a huge time taking scp command or cant compress the data to a tarball on the host because of file system space limitation, so this command can do the job quite well. This command performs very well mainly when a lot of data is involved in the process. If you copying a low amount of data, use scp instead (easier to type)

Find Duplicate Files (based on size first, then MD5 hash)
If you have the fdupes command, you'll save a lot of typing. It can do recursive searches (-r,-R) and it allows you to interactively select which of the duplicate files found you wish to keep or delete.

prints the parameter you used on the previous command

Calculates the number of physical cores considering HyperThreading in AWK
Check whether hyperthreading is enabled or not. a better solution as nproc should work on all OS with awk

chroot, bind mount without root privilege/setup
PRoot is a user-space implementation of chroot, mount --bind, and binfmt_misc. This means that users don't need any privileges or setup to do things like using an arbitrary directory as the new root filesystem, making files accessible somewhere else in the filesystem hierarchy, or executing programs built for another CPU architecture transparently through QEMU user-mode. Also, developers can use PRoot as a generic Linux process instrumentation engine thanks to its extension mechanism, see CARE for an example. Technically PRoot relies on ptrace, an unprivileged system-call available in every Linux kernel. https://github.com/cedric-vincent/PRoot

Execute a command with a timeout
I like much more the perl solution, but without using perl. It launches a backgroup process that will kill the command if it lasts too much. A bigger function: check_with_timeout() { [ "$DEBUG" ] && set -x COMMAND=$1 TIMEOUT=$2 RET=0 # Launch command in backgroup [ ! "$DEBUG" ] && exec 6>&2 # Link file descriptor #6 with stderr. [ ! "$DEBUG" ] && exec 2> /dev/null # Send stderr to null (avoid the Terminated messages) $COMMAND 2>&1 >/dev/null & COMMAND_PID=$! [ "$DEBUG" ] && echo "Background command pid $COMMAND_PID, parent pid $$" # Timer that will kill the command if timesout sleep $TIMEOUT && ps -p $COMMAND_PID -o pid,ppid |grep $$ | awk '{print $1}' | xargs kill & KILLER_PID=$! [ "$DEBUG" ] && echo "Killer command pid $KILLER_PID, parent pid $$" wait $COMMAND_PID RET=$? # Kill the killer timer [ "$DEBUG" ] && ps -e -o pid,ppid |grep $KILLER_PID | awk '{print $1}' | xargs echo "Killing processes: " ps -e -o pid,ppid |grep -v PID | grep $KILLER_PID | awk '{print $1}' | xargs kill wait sleep 1 [ ! "$DEBUG" ] && exec 2>&6 6>&- # Restore stderr and close file descriptor #6. return $RET }


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: