All commands (14,187)

  • If you're going to use od, here's how to suppress the labels at the beginning. Also, it doesn't output the \x, hence the sed command at the end. Remove it for space separated hex values instead Show Sample Output


    5
    echo -n "text" | od -A n -t x1 |sed 's/ /\\x/g'
    camocrazed · 2010-07-14 15:31:36 8

  • 1
    find . -name '*.?pp' -exec grep -H "string" {} \;
    rthemocap · 2010-07-14 15:10:23 5
  • Ssh to host1, host2, and host3, executing on each host and saving the output in {host}.log. I don't have the 'parallel' command installed, otherwise it sounds interesting and less cryptic.


    1
    for host in host1 host2 host3; do ssh -n user@$host <command> > $host.log & done; wait
    cout · 2010-07-14 14:55:31 3
  • Just use "od" and it can also dump in decimal or octal. (use -t x1 and not just -x or it confuses the byte order) There is a load of other formatting options, I'm not sure if you can turn off the address at the start of the line. Show Sample Output


    0
    echo "text" | od -t x1
    max_allan · 2010-07-14 14:53:25 3
  • I like this better than some of the alternatives using -exec, because if I want to change the string, it's right there at the end of the command line. That means less editing effort and more time to drink coffee. Show Sample Output


    2
    find . -name '*.?pp' | xargs grep -H "string"
    cout · 2010-07-14 14:41:07 5
  • first off, if you just want a random UUID, here's the actual command to use: uuidgen Your chances of finding a duplicate after running this nonstop for a year are about the same as being hit by a meteorite before finishing this sentence The reason for the command I have is that it's more provably unique than the one that uuidgen creates. uuidgen creates a random one by default, or an unencrypted one based on time and network address if you give it the -t option. Mine uses the mac address of the ethernet interface, the process id of the caller, and the system time down to nanosecond resolution, which is provably unique over all computers past, present, and future, subject to collisions in the cryptographic hash used, and the uniqueness of your mac address. Warning: feel free to experiment, but be warned that the stdin of the hash is binary data at that point, which may mess up your terminal if you don't pipe it into something. If it does mess up though, just type reset Show Sample Output


    0
    printf $(( echo "obase=16;$(echo $$$(date +%s%N))"|bc; ip link show|sed -n '/eth/ {N; p}'|grep -o -E '([[:xdigit:]]{1,2}:){5}[[:xdigit:]]{1,2}'|head -c 17 )|tr -d [:space:][:punct:] |sed 's/[[:xdigit:]]\{2\}/\\x&/g')|sha1sum|head -c 32; echo
    camocrazed · 2010-07-14 14:04:53 10

  • -1
    grep -i '^DocumentRoot' /etc/httpd/conf/httpd.conf | cut -f2 -d'"'
    hm2k · 2010-07-14 13:30:36 3
  • Here's a version that uses perl. If you'd like a trailing newline: perl -pe 's/(.)/sprintf("\\x%x", ord($1))/eg; END {print "\n"}'


    1
    echo -n 'text' | perl -pe 's/(.)/sprintf("\\x%x", ord($1))/eg'
    putnamhill · 2010-07-14 12:20:42 3

  • 9
    indent -kr hello.c
    cp · 2010-07-14 07:16:15 4

  • 1
    tcpdump -i eth0 port 80 -w -
    freestyler · 2010-07-14 04:17:18 7
  • If shell escaping of the command is problematic, you can write the command to a file first: batch <somefile Or read it: read -re && echo "$REPLY" | batch Or, if your shell supports it, you can eliminate echo: read -re && batch <<<$REPLY ("man batch" lists 1.5 for me, but I don't know how widely it differs.)


    9
    echo 'some command' | batch
    kniht · 2010-07-14 03:08:31 3
  • As a user, deletes all your posts from a MyBB board (provided you have the search page listings of all your posts saved into the same directory this command is run from). Full command: for i in *; do cat $i | grep pid | sed -e 's/;/\ /g' -e 's/#/\ /g' -e 's/pid=/\ /g' | awk -F ' ' '{print $2}' >> posts.txt; done; for c in `cat posts.txt`; do curl --cookie name= --data-urlencode name=my_post_key=\&delete=1\&submit=Delete+Now\&action=deletepost\&pid=$c --user-agent Firefox\ 3.5 --url http://url/editpost.php?my_post_key=\&delete=1\&submit=Delete+Now\&action=deletepost\&pid=$c; sleep 2s; done; echo


    0
    curl --cookie name=<cookie_value> --data-urlencode name=my_post_key=<post_key>\&delete=1\&submit=Delete+Now\&action=deletepost\&pid=$c --user-agent Firefox\ 3.5 --url http://url/editpost.php?my_post_key=<post_key>\&delete=1\&submit=Delete+Now\&action=dele
    mrlockfs · 2010-07-14 01:50:48 5
  • Change the number to change the number of spaces. Leaving it out defaults to 8. Leaving out the filename defaults to stdin. And to do it in reverse, you can use the unexpand command.


    -1
    expand -t 2 <filename>
    camocrazed · 2010-07-13 23:04:57 3
  • Thanks th John_W for suggesting the fix allowing ~/ to be used when saving a directory. directions: Type in a url, it will show a preview of what the file will look like when saved, then asks if you want to save the preview and where you want to save it. Great for grabbing the latest commandlinefu commands without a full web browser or even a GUI. Requires: w3m Show Sample Output


    0
    read -p "enter url:" a ; w3m -dump $a > /dev/shm/e1q ; less /dev/shm/e1q ; read -p "save file as text (y/n)?" b ; if [ $b = "y" ] ; then read -p "enter path with filename:" c && touch $(eval echo "$c") ; mv /dev/shm/e1q $(eval echo "$c") ; fi ; echo DONE
    LinuxMan · 2010-07-13 22:36:38 5
  • Same as another one I saw, just with a cleaner sed command Edit: updated the sed command to use the [[:xdigit:]] character class - more portable between locales Note that it will have a newline inserted after every 32 characters of input, due to the output of xxd Show Sample Output


    0
    echo -n 'text' | xxd -ps | sed 's/[[:xdigit:]]\{2\}/\\x&/g'
    camocrazed · 2010-07-13 21:46:30 3
  • Lists a sample of all installed toilet fonts Show Sample Output


    3
    find /usr/share/figlet -name *.?lf -exec basename {} \; | sed -e "s/\..lf$//" | xargs -I{} toilet -f {} {}
    unixmonkey3987 · 2010-07-13 20:12:54 5
  • [ 2000 -ge "$(free -m | awk '/buffers.cache:/ {print $4}')" ] returns true if less than 2000 MB of RAM are available, so adjust this number to your needs. [ $(echo "$(uptime | awk '{print $10}' | sed -e 's/,$//' -e 's/,/./') >= $(grep -c ^processor /proc/cpuinfo)" | bc) -eq 1 ] returns true if the current machine load is at least equal to the number of CPUs. If either of the tests returns true we wait 10 seconds and check again. If both tests return false, i.e. 2GB are available and machine load falls below number of CPUs, we start our command and save it's output in a text file. The ( ( ... ) & ) construct lets the command run in background even if we log out. See http://www.commandlinefu.com/commands/view/3115/ .


    4
    ( ( while [ 2000 -ge "$(free -m | awk '/buffers.cache:/ {print $4}')" ] || [ $(echo "$(uptime | awk '{print $10}' | sed -e 's/,$//' -e 's/,/./') >= $(grep -c ^processor /proc/cpuinfo)" | bc) -eq 1 ]; do sleep 10; done; my-command > output.txt ) & )
    michelsberg · 2010-07-13 09:12:11 3
  • this command shows the space used in postgres directory. Show Sample Output


    -5
    while (( 1==1 )); do du -c . >> output.log; sleep 2; done; tail -f output.log
    aceiro · 2010-07-12 17:23:45 4
  • Full Command: google contacts list name,name,email|perl -pne 's%^((?!N\/A)(.+?)),((?!N\/A)(.+?)),([a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+))%${1}:${3} <${5}>%imx'|grep -oP '^((?!N\/A)(.+?)) <[a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+)>' | sort You'll need googlecl and python-gdata. First setup google cl via: google Then give your PC access google contacts list name,email Then do the command, save it or use this one to dump it in the cone-address.txt file in your home dir: google contacts list name,name,email | perl -p -n -e 's%^((?!N\/A)(.+?)),((?!N\/A)(.+?)),([a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+))%${1}:${3} <${5}>%imx' | grep -o -P '^((?!N\/A)(.+?)) <[a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+)>' | sort > ~/cone-adress.txt Then import into cone. It filters out multiple emails, and contacts with no email that have N/A. (Picasa photo persons without email for example...) Show Sample Output


    1
    google contacts list name,name,email|perl -pne 's%^((?!N\/A)(.+?)),((?!N\/A)(.+?)),([a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+))%${1}:${3} <${5}>%imx' #see below for full command
    Raymii · 2010-07-12 16:50:44 148

  • -1
    wget --load-cookies <cookie-file> -c -i <list-of-urls>
    alriode · 2010-07-12 02:35:21 3
  • I have this as a file called deletekey in my ~/bin. Makes life a little easier.


    -13
    echo "${1}" | egrep '^[[:digit:]]*$' ; if [ "$?" -eq 0 ] ; then sed -i "${1}"d $HOME/.ssh/known_hosts ; else printf "\tYou must enter a number!\n\n" ; exit 1 ; fi
    DaveQB · 2010-07-11 23:09:11 3
  • In this case it's better do to use the dedicated tool


    45
    ssh-keygen -R <the_offending_host>
    bunam · 2010-07-11 19:37:24 16
  • Disable randomisation address Show Sample Output


    -1
    echo 0 > /proc/sys/kernel/randomize_va_space
    gunslinger_ · 2010-07-11 16:42:42 3
  • sending packet by ping if sending more high packet root needed... Show Sample Output


    -3
    sudo ping -f -c 999 -s 4500 target.com
    gunslinger_ · 2010-07-11 16:38:44 13
  • Print "Art of hacking..." 100 times by perl or you can this tools : http://packetstormsecurity.org/shellcode/shellcodeencdec.py.txt Show Sample Output


    -15
    perl -e 'print "\x41\x72\x74\x20\x6f\x66\x20\x68\x61\x63\x6b\x69\x6e\x67\x2e\x2e\x2e\n" x 100'
    gunslinger_ · 2010-07-11 16:32:00 7
  • ‹ First  < 358 359 360 361 362 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Display the definition of a shell function
Display the code of a previously defined shell function.

Remove a range of lines from a file

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Benchmark SQL Query
Benchmark a SQL query against MySQL Server. The example runs the query 10 times, and you get the average runtime in the output. To ensure that the query does not get cached, use `RESET QUERY CACHE;` on top in the query file.

Connect to remote machine with other enconding charset

raw MySQL output to use in pipes
-N removes header -s removes separator chars -r raw output After using these options, the MySQL ouptut can be used with pipes very easily

Rotate a video file by 90 degrees CW

Close shell keeping all subprocess running

Show crontabs for all users
added echo "### Crontabs for $user ####"; to make clear whose crontab is listed.

Throttling Bandwidth On A Mac
sudo ipfw pipe 1 config bw 50KByte/s Set the bandwidth (bw) limit to any number you want. For example you could have a 15kb pipe for X application and then a 100kb pipe for another application and attach things to those pipes. If a port isn’t attached to a pipe, it runs at full speed. Change the number (in this case 1) to a different number for a different pipe. The next step is to attach your port. sudo ipfw add 1 pipe 1 src-port 80 In this case anything on port 80 (http) will be set to a limit of 50Kbyte/s. If you want to attach a second port to this pipe, repeat the command but change the port number at the end. src : http://www.mactricksandtips.com/2008/12/throttling-bandwidth-on-a-mac.html


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: