All commands (14,187)

  • Felt like I need to win the lottery, and wrote this command so I train and develop my guessing abilities. Show Sample Output


    13
    A=1;B=100;X=0;C=0;N=$[$RANDOM%$B+1];until [ $X -eq $N ];do read -p "N between $A and $B. Guess? " X;C=$(($C+1));A=$(($X<$N?$X:$A));B=$(($X>$N?$X:$B));done;echo "Took you $C tries, Einstein";
    rodolfoap · 2009-12-16 13:24:23 140
  • Check general system error on AIX


    1
    errpt -a | more
    marousan · 2009-12-16 12:07:16 3
  • I wanted to create a copy of my whole laptop disk on an lvm disk of the same size. First I created the logical volume: lvcreate -L120G -nlaptop mylvms SOURCE: dd if=/dev/sda bs=16065b | netcat ip-target 1234 TARGET: nc -l -p 1234 | dd of=/dev/mapper/mylvms-laptop bs=16065b to follow its process you issue the following command in a different terminal STATS: on target in a different terminal: watch -n60 -- kill -USR1 $(pgrep dd) (see http://www.commandlinefu.com/commands/view/4356/output-stats-from-a-running-dd-command-to-see-its-progress)


    7
    SOURCE: dd if=/dev/sda bs=16065b | netcat ip-target 1234 TARGET: netcat -l -p 1234 | dd of=/dev/mapper/laptop bs=16065b STATS on target: watch -n60 -- kill -USR1 $(pgrep dd)
    bw · 2009-12-16 10:51:06 10
  • if you start a large dd and forgot about statistics, but you still wonder what the progress is this command in an OTHER terminal will show you the way. NOTE: the watch command by itself will not output anything NOTE: the kill command will not kill the process Show Sample Output


    2
    watch -n60 --kill -USR1 $(pgrep dd)
    bw · 2009-12-16 10:35:28 4
  • If you give tar a list of filenames, it will not add the directories, so if you don't care about directory ownership or permissions, you can save some space. Tar will create directories as necessary when extracting. This command is limited by the maximum supported size of the argument list, so if you are trying to tar up the whole OS for instance, you may just get "Argument list too long".


    3
    tar -cvzf arch.tgz $(find /path/dir -not -type d)
    pysquared · 2009-12-15 13:46:54 6
  • Tar - Compress by excluding folders Show Sample Output


    -1
    tar -cvf /path/dir.tar /path/dir* --exclude "/path/dir/name" --exclude "/path/dir/opt"
    sandeepverma · 2009-12-15 09:48:41 3

  • 8
    COL=$(( $(tput cols) / 2 )); clear; tput setaf 2; while :; do tput cup $((RANDOM%COL)) $((RANDOM%COL)); printf "%$((RANDOM%COL))s" $((RANDOM%2)); done
    sputnick · 2009-12-15 02:48:28 10
  • This can show all ls colors, with a demo.


    8
    echo $LS_COLORS | sed 's/:/\n/g' | awk -F= '!/^$/{printf("%s \x1b[%smdemo\x1b[0m\n",$0,$2)}'
    bones7456 · 2009-12-15 01:17:46 7
  • We force IPv4, compress the stream, specify the cypher stream to be Blowfish. I suppose you could use aes256-ctr as well for cypher spec. I'm of course leaving out things like master control sessions and such as that may not be available on your shell although that would speed things up as well.


    18
    ssh -4 -C -c blowfish-cbc
    vxbinaca · 2009-12-15 00:30:53 34
  • Not everyone reads manpages. Aliasing this command will help with the task of doing audits with RKhunter. It will check for the latest version, update the definitions and then run a check on the system. Hint: alias that in your .bashrc to make life for your fingers easier.


    2
    rkhunter --versioncheck --update --propupd --check
    vxbinaca · 2009-12-15 00:23:45 4
  • This command is a great way to check to see if acpi is doing damage to your disks by agressivly parking the read arm and wearing down it's life. As you can see, mine has lost half its life. I'm sure this could be shortened though somehow. It will use smartctl to dump the stats and then grep out just the temperature and load cycles for the disk (a load cycle is when a the read arm comes out of park and wears on the drive). Show Sample Output


    2
    watch -d 'sudo smartctl -a /dev/sda | grep Load_Cycle_Count ; sudo smartctl -a /dev/sda | grep Temp'
    vxbinaca · 2009-12-15 00:15:24 5
  • FLAC's built in integrity checks are far more useful then devising a scheme to use MD5 sum files. This will check all the FLAC in a directory and output only errors. Remove the "s" after the "t" and it will be somewhat verbose in the check.


    1
    flac -ts *.flac
    vxbinaca · 2009-12-15 00:07:51 5
  • This command is meant to be used to make a lightweight backup, for when you want to know which files might be missing or changed, but you don't care about their contents (because you have some way to recover them). Explanation of parts: "ls -RFal /" lists all files in and below the root directory, along with their permissions and some other metadata. I think sudo is necessary to allow ls to read the metadata of certain files. "| gzip" compresses the result, from 177 MB to 16 MB in my case. "> all_files_list.txt.gz" saves the result to a file in the current directory called all_files_list.txt.gz. This name can be changed, of course. Show Sample Output


    2
    sudo ls -RFal / | gzip > all_files_list.txt.gz
    roryokane · 2009-12-14 21:40:56 3
  • Remove security from PDF document using this very simple command on Linux and OSX. You need ghostscript for this baby to work.


    47
    gs -q -dNOPAUSE -dBATCH -sDEVICE=pdfwrite -sOutputFile=OUTPUT.pdf -c .setpdfwrite -f INPUT.pdf
    deijmaster · 2009-12-14 21:30:22 27
  • The above url contains over 6700 of the common ad websites. The command just pastes these into your /etc/hosts. Show Sample Output


    7
    wget -q -O - http://someonewhocares.org/hosts/ | grep ^127 >> /etc/hosts
    torrid · 2009-12-14 17:11:16 8
  • Useful if a different user cannot access some directory and you want to know which directory on the way misses the x bit. Show Sample Output


    2
    dir=$(pwd); while [ ! -z "$dir" ]; do ls -ld "$dir"; dir=${dir%/*}; done; ls -ld /
    hfs · 2009-12-14 14:38:11 3
  • Displays a scrolling banner which loops until you hit Ctrl-C to terminate it. Make sure you finish your banner message with a space so it will loop nicely.


    11
    while [ 1 ]; do banner 'ze missiles, zey are coming! ' | while IFS="\n" read l; do echo "$l"; sleep 0.01; done; done
    craigds · 2009-12-14 07:40:07 10
  • If you want all the URLs from all the sessions, you can use : perl -lne 'print for /url":"\K[^"]+/g' ~/.mozilla/firefox/*/sessionstore.js Thanks to tybalt89 ( idea of the "for" statement ). For perl purists, there's JSON and File::Slurp modules, buts that's not installed by default.


    0
    perl -lne 'print for /url":"\K[^"]+/g' $(ls -t ~/.mozilla/firefox/*/sessionstore.js | sed q)
    sputnick · 2009-12-14 00:51:54 2
  • blue and yellow colored bash prompt for a Hanukkah celebration on your box


    3
    export PS1="\e[0;34m[\u\e[0;34m@\h[\e[0;33m\w\e[0m\e[0m\e[0;34m]#\e[0m "
    decept · 2009-12-13 18:35:06 8
  • This command, taken from play's manual page, plays a synthesized guitar tone for each of the strings on a standard tuned guitar. The command "play" is a part of the package "sox".


    18
    for n in E2 A2 D3 G3 B3 E4;do play -n synth 4 pluck $n repeat 2;done
    eightmillion · 2009-12-13 06:57:26 32

  • 11
    cpan -r
    sputnick · 2009-12-13 02:54:22 9
  • Thanks to comment if that works or not... If you have already typed that snippet or you know you already have IO::Interface::Simple perl module, you can type only the last command : perl -e 'use IO::Interface::Simple; my $ip=IO::Interface::Simple->new($ARGV[0]); print $ip->address,$/;' <INTERFACE> ( The first perl command will install the module if it's not there already... )


    1
    x=IO::Interface::Simple; perl -e 'use '$x';' &>/dev/null || cpan -i "$x"; perl -e 'use '$x'; my $ip='$x'->new($ARGV[0]); print $ip->address,$/;' <INTERFACE>
    sputnick · 2009-12-13 02:23:40 36
  • Knowing when a filesystem is created , you can deduce when an operating system was installed . find filesystem device (/dev/) informations by using the cat /etc/fstab command. Show Sample Output


    7
    dumpe2fs -h /dev/DEVICE | grep 'created'
    eastwind · 2009-12-12 14:47:33 14

  • 4
    du -sch ./*
    enderst · 2009-12-12 05:10:40 3

  • 1
    ffmpeg -i Your_video_file -s 320x240 FILE.flv
    eastwind · 2009-12-12 00:28:10 4
  • ‹ First  < 417 418 419 420 421 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Take a screenshot of the window the user clicks on and name the file the same as the window title
In general, this is actually not better than the "scrot -d4" command I'm listing it as an alternative to, so please don't vote it down for that. I'm adding this command because xwd (X window dumper) comes with X11, so it is already installed on your machine, whereas scrot probably is not. I've found xwd handy on boxen that I don't want to (or am not allowed to) install packages on. NOTE: The dd junk for renaming the file is completely optional. I just did that for fun and because it's interesting that xwd embeds the window title in its metadata. I probably should have just parsed the output from file(1) instead of cutting it out with dd(1), but this was more fun and less error prone. NOTE2: Many programs don't know what to do with an xwd format image file. You can convert it to something normal using NetPBM's xwdtopnm(1) or ImageMagick's convert(1). For example, this would work: "xwd | convert fd:0 foo.jpg". Of course, if you have ImageMagick already installed, you'd probably use import(1) instead of xwd. NOTE3: Xwd files can be viewed using the X Window UnDumper: "xwud <foo.xwd". ImageMagick and The GIMP can also read .xwd files. Strangely, eog(1) cannot. NOTE4: The sleep is not strictly necessary, I put it in there so that one has time to raise the window above any others before clicking on it.

follow the content of all files in a directory
The `-q' arg forces tail to not output the name of the current file

sort a JSON blob
For situations where you keep JSON in a VCS and you want your diffs to be sane, such as within a Chef configuration repo.

Change prompt to MS-DOS one (joke)

pass the output of some command to a new email in the default email client
This depends on 'stripansi' and 'urlencode' commands, which exist on my system as these aliases: $ alias stripansi='perl -ple "s/\033\[(?:\d*(?:;\d+)*)*m//g;"' $ alias urlencode='perl -MURI::Escape -ne "\$/=\"\"; print uri_escape \$_"' The `open` command handles URLs on a Mac. Substitute the equivalent for your system (perhaps gnome-open). I don't use system `mail`, so I have this aliased as `mail` and use it this way: $ git show head | mail

github push-ing behind draconian proxies!
If you are behind a restrictive proxy/firewall that blocks port 22 connections but allows SSL on 443 (like most do) then you can still push changes to your github repository. Your .ssh/config file should contain: Host * ForwardX11 no TCPKeepAlive yes ProtocolKeepAlives 30 ProxyCommand /usr/local/bin/proxytunnel -v -p -d %h:443 Host User git Hostname ssh.github.com ChallengeResponseAuthentication yes IdentityFile ~/.ssh/id_rsa IdentitiesOnly yes Basically proxytunnel "tunnels" your ssh connection through port 443. You could also use corkscrew or some other tunneling program that is available in your distro's repository. PS: I generally use "github.com" as the SSH-HOST so that urls of the kind git@github.com:USER/REPO.git work transparently :) You

add a gpg key to aptitute package manager in a ubuntu system
when we add a new package to a aptitude (the debian package manager) we need to add the gpg, otherwise it will show warning / error for missing key

Prepare a commandlinefu command.
This command will format your alias or function to a single line, trimming duplicate white space and newlines and inserting delimiter semi-colons, so it continues to work on a single line.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Shortcut to find files with ease.
It looks for files that contains the given word as parameter. * case insensitive * matches files containing the given word.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: