All commands (14,187)

  • it does provide much more information , the owner , group , the size in byte , and the last modified time a file or directory was ls -al : list all in long format Show Sample Output


    -11
    ls -al
    eastwind · 2009-11-12 12:27:32 6
  • this is a reference to Antoine de St. Exupery's "The Little Prince" Show Sample Output


    6
    aptitude moo
    eastwind · 2009-11-12 12:24:01 7
  • Watch a TiVo file on your computer.


    0
    curl -s -c /tmp/cookie -k -u tivo:$MAK --digest http://$tivo/download/$filename | tivodecode -m $MAK -- - | mplayer - -cache-min 50 -cache 65536
    matthewbauer · 2009-11-11 23:32:23 3

  • 9
    setterm -powersave off -blank 0
    unixmonkey6999 · 2009-11-11 22:39:50 4
  • trying to copy all your dotfiles from one location to another, this may help Show Sample Output


    -2
    ls -a | egrep "^\.\w"
    kulor · 2009-11-11 18:19:56 12
  • cd to the folder containing the wav files and convert them all to ogg format. in my sample output i use the -a and -l flags to set the author and album title. to get the oggenc program in ubuntu linux run: sudo apt-get install oggenc Show Sample Output


    2
    oggenc *.wav
    nickleus · 2009-11-11 14:26:01 6
  • cd to the folder containing the wav files, then convert them all to flac. yeah baby! in ubuntu, to get the flac program just: sudo apt-get install flac flac file input formats are wav, aiff, raw, flac, oga and ogg Show Sample Output


    3
    flac --best *.wav
    nickleus · 2009-11-11 14:17:24 9

  • -1
    for i in `cat /etc/passwd | awk -F : '{ print $1 }';`; do passwd -e $i; done
    irraz · 2009-11-11 13:01:22 3
  • Create a tar file in multiple parts if it's to large for a single disk, your filesystem, etc. Rejoin later with `cat .tar.*|tar xf -` Show Sample Output


    17
    tar cf - <dir>|split -b<max_size>M - <name>.tar.
    dinomite · 2009-11-11 01:53:33 5
  • The magic is performed by the parameter -t Show Sample Output


    -2
    for F in $(find ./ -name "*.tgz") ; do tar -tvzf $F ; done
    alchandia · 2009-11-11 00:50:52 3
  • 355 # from zsh-users 356 edit_command_line () { 357 # edit current line in $EDITOR 358 local tmpfile=${TMPPREFIX:-/tmp/zsh}ecl$$ 359 360 print -R - "$PREBUFFER$BUFFER" >$tmpfile 361 exec 362 ${VISUAL:-${EDITOR:-vi}} $tmpfile 363 zle kill-buffer 364 BUFFER=${"$( 365 CURSOR=$#BUFFER 366 367 command rm -f $tmpfile 368 zle redisplay 369 } 370 zle -N edit_command_line


    -2
    zsh$ M-v
    bucciarati · 2009-11-10 23:02:56 11

  • -3
    dd if=/dev/<device location> | gzip -c /<path to backup location>/<disk image name>.img.gz
    awjrichards · 2009-11-10 22:57:51 9
  • The pstack command prints a stack trace of running processes without needing to attach a debugger, but what about core files? The answer, of course, is to use this command. Usage: gdbbt program corefile


    3
    alias gdbbt="gdb -q -n -ex bt -batch"
    TeacherTiger · 2009-11-10 22:56:59 649
  • I don't know if you've used sqsh before. But it has a handy feature that allows you to switch into vim to complete editing of whatever complicated SQL statement you are trying to run. But I got to thinking -- why doesn't bash have that? Well, it does. It's called '|'! Jk. Seriously, I'm pretty sure this flow of commands will revolutionize how I administer files. And b/c everything is a file on *nx based distros, well, it's handy. First, if your ls is aliased to ls --color=auto, then create another alias in your .bashrc: alias lsp='ls --color=none' Now, let's say you want to rename all files that begin with the prefix 'ras' to files that begin with a 'raster' prefix. You could do it with some bash substitution. But who remembers that? I remember vim macros because I can remember to press 'qa' and how to move around in vim. Plus, it's more incremental. You can check things along the way. That is the secret to development and probably the universe. So type something like: lsp | grep ras Are those all the files you need to move? If not, modify and re-grep. If so, pipe it to vim. lsp | grep ras | vim - Now run your vim macros to modify the first line. Assuming you use 'w' and 'b' to move around, etc., it should work for all lines. Hold down '@@', etc., until your list of files has been modified from ras_a.h ras_a.cpp ras_b.h ras_b.cpp to: mv ras_a.h raster_a.h mv ras_a.cpp raster_a.cpp mv ras_b.h raster_b.h mv ras_b.h raster_b.cpp then run :%!bash then run :q! then be like, whaaaaa? as you realize your workflow got a little more continuous. maybe. YMMV.


    -3
    vim -
    tmsh · 2009-11-10 22:25:36 12
  • This script creates date based backups of the files. It copies the files to the same place the original ones are but with an additional extension that is the timestamp of the copy on the following format: YearMonthDay-HourMinuteSecond Show Sample Output


    6
    backup() { for i in "$@"; do cp -va $i $i.$(date +%Y%m%d-%H%M%S); done }
    polaco · 2009-11-10 20:59:45 6
  • This script will list all the files in the tarballs present on any folder or subfolder of the provided path. The while loop is for echoing the file name of the tarball before listing the files, so the tarball can be identified


    -2
    find <path> -name "*.tgz" -or -name "*.tar.gz" | while read file; do echo "$file: "; tar -tzf $file; done
    polaco · 2009-11-10 20:39:04 36
  • This command will copy a folder tree (keeping the parent folders) through ssh. It will: - compress the data - stream the compressed data through ssh - decompress the data on the local folder This command will take no additional space on the host machine (no need to create compressed tar files, transfer it and then delete it on the host). There is some situations (like mirroring a remote machine) where you simply cant wait for a huge time taking scp command or cant compress the data to a tarball on the host because of file system space limitation, so this command can do the job quite well. This command performs very well mainly when a lot of data is involved in the process. If you copying a low amount of data, use scp instead (easier to type) Show Sample Output


    12
    ssh <host> 'tar -cz /<folder>/<subfolder>' | tar -xvz
    polaco · 2009-11-10 20:06:47 8

  • 0
    egrep -v "^[[:blank:]]*($|#|//|/\*| \*|\*/)" somefile
    sdadh01 · 2009-11-10 18:49:19 5
  • Find files recursively that were updated in the last hour ignoring SVN files and folders. Incase you do a full svn up on accident.


    2
    find . -mmin -60 -not -path "*svn*" -print|more
    bloodykis · 2009-11-10 18:34:53 7
  • Strips comments from at least bash and php scripts. Normal # and // as well as php block comments removes all of the: empty/blank lines lines beginning with # lines beginning with // lines beginning with /* lines beginning with a space and then * lines beginning with */ It also deletes the lines if there's whitespace before any of the above. Add an alias to use in .bashrc like this: alias stripcomments="sed -e '/^[[:blank:]]*#/d; s/[[:blank:]][[:blank:]]*#.*//' -e '/^$/d' -e '/^\/\/.*/d' -e '/^\/\*/d;/^ \* /d;/^ \*\//d'"


    -3
    sed -e '/^[[:blank:]]*#/d; s/[[:blank:]][[:blank:]]*#.*//' -e '/^$/d' -e '/^\/\/.*/d' -e '/^\/\*/d;/^ \* /d;/^ \*\//d' /a/file/with/comments
    unixmonkey6951 · 2009-11-10 17:47:22 10
  • The legend in the first column: i = installed p = installable Show Sample Output


    -6
    aptitude search NAME
    CafeNinja · 2009-11-10 11:23:18 5
  • The command as given would create the file "/result_path/result.tar.gz" with the contents of the target folder including permissions and sub- folder structure. Show Sample Output


    0
    tar pzcvf /result_path/result.tar.gz /target_path/target_folder
    CafeNinja · 2009-11-10 11:17:00 5
  • will decode a mime message. usefull when you receive some email and file attachment that cant be read.


    3
    munpack file.txt
    Diceroll · 2009-11-10 10:53:49 4
  • search ubuntu's remote package source repositories for a specific program to see which package contains it Show Sample Output


    7
    apt-file find bin/programname
    nickleus · 2009-11-10 10:21:45 6
  • require the pdftk package


    8
    pdftk 1.pdf 2.pdf 3.pdf cat output 123.pdf
    eastwind · 2009-11-10 10:07:37 4
  • ‹ First  < 428 429 430 431 432 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Serial console to a Vmware VM
Create a serial console with "socket (named pipe)" of "/tmp/socket", "from:server, to:virtual machine" in vmware player, etc.. gui. Run the above command after you have booted the guest OS (which should also be configured for serial console).

Play musical notes from octave of middle C
Are there any creative pieces of music that can be created using beep and the shell? I'd love to hear it!

Create a bash script from last commands
In order to write bash-scripts, I often do the task manually to see how it works. I type ### at the start of my session. The function fetches the commands from the last occurrence of '###', excluding the function call. You could prefix this with a here-document to have a proper script-header. Delete some lines, add a few variables and a loop, and you're ready to go. This function could probably be much shorter...

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Quick HTML image gallery from folder contents
Setting: You have a lot of jpg files in a directory. Maybe your public_html folder which is readable on the net because of Apache's mod_userdir. All those files from the current folder will be dropped into a file called gallery.html as image tags that can be viewed within a web browser locally or or over the Internet. Original: $find . -iname "*.jpg" -exec echo "" >> gallery.html \;

Test file system type before further commands execution
Exclude 400 client hosts with NFS auto-mounted home directories. Easily modified for inclusion in your scripts.

Convert (almost) any video file into webm format for online html5 streaming

This will allow you to browse web sites using "-dump" with elinks while you still are logged in
README: This require you to login on facebook with elinks without using '-dump' first time and when you have logged in you will then be able to dump all data from facebook without any advanced combos, dump is all you need for see all your friends newsfeed or whatever you wish to view in cli/terminal. Facebook is just an example, same requirements for all websites that have a login form.

backup local MySQL database into a folder and removes older then 5 days backups

Disassemble all ACPI tables on your system
The fact that Linux exposes the ACPI tables to the user via sysfs makes them a gold mine of valuable hardware information for low-level developers. Looping through each of them and disassembling them all makes them even more valuable.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: