All commands (14,187)

  • it does provide much more information , the owner , group , the size in byte , and the last modified time a file or directory was ls -al : list all in long format Show Sample Output


    -11
    ls -al
    eastwind · 2009-11-12 12:27:32 6
  • this is a reference to Antoine de St. Exupery's "The Little Prince" Show Sample Output


    6
    aptitude moo
    eastwind · 2009-11-12 12:24:01 7
  • Watch a TiVo file on your computer.


    0
    curl -s -c /tmp/cookie -k -u tivo:$MAK --digest http://$tivo/download/$filename | tivodecode -m $MAK -- - | mplayer - -cache-min 50 -cache 65536
    matthewbauer · 2009-11-11 23:32:23 3

  • 9
    setterm -powersave off -blank 0
    unixmonkey6999 · 2009-11-11 22:39:50 4
  • trying to copy all your dotfiles from one location to another, this may help Show Sample Output


    -2
    ls -a | egrep "^\.\w"
    kulor · 2009-11-11 18:19:56 12
  • cd to the folder containing the wav files and convert them all to ogg format. in my sample output i use the -a and -l flags to set the author and album title. to get the oggenc program in ubuntu linux run: sudo apt-get install oggenc Show Sample Output


    2
    oggenc *.wav
    nickleus · 2009-11-11 14:26:01 6
  • cd to the folder containing the wav files, then convert them all to flac. yeah baby! in ubuntu, to get the flac program just: sudo apt-get install flac flac file input formats are wav, aiff, raw, flac, oga and ogg Show Sample Output


    3
    flac --best *.wav
    nickleus · 2009-11-11 14:17:24 9

  • -1
    for i in `cat /etc/passwd | awk -F : '{ print $1 }';`; do passwd -e $i; done
    irraz · 2009-11-11 13:01:22 3
  • Create a tar file in multiple parts if it's to large for a single disk, your filesystem, etc. Rejoin later with `cat .tar.*|tar xf -` Show Sample Output


    17
    tar cf - <dir>|split -b<max_size>M - <name>.tar.
    dinomite · 2009-11-11 01:53:33 5
  • The magic is performed by the parameter -t Show Sample Output


    -2
    for F in $(find ./ -name "*.tgz") ; do tar -tvzf $F ; done
    alchandia · 2009-11-11 00:50:52 3
  • 355 # from zsh-users 356 edit_command_line () { 357 # edit current line in $EDITOR 358 local tmpfile=${TMPPREFIX:-/tmp/zsh}ecl$$ 359 360 print -R - "$PREBUFFER$BUFFER" >$tmpfile 361 exec 362 ${VISUAL:-${EDITOR:-vi}} $tmpfile 363 zle kill-buffer 364 BUFFER=${"$( 365 CURSOR=$#BUFFER 366 367 command rm -f $tmpfile 368 zle redisplay 369 } 370 zle -N edit_command_line


    -2
    zsh$ M-v
    bucciarati · 2009-11-10 23:02:56 11

  • -3
    dd if=/dev/<device location> | gzip -c /<path to backup location>/<disk image name>.img.gz
    awjrichards · 2009-11-10 22:57:51 9
  • The pstack command prints a stack trace of running processes without needing to attach a debugger, but what about core files? The answer, of course, is to use this command. Usage: gdbbt program corefile


    3
    alias gdbbt="gdb -q -n -ex bt -batch"
    TeacherTiger · 2009-11-10 22:56:59 658
  • I don't know if you've used sqsh before. But it has a handy feature that allows you to switch into vim to complete editing of whatever complicated SQL statement you are trying to run. But I got to thinking -- why doesn't bash have that? Well, it does. It's called '|'! Jk. Seriously, I'm pretty sure this flow of commands will revolutionize how I administer files. And b/c everything is a file on *nx based distros, well, it's handy. First, if your ls is aliased to ls --color=auto, then create another alias in your .bashrc: alias lsp='ls --color=none' Now, let's say you want to rename all files that begin with the prefix 'ras' to files that begin with a 'raster' prefix. You could do it with some bash substitution. But who remembers that? I remember vim macros because I can remember to press 'qa' and how to move around in vim. Plus, it's more incremental. You can check things along the way. That is the secret to development and probably the universe. So type something like: lsp | grep ras Are those all the files you need to move? If not, modify and re-grep. If so, pipe it to vim. lsp | grep ras | vim - Now run your vim macros to modify the first line. Assuming you use 'w' and 'b' to move around, etc., it should work for all lines. Hold down '@@', etc., until your list of files has been modified from ras_a.h ras_a.cpp ras_b.h ras_b.cpp to: mv ras_a.h raster_a.h mv ras_a.cpp raster_a.cpp mv ras_b.h raster_b.h mv ras_b.h raster_b.cpp then run :%!bash then run :q! then be like, whaaaaa? as you realize your workflow got a little more continuous. maybe. YMMV.


    -3
    vim -
    tmsh · 2009-11-10 22:25:36 12
  • This script creates date based backups of the files. It copies the files to the same place the original ones are but with an additional extension that is the timestamp of the copy on the following format: YearMonthDay-HourMinuteSecond Show Sample Output


    6
    backup() { for i in "$@"; do cp -va $i $i.$(date +%Y%m%d-%H%M%S); done }
    polaco · 2009-11-10 20:59:45 6
  • This script will list all the files in the tarballs present on any folder or subfolder of the provided path. The while loop is for echoing the file name of the tarball before listing the files, so the tarball can be identified


    -2
    find <path> -name "*.tgz" -or -name "*.tar.gz" | while read file; do echo "$file: "; tar -tzf $file; done
    polaco · 2009-11-10 20:39:04 36
  • This command will copy a folder tree (keeping the parent folders) through ssh. It will: - compress the data - stream the compressed data through ssh - decompress the data on the local folder This command will take no additional space on the host machine (no need to create compressed tar files, transfer it and then delete it on the host). There is some situations (like mirroring a remote machine) where you simply cant wait for a huge time taking scp command or cant compress the data to a tarball on the host because of file system space limitation, so this command can do the job quite well. This command performs very well mainly when a lot of data is involved in the process. If you copying a low amount of data, use scp instead (easier to type) Show Sample Output


    12
    ssh <host> 'tar -cz /<folder>/<subfolder>' | tar -xvz
    polaco · 2009-11-10 20:06:47 8

  • 0
    egrep -v "^[[:blank:]]*($|#|//|/\*| \*|\*/)" somefile
    sdadh01 · 2009-11-10 18:49:19 5
  • Find files recursively that were updated in the last hour ignoring SVN files and folders. Incase you do a full svn up on accident.


    2
    find . -mmin -60 -not -path "*svn*" -print|more
    bloodykis · 2009-11-10 18:34:53 7
  • Strips comments from at least bash and php scripts. Normal # and // as well as php block comments removes all of the: empty/blank lines lines beginning with # lines beginning with // lines beginning with /* lines beginning with a space and then * lines beginning with */ It also deletes the lines if there's whitespace before any of the above. Add an alias to use in .bashrc like this: alias stripcomments="sed -e '/^[[:blank:]]*#/d; s/[[:blank:]][[:blank:]]*#.*//' -e '/^$/d' -e '/^\/\/.*/d' -e '/^\/\*/d;/^ \* /d;/^ \*\//d'"


    -3
    sed -e '/^[[:blank:]]*#/d; s/[[:blank:]][[:blank:]]*#.*//' -e '/^$/d' -e '/^\/\/.*/d' -e '/^\/\*/d;/^ \* /d;/^ \*\//d' /a/file/with/comments
    unixmonkey6951 · 2009-11-10 17:47:22 10
  • The legend in the first column: i = installed p = installable Show Sample Output


    -6
    aptitude search NAME
    CafeNinja · 2009-11-10 11:23:18 5
  • The command as given would create the file "/result_path/result.tar.gz" with the contents of the target folder including permissions and sub- folder structure. Show Sample Output


    0
    tar pzcvf /result_path/result.tar.gz /target_path/target_folder
    CafeNinja · 2009-11-10 11:17:00 5
  • will decode a mime message. usefull when you receive some email and file attachment that cant be read.


    3
    munpack file.txt
    Diceroll · 2009-11-10 10:53:49 4
  • search ubuntu's remote package source repositories for a specific program to see which package contains it Show Sample Output


    7
    apt-file find bin/programname
    nickleus · 2009-11-10 10:21:45 6
  • require the pdftk package


    8
    pdftk 1.pdf 2.pdf 3.pdf cat output 123.pdf
    eastwind · 2009-11-10 10:07:37 4
  • ‹ First  < 428 429 430 431 432 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

list all files in a directory, sorted in reverse order by modification time, use file descriptors.
It's both silly, and infinitely useful. Especially useful in logfile directories where you want to know what file is being updated while troubleshooting.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

BASH: Print shell variable into AWK

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

Count the total number of files in each immediate subdirectory
counts the total (recursive) number of files in the immediate (depth 1) subdirectories as well as the current one and displays them sorted. Fixed, as per ashawley's comment

for too many arguments by *
$ grep ERROR *.log -bash: /bin/grep: Argument list too long $ echo *.log | xargs grep ERROR /dev/null 20090119.00011.log:DANGEROUS ERROR

get total of inodes of root partition

A signal trap that logs when your script was killed and what other processes were running at that time
trap is the bash builtin that allows you to execute commands when the current script receives a particular signal. Uses $0 for the script name, $$ for the script PID, tee to output to STDOUT as well as a log file and ps to log other running processes.

take a look to command before action
add |sh when you agree the list, I often use that method to prevent typos in dangerous or long operations

Get gzip compressed web page using wget.
Like the original command, but the -f allows this one to succeed even if the website returns uncompressed data. From gzip(1) on the -f flag: If the input data is not in a format recognized by gzip, and if the --stdout is also given, copy the input data without change to the standard output: let zcat behave as cat.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: