Commands tagged cat (49)

  • If you have a bunch of small files that you want to cat to read, you can cat each alone (boring); do a cat *, and you won't see what line is for what file, or do a grep . *. "." will match any string and grep in multifile mode will place a $filename: before each matched line. It works recursively too!! Show Sample Output


    18
    grep . *
    theist · 2011-09-01 09:16:04 4
  • This command lets you see and scroll through all of the strings that are stored in the RAM at any given time. Press space bar to scroll through to see more pages (or use the arrow keys etc). Sometimes if you don't save that file that you were working on or want to get back something you closed it can be found floating around in here! The awk command only shows lines that are longer than 20 characters (to avoid seeing lots of junk that probably isn't "human readable"). If you want to dump the whole thing to a file replace the final '| less' with '> memorydump'. This is great for searching through many times (and with the added bonus that it doesn't overwrite any memory...). Here's a neat example to show up conversations that were had in pidgin (will probably work after it has been closed)... sudo cat /proc/kcore | strings | grep '([0-9]\{2\}:[0-9]\{2\}:[0-9]\{2\})' (depending on sudo settings it might be best to run sudo su first to get to a # prompt)


    15
    sudo cat /proc/kcore | strings | awk 'length > 20' | less
    nesquick · 2009-03-09 02:19:47 5
  • Files containing ascii art (e.g. with .nfo extension) are typically not correctly reproduced at the command line when using cat. With iconv one can easily write a wrapper to solve this: #!/bin/bash if [ -z "$@" ]; then echo "Usage: $(basename $0) file [file] ..." else iconv -f437 -tutf8 "$@"; fi exit 0


    8
    iconv -f437 -tutf8 asciiart.nfo
    speaker · 2009-07-11 23:50:05 1
  • The format is JJJJJ YR-MO-DA HH:MM:SS TT L DUT1 msADV UTC(NIST) OTM and is explained more fully here: http://tf.nist.gov/service/acts.htm Show Sample Output


    8
    cat </dev/tcp/time.nist.gov/13
    drewk · 2009-12-03 21:40:14 6
  • SH

    cat mod_log_config.c | shmore or shmore < mod_log_config.c Most pagers like less, more, most, and others require additional processes to be loaded, additional cpu time used, and if that wasn't bad enough, most of them modify the output in ways that can be undesirable. What I wanted was a "more" pager that was basically the same as running: cat file Without modifying the output and without additional processes being created, cpu used, etc. Normally if you want to scroll the output of cat file without modifying the output I would have to scroll back my terminal or screen buffer because less modifies the output. After looking over many examples ranging from builtin cat functions created for csh, zsh, ksh, sh, and bash from the 80's, 90s, and more recent examples shipped with bash 4, and after much trial and error, I finally came up with something that satisifed my objective. It automatically adjusts to the size of your terminal window by using the LINES variable (or 80 lines if that is empty) so This is a great function that will work as long as your shell works, so it will work just find if you are booted in single user mode and your /usr/bin directory is missing (where less and other pagers can be). Using builtins like this is fantastic and is comparable to how busybox works, as long as your shell works this will work. One caveat/note: I always have access to a color terminal, and I always setup both the termcap and the terminfo packages for color terminals (and/or ncurses and slang), so for that reason I stuck the tput setab 4; tput setaf 7 command at the beginning of the function, so it only runs 1 time, and that causes the -- SHMore -- prompt to have a blue background and bright white text. This is one of hundreds of functions I have in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html">.bash_profile at http://www.askapache.com/">AskApache.com, but actually won't be included till the next update. If you can improve this in any way at all please let me know, I would be very grateful! ( Like one thing I want is to be able to continue to the next screen by pressing any key instead of now having to press enter to continue) Show Sample Output


    6
    shmore(){ local l L M="`echo;tput setab 4&&tput setaf 7` --- SHMore --- `tput sgr0`";L=2;while read l;do echo "${l}";((L++));[[ "$L" == "${LINES:-80}" ]]&&{ L=2;read -p"$M" -u1;echo;};done;}
    AskApache · 2010-04-21 00:40:37 1
  • the tee command does fine with file names, but not so much with file descriptors, such as &2 (stderr). This uses process redirection to tee to the specified descriptor. In the sample output, it's being used to tee to stderr, which is connected with the terminal, and to wc -l, which is also outputting to the terminal. The result is the output of bash --version followed by the linecount Show Sample Output


    5
    tee >(cat - >&2)
    camocrazed · 2010-07-20 17:22:31 3

  • 5
    cat -n file.txt
    ztank1013 · 2011-09-14 20:38:41 0
  • Useful to detect number of tabs in an empty line, DOS newline (carriage return + newline). A tool that can help you understand why your parsing is not working. Show Sample Output


    4
    cat -v -t -e
    alperyilmaz · 2009-03-24 19:29:03 1

  • 4
    cat -n
    putnamhill · 2009-12-08 16:35:55 2
  • This takes a webcam picture every everytime the mouse is moved (waits 10 seconds between checking for movement) and stores the picture wherever you want it. Ideas: Use in conjunction with a dropbox type application to see who is using your computer Use /dev/input/mice if /dev/input/mouse* doesn't work Use the bones of this to make a simple screensaver


    4
    while true; do sudo cat /dev/input/mouse0|read -n1;streamer -q -o /tmp/cam.jpeg -s 640x480 > /dev/null 2>&1; sleep 10;done
    SQUIIDUX · 2012-04-22 01:51:30 1

  • 3
    cat /proc/net/ip_conntrack | grep ESTABLISHED | grep -c -v ^#
    flamarion · 2009-07-29 20:21:25 2
  • Pipe any output to "grep ." and blank lines will not be printed.


    3
    cat filename | grep .
    fraktil · 2009-08-09 01:00:59 1
  • random(6) - random lines from a file or random numbers


    3
    random -f <file>
    haplo · 2009-09-24 19:15:58 4
  • This just reads in a local file and sends it via email. Works with text or binary. *Requires* local mail server.


    2
    cat filename | mail -s "Email subject" user@example.com
    topher1kenobe · 2009-09-20 01:38:23 0
  • This uses some tricks I found while reading the bash man page to enumerate and display all the current environment variables, including those not listed by the 'env' command which according to the bash docs are more for internal use by BASH. The main trick is the way bash will list all environment variable names when performing expansion on ${!A*}. Then the eval builtin makes it work in a loop. I created a function for this and use it instead of env. (by aliasing env). This is the function that given any parameters lists the variables that start with it. So 'aae B' would list all env variables starting wit B. And 'aae {A..Z} {a..z}' would list all variables starting with any letter of the alphabet. And 'aae TERM' would list all variables starting with TERM. aae(){ local __a __i __z;for __a in "$@";do __z=\${!${__a}*};for __i in `eval echo "${__z}"`;do echo -e "$__i: ${!__i}";done;done; } And my printenv replacement is: alias env='aae {A..Z} {a..z} "_"|sort|cat -v 2>&1 | sed "s/\\^\\[/\\\\033/g"' From: http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    2
    for _a in {A..Z} {a..z};do _z=\${!${_a}*};for _i in `eval echo "${_z}"`;do echo -e "$_i: ${!_i}";done;done|cat -Tsv
    AskApache · 2010-10-27 07:16:54 0
  • Randomizes a file. The opposite of sort is sort -R!


    2
    sort -R
    RyanM · 2011-07-15 15:35:27 0
  • this will open a new tab in firefox for every line in a file the sleep is removable but i found that if you have a large list of urls 50+, and no sleep, it will try to open all the urls at once and this will cause them all to load a lot slower, also depending on the ram of your system sleep gives you a chance to close the tabs before they overload your ram, removing & >2/dev/null will yield unpredictable results.


    2
    for line in `cat $file`; do firefox -new-tab "$line" & 2>/dev/null; sleep 1; done
    hamsolo474 · 2011-11-12 13:47:24 0
  • Merge Multiple PDFs In Alphabetical Order


    2
    pdftk *.pdf cat output merged.pdf
    o0110o · 2014-03-02 01:53:37 0
  • Takes input from the connected terminal and dumps it to the specified file. Stop writing and close file with control + D or the end of line character. Useful for copying+pasting large blobs of text over SSH to a new machine. Show Sample Output


    1
    cat /dev/tty > FILE
    Jo · 2009-02-25 01:43:47 5
  • Count your source and header file's line numbers. This ignores blank lines, C++ style comments, single line C style comments. This will not ignore blank lines with tabs or multiline C style comments.


    1
    find /usr/include/ -name '*.[c|h]pp' -o -name '*.[ch]' -print0 | xargs -0 cat | grep -v "^ *$" | grep -v "^ *//" | grep -v "^ */\*.*\*/" | wc -l
    unixmonkey44446 · 2013-06-17 08:37:37 0
  • Prepend text to a file. It doen't need temporary files, ed or sed.


    1
    echo "text to prepend" | cat - file
    leni536 · 2013-12-18 15:54:17 3
  • on some distro's you have to replace "BogoMIPS" with "bogomips". Show Sample Output


    1
    cat /proc/cpuinfo | grep BogoMIPS | uniq | sed 's/^.*://g' | awk '{print($1 / 4) }'
    derpat · 2014-03-01 03:44:03 0
  • Save the script as: sort_file Usage: sort_file < sort_me.csv > out_file.csv This script was originally posted by Admiral Beotch in LinuxQuestions.org on the Linux-Software forum. I modified this script to make it more portable. Show Sample Output


    0
    infile=$1 for i in $(cat $infile) do echo $i | tr "," "\n" | sort -n | tr "\n" "," | sed "s/,$//" echo done
    iframe · 2009-07-12 21:23:37 0
  • uuencode the file to appear as an attachment


    0
    cat filename | uuencode filename | mail -s "Email subject" user@example.com
    amaymon · 2009-09-21 04:13:50 6
  • The sort utility is well used, but sometimes you want a little chaos. This will randomize the lines of a text file. BTW, on OS X there is no | sort -R option! There is also no | shuf These are only in the newer GNU core... This is also faster than the alternate of: | awk 'BEGIN { srand() } { print rand() "\t" $0 }' | sort -n | cut -f2- Show Sample Output


    0
    cat ~/SortedFile.txt | perl -wnl -e '@f=<>; END{ foreach $i (reverse 0 .. $#f) { $r=int rand ($i+1); @f[$i, $r]=@f[$r,$i] unless ($i==$r); } chomp @f; foreach $line (@f){ print $line; }}'
    drewk · 2009-09-24 15:42:43 2
  •  1 2 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Jump to line X in file in Nano.
Starts the cursor on line X of file foo. Useful for longer files in which it takes a long time to scroll. If X is greater than the number of lines in file foo, it will go to the last existing line.

check open ports without netstat or lsof

search the manual page names and descriptions

Find usb device in realtime
Using this command you can track a moment when usb device was attached.

Scale,Rotate, brightness, contrast,...with Image Magick
$rotate: the rotate angle $width, $height: width and height to scale to $birghtness: change brighness

Create a local compressed tarball from remote host directory
The command uses ssh(1) to get to a remote host, uses tar(1) to archive a remote directory, prints the result to STDOUT, which is piped to gzip(1) to compress to a local file. In other words, we are archiving and compressing a remote directory to our local box.

cat ~/.ssh/id_rsa.pub | ssh user@site.com "cat - >> ~/.ssh/authorized_keys"
You'll want to use this for passwordless logins. Same as ssh-copy-id, if you don't have it on your system.

Commandline document conversion with Libreoffice
In this example, the docx gets converted to Open Document .odt format. For other formats, you'll need to specify the correct filter (Hint: see "Comments" link below for a nice list).

count the number of specific characters in a file or text stream
In this example, the command will recursively find files (-type f) under /some/path, where the path ends in .mp3, case insensitive (-iregex). It will then output a single line of output (-print0), with results terminated by a the null character (octal 000). Suitable for piping to xargs -0. This type of output avoids issues with garbage in paths, like unclosed quotes. The tr command then strips away everything but the null chars, finally piping to wc -c, to get a character count. I have found this very useful, to verify one is getting the right number of before you actually process the results through xargs or similar. Yes, one can issue the find without the -print0 and use wc -l, however if you want to be 1000% sure your find command is giving you the expected number of results, this is a simple way to check. The approach can be made in to a function and then included in .bashrc or similar. e.g. $ count_chars() { tr -d -c "$1" | wc -c; } In this form it provides a versatile character counter of text streams :)

Show memory usage of all docker / lxc containers (works on CoreOS)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: