Commands using cat (514)

  • Plain old `unzip` won't unzip output coming from STDOUT the ZIP file format includes a directory (index) at the end of the archive. This directory says where, within the archive each file is located and thus allows for quick, random access, without reading the entire archive. This would appear to pose a problem when attempting to read a ZIP archive through a pipe, in that the index is not accessed until the very end and so individual members cannot be correctly extracted until after the file has been entirely read and is no longer available. As such it appears unsurprising that most ZIP decompressors simply fail when the archive is supplied through a pipe. The directory at the end of the archive is not the only location where file meta information is stored in the archive. In addition, individual entries also include this information in a local file header, for redundancy purposes. From the `jar` manpage: > The jar command is a general-purpose archiving and compression tool, based on ZIP and the ZLIB compression format. JAR is smart enough to know how to handle these local file headers when the index is unavailable when reading through the pipe. (Most of the explanation in this description is taken from https://serverfault.com/a/589528/314226 , though they recommend using `bsdtar`, but that is not always available on systems) Show Sample Output


    0
    cat foo.zip | jar xv
    bbbco · 2019-01-14 22:08:19 33
  • Overwrites remote file without asking! Uses HTTPS proxy that supports CONNECT. Actually uses SSH and not SFTP to upload the file.


    0
    cat myFile.json | ssh root@remoteSftpServer -o "ProxyCommand=nc.openbsd -X connect -x proxyhost:proxyport %h %p" 'cat > myFile.json'
    casueps · 2020-01-22 11:00:20 107
  • Especially good for exported ipython files Show Sample Output


    0
    grep -v '^# In' viz.txt | cat -s > out.txt
    shantanuo · 2022-06-08 04:01:11 509
  • This command works only if the line "DROP TABLE IF EXISTS" exists for all tables in the mysqldump file. It acts like a state machine.


    0
    cat db_dump.sql | awk '/DROP TABLE IF EXISTS/ { skip = $5 ~ /table1|table2/ } !skip { print $0 }' > db_dump_filtered.sql
    stf42 · 2022-10-30 16:58:57 801
  • tells you the number of lines in said file, and then tail the last 100 lines ( or how many are messed up) then u take the total amount of lines and then subract the 100 or so lines u DONT WANT, then do a head -n $new_number and then redirect it to new file.db


    -1
    cat -n $file | tail -n 100 && head -n number-of-lines-you-want-to-keep > newfile
    bbelt16ag · 2009-02-15 01:02:10 9
  • In July 2008, there was an uproar over Foxconn motherboards feeding Linux installs incorrect ACPI information (http://ubuntu-virginia.ubuntuforums.org/showthread.php?t=869249). Foxconn has gladly corrected their mistake, but make sure it's not happening on your motherboard! After running the command, just view the 'dsdt.dsl' in any editor you like. Show Sample Output


    -1
    sudo aptitude -y install iasl && sudo cat /sys/firmware/acpi/tables/DSDT > dsdt.dat && iasl -d dsdt.dat
    brettalton · 2009-02-15 23:13:50 13
  • Be aware of using the --password argument as it will appear your password in plain text on the screen. You may use -p argument instead, it will prompt you to enter you password in hidden mode.


    -1
    cat schema.sql data.sql test_data.sql | mysql -u user --password=pass dbname
    tristan_ph · 2009-03-24 08:39:40 6
  • I'm sure almost everybody knows this by now. This command will pull the password for the admin login of any plesk machine. Show Sample Output


    -1
    cat /etc/psa/.psa.shadow
    jigglebilly · 2009-04-30 18:08:12 4
  • This is useful for displaying a portion of a FILE that contains an error at line NUMBER


    -1
    cat -n FILE | grep -C3 "^[[:blank:]]\{1,5\}NUMBER[[:blank:]]"
    lv4tech · 2009-05-17 18:19:55 8
  • VARNAMES='ID FORENAME LASTNAME ADDRESS CITY PHONE MOBILE MAIL ...' cat customer.csv | while read LINE ; do COUNT=1 for VAR in $VARNAMES ; do eval "${VAR}=`echo $LINE | /usr/bin/awk {'print $'$COUNT''}`" let COUNT=COUNT+1 done done Maybe you have a CSV-File with addresses, where you have to process each contact (one per line, write each value to own variable). Of course you can define every variable, but this way is more simple and faster (to write). VARNAMES includes the variable names. Pay attention: the number of names in VARNAMES have to be the same than in the CSV-file the fields. If the CSV is not seperated with ";", you can set the seperator after the awk-binary with -F"_" for example.


    -1
    VARNAMES='ID FORENAME LASTNAME ADDRESS CITY PHONE MOBILE MAIL' ; cat customer.csv | while read LINE ; do COUNT=1 ; for VAR in $VARNAMES ; do eval "${VAR}=`echo $LINE | /usr/bin/awk {'print $'$COUNT''}`" ; let COUNT=COUNT+1 ; done ; done
    GeckoDH · 2009-05-19 11:23:00 4
  • avoid mouse abuse and the constant struggle of balancing scroll velocity ... not to mention that burning sensation in your upper right shoulder ....


    -1
    cat large.xml | xclip
    copremesis · 2009-07-08 16:30:07 8
  • If you are downloading a big file (or even a small one) and the connection breaks or times out, use this command in order to RESUME the download where it failed, instead of having to start downloading from the beginning. This is a real win for downloading debian ISO images over a buggy DSL modem. Take the partially downloaded file and cat it into the STDIN of curl, as shown. Then use the "-C -" option followed by the URL of the file you were originally downloading. Show Sample Output


    -1
    cat file-that-failed-to-download.zip | curl -C - http://www.somewhere.com/file-I-want-to-download.zip >successfully-downloaded.zip
    linuxrawkstar · 2009-08-05 13:33:06 16

  • -1
    echo capitalize | { dd bs=1 count=1 conv=ucase 2> /dev/null; cat ;}
    twfcc · 2009-09-05 01:49:53 40
  • Some malicious program appends a iframe or script tag to you web pages on some server, use this command to clean them in batch.


    -1
    for f in *.html; do head -n -1 $f > temp; cat temp > $f; rm temp; done
    Sunng · 2009-10-12 12:49:18 5
  • Yep, is hard, but is a way more flexible using pipe.


    -1
    cat infile | while read str; do echo "$((++i)) - $str" ; done;
    glaudiston · 2009-12-09 14:05:09 3
  • Get Memeory Info


    -1
    cat /proc/meminfo
    svnlabs · 2010-01-22 16:48:03 3
  • You don't need to create an intermediate file, just pipe the output directly to tar command and use stin as file (put a dash after the f flag).


    -1
    cat 1.tar.gz 2.tar.gz | tar zxvif -
    psychopenguin · 2010-05-09 03:50:00 5
  • -r to use extended regex ^ begin line | alternative get 100 or 0-9 one or two times Show Sample Output


    -1
    cat file | sed -n -r '/^100$|^[0-9]{1,2}$/p'
    voyeg3r · 2010-05-15 19:15:56 5
  • thx Montecristo, thx hckhckhck


    -1
    cat > {filename} {your text} [^C | ^D]
    sphere64 · 2010-06-03 09:02:12 3
  • It works in every linux box Show Sample Output


    -1
    cat /proc/cpuinfo
    magicjohnson_ · 2010-09-24 09:27:58 3
  • Is a simple script for video streaming a movie


    -1
    cat video.ogg | nc -l -p 4232 & wget http://users.bshellz.net/~bazza/?nombre=name -O - & sleep 10; mplayer http://users.bshellz.net/~bazza/datos/name.ogg
    el_bazza · 2010-11-29 03:34:31 5
  • This command deletes the "newline" chars, so its output maybe unusable :)


    -1
    cat file | tr -d "\n"
    uzsolt · 2010-12-02 09:22:02 3
  • Change your drive letter as you wish. Using pv command for speed detect.First of all you must install pv command for usage. http://www.bayner.com/ kerim@bayner.com Show Sample Output


    -1
    cat /dev/sda | pv -r > /dev/null
    kerim · 2011-01-23 22:58:56 5

  • -1
    grabtweets() { curl -s -o $GT_TMP twitter.com/$1 | cat $GT_TMP | grep entry-content | sed -e :loop -e 's/<[^>]*>//g;/</N;//bloop' | sed 's/^[ \t]*//'; }
    gl101 · 2011-05-04 21:49:08 5
  • tired of opening tabs and fill in search forms by hand? just pipe the search terms you need into this surfraw loop. you can use any browser you have installed, but a graphical browser with a tabbed interface will come in handy. surfraw can be found here: http://surfraw.alioth.debian.org


    -1
    cat search_items.txt | while read i; do surfraw google -browser=firefox $i; done
    bubo · 2011-05-12 09:27:08 2
  • ‹ First  < 15 16 17 18 19 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

A trash function for bash
apt-get install trash-cli Commandline program that allows you put folders or files in the standard KDE/Unity desktop trash.

Fill a hard drive with ones - like zero-fill, but the opposite :)
Note: Replace 200000 with drive bytes/512, and /dev/sdx with the destination drive/partition. ;) Note: You may need to install pipebench, this is easy with "sudo apt-get install pipebench" on Ubuntu. The reason I hunted around for the pieces to make up this command is that I wanted to specifically flip all of the bits on a new HDD, before running an Extended SMART Self-Test (actually, the second pass, as I've already done one while factory-zeroed) to ensure there are no physical faults waiting to compromise my valuable data. There were several sites that came up in a Google search which had a zero-fill command with progress indicator, and one or two with a fill-with-ones command, but none that I could find with these two things combined (I had to shuffle around the dd command(s) to get this to happen without wasting speed on an md5sum as well). For reference, these are the other useful-looking commands I found in my search: Zero-fill drive "/dev/sdx", with progress indicator and md5 verification (run sudo fdisk -l to get total disk bytes, then divide by 512 and enter the resulting value into this command for a full wipe) $ dd if=/dev/zero bs=512 count= | pipebench | sudo tee /dev/sdx | md5sum And this command for creating a file filled with ones is my other main source (besides the above command and man pages, that is - I may be a Linux newbie but I do read!): $ tr '\000' '\377' < /dev/zero | dd of=allones bs=1024 count=2k Hope someone finds this useful! :) Cheers, - Gliktch

Ultimate current directory usage command
Based on the MrMerry one, just add some visuals to differentiate files and directories

Creat a tar file for backup info
Use tar command for a backup info with a date of creation

Find usb device in realtime
Using this command you can track a moment when usb device was attached.

list block devices
Shows all block devices in a tree with descruptions of what they are.

Remove the first character of each line in a file

bulk dl files based on a pattern
-O switch creates local filename same as remote curl [][] -o #1#2 makes local files unique inserting sequence values into #x placeholders sequences can be alpha or numeric e.g [a-z] [1-25]

Getting information about model no. of computer
This command gives a model information of a computer. Also useful in determining the host is a VM machine or actual physical machine.

Be notified about overheating of your CPU and/or motherboard
You'll be notified if your core 1 temperature exceeds 50 degrees, you can change the monitored device by editing the "Core 1" or change the critical temperature by editing the "-gt 50" part. Note: you must have lm-sensors installed and configured in order to get this command working.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: