Commands using cat (514)

  • install json-to-js as a npm global package


    0
    cat data.json | json-to-js | pbcopy
    minademian · 2018-12-14 15:55:41 42
  • Plain old `unzip` won't unzip output coming from STDOUT the ZIP file format includes a directory (index) at the end of the archive. This directory says where, within the archive each file is located and thus allows for quick, random access, without reading the entire archive. This would appear to pose a problem when attempting to read a ZIP archive through a pipe, in that the index is not accessed until the very end and so individual members cannot be correctly extracted until after the file has been entirely read and is no longer available. As such it appears unsurprising that most ZIP decompressors simply fail when the archive is supplied through a pipe. The directory at the end of the archive is not the only location where file meta information is stored in the archive. In addition, individual entries also include this information in a local file header, for redundancy purposes. From the `jar` manpage: > The jar command is a general-purpose archiving and compression tool, based on ZIP and the ZLIB compression format. JAR is smart enough to know how to handle these local file headers when the index is unavailable when reading through the pipe. (Most of the explanation in this description is taken from https://serverfault.com/a/589528/314226 , though they recommend using `bsdtar`, but that is not always available on systems) Show Sample Output


    0
    cat foo.zip | jar xv
    bbbco · 2019-01-14 22:08:19 33
  • Overwrites remote file without asking! Uses HTTPS proxy that supports CONNECT. Actually uses SSH and not SFTP to upload the file.


    0
    cat myFile.json | ssh root@remoteSftpServer -o "ProxyCommand=nc.openbsd -X connect -x proxyhost:proxyport %h %p" 'cat > myFile.json'
    casueps · 2020-01-22 11:00:20 106
  • Especially good for exported ipython files Show Sample Output


    0
    grep -v '^# In' viz.txt | cat -s > out.txt
    shantanuo · 2022-06-08 04:01:11 498
  • This command works only if the line "DROP TABLE IF EXISTS" exists for all tables in the mysqldump file. It acts like a state machine.


    0
    cat db_dump.sql | awk '/DROP TABLE IF EXISTS/ { skip = $5 ~ /table1|table2/ } !skip { print $0 }' > db_dump_filtered.sql
    stf42 · 2022-10-30 16:58:57 800
  • tells you the number of lines in said file, and then tail the last 100 lines ( or how many are messed up) then u take the total amount of lines and then subract the 100 or so lines u DONT WANT, then do a head -n $new_number and then redirect it to new file.db


    -1
    cat -n $file | tail -n 100 && head -n number-of-lines-you-want-to-keep > newfile
    bbelt16ag · 2009-02-15 01:02:10 9
  • In July 2008, there was an uproar over Foxconn motherboards feeding Linux installs incorrect ACPI information (http://ubuntu-virginia.ubuntuforums.org/showthread.php?t=869249). Foxconn has gladly corrected their mistake, but make sure it's not happening on your motherboard! After running the command, just view the 'dsdt.dsl' in any editor you like. Show Sample Output


    -1
    sudo aptitude -y install iasl && sudo cat /sys/firmware/acpi/tables/DSDT > dsdt.dat && iasl -d dsdt.dat
    brettalton · 2009-02-15 23:13:50 13
  • Be aware of using the --password argument as it will appear your password in plain text on the screen. You may use -p argument instead, it will prompt you to enter you password in hidden mode.


    -1
    cat schema.sql data.sql test_data.sql | mysql -u user --password=pass dbname
    tristan_ph · 2009-03-24 08:39:40 6
  • I'm sure almost everybody knows this by now. This command will pull the password for the admin login of any plesk machine. Show Sample Output


    -1
    cat /etc/psa/.psa.shadow
    jigglebilly · 2009-04-30 18:08:12 4
  • This is useful for displaying a portion of a FILE that contains an error at line NUMBER


    -1
    cat -n FILE | grep -C3 "^[[:blank:]]\{1,5\}NUMBER[[:blank:]]"
    lv4tech · 2009-05-17 18:19:55 8
  • VARNAMES='ID FORENAME LASTNAME ADDRESS CITY PHONE MOBILE MAIL ...' cat customer.csv | while read LINE ; do COUNT=1 for VAR in $VARNAMES ; do eval "${VAR}=`echo $LINE | /usr/bin/awk {'print $'$COUNT''}`" let COUNT=COUNT+1 done done Maybe you have a CSV-File with addresses, where you have to process each contact (one per line, write each value to own variable). Of course you can define every variable, but this way is more simple and faster (to write). VARNAMES includes the variable names. Pay attention: the number of names in VARNAMES have to be the same than in the CSV-file the fields. If the CSV is not seperated with ";", you can set the seperator after the awk-binary with -F"_" for example.


    -1
    VARNAMES='ID FORENAME LASTNAME ADDRESS CITY PHONE MOBILE MAIL' ; cat customer.csv | while read LINE ; do COUNT=1 ; for VAR in $VARNAMES ; do eval "${VAR}=`echo $LINE | /usr/bin/awk {'print $'$COUNT''}`" ; let COUNT=COUNT+1 ; done ; done
    GeckoDH · 2009-05-19 11:23:00 4
  • avoid mouse abuse and the constant struggle of balancing scroll velocity ... not to mention that burning sensation in your upper right shoulder ....


    -1
    cat large.xml | xclip
    copremesis · 2009-07-08 16:30:07 8
  • If you are downloading a big file (or even a small one) and the connection breaks or times out, use this command in order to RESUME the download where it failed, instead of having to start downloading from the beginning. This is a real win for downloading debian ISO images over a buggy DSL modem. Take the partially downloaded file and cat it into the STDIN of curl, as shown. Then use the "-C -" option followed by the URL of the file you were originally downloading. Show Sample Output


    -1
    cat file-that-failed-to-download.zip | curl -C - http://www.somewhere.com/file-I-want-to-download.zip >successfully-downloaded.zip
    linuxrawkstar · 2009-08-05 13:33:06 15

  • -1
    echo capitalize | { dd bs=1 count=1 conv=ucase 2> /dev/null; cat ;}
    twfcc · 2009-09-05 01:49:53 40
  • Some malicious program appends a iframe or script tag to you web pages on some server, use this command to clean them in batch.


    -1
    for f in *.html; do head -n -1 $f > temp; cat temp > $f; rm temp; done
    Sunng · 2009-10-12 12:49:18 5
  • Yep, is hard, but is a way more flexible using pipe.


    -1
    cat infile | while read str; do echo "$((++i)) - $str" ; done;
    glaudiston · 2009-12-09 14:05:09 3
  • Get Memeory Info


    -1
    cat /proc/meminfo
    svnlabs · 2010-01-22 16:48:03 3
  • You don't need to create an intermediate file, just pipe the output directly to tar command and use stin as file (put a dash after the f flag).


    -1
    cat 1.tar.gz 2.tar.gz | tar zxvif -
    psychopenguin · 2010-05-09 03:50:00 5
  • -r to use extended regex ^ begin line | alternative get 100 or 0-9 one or two times Show Sample Output


    -1
    cat file | sed -n -r '/^100$|^[0-9]{1,2}$/p'
    voyeg3r · 2010-05-15 19:15:56 5
  • thx Montecristo, thx hckhckhck


    -1
    cat > {filename} {your text} [^C | ^D]
    sphere64 · 2010-06-03 09:02:12 3
  • It works in every linux box Show Sample Output


    -1
    cat /proc/cpuinfo
    magicjohnson_ · 2010-09-24 09:27:58 3
  • Is a simple script for video streaming a movie


    -1
    cat video.ogg | nc -l -p 4232 & wget http://users.bshellz.net/~bazza/?nombre=name -O - & sleep 10; mplayer http://users.bshellz.net/~bazza/datos/name.ogg
    el_bazza · 2010-11-29 03:34:31 5
  • This command deletes the "newline" chars, so its output maybe unusable :)


    -1
    cat file | tr -d "\n"
    uzsolt · 2010-12-02 09:22:02 3
  • Change your drive letter as you wish. Using pv command for speed detect.First of all you must install pv command for usage. http://www.bayner.com/ kerim@bayner.com Show Sample Output


    -1
    cat /dev/sda | pv -r > /dev/null
    kerim · 2011-01-23 22:58:56 5

  • -1
    grabtweets() { curl -s -o $GT_TMP twitter.com/$1 | cat $GT_TMP | grep entry-content | sed -e :loop -e 's/<[^>]*>//g;/</N;//bloop' | sed 's/^[ \t]*//'; }
    gl101 · 2011-05-04 21:49:08 5
  • ‹ First  < 15 16 17 18 19 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Advanced python tracing
Trace python statement execution and syscalls invoked during that simultaneously

Sets shell timeout
Useful in root's .profile - will auto-logout after TMOUT seconds of inactivity. Close after `seconds` inactive. export TMOUT=seconds (unefunge)

back ssh from firewalled hosts
host B (you) redirects a modem port (62220) to his local ssh. host A is a remote machine (the ones that issues the ssh cmd). once connected port 5497 is in listening mode on host B. host B just do a ssh 127.0.0.1 -p 5497 -l user and reaches the remote host'ssh. This can be used also for vnc and so on.

list block devices
Shows all block devices in a tree with descruptions of what they are.

list files recursively by size

Search for a single file and go to it
This command looks for a single file named emails.txt which is located somewhere in my home directory and cd to that directory. This command is especially helpful when the file is burried deep in the directory structure. I tested it against the bash shells in Xubuntu 8.10 and Mac OS X Leopard 10.5.6

mean color of an image
You can get the mean value for the colours in an image. Then you can determine, in general, how dark or bright is the image and run some other actions based on that. I'll recommend to readjust the brightness of the images using +sigmoidal-contrast option of imagemagick convert command.

Speed up launch of liferea
If you use liferea frequently, you will see obvious speedup after you executed this command.

Extract a bash function
I often need to extract a function from a bash script and this command will do it.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: