Commands using cat (514)

  • install json-to-js as a npm global package


    0
    cat data.json | json-to-js | pbcopy
    minademian · 2018-12-14 15:55:41 42
  • Plain old `unzip` won't unzip output coming from STDOUT the ZIP file format includes a directory (index) at the end of the archive. This directory says where, within the archive each file is located and thus allows for quick, random access, without reading the entire archive. This would appear to pose a problem when attempting to read a ZIP archive through a pipe, in that the index is not accessed until the very end and so individual members cannot be correctly extracted until after the file has been entirely read and is no longer available. As such it appears unsurprising that most ZIP decompressors simply fail when the archive is supplied through a pipe. The directory at the end of the archive is not the only location where file meta information is stored in the archive. In addition, individual entries also include this information in a local file header, for redundancy purposes. From the `jar` manpage: > The jar command is a general-purpose archiving and compression tool, based on ZIP and the ZLIB compression format. JAR is smart enough to know how to handle these local file headers when the index is unavailable when reading through the pipe. (Most of the explanation in this description is taken from https://serverfault.com/a/589528/314226 , though they recommend using `bsdtar`, but that is not always available on systems) Show Sample Output


    0
    cat foo.zip | jar xv
    bbbco · 2019-01-14 22:08:19 33
  • Overwrites remote file without asking! Uses HTTPS proxy that supports CONNECT. Actually uses SSH and not SFTP to upload the file.


    0
    cat myFile.json | ssh root@remoteSftpServer -o "ProxyCommand=nc.openbsd -X connect -x proxyhost:proxyport %h %p" 'cat > myFile.json'
    casueps · 2020-01-22 11:00:20 107
  • Especially good for exported ipython files Show Sample Output


    0
    grep -v '^# In' viz.txt | cat -s > out.txt
    shantanuo · 2022-06-08 04:01:11 507
  • This command works only if the line "DROP TABLE IF EXISTS" exists for all tables in the mysqldump file. It acts like a state machine.


    0
    cat db_dump.sql | awk '/DROP TABLE IF EXISTS/ { skip = $5 ~ /table1|table2/ } !skip { print $0 }' > db_dump_filtered.sql
    stf42 · 2022-10-30 16:58:57 801
  • tells you the number of lines in said file, and then tail the last 100 lines ( or how many are messed up) then u take the total amount of lines and then subract the 100 or so lines u DONT WANT, then do a head -n $new_number and then redirect it to new file.db


    -1
    cat -n $file | tail -n 100 && head -n number-of-lines-you-want-to-keep > newfile
    bbelt16ag · 2009-02-15 01:02:10 9
  • In July 2008, there was an uproar over Foxconn motherboards feeding Linux installs incorrect ACPI information (http://ubuntu-virginia.ubuntuforums.org/showthread.php?t=869249). Foxconn has gladly corrected their mistake, but make sure it's not happening on your motherboard! After running the command, just view the 'dsdt.dsl' in any editor you like. Show Sample Output


    -1
    sudo aptitude -y install iasl && sudo cat /sys/firmware/acpi/tables/DSDT > dsdt.dat && iasl -d dsdt.dat
    brettalton · 2009-02-15 23:13:50 13
  • Be aware of using the --password argument as it will appear your password in plain text on the screen. You may use -p argument instead, it will prompt you to enter you password in hidden mode.


    -1
    cat schema.sql data.sql test_data.sql | mysql -u user --password=pass dbname
    tristan_ph · 2009-03-24 08:39:40 6
  • I'm sure almost everybody knows this by now. This command will pull the password for the admin login of any plesk machine. Show Sample Output


    -1
    cat /etc/psa/.psa.shadow
    jigglebilly · 2009-04-30 18:08:12 4
  • This is useful for displaying a portion of a FILE that contains an error at line NUMBER


    -1
    cat -n FILE | grep -C3 "^[[:blank:]]\{1,5\}NUMBER[[:blank:]]"
    lv4tech · 2009-05-17 18:19:55 8
  • VARNAMES='ID FORENAME LASTNAME ADDRESS CITY PHONE MOBILE MAIL ...' cat customer.csv | while read LINE ; do COUNT=1 for VAR in $VARNAMES ; do eval "${VAR}=`echo $LINE | /usr/bin/awk {'print $'$COUNT''}`" let COUNT=COUNT+1 done done Maybe you have a CSV-File with addresses, where you have to process each contact (one per line, write each value to own variable). Of course you can define every variable, but this way is more simple and faster (to write). VARNAMES includes the variable names. Pay attention: the number of names in VARNAMES have to be the same than in the CSV-file the fields. If the CSV is not seperated with ";", you can set the seperator after the awk-binary with -F"_" for example.


    -1
    VARNAMES='ID FORENAME LASTNAME ADDRESS CITY PHONE MOBILE MAIL' ; cat customer.csv | while read LINE ; do COUNT=1 ; for VAR in $VARNAMES ; do eval "${VAR}=`echo $LINE | /usr/bin/awk {'print $'$COUNT''}`" ; let COUNT=COUNT+1 ; done ; done
    GeckoDH · 2009-05-19 11:23:00 4
  • avoid mouse abuse and the constant struggle of balancing scroll velocity ... not to mention that burning sensation in your upper right shoulder ....


    -1
    cat large.xml | xclip
    copremesis · 2009-07-08 16:30:07 8
  • If you are downloading a big file (or even a small one) and the connection breaks or times out, use this command in order to RESUME the download where it failed, instead of having to start downloading from the beginning. This is a real win for downloading debian ISO images over a buggy DSL modem. Take the partially downloaded file and cat it into the STDIN of curl, as shown. Then use the "-C -" option followed by the URL of the file you were originally downloading. Show Sample Output


    -1
    cat file-that-failed-to-download.zip | curl -C - http://www.somewhere.com/file-I-want-to-download.zip >successfully-downloaded.zip
    linuxrawkstar · 2009-08-05 13:33:06 16

  • -1
    echo capitalize | { dd bs=1 count=1 conv=ucase 2> /dev/null; cat ;}
    twfcc · 2009-09-05 01:49:53 40
  • Some malicious program appends a iframe or script tag to you web pages on some server, use this command to clean them in batch.


    -1
    for f in *.html; do head -n -1 $f > temp; cat temp > $f; rm temp; done
    Sunng · 2009-10-12 12:49:18 5
  • Yep, is hard, but is a way more flexible using pipe.


    -1
    cat infile | while read str; do echo "$((++i)) - $str" ; done;
    glaudiston · 2009-12-09 14:05:09 3
  • Get Memeory Info


    -1
    cat /proc/meminfo
    svnlabs · 2010-01-22 16:48:03 3
  • You don't need to create an intermediate file, just pipe the output directly to tar command and use stin as file (put a dash after the f flag).


    -1
    cat 1.tar.gz 2.tar.gz | tar zxvif -
    psychopenguin · 2010-05-09 03:50:00 5
  • -r to use extended regex ^ begin line | alternative get 100 or 0-9 one or two times Show Sample Output


    -1
    cat file | sed -n -r '/^100$|^[0-9]{1,2}$/p'
    voyeg3r · 2010-05-15 19:15:56 5
  • thx Montecristo, thx hckhckhck


    -1
    cat > {filename} {your text} [^C | ^D]
    sphere64 · 2010-06-03 09:02:12 3
  • It works in every linux box Show Sample Output


    -1
    cat /proc/cpuinfo
    magicjohnson_ · 2010-09-24 09:27:58 3
  • Is a simple script for video streaming a movie


    -1
    cat video.ogg | nc -l -p 4232 & wget http://users.bshellz.net/~bazza/?nombre=name -O - & sleep 10; mplayer http://users.bshellz.net/~bazza/datos/name.ogg
    el_bazza · 2010-11-29 03:34:31 5
  • This command deletes the "newline" chars, so its output maybe unusable :)


    -1
    cat file | tr -d "\n"
    uzsolt · 2010-12-02 09:22:02 3
  • Change your drive letter as you wish. Using pv command for speed detect.First of all you must install pv command for usage. http://www.bayner.com/ kerim@bayner.com Show Sample Output


    -1
    cat /dev/sda | pv -r > /dev/null
    kerim · 2011-01-23 22:58:56 5

  • -1
    grabtweets() { curl -s -o $GT_TMP twitter.com/$1 | cat $GT_TMP | grep entry-content | sed -e :loop -e 's/<[^>]*>//g;/</N;//bloop' | sed 's/^[ \t]*//'; }
    gl101 · 2011-05-04 21:49:08 5
  • ‹ First  < 15 16 17 18 19 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

list all files in a directory, sorted in reverse order by modification time, use file descriptors.
It's both silly, and infinitely useful. Especially useful in logfile directories where you want to know what file is being updated while troubleshooting.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

BASH: Print shell variable into AWK

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

Count the total number of files in each immediate subdirectory
counts the total (recursive) number of files in the immediate (depth 1) subdirectories as well as the current one and displays them sorted. Fixed, as per ashawley's comment

for too many arguments by *
$ grep ERROR *.log -bash: /bin/grep: Argument list too long $ echo *.log | xargs grep ERROR /dev/null 20090119.00011.log:DANGEROUS ERROR

get total of inodes of root partition

A signal trap that logs when your script was killed and what other processes were running at that time
trap is the bash builtin that allows you to execute commands when the current script receives a particular signal. Uses $0 for the script name, $$ for the script PID, tee to output to STDOUT as well as a log file and ps to log other running processes.

take a look to command before action
add |sh when you agree the list, I often use that method to prevent typos in dangerous or long operations

Get gzip compressed web page using wget.
Like the original command, but the -f allows this one to succeed even if the website returns uncompressed data. From gzip(1) on the -f flag: If the input data is not in a format recognized by gzip, and if the --stdout is also given, copy the input data without change to the standard output: let zcat behave as cat.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: