append content of a file to itself

cat file | tee >> file
The command `cat file >> file` failes with the following error message: cat: file: input file is output file `tee` is a nice workaround without using any temporary files.
Sample Output
echo "Just one line" >> file
cat file
> Just one line
cat file | tee >> file
cat file
> Just one line
> Just one line

0
By: GeckoDH
2009-07-30 07:34:03

These Might Interest You

  • If you set noclobber to on, bash won't allow redirection to overwrite existing files . set -o noclobber command turn the option on (default it s off ) . You can still append information but not overwrite the file .to turn it back off use : $ set +o noclobber . I use it because i overwrite a file by accident , after thought , content of the file was very important , creating a one more file mean nothing for my hard disk (we are not anymore on the 64 k memory time) , but content of file is far much important . What we call exeprience :( Show Sample Output


    1
    set -o noclobber
    eastwind · 2010-01-08 21:06:44 1

  • 1
    convert -size 32x32 \( xc:red xc:green +append \) \( xc:yellow xc:blue +append \) -append output.png
    kev · 2011-08-23 07:44:07 0
  • If you just want to write or append some text to a file without having to run a text editor, run this command. After running the command, start typing away. To exit, type . on a line by itself. Replacing the >> with a single > will let you overwrite your file. Show Sample Output


    4
    cat <<.>> somefilename
    tomlouie · 2009-07-10 17:45:42 3
  • OpenDocument documents from OpenOffice.org,LibreOffice and other applications, are actually ZIP archives. Useful informations in these archives are in XML format. Here we like it or do not. Anyway, the XML files have the unfortunate tendency to not be indented, and for good reason: they consist of only one line! To solve the problem and to use a proper editor on the content, I proceed as follows. Required xmlindent You can also use : zip document.odt content.xml And it works with vi instead of nano !


    0
    unzip document.odt content.xml && xmlindent -w content.xml && nano content.xml
    arthurdent · 2012-12-01 17:05:28 0

What Others Think

Could you please tell us at least one reason we would duplicate a file in itself ?
CodSpirit · 464 weeks ago
it looks like you mean to use > rather than >> here, because depending on when cat takes a break reading "file" and when the shell writes the incoming data you could end up with a file that just keeeeeeepppps groooooowing.
bwoodacre · 463 weeks and 5 days ago
For me its nice if I need to generate a test file with a, say, 10 chunks of identical 100,000 lines in it. So it's quicker to append the file to itself 10 times than write out 100,000 lines 10 times. But (ironically) this way didn't work for me (the file kept growing), so intermediate file is not a bad alternative. Maybe it's because the file is so gigantic, but oh well...
tefrak · 328 weeks and 3 days ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: