Output specific line in a very big file and exit afterwards

awk 'FNR==100 {print;exit}' file
This will save parsing time for operations on very big files.

0
2012-03-04 20:25:57

These Might Interest You

  • Just one character longer than the sed version ('FNR==5' versus -n 5p). On my system, without using "exit" or "q", the awk version is over four times faster on a ~900K file using the following timing comparison: testfile="testfile"; for cmd in "awk 'FNR==20'" "sed -n '20p'"; do echo; echo $cmd; eval "$cmd $testfile"; for i in {1..3}; do time for j in {1..100}; do eval "$cmd $testfile" >/dev/null; done; done; done Adding "exit" or "q" made the difference between awk and sed negligible and produced a four-fold improvement over the awk timing without the "exit". For long files, an exit can speed things up: awk 'FNR==5{print;exit}' <file>


    1
    awk 'FNR==5' <file>
    dennisw · 2009-10-20 22:52:41 0
  • Using sed to extract lines in a text file If you write bash scripts a lot, you are bound to run into a situation where you want to extract some lines from a file. Yesterday, I needed to extract the first line of a file, say named somefile.txt. cat somefile.txt Line 1 Line 2 Line 3 Line 4 This specific task can be easily done with this: head -1 somefile.txt Line 1 For a more complicated task, like extract the second to third lines of a file. head is inadequate. So, let's try extracting lines using sed: the stream editor. My first attempt uses the p sed command (for print): sed 1p somefile.txt Line 1 Line 1 Line 2 Line 3 Line 4 Note that it prints the whole file, with the first line printed twice. Why? The default output behavior is to print every line of the input file stream. The explicit 1p command just tells it to print the first line .... again. To fix it, you need to suppress the default output (using -n), making explicit prints the only way to print to default output. sed -n 1p somefile.txt Line 1 Alternatively, you can tell sed to delete all but the first line. sed '1!d' somefile.txt Line 1 '1!d' means if a line is not(!) the first line, delete. Note that the single quotes are necessary. Otherwise, the !d will bring back the last command you executed that starts with the letter d. To extract a range of lines, say lines 2 to 4, you can execute either of the following: sed -n 2,4p somefile.txt sed '2,4!d' somefile.txt Note that the comma specifies a range (from the line before the comma to the line after). What if the lines you want to extract are not in sequence, say lines 1 to 2, and line 4? sed -n -e 1,2p -e 4p somefile.txt Line 1 Line 2 Line 4 Show Sample Output


    0
    sed -n -e 1186,1210p A-small-practice.in
    evandrix · 2011-06-04 10:53:46 0
  • This command is more for demonstrating piping to vim and jumping to a specific line than anything else. Exit vim with :q! +23 jumps to line 23 - make vim receive the data from the pipe


    1
    zcat /usr/share/doc/vim-common/README.gz | vim -g +23 -
    int19h · 2009-02-16 01:25:37 2
  • Doesn't display the matching line. If you want that behaviour, you need to add "print && " before the 'exit'.


    0
    <command> | perl -pe '/<regex/ && exit;'
    intuited · 2009-12-22 15:05:49 0

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: