Commands tagged create file (4)

  • If you want to create fast a very big file for testing purposes and you do not care about its content, then you can use this command to create a file of arbitrary size within less than a second. Content of file will be all zero bytes. The trick is that the content is just not written to the disk, instead the space for it is somehow reserved on operating system level and file system level. It would be filled when first accessed/written (not sure about the mechanism that lies below, but it makes the file creation super fast). Instead of '1G' as in the example, you could use other modifiers like 200K for kilobytes (1024 bytes), 500M for megabytes (1024 * 1024 bytes), 20G for Gigabytes (1024*1024*1024 bytes), 30T for Terabytes (1024^4 bytes). Also P for Penta, etc... Command tested under Linux. Show Sample Output


    3
    truncate --size 1G bigfile.txt
    ynedelchev · 2015-02-26 11:56:27 0
  • Create a bash script to change the modification time for each file in 'files.txt' such that they are in the same order as in 'files.txt' File name for bash script specified by variable, 'scriptName'. It is made an executable once writing into it has been completed. Show Sample Output


    1
    scriptName="reorder_files.sh"; echo -e '#!/bin/sh\n' > "${scriptName}"; cat files.txt | while read file; do echo "touch ${file}; sleep 0.5;" >> "${scriptName}"; done; chmod +x "${scriptName}";
    programmer · 2016-04-19 11:52:00 3
  • Create empty file


    0
    > filename
    TIT · 2012-03-08 23:17:05 0
  • When run on an existing file, alters it's creation date.


    0
    touch /path/to/file
    lolssl · 2015-09-22 16:49:22 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Use AWS CLI and JQ to get a list of instances sorted by launch time
You can do the filtering natively in the aws cli, without using jq (although jq is awesome!)

quit gvim remotely

Get ssh server fingerprints
Get your server's fingerprints to give to users to verify when they ssh in. Publickey locations may vary by distro. Fingerprints should be provided out-of-band.

GREP a PDF file.
This is a good alternative to pdf2text for Ubuntu. To install it: sudo apt-get install python-pdfminer

find system's indianness

cd to (or operate on) a file across parallel directories
This is useful for quickly jumping around branches in a file system, or operating on a parellel file. This is tested in bash. cd to (substitute in PWD, a for b) where PWD is the bash environmental variable for the "working directory"

List out classes in of all htmls in directory
Lists out all classes used in all *.html files in the currect directory. usefull for checking if you have left out any style definitions, or accidentally given a different name than you intended. ( I have an ugly habit of accidentally substituting camelCase instead of using under_scores: i would name soemthing counterBox instead of counter_box) WARNING: assumes you give classnames in between double quotes, and that you apply only one class per element.

Show me a histogram of the busiest minutes in a log file:
Busiest seconds: $ cat /var/log/secure.log | awk '{print substr($0,0,15)}' | uniq -c | sort -nr | awk '{printf("\n%s ",$0) ; for (i = 0; i

Count the number of pages of all PDFs in current directory and all subdirs, recursively


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: