Commands tagged create file (4)

  • If you want to create fast a very big file for testing purposes and you do not care about its content, then you can use this command to create a file of arbitrary size within less than a second. Content of file will be all zero bytes. The trick is that the content is just not written to the disk, instead the space for it is somehow reserved on operating system level and file system level. It would be filled when first accessed/written (not sure about the mechanism that lies below, but it makes the file creation super fast). Instead of '1G' as in the example, you could use other modifiers like 200K for kilobytes (1024 bytes), 500M for megabytes (1024 * 1024 bytes), 20G for Gigabytes (1024*1024*1024 bytes), 30T for Terabytes (1024^4 bytes). Also P for Penta, etc... Command tested under Linux. Show Sample Output


    3
    truncate --size 1G bigfile.txt
    ynedelchev · 2015-02-26 11:56:27 0
  • Create a bash script to change the modification time for each file in 'files.txt' such that they are in the same order as in 'files.txt' File name for bash script specified by variable, 'scriptName'. It is made an executable once writing into it has been completed. Show Sample Output


    1
    scriptName="reorder_files.sh"; echo -e '#!/bin/sh\n' > "${scriptName}"; cat files.txt | while read file; do echo "touch ${file}; sleep 0.5;" >> "${scriptName}"; done; chmod +x "${scriptName}";
    programmer · 2016-04-19 11:52:00 3
  • Create empty file


    0
    > filename
    TIT · 2012-03-08 23:17:05 0
  • When run on an existing file, alters it's creation date.


    0
    touch /path/to/file
    lolssl · 2015-09-22 16:49:22 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

spawn shell listener service with nc

Backup your hard drive with dd
This will create an exact duplicate image of your hard drive that you can then restore by simply reversing the "if" & "of" locations. $ sudo dd if=/media/disk/backup/sda.backup of=/dev/sda Alternatively, you can use an SSH connection to do your backups: $dd if=/dev/sda | ssh [email protected] dd of=~/backup/sda.backup

analyze traffic remotely over ssh w/ wireshark
commandline for mac os x

list files recursively by size

Outputs a 10-digit random number

Show directories
Show only the subdirectories in the current directory. In the example above, /lib has 135 files and directories. With this command, the 9 dirs jump out.

Delete newline

Dump bash command history of an active shell user.

Shows space used by each directory of the root filesystem excluding mountpoints/external filesystems (and sort the output)
Excludes other mountpoints with acavagni's "mountpoint" idea, but with -exec instead of piping to an xargs subshell. Then, calling "du" only once with -exec's "+" option. The first "\! -exec" acts as a test so only those who match are passed to the second "-exec" for du.

Find inside files two different patterns in the same line and for matched files show number of matched lines
The option -print0 for find and -0 for grep help prevent issue with weird characters or spaces in filenames. Furthermore with xargs there is no limited number of arguments that find can throw.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: