Commands using shred (11)

  • This command securely erases all the unused blocks on a partition. The unused blocks are the "free space" on the partition. Some of these blocks will contain data from previously deleted files. You might want to use this if you are given access to an old computer and you do not know its provenance. The command could be used while booted from a LiveCD to clear freespace space on old HD. On modern Linux LiveCDs, the "ntfs-3g" system provides ReadWrite access to NTFS partitions thus enabling this method to also be used on Wind'ohs drives. NB depending on the size of the partition, this command could take a while to complete. Show Sample Output


    8
    # cd $partition; dd if=/dev/zero of=ShredUnusedBlocks bs=512M; shred -vzu ShredUnusedBlocks
    mpb · 2009-06-21 14:17:22 6
  • Instead, install apt-get install secure-delete and you can use: -- srm to delete file and directory on hard disk -- smem to delete file in RAM -- sfill to delete "free space" on hard disk -- sswap to delete all data from swap


    6
    shred -u -z -n 17 rubricasegreta.txt
    0disse0 · 2010-01-31 15:24:54 0

  • 6
    sudo shred -vz -n 0 /dev/sdb
    bugmenot · 2012-08-06 22:37:44 0
  • GNU shred is provided by the coreutils package on most Linux distribution (meaning, you probably have it installed already), and is capable of wiping a device to DoD standards. You can give shred any file to destroy, be it your shell history or a block device file (/dev/hdX, for IDE hard drive X, for example). Shred will overwrite the target 25 times by default, but 3 is enough to prevent most recovery, and 7 passes is enough for the US Department of Defense. Use the -n flag to specify the number of passes, and man shred for even more secure erasing fun. Note that shredding your shell history may not be terribly effective on devices with journaling filesystems, RAID copies or snapshot copies, but if you're wiping a single disk, none of that is a concern. Also, it takes quite a while :)


    5
    shred targetfile
    sud0er · 2009-04-28 19:57:43 5
  • remove file that has sensitive info safely. Overwrites it 33 times with zeros


    2
    shred -n33 -zx file; rm file
    copremesis · 2009-05-08 19:15:41 3
  • This command remove a file from your filesystem like the normal rm command but instead of deleting only the inode information this also delete the data that was stored on blocks /!\ warning this may be long for large files Show Sample Output


    1
    function rrm(){ for i in $*; do; if [ -f $i ]; then; echo "rrm - Processing $i"; shred --force --remove --zero --verbose $i; else; echo "Can't process $i"; type=$(stat "$1" -c %F); echo "File $i is $type"; fi; done;}
    thelan · 2010-06-10 22:40:27 0

  • 0
    shred -vzu /tmp/junk-file-to-be-shredded
    mpb · 2009-06-18 12:00:19 2
  • Make sure the file contents can't be retrieved if anyone gets ahold of your physical hard drive. With hard drive partition: gpg --default-recipient-self -o /path/to/encrypted_backup.gpg -e /dev/sdb1 && shred -z /dev/sdb1 WARNING/disclaimer: Be sure you... F&%k it--just don't try this.


    0
    gpg -e --default-recipient-self <SENSITIVE_FILE> && shred -zu "$_"
    h3xx · 2011-07-24 05:51:47 0
  • Instead of zeroing the filesystem, this command overwrites N times (default is 3) the disk content, making data recovery much harder. The command accepts many more options


    0
    shred --iterations=N /dev/sdaX
    bibe · 2012-01-23 20:40:36 0
  • Shred can be used to shred a given partition or an complete disk. This should insure that not data is left on your disk


    -2
    sudo shred -zn10 /dev/sda
    dcabanis · 2009-04-30 13:02:43 3

  • -5
    shred -v filename
    techie · 2013-05-07 14:58:17 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Find and delete oldest file of specific types in directory tree
This works on my ubuntu/debian machines. I suspect other distros need some tweaking of sort and cut. I am sure someone could provide a shorter/faster version.

How to pull out lines between two patterns
This command will display all lines between 2 patterns: word-a and word-b useful for grepping command outputs from file

Convert Unix newlines to DOS newlines
The ctrl+v,ctrl+m portion represents key presses that you should do. If you do it successfully you should see a ^M character appear.

Insert the last argument of the previous command
for example if you did a: $ ls -la /bin/ls then $ ls !$ is equivalent to doing a $ ls /bin/ls

commentate specified line of a file
used when modify several configuration files with a single command

Ruby - nslookup against a list of IP`s or FQDN`s
This version uses host and no ruby.

print file without duplicated lines using awk
This create an array 'a' with wole lines. only one occurrence of each line - Not Get lines ++ !

Lists all listening ports together with the PID of the associated process
The PID will only be printed if you're holding a root equivalent ID.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

df without line wrap on long FS name


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: