How does a 7- or 35-pass erase work? Why would one use these methods?

那年仲夏 提交于 2019-11-30 03:41:18

I'd never heard of the 35-part erase: http://en.wikipedia.org/wiki/Gutmann_method

The Gutmann method is an algorithm for securely erasing the contents of computer hard drives, such as files. Devised by Peter Gutmann and Colin Plumb, it does so by writing a series of 35 patterns over the region to be erased. The selection of patterns assumes that the user doesn't know the encoding mechanism used by the drive, and so includes patterns designed specifically for three different types of drives. A user who knows which type of encoding the drive uses can choose only those patterns intended for their drive. A drive with a different encoding mechanism would need different patterns. Most of the patterns in the Gutmann method were designed for older MFM/RLL encoded disks. Relatively modern drives no longer use the older encoding techniques, making many of the patterns specified by Gutmann superfluous.[1]

Also interesting:

One standard way to recover data that has been overwritten on a hard drive is to capture the analog signal which is read by the drive head prior to being decoded. This analog signal will be close to an ideal digital signal, but the differences are what is important. By calculating the ideal digital signal and then subtracting it from the actual analog signal it is possible to ignore that last information written, amplify the remaining signal and see what was written before.

A single pass with zeros doesn't completely erase magnetic artifacts from a disk. It's still possible to recover the data from the drive. A 7-pass erasure using random data will do a pretty complete job to prevent reconstruction of the data on the drive.

Wikipedia has a number of different articles relating to this topic.

http://en.wikipedia.org/wiki/Data_remanence

http://en.wikipedia.org/wiki/Computer_forensics

http://en.wikipedia.org/wiki/Data_erasure

As mentioned before, magnetic artifacts are present from the previous data on the platter.

In a recent issue of MaximumPC they put this to the test. They took a drive, ran it through a pass of all zeros, and hired a data recovery firm to try and recover what they could. Answer: Not one bit was recovered. Their analysis was that unless you expect the NSA to try, a zero pass is probably enough.

Personally, I'd run an alternating pattern or two across it.

one random pass is enough for plausible deniability, as the lost data will have to be mostly "reconstructed" with a margin of error that grows with the length of the data trying to be recovered, as well as whether or not the data is contiguous (most cases, its not).

for the insanely paranoid, three passes is good. 0xAA (10101010), 0x55 (01010101), and then random. the first two will grey out residual bits, the last random pass will obliterate any "residual residual" bits.

never do passes with zeros. under magnetic microscopy the data is still there, its just "faded".

never trust "single file shredding", especially on solid state mediums like flash drives. if you need to "shred" a file, well, "delete" it and fill your drive with random data files until it runs out of space. then next time think twice about housing shred-worthy data on the same medium as "low-clearance" stuff.

the gutmann method is based on tin-foil hat speculation, it does various things to get drives to degauss themselves, which is admirable in an artistic sense, but pragmatically its overkill. no private organisation to-date has successfully recovered data from even a single random pass. and as for big brother, if the DoD considers it gone then you know its gone, the military industrial complex gets all the big bucks to try and do exactly what gutmann claims they can do, and believe you me if they had the tech to do so it would already have been leaked to the private sector since they're all in bed with each other. however if you want to use gutmann in spite of this, check out the secure-delete package for linux.

7 pass and 35 pass would take forever to finish. HIPAA only requires DOD 3-pass overwrite, and I am not certain why DOD even has a 7 pass overwrite as it seems they just simply shred the disks before disposing of machines anyway. Theoretically, you could recover data off of the outer edges of each track (using a scanning electron microscope or microscopic magnetic probe), but it practice you would need the resources of a disk drive maker or one of the three letter government organizations to do this.

The reason to perform multipass writes is to take advantage of the slight errors in positioning to overwrite the edges of the track also, making recovery far less likely.

Most drive recovery companies can't recover a drive that has had its data overwritten even once. They are typically taking advantage of the fact that Windows doesn't zero out the data blocks, just changes the directory to mark the space free. They simply 'undelete' the file and make it visable again.

If you don't believe me, call them up and ask them if they can recover a disk that has been dd'ed over... they will typically tell you no, and if they do agree to try, it will be serious $$$ to get it back...

DOD 3 pass followed by a zero overwrite should be more than sufficent for most (i.e. non- TOP SECRET) folks.

DBAN (and its commercially supported decendent, EBAN) do this all cleanly... I would recommed these.

Advanced recovery tools can recover single pass deleted files easily. And they are expensive too (e.g http://accessdata.com/).

A visual GUI for Gutmann passes from http://sourceforge.net/projects/gutmannmethod/ shows it has 8 semi random passes. I never seen a proof that files deleted by Gutmann been recovered.

An overkill, maybe, still far better that Windows soft delete.

Regarding the second part of the question, some of the answers here actually contradict real research on that exact atopic. According the the Number of overwrites needed of the Data erasure article on wikipedia, on modern drives, erasing with more than one pass is redundant:

"ATA disk drives manufactured after 2001 (over 15 GB) clearing by overwriting the media once is adequate to protect the media from both keyboard and laboratory attack." (citation)

Also, infosec did a nice article entitled "The Urban Legend of Multipass Hard Disk Overwrite", on the entire subject, talking about the old USA Government erasure standards, among others, of how the multi-pass myth established itself in the industry.

"Fortunately, several security researchers presented a paper [WRIG08] at the Fourth International Conference on Information Systems Security (ICISS 2008) that declares the “great wiping controversy” about how many passes of overwriting with various data values to be settled: their research demonstrates that a single overwrite using an arbitrary data value will render the original data irretrievable even if MFM and STM techniques are employed.

The researchers found that the probability of recovering a single bit from a previously used HDD was only slightly better than a coin toss, and that the probability of recovering more bits decreases exponentially so that it quickly becomes close to zero.

Therefore, a single pass overwrite with any arbitrary value (randomly chosen or not) is sufficient to render the original HDD data effectively irretrievable."

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!