(Image by Philipp Stössel/ETH Zurich)
A fascinating project at Zurich’s Swiss Federal Institute of Technology aims to preserve digital data for the long haul by repurposing Deoxyribonucleic acid (DNA) for use as a medium for storage.
Nature’s own time-tested data storage mechanism, DNA, has been carrying significant amounts of pretty important information around for millions of years … the genetic data that undergirds all life on Earth. With that sort of track record, researchers have been working to repurpose the molecule to safely store arbitrary digital data to carry human knowledge on through the centuries.
The code itself is binary, with base pairs A and C treated as a “0” and G and T as a “1.” Error correction is baked into the code in case of any damage to the DNA.
The team tested the durability of the storage media by encoding 83 KB of data (the Swiss federal charter and the Archimedes Palimpsest) and storing the glass spheres containing the DNA at temperatures ranging between 60 degrees and 70 degrees C for a week. This was hoped to simulate a long passage of time. At the end of the tests, the data remained readable without any errors.
(Image by Zephyrus, CC licensed, some rights reserved)
Reports from New Scientist suggest that the data stored in that form could last around 2000 years if stored at 10 degrees C, or a whopping 2 millions years if kept at -18 degrees C.
Binary encoding using DNA has been something of a fad for several years now; scientists at the J. Craig Venter Institute
inserted a marker code with their names and a number of quotations into a synthetic genome used to demonstrate the viability of synthetic cell creation, and researchers at Harvard stored a novel, several JPG images, and a JavaScript program (all together totaling about 650 Kilobytes) into sequences of DNA. A biochemist at UCLA is currently working with the band OK Go to
put out their next studio album on DNA (it’s not clear exactly how playback might work).
The process isn’t cheap, however. The Venter Institute experiment cost around $40 million, and the Swiss experiments were estimated to cost about $18 per kilobyte to encode, to say nothing of the storage and retrieval costs. It’s going to take a long time to compete with the industry standard of our $.03 per Gigabyte for magnetic and optical media … costs that are still dropping even today.
Long-term storage of information has been a pressing problem through the years. The oldest known example of recorded information is on a piece of wood, the Dispilio tablet, thought to date back to 5260 BC. The tablet’s survival was a bit of a fluke, however, having been preserved only by being buried in mud and water for much of its history. It was protected there from deterioration, as it was kept from oxygen exposure and temperature extremes.
More pressing, perhaps, is the translatability of the coding process, regardless of the media — the text preserved on the Dispilio tablet
remains incomprehensible to modern scholars despite the survival of the writing itself.
댓글 없음:
댓글 쓰기