
to be exact
its actually 10,457,355,285 years, 151 days, 1 hour and 4 minutes, give or take a few leap seconds
US researchers have demonstrated a form of nanotube archival memory that can store a memory bit for a billion years, and has a theoretical trillion bits/square inch density. The researchers at the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California (UC) Berkeley …
Dave, check out Neal Stephenson's novel "The Diamond Age" for ideas on that score. In the book, engineering has gone nano-scale, with devices that look like ultra-small versions of old-school Victorian engineering - all pushrods and valves and stuff like that.
Incidentally, there are several things that the article avoids mentioning, hence the other reason for the "fiction" in the title. Yeah, this thing can hold its state for umpty-tum zillion years, and that's great. But it takes about 3s for a bit to change state (according to their data), which seriously limits its usefulness for any purpose except offline backups. They also don't mention anything about how many times it can change state over its lifespan, which is a *very* important issue for a mechanical system. And nor have they checked anything about this thing's stability when it's in an environment with electric fields created by other devices, which is a bit like saying "this amazing ice cube will last for a trillion years without melting" and leaving out the disclaimer of "... if I keep it stored in a freezer for a trillion years".
This post has been deleted by its author
"The nano-structure was created in a single step by pyrolysis of ferrocene in argon at 1,000 degreees C. The created nanotube elements are dispersed in isopropanol ultrasonically and deposited on a substrate with electrical contacts applied to the ends of the nanotube."
Straight forward enough! Thanks for making that so clearly understandable!
Yes, lots of things, like the space elevator for a start.
I'm frankly embarrassed that physicists working on these kind of projects are so poorly trained that they don't understand some of the most fundamental concepts of materials engineering (which almost any physics student in this country will tell you is nothing more than physics with all the hard maths taken out).
Fundamental concepts like... the equilibrium concentration of thermal defects, fracture mechanics, sp2/sp3 hybridization and all those other 'little' problems affecting the scaling of the achievable mechanical properties of carbon nanotubes. When it's 2nm long it's all fine and good at 1/3 of the theoretical strength, but when the strongest carbon structure past 2mm is weaker than the majority of low-grade steels and still hellishly difficult to make... well, draw your own conclusions.
Have a quick read of the 'Gigatubes' section of this, if you want a much better scientists' explanation;
http://www.msm.cam.ac.uk/phase-trans/2005/MST7118.pdf
I don't dispute that materials on the sub micro-scale have behaviours that are exciting and unusual, but the very nature of the beast means that they don't scale very well. You can have one end or the other. Not both.
""The nano-structure was created in a single step by pyrolysis of ferrocene in argon at 1,000 degreees C. The created nanotube elements are dispersed in isopropanol ultrasonically and deposited on a substrate with electrical contacts applied to the ends of the nanotube."
Straight forward enough! Thanks for making that so clearly understandable!" .... By Steve Swann Posted Tuesday 9th June 2009 06:57 GMT
Sounds like Python on Sticky Speed and Sweet Ruby Red Wine, Steve.
And QuITe Perfect for whenever the Future is Steered by what we do Today with what we know of Tomorrow, with the Past only a Never to Return Proxy Memory Occupying the Minds of the Slow-Witted as they are EduTained Sublimely to a Higher Beta Operating Standard.
Here's a link to a CERN paper that describes research that they have done of the stability of their (disc) storage systems. They reckon that their system has silent bit errors in the 10^-7 range. This seems small, but a terabyte of disc may have 3 corrupted files at this rate. What is worse is that you won't know that the files are corrupt - all the error detection has been defeated by the scale of the storage.
http://indico.cern.ch/getFile.py/access?contribId=3&sessionId=0&resId=1&materialId=paper&confId=13797
Less scientific review of the paper:
http://storagemojo.com/2007/09/19/cerns-data-corruption-research/
I think you've missed the standard caveat that this *years* and *years* away from being practical.
I agree that this is meant for long term storage and to answer you question about stability, you could always shield the chip such that you limit the amount of electric noise exposure, except to the electric currents that you want.
If they can shrink the length of the tube, increase the sensitivity of their measurements, it will be faster.
Imagine if they can, when writing, 'shoot' the metal particle to one wall or the other. Then you have you 1/0 state easily read. My guess is that its the length of tube that helps give it the really long data lifespan. If they shorten it, will it at least last 1000 years? If so, and Moore's law kind of holds, they can probably figure out how to make it small, fast and last a billion years with a couple of decades.
Too bad anyone trying to read it in a billion years will probably lack the technology...