Core memory
So all they've done is use the concept of core memory from the 60s and shrink it down?
A team of scientists in Germany and the US have developed a new kind of logic gate that could crack the size problems haunting the processor industry. The problem is that the conventional CMOS method of producing chips requires ever-smaller transistors, but once you get to working in measurements of single-figure nanometers, …
Ah, core memory....
I had a job interview at a company that was still making core memory in '85, and I know it was still common in newly deployed military hardware through the late 80's. Around 1988 I was working on a design that included a couple US Navy AN/UYK-44 16-bit mini-computers, and some of the gold-braid types were nervous because we were proposing the use of brand new SRAM and EEPROM cards instead of core.
One reason they were nervous was because core stores have power-off retention and RAM (back there and back then) did not. If there's been a power outage - not unknown in military applications - when the power comes back on a core store based computer can continue running its program where it left off with no lost or corrupted data.
My computer science instructor was a punch card tech with one of the biggest central personnel computers in Vietnam, when his hooch got hit with a rocket. He said it landed right where his head normally would have been and blew up his wall locker. Even the nerds got shot at over there!
This post has been deleted by its author
This is not like Core, which is a memory bit set and reset with current in the wires passing through the magnetic donut. Also what you see has no good way to link to a second device, i.e. it is not "integrated."
The three inputs can be used to make this work like either a NOR gate or a NAND gate where I1 & I2 are the inputs and O is the output.
Right now the input/output states of the device are measured by taking a picture revealing the polarization of the thin magnetic metal films (it makes them look either bright or dark).
This demo is a large scale device and not really "nano" in size. The "60 nm below" and the "60 nm" measurement bar you see in the photo is simply the thickness of the underlying metal film designated "I3" (which stands for "input #3"). It is not a gap and not the actual size of a magnetic bit.
They are "programming" magnetic settings with a Focused Ion Beam (FIB, as noted in the upper right picture). A FIB is like a scanning electron microscope only it uses ionized atoms (usually Gallium) instead of electrons. The atoms are massive enough they can drill holes into the wafer, but that isn't being done here as part of device operation. (They may have used the FIB to cut into the device in order to see the cross section shown in the lower right. Most semiconductor fabs have a FIB just for this purpose. It is, however, a destructive operation.) A FIB is roughly the size of a billiard table, so that's not how a final magnetic circuit would be run.
I recall reading of an attempt in the 1940s to use neon based crosspoint switches as an alternative, turns out that very small amounts of helium and possibly 63Ni to help with hysteresis were used to achieve similar results circa 1943.
As it turned out they could never make the neons switch reliably, too many thermal effects due to variations in processsing.
Ironically the same technique was later used (with silicon based HV switching) as the basis for the very first plasma screens.
Rumours of the death of Moore's Law have been around since Gordon himself said that it wouldn't be possible to design a chip with more than a quarter-million transistors. We're at several billion and counting, and I fearlessly predict that single-digit nanometre technology is not far off. IBM and others have been building switches atom-by-atom for several years now.