A lab is a place where the impossible can be made possible, and lab folks are typically kept behind closed doors, where they can engage in their imagining unmolested. But HP’s tagline is “Invent,” so it gave attendees at its Discover conference in Las Vegas last week a peek behind the curtain at a project that, the company said, represents the future of technology.
We have been using the same fundamental computing architecture for sixteen years, noted HP CEO Meg Whitman, and 90 per cent of the time and energy used by a current generation computer goes into moving data from one tier of memory to another, via copper wire. That takes time, creates heat, and wastes energy. HP, she said, is building a new way to compute, from the ground up, and it will be available by the end of the decade.
It’s called “The Machine.”
“The Machine is not a server, or a laptop, or a mainframe,” she said. “It’s a continuum. It’s all those things and more. A new approach to managing a distributed information landscape.”
Martin Fink, HP’s CTO and head of HP Labs, explained that, rather than using general purpose processors like today’s servers, The Machine will have specialized processors customized for the computing it’s doing. That allows them to use just the amount of energy necessary for the task. Those processors will be connected to a large single pool of “universal memory” (memory used for both RAM and storage), not with copper wire, but using photonics – yup, that’s light. With photonics, said Mr. Fink, a fibre smaller than a human hair should be capable transferring 6 terabytes of data per second. This will allow The Machine to deal with massive datasets using what HP says is orders of magnitude less energy. It will be able to address any one byte of data in a 160 petabyte (a petabyte is one million gigabytes) data store in under 250 nanoseconds (a nanosecond is one billionth of a second).
Storage will be courtesy of a technology called memristors, a low-cost form of high performance non-volatile memory (when you turn off the power, non-volatile memory doesn’t lose its data as conventional memory does) that proponents claim could ultimately replace not only flash memory, but SSDs and several types of RAM as well. HP initially previewed its work with memristors late last year at HP Discover in Barcelona.
Each module of The Machine is tiny – a mock-up presented at Discover fits easily into a man’s shirt pocket – and HP foresees them being used to build a distributed mesh network to serve the Internet of Things, as well as ending up in PCs, servers, supercomputers, and smartphones.
“We want you to be able to store your entire life – think of 100 terabytes – on your smartphone,” said Mr. Fink.
All this new architecture needs a new operating system to serve its unique needs, which is one reason HP chose to reveal it now: it wants the new OS to be open source, and that means the community has to start learning about the technology. But, to hedge its bets, HP engineers are also developing a Linux environment for The Machine, and are working on a version of Android optimized for non-volatile memory systems. There’s yet another team working on management tools.
And when is all this coming together? HP’s roadmap has memristors (which have been a challenge to manufacture, from all reports) available in small numbers next year, along with the Machine OS software development kit. The Machine OS is to be in beta testing in 2017 and released in 2018, and commercial versions of The Machine available in 2019.
Optimistic? We’ll see. There are a lot of pieces that have to coalesce, and a lot of partners who have to buy in to make all this work. But if it does work, it will turn computing on its ear.
And why call it “The Machine”?
Said Mr. Fink, “We called it that because HP Labs does not have a marketing department.”
Follow us on Twitter: