HP Works On 'Machine' New Server Hardware
Hewlett-Packard has announced the 'Machine', an ambitious project that aims at reinventing the basic architecture of computers and could replace today's energy-consuming data centers.
Today's digital universe is about four zettabytes and by the end of the decade, we'll be starting to use a unit that few people have ever heard of: the brontobyte - a billion exabytes - or two quadrillion years of music.
At an enterprise level, IT planners are looking out a decade and recognize that the current mix of technologies will have trouble keeping pace with the exponential growth of big data.
For Hewlett-Packard (HP), the solution to this challenge is 'The Machine.'
The Machine program aims to solve this rising problem by coordinating and advancing four emergent technologies in parallel, to prevent the possibility that the rising data flow could flood conventional legacy technologies.
It starts with replacing general-purpose processors with special-purpose cores integrated with memory and networking into a single chip package (SoC). Building on the foundations laid by HP?s Moonshot microservers, this promises to slash the energy that conventional microprocessors require and chomp through huge amounts of data much more rapidly.
The new computer architecture is also based on memristors, a new computer memory that could replace both the SRAM and DRAM, Memristor is "nonvolatile" - meaning that no electricity is needed to maintain the data. This massively reduces the energy required to store data.
The 'Machine' also uses light beams to move data around at high speeds. HP is pushing ahead with optical links that relay bits via photons rather than electrons. This eliminates copper wires as the conduit and with them, a big source of energy and space inefficiency. HP envisions high speed photonic fabrics - a term used to mean the web of connections between processor cores - that will allow unprecedented storage and computational resources to be marshalled under a simplified programming model - moving data between hundreds of thousands of optimized computing cores and exabytes of Memristor storage.
And finally, the fourth piece of HP's vision is a new operating system that orchestrates the flow of data between these hardware upgrades. HP says coders will be able to create applications that can manipulate and extract meaning from vastly larger data sets than is possible today.
HP hopes that it will be able to deliver on these technologies by 2020.
Commenting on HP's vision for the future, rival Dell described the attempt to come up with a new architecture for computers as "laughable".
"The notion that you can reach some magical state by rearchitecting an OS is laughable on the face of it," John Swainson, head of Dells software business, told reporters in San Francisco Thursday when asked to comment on the work.
At an enterprise level, IT planners are looking out a decade and recognize that the current mix of technologies will have trouble keeping pace with the exponential growth of big data.
For Hewlett-Packard (HP), the solution to this challenge is 'The Machine.'
The Machine program aims to solve this rising problem by coordinating and advancing four emergent technologies in parallel, to prevent the possibility that the rising data flow could flood conventional legacy technologies.
It starts with replacing general-purpose processors with special-purpose cores integrated with memory and networking into a single chip package (SoC). Building on the foundations laid by HP?s Moonshot microservers, this promises to slash the energy that conventional microprocessors require and chomp through huge amounts of data much more rapidly.
The new computer architecture is also based on memristors, a new computer memory that could replace both the SRAM and DRAM, Memristor is "nonvolatile" - meaning that no electricity is needed to maintain the data. This massively reduces the energy required to store data.
The 'Machine' also uses light beams to move data around at high speeds. HP is pushing ahead with optical links that relay bits via photons rather than electrons. This eliminates copper wires as the conduit and with them, a big source of energy and space inefficiency. HP envisions high speed photonic fabrics - a term used to mean the web of connections between processor cores - that will allow unprecedented storage and computational resources to be marshalled under a simplified programming model - moving data between hundreds of thousands of optimized computing cores and exabytes of Memristor storage.
And finally, the fourth piece of HP's vision is a new operating system that orchestrates the flow of data between these hardware upgrades. HP says coders will be able to create applications that can manipulate and extract meaning from vastly larger data sets than is possible today.
HP hopes that it will be able to deliver on these technologies by 2020.
Commenting on HP's vision for the future, rival Dell described the attempt to come up with a new architecture for computers as "laughable".
"The notion that you can reach some magical state by rearchitecting an OS is laughable on the face of it," John Swainson, head of Dells software business, told reporters in San Francisco Thursday when asked to comment on the work.