A prototype computer with 160TB of memory has been unveiled by Hewlett Packard Enterprises.
Designed to work on big data, it could analyse the equivalent of 160 million books at the same time, HPE said.
The device, called The Machine, had a Linux-based operating system and prioritised memory rather than processing power, the company said.
HPE said its Memory Driven Computing research project could eventually lead to a “near-limitless” memory pool.
“The secrets to the next great scientific breakthrough, industry-changing innovation or life-altering technology hide in plain sight behind the mountains of data we create every day,” said HPE boss Meg Whitman.
“To realise this promise, we can’t rely on the technologies of the past, we need a computer built for the big data era.”
- Japan kicks off AI computer project
- IBM’s online quantum machine gets faster
Prof Les Carr, of the University of Southampton, told the BBC The Machine would be fast but big data faced other challenges.
“The ultimate way to speed things up is to make sure you have all the data present in your computer as close to the processing as possible so this is a different way of trying to speed things up,” he said.
“However, we need to make our processing… not just faster but more insightful and business relevant.”
“There are many areas in life where quicker is not necessarily better.”