Chocolate and peanut butter, tea and scones, gin and tonic, they’re all great combinations, and today we now have a new binary mixture — Quantum and AI. Do they actually mix well together? Quadrant, a new spin out from D-Wave Systems, certainly seems to think so. D-Wave has been … | Continue reading
In a broad sense, the history of computing is the constant search for the ideal system architecture. Over the last few decades system architects have continually shifted back and forth from centralized configurations where computational resources are located far from the user to … | Continue reading
In the long run, networking chip giant and one-time server chip wannabe Broadcom might regret selling off its “Vulcan” 64-bit Arm chip business to Cavium, soon to be part of Marvell. The ThunderX2 processors based on the Vulcan designs have been tweaked by Cavium and have been en … | Continue reading
It is fair to say that containers in HPC are a big deal. Nothing more clearly shows the critical nature of any technology than watching the community reaction when a new security issue is discovered and released. In a recent announcement from the team over at Sylabs, they stated … | Continue reading
A team at Intel, in collaboration with QuTech in the Netherlands, is researching the possibilities of quantum computing to better understand how practical quantum computers can be programmed to impact our lives. Given the research nature and current limitations of quantum compute … | Continue reading
As enterprises continue to spread their workloads around – keeping some in their core datacenters while placing others in either private clouds or sprinkling them among disparate public clouds – the portability, visibility and management of those applications becomes an issue. Th … | Continue reading
A team at Intel, in collaboration with QuTech in the Netherlands, is researching the possibilities of quantum computing to better understand how practical quantum computers can be programmed to impact our lives. Given the research nature and current limitations of quantum compute … | Continue reading
In the first article outlining some of the results from our AI survey, we discussed how most customers are just beginning their journey into AI and that very few have actual AI applications in production. In this article, we are going to talk about the whats and whys behind AI. I … | Continue reading
Cloud computing became an essential infrastructure strategy for nearly every business. Last year Gartner predicted that demand for infrastructure as a service would increase by 36.8 percent. A 2018 McAfee survey found that 97 percent of organizations are using cloud services from … | Continue reading
VMware’s $1.26 billion acquisition of network virtualization startup Nicira in 2012 sent ripples through the tech world. Through the deal, VMware, which had made its name as a pioneer of server virtualization technology, planted a flag in the burgeoning software-defined networkin … | Continue reading
To give hungry customers a high quality, gourmet AI experience new and exotic recipes are being constructed in a race to dream up ever more exciting and tasty concoctions from traditional software and hardware staples. Over at Intel, AI is clearly the highlight of its current tas … | Continue reading
The growing amounts of data that are being generated due to such trends as the Internet of Things (IoT) and cloud computing have naturally beget the need for data scientists who can collect, analyze and, most importantly, interpret these massive stockpiles of complex information … | Continue reading
It is hard to make a profit selling hardware to supercomputing centers, hyperscalers, and cloud builders, all of whom demand the highest performance at the lowest prices. But in the first quarter of this year, network chip, adapter, switch, and cable supplier Mellanox Technologie … | Continue reading
Without splitting a lot of hairs on definitions, it is safe to say that machine learning in its myriad forms is absolutely shaking up data processing. The techniques for training neural networks to chew through mountains of labeled data and make inferences against new data are se … | Continue reading
It has been more than a decade since AMD was a force in computing in the datacenter. For that reason, we have not wasted a lot of time going over the ins and outs of its quarterly financials. But now that the Epyc CPUs and Radeon Instinct GPU accelerators are getting traction amo … | Continue reading
The demand for compute is so strong among the hyperscalers and cloud builders that nothing seems to be slowing down Intel’s datacenter business. Not delays in processor rollouts due to the difficulties in ramping 14 nanometer and 10 nanometer processes as the pace of Moore’s Law … | Continue reading