Back in the late 1970s and early 1980s, big iron in datacenters started out with water cooling, which was a pain in the neck in terms of the system and facilities engineering. And it was a big deal – and a big competitive threat – when former IBM System/360 architect Gene Amdahl … | Continue reading
Lustre has been an essential component of HPC systems for a decade and a half, and has experienced a somewhat turbulent history of shifting ownership followed by uncertain support from various backers as an open source project. Now, with DataDirect Networks acquiring Intel’s Lus … | Continue reading
It has been difficult for the supercomputing community to watch the tools they have honed over many years get snatched up by a more commercially-oriented community like AI and popularized. Without tireless work at supercomputing sites around the world much of what AI hinges upon … | Continue reading
A lot of money and time is being thrown at quantum computing by vendors, including IBM, Google, Microsoft, and Intel, and there is the normal competitiveness between the United States and China and Europe as well as work in Japan. We are at the early stages of quantum computer de … | Continue reading
HPC luminary Jack Dongarra (University Distinguished Professor University of Tennessee) presented a new direction for math libraries at the International Supercomputing conference (ISC) in Frankfurt Germany in his presentation “Numerical Linear Algebra for Future Extreme-Scale Sy … | Continue reading
The irony, of course, is that there is never a summit when it comes to supercomputing. The tectonic forces of Moore’s Law, voracious appetites for compute, and government budgets that somehow pay the ever-larger bill keep pushing up the mountain ranges, higher and higher, startin … | Continue reading
If everything had played out as planned, then the original “Aurora” supercomputer planned by Intel and built by Cray for Argonne National Laboratory under contract from the US Department of Energy would probably have been at or near the top of the Top 500 charts this week at the … | Continue reading
Edge computing can mean different things to different people, as is the case with any new phenomena in the IT sector. We like the definition that an Intel executive once gave us, which is that the edge, as we like to call it here at The Next Platform, is basically the transformat … | Continue reading
The shenanigans with the Top 500 rankings of the world’s most powerful supercomputers continues, but there are a bunch of real supercomputers that were added to the list for the June 2018 rankings, and we are thankful, as always, to gain the insight we can glean from the Top 500 … | Continue reading
It has been four years since Kirk Bresniker, HPE Fellow, vice president, and chief architect at Hewlett Packard Labs, stood before a crowd of journalists and analysts at the company’s Discover show and announced plans to create a new computing architecture that puts the focus on … | Continue reading
Japanese computer maker Fujitsu, which has four different processors under development at the same time aimed at different workloads in the datacenter – five if you count its digital annealer quantum chip – has unveiled some of the details about the future Arm processor, as yet u … | Continue reading
The incumbent switch makers of the world could learn a thing or two from the server racket. Well, they actually are, but it is because the hyperscalers and cloud builders of the world have been schooling them about disaggregating the components of the switch to open up their arch … | Continue reading
The server market has been spoiling for a fight for so long that it is hard to remember a time when there was intense competition across multiple processor vendors and architectures. But we have long memories, as many of you do, as well as long time horizons looking out into the … | Continue reading
Pathology laboratories are big data environments. However, these big data are often hidden behind expert humans who manually and with great care visually parse large complex and detailed datasets to provide critical diagnoses. Humans it turns out, are amazingly detailed and accur … | Continue reading
If the ecosystem for Arm processors is going to grow in the HPC arena, as many think it can, then someone has to make the initial investments in prototype hardware and help cultivate the software stack that will run on current and future Arm platforms. Sandia National Laboratorie … | Continue reading
Linux has gradually grown in importance along with the Internet and now the hyperscalers that define the next generation of experience on that global network. Most of the software running at the hyperscalers – with the exception of Microsoft, of course, is built upon Linux and ot … | Continue reading
There is no shortage of data in the life sciences. Every talk about bioinformatics and genomics has to include the now ubiquitous hockey stick growth graph of digital DNA. It just isn’t a real-life science talk if there isn’t one of these graphs. These graphs have basically becom … | Continue reading
The Barcelona Supercomputing Center is one of the flagship HPC facilities in Europe and it arguably has the most elegant and spiritual of datacenters in the world, located inside of Torre Girona Chapel. While BSC has been a strong proponent of IBM’s Power processors in past syste … | Continue reading
When it comes to deep learning chip startups, hype moves fast but crossing the finish line to real production silicon takes an incredibly long time. There are several incumbents on the custom hardware side aiming for the AI training and inference market but outside of Google’s TP … | Continue reading
This year at the GPU Technology Conference (GTC18) our roaming camera crew was lucky enough to catch Dolly Wu, Inspur’s VP & GM of their Datacenter/Cloud division. Inspur has been the fastest growing server vendor over the past couple of years, according to Gartner, and we expect … | Continue reading
The challenge with many of the complex modern technologies that are coming into datacenters is making them easy and cheap enough for enterprises to use at a their own scale, which is much more limited than that of hyperscalers and cloud builders, and employing their own skillsets … | Continue reading
With Intel having significant difficulties in ramping up its 10 nanometer manufacturing processes and not really talking much about its plans for 7 nanometers, there has never been a better time for its few remaining rivals in chip manufacturing to give their respective CPU and G … | Continue reading
Believe it or not, Cisco Systems has a bunch of customers for its UCS blade and rack servers that are in the gaming industry, which has its share of near-hyperscale players who have widely geographically distributed clusters spread around the globe so players can get very low lat … | Continue reading
Like any emerging technology, artificial intelligence and various components like machine learning and deep learning are getting a lot of hype, with a continuous flow of analyst reports and news stories detailing how they all will change how business is done, research is conducte … | Continue reading
One cloud was never going to be enough, no matter how much Amazon Web Services wants it to be otherwise. It is an increasingly multicloud world, and enterprises want to know that they can run their applications and services on any of the major public clouds as well as virtualized … | Continue reading
There are several competing processor efforts targeting deep learning training and inference but even for these specialized devices, the old performance ghosts found in other areas haunt machine learning as well. Some believe that the way around the specter of Moore’s Law as well … | Continue reading
On the face of it, if you just look at the top level numbers, the server market is booming like we have not seen since the recovery in the wake of the Great Recession for a few quarters here and there between late 2009 and early 2011. That recovery was a bit spikey, but this one … | Continue reading
Nvidia got a little taste of hardware, and the company’s top brass have decided that they like having a lot of iron in their financial diet. And to that end, the company is becoming more involved in the way system components for GPU compute are manufactured and is itself providin … | Continue reading
Last week at the Fujitsu Forum in Tokyo, Lisa Spelman, who is general manager of Xeon products and Data Center Marketing at Intel, did a soft announcement of the hybrid Xeon CPU-Arria 10 FPGA hybrid chip that the company has been talking about for years and that is now available … | Continue reading
It has been a long time since the Japan Meteorological Agency has deployed the kind of supercomputing oomph for weather forecasting that the island nation | Continue reading
Any new and powerful technology always cuts both ways. The rapid rise of the machine learning flavor of artificial intelligence is due to the fact that, un | Continue reading
In the United Kingdom, there is a topical BBC Radio 4 comedy panel show called I’m Sorry I Haven’t A Clue. On this show, they often host a game segment whe | Continue reading
“Death and taxes” is a phrase that is usually attributed to Benjamin Franklin from a quote in a 1789 letter: “In this world nothing can be said to be certain, except death and taxes.” Public cloud computing providers didn’t exist back in the days of Franklin, but if they did, the … | Continue reading
Success can be its own kind of punishment in this world. Since the dawn of modern computing 130 years ago with tabulating machines derived from looms, there have always been issues of scale when it comes to compute and storage. While all modern businesses worry about the IT infra … | Continue reading
Way back in the early days of the commercial Internet, when we all logged into what seemed to be new but what was actually a quite old service used by academic institutions and government agencies that rode on the backbones of the telecommunications network, there were many, many … | Continue reading
One of the most important lessons in marketing is that you don’t change something that is working, but that you also have to be able to carefully and cautiously innovate to protect against changing tastes or practices that might also spell doom for the business. Two and a half de … | Continue reading
Any processor that hopes to displace the Xeon as the engine of choice for general purpose compute has to do one of two things, and we would argue both: It has to be a relatively seamless replacement for a Xeon processor inside of existing systems, much as the Opteron was back in … | Continue reading
Building custom processors and systems to annotate chunks of DNA is not a new phenomenon but given the increasing complexity of genomics as well as explosion in demand, this trend is being revived. Those that have been around in this area in the last couple of decades will recall … | Continue reading
Sometimes, to appreciate a new technology or technique, we have to get into the weeds a bit. As such, this article is somewhat more technical than usual. But the key message that new libraries called ExaFMM and HiCMA gives researchers the ability to operate on billion by billion … | Continue reading
Even if Nvidia had not pursued a GPU compute strategy in the datacenter a decade and a half ago, the company would have turned in one of the best periods in its history as the first quarter of fiscal 2019 came to a close on April 29. As it turns out, though, the company has a fas … | Continue reading
Broadcom may not have wanted to be in the Arm server chip business any more, but its machinations since it was acquired by Avago Technology two years ago have certainly sent ripples through that nascent market. It did it in the wake of buying Broadcom, and now it looks like it is … | Continue reading
Google did its best to impress this week at its annual IO conference. While Google rolled out a bunch of benchmarks that were run on its current Cloud TPU instances, based on TPUv2 chips, the company divulged a few skimpy details about its next generation TPU chip and its systems … | Continue reading
Hitachi is a massive multi-national conglomerate that has more than 300,000 employees and 950 subsidiaries and a reach that extends into a wide array of industries, from aircraft and automotive systems to telecommunications, construction, defense and financial services. It also i … | Continue reading
While there is a battle of sorts going on between hyperconverged architectures and disaggregated ones, it is probably safe to assume that at the scale that most enterprises run, they could care less about which one they choose so long as either architecture does what they need to … | Continue reading
Back in the early 1990s, when IBM has having its near-death experience as the mainframe business faltered, Unix systems were making huge inroads into the datacenter, and client/server computing was pulling work off central systems and onto PCs, the company was on the ropes and pr … | Continue reading
The world of AI software is quickly evolving. New applications are coming on the scene on almost a daily basis, and now is a good time to try to get a handle on what people are really doing with machine learning and other AI techniques and where they might be headed. In our first … | Continue reading
One of the most common misconceptions about machine learning is that success is solely due to its dynamic algorithms. In reality, the learning potential of those algorithms and their models are driven by the data preparation, staging and delivery. When suitably fed, machine learn … | Continue reading
Dell EMC has long been a vocal proponent of NVM-Express, the up and coming protocol that cuts out the CPU jib-jab with PCI-Express peripherals and that boost throughput and drops latency for flash and other non-volatile memory. For the past two years, Dell, like other system make … | Continue reading