A new equation emerges at Brown University

ADDING UP: Jill Pipher is director of Brown's Institute for Computational and Experimental Research in Mathematics, which hosted a conference on the next generation of high-speed computers earlier this month. / PBN PHOTO/RUPERT WHITELEY
ADDING UP: Jill Pipher is director of Brown's Institute for Computational and Experimental Research in Mathematics, which hosted a conference on the next generation of high-speed computers earlier this month. / PBN PHOTO/RUPERT WHITELEY

The design of the next generation of high-speed computers may have its roots, in the form of algorithms, in a gathering of world-class mathematicians earlier this month at Brown University.
More than 50 of the nation’s top mathematicians from industry, academia and national-research laboratories gathered at Brown from Jan. 9-13 to wrestle with potential solutions to the future design and economics of the next generation of ultra-exascale, super computers.
The discussions took place at the headquarters of Brown University’s new Institute for Computational and Experimental Research in Mathematics (ICERM), in a remodeled space formerly occupied by a law firm that school officials expect to become a magnet for attracting creative, undergraduate talent in information technology to the university.
The preferred common language used for the discussions were algorithms, complex equations used to solve the challenges of the future architecture and speed of super computers. Such algorithms could be found sprouting up on the numerous white boards that dominate the wall space of the math-research center.
The conference, sponsored by the U.S. Department of Energy, was entitled “Synchronization-Reducing and Communication-Reducing Algorithms and Programming Models for Large-scale Simulations.”
The conference’s challenge, explained ICERM Director Jill Pipher, was “to develop computers 1,000 times more powerful than the ones we have today, as measured in number of calculations that you can do per second,” while at the same time minimizing power consumption. The government’s deadline is 2018, and meeting it, Pipher continued, will require that a new series of algorithms be created. “These challenges are, at heart, mathematical,” she said.
More than an intellectual problem, the algorithmic challenges of exascale computing involve an economics equation based in physics: the costs and the trade-offs in energy use, the speed of computations and the distance needed to transmit memory.
At stake, according to Jan S. Hesthaven, professor of Applied Mathematics at Brown and deputy director at ICERM, is the distinct economic advantage of super-computing. “If you have it, and you know how to use it, it gives you an economic advantage for discovery and innovation.”
In designing the next generation of computational architecture, Hesthaven continued, power consumption is a big problem. “If you were to take what we have now, and scale it up, make it bigger, it simply wouldn’t work,” he said. “The amount of energy that it would take would be hundreds of millions of dollars; it would be too expensive.” The only way to do it, Hesthaven continued, is to think about it in a “radically different way that pushes mathematicians – that’s what we’re here for.” David Keyes, a professor of applied physics and applied mathematics at Columbia University, says the answer could be in a reconsideration of algorithms that may have been considered before but were not necessarily well-adapted for the machines in use for the last 25 years.
“The main revolution between what we’ve been doing for the last 25 years and what we need to start doing, is to optimize computations,” Keyes said. “We’ve been counting arithmetic operations, but now, those cost much less than the energy costs of moving the data.
“We have to restructure algorithms that move the data around less, finding how to use less memory and less data motion,” he said.
And, Keyes added, “when you’re talking about a machine with a billion processors, using algorithms that synchronize – which is almost all of the algorithms in use today – they become much less optimal and much less interesting” in the design of future super computers.
That Brown hosted the conference is a feather in the cap of the university and its new center.
The five-month-old, $15.5 million national research center is funded by the National Science Foundation – one of eight such centers and the only one in New England.
In its first four months of operation between August and December, ICERM had a significant impact on the local economy. It generated about $165,000 in lodging revenue, about $506,000 in food revenue, and used 7,480 hotel nights in Providence, according to Ruth Crane, ICERM’s assistant director. During the next two years, Crane continued, ICERM is projected to bring in between $1.5 and $2 million in new revenue.
ICERM plans to employ a full-time staff of nine, up to 40 resident researchers, and up to 12 postdoctoral researchers annually. To date, in its first months of operation, it has hosted six conferences, attracting participants from around the globe.
One of the biggest impacts of ICERM is its role in attracting top students to Brown.
“The best undergraduates are being drawn to IT again,” Keyes said. “It’s not just law and business anymore.” •

No posts to display