Sunday, April 19, 2015

"Happy Birthday to Moore’s Law" (plus party pooper Vaclav Smil)

First up, the Washington Post, Apr. 16:
Few revolutions can be said to have lasted for half a century, or to have wrought disruptive change at a predictable pace.

But that’s exactly the case with the digital revolution, which has seen computing get dramatically faster, cheaper and smaller every few years since the 1950’s.

The remarkable prophecy that anticipated that phenomenon is known as Moore’s Law, which turns 50 on April 19. In a four-page article for Electronics magazine, long-time Intel chief executive Gordon Moore (then head of R&D at Fairchild Semiconductor) made his famous prediction that, for the foreseeable future, the number of components on semiconductors or “chips” would continue to double every twelve to eighteen months even as the cost per chip would hold constant.

Moore originally thought his prediction would hold for a decade, but half a century later it’s still going strong. Computing power — and related components of the digital revolution including memory, displays, sensors, digital cameras, software and communications bandwidth — continue to get faster, cheaper, and smaller roughly at the pace Moore anticipated.

Moore’s Law is driven, as Moore explained, largely by economies of scale in producing chips, improvements in design, and the relentless miniaturization of component parts.  The smaller the chip, the cheaper the raw materials. Transistors, the building blocks for chips, have fallen in price from $30 each 50 years ago to a nanodollar today—roughly $0.000000000001.  That low price encourages more uses, which raises production and lowers costs in a virtuous cycle. Miniaturization also means a shorter distance that electricity has to travel to activate software instructions. Smaller, denser chips are consequently not only cheaper to make, they use less power and perform better. Much better. With each cycle of Moore’s Law, computing power doubles, even as price holds constant. It is the prime example of what Paul Nunes of Accenture and I call an “exponential technology.” It’s hard to get your head around the impact of a core commodity whose price and performance have improved by a factor of two every two years for half a century.  (Compare that to commodities such as oil or meat, which get worse and more expensive.) One example I use is to help make Moore’s Law concrete is to compare the performance, cost and size of the Univac I, sold in the mid-1950’s, with devices available now....MORE
From Medium, a deep dive:

How Gordon Moore Made “Moore’s Law”

The definitive story behind the rule that explained why our world changed — and is still changing — at a rate that’s still too awesome to grasp
On April 19, 1965, chemist Gordon Moore published an article in Electronics magazine that would codify a phenomenon that would shape our world. At its core was a non-intuitive, and incredibly ballsy, prediction that with the advent of microelectronics, computing power would grow dramatically, accompanied by an equally dramatic decrease in cost. Over a period of years and decades, the exponential effect of what would be known as “Moore’s Law” would be the reason why, for instance, all of us carry in our pockets a supercomputer that only years earlier would have cost billions of dollars and filled many rooms. We call them “phones.”

In a new and definitive biography of Moore — called, naturally, Moore’s Law —  authors Arnold Thackray, David Brock and Rachel Jones provide a thorough look at the man and his times. But perhaps its key section, printed below, tells the story behind the eponymous breakthrough that epitomizes the digital age — that fateful publication that still resonates a half century later.

Moore first began to develop his insight in 1959 when he worked for Fairchild Semiconductor, but he did not discuss the idea publicly for several years. In 1962 and 1963 he contributed some of this thoughts to, respectively, a science yearbook and a microelectronics textbook. But it was not until 1965, in that historic Electronics piece, that the world would see what became known as Moore’s Law: a regular doubling of computer power and halving of its cost.

Here is how Gordon Moore shared his “law” with the world. — Steven Levy


In February 1965, Gordon found his opportunity to engage directly with the wider electronics community: a letter from Lewis Young, editor of the weekly trade journal Electronics, asking for an in-depth piece about the future of microcircuitry. Electronics was well established and widely read, with a mix of news reports, corporate announcements, and substantial articles in which industry researchers outlined their recent accomplishments. It covered developments both within the semiconductor industry and in electronics more broadly, giving technology and business perspectives.

Young was planning a thirty-fifth anniversary issue, including a series titled “The Experts Look at the Future.” As the sole microchip expert in the issue, Gordon’s words would reach sixty-five thousand subscribers. It was the moment that he had been waiting for. He made a giant asterisk mark with his pencil at the top of Young’s invitation and underlined an exhortation to himself: “GO-GO.” Answering Young, he admitted, “I find the opportunity to predict the future in this area irresistible and will, accordingly, be happy to prepare a contribution.” Within a month he had drafted his manuscript: “The Future of Integrated Electronics.”

The piece reiterated much of what Moore had already written, but sought to be more engaging. Gordon’s confidence and comfort in his expert position shone through in his subtle use of dry humor and a clear, low-key style. His conscious attempt at warmth was designed to persuade readers both to buy into the future he foresaw and to help create it. Included, for the first time, were several explicit numerical predictions. He telegraphed the gist of his argument in a brief summary for the Fairchild lawyer who would review his draft: “The promise of integrated electronics is extrapolated into the wild blue yonder, to show that integrated electronics will pervade all of electronics in the future. A curve is shown to suggest that the most economical way to make electronic systems in ten years will be of the order of 65,000 components per integrated circuit.”

The claim was nothing if not bold. Sixty-five thousand transistors per silicon microchip (up from sixty-four in 1965) would be a remarkable level of complexity. These microchips with sixty-five thousand transistors would represent the most economical way to make electronic products. Gordon’s message was simple and stunning. Silicon microchips made better and cheaper electronics. Applications would widen throughout industry, technology, and society, and possibilities would emerge for computers to develop unprecedented capabilities.

In his opening paragraph, Moore set the tone: “The future of integrated electronics is the future of electronics itself.” Since the actual future lay beyond his reach, he aimed not “to anticipate these extended applications, but rather to predict for the next ten years the development of the integrated electronics technology on which they will depend.” Silicon microchips were now “an established technique.” Nowhere was this truer than in military systems, where reliability, size, and weight requirements were “achievable only with integration,” making silicon microchips mandatory. Beyond this, the use of microchips in mainframes was already surpassing conventional electronics in both cost and performance. Complex microchips of high quality would “make electronic techniques more generally available throughout all of society,” enabling the smooth operation of “many functions that are done inadequately by other techniques or not at all.” Existing technologies would be refashioned or replaced by electronics-based approaches, providing fresh technical, social, and economic functions....MUCH MORE
Finally, Vaclav Smil  writing for the brainiacs at IEEE Spectrum:

Moore’s Curse

There is a dark side to the revolution in electronics: unjustified technological expectations
In 1965, the year in which the number of components on a microchip had doubled, Gordon Moore predicted [pdf] that “certainly over the short term this rate can be expected to continue.” In 1975 he revised [pdf] the doubling rate to two years; later, it settled down at about 18 months, or an exponential growth rate of 46 percent a year. This is Moore’s Law.

As components have gotten smaller, denser, faster, and cheaper, they have increased the power and cut the costs of many products and services, notably computers and digital cameras but also light-emitting diodes and photovoltaic cells. The result has been a revolution in electronics, lighting, and photovoltaics.

graphic link for Moore's Law special reportBut the revolution has been both a blessing and a curse, for it has had the unintended effect of raising expectations for technical progress. We are assured that rapid progress will soon bring self-driving electric cars, hypersonic airplanes, individually tailored cancer cures, and instant three-dimensional printing of hearts and kidneys. We are even told it will pave the world’s transition from fossil fuels to renewable energies.

But the doubling time for transistor density is no guide to technical progress generally. Modern life depends on many processes that improve rather slowly, not least the production of food and energy and the transportation of people and goods. There is no shortage of historical data to illustrate this reality, and I have calculated representative rates for the decades coinciding with the development of transistors (the first commercial application was in hearing aids in 1952) and microprocessors, as well as the rates for the entire 20th century, or even longer.

Corn, America’s leading crop, has seen its average yields rising by 2 percent a year since 1950. The efficiency with which steam turbogenerators convert thermal power to electricity generation rose annually by about 1.5 percent during the 20th century; if you instead compare the steam turbogenerators of 1900 with the combined-cycle power plants of 2000 (which mate gas turbines to steam boilers), that annual rate increases to 1.8 percent. Advances in lighting have been more impressive than in any other sector of electricity conversion, but between 1881 and 2014 light efficacy (lumens per watt) rose by just 2.6 percent a year, for indoor lights, and by 3.1 percent for outdoor lighting (topped by the best low-pressure sodium lamps).

The speed of intercontinental travel rose from about 35 kilometers per hour for large ocean liners in 1900 to 885 km/h for the Boeing 707 in 1958, an average rise of 5.6 percent a year. But that speed has remained essentially constant ever since—the Boeing 787 cruises just a few percent faster than the 707. Between 1973 and 2014, the fuel-conversion efficiency of new U.S. passenger cars (even after excluding monstrous SUVs and pickups) rose at an annual rate of just 2.5 percent, from 13.5 to 37 miles per gallon (that’s from 17.4 liters per 100 kilometers to 6.4 L/100 km). And finally, the energy cost of steel (coke, natural gas, electricity), our civilization’s most essential metal, was reduced from about 50 gigajoules to less than 20 per metric ton between 1950 and 2010—that is, an annual rate of about –1.7 percent....MORE
All of which, having been properly absorbed via cosmosis-- knowledge gained across a semi-permeable cosmic membrane, or something--led to last Wednesday's declaration by yours truly:
We've been tracking the progress of the science for a couple decades now and can report, from hard won experience, there ain't no Moore's Law for batteries....