I have not normally been posting
on technical topics, and I am not a professional programmer. But I do spend a fair bit of time writing programs for artistic purposes. Professional programmers
won't find anything of technical interest.
Recently I came across two articles by Herb Sutter, entitled "The
Free Lunch Is Over", from 2004
(http://www.gotw.ca/publications/concurrency-ddj.htm), and "Welcome
to the Jungle", from 2011
(http://herbsutter.com/welcome-to-the-jungle). Together they
chart fundamental changes in the way that computer hardware is
organised, and the effect that this is having on computer programs
and computer programmers. Sutter is a programming guru who
works for Microsoft, and he is particularly interested in changes to
programming techniques.
In "The Free Lunch Is Over", Sutter presciently pointed out that the
era of ever faster and more powerful computer processors is
ending. The free lunch was the continual increase in computer
processor speeds, sustained over a very long period (Sutter says
roughly 1975 to 2005, but 1975 is an approximate starting date for
desktop computers; for bigger computers it surely extends further
back). This meant that software developers didn't have to
worry too much about inefficient software; it might be a bit slow
today, but tomorrow's machines will run it fast enough.
Sutter's article, which first appeared in 2004, pointed out that
processor clock speed had started to level out. Since then,
there has been almost no increase in clock speed, which has
stagnated at something under 4 gigahertz; the obstacle is the amount
of heat generated in the small space of the chip. Sutter's
first era is the era of the free lunch of ever-increasing processor
speeds
It is still possible to pack ever more transistors into a chip, so
since 2005 there has been a proliferation of multi-core chips, where
each "core" is equivalent to the whole processor of an earlier
machine. Today typical desk-top machines have four cores, and
even phones and tablets are beginning to have two cores.
Different programs can run at the same time on different cores, but
to really make use of the cores a single program has to utilise
several cores simultaneously. This requires a big change on
the part of programmers, who need to acquire new tools and a new
mindset. Various approaches to what is variously called
parallel programming, concurrency or multi-threading have been
around for a long time, but now they suddenly become central.
Sutter's second era is "multi-core", that of machines with a
relatively small number of powerful cores. The first article
takes us to this point.
In the second article, Sutter considers that the "multi-core" era is
already ending even before we have learnt to cope with it. The
third era is that of "hetero-core", the era of heterogeneous cores,
which according to Sutter started in 2011. As far as the
actual hardware is concerned, the third era arrived when powerful
graphics cards started to be fitted to home computers for computer
games. These graphics cards contain a large number (for
example 100) of very small specialised cores, originally only
capable of processing pixels for display. These small cores
have gradually become more general-purpose, and there has been
considerable interest in scientific computing circles in harnessing
their power for general-purpose computation, not just
graphics. This interest is now going mainstream, but it brings
with yet more challenges for programmers, as now, added to the
already difficult challenge of adapting a program to make use of
multiple cores, different parts of the one program may be running on
cores with very different capabilities.
Sutter has the "hetero-core" era ending some time in the 2020s
because he thinks that is when Moore's Law (that the number of
transistors on a chip doubles every two years) will finally
end. At that point our desktop and laptop and pocket computing
devices will have as much power as they are going to get.
Sutter thinks by then another trend will have already taken over,
the availability of "hardware as a service": enormous clusters of
computers available to be used over the Internet by anyone, for a
fee. This provides still another challenge for programmers: a
program will run partly on the by then 1,000 or more heterogeneous
cores in the user's local machine (desktop, laptop, tablet or
phone), and partly on a much bigger collection of cores available at
the other end of a wi-fi link. Sutter considers that building
larger and larger networks of computers will be, for the foreseeable
future, much easier than cramming more and more transistors into a
single chip or box, so growth in computing power will take place
less in individual machines and more in the availability of networks
of computers. As Sutter points out, already Amazon and others
offer large clusters of computers for hire; he gives the example of
a cluster with 30,000 virtual cores that was (virtually) put
together for a pharmaceutical company who hired it for one day, at a
cost of under $1500 per hour. The calculations would have
taken years on a desktop computer.
Interesting times!
Thursday, April 19, 2012
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment