Gordon Moore's 45-year-old prediction about processor development has proven to

be amazingly accurate. Mike Bedford investigates the science behind our next

generation of CPUs.

In 1965, when state-of-the-art chips contained a mere 50 transistors, Intel

founder Gordon Moore observed that the number of transistors on an integrated

circuit had doubled every year. He then predicted that the trend would continue

for another 10 years. What an understatement this turned out to be. Moore's Law,

as his prediction came to be known, is still on track after 45 years.

Moore tweaked the theory after the initial 10 years was up, speculating that the

transistor count per chip would continue to double every two years. Today,

Intel's latest eight-core processor has a massive 2.3 billon transistors, which

isn't far short of the figure that Moore's Law predicted all those years ago.

However, according to some technology analysts, Moore's Law might be approaching

the end of the road. Since the number of transistors in a processor has a major

bearing on its performance, this may just mean that the exponential speed

improvements that we've come to expect could soon be a thing of the past too.

Fortunately, in research establishments across the globe, scientists are intent

on ensuring that this prophecy doesn't come to pass.

So What'S the Problem?

In this article we're going to investigate some of the pioneering technologies

that could shape processors of the future. Before we start, we need to

understand why some pundits think that Moore's Law will be doomed without a

radical new approach to microprocessor technology. After all, it's not the first

time that such a prediction has been in made. On all previous occasions,

semiconductor manufacturers found ways to combat looming disaster.

Cramming ever more transistors on to a chip without making those transistors

smaller would soon result in some seriously large chips. As an extreme example,

let's take Intel's 4004, the world's first microprocessor, which made its debut

in 1971.

It had 2,300 transistors and its die (that's the actual piece of silicon)

measured about 3x4mm. If transistors had not shrunk in the intervening period,

today's latest and greatest processors would measure 3x4m. Their clock speed

would also be measured in kilohertz rather than gigahertz, but that's another

story.

It goes without saying, then, that a continual shrinking in the size of a

processor's minimum feature size is an integral part of keeping Moore's Law on

track, at least for the foreseeable future. To date, the roadblocks that have

had to be overcome have mostly been technological ones.

Semiconductor manufacturing involves photographic processes where the pattern of

the chip's microscopic features is projected on to photo-sensitive layers on the

surface of the chip. The size at which an optical image can be focused depends

on the wavelength of the light. Initially, white light was used, but this soon

gave way to monochromatic light.

Today, deep ultraviolet (DuV) is used, but even this technique is nearing the

end of its usefulness. The roadmap requires the perfection of extreme

ultraviolet (euV) lithography within the next few years. It would be wrong to

belittle the developments to date, but the shift to euV comes with phenomenal

challenges, not least of which is the fact that the air absorbs the rays at this

wavelength.

To achieve even greater miniaturisation would require a transition to X-rays,

which opens up a whole new set of problems. The fact that there's no known way

of focusing X-rays hints at the difficulties ahead. next we have the matter of

heat generation and how to disperse it. unless it's removed, the chip will fry.

This has already resulted in one trend coming to an end that of increasing clock

speeds, which hit a plateau of just over 3GHz several years back. This is one

problem that will return to haunt us as feature sizes continue to fall. Due to

clever techniques such as selectively switching off or slowing down parts of a

processor that are not being used, power consumption and thus, heat generation

is slightly down from its peak of a few years ago, despite continued increases

in performance.

A chip's thermal density the amount of heat generated per unit area is already

similar to that of an electric hotplate. It's not hard to imagine the challenges

of removing the excess heat as chips continue to shrink.

Then we have the stranger laws of physics with which to contend. In short,

normal electronic behaviour depends on the flow of lots of electrons, but once

we get to atomic dimensions we are concerned with a small numbers of particles.

We're now into the realm of quantum effects where a single electron is defined

in terms of its probability of behaving in a particular way. There is, for

example, a probability that a single electron can pass straight through a thin

insulating barrier, so insulators could become conductors. There's even the

tantalising possibility of an electron being in two places at once.

It doesn't take a PhD in quantum physics to appreciate the impact all of this

could have on the workings of electronic circuits. It genuinely does appear that

there are some formidable challenges on the horizon.

Future technologIeS

There are a variety of technologies, currently in the research labs, that might

provide a solution if and when today's technology gets to the point where it

can't be stretched out any further. Some of these can be considered evolutionary

whereas others are most definitely revolutionary. Few, if any, will solve all

the problems that lie ahead, but these initiatives offer us hope that processor

development has a long and healthy future.

The major benefit of making transistors smaller is being able to cram more of

them on to a silicon chip without it becoming huge. However, this is ignoring

the fact that today's semiconductors are flat. By moving from two to three

dimensions, this hurdle could be overcome. engineering two layers of transistors

on a single chip would immediately double the transistor count while barely

increasing the thickness of the chips. research into 3D chips is progressing

apace, and there are more benefits on offer than just increasing the packing

density.

IBM researchers are developing techniques for stacking memory such as cache on

top of processors and, ultimately, for stacking processor cores on top of each

other. When compared with mounting all these elements side by side, as they are

on today's chips, this technique drastically shortens the distance that signals

need to travel.

Pathways can be a thousand times shorter than on ordinary 2D chips and a hundred

times more data channels can be accommodated, and both these factors will have a

dramatic effect on performance.

All this has its drawbacks, though. Heat removal from a 3D chip is far more

difficult than from the surface of a 2D chip. As much as a kilowatt of heat

could be generated in a volume of just half a cubic centimetre, which, according

to IBM, is 10 times hotter than any other human-made device, including a nuclear

reactor.

Scientists at IBM's research Lab in Zurich, in collaboration with the Fraunhofer

Institute in Berlin, have turned to water cooling to get around this problem.

Heat is removed by piping water through cooling structures as thin as a human

hair.

However, according to IBM, this doesn't even hint at the complexity involved:

"The complexity of such a system resembles that of a human brain, wherein

millions of nerves and neurons for signal transmissions are intermixed, but do

not interfere with tens of thousands of blood vessels for cooling and energy

supply, all within the same volume.

The fabrication of the individual layers was accomplished with existing 3D

packaging fabrication methods to etch or drill the holes for signal transmission

from one layer to the next. To insulate these nerves', scientists left a silicon

wall around each interconnect (also called through-silicon vias) and added a

fine layer of silicon oxide to insulate the electrical interconnects from the

water."

Impressive as IBM's breakthrough into the third dimension is, the essential

building block of all integrated circuits, namely the transistor, is still

constrained in two dimensions. However, recent developments by Japan's unisantis

electronics Ltd and the Institute of Microelectronics in Singapore are set to

bring us the world's first three-dimensional transistor, known as a surround

gate transistor or SGT.

According to Professor Fujio Masuoka, unisantis's chief technology officer and

long-time researcher into SGT technology, the transistor has its three

constituent parts the source, gate and drain arranged vertically rather than

being flat.

This arrangement greatly reduces the distance that electrons must travel within

the SGT, which means less heat is generated. That also means it's possible to

ramp up the clock speed without risking overheating.

A tenfold speed increase has been suggested, giving us the tantalising prospect

of processors with a clock frequency of between 20GHz and 50GHz. "The SGT allows

further improvements in silicon-based semiconductors, in terms of transistor

size and processing speed, for at least 30 more years before the theoretical

limits are reached," claimed Masuoka.

quantum leaP

We've seen that things get tricky when the dimensions reduce to the point where

we're dealing with very few electrons. However, one current strand of research

concerns the single-electron transistor (or SeT). Here, developers have

exploited the effect of quantum mechanical tunnelling the one that can cause

electrons to pass through an insulating barrier.

To fully understand the workings of an SeT, you'd have to be well-versed in

quantum physics, but its general principles are easy to grasp. Like an ordinary

MOSFeT (metal-oxide-semiconductor-field-effect transistor), an SeT uses the

voltage on its gate to control the current between the source and the drain.

In an SeT, the same flow of current takes place via a tiny structure called a

quantum dot, which is insulated from both the source and the drain. electrons

can migrate, one at a time, through the insulating barrier by quantum mechanical

tunnelling from the source to the quantum dot, controlled by the voltage on the

gate. From here, again by tunnelling, electrons can pass to the drain, thereby

causing the flow of current.

We shouldn't lose sight of the fact that the flow of an electrical current takes

place through two insulating barriers. In high-voltage electronics, insulators

can break down, which means that the high voltage punches a hole through the

barrier. That's not what's happening here, though. First, we're not dealing with

high enough voltages, and second, the insulating barrier is still intact at the

end of the process.

Quantum mechanical tunnelling has been described as similar to throwing a rubber

ball against a wall. Most of the time it just bounces back, but there's a

possibility that it could appear at the other side of the wall, without having

made a hole through it.

The initial work on SeTs was carried out using silicon, but despite the academic

interest they seemed to be a flash in the pan. However, recent work at

Manchester university might change all that. This research involves a totally

new form of carbon, called graphene a layer of carbon atoms just one atom thick.

We spoke to its co-discoverer, Dr Kostya novoselov of the university's Condensed

Matter Physics Group in the School of Physics & Astronomy, about what graphene

offers the SeT.

"So far, silicon SeTs haven't worked at room temperature," he told us, referring

to the fact that early SeTs had to operate at liquid helium temperatures.

"Graphene SeTs operate at room temperature, which means that we can make them

very small." Previous attempts in elevating the operating temperature of SeTs

involved making them large, which defeats the object.

If switching a transistor involves moving single electrons instead of a whole

stream of them, the power consumed by an SeT is far less than that of an

ordinary transistor that's why they're often referred to as ultra low-power

devices. using graphene in circuitboard design could reduce power requirements

even further.

"The SeT is only one possible route for graphene electronic devices; there are

other options and it's not yet clear if SeTs do represent the future," explains

Dr novoselov. "A lot of the power consumed by processors is not from transistors

themselves, but from the interconnects. Graphene can help here, whatever type of

transistor is used, because it has a much lower resistance than silicon or any

other material used for this purpose."

Manchester university isn't the only place where researchers are considering a

world beyond silicon. IBM is researching graphene applications, too, and has

demonstrated a transistor with a more conventional architecture that can run at

100GHz. Other scientists are probing the potential of another recently

discovered form of carbon the carbon nanotube which is basically a sheet of

graphene rolled into a cylinder.

However, while individual carbon nanotube transistors have been shown to work,

aligning them into workable circuits has proved a nightmare. even so, as an

example of the innovative techniques that could make up a future beyond silicon,

researchers at the California Institute of Technology (Caltech) have used

strands of DnA as the scaffolding on to which carbon nanotubes assemble

themselves into electronic circuits.

comPounD materIalS

Carbon derivatives, with their inherent difficulties, aren't the only

alternative to silicon for making semiconductors. Some of the other

possibilities are more compatible with today's manufacturing processes.

So-called compound semiconductors are made from a combination of elements from

groups III and V of the periodic table a range of elements including boron,

aluminium, gallium, indium, nitrogen, phosphorus, arsenic, antimony and bismuth.

According to Paolo Gargini, Intel's director of technology strategy, electrons

can pass through indium antimonide 50 times faster than through silicon. As a

result, devices can be both faster and consume less power. earlier this year, at

the Industry Strategy Symposium europe, Gargini suggested that by 2015 compound

semiconductors could deliver a transistor with three times the performance of

silicon at the same power consumption, or silicon-level performance at a tenth

of the power consumption.

Inevitably, building circuits out of compound semiconductors also has its

challenges. Whereas silicon can be made into wafers 300mm in diameter, compound

semiconductor wafers can't be nearly as large. This might be a fundamental

limitation rather than one imposed by today's technology.

Intel are experimenting with depositing islands of compound semiconductor on a

silicon wafer, after which chips can be created using much the same photo

lithography techniques that were developed for silicon.

So we've seen 3D circuits and transistors that operate on single electrons, and

we've discovered an array of materials that could be used to build those

circuits. even so, the story so far has been the familiar one of electronic

circuits that use an electrical charge to represent binary ones and zeros.

Other scientists are experimenting with technologies that are as different from

those used in today's computers as a modern PC is from a slide rule. now for the

science bit.

electron SPIn

Given that quantum effects might be responsible for preventing miniaturisation

of conventional transistors beyond a certain point, it would be fitting if they

also offered an alternative. We're talking about something far more bizarre than

the single-electron transistor and quantum mechanical tunnelling.

As an example of this weird quantum behaviour, think about the electron.

electrons have a property called spin, which can be in one of two states that

are known as up and down. By giving the electron sufficient energy from a laser

beam, the spin can be flipped from up to down, or vice versa.

However, if you limit the amount of energy so that there's only a 50:50 chance

of the spin flipping, something extraordinary happens. In the realm of

sub-atomic particles, quantum theory suggests that whenever there's a

probability of two different outcomes, both outcomes happen at the same time. In

the case of our electron, therefore, the spin becomes both up and down

simultaneously.

However, this is something you'll never observe, because as soon as you try to

determine the electron's spin, it will actually adopt one of its two possible

states.

If the spin of an electron is used to represent a binary one or zero, an

electron in a state of superposition (as the strange both spins at once' state

is called) represents a zero and a one simultaneously.

A bit with this property is called a quantum bit or qubit. eight qubits would

make up a quantum byte, which could represent all the possible values of eight

bits (0-255) at once. As the number of bits increases, so does the number of

values it can represent at the same time. A 64-qubit register, for example,

could hold all the numbers from 0 to 18.4 billion billion at once.

What's more, if you build a processor to perform operations on that register,

the operation would be carried out on all those values simultaneously. A quantum

computer is, therefore, a massively parallel computer. each time an additional

qubit is added, it's like doubling the number of cores in a regular CPu.

Perhaps the major difficulty with quantum computing is that the process of

observing a qubit also destroys its state of superposition. The more you use,

the more likely this becomes. Subsequently, superposition doesn't last long.

Still, most research groups have reported working quantum computers with up to a

dozen qubits.

undeterred, scientists are turning to a weird and wonderful mixture of

technologies to bring quantum computers to fruition. Some of these involve ions

trapped in a magnetic field, diamonds illuminated by lasers, or even test tubes

full of a fluoro-carbon compound.

Many pundits reckon quantum computers will end up as silicon chips. even if the

future of Moore's Law turns out to be quantum-powered, it looks like some things

never change.

lIght FantaStIc

>From the abacus to the crank handle adding machine, mechanical computing aids

have ruled the roost for over 4,000 years. Then, in the 1940s, electronic

computers made their appearance. In just 70 years they've transformed virtually

every part of our lives.

Given such a short lifespan when compared with its mechanical predecessor, it

seems highly unlikely that the electronic computer is close to being replaced

any time soon. However, it isn't the only option for the future of computation.

Here, and in the other two boxes, we look at three wildcards that might just

have a part to play in the computers of tomorrow.

Whenever alternatives to electronic computers are discussed, attention

invariably turns to optical computers. After all, beams of light travel a lot

faster than electrons. Even so, and despite numerous research projects over the

years, progress in producing an optical computer has been slow.

But an announcement by Swiss research establishment ETH Zurich may just be a

sign that things are changing. The breakthrough involved creating an optical

transistor based on a single molecule that is manipulated using laser beams.

It's not going to take the world by storm any time soon, though. "Comparing the

state of this technology with that of electronics, we are closer to the vacuum

tube amplifiers that were around in the 1950s than we are to today's integrated

circuits," said Professor Vahid Sandoghdar of the Laboratory of Physical

Chemistry.

comPutatIon In a teSt tube

In the 1990s, scientists proved that DNA, the double helix molecule that is the

blueprint for life, can be used to perform calculations. This is possible

because the sequence of molecular groupings along the double helix can represent

data, which can then be processed using chemical reactions. The strength of the

technique relied on the fact that just a few grams of DNA contains many

thousands of molecules, so massively parallel computations are possible.

The DNA computer solved the notorious travelling salesman problem, whereby it

had to calculate the shortest route between Germany's 15 largest cities. This

computer looked like a chemistry experiment, though multicoloured test tubes and

all.

Fast-forward to the 21st century and William Grover at Berkley has automated the

process: "We've developed an integrated circuit for DNA computing a device like

a microprocessor in a conventional computer, but for moving around and operating

upon tiny amounts of DNA instead of electronic ones and zeros. Like a

microprocessor, our micro-fluidic DNA computer is programmable a single device

that can perform many computations."

the mechanIcal alternatIve

It might sound like a return to the Dark Ages but scientists at the University

of Wisconsin-Madison have turned their attention to developing mechanical

computers. The work was inspired by historical machines such as Charles

Babbage's Difference Engine, which was designed between 1847 and 1849.

However, unlike Babbage's machine, this new device won't be three metres tall

and weigh 2 1/2 tonnes. Just like electronic circuits, mechanical devices work

faster as they get smaller, so the Wisconsin-Madison model is being designed as

a nano-scale device.

It will depend purely on moving parts to create the switches, logic gates and

memory units that we find in electronic computers. These mechanical processors

will use far less power than their electronic counterparts, not that overheating

would be as much of a problem. Apparently these nano mechanical computers could

operate at a temperature of 500Adegree.

According to project leader Professor Robert Blick, a key consideration has been

to develop devices that can be fabricated using the same methods of

photo-lithography used in manufacturing integrated circuits.


Technical telepathy:  09969636745

I am more inspired by Newton's apple tree than Adam's forbidden apple.

Get numbers right this time, help the census with correct disability info!

Reply via email to