Intel is set to disclose some of its plans in nanotechnology, sure to be key to the company’s chips for decades to come. As reported by CNET’s News.com, Sunlin Chou, senior VP of technology and manufacturing, will discuss some of the plans next week at the Intel Developer Forum in San Jose. Topping the topics likely to be covered: Carbon nanotubes and multigate transistors. Nanotubes are strings of carbon atoms tightly bonded together that show promise in manufacturing everything from tennis rackets to electronics. In computer chips, they can theoretically be used to replace the wispy metal wires that now define a chip’s circuitry. That could make processors smaller and cheaper. Multigate transistors, meanwhile, are a way of addressing the conundrum faced by all chipmakers: The more powerful processors become, the more electricity must flow through them. But as chips shrink in size, the extremely small transistors that control this flow are growing overloaded, something like hooking up a fire hose to a Waterpik nozzle, as CNET puts it. One way around that is to give each transistor more than one gate, an approach that IBM is using in some of its products already. Although analysts say they doubt Intel will copy this entirely, the company likely has a similar approach up its sleeve.
Proving again that clever sloth trumps dull industriousness, a mischievous group of transistors at a British university has spontaneously converted itself into — of all things — a radio receiver.
No word yet if the transistors are next planning to materialize as headphones or a graphic equalizer. New Scientist reports that the recreating of century-old technology occurred at the University of Sussex in Brighton during an experiment that was unusual in its own right. Researchers took transistors, added an evolutionary computer program and were expecting to end up with an oscillator — a repeating sign wave signal.
Instead of forming their own waves, though, the transistors utilized a part on a nearby circuit board as an antenna and began receiving the oscillations from an adjacent computer. Somewhere out in the ether, Guglielmo Marconi ought to be proud. And slackers everywhere, too.
Apple in recent years has sought to close the megahertz gap with Intel and Advanced Micro Devices by selling high-end machines that come with two processors instead of one. Thus, a dual-processor 1.2GHz G4 can reasonably claim to compete with a 2.5GHz Intel box. But Intel reckons there’s more than one way to build a two-brained beast and is experimenting with putting two processor cores on a single piece of silicon. It’s a process that CNET’s News.com says over the next decade will improve performance and reduce power consumption. One advantage is the ability to better dissipate heat. When a series of calculations has Core A chugging along at full steam, its transistors can hit temperatures that start to degrade performance. When that happens, the chip can “hop” some of the work over to Core B for an overall performance improvement. Another approach is to have Core A and Core B specialize in different things, similarly spreading the number-crunching responsibility around.
Computer chips have been shrinking for years. But who stops to consider that that’s only been possible because the stuff on the chips, like circuits, transistors and memory have shrunk too? To keep the trend going, Germany’s Infineon has joined Advanced Micro Devices and United Microelectronics Corp. to develop technology to produce the tiny structures needed inside chips. As the number of elements on a chip doubles approximately every year, “chipmakers are under pressure to develop new microelements to fit on (them),” Reuters reports. Currently, the size of the smallest element on a chip is 130nm. The three-way alliance will focus on developing a 65nm and 45nm manufacturing process.