San Francisco (CA) – Intel traditionally closes its Developer Forum with a visionary keynote that provides a glimpse into Intel’s labs and how the company’s engineers think about how technology could shape the world in five, ten or - in this case - 40 years.
The fact that the Intel Developer Forum turned just (or already, depending on your view) 11 years old today makes it a bit difficult to come up with a track record how well Intel’s engineers can predict the future of technology, but if my memory serves me right, we haven’t seen too many of the company’s ideas make it into production yet. In fact, most of them disappeared and no one really noticed or cared.
One reason for that may be that many of these ideas are a bit wacky and could be, in a best case scenario, a bit ahead of our time. For example, about ten years ago I was fascinated by a proposal to combine the audio and visuals for almost perfect voice recognition – like Hal in 2001: A Space Odyssey, a PC camera would track your lips and coordinate it with audio data to improve the rate of recognized words. I never heard about the idea again, but believe that stream processors may revive the idea at some point in the future.
This year, Intel was not short of big ideas for the future either, throwing out predictions I personally do not really look forward to and others that could easily be imagined as technologies that can truly enhance our everyday life. For example, Intel CTO Justin Rattner believes that the reasoning gap between machines and humans could close by the year 2050 and “machines could even overtake humans in their ability to reason in the not so distant future."
Rattner said that Intel's research labs are already looking at human-machine interfaces and examining future implications to computing with “some promising changes coming much sooner than expected.” This vision seems to fit in a timeline that Intel began creating a few years ago with ideas such as user-aware computing. While the concept of user-aware computing and computers that better understand certain actions and intentions of a user is a great approach, it appears to be rather controversial to pitch computers that are superior in their reasoning abilities.
A technology that is much more in reach, however, is wireless power. Concepts, possibilities and dangers have been discussed numerous times in the past and there is certainly no shortage of research projects dedicated to this topic. Intel said it is working on wireless power as well and uses principles developed by MIT scientists as a foundation. Rattner demonstrated a Wireless Resonant Energy Link (WREL) to power a 60-watt light bulb without the use of a plug or wire of any kind, which is more than is needed for a typical laptop.
WREL technology employs strongly coupled resonators, a principle similar to the way a trained singer can shatter a glass using her voice, Intel said. At the receiving resonator's natural frequency, energy is absorbed efficiently, just as a glass absorbs acoustic energy at its natural frequency. With this technology enabled in a laptop, for example, batteries could be recharged when the laptop gets within several feet of the transmit resonator, the company explained.
Wireless power is far from being ready for prime time and “many engineering challenges remain,” Intel said. But the company hopes “to find a way to cut the last cord in mobile devices and someday enable wireless power in Intel-based platforms.”
A bit more realistic, at least for the 10 or 15 years ahead of us, is Intel’s note that Moore’s Law, which predicts a doubling of the transistor count on an integrated circuit every 18-24 months, is alive and well “through the next decade and beyond.” Rattner brought up again the idea of a transition from planar to 3D (stacked) transistors (Rattner talked about this concept for the first time back in 2005) and possibly using compound semiconductors to replace silicon in the transistor channel. Looking further out, Intel said it looking “into a variety of non-charge-based technologies that could one day replace CMOS altogether.”
No comments:
Post a Comment