TED -- or Technology, Entertainment and Design -- is a yearly conference held in Monterey, Calif., that brings together more then 1,000 thought-leaders in the computing field. Its mantra is to create a sort of think-tank to share any ideas that can provide help to advance the computing field.
What comes out of it is truly extraordinary. The first TED, held in 1984, helped unveil the first Macintosh computer and the Sony compact disk, as well as the idea of AI, or Artificial Intelligence. Look how far that's come!
Since its creation, TED has attracted a growing and influential audience from many different disciplines, including mathematics, business, science and the arts. What holds them together is their unification of curiosity and out-of-the-box thinking. Everything they do is for the greater good of the community; in fact, they have never spent any money to advertise or promote their cause.
The speakers who attend the conference come from all walks of life. Past years have seen people like Microsoft maven Bill Gates; world-renowned primatologist Jane Goodall; musician Herbie Hancock; and even Li Lu, a key organizer of the Tiananmen Square student protest.
This year, the most interesting ideas related to personal computing came from Jeff Han, a research scientist for New York University's department of computer science. He and his staff have been working on the advancement of multitouch sensing, similar to the touch-screen kiosks you see at airports or movie theaters.
This new type of computing interface builds on the premise of single finger-touch screens, but takes it to a whole new level. Touch-screen systems today only allow you to use one finger to press a button, type a word or click a point on the screen. Han's new system, which was released from the laboratory days before the TED conference, allows users to touch and maneuver multiple points on the screen with all 10 fingers simultaneously.
It'll let you use even more if you have a second person using the system at the same time.
This hi-resolution, low-cost multitouch sensing system is represented on a drafting-table-sized screen, and can run all sorts of applications that are controlled by a user's fingertips. The highly intuitive software applications, like a three-dimensional mapping program developed by NASA, allow the user to rotate, zoom in or zoom out to a point on a map just by moving their fingers across the screen -- squeeze your pointer finger and thumb together on the screen and you can zoom out, spread them apart and you can zoom in. If you move your pointer finger around in a circle, you can rotate the entire canvas.
The screen is also pressure sensitive. Han's team created a very basic lava-lamp application, where a lava-like image slowly maneuvers around the screen. Press your finger on the screen, and you can heat up the lava and pull it apart like play dough.
Let go of your finger on that same point and the lava instantly cools, retains its shape and continues to float around like molten lava on the screen.
His idea of multitouch sensing is very scalable, and can be applied to almost any computer application. In the future, users will be able to use their fingers to operate a computer in the same way that they use a keyboard and mouse today.
Even though this program is still in its infancy, you can expect multitouch interaction research to take a giant leap forward in the years to come.
For more information and a great demonstration of how it all works, log on to: video. google.com/ted. html.