Ten years ago, we would have been blown away by a cell phone with far more computing power and memory than the average PC had in 1999, along with a built-in camera and programs to manage every aspect of our lives. Ten years from now, the iPhone and its ilk will be antiques.
Over the next decade, the evolution of computing and the Internet will produce faster, increasingly intelligent devices. More of our possessions will contain sensors and computers that log our activities, building digital dossiers that augment our memories, help us make decisions and tame information overload.
Such ideas may sound futuristic and excessive today. And technological predictions are notoriously off-base. Short-term forecasts tend to assume too much change, and long-term forecasts underestimate the possibility of sudden, major shifts.
Even so, this vision of interconnected devices that produce and filter massive amounts of data in the 2010s is a logical progression of the Web, computers and gadgetry that emerged in the 2000s. Moore's Law, the principle that computing power doubles every two years without increasing in cost, still rules.
Recall the personal computer, circa 2000. It likely had a "clock speed," a measure of how fast it could do things, just one-sixth of many computers' today.
In the next decade as conjured by Forrester Research analyst James McQuivey, all that information will be available instantaneously, anywhere. He imagines spotting an acquaintance at a conference and having at his fingertips links to the person's most recent research, plus a reminder of her husband's name.
Software will remember everything McQuivey buys, reads online and watches on TV. A "smart filter" will use his past choices to suggest the next book or show, or even what he should eat for dinner. It's a more powerful version of the way Amazon.com and Netflix make book or movie recommendations.
He also thinks we'll all use this technology just to keep up with everyone else. He likens the situation to calculators in math class. At first teachers banned them, but now they're required. Leaving yours at home on test day would be a big disadvantage.
Craig Mundie, Microsoft Corp.'s chief research and strategy officer, believes we are near a long-wished-for era of computers that respond to speech, gestures and handwriting.
In Mundie's vision, "digital assistant" programs will help us solve specific problems. Imagine you're moving to a new city and need to find a house. "Relocation assistant" software would listen as you brainstorm out loud about whether you want to drive to work or take the bus, about school preferences and the market value of your current house. As you converse with it, the program scouts real estate listings and plots the best on a map.
Our smaller devices will also benefit from speedy connections to "the cloud," powerful networks of computers that perform services remotely. In a decade, Manny Vara, chief evangelist for Intel Labs, imagines, he'll tap the power of the cloud on trips to foreign countries, speaking into his phone and seeing a translation on his screen within seconds.
In another scenario, Vara imagines we will each wear a tiny camera. It could snap a photo of the cutie next to you in the bar and send it up into the cloud for analysis. If it matches your friend's nasty ex, a voice could whisper into your earpiece that it's time to move on. Your portable devices don't have to be powerful enough to run facial recognition software; they just need a connection to the cloud.