As a bit of a technophile it's a fairly constant pleasure to see what can be done with consumer electronics these days. Advances apppear particularly dramatic in the field of computing, since most types of progress are exponential. This does have the side effect of making the whole thing seem a little pointless because if you buy your stuff this year then next year's processor will be twice as fast and use less power to boot. And that's not all, not by a long shot.
The expansion slots in your motherboard double in bandwidth every two years, seven months. The connectors feeding your hard drive double in bandwidth every two years, five months. External connections like USB and their ilk are a little more erratic but conservatively they double in bandwidth every two and a half years as well. I haven't done the sums but it'd be smart money to bet that RAM and CPU bandwidth are following a similar trend. If we project these trends to 2020 which, let's face it, is only nine years away, we can expect a colossal 2 Tbps over whatever graphics bus we're using then. Engineers are working to that goal now, you may depend on it. You can also expect a good 10 Gbps over the cable going to your external device, though, as today, whether you acheive the theoretical maximum may be a matter of "never". To put that into perspective, that's the kind of data rate the very latest graphics cards need to push those trillions of numbers around so the pretty lights happen on your screen in CoD.
It all seems rather excessive. And speaking of excess - yes, newer technology does take less power to do the same tasks as the silicon of yesteryear but don't forget that our ability to write code has ever been ahead of our ability to run that code. The best example to come from the game programmers is Crysis. When it was released in 2007 it had the singular distinction of being so catastrophically demanding at full setting that it was totally unplayable even if you had the best there was. My computer is three years younger than it and by any average person's standards it is a desktop supercomputer (the graphics cards can do about 4.1 TFLOPS), but wouldn't you know it I can't turn everything up to maximum either, not if I want to actually kill those Koreans. And as a result of all this, the total power consumption at maximum load is also going up quite a lot. I've just seen a 1.5 kW power supply. That could comfortably supply four PCs like mine at full chat, or in all likelyhood about seven of yours.
I do wonder, though, whether we will actually get to those kinds of speeds in consumer stuff. After decades of development we are reaching the point where the desktop computer can run photorealistic games at playable framerates. The trend is less towards more power from each chip as it is towards more chips. That's why the new AMD Phenom and the Intel i7 each have six processor cores. It's also why for some time now you could plug in two graphics

Of course it is easy to imagine things that need ever more code and ever more processing as a result, but the returns are deminishing. In other words, there's not that much difference between "nearly photorealistic" and actually photorealistic, but there's a huge difference in terms of the processing required. If you don't believe me, consider that I can actually run Crysis quite nicely, but it took a 40,000 processor supercomputer some months to stitch together the complex physics and visuals of Avatar.
I guess the real question is whether gamers of the future will demand things like proper fluid dynamics and fully realistic destructible environments. Hardware tesselation is the showpiece of DX11 so perhaps the notoriously computationally intensive simulations like hair, fluids, cloth and refraction will be in the revisions to come. If they are, though, I really do hope we don't end up with something like an extremely furry wookie wearing Jedi clothes and swimming to Otoh Gunga. 'Cos if we do, we'll have gone backwards. Let's not fall into the trap of too eagerly computing things just because we can.