I've posted about remote access lots of times over the past dozen years or so, because it absolutely fascinates me. I'm talking about technologies like Microsoft's Remote Desktop, VNC, Remote X Windows or similar technologies, but using a tablet or mobile device as a client. And every time I see it in action, I think to myself that this is it, now is the time it will take off and thin clients will become common, only to be surprised that it doesn't happen. Then watching as a year or so later, the concept is born yet again in another context, but with the same result.
There's always been a push and pull between thick and thin clients throughout the history of computing - from the days of shared mainframe access through to the 90s failed Network Computing stuff. But where it got really interesting to me was back in 2002 when Microsoft showed off its Mira Smart Display tablets. The tablet itself was relatively dumb touchscreen display with all the processing kept in the PC it was connected to. The system was fundamentally flawed in many ways, but Mira introduced several interesting ideas: First, it was an actual portable tablet device, years before the iPad, second it relied on a more powerful computer to do the heavy lifting, and third, that more powerful computer was a local desktop PC, not some remote shared system. These are fantastic ideas - the client can be simple and portable, yet it can have all the processing power, storage and networking speed of a desktop PC. Tablets, of course, have since taken off like crazy - but that remote access stuff? It just never seemed to catch hold, despite being functional, usable and useful for literally a decade.
I remember working at Yahoo! and having an external meeting with Nokia when they first were launching the Nokia 770. They had a few demos showing off Yahoo! service integration - one of which only ran on a Windows PC, but they used a remote access client on the 770 so they could show what it would look like. I was absolutely fascinated by this. I had only used VNC, which has a lot of lag and artifacts, yet the demo was seamless and had sound! This was because they were using Windows Remote Desktop (aka Terminal Services), which is a far superior technology which transmits display data more efficiently, letting clients display windows and the pointer with virtually no lag (especially compared to VNC), and even includes remote sound. I remember actually asking if it was some sort of custom VNC version, and being told with a smile that it was something similar - it wasn't until I got a 770 for myself that I realized what they were doing. I took a break from blogging around that time, so I didn't get to post about the experience for a year or so, but here's a post from 2007 in which I'm waxing lyrically about how amazing remote desktop is and how I was convinced it was mobile's killer app. Woops. Obviously not, but back then the technology was even more impressive than today because of how underpowered mobile devices were - it seemed like a no brainer.
OK, so fast forward a few years and remote access pops up once again, this time in the context of video games with OnLive's server-powered gaming service. I was so fascinated, I bought the console simply to see if it actually worked as advertised, and it did! They even had tablet and mobile versions of their client - though unhappily, Apple never approved their iOS version (which was probably their undoing at the end). Later, I was hoping OnLive had a hit on their hands when they announced their Desktop product for the iPad, which essentially gave you a full Windows desktop in the cloud (on the same game servers which, during the day, were being under-utilized). Again, it didn't take off (though I can understand why, as Windows 7 is definitely awkward to use in many ways from an iPad). Still, OnLive proved conclusively that a system where you were able to utilized a powerful computer in the cloud to power a portable thin client was both possible and practical, even via the open Internet.
There are, of course, fundamental limits of OnLive's system based on the fact that light only goes so fast and the Internet is only so reliable. Even though OnLive's system worked well, there could be perceptible lag in a lot of games. This lead many to believe that a remote access gaming client would never be able to be responsive enough for "real games", even without the Internet in the way. This idea was pretty much cast aside last year when Nintendo launched their Wii U gaming system with its integrated GamePad tablet controller which talks to the console using a custom, low-latency version of the 802.11n WiFi spec. The result is absolutely amazing - the screen is big and beautiful and responsive, and the games react instantly: It feels like you have an incredibly powerful gaming system directly in your hands - like a Nintendo DS on steroids. Unless you happen to stray too far away from the console, it really feels like all the power and functionality is contained right on the tablet itself.
Since then, Nvidia launched Shield, their Android-powered portable gaming console which can stream games in a similar manner, but from PCs that use Nvidia's high-end GPUs. Like the Wii U, it relies on high-bandwidth WiFi connection to reduce latency (directly to the console for the Wii U and via specially approved streaming routers for the Shield). What's interesting about both solutions is that, like Microsoft's 2002 Mira tablet concept, these devices don't access centralized servers in the cloud, but rely on the power of a device that you have locally with the assumption that you have other places around your house where you'd like to play your games rather than being stuck directly in front of the box that happens to be generating the graphics for you. As long as you're within range, you have all the power of a console or PC, but contained in a small portable screen.
But that's the rub, of course, you have to be at home (and within WiFi range) for this to work - what if you could take the GamePad or Shield with you and somehow remotely access your games?
Well, that's exactly what Sony has enabled with their PS Vita portable game console and its integrated PS3/PS4 Remote Play functionality. Back when OnLive was launching their consumer product, a competitor of theirs named Gaikai was creating a similar system but as a white-label game streaming service that could be used by cable companies, tablet manufacturers, etc. Sony bought them in 2012 and announced that they were integrating their technology into the PlayStation consoles. I had read about this and thought it was only about providing backwards compatibility to PS4 (which can't play PS3 games), but it turns out that when Sony said they were integrating Gaikai technology into the various PlayStation consoles, they meant *into* the consoles! The PS4 can stream games locally to the PS Vita directly (like a Wii U), *and* it can stream remotely as well if the ports to your PS4 are open. This is really, really, really cool!
I had no idea about this until just a few days ago - but it appears to have given a whole new life to the PS Vita, which has been mostly ignored if not outright forgotten by most gamers and developers since its launch last year. Thanks to the PS4 and Remote Play, the PS Vita is now a hot item - and as more people find out what it can do, I bet it becomes even hotter. I purchased a used one from GameStop the other day and the guy behind the counter called it his favorite thing about the new generation of consoles - a key differentiator from the XBox One specifically - and the cause of many lost hours as he plays his favorite console games where ever he happens to be (which, I gathered, meant at work).
But it's not just a great feature for gamers. If you think about it, Sony has successfully transitioned their PlayStation consoles from entertainment consoles to entertainment *servers*. And they'll be in millions of people's homes, connected to broadband Internet lines, ready to serve up not only games, but video and music as well, and maybe other types of apps in the future. The PS4 is, after all, simply a dedicated x86 computer with (currently) pretty top of the line specs and could do all sorts of interesting tasks as a home server of sorts. There's a ton of opportunities - and now that the service has been established, and has a killer app (the games) there could really be some interesting things that happen as a result.
That said, I'm not really sure how important this is, really. I mean, is there any sort of fundamental shift going on here? Something akin to the iPhone launch or cloud computing in general? Probably not. The fact is that it's not 2005 any more and our tablets and mobiles already have incredible horsepower and great bandwidth via now-ubiquitous WiFi and 4G LTE cellular. The need for remote access just isn't as strong as it once was. Mobile devices used to be really, really underpowered, but now they're less than an order of magnitude slower than current desktop computers and consoles - and honestly, that's not very much. (The rule of thumb is a 10x increase for a new paradigm in technology, right?). Dell's new Venue 8 Pro is actually full-on Windows 8.1 computer in the form factor of a normal 8" tablet - why would anyone even *need* remote access, actually? Additionally, companies like Amazon (my employer), Microsoft and others are quickly offering more and more functionality in the cloud (like both streaming games *and* Windows desktops, in fact) so the incentive to buy, install and manage a personal home server is pretty much non-existent, or at least shrinking very, very quickly. I don't know about you, but all I have left in my house are laptops and backup hard drives for the most part - well, except for the stacks of mobile devices and the cacophony of devices plugged into my HDTV of course.
Still, there's something there, I think. I'm just not sure what it is, it just seems like something important is happening. That Sony, Nintendo, Nvidia and others are all converging on the idea of pairing lower-powered handhelds with higher-powered computers, and that all have launched within the past year is more than a coincidence, it's a trend. Maybe it's just the costs are now reasonable, maybe it's because the infrastructure is in place, maybe it's simply that WiFi is now at a speed that's practical for low-latency apps and games. Who knows? (BTW, if you want more details about the latter, here's a great, in-depth article comparing Nintendo, Nvidia and Sony's remote access play solutions, measuring latency, frame rate, etc.) I think it'll depend on the next big player who jumps in. If Microsoft starts selling all their mobile phones with a Azure license for Windows in the cloud? That'd be a big sign. If Microsoft takes Sony's lead and adds local/remote game streaming to the XBox One, that could be a big deal. If Google adds sort of remote-access functionality to Android or Chrome at a core level, accessing streaming services in their nascent Compute Engine, that'd be something big as well. Maybe even Apple might add some sort of integrated iOS/Macintosh functionality in their next release. I can see practical reasons for all those companies - and others - to do these sorts of things.
Or maybe 2014 will come, and go and nothing else comes of this trend. Nvidia's and Nintendo's efforts slowly fade from relevancy as they focus on new products, and Sony's Remote Play becomes simply an interesting capability cherished by the most hardcore fans, but considered an oddity by everyone else, slowly fading from memory as time goes by, with the service eventually dropped in the future due to lack of interest by developers or gamers. I can totally see this happening. Then once again, I'll wonder why technology like this never seems to take off. Only time will tell.