Brain Dump 2013


Almost a year ago I wrote here that I had been using OneNote (and later EverNote) to keep track of random thoughts that popped into my head while reading news, and that I was going to start posting them more regularly to my blog as shorter posts. Well, I never did. Every time I went to start posting, I'd think, "Ooh, I should expand on that a bit more" - this is how some of the longer posts I've written over the past six months or so came to be. But there's lots and lots more ideas that have been just sitting on my hard drive for a year or more now, so I figured I'd just finally put them all into one giant, 10,000 word blog post and be done with it. Some are only slightly longer than an average tweet, but others are several paragraphs long. In all, they capture quite a few of my thoughts on various tech topics.

The title is actually a misnomer, as most of these thoughts were had during 2012 - basically when OneNote launched for both the iPad and the Windows Phone 7 that I was using at the time. It worked really well. I actually then moved to EverNote when OneNote became unstable for some reason on Android, and used that for a while, but then they kept on messing with their user interface, which started to mess with my habits and I finally stopped, which is too bad. This is pretty much why I hate using anyone else's services besides my own - I get into a routine, then some moron breaks something and it messes with me until I lose the habit all together. I'm sure there are many, many Google Reader users who are feeling very similar right now.

Who the hell is going to read thousands of words of random thoughts? Probably not a lot of readers - I imagine there'll be a few who'll skim the article, but just going through now and cleaning it up a bit (taking out references that don't make sense any more, getting rid of half-sentences, etc.) I was barely able to make it through the whole thing, and I wrote it. Generally my blog over the years has served as a touch point for me to go back and read what I was thinking about at the time - so I can say "See? Knew it!" or just as many times, "What the hell was I thinking?" So that's why I wanted to post this all before it gets lost or completely irrelevant.

I really should start to post more often though. Writing down your thoughts is always a great way to clarify ideas, expand on them, and by publishing them, get valuable feedback from others. Even if the ideas aren't always fleshed out into full posts, it's good to get them down.


- Russ

Device programming for power users - There's an opportunity to provide tools to let power users take advantage of their more powerful devices. Just like VBA and Apple's Automator or Applescript does for desktops, or, Yahoo Pipes and Microsoft's Mayhem and on{x} projects do for the web. Devices are more powerful, more complex and more personal now - we should be able to give them more complex instructions. Why should the Nest thermometer be the only device learning patterns and helping make our lives easier?

Cloud Desktops - OnLive's Desktop app and other services have shown that having a desktop in the cloud is possible and functional - how long before Microsoft offers a free Windows 8 desktop hosted on Azure with every new Windows Tablet? Also, Apple is behind in this area - there's no virtual Mac OS hosting service.

Display Ads suck - There's a continuum from helpful to innocuous to annoying/intrusive. The best ads are on the left (like Google AdWords for the most part) and any sort of display ads are on the right:

welcome - useful - helpful - acceptable - innocuous - bothersome - annoying - intrusive - malicious

QR Codes need to die - They were invented a decade ago for older, much less powerful phones, hide the data they're trying to transfer, and have never been adopted by the West. Time to get rid of them and replace them with markers around text so an OCR program can accurately detect which section of an image to pull information from. NFC tags can be reserved for larger bits of information without processing, and information that you'd want to transfer secretly (your credit card numbers, etc.)

Apps vs. Web - I'd be more willing to use and rely on custom magazine style apps if there was reliable linking from one to another. URL schemas need to be adapted to include apps, so that you can link to anywhere within a particular app, and be able to link *out* of any particular app as well.

Why I dislike Windows - It's not Unix, basically. If Windows came with a bash-based shell, included standard Unix utilities such as ssh, grep, etc., and switched to unix-style file and folder names it would go a long way to making the platform usable. There's a reason Unix continues to thrive 30 years after its creation, and reasons why Windows, which was originally built on top of a cruddy OS clone of a cruddy copy of Unix, continues to suck. Examples include things like the file system with the back slashes and c: drives. And the fact that plugging in a USB peripheral still generates balloons, alert sounds, install notices, and long waits - 10 seconds to 10 minutes - for something that Linux or OSX just instantly do. Other problems are things like built-in apps suddenly not responding, or apps taking time to do something, and the OS assumes its not responding. The admin tools are a mess - there's so many 'enterprise' options and plugins and crazy logging options, It's impossible to actually *manage* anything in the management apps. The registry continues to be a nightmare and Powershell should never have been invented. It's just a horrible OS.

Why I can't get excited about 'apps' - First, I'm still holding a grudge from the first year when the iPhone didn't allow them and people illegally cracked the security on their phones (better known by the euphemism 'jailbreaking'). But also, has there been any great companies started as App companies? No, because companies need to control their own destiny and not need to ask permission of another company to sell their wares, giving them 30% of the profits. Apps seem to be limited somehow. Adobe still makes most of its money selling software for use on PCs. I can't imagine another company having the same sort of size and influence making only Apps. Especially with Apple's 30% tax.

Twitter/Facebook data mining in Silicon Valley - Imagine instead of data mining everyone's tweets looking for patterns of buy/sell, you instead just looked at employees of tech companies instead? Aggregate their tweets, look for hidden patterns and how those are reflected in earnings.

Wall-sized touch screen displays - One of the cooler things I did at NRC is help create a project we initially called Lava, then called Expo, which is to utilize the web to create HDTV sized information touch screens using HTML. This is going to be more common, and there's a lot to learn from it.

Communication is hard to monetize - Though it's one of the most essential functions of technology - from the telegraph to the telephone to email and the web - the content it produces has the least value, outside the person and moment a particular message was meant for. We know how to monetize information and entertainment via direct payment (ebooks, music), sponsorships or added value (AdWords), but the very act of interrupting a communication stream or creating a distraction in order to derive value is ultimately destructive or futile. derives value by extortion. Other communication services have tiered levels of functionality. Others try to distract you with obnoxious display ads (Hotmail, Gmail, Yahoo mail, Facebook). At one point we paid by the minute to communicate - I ran up hundreds of dollars of phone bills during college by simply talking. And later, AOL charged me by the hour to connect to the internet. I've only just recently stopped paying per text message I receive. Really, how much is a single IM, text, email, comment, tweet or 'like' really worth? And then how do you translate that worth into actual money? Meebo had a chat client, eventually was some sort of ad thing before it got bought. Lingr, tiny-chat and a million other chatrooms and chat sites have faded as their inability to derive actual value from communications continue.

Also, correlation (new rule?): The closer web content gets to communication, the less value it has. However, if the communication facilitates some other purpose - like eBay - then the lock-in is high and the value is huge. 'Markets' online are simply sevices that facilitate communication between buyers and sellers. In order for Facebook to actually make money from their service, they need to figure out how to convert their communication service into facilitating some other goal.

Twitter's old enough now to expand the number of characters it supports, no? Why not? Ironically, this thought could almost fit in a tweet...

Tablet website design - I love how some sites (MIT Tech Review) get the idea of a site designed for tablets, where others don't get it at all - like the various Wordpress plugins and services that format pages for tablets, and end up making the site harder to use in every way.

'No I don't want to download your shitty iPad app' - Someone should start a Tumblr showing all the various obnoxious full-screen ads that publishers inflict on readers. Ahh, someone did. Good.

Portrait vs. Landscape tablet design and usage - I use my tablet exclusively in portrait mode most of the time as that's the most comfortable and natural position. It seems silly to me to use it in landscape - do we read magazines in landscape position? Yet both Android and Windows tablets are regularly shown in a landscape position and pretty much made to be held that way. So crazy. - Donate money and get karma points, then you can give those points to others, or post 'challenges' online where people can earn karma points from you. Then those points can be used for discounts/prizes. I've got to do something interesting with this domain I own...

Conversational UI idea - Create a new web site which uses the 'conversational UI' that Siri uses. Rather than a 'one box' at the top with a page of results below, the UI would be more like an IM conversation with each response box having rich content: video, html, etc. And it'd all be archived. Created a prototype of this back in 2004 for Yahoo! Question: Aren't most searches fleeting and not worth archiving?

iPad audio - The iPad needs better speakers! Isn't anyone at Apple an audiophile? The screen is great now, but the speakers suck. Where's the Retina innovation for our ears? Why does Apple ignore sound? Wasn't Jobs an audiophile? Why is there such little effort put into innovative solutions for bigger, better sound from the iPad and MacBooks? For a company that sells so many iPods - devices who's sole purpose is to deliver sound and media - it seems ludicrous that Apple basically ignores that part of the device. There's got to be something they can do to improve it.

Wireless charging - We have the technology, why aren't more of my gadgets using wireless (inductive) charging. When the hell is Apple going to get on board with this? I want to be able to set the tablet down as I would a magazine, and be confident that when I pick it up again, it's fully charged. Same thing for my phone, ereader, music player, etc. You know how many times I have to remind my son to charge his mobile phone?

Hosted server-side 'apps' - The idea of individuals having their own hosted server used to be big business. Hosting was huge for blogs in the early days - accounting for the success of Moveable Type and Wordpress and PHP as well. Though you still can get quick and cheap hosting from places like GoDaddy, it's generally gone out of vogue. Turn-key solutions and services are slicker and easier to use.'s service is more popular than's software by far. But today virtualization has become a key-press activity. Look at Microsoft Azure, Amazon, Heroku and others. So what if you made installing server-hosted applications as easy to manage as installing a new App on your phone or tablet? What sort Server-side apps are there besides web servers, storage and databases? How about scrapers and logic? What about actual desktop apps or console games running in the cloud and served up like Should each phone/tablet come with a mirror in the cloud?

Back to intelligence in the clouds - Could server-side JavaScript and some standardization facilitate a Telescript like functionality, where code moves from server to server, running in place? Is this needed, since you can move the data around pretty easily? Or if not (the entire Twitter archives for example) then the process would take too long anyways, and another host wouldn't want to run your code. What if your script was badly written and just did an endless loop or some other wasteful thing? Would it help if the scripts got charged for their service usage (like how Amazon charges per CPU hour today). Again, is there any practical reason for Telescript?

Solar powered generators and minimizing electric use a little at a time per household. I'm not an electrical engineer, so I'm clueless about the realities of this. You know the gasoline powered electric generators? Can they be converted to work with steam power? Ever see that video of that huge solar magnifying glass that can melt rock and metal? It seems like a *lot* of energy is there that could be used to say create steam, which would then turn a generator. If you did that for, say, 6 hours a day, how much energy would you produce and how does that compare to the same amount of space taken up by solar panels? Is the problem simply storage? My old boss installed panels which then goes out to the grid, and he gets a credit for when he uses electricity later (solving the storage problem). But what if we thought smaller? Take all your electric devices that need to be charged up, and have them use a dedicated plug that's powered 100% by a single solar panel. Run a wire through a window, and you're done.

Smart TVs are coming - Thumb-drive size 'computers' with HDMI slots for playing video on the TV are becoming more and more common. They're all cheap too - $78 will get you HD playback. This price, to me, means that TV manufacturers will soon be simply adding them into the TV itself. But this presents a problem - how best to interact with this device? Remote control? Touch screen? Voice commands? Motion detection (using sensors like Kinect, image detection like eye-toy, or with controls like Wii or PS3's lollipop controllers).

Windows branding - The thing that I can't understand is Microsoft's attachment to the Windows brand. They use it everywhere, even if it has little to do with the product itself. The rationale, I'm sure is that all of MS's power and profits derive from it's desktop near-monopoly, so all products and services should be created and marketed with an eye towards enhancing that monopoly. However, every brand has a lifecycle and Windows has really run its course. It seems that its time for them to branch out. The XBox is a great example of a new brand that has zero Windows tie in, and that's good. Microsoft is a giant, nameless, scary corporation. Windows is that unhip OS you use at work and has all those viruses. XBox though, is cool and popular. Microsoft stumbled upon a great potential brand name with Metro, in my opinion. I understand that name recognition is important and super valuable, but labeling their Phone and Tablet OS as Microsoft Metro OS would have been a better way to refresh their brand and generate enthusiasm for something 'new'. Those devices are consumer products like the XBox, so it'd be a perfect place to launch a new brand. Windows Azure is another boneheaded label... There aren't any 'Windows' on a headless server running hosted OSs like Linux. Nor are windows used in in the tablet-focused Metro UI. Windows RT could actually really suffer as a result of this - consumers are going to get pissed off once they learn they're not really getting 'real' Windows. Apple didn't call iOS by the name 'OSX mobile' or something and as a result it's clear that applications you buy for one platform won't work on the other, however close their lineage. Microsoft's plan of letting Metro apps run on both is interesting, but ultimately confusing, as it's a one way street.

Windows RT thoughts - A tablet running Windows RT is pretty much like a tablet running Linux, Android or something like ChromeOS. Microsoft is a huge player, and Office is included which is definitely a big deal, but every other app has to be created from scratch by developers. This to me makes the whole thing a really questionable platform. This is a result of Intel's failure to match ARM when it comes to low-power hardware. RT devices simply didn't sell as expected last Christmas, so the public is going to simply wait until next year (or the year after) when Intel finally gets it's chips to work like they should for low-power devices. Then MS will kill (or de-emphasize) the RT label and move back to Intel based hardware as their main focus. That said, ARM is so huge now, Microsoft may be forced to keep plugging away at it, keeping both their ARM and x86 versions of Windows going. But something about that seems untenable. Remember the Christmas before when all those Android tablets sat, unloved and unpurchased, in Best Buy while Apple's iPad sold in record numbers? Just because you've got a tablet with a nice looking OS, and a retail presence doesn't mean squat, as Google can attest. I can totally see a bunch of Windows RT tablets succumbing to the same fate 5 months from now.

Microsoft Surface Pro is the MacPad I wrote about when the iPad first launched. The hardware looks good - lots of MS hardware is actually pretty good - I love my Arc keyboard, wireless MS mice (with bluetrack) and the XBox 360 and it's controllers are astoundingly well designed pieces of hardware that have truly withstood the test of time (ignoring, of course, the original red-ring of death problem that occurred due to lack of testing). But the problem is, sadly, it's running Windows, and Windows simply sucks.

Is Apple's iPad a success because of its simplicity, or in spite of it? You can't just throw a WIMP interface like OSX or Windows on a tablet and call it done - Microsoft tried that already and it didn't work. But going to the opposite extreme and saying that tablets don't need to be as powerful, versatile and useful as full-fledged PCs seems wrong. But that's pretty much what's been successful - not a desktop OS, but a consumer-oriented system instead.

Microsoft wants RT to be as locked down as iOS, but there's no technical or historical reasons for this, so it's really just insanity to think they'll be able to pull it off. If you don't care about a closed platform, you'll go with an iPad, which is established and has more apps than god. If you want a more open platform, you'll go Android. Why would anyone in their right mind choose a lobotomized Windows tablet?

Cloud comparision - There needs to be a graph of tech companies now that compares and contrasts their various systems and services:

Apple - Amazon - Google - Microsoft - Facebook

Makes hardware - Makes OS - Makes applications - Expert online services - Provides Email - Has a marketplace -Has your credit card info

Linus seems to have made nice with Google - I had the sense before that the Linux community and the Android guys (read: Google) didn't get along very well. But now things seem to have turned around. The Android and Linux kernels have been reunified, Linus is giving Google praise for the ChromeBook in his speech to Aalto as well as pointing out the vast number of Android phones activated daily, and seems to have started blogging semi regularly using Google+ . This is an interesting change - precipitated, I think, by Android's ongoing success. Linus said it bugs him that he first designed Linux for the desktop, and yet that's the one place it still doesn't dominate in computers. But I assume it must feel nice to have a consumer-oriented, mass market device in his pocket that is running Linux, yet isn't some oddity - in fact, is considered one of the most powerful, promising platforms. Whether it's some Samsung, or Moto Droid, Linus can pull it out and get nods of appreciation from your average consumer.

Better HTML than HTML? - If you were to start from scratch, could you make a better HTML and browser? Rather than rely on html plus CSS plus JavaScript, what if you could make up your own type of page rendering markup? Apps are so popular, what if you could provide standard templates into which data was poured automatically. Or maybe the first thing a site delivers is the template - which would contain what is essentially an app, and then the rest is all a data stream of some sort? Or maybe the App (like on an iPad) has a bunch of pre-installed templates that can be chosen by the user or the site? Someone must have thought of this before, and at first it seems too limiting, but there's something there. Rather than continually munging data and interface, they would be permanently separated.,. Instead of a browser, it'd be a 'reader' - data reader. Is is what a lot of client-side sites are doing anyways, it'd be interesting to formalize it.

The new browser. Is there a market for a new type of browser, one level above the current crop? Just using off-the-shelf Webkit, etc but then adding in specialized features... Not just email integration, but full-on data munging, like GreaseMonkey on steroids. Where every page is subject to manipulation and re-organization, data-mining, etc. There's been a few custom browser startups that failed each time, but there's something there.

Web Proxy as a service - imagine always browsing through an intelligent web proxy, that analyzed and assisted your daily web usage. Who says that all the new features in a browser need to be embedded in the client.

Trusting online services - As time goes by I'm more willing to trust parts of my daily computing experience to online services. Google now handles my email - and I am so happy they are as the amount of spam I used to get was simply overwhelming my ability to tweak SpamAssassin and get rid of it. I've maybe seen a half dozen pieces of spam filter through in the past year or so of using Gmail. And though I rarely use it nowadays, Google also handles my Jabber IM as well [Update: This has been closed now as well, of course], which is nice, and my online calendar (again, hardly used). Beyond Google, I'm starting to use SkyDrive quite a bit. At first it was just because it was integrated with Windows Phone 7 which used for over a year until switching to Android. But I still use it - OneNote and EverNote is great for keeping track of odd bits of info or ideas (like this post), online Office is handy for cross platform documents, and the integration of photos and files is great for automatically collecting all the photos I take with my handset into one place, and then being able to share those pics with my parents. I've also continued to use Twitter and Facebook to post random thoughts and links, which may not seem like real 'third party services', but I used to put all that stuff on my blog. I'd never trust my personal blog or news reader to a third party though (and with the closing of both Posterous and Google Reader, the reasons are self-explanatory).

Unfinished games - There are sooooo many unfinished video games that I own, and each one I feel somewhat guilty about not finishing. They're not like a novel though, where you can pick it up next time you get a long weekend and power through it, you'll probably never see the end of a game you stop playing. Usually its because they just get too hard. Which is annoying, as you're generally paying for 1/2 or even 1/3rd of a game. So developers spend years creating some of these games, only for most of their customers to never see that content. It's odd. But these are just the 40 or so games for the XBox, Wii and Nintendo DS that I've purchased over the past 5 years. There are literally thousands more for each platform, plus others I don't own. Millions of man hours going into creating amazingly detailed worlds and adventures, much of which is forgotten about soon thereafter. Sure, there are classics that everyone plays again and again - there are people who time themselves playing 8 hours straight of Zelda games from the mid-90s still. But walk down the aisle of any Best Buy and there's so many games that I'll never get the chance to play, let alone finish.

The Total Perspective Vortex machine from Hitchhiker's Guide is real, and it's my XBox. Check out your personal video game rankings, and you'll quickly see that you're not a unique snowflake. It's amazing how you can play a game, do the best you can, the see when your game is compared online how much you actually *suck* at it, and how literally there are HUNDREDs of THOUSANDS of people out there that have better hand-eye coordination than you, or who are vastly more intelligent or better at puzzles than you. And those are just the ones who happen to be playing the same thing you are.

BigCo Money - When I first started working at Nokia, I was amazed at how much money a huge corporation had to make just to keep one office running. let alone a multinational organization. Coming from a startup where I had to pay for everything,my mind boggled at the shear amount of stuff and people. Everything cost so much money. Think about each person, in silicon valley that's easily $100k a more per year in salary, benefits and other costs. How much was the rent? Each chair and table and stapler and computer and monitor and mouse, etc. etc. And yet, in order for me personally to live, I need much less than all that. With my knowledge of all the various ways you can generate revenue on the internet or via apps, how is it I can't seem to earn just that relatively small amount to live without working at a big company?

Tablets are the Swiss Army Knives of knowledge - Pretty much every book, novel, website, movie, and piece of music is available. For kids it can be a dictionary, an encyclopedia, a math tutor, and a language guide. It can be used to create reports, learn how to play music, or used to create movies and stop motion animation. You can use it to draw, or play games, or browse the web. And the more people get used to it as a 'always on, instantly usable platform' the more educational tools will be developed for it - like say to lock down all the iPads during a test. And for multiple choice tests, with instant feedback to the teacher.

Touch screens can look like anything, so they're incredibly flexible. Unlike platforms of the past, there's almost no learning curve for many apps and games, because what you see is what you do. Wysiwyd. Ever see someone learning how to use a mouse for the first time? Or a modern game controller with two analog sticks, a digital pad and 12 different buttons? (start, menu, a, b, y, z, rb, lb, rt, lt, rs,ls) they're lost.

5 6 years of iPhone - After a half decade of being in consumers pockets, it's amazing how little the original iPhone has changed, and yet how it's still the platform to beat for most people. But more than that, the cultural change the iPhone precipitated is incredible. It's no longer surprising to hear about the sort of power and capabilities everyday people have in their pockets. What was once cutting edge, high-end only is now contained in the hand-me-down phones parents give their children.

The fifth year anniversary of the iPhone means the smart phone revolution has been in full effect for half a decade. It's no longer new or interesting to think of the device in your pocket as 'just a phone' and it's been years since people have said stupid things like, 'most people will never want to do X on a device.' The iPhone brought smart devices to the masses. All the good ideas of every other platform that preceded it was instantly ported over, and usually more success on the iPhone. I remember soon after the App store was opened, I talked to another parent who was a doctor, and she had loaded up an expensive medical dictionary that used to be only available on the Palm. To her, the iPhone was now as indispensable as any other tool she had. And as iPhones got more popular, more developers had friends and family who owned the same exact, incredibly powerful and easy to use platform and they started making apps like crazy.

I'm actually amazed at the shift to what was eventually called iOS programming. So many developers learned a whole new programming language (Objective C), and a whole new set of APIs, for a whole new OS, using a new user interface paradigm (multi-touch screens). It's mind boggling to me that it happened, and so quickly. But back to users, who are now jaded. There's many soccer moms out there who've had this insanely great device in their pocket for half a decade now. There are children entering kindergarten who've never known what life was like without an instant-on information and entertainment device always within easy reach. The key point is that the iPhone era now represents a solid half of the smartphone era, which began with the sale of the Nokia 7650 back in 2002. They may not have been first, but like the IBM PC that entered the 'micro-computer' market years after the first Altair was sold, the iPhone now dominates and defines the era.

Blogs - I've noticed a return of the personal, unadorned, no-ad blogs. It seems they're coming back into vogue. Thank goodness.

Apple's delaying lawsuits - At one point in their history, Apple pinned all their hopes on a lawsuit against Microsoft to recover from bad business decisions and to compete against a nimble competitor, and almost disappeared as a result. Apple is now suing competitors again, but this time the strategy in my mind is different. There are three outcomes - first, if they do actually win any of their suits, they obviously damage a competitor in serious ways and that's great. But that's just the best outcome. If they can simply delay their competitors from entering a market for just a few months, that's huge. Especially because of the number of devices that are being sold, and the inherent lock-in that comes with them. Once a customer joins the Apple ecosystem and starts buying Apps from the App store, they're going to be less likely to switch. Every month Apple can delay a viable competitor with a lawsuit, means millions of new customers who are locked in for a much longer time. And if they can win an injunction over a holiday period? Well that's massive.

How hard is it to create an open Twitter? - I wonder why nothing has gotten off the ground? Is everyone over-thinking it? Are the developers who could do such a thing all too cool to use PHP and just host the thing? Why not just Keep It Simple Stupid and use PHP for the back end, for single users, with some sort of JSON API for 'feeds' and client access? Single user keeps the resource needs pretty simple, just a SQLite db and that's it. Customization? Also keep it simple, and then make sure the system includes an aggregator as well (with PubSubHubbub support for instant updates) and poof, all done? I guess this is just thinking like it's still circa 2000 and everyone's willing to set up a Blogger-like server to host their personal weblog. Why bother when Twitter and Facebook already exist, right?

Why isn't there a Javascript version of PHP? - Developers have been bitching about PHP since it was started, but the core functionality has yet to be matched. The Shared Nothing, start/stop pages per request obviously is appealing in that it scales horizontally really well, and is super easy to get started on for a beginner. So what's the problem? Node.js and any other 'stateful' server solution (like Ruby or Python server code) really doesn't do the same thing. The script's lifecycle has to start run and stop with each request. Also, HTTP should be a first class citizen, with little in the way of getting to the pure data being sent/received - aka no abstractions. At one point there was a project called mod_v8 but the oxygen supply for that got sucked out by Node. It'd be great to see it come back in some form. PHP is easy to start using, easy to scale to a pretty decent size without major issues, and has great docs/community. But when writing modern web apps which use a lot of Javascript inside the browser, the swap back and forth is a mental pain. Ignoring library re-use and all that, it would be nice just to have to remember one set of oddities (and every language has them) and not two.

Old computers and video game consoles are going to get increasingly harder to hook up, now that TVs aren't using analog signals any more. I recently got an old univision Pong clone off of eBay, and was excited to hook it up to my Toshiba or LG flatscreen TV, but was surprised when it wouldn't work, and I was lucky enough to have a TV from 2007, before the digital switchover. I assume there will always be gadgets you can buy to translate the RF signals coming from the old computers or Ataris and convert them to something modern TVs can understand, but the days of finding your old TRS-80 CoCo in the closet and hooking it up easily are going to go away.

Being Actualized - What separates the people who become President or superstars or business tycoons seems to be that they are self actualized in a way that I can't seem to grasp. I'm not talking about those that have a bit of genius once in a while and then muddle through the rest of the time. The truly great people get up every day, and set to work on their plans with efficiency and skill. The normal stuff that knocks the rest of us off our stride, just don't seem to apply. They always seem like they just woke up from a great night's sleep. They're smart *and* capable. I've met a lot of people who might be one or the other, or a little bit of both, but I'm always amazed when I run into that person who's in that next league of functionality. There are 7 billion people on Earth, but even with that huge number, there are people that are the most capable. They just function really, really well. They don't have the same sort of penchant towards procrastinating or a host of other mental hangups. Maybe they're healthy, maybe they're not - but they don't seem to care either way. They're just super *functional*.

Many of us *could* be amazing, if only we could function at the top of our abilities all the time - we might have the intelligence or wisdom, (though more likely than not we actually don't and just think we do) to lead important projects and efforts. But most of us aren't like that. We put off things just because. We want to sit on the couch on a Saturday, or avoid making some effort at some point just because we don't feel like it. Or we get afraid of our success and sabotage it. Or we're afraid to even try. Or a million other ways that lead most people into lives of going to work 9-5 and spending the other 8 hours before bed eating or watching TV.

I'd love to become truly actualized before I die, but I know it's not going to happen, which is sort of sad.

Family gadgets - We really need better control of multiple accounts for Apps. Centralized billing and permissions. Ability to group families together, as each member has a mobile now. There has to be more effort given to family-level command and control features for the ever-increasing number of gadgets we have in our lives.

Apple desperately needs a way to manage accounts for the whole family. On one hand, their system is great, because I can buy an App once, and both my son and I can use it - he actually has an old iPod Touch as well that also can download it. But I don't give him my username and password, because he could spend a thousand dollars on apps in a blink. But since you have to have that to update your apps, I'm constantly having to grab his iPad and enter it in manually. It'd be really great if I could just manage his account as a sub-account. That said, it definitely should NOT be done like Microsoft's XBox Live system. What a giant pain in the ass. It's so confusing, I don't even understand how it works even today. I started with the basic, 'this is a child' settings, and then we could never seem to do anything together. So I gave up and just set the account to 'adult'. But even then, there's tons of games from the arcade that - even though they're installed on the same physical box, can't be used by his ID because he didn't purchase them. Ugh.

Family tech management is going to be a thing. But it's sort of a niche thing, as you only need it for a few years. My son is just now getting old enough to start exploring the Internet - YouTube specifically seems to be a big draw for him and his friends - yet he's still too young for a full dose of the real world yet. So his time online needs to managed and monitored - but not in a limited, sheltered way. Same thing for the apps he might want to buy, etc. but then in a few years he'll be old enough to make intelligent decisions (I hope) and smart enough to get around any draconian restrictions, so those controls will be moot. It seems that there will never be enough demand for such a fleeting target customer (parents of 9 to 14 year olds) for any real innovation to come in this area, and it's sorta sad.

We get used to UI limitations - When the iPhone first came out, something I thought may be an Achilles heel for the platform specifically when it came to gaming was the lack of buttons. I was convinced that there would be add-ons (like the gamepad that only seems to have arrived this year - 5 years later) to add buttons. My reasoning for this was Tetris. Playing Tetris using virtual buttons on a touch screen was something I thought just wasn't as enjoyable as using actual buttons. But now, years later, now that I'm used to using a touch screen to interact with all my phones and my iPad, playing Tetris using virtual buttons isn't so bad. I still think using actual buttons is better, as I hate when my fingers move ever so slightly and I start miss-tapping the screen, but it's not the deal-killer I once thought it was.

As someone who spent a lot of time in the early 1990s using desktop publishing software like Aldus Pagemaker and Quark XPress, I remember that I was thoroughly un-impressed with the layout and design of web pages and thought there needed to be something better. I wasn't alone - way before CSS was common, there were programs out there that would let you design pages like you were using a desktop publishing app, but behind the scene the HTML was being created using thousands of tables or who knows what. NetObjects Fusion comes to mind as one of the most evil of these programs, but Microsoft's FrontPage was just as bad. For many people who had been doing Desktop Publishing since the mid-80s, the idea of not being able to control the kerning precisely in fonts was something that was just unacceptable. The mistake was thinking that that meant that the entire new system was a failure. Just because one part of a new technology doesn't work as well as the similar part in the past, doesn't mean that the entirity is hopeless, nor should you focus on 'providing that missing piece' for those hold outs. How many times have I seen the phrase, "this new tech is great except for x, which isn't as good as before, so it won't really get popular until that's fixed."? And how many times has that been wrong? I've done this myself even over the past few years - I've been completely shocked at the uptake of mobile apps, for example, assuming people would balk at the inherent limitations of an app vs. a web page. I'm not talking about games, or apps that do things that web pages can't do yet, like multimedia, etc. I'm talking about the magazine apps, and the news apps, and aggregators like Flipboard. All that stuff can be done better via website, yet many people still prefer to use the apps instead, maybe even a majority.

Twitter really has reached it's limits of usefulness - I find myself using Facebook to post random thoughts more and more simply because I'm able to use more than 140 characters to express myself. Maybe for the vast quantity of people out there, making simple statements on the web, or just sharing a link, is enough to express everything they have in their brain about a subject. But I need more than that. I like being able to write and explain myself, to use turns of phrase or simply use a little more space to fully get everything in my head out. I may not need an entire blog post to do so, many times what I have to say is simple enough to fit in just a few sentences, or maybe just a couple paragraphs, which would make a sorry-ass blog post, almost not enough content to warrant the effort to load the page. But in a system like Facebook or Twitter, that's a great amount of content. Yet years after the service was created using SMS as its backbone, they still haven't expanded the amount of characters you can use. At first it was novel. Then later the insistence was a bold decision, but now it's just stupid. They've kept what are now essentially artificial limits to their system for years now - in fact tweets are so small, they actually barely contain the contextual information sufficient to make them useful in the aggregate (like for search engine ranking, or opinion analysis).

They've kept other artifacts around as well, like the completely useless single-page tweet, rather than showing a day's worth of tweets or letting users bundle tweets in some way (per hour, per day, whatever). Their profile pages also suck - being almost useless, and for little reason. Facebook really has the right idea - being able to pop in a link (which expands out to a snippet and thumbnail thanks to the open graph API), and then a paragraph or two to explain your thoughts is great. No more is really needed for most people, and most thoughts. But no less is good either. What's most annoying though, is that Twitter now seems to be able to support this functionality, but they're parceling it out to select partners only. That's stupid. Facebook's ability to have friends comment directly on a post is great, or simply note their appreciation for something or to acknowledge they saw it by using the like button. This is useful stuff - and though not all of it applies to a more public system like Twitter (where friendship - and thus trust is asynchronous), they've relied too long on the same simple posting mechanic with little-to-no innovation, let alone acknowledging other, more useful, ways of doing the same thing.

Since writing the above paragraph, I've of course shifted back to using Twitter more. No idea why, really.

Why would the Sailfish guys focus on Meego? - It makes no sense. With their skill set they could expand on the base Android Linux platform, adding in features and additions that could be sold to the numerous Android-using manufacturers to help differentiate their products. I was really surprised at how well Samsung was able to promote the handful of add-ons to Android in their Samsung Galaxy SIII launch. They took what could have been a boring me-too platform and made it into their own, but without spooking end users that they were going to be locked-in to Samsung because of incompatibilities. A new startup focusing on the OEMs as a customer, tweaking the core Android OS in various ways would be a valuable startup that had a real chance of making money and probably getting acquired. But Meego is dead. It's as gone as WebOS. Can you imagine if a group of ex-WebOS guys decided to create a startup focused on that platform? Insane. Same thing for Meego. Why dedicate more time of your life writing code and expending effort in a black hole of a platform that no one will ever use?

Personal servers in the cloud - Everyone should have a dedicated server that has high bandwidth and decent processing power. I've thought this for years, and I've continually paid for my own hosted server for over a decade - even though my blog isn't close to the traffic it once had, the idea of not having a dedicated island in the cloud just for me is unacceptable. That said, I don't use it nearly as much as I could, and in fact I've been slowly moving things away from the server - for example, my email is now hosted by google because they're able to cut spam 1000% better than I was ever able to. But I envision a server with intelligent agents that we could give tasks to, to monitor web sites and news feeds, do BitTorrent downloads for us, connecting to thousands of peers over high-bandwidth connections, serve as web proxies for us so all our clients routed through a single point - where we could put instructions to filter ads or add to a logic filter of things we're interested in, we could even host our desktops or game consoles in the cloud like OnLive does, but each person would have their own. But that last part might be crossing the line between what is an appropriate function for servers vs. what is better to have local - and the weak link of course is the reliance on constant high speed access. That's not really what I think is the most valuable - it's the idea of an intelligent agent looking out for us on a dedicated server which I think is interesting.

Home servers - I have a media PC in my living room which is attached to a 2 terabyte drive on which I put my movies and music, ebooks and backups. I wonder what less techy people do? And if we're entering the post-PC era, will people just use services in the cloud rather than have this stuff stored locally?

My iPad kills my productivity - Using the iPad can be incredibly unproductive. Forcing myself to turn it off when I'm at home has an incredible effect on how much I do on the computer. First, the reason doesn't have to do with the capabilities of the iPad as much as it has to do with where and how I use it. If I was sitting at a desk with only my iPad to use, I could actually still get quite a lot done. The problem is that when I grab my iPad to go read my news or email, etc. I generally take it to places where I can sit and relax while doing that stuff. I'm in a passive mode immediately. It's more of a bad habit than anything intrinsic in the platform, I think. Whereas with a laptop - which yes, enables more powerful applications, faster input and multi-tasking - I'm also sitting upright at a desk or table, or even on the couch in a way that is not so passive. So when I turn my iPad off now, instead of just putting it to sleep, it's a way of reminding myself with a few seconds - "hey, you've got stuff to do - don't go into passive mode right now." - I may go to the computer and check Facebook, but I'm also in a position to actively engage in other stuff as well. I wonder if there's a way to get out of the bad habit with the tablet form factor, so I can lounge *and* be productive.

Habits - As I get older, I find myself less able to break bad habits, or more importantly start new good ones. I can remember over the years starting various new activities and then having them slowly fade from my routine until I stopped completely. And I also have a sense that, over the years the time from starting to stopping has shortened quite a bit as well - even though at the same time, time itself seems to be moving faster. It's already mid July, yet the year seems like it started just a little while ago. One would think that if I started running regularly now, that *bam* six months would flash by and I wouldn't have noticed that I'd been exercising regularly for all that time, right? Habits should be *easier* to get started as you get older since time is zipping by so fast. Yet. It seems that I start new habits, and then they're done within a couple weeks, if that. Not only that, but sometimes I even forget I was starting something new, and then remember 3 months later, "Oh, yeah - I was going to go to that thing every week..." I have to assume that these two issues are related.

Web revenues - How much traffic is enough to generate significant ad revenue nowadays? I guess it's the same as it's always been for Pay Per Click, it's all about the types of ads, making content relevant to the most lucrative ads and then attracting enough traffic (SEO and otherwise) so that the minuscule portion of people who actually click on those ads is enough to add up. Using various versions of ad-block for years, it's always amazing to me the type of person who'd click on an ad. Even Google... I use it every day, but when's the last time you clicked on a sponsored ad? Rarely, if ever. Their whole business model is predicated on taking advantage unsophisticated people not knowing any better, and that there's so many millions of these people that the numbers add up.

Apps are actually a service, rather than a product - I've known this for a while, but it's still an odd thing to really grok. I have a bunch of apps I bought from Apple, yet none are actually, truly mine. I can't sell you my copy of Angry Birds because I'm tired of it, can I? So really, it's not so much a product, as it is an active sort of service. Same thing for the games I bought from XBox Live, except you have to pay a yearly service fee. Imagine if Apple decided to charge a yearly fee for iCloud and app updates? Imagine if iCloud worked more like XBox Live, and if you wanted to post an update to your app, it'd cost $40,000?

Net neutrality needs to start in the cloud - Silicon Valley loves to point to carriers saying they can't cherry pick traffic as a sign of net neutrality, and they're right. But net neutrality also needs to extend to the services in the cloud as well - both from a legal and corporate policy perspective. My space in the cloud needs to be considered as private as my personal computer (and thus protected from unwarranted search and seizure). If I am using a service such as Twitter, Facebook, WordPress, Apple, etc. for publishing, that publication needs to be protected from *corporate* interests, as much as it is protected from government. Right now, any service can cancel your account for any reason without recourse at all. This is unfair on every level. There is no level playing field between an individual and a multinational corporation. If twitter doesn't like what you're tweeting, it can and will take down your account. Your recourse is minimal. There needs to be a cloud fairness act which makes sure that companies that provide data services - on which we rely for both entertainment and livelihood - can't impede your access or use of those data services without just cause.

There's two levels to this - the first is to prevent companies from making purposeful, high-level decisions to not provide or cancel services for some purpose - examples would be Apple not allowing any browser technology in the App store, nor allowing OnLive to publish their streaming game client (even though it breaks no rules). Then there's the random customer service rep who does something stupid, but there's no way to get it fixed because corporations are slow moving behemoths filled with people who are simply trying to cover their own ass. Examples are Microsoft recently canceling a non-public Skydrive account because out of thousands of files backed up, there was single picture of a naked woman, which it considered porn. Apple recently told a woman she couldn't even *mention* Amazon's services in any eBook published on iBooks - at first she needed to remove live links, then it was the content itself.

On both a macro and micro scale, big companies both don't have the incentive to be fair. Think about it - customers outnumber employees by literally millions to one. Support staff are overwhelmed by requests, so their priority is simply to get *most* of them right. If they don't get most right, they'll definitely start to lose customers, and that's not good. But what about the one or two that fall through the cracks? What about the one or two people who got screwed for no reason - well, there's no alternative for them. Unless their issues are so egregious they can drum up some bad publicity to get attention focused on their problem, there is no recompense for them.

A cloud neutrality law would focus on this aspect of numbers. If your company offers services to a certain number of people, or to a certain percentage of people more than your employees, or if your company acts as a 'marketplace' for buyers and sellers, then that company should be forced into being a neutral party. We've seen this work (somewhat) when it comes to piracy - in order to not be libel for pirated content stored on your servers, service provides need to be neutral in handling of that content and respond to take down requests, etc. The incentive was there for them, and suddenly the cost of support was worth it.

Cloud services need to have similar restrictions placed on them as well - with hefty fines for unjustifiably restricting access to a service, product or marketplace. Apple should be forced to open up the App store as they are providing a marketplace. Twitter should pay hefty fines if they take down an account for any reason other than actual criminal activity.

Mozilla's documentation needs serious help - There's *so* much of it, with various versions of the same content, it's almost impossible to figure out what you're looking for and looking at. The good thing is there's lots of information. The bad thing is there's lots of information. I know they've started working on this with a new CMS system, but it still needs a lot of help. And what's going on with that thing? Seems like it's been forgotten. The only way that site could *ever* survive is if Google, Microsoft, Apple and Firefox all agreed to close their own web doc sites and use that instead.

Android - I'm grudgingly liking and supporting Android - you have to give Google credit for continually pushing Android. It started as a system geared around making Nokia E65 clones, but was quickly morphed into an OS that competes with iOS. Year after year, more resources have poured into it and the result is really the OS everyone was hoping for a decade ago - Java for App and UI development, on top of a standard Linux stack. The two alternatives (iOS and Windows Phone) are locked down and need proprietary dev tools: Want to create an iOS app? Get a Mac and XCode. Want to publish to the WP7 store? Get a Windows PC and Visual Studio. Android apps can be developed using any OS (including Android itself). Sure some carriers lock down Android devices, but alternatives have been regularly available for a while now.

Android everywhere? - Android has become the defacto operating system for any piece of consumer electronics not produced by Apple. This makes sense, as underneath the GUI and main App APIs is Linux. I think rather than considering Android as its own OS, maybe it should be thought of as Gnome Desktop for gadgets. Actually, as Chrome OS comes along, maybe it'll be for desktops as well. Will Microsoft be able to get Windows on TVs? In the cut-throat business of consumer electronics where every penny counts, I sorta doubt it. I think it's interesting how Linus criticized Apple and Microsoft for having to split up their OSes into server, desktop and gadget versions (calling the unification, 'bullshit', which may be wrong in the case of MinWin).

I agreed with Nokia using WP7 actually, as it seemed like a good way to differentiate themselves and get the backing of a sleeping giant. But what I didn't foresee was the viability of companies like Samsung to be able to launch compatible Android devices, yet still add in custom features to differentiate their devices that weren't just cruddy shells on top of the stock OS. Their voice, messaging and imaging enhancements on their Galaxy phones really do make that device better than competing models with the same hardware specs and version of Android, yet it's still Android and your previous purchase of Angry Birds is still valid and compatible. I honestly didn't see that actually working, but it does. Mostly because Android is continually improving, so the base functionality is good - and adding features on top of that functionality is just icing. Rather than being faceless clones, HTC, Samsung and Sony for example have been able to differentiate, while also keeping compatibility. The end result is that any third option (Windows Phone, or something else), is going to face the impossible task of getting traction because they'll be different (just like all the various Android devices), but without the benefit of also being compatible.

Android has won - In my mind, the world is going to continue to move towards Android with increasing speed and ferocity.

It's open. It's Linux. It's widely supported. It's the only thing comparable to iOS. It's easy to develop for. It's generally good looking and functional. It's fostering massive competition and driving down price. Google isn't letting up, continuing to pour resources into each version of the OS continuing to add features and improve the UI.

Though Apple is the 1000 pound gorilla and Microsoft is close behind them in terms of influence over the general computing market, and it's hard to tell what the longer-term effect of Apple's dominance or Microsoft's control over the traditional PC is going to have on the "post-PC" devices, but Android right now seems to be doing nothing but growing. Maybe this era is different, and Apple will just continue to dominate the market and everyone else will just be fighting for scraps. Maybe Microsoft's pull in the Enterprise and living room (XBox) will surprise everyone in the long run and they'll become a real player. I'm not sure.

But what I am sure is every other manufacturer on the planet is creating devices and using Android. If it's a consumer product with a screen? It's going to be running Android. This is pretty huge... I think until just a year ago, I was really wondering if Google were going to be able to make that happen, or if another variant of Linux would somehow jump in and take over. But for as many flaws as Android has, all the rest were even worse (or vapor) and now there seems to have been a decision to use Android. If you want all the fancy features, you license that stuff from Google. If not, you go your own way and use a custom version like Amazon is doing with their Fire tablets, and countless OEM white-labels are doing throughout Asia. That's a pretty great strategy, and one that is hard for anyone besides Apple to counter as of right now, and it'll only get worse as time goes by.

I wonder if we won't start seeing 'general computing' platforms (aka PCs) with Android on them soon?

My move to Mac - I hate to admit it, but I am vastly more productive now that I've finally switched from Linux and settled on using a Mac at home. There's so many options in terms of apps, and the support is fantastic as there's already such a rabid community of developers and users out there that are using Macs already.

That said, I do have to say that the Unix underpinnings of the Mac are just sad in so many ways. It's really difficult to understand unless you've used a Linux desktop for a while, but having the best command line tools at your finger tips, and the ability to install/upgrade any dev tool or server using an integrated package manager really makes using Linux so nice. But for everything else - whether it's using iMovie to arrange videos, iPhoto to manage my pictures, using the countless commercial apps that are available - from small guys like Pixelmator to any of the Adobe or Microsoft products - using a Mac makes many small headaches just disappear.

Home grown industry - In Silicon Valley the locals do startups and technology. It's what we do here. Other places might specialize in making movies or being a stock broker or lawyer. Here, we start tech companies, work at startups, wish we were working at a startup, living down the street from a guy who just sold his startup, talk about startups, read about startups. It's hard not to be constantly amazed at how much influence the companies in this area have on the entire world. From Intel to Apple, Nvidia to Google, Facebook and Twitter, Oracle, etc. Even though it seems that tech and information services should be a completely open playing field, with no inherent advantages for any particular geographic region, Silicon Valley seems to be able to consistently produce generation after generation of incredible technology and high tech services. Every once in a while I look around and really see how crazy it is here. My kid's friends parents all work in one tech company or another. It's what we talk about at barbecues, while we're waiting in line somewhere, chit-chat at the dentist. It's just the local industry, but one that has such huge effect on the rest of the world, it's nutty.

< Previous         Next >