Thursday, December 16, 2010

High Hopes for ChromeOS

Google held an event last week that provided more detail on their efforts surrounding ChromeOS and announced a pilot program. There seems to be much confusion around two questions:
1) What is it?
2) Why should I care?

I can address #1 somewhat. ChromeOS is an operating system Google is making available to partner computer manufacturing companies that boots straight into the Chrome Web Browser and really restricts itself to using the web. The concept is that people using netbooks and even full-power laptops and desktops often spend the majority of their time in the web browser. Google has improved the Chrome Web browser, incentivized web app developers by adding a web app store, simplified the operating system to running this single application, baked in the latest security capabilities in a user-friendly fashion, and dictated inclusion of up-to-date wifi and 3G radios in the hardware so that users can be almost always connected.

So, why should you care? Google partially answers that question here:
http://www.youtube.com/watch?v=lm-Vnx58UYo
For myself, manufacturers should be able to turn this platform into cheap, simple to use, long battery-lived, quick access computers that do nearly all the things people do on today's computers and more. Powerful applications can run on powerful servers instead of requiring you to lug around beefier hardware. Simple applications stay simple, are always up to date, and don't expose your data to loss or theft when, inevitably, the computer is damaged or lost. The frustrating days of paying thousands of dollars for a laptop, then hundreds or thousands more for applications, loading it up with private info, personally owned media, and precious memories, only to leave it in the airport security line and lose everything may soon be behind us.

Google's CR-48 pilot program demonstrates all these benefits for some, however, access to them remains in the future for the larger marketplace. Google's pilot program demonstrates what can be done but its up to other parties to carry this forward. Manufacturers need to come out with compelling hardware at a reasonable price, soon. Developers need to leverage the power of HTML5 in their web apps to make them perform like locally-installed apps. More web development of private cloud technology is necessary so that people and businesses have a secure place to keep their info that doesn't belong to a third party. Finally, wireless carriers need to get on-board. Nobody wants yet another 2 year, $60/month contract tied to a single, limited device. I really don't want to pay by the byte, either, (would you want to pay for your electricity by the electron?) but that appears to be the way things are headed. The 100MB/month free setup Verizon includes with Google's CR-48 pilot program is encouraging but it remains to be seen what will be offered with real products. Here's hoping it all comes together

Wednesday, December 1, 2010

Why Is My Home WiFi Flaky?

I spent probably way too much on an Apple Time Capsule a couple years ago. In an attempt to get my money's worth, I have been trying to get alot of use out of it. Its supposed to do alot: dual band (2.4 and 5 GHz) WiFi, Gigabit Ethernet, 1 TB network-shared hard drive built-in, and a host of other Appley features. Since nearly every WiFi device is compatible with 2.4 GHz, I've left it in that mode for those couple years. The coverage is flaky, however. Videos studder, even when I'm playing them from a share from another computer in the house, and sometimes connections just drop. In an attempt to find out why, I read up on and downloaded a free program called InSSIDer which produces a handy graph of all the WiFi networks in range and what channel they're on. Here's what I found:
2.4 GHz Networks
2.4 GHz Networks

5 GHz Networks
5 GHz Networks
(Network names are obfuscated to protect the innocent) There are multiple, overlapping networks in range at every available channel in the 2.4 GHz spectrum. While the WiFi specs have algorithms in place to continue operating in spite of all the interference, they're pushed to the limit by all the overlapping networks. By contrast, there are far more channels in the 5 GHz spectrum and only a single network in range. A 5 GHz network in my house could easily operate without any interference from neighborhood WiFi networks.

Luckily, or so I thought, the Time Capsule supports 5 GHz WiFi, as well. As I found when I switched it over to 5 GHz mode, however, the performance was dismal. Despite having no other networks interfering with it, speeds and reliability in 5 GHz mode were even lower in some cases than in the overcrowded 2.4 GHz mode. It was actually worse when my laptop was in the same room as the Time Capsule, so range isn't the cause. After using jperf to take some actual throughput measurements at a few places in the house and comparing them to what competing wireless routers can do (2-7x faster in on-line reviews), I've decided to retire the Apple Time Capsule. Since I've got the Windows Home Server handling backup of both PCs and Macs in the house, now, even its use as a Time Machine backup target is no longer relevant. What a colossal waste of money.

Don't Get Me an Internet TV Device for Christmas

A startling and confusing array of Internet TV devices have been launched over the recent months, all competing to be your gadget of choice for getting internet content onto your TV. Even more confusing to most folks is why they would want such a device since none of them gives us access to the full web we see on our computers. I was in the middle of drafting another diatribe on the topic when I came across this series of posts from Tim Higgins over at Smallnetbuilder.com: http://www.smallnetbuilder.com/multimedia-voip/multimedia-voip-howto/31346-diary-of-my-switch-to-internet-tv-part-10-internet-tv-boxes-bah-humbug

That link is to his most recent post where he comes to the same conclusion I have: Its all a big waste unless deals are cut with the content companies. He comes to another conclusion that the best device for watching internet content on your TV is still the computer. Apparently, content license owners are coming to the same conclusion and exerting more control over what they make available online. Just tonight, my wife and I sat down to watch back episodes of a show a good friend turned us onto - The Big Bang Theory - and we've become hooked on. Its not on Hulu, Netflix, CBS (who broadcasts it), and though it shows up in Amazon's VOD service, you get this annoying message:
Amazon licensing error message

It's a not so subtle reminder of who, exactly, is in charge of what we watch, when. Like it or not, our options are to wait for the license owners to come around or work around them and be labelled a "pirate." As nifty as the Internet TV devices are, technologically, I don't want to be in tonight's situation of searching all over, site by site, only to be thwarted by someone else's "licensing agreements." And, as Tim Higgins points out, even the best of the available devices still have their bugs to be worked out.

Thursday, September 23, 2010

Windows Home Server and Apple Time Capsule

I just got a HP Mediasmart Windows Home Server and was digging through some forums about another issue. I came across these very handy, detailed, instructions on homeserverhacks.com for how to get the remote access features working with my Apple brand router.

I'm in the middle of backing up all my systems (two Windows laptops, an older desktop, and a Mac Mini) to the server at the moment so I haven't messed with it. So far, the system has performed well. A few warnings, though:
1) The Media Collector that comes with the HP-branded models works great for automatically collecting and organizing music from iTunes. Some sites claim it does the same thing for videos from iTunes and photos from iPhoto. That hasn't been the case in my first 24 hours of experience. iPhoto's approach of storing everything together along with a DB only it can manage doesn't lend itself to sync with any other system and HP's software is no exception. Since we switched to using Picasa a while ago, which uses the approach of leaving image files where they are and detecting new ones as they arrive, our more recent photos have synced over but the old ones from iPhoto have not. I'm not sure what to think about the iTunes movies. Its amazing what the Mediasmart Server does, but it doesn't quite meet the expectations I had, so far.
2) The point of the Media Collector is gathering all your media together from different systems. You get out of this process, however, what you put into it in terms of organizing your media before the collector goes to work. Again, in contrast to its nice handling of music, photos and movies are collected into the server shares by username. Unless you happen to use the same account on all your systems with similar folder structures and everything it its default location (e.g. all your pictures you want collected in the windows "My Pictures" folder), Collector makes an equivalent mess out of your files in its shares. One option I found out about too late is enabling the Guest account on the server. This option should allow everything to be Collected under that single username.

Friday, August 6, 2010

WSJ and re-reporters miss point on IE In-private Filter

The Wall Street Journal recently published an article titled "Microsoft Quashed Effort to Boost Online Privacy" on Microsoft's internal struggles about how to implement their "In Private Filter." Tech news sites picked up and re-reported the story - a few examples are
http://arstechnica.com/microsoft/news/2010/08/microsofts-internal-internet-explorer-privacy-battles.ars and http://wiki.twit.tv/wiki/Tech_News_Today_44 - and failed to add the important point that others browsers released the same or similar features first. According to Wikipedia, in Safari's case, their private browsing feature was released several years in advance of IE8 while Google's own Chrome browser had the feature at version 1.0 in late 2008. Microsoft didn't release the feature with IE8 until March 2009, though it had been in the prior Beta releases.

Despite the WSJ's extensive documentation of Microsoft's internal wrangling, this story about Microsoft "quashing" their In-Private filter sounds trumped-up and hypocritical of the other browser companies. IE8's InPrivate Filtering is comparable to Google Chrome's Incognito Mode (lovingly known as "Porn mode") feature. Neither of them have the setting stick - you have to manually enable it with every launch of all browsers on the market (and Mozilla's Firefox followed suit with version 3.5 later in 2009).

MS enables essentially the same feature in the same way - late to the market, as usual - and suddenly, over a year later, the WSJ is crying foul because Microsoft also does online advertising on the side. Did everyone forget that Google is the world's biggest online advertising company? The only news here is how long after Apple and Google released private browsing it took MS to release the same feature in IE8. MS has certainly played catch-up with IE8 but they have no greater conflict of interest than their competitors. In fact, I'd say Google has a far greater conflict of interest with Chrome since they purchased Doubleclick, the world's largest online ad tracking service at the time. Google passed up the same opportunity to do something potentially good for the consumer by making private browsing the default.

Now that the timeline is far enough along that most of us have forgotten who came first, the WSJ releases an article critical solely of MS mentioning Chrome and Firefox in the article, but not that they each have similar features. I'm no Microsoft apologist, but I don't think all this FUD is warranted by their implementation of what is essentially a catch-up feature. I would have hoped that either the "real journalists" at the WSJ or the tech-focussed blogosphere would have picked up on the discrepancy before passing along the story with such a conspiratorial bent.

Monday, July 19, 2010

Apple Apologizes Reluctantly for iPhone 4 Antenna Problems, Blames Media, No One Explains Problem Plainly

Steve Jobs and the Apple spin machine are at it again with their latest press conference on the iPhone 4's antenna problems: http://events.apple.com.edgesuite.net/100716iab73asc/event/index.html. Steve's presentation makes it clear both implicitly (with his body language and dramatic pauses) and explicitly (with what he says) that he thinks there's no real problem here. He expresses his view that the only reason all this hubbub occurred is because the media is blowing the problem out of proportion.

Apple's successful manipulation of the media has led to some of the best free advertising for any tech company in existence. Now, the combination of media uncertainty about how to properly explain a legitimate technical performance issue with the iPhone 4 and a lack of clear technical explanation from Apple has led to negative coverage. Boohoo. Welcome to real world. The media giveth and the media taketh away. They especially take away when customers report a real problem and the company responsible tries to claim everything but responsibility for it.

You can't have it both ways Apple. This performance issue can't be both the result of an Apple design choice to expose the antenna and its "weak spot" on the outside of the phone and not your fault. The extremely limited data Apple provides in its presentation show an increase in dropped calls from the iPhone 3GS to the iPhone 4. Given that the "revolutionary" antenna design was supposed to improve wireless performance, seeing poorer performance is clearly an issue. Again, you can't credibly present those data showing an increase in dropped calls and then claim there is no technical issue.

As someone who knows just enough to be dangerous about radio signals and electronics and is willing to speculate based on the coverage I've seen so far, I think the first failure is that no one - not Apple, popular media, the tech media, or any experts brought in to comment - has succinctly explained the problem and its likely causes. Anandtech did a great job of proving quantitatively, as best as you can with something as complex as RF devices and without the complex test equipment, that there is a problem in their article: iPhone 4 Thoroughly Reviewed. Where they didn't do so well is in explaining it plainly and this poor explanation was repeated in follow-up interviews such as CNet's Reporter's Roundtable and TWIT (This Week in Tech) episodes. I can give the journalists, podcasters, and bloggers some leeway because they're not all experienced engineers and this poor performance - and its inconsistency among users - is likely due to a complex combination of technical issues.

As I've said before, its easy to criticize and harder to do better. Here's my attempt at doing better.
Caveats: Modern radios and the antennas that work with them are extremely complex. I understand antenna design is hard and explaining it is even harder. Its not possible to explain plainly something so complex without making some gross generalizations.

Background: Every cell phone manufacturer takes as many of the complex factors determining connection performance as they can into account in their phone design. Its a fact of life in the mobile wireless industry that you will have something nasty like a hand (in terms of causing RF signal attenuation) covering your antenna and blocking its ideal signal path to the tower. I'll call this Problem A. Manufacturers have handled this problem by making ever more sensitive receivers, better amplifiers, and moving antennas around inside their phones. Since modern handsets are about the size of a hand, the insides of the case provide limited options for improving performance by moving the antenna. The wireless standards bodies have addressed these realities by including all sorts of digital encoding and processing schemes into the communications spec such that weaker and weaker signals still result in a successful connection.

The simple part: Apple thought they could break the mold and do something revolutionary by putting their antennas around the outside of their phone, thereby reducing the effect of Problem A. Instead, they added a second problem, Problem B, that occurs only when the phone is held a certain way. When a user physically touches the two antennas running around the outside and electrically connects them, they drastically change the RF properties of the antenna-hand system and make the antenna less sensitive to the frequencies it uses to connect. Problem B reduces signal at the receiver which compounds with Problem A to make the signal drop even more sharply. Problem B simply cannot occur when the antenna is insulated inside the handset's case.

To put some numbers on these problems, Anandtech measured their compounded drop (A&B) to be 24 dB, or about 250x less signal than is otherwise received. They further measured the attenuation with the phone in a case, essentially isolating Problem A, to be 7.2 db, or about 5x less signal than is otherwise received. So, Problem B knocks the signal down roughly (250/5=) 50 times! Whereas internal antenna designs have to deal with Problem A, Apple's iPhone 4 antenna design has to fight the compounded result of both Problems A and B.

The complex part: So, why don't all customers see the same increased call-drop or poor connection quality behavior? Three reasons:
1) Not all users expreience problem B. They either don't hold the phone that way or they have a case which prevents it. They only experience Problem A. Therefore, they don't experience the extra ~50x drop in signal strength.

2) Digital encoding technology hides the issue in strong signal areas. As previously mentioned, and explained in Anandtech's article, a modern digital handset can maintain its connection at the same level of quality in the presence of an extremely low signal that is still above threshold as in the presence of a high signal. A signal weakened 250 times can still be above threshold. With digital technology, as long as your signal is "good enough," the connection stays good and the user notices no ill effects. When the signal level drops ever so slightly below the systems' threshold, however, the connection just drops. There's very little in-between where the connection sounds fuzzy or distorted like with older analog technologies. This technology effectively hides from users whether they're in good or bad signal areas for all situations above threshold.

3) Wireless phone users have become accustomed to calls dropping and intermittent connections. There are enough other problems wireless handset manufacturers and carriers face, not described here, to take up the rest of the alphabet and then some. AT&T has gotten its coverage and service heavily criticized of late, and I think there is some merit in that criticism. Regardless of AT&T's specific issues, all carriers face these problems, all drop calls, all drop connections. If you hear that AT&T's service is poor in your area and buy an iPhone anyway, then dropped calls and intermittent data connections are something you're expecting. What are a few more of something that has not previously been quantified?

Thursday, June 10, 2010

User-focussed mobile data plan model

If you look at some of my older posts, you'll see that I'm no fan of the data plan model currently in-place in the mobile wireless industry. Flat fees for "unlimited" plans that are, in fact, quite limited, with zero accountability for the quality of service provided, and inordinately large and increasing termination fees that hamper consumer mobility just aren't good enough. The recent changes announced by AT&T and rumored to be coming soon on Verizon where users are charged for "buckets" of bytes doesn't make sense to me, either. These plans create a false sense of scarcity for something that, once "enough" infrastructure is in place, costs the carrier nothing: the byte. In fact, the majority of costs to a wireless carrier are incurred to install and maintain a given capacity and those costs remain nearly static no matter how much you use them. The cost per byte actually drops the more customers consume them. This by-the-byte approach rolls out at the same time as mobile ad platforms (which use data, too) are rolling out. How much are you willing to pay to see ads? Since no system is in place to inform users of how much data they currently use, until they exceed their arbitrary limit and find out in their increased bill, metered usage doesn't make sense, yet, and still fails to incentivize carriers to provide higher capacity and availability. This approach continues to put all the information, and therefore power, in the hands of the carrier while all the responsibility for managing usage falls to the customer. There are many criticisms that could and should be applied to the wireless industry (not the least of which is the spinning of the word "bandwidth," but that's a whole other rant).

It is easy to criticize, however, and hard to propose a more reasonable alternative. The fact is that carriers are saying they will provide feedback to users about how much data they have "used" along with warnings when they're about to exceed the plan amount. Also, bytes, like minutes for phone plans (as in, used for talking), are fairly easy for users to understand and carriers to monitor. The more data you send and receive, the more you pay. Carriers, for their part, need to track usage as part of normal operations to ensure their network is balanced, anyway, so why not report those data to users. Its simple, in concept. How else should usage be tracked if not by byte?

So, what if we came up with a way to meter the bill based on something that's focussed on user experience? The primary complaints customers have, and the primary challenges of a carrier, are capacity and availability. If there are great connections all over a city but poor connections outside the city, what happens when you're driving or taking the train home for the evening? Your call gets dropped and your files start downloading slowly. So, why should you pay the same price while you're in a bad area as you do in a good area? Reversing the scenario, what load (and therefore cost to the carrier) do you put on the network when you're in an office building where the signal doesn't penetrate? How about when your phone is powered off? None. The carrier allocates zero wireless capacity resources to you when you're off the grid. So, why should you be paying at all for the significant portion of the day when you're not able to use the service? I suggest billing based on two factors: 1) The Carrier's quality of service to the user (i.e. speed and latency) and 2) The portion of time that service is available to the user (availability). The better job the Carrier does, the greater proportion of the plan fee they can charge their customer at the end of the month.

As with any idea, the implementation details determine everything. I'm not advocating making the wireless business so volatile that carriers can hardly stay in business. Reasonable limits should be in place such that customers have a minimum fee to pay, even if they choose to turn their phone off all month. There are some costs to having the network available to you, even if you don't choose to use it. Likewise, reasonable limits should be in place on how much the carrier can charge if they achieve 99.99% coverage, quality, and reliability. That maximum amount should be known and agreed to when the plan is purchased. Carriers should publish their quality and reliability numbers, by area, every month with history reported for at least the length of the longest contract offered, so that customers can make informed choices when selecting their carrier.

This approach increases complexity, though, by using several numbers to determine your bill. There are many ways to simplify that complexity, however. Speed and latency thresholds could be set in the plan such that the sole variable determining the bill is time of availability better than both of those thresholds. By contrast to AT&T's recent announcement of $15 for 2GB with $10/GB after that, perhaps you'd pay $15 as a minimum for the month plus $15 x the percentage of the month your phone had a connection at the agreed quality available to it (as determined by periodic connection tests from phone to internet firewall). I'm sure many methods of implementation would be thought up and tested in the crucible of the market. The point is that customers would pay more for a better, more consistent experience and less for a worse experience. Right now, you get what you get and you pay the same or even more, regardless.

This approach has merits for the customer in that it re-balances power such that more is in hands of customers and it gives carriers an immediate and continuing incentive to improve the customer experience. It also gets the industry away from billing customers for a commodity that has no direct link to cost - bytes - and which customers have little to no control over (web pages don't tell you how big they are before you load them!). Instead, the industry moves towards billing for the primary cost drivers - capacity and availability. Carriers who provide a better experience by expanding and balancing their networks better make more money. Customers pay for availability at an agreed quality level.

So what's wrong with this idea? First of all, carriers would never willingly agree to giving up so much power. Gov't regulation would be required and that's tough to get right. Secondly, its complex. The only way to ensure lawmakers, who don't really understand the issues, get those laws right is to make things crystal clear, black and white, such that its clear they're doing the right thing and gathering the approval of constituents (who often don't really understand the issues, either) by voting for it. The industry has already convinced us that paying over $1500/MB for text messaging (that's about what 20 cents per message works out to) makes enough sense that it's turned into a multi-billion dollar per year revenue stream. Their ability to cloud issues and shape regulation has been a big part of their success. This kind of change needs a champion inside Congress to carry it forward and a bevy of champions in the regulatory agencies to insure it is properly enforced.

Tuesday, June 8, 2010

Apple WWDC 2010 Keynote Video available as a Podcast, too

I went to watch the latest Stevenote tonight from Apple's streaming website - http://events.apple.com.edgesuite.net/1006ad9g4hjk/event/index.html - and found that, no matter what quality I clicked on, I got a low quality stream (224 lines). Since I bothered to get an HDTV that works with my Mac Mini and I'd read that part of the keynote demonstrated the new iPhones higher-res display, I was pretty disappointed in how it looked. If you look for the "Apple Keynotes" podcast in iTunes, you can find it as a free video download (with 360 lines) along with video from the past Apple Events since the unveiling of the original iPhone in 2007. While 360 lines is not exactly HD and not as high resolution as the "High" stream on their website, it certainly looks better on my big screen than the low res stream. Hopefully, Apple will have their site fixed soon so it won't be an issue. Its still tough to see some of the differences in images and text Steve demonstrates between the old and new displays when the new iPhone display has significantly higher resolution (960 lines in portrait mode) than the video it is shown in.

Friday, June 4, 2010

Friendship isn't binary

Another thought on what turns me off about Facebook:  it treats friendship as if its binary.  You are either friends with someone or you are not.  No matter how much time you spend messing. With your privacy settings, even at its most locked down, annyone marked as a friend sees everything.  What about family, acquaintences,  work buddies, neighbors, or those true friends who help you move (bodies)?  They're all lumped together into an on or off, up or down, in or out world.  So, users must either constantly filter what they say for the current (and future) mix of ins or, as I'd guess everyone with more than 10 or so friends does, post whatever and forget about the consequences.  Without any sense of groups of friends, you may as well make everything public.  This approach simply doesn't fit my life.

Thursday, June 3, 2010

Mac Malware in the wild, finally?

After hearing much speculation about proof of concept malware or malware that you can only get through illegal warez download sites, here's the first instance I've heard of malware effecting Macs in the wild:
http://news.cnet.com/8301-27080_3-20006502-245.html?part=rss&tag=feed&subj=News-Security

It appears to have been caught very quickly at the sites where it was available for download and its spread limited. It also appears to be highly adaptable. This is nowhere near as virulent as the horror stories of infection on Windows PCs by simply opening an e-mail (without opening the attachment) or visiting a website (without knowingly downloading anything). With Apple touting "no viruses and spyware" as a feature for the past several years, however, Mac users may be less suspicious and less prepared to deal with this type of threat. In this case, a user has to download and install a program that is advertised for a different and quite useful purpose from a trusted download site. Then the software downloads and installs the truly malicious code without the user's knowledge. The truly weak links, here, are the sites (Softpedia, MacUpdate, and VersionTracker) where the malicious download was allowed to be made available. If you can't trust these sites, some of the biggest names in free software download, who can you trust?

Friday, May 14, 2010

Not impressed with the quality of this BlackBox Server Enclosure

Who knew putting a server-grade ups in our lab rack at the office
would turn into a shop project?  Even though APC, who shipped rack
rails with the UPS, and BlackBox, who makes the rack enclosure, both
claim to use #10-32 screws, I was only able to make APC's screws fit
in about 1/10 holes in BlackBox's rack. Other screws of this size from
lab supply didn't fare any better.  Luckily, we have a whole pile of
BlackBox screws that came with the enclosure.  Their own screws fit in
about 1/2 the threaded holes in their own rack.  It took a powered
driver to get them all in, too, sometimes taking multiple tries to get
part of the way in, back out and blow off all the metal shavings, then
go a little farther.

Like I said, I'm not impressed BlackBox.

I am Deactivating then Deleting my Facebook Account

I just posted the following message to my Facebook account. Since it will be going away soon, I figured I'd repeat it here.

Goodbye Facebook. The evidence is overwhelming. Facebook has proven itself unworthy of my trust with even the little personal information I have chosen to share with it (basically a name, picture, and e-mail address. As soon as I figure out how to do it, I will be deleting my account. Seriously, they don't make it easy. There are instructions here: http://www.wikihow.com/Permanently-Delete-a-Facebook-Account

I'm tired of blocking/ignoring app requests because Facebook passes my contact details to whoever owns the app (not identified), not just my friends. Who wants more spam? I'm tired of trying to dig through the mounting pile of privacy settings just to do what I came on here to do: Communicate with you - my friends and family - and only you.

Here's a timeline of Facebook terms of service changes relating to Privacy up to Dec 2009: http://www.eff.org/deeplinks/2010/04/facebook-timeline/
Here's a more visual representation of how the default settings have made more and more of our personal information public:http://www.huffingtonpost.com/2010/05/07/facebook-privacy-changes_n_568345.html

The more recent changes in April extend to following me around the web and sharing even more of what I do, say, and what you say about me (which may reveal more personal info than I had intended) with third party companies. Facebook's response to user questions and complaints in an interview with the New York Times -http://bits.blogs.nytimes.com/2010/05/11/facebook-executive-answers-reader-questions/ - unapologetically says "Everything is opt-in on Facebook. Participating in the service is a choice." That's very interesting spin. It means, by having a Facebook account, I "opt-in" by default to all the poorly-explained personal information sharing that Facebook wants to do and may want to expand into in the future. Details about their recent changes are all over the web but are also so confusing that it's not worth the time for the limited functionality I get from Facebook. So, I'm taking their advice and opting-out the only feasible way Facebook now offers.

If you would like to stay connected with me, you can find me at my personal blog - http://agcrazylegs.blogspot.com/ - where I know what the rules are and you probably already have my e-mail address, chat logons, twitter account, or some other way to reach me.

If not, take care. It was nice catching up. Perhaps the next Livejournal/Friendster/MySpace/Facebook iteration will be better. Perhaps open, federated standards for identity online will become both available and useful one day soon. Perhaps we'll meet again.

Friday, January 29, 2010

User Comfort with a Limited Device

A discussion going on over at Technologizer - started in the comments, here http://technologizer.com/2010/01/28/microsoft-the-ipad-is-humorous/ and followed up in an article, here http://technologizer.com/2010/01/29/the-ipad-isnt-just-for-us-its-for-aunt-bettys-too/ - points out an interesting possibility concerning user comfort with a device such as the iPad. As pointed out by commenter Bouke Timbermont and later by the article author David Worthington, perhaps the people Apple is after with the iPad are embarrassed when they get a multi-functional computer and don't feel knowledgeable about it. To a middle-aged techy like myself, that may be the equivalent of getting schooled by a 10-year old in an online game, repeatedly, for the life of the device. Perhaps a restricted device like the iPad is more comfortable because what it can and can't do are much more straight-forward. For example, there are no USB-ports whose purpose is so universal (hence the U) that people still don't know how many they want or need on their computers. Every app you add to an iPhone OS-based device, since it can't really interact with other apps, adds precisely the capability you paid for, or so Apple's review process would have us believe. While an engineer, like myself, tends to focus on the lost potential of capable hardware limited by a locked-down OS, perhaps a more common point of view is that its a tool refined for a more-understandable set of purposes.

Comfort considerations aside, I'm still not sure the set of purposes described so far for the iPad contains any "killer apps" that justify its size and price. It still looks to me like a coffee table device, as expressed in my last post. Perhaps more software capabilities, hardware add-ons, or content deals will be revealed in the ~2 months before launch that make the iPad more compelling. Perhaps I'm right and Apple really is gambling on the coffee table market. Perhaps I'm wrong and my engineer's perspective limits me in ways the market's imagination doesn't. I doubt it. We'll soon see.

Note: post updated at 11:05 EST to include link to second article at Technologizer.

Thursday, January 28, 2010

iPad - Its the coffee-table book of computers

Steve Jobs and others are trying to compare Apple's new "breakthrough" device - the iPad - to a netbook to show how much better it is. At the $500 minimum price, however, its really in the same price class as fully capable laptops. Sure, laptops are a much more mature market so you'd expect better value, but take a look at what you can get at street prices right now of $500 and under: Newegg laptops under $500. Of course, then you have to live with Windows 7 and how awful is that (where did I put that sarcmark, again)?

Steve was clever in announcing the price last and possibly even more clever if Apple purposefully leaked price points near $1,000 to help manage the expectations. If you thought going into his presentation that $1,000 was too much, which everyone did, $500 would sound like a breakthrough. Its excellent salesmanship, as we've all come to expect from Steve.

A few illustrative use cases:
1) You've tapped-out a document in iPad iWork. Now what? iPhone OS doesn't have a user-accessible file system so you can't save it for use or sharing in another app nor save it to a memory card or USB flash drive, you can't send it to a fileshare (AFP or SMB), and since iPhone OS doesn't support printers you can't print it. These are all option on a netbook with even more available on the $500 laptops above (e.g. burn to optical disc). All you can do is e-mail it to yourself or, presumably, sync it over to your $500 computer using iTunes(???) in order to share, burn, copy, or print it. How does that make sense? This use case assumes you've either adjusted to the touch, no-tactile-feedback keyboard which your hands do a great job of hiding or invested another $69 in the iPad Dock Keyboard.
2) You've finger-painted and touch-tapped a presentation in iPad Keynote. Now what? In addition to all the limitations in use case #1, you're only display option on this highly mobile device is plugging it into a fat analog VGA cable via a dock-connector adapter. If you have to stand in the back, tethered to the projector, you really lose out on the mobile capabilities of this device. If only there were a technology that could do this wirelessly...oh wait, didn't Intel release one just a couple weeks ago at CES? But of course, that wireless display only works on full blown laptops with Core I-series processors, the kind you'll soon be finding in laptops, eventually in this price class.
3) A tablet is a great form factor for taking and sharing hand-written or -drawn notes and diagrams. Writing everything down on paper and having it stuck in your notebook is so last century. Currently, there's no App for that. There's not even the necessary handwriting recognition in the OS for turning such notes into anything but big, ugly images (unless you're an artist in which case they're big, artsy images). Since iPhone OS can't currently multi-task, there's little to no hope of a third-party handwriting recognition app integrating with a separate productivity app like iWork. Best case, somebody who's good at handwriting recognition will release an app that can turn your scribbles into text and then...do what with them? See use-case #1.

Basically, all the use cases I can think of where a tablet would be innovative or useful in a work or school environment, I find that this device eschews the necessary features. Its really an entertainment device with a bit of a self-image problem. Its too big to fit into your pocket like an iPod/iPhone and so low-featured compared to comparably-priced laptops that I don't think its going traveling, much. If you owned all 3 (iPod/iPhone, iPad, Macbook), like a good Apple customer, and you were about to leave the house, what would you bring with? The iPod/iPhone, of course, because it fits in your pocket so why not? If you thought you'd need to get something done while you're out (that you can't do on your iPhone) would you bring the iPad over the Macbook? Doubtful. Perhaps if you didn't have a Macbook? Without a computer to sync to, its crippled. No, for most customers, this thing is never leaving the coffee table.

That's what it is, the coffee-table book of computing. You'll have it there for guests to ooh and aah about. You'll flip through a few photos to show them your trip to wherever. Since it doesn't sync wirelessly, there won't be many photos on it. Maybe you'll read a few pages of an ebook on it. Then you'll put it down and forget about it for a month while the battery slowly drains.