Monthly Archive for August, 2008

Little known facts about Sarah Palin

A twittermeme popped up last night which featured little known facts about Sarah Palin. Here’s an example:

Little Known Fact: It’s not raining in DC. Those are God’s tears of joy that McCain picked Sarah Palin. (@MichaelTurk)

Started in fact by the above Michael Turk, who was a 2004 campaign director for the Bush campaign, the template was copied and spread across twitter, to amusing effect. They were interesting, varied, and popular enough for Sunil Garg and I to throw together a quick single serving site: Little Known Facts about Sarah Palin.

All it does is poll Twitter every so often for tweets that match the pattern, and when you visit the site you get a random specimen to read and hopefully enjoy. Currently, there are already 3200 items in the database.

More on this as the site spreads, hopefully.

NBC Olympics web strategy: a loser?

I’m a big fan of Hulu. I think it represents a huge landmark in forward thinking by the major television networks, and that they should be applauded for their work with that particular platform thus far. The video quality is more than reasonable (better than I get off my standard definition Comcast pipe, even!), on-demand, and almost impossibly ad-free. I can scrub through video and replay any particularly hilarious bits I choose, without needing to buy an expensive TiVo, or spend the time to set up Media Center or MythTV.

Actually, after several hours of trying I still can’t get MythTV to work, so maybe it’s just me.

But no one gets it perfect right off the bat. Hulu had its issues at first –– Hulu has its issues now. Slowly, one by one, they addressed them. You can pause during commercials now, and turn the volume down a bit since they’re always compressed beyond belief. But while the library continues to grow, NBC and Fox have made the incomprehensible decision to limit the number of available episodes online, even before DVD sets are available for shows. It’s the web, why limit content and revenue?

So, when I heard about NBC’s Olympics initiative, I was fairly excited. A multitude of streaming video selection, hours of playback and clips, and other on-demand content was the plan, and with how well they pulled Hulu off, I was looking forward to seeing it in action. Of course, I was in Amsterdam the whole time, so I never got to actually see it, instead relying on what other people and blogs said about it.

TechCrunch reported a couple of days ago that the NBC Olympics web initiative was an abysmal failure. A “loser”.

Of course, this was largely stated from a fiscal perspective, but it was stated nonetheless. While I didn’t get to see NBC Olympics in action, I did get to see the Dutch equivalent –– incidentally also in Silverlight only. It featured 10 streaming channels, one of which mirrored Nederlands 1, and the rest of which were simply raw feeds from various sporting events. There was a schedule for each channel, and all sports got very good representation in the lineup. I could pick whatever I wanted to watch, tune in to the relevant channel, and even channel surf between events when things got slow between heats.

When I went to a friend’s house in Den Haag, we decided to tune into the Olympics then as well, on a real television set! And I felt frustratingly limited. We got to watch one of three channels –– that means we got to watch one of three events. And all without the blissful lack of commentary that could be found on the streaming channels (incidentally, one of TechCrunch’s few technological complaints leveled at NBCOlympics was the lack of commentary; I happen to think this was a feature).

So the Dutch equivalent was marvelous. And it was only a live solution, too –– there were hardly any event clips put up after the fact. And from what I’ve seen after getting back, NBC’s version was at least as good, if not orders of magnitude better.

So how was the project an loser?

Even by TechCrunch’s own books, the whole thing was still in the black. NBC tested out an entirely new way to broadcast, one that represents how everything should be done in the future, and users got unprecedented choice in what they watched. Yes, there were mistakes and misjudgments in planning and execution, but nothing is done perfectly the first time, and Hulu well shows.

I would never call such a forward thinking attempt to innovate a loser.

DRM: The games industry *gets* it

Surprisingly, I somehow haven’t yet discussed DRM on this blog at all just yet. This despite feeling rather strongly on the subject. I suppose this is largely due to the fact that there isn’t much to be said about DRM that hasn’t yet been said by others, and in a far more thoughtful fashion than I possibly could. However, there is one particular belief I hold that seems to be relatively rare, and which I think was validated recently.

As the title of this post suggests, I am of the opinion that the games industry gets it.

Of the various stolen goods you’ll often find up for grabs on shady websites, the most prolific items are always music and movies, but close behind you’ll always find pro software and PC games. Protecting content that is ultimately supposed to end up on a computer is inherently pretty difficult. SecuROM and SafeDisc are fairly well-known quantities to hackers at this point; they won’t blink twice while breaking the CD/DVD protection of these games. StarForce baffled people for a good long while, but eventually the black hats broke it, and once word got out that it has the unpleasant side effect of bricking the optical drives of customers, legitimate and illicit alike, developers finally began to shy away from it.

The point is that piracy’s a pretty big problem in the games industry.

The key, though, is how you deal with piracy. Crytek, the makers of the fairly extravagant Crysis, recently announced that they would no longer code PC-exclusive games, as they weren’t making any money due to the piracy issue there. Incidentally, this move should as a side effect solve their real problem, which was that no one had a computer that could run Crysis.

Valve, on the other hand, created Steam, which deals with the problem in an entirely different way –– digital distribution. This is a forward-thinking approach not only technologically, but also socially. By creating a consistent platform for PC games that singularly encompasses all types of games and allows for a pervasive community, Valve has made an entire economic ecosystem for themselves –– and loyal fans. I, for one, refuse to buy any PC game that isn’t on Steam now out of principle. Bionic Commando Rearmed and Sins of a Solar Empire, I’m looking at you.

But how does Steam address the piracy issue? First, its DRM approach is incredibly sensible. Once you buy a game, you own it. You can log into any computer on Earth with an Internet connection and kick off a download of any game you own. If you don’t intend to play multiplayer online, you can even run your copy of any game on as many computers as you want at once. Second, it makes buying games legally even easier than it ever was to pirate them. Click on the game you want, type in a couple of digits, and you’re done. No need to run to the store, no need to fuss about with physical media. It’s just that easy.

Traditionally, this has been my argument for why the games industry gets it. But former Xbox head, current EA Sports president, and general practitioner of awesome Peter Moore recently said some excellent things on the subject, which made me rather happy to hear.

I’m not a huge fan of trying to punish your consumer. Albeit these people have clearly stolen intellectual property, I think there are better ways of resolving this within our power as developers and publishers. Yes, we’ve got to find solutions. We absolutely should crack down on piracy. People put a lot of blood, sweat and tears into their content and deserve to get paid for it. It’s absolutely wrong, it is stealing. But at the same time I think there are better solutions than chasing people for money. I’m not sure what they are, other than to build game experiences that make it more difficult for there to be any value in pirating games. (eurogamer.net)

Exactly. Piracy is wrong, and piracy is a problem. But the industry needs to find compelling, reasonable ways to deal with the issue at its root cause, not sue its own customers to oblivion. Done, and done.

Now, Mr. Moore, get your company to publish its games on Steam and we’ll call it good.

Web, meet Ubiquity.

Ubiquity logo

Today, Mozilla Labs announced yet another new product to add to its long list of experiments and prototypes. First things first — let us pray that this experiment will fare better than its predecessors (Weave and The Coop, I’m looking at you…).

With that out of the way, let us examine precisely what it is that we so fervently wish to preserve.

To hear Aza Raskin of Mozilla explain it, you would fall under the impression that Ubiquity is essentially a dream, and that dream is to make natural language processing a reality in the context of bringing web mashups to the masses. The following ultimate example goal sums the project up fairly well:

Book a flight to Boston next Monday to Thursday, no red-eyes, the cheapest. Then email my Boston friends the itinerary, and add it to my calendar.

To which the system responds:

Leaving from SF to Chicago on March 20th at 9am. Returning on March 24th at 7pm. Itinerary will be sent to Andrew, Margaret, and Josh.

Elegant, efficient, and if done right, revolutionary. Essentially, Mozilla Labs wants to make that old Apple Newton web ad a reality.

However, the current prototype does not reflect this goal. Instead, it exists today as a launcher, a necessary menagerie of smaller plugins which connect to various different web applications and services, the composite whole of which may yet someday form this natural language beast that Aza and his team have envisioned. This shouldn’t faze anyone, however — in fact, I’m actually here to argue that the prototype is brilliant as is. First, some background.

Inarguably the most powerful utility on Mac OS X is Quicksilver. To most people, Quicksilver is simply a faster alternative to Spotlight for application launching purposes: if you need to load Word, just hit Ctrl+Spacebar to pop up Quicksilver, type “Word”, and hit enter. Much faster than going to the dock, and definitely better than the half second lag that Spotlight suffers from for the identical operation. However, Quicksilver is much more powerful than that, and represents in fact an entire philosophy, which creator Nicholas Jitkoff once detailed in a Google Tech Talk.

In the standard operating system shell paradigm, the goal of the OS browsing interface is to get you to the application. From there, you’re on your own. Thus, browsing the filesystem is the key and the model on top of which most operating system shell interfaces are developed — Windows Explorer, Nautilus, Finder, etc are all designed to let you browse through your hierarchy of files and eventually select a file or application to execute.

The key philosophy behind Quicksilver is that this barrier is an artifical construction, one that need not exist. Sure, Quicksilver will let you browse the filesystem and launch applications faster than anything else on the market, but the real beauty behind Quicksilver is how it lets you step past the filesystem. There is no need, for instance, to stop once you reach “iTunes.app” — you can, within Quicksilver, navigate straight into iTunes and browse your Library as if it iTunes were merely a folder and you were still browsing the filesystem. This far-reaching mentality is what makes Quicksilver truly powerful and flexible, and is where most power users spend their time with the utility.

The filesystem-application barrier is artificial and need not exist.

Now, let us at last take a look at the current incarnation of Ubiquity. As demonstrated in the screencast, the plugin is currently essentially a launchbar, from which contextual actions may be launched. You can, for instance, highlight an address, call Ubiquity, and tell it to “map,” which will not only load Google Maps, but let you drop it into an email you’re writing. Similar functionality exists to find things on Yelp and other web services. It also lets you do things, such as highlight foreign language text within a page and ask Ubiquity to translate it in-line, TinyURL a URL, or tweet about things that you see around the web.

Thus, I would argue that Ubiquity is currently Quicksilver for the web. And perhaps it wouldn’t be so bad to keep it that way. Essentially, Ubiquity allows Firefox to become more than a web browser, in a nonobtrusive way: it becomes an active component of the web. You can execute actions on any webpage through the browser to any supported web service.

Essentially then, the core philosophy [at the moment] is that the browser/URL-web application/services barrier is artificial and need not exist.

Ubiquity is tons of fun to play around with, and will probably become a core part of my Firefox experience before long. But does it need to be anything more? Quicksilver for the web is already an ambitious goal, and while natural language programming would be nice, this set of features and this paradigm is here now. And I think the web is ready for it.

Amsterdam: “Oh my God, everything is Helvetica!”

On 27 July, 2008, I left Seattle to go to Amsterdam for a month-long study abroad program hosted by the University of Washington Honors Program, the International School for Humanities and Social Sciences at the Universiteit van Amsterdam, and the Virtual Knowledge Studio. I now sit at Amsterdam-Schipol airport, typing a small series of articles detailing some of the more interesting points of the trip; I will refrain from speaking about the program itself, however –– on that subject, suffice it to say that it was at times and alternately exciting, interesting, frustrating, tiring, and confusing. With that said –– Amsterdam!

Upon touching down at Amsterdam-Schipol nearly a month ago, my immediate thought was “oh my God, everything is Helvetica!” Schipol is a very impressive airport, even if it lacks the huge glass façades of Sea-Tac or the immense scale of O’Hare; it’s quite simply very modern, with a reasonable layout and cozy lounge areas. And everything is in Helvetica.

Not just Helvetica the font, however –– the overall design and aesthetics of the airport reflects strongly the Helvetica mentality: bold, vibrant, and modern, but not forceful. Cheerful yellow signs point you around the rather inviting lounge areas, which were substantial, even in the international terminal alone. And that wasn’t the only thing that was cheerful: the customs official let me through within ten seconds. After buying a ticket, I wandered downstairs to wait for a train to Amsterdam Centraal Station, which was about a 6 minute wait. The train was similarly nice; the sneltrains are almost all fairly new, and run fairly smoothly and pretty much completely quietly.

And along the way, that same Helvetica impression held. Building after building was modern, with shameless “look at me!” type architecture-for-architecture’s-sake. Cubes on top of cubes at ridiculous angles, glass panelling, and a curious combination of unique buildings juxtaposed with lines of identical condominium towers proceeded to interest, and almost even impress me. Sadly, I’m not a terribly huge fan of architecture that doesn’t have a point, and so “almost even” was about as close as it got.

As an aside, that train ride was also the first point at which I became very annoyed at tourists –– and my own home country. I had the distinct pleasure of sitting in front of a woman on the train, who absolutely could not cease babbling about how incredibly terrible and disgusting that honestly nice and clean train was. Her husband sat across the aisle –– I gave him what I hoped was a sympathetic glance.

Ah, but at last I arrived at glorious Amsterdam Centraal… and proceeded to walk out the back exit by accident.

Let me tell you about this back exit.

It’s bad. There used to be doors. Now, there very clearly aren’t –– and the exit opens up to a wide concrete path surrounded on both lateral sides by chain-link fence struggling to hold in abandoned construction, and on the top by a crumbling overpass. I can see why they’re redoing Oosterdokeiland. I ventured out into the semi-putrid air past the homeless people staggering about for about half a minute before determining that something was amiss and wandering back into the station, through the doors that were striving so hard to be.

Somewhere along the line, Helvetica wandered off and committed a sad, silent suicide.

But not to fear! After wandering back through the train station, I found the main exit. Happily, excitedly, I stepped out the sliding glass doors and into fresh ai––

––into a huge whiff of marijuana smoke?

“That bad?”

I should qualify my use of the word “bad.” I have absolutely no problems with pot: stoned people generally don’t get into cars and kill people (and themselves), and are also usually quite a bit quieter. But the first bit of proper air I breathe in Amsterdam and it’s a huge whiff of it? That’s a bit unexpected, for sure. I’m now fairly convinced that someone just stands in front of Centraal and smokes weed just to catch people like me off guard… this was pretty much the only such occurrence.

I then walked the mile and a bit to our dorms, failed to locate Albert Heijn to buy food (which I hadn’t consumed in about 14 hours), and collapsed. Go international travel.

More to follow…

The proper way to deal with bugs

Electronic Arts has taken a lot of flak in the past half decade or so for being the Huge Corporate Conglomerate of the gaming market: buying out countless licenses, releasing a torrential flood of games with questionable quality assurance standards, and just not caring in general.

Well, everyone else can eat their words today –– EA wins.

This is the only proper way to deal with bugs.

Collaborative Work for the Future: A Followup

I had an interesting conversation with teacher and friend Clifford Tatum on the subject of my previous post, Collaborative Work for the Future, largely as a direct result of having freshly written and published its content. While a large part of the proceedings revolved around the difficulty of uniting communities, technologies, and needs (among other things), we raised many more questions than we answered, and so I would like to start by pointing out a few examples that I now realize fall under the wing of non-software-development collaborative platforms which I would like to address.

First is Microsoft’s SharePoint. While it certainly provides a collaborative platform with revision and user tracking, with the added benefit of a useful, rich, and familiar working environment (Microsoft Office), it has more than its share of significant shortcomings. One is the sheer mass of technology involved: dedicated servers are needed to power the platform, with enterprise-grade database (MS-SQL) and web (IIS+ASP.NET) services. The amount of setup work is remarkably prohibitive and upgrading the software components is a nightmare, on top of which the entire platform is built to function mostly in a trusted Intranet environment, not for worldwide collaboration. In addition, the whole package, which requires not only the SharePoint software, but also the aforementioned Windows Server, MS-SQL, and ASP.NET licenses, tally up to a rather frightening price tag, on top of the maintenance and server upkeep costs. Clearly, this solution is aimed at medium to large businesses, and not the average user or researcher.

A similar product to SharePoint is Alfresco. I haven’t personally used it, but it’s built entirely on an open-source software stack, and is free to use. It has yet to make any major waves on the market, and since they don’t appear to offer fully-hosted services based on their software, installation is again a key factor. However, it might be interesting to keep an eye on them in the future.

Another example which came to mind after-the-fact was Google Documents. What began life as Writely and Google Spreadsheets has slowly evolved to become a usable, albeit limited office suite. And, due to its origins, collaboration was built in to the platform from minute one. It’s free, fully hosted, and ready to use the moment you own a Google account, offering comprehensive live-edit, sharing, security, and revision support. It’s exceptional at what it does. What it does, however, is the issue –– once again, even though Google Docs wants to be a fully fledged office suite someday, it simply isn’t there yet. All the features it supports are on a me-too level, and Javascript in browsers is simply too slow and glitchy to be relied upon just yet. In the end, the platform still ends up being a web-medium lock-in, much like the wiki solution is. It will be interesting to see, however, how the product evolves in the future.

But what do the researchers need? What do non-profit organizations need? Does there need to be comprehensive project management features built-in to the document collaboration platform? What is the key ingredient that is missing at the moment? This difficulty in uniting communities with technologies and addressing their needs head-on has been traditionally (one would assume) a barrier to the advancement of these technologies, and needs to be addressed.

Perhaps now that I have a small handful of research projects under my belt, finding out what researchers and small organizations want is my next step.

Collaborative Work for the Future

Communication has long been the most-touted invention of the modern era –– first telecommunications, then the Internet spawned a society where people are not only able to communicate instantaneously, they are able to do so with complete ease and near ubiquity. Services like Facebook, Twitter, and the various Instant Messaging protocols connect us to each other at nearly every breathing minute.

A byproduct of communication –– the one I’d like to focus on today –– is collaboration. While communication and communication technologies provide the inroads to facilitate collaboration, the ability to transmit data of any form to one another instantaneously is not enough to genuinely collaborate. As network technologies, then web technologies, then rich media technologies began to grow, however, we have seen increasingly frequent attempts to provide a complete system for collaboration. Videoconferencing packages, for instance, provide unique features such as shared whiteboards or screens, allowing for work to happen across the globe in ways never before imaginable. However, this is still a fundamentally communication-oriented development, which while immensely beneficial to collaborative efforts, doesn’t necessarily address the ultimate goal of building a single product, paper, or project.

So, how do we better use technology to facilitate direct collaboration?

There are several bits of software that attempt to address this issue head-on, but being by developers, they largely address developers’ own needs –– the rest of the world hasn’t necessarily woken up to technology’s potential in this regard, and so very little attention and effort have been raised towards furthering these projects in other directions.

These pieces of software are known as VCSs, or Version Control Systems. Several prominent examples are CVS, SVN, Git, and TFS. Three-lettered length aside, they all tout a number of core features –– the ability to keep track of revisions and who made them, the ability to view or roll back to any of these revisions, and the ability to merge two versions of a file if, say, they were both being worked on at once. While very efficient, useful, and relatively simple for people working on software, these systems are on the difficult side for even moderately technologically proficient users, and setting them up is a nearly insurmountable task, one even seasoned experts tend to dread.

So, what’s out there that’s easier for the general public to use? The solution that my Amsterdam study abroad class appears to have chosen is to repurpose a wiki for the task. And at first glance, it appears to be a fitting choice –– wikis generally feature user and revision tracking, and at least a rudimentary form of diff merging. However, they are also a very restrictive medium –– one wouldn’t be able to build a trifold brochure, or a technical manual on them with any sort of practicality: while it may be possible to format the wiki to look properly in these regards, these things tend to be done with real desktop software, with real formatting tools and rich output. Adobe has a solution for its Creative Suite that’s slowly evolving, but what of the rest of the business and academic market?