Wednesday, March 04, 2009

The curious case of the Real-Time Web

eConsultancy's got a really intriguing article denouncing the need for a real-time web. I very much enjoyed it and the author's stance is certainly valid, undoubtedly shared by many. It's led to several passionate responses within the blogosphere.

Ever the contrarian, I'd like to speak in defense of real-time online experiences (to be understood as those involving the Web, desktop applications, mobile devices, e-mail, RIAs, texting, instant messaging, Internet-aware appliances and otherwise), shedding light on why the direction we all know the delivery of content and information is inevitably heading is so critical. I'm speaking specifically from the perspective of the broadcast news industry, as ours is a perfect case study to prove the worth of such systems such that other lines of work might follow suit with their own respective implementations.

Very necessary
I'm highly in support of ushering-in push-based methods of delivering content to user at a near-real-time pulse. I'll admit my selfishness: working in a multiplatform/multimedia/multidevice business that's more and more these days predicated on responsiveness over authoritativeness, credibility has taken a backseat to time-to-market. So if someone else out there has stuff to tell people - a direct rival news source, a citizen journalist with a devout following, or simply a (micro)blogger - they're taking away eyeballs that I could have. They're cutting into my attentionspace.

If I've got content to distribute, it's not enough anymore for me to update my site and expect people to manually browse to it, or to send out news alerts and only after people check their messaging accounts expect them to flock, or wait for their feedreaders to modify their subscriptions. I've got stuff and I need to let you know as quickly as I can, lest you find out from someone else.

Such is the sense of urgency with which we in the news game have to operate, because it's better for competition, better for the user and better for the technological landscape overall.

Why the lag?
To date there's always been an acceptable disconnect in communicating online, so much so that such latency has sadly become accepted as the norm. This was largely because of a lack of supporting infrastructure over which to send/receive data - available bandwidth, affordable consumer technology, meager public awareness, and the Internet being a medium confined to desktop-only access - in addition to the event-dependent request/response nature of HTTP.

Even the traditional mechanism of "evolving platforms" like notifications via SMS and e-mail are still subject to a lag that the unwashed masses tolerate. Think about the Answering Machine Analogy: how inconvenient and difficult would it be if every phone conversation you engaged in saw you and your distant-end party exchange messages exclusively over voicemail? The staggered nature of such communications is ridiculous, and yet we're happy with doing just that via e-mail.

Today we're getting close - very close - to mirroring the experience of actual human conversation. And this needs to be seen by society as (1) an augmenting option to the static retrieval process of information we're used to, and (2) a good and positive thing.

The game's a-changin'
As a user, if I can be informed about a developing event the source matters less to me than in years past. I don't put so much value anymore on whether whoever authored a report or posted a thought that's 140 characters or less has won an Emmy or Pulitzer Prize. It's the nature of being informed that I'm after. If I want the full details about a report, I'll seek out the offering from an established leader.

CNN no longer holds a monopoly on controlling what people know about the world. And more importantly, when they know it.

To the ad hominem challenge of user-generated content: there's always going to be a market for well-produced information products...assuming the mainstream media gets its act together and stays proactive to compete with grassroots applications while leveraging the quality composition and credibility its known for. To date, that's not largely been the case. Corporate media powerhouses, newspapers in particular, have traditionally displayed a hubris towards emerging technologies that's let them fall several lengths behind the rest of the field.

! GEEK ALERT !
Asynchronously pushing content to subscribed users, an evolved method of the traditional interval-based polling technique used across online properties, is THE pattern developers are salivating over these days. XMPP is a well-defined, established, flexible technology stack that avoids the inefficiencies of making unnecessary repeat system roundtrips that are at best highly wasteful of network resources and at worst an emulated DDoS attack. Expect a lot of mainstream traction in this space.

Granted, filtering posts through the Twitter Search API via a client-side track bot is a technical process that's not exactly easy for the layman to grasp, much less setup. So there's much work ahead in terms of making the tools easy to use to achieve this experience. We faced the same challenges five years ago when podcasting really took off in the technical substrate, with a solution needed to make single-click subscription to RSS feeds a reality to open up the platform to the greater worldwide community.

But the groundwork's been laid and the technology exists to make it possible.

I also see people retrofitting real-time delivery models within existing platforms beyond the World Wide Web. The perfect candidate for such migration is e-mail, that tried-and-true killer app. Once you've drank from the real-time well, waiting around for messages to be aggregated and downloaded is a nuisance, as you're now cognizant of the fact that you're not truly reading notes as they're being sent to you. I've spoken to a developer who's planning to embed the open source Jabber engine within a custom SMTP transport for delivery of electronic mail, sans delay.

The demand for real-time mail clients, in addition to a call for XMPP support within browsers, is going to thrive.

How about the Mothership?
Another perspective to consider is that of Google, the bastion of all things Internet-related. Their main search index has been chastised for not being "real-time friendly", only listing stored documents in the historical Web. But consider what the company did years ago when launching Google News.

The pilot project crawled a distinct subset of networks, affiliates and news services so rapidly it was able to provide results mere minutes after its network of publishers updated their sites with new content. The trend we're now seeing is that the company is starting to leverage that innovation towards the Web at-large, timestamping new pages it crawls not too long after they've been added online.

Google could also conceivably incorporate real-time notifications into Gmail, building out an XMPP architecture supporting pushed alerts triggered by the presence of newly-arrived messages. And a new Firefox add-on is all the rage at the moment, displaying tweets related to Google search queries in quasi-real-time. So the demand for real-time information is certainly there.

Each of these concepts, applied at Google's scale, has huge implications.

And my point is...?
In short, the Web is becoming what it's never been before - a helluva lot faster. This is meaningful because it could be the first major form of media to achieve multi-format interactivity at a near-synchronized rate. This is a larger achievement than most realize and are willing to accept.

So here's the long and the short of it all: if real-time experiences aren't your cup of tea, that's fine. Such will function as a valuable extension of the current method of disseminating information. There's work ahead that we in the development community have cut out for ourselves in building tools to filter data coming across the wire so that people can only get what they want, or people not interested in such processes at all won't be bothered. The World Wide Web will continue to exist as society's largest repository of historical data, replete with tools to search, filter, discover and create content.

The Web's IP-based cousins will continue to support their own methods of managing the data you crave so desperately, with IM getting a particular share of the limelight. It's nabbing items coming down the pike and doing neat things with them that's the impetus of so much interest here. Moving towards delivering content in near-real-time is a natural evolution of the Web as we know it.

We all knew it was heading this way. So integrate it, love it, oppose it, ignore it, or fall head over heels for it - just don't deny it a place at the table.


**UPDATE**
Just mere hours after posting this article, Facebook announced their low-latency web updates capabilities, and big revelations were made at DrupalCon in regards to XMPP support.  Now with major players behind them, more people will hopefully be exposed to the benefits of instantaneous data, and begin to get it.


Comments:
I like it and agree. The need for real time information is critical and in high demand. The key to it's growth is to get the technology and bandwidth as affordable and accessible as possible. But I believe that we are well on our way. I can't go to a meeting, function or party without seeing Blackberry's popping into hands every couple of seconds all over the place. Keep up the good work Jason.
Senator Matt Rector
 
I disagree, for the most part. First, I don't think as a media organization your success is predicated on responsiveness over authoritativeness...there's no way you can beat citizen bloggers (or Twitterers) in terms of responsiveness. Maybe you can right now, on Guam, but eventually you will lose, because you aren't everywhere and citizens are. I think your advantage is authoritativeness...people can turn to you when they want the real truth, rather than what Joe Cruz said on twitter. This isn't so much an issue on Guam right now because most people are behind in technology, but in the next decade that will change.

Second, I think for most people instant information is a bad idea. Some people (like journalists perhaps) need to be constantly up-to-date, but most people don't. For many people, having news updates constantly pushed on us is a bad thing -- it's a distraction, and will threaten to overwhelm us, to stop us from working on the important things. This is partly what Zen Habits and The Power of Less is about -- finding a way to limit technology so you can take advantage of its power without being overwhelmed by it.

Leo
 
There are other more subtle benefits of the "real-time" Web - reduction of wasteful Web communication. Today individuals, companies, and governments waste enormous amount of data = hardware = energy in sending empty request (and response) headers around the world. Some of these real-time Web technologies enables "Power of Less", although not in the context of what Leo refers to. With a real-time Web we will be able to reduce the amount of information sent and received, thus saving bandwidth, and ultimately energy. Sure, if you are a small shop it may not impact your energy bill that much, but in the bigger picture it will make a difference.

The Web as it is designed today is like having children riding in the back seat of the car on the way to their grand parents: Are we there? Are we there? Are we there? - a thousand times. And the only answer is: No, not yet. No, not yet. You see the picture. The real-time Web would allow the driver to answer differently: I'll let you know when we get there. end of discussion. This may not make any difference to the user, but for whoever runs the networks it will.

As a technology geek there is also the possibility of creating new cool stuff for the Web.

Cheers, Jonas
 
Good thoughts, Jonas. I previously talked about the inefficiencies of repeatedly polling a web server with The Brainy Smurf Analogy. Ironically though, there are some concerns at the architectural level about using XMPP-based technologies such as BOSH and the additional traffic it imposes on servers when using it over HTTP due to big concern about large payloads sizes for large packet chunks.
 
Jonas, you might want to look at this post on different methods of HTTP push. I think you'll enjoy it.
 
Jason, I completely agree with your Smurf analogy, and another side-effect is that you also send less data around e.g. less request and response headers.

I would recommend that you have a look at this site (http://www.kaazing.org) since this solution takes your real-time Web one step further. Why satisfy and limit yourself with push when you can have full-duplex communication (or streaming) to ANY TCP-based backend service without the requirement for plug-ins ;).

I love the speed with which the Web is moving, and the potential of new inventions it provides.

Btw, thanks for the link to Zac's post.

Cheers,
Jonas

Disclaimer: I work at Kaazing :)
 

Post a Comment



Links to this post:

Create a Link



<< Home

This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]