Firefox Extension Developer Tips

Just a couple of tips for Firefox extension developers, hard earned after many hours of head scratching. Not adhering to either tips will confuse Firefox and XPCOM component will fail to load.

XPCOM components get loaded before chromes are loaded.

[Update: The most common problem related to this is Components.utils.import call fails during launch with NS_ERROR_FAILURE exception. To fix, wait until app-startup notification is received before importing javascript modules.]

This means anything defined in chrome.manifest won’t be available until “app-startup” event is observed. Note that Resource URI scheme “resource://” introduced in Firefox 3 uses resource directives in chrome.manifest which means you should defer Components.utils.import calls until “app-startup“.

XPCOM components implemented using Javascript should be defined as a pure object, not function.

So it should look something like this:

var MyServiceModule = {
  registerSelf: function(compMgr, fileSpec, location, type) {

Real-Time State of Mind

I need to get back to blogging more often. Having to type more than 140 characters feels wierd. ๐Ÿ˜‰

Given that I’ll be attending TechCrunch’s Real-Time Stream CrunchUp this Friday, I thought a blog post on a key real-time stream problem would help me into a real-time state of mind.

Real-time streams have many technical problems to overcome many of which are thankfully being resolved by advances in technology and infrastructure but the problem that interests me the most is the user experience problems:

Information, real-time or otherwise, is meaningless if users are drowned within it.

Typical Twitter users see only a fraction of tweets from people they follow. The notion of Top Friends (related to my social radar diagram from 8 years ago) will help but at the cost of additional chores users have to do separate the greens from weeds.

The financial industry has used real-time streams for a long time so there is a lot to learn there technically. But, when it comes to user experience, they haven’t cracked the nut either, forcing traders to use bewildering number of charts and numbers on multiple displays and input devices to trade. So the emerging consumer real-time stream developers will have to break new grounds ourselves.

Fixed Aptana RadRails GEM_LIB issue on m…

Fixed Aptana RadRails GEM_LIB issue on mac by linking ‘/Users/{user}/.gem/ruby/1.8/gems’ to ‘/usr/local/lib/ruby/gems/1.8/gems’. I can’t blame Aptana for this since it was me who chose to use a tool built by a company that spread itself too thin. I doubt they have more than a couple of engineers working on RadRails which is not enough to provide the necessary quality across the range of environments Aptana is unfortuantely being asked to support.

HTML5 Microdata Fantasy

I haven’t been tracking HTML5 design efforts lately but what’s being proposed for microdata (see posts by Sam Ruby and Shelly Powers) yucked me sufficiently to revisit an old fantasy of mine about HTML (man, what a boring life I have). My fantasy was to add general element/structure definition facility to HTML. It should easily extended to support microdata as well.

The way I envisioned it being used is like this:

<street>123 ABC St.</street>

which sure is preferable to:

<div item>
<span itemtype="street">123 ABC St.</span>
<span itemtype="city">Foobar</span>
<span itemtype="state">CA</span>
<span itemtype="zip">94065</span>

As to how a semantic structures and syntactic sugars can be defined, one very arbitrary way could be:

<def name="address" package=""
    params="{{street city state zip}}">

I don’t have any illusions that this fantasy has even a tiny chance of coming true though. Besides, it’s like a beggar asking for caviar when any kind of microdata support will satiate our hunger.

Boss! Boss! The Plane. The Plane!


Here is a more elaborate version of the def element for the bored:

<def name="name" package=""
  attrs="$$first last$$">
  <span>$$first$$ $$middle$$ $$last$$</span>

which could be used like this:

<name first="Don" last="Park"/>

There are lots of wholes in this sketch which is why it’s a fantasy.

Smiley Profile Image Set

I wish I could use a set of profile images instead of just one and have appropriate one displayed based on text content so that if I put a smiley like ๐Ÿ™‚ or ๐Ÿ˜‰ in the text, photo of me smiling or winking will show.

It doesn’t have to be a face, it could be topic/category images. And I don’t see why tweet-specific images couldn’t be displayed since Twitter already sends out image URL with each tweet (inside ‘user’).

Why wasn’t OAuth Vulnerability found earlier?

According to OAuth about page, it was Blaine Cook who initiated the birth of the the standard while working at Twitter in Nov. 2006. Blaine mobilized the initiative by getting Chris Messina involved which attracted others at CitizenSpace to join the effort (an excellent demonstration of benefits co-working social environments offer). By April 2007, the initiative got to formalize and, by October 2007, OAuth Core 1.0 spec was finalized. The question of interest to me is, why did it take a year and a half to uncover the first vulnerability?

It’s puzzling because OAuth was well known and popularized, attracted a large body of developers, many of whom I presume read the spec, and implemented by many, some very large companies. I’ve read the spec as well and discussed it with peers and partners in the security and payment industry on several occasions.

I think the right answer might be that our collective perspective in dealing with the standard was focused on implementation, application, and hype while wrongly assuming that the standard was secure. Recollecting my thoughts when I was reading the spec for the first time, I now realize that it was the safety in numbers and the lure of promising applications that influenced me to focus only on implementation.

The good news is that I think OAuth will be given the proper shake it needs to get any remaining kinks out. The bad news is that we are likely to repeat the mistake when the next popular grassroots standard emerges in a hurry. Relatively fast pace of community/grassroots standard initiatives is not a concern only if mass appeal can be effectively leveraged to shine intensive searchlight on all aspect of the standard.

On Twitter’s OAuth Fix

While the OAuth team is working on addressing the OAuth session fixation vulnerability at the spec level, Twitter made following changes to reduce the exposure window:

  • Shorter Request Token timeout – This is good practice in general. Developers tend to be too generous and, all too often, forget to enforce or verify enforcement.
  • Ignore oauth_callback, in favor of URL set at regration time – this prevents hackers from intercepting callback.

Double-callback is still possible though which means Twitter OAuth Consumers will have to detect extraneous callbacks and invalidate access to everyone involved because they have no way of telling who is who.

Remaining exposure to the vulnerability is when hacker’s simulated callback arrives before the user. We are talking temporal exposure of a couple of seconds at most which, given current Twitter use-cases, is not that big a deal. I wouldn’t do banking over Twitter though. ๐Ÿ˜‰

On OAuth Vulnerability

Twitter’s OAuth problem turned out to be a general problem affecting other OAuth service providers and well as consumers using ‘3-legged’ OAuth use-case. For details, you should read not only the relevant advisory but Eran Hammer-Lahav’s post Explaining the OAuth Session Fixation Attack.

First hint of the vulnerability surfaced last November as a CSRF-attack at Clickset Social Blog which was initially diagnosed as an implementation-level issue. Well, it turned out to be a design flaw requiring some changes to the protocol.

There are actually two flaws.

The first flaw is that parameters of HTTP redirects used in OAuth can be tempered with or replayed.

This flaw allows hackers to capture, replay, and mediate conversations between OAuth Consumer and Service Provider flowing over the surface of User’s browser between the User, Consumer, Service Provider.

I think the easiest general remedy for this flaw is including a hash of the HTTP redirect parameters and some shared secret like consumer secret. A more general solution like evolving tokens could be done as well but inappropriate as a quick remedy.

This flaw should not affect OAuth service providers that manage and monitor callback URLs rigorously.

The second and more serious flaw is thatย the User talking to the Consumer may *not* be the same User talking to the Service Provider.

This means that a hacker can start a session with then phish someone to authorize at Twitter to gain access to as that someone without stealing password or session-jacking.

Solving the first flaw simplifies the solution to the second flaw by reducing the possibility of the hacker intercepting callback from Service Provider to Consumer which is not supposed to have any sensitive information but some implementations might include. Wire sniffing is a concern if HTTPS is not used but the relevant concerns for the flaw are integrity and identity, not secrecy which is an application factor.

Removing the possibility of callback URL tempering leaves double callback, meaning that the hacker start things off, tricks someone into authorizing without intercepting the callback, then simulate a callback to Consumer. Note that the Consumer would have started a HTTP session with the hacker, session associated with the RequestToken in the callback. Even if HTTP session is not created until the callback is received, there is no way for the Consumer to tell who is who.

I think Service Provider have to send back a verifiable token, like a hash of the RequestToken and consumer secret so the hacker can’t simulate the callback.

Regardless of which solutions OAuth guys decide on, one thing is clear. It will take time, many weeks at least, if not months. That’s going to put quite a damper on developers in the Consumer side of the OAuth as well as the Service Provider side.

Value of Journalism

Will newspapers survive? I think the physical form will survive for another 10 years at least at much lower valuation then eventually break into nich market fragments. The profession of journalism will, however, not only continue on but become more respected than before.

This is why I think so. When we are short of something we consume, like water in the desert, we put value in availability. As we approach ubiquitous availability of the same, we shift value to quality.

In a sea filled with unverified and biased news and information, we will rediscover the value of journalism. We will see memes as what they really are, mental viruses, and know the danger of careless consumption. As we have become more health conscious, we will also become more mental health conscious.

We’ll see products of journalism like bottled water, avoid reading/eating things off the ground, and see eaters of biased or mutated news as inbred rednecks. Those who can afford to pay, that is.

As usual, I am exaggerating. Not quite hyperbole but enough force to kickstart pointless thinking.