The Standards Wars and the Sausage Factory


Standards-making is like sausage making. You need it, but it’s ugly. Yet the standards process is the necessary evil behind every technology we rely on.

On August 6, 1890, William Kemmler became a victim of an early technology standards war.

In the 1880s, the technology standard war of the day was between Thomas Edison, the primary supporter of Direct Current (DC) for electrical transmission, and his arch-rival George Westinghouse, who supported Alternating Current (AC). In a last ditch effort to show why DC should become the standard, Edison killed animals with AC-powered devices. He then persuaded the State of New York that an AC-powered device, the electric chair, would be a more humane way to kill condemned prisoners, the first of which was Kemmler’s execution by electric chair.

It was all in vain. Alternating Current was the more efficient technology, and today our homes and offices are powered by it.

“The wonderful thing about standards is that there are so many of them to choose from.” –Admiral Grace Hopper.

Today no one dies from standard wars, not that you’d know it from Internet comments. But years, millions of dollars, and endless arguments are spent arguing about standards. The reason for our fights aren’t any different from those that drove Edison and Westinghouse: It’s all about who benefits – and profits – from a standard.

I know, I know. Some of you are convinced that standards are determined by which technology is “best.” You are ready to trot out examples such as the famed battles including VHS vs. Beta video tape, or WiMax vs. Long Term Evolution (LTE) for 4G. You will bring up more current wars, such as Google’s SPDY vs. Microsoft’s HTTP Speed+Mobility (over how to speed up HTTP data transmission) and the hot-blooded fist fights over what will replace the X window server system for the Unix and Linux graphics stacks: Red Hat and friends’ Wayland or Canonical/Ubuntu’s Mir.

Sometimes the standard is driven by technical excellence. Usually, it’s not.


Image Source:

First, as the famous xkcd cartoon indicates, there isn’t an ideal “best” standard that covers everyone’s use case. To borrow Eric S. Raymond’s open-source truism, “Every good work of software starts by scratching a developer’s personal itch.” Standards are the same. What scratches your developers’ use case does not (necessarily) scratch other developers’ itches.

So since everyone wants to have things his own way, we – as businesspeople and as technologists – attempt to find compromises through organizations such as the IEEE, IEC, ISO, and IEFT. In theory, these Standards Development Organizations (SDO), according to the IEEE Standards Association, “offer time-tested platforms, rules, governance, methodologies, and even facilitation services that objectively address the standards development life-cycle, and help facilitate the development, distribution, and maintenance of standards.”


Excuse me. I had to pick myself up from the floor from laughing so hard. You see, as a journalist I’ve covered how standards are actually made. It’s a lot like sausage making: an ugly, painful process that you hope produces a product that everyone finds tasty.

Take, for example, the long hard road for the now-universal IEEE 802.11n Wi-Fi standard. There was nothing new about the multiple-in, multiple-out (MIMO) and channel-bonding techniques when companies start moving from 802.11g to 802.11n in 2003. Yet it wasn’t until 2009 that the standard became official.

What took so long? At the start, four major groups fought to decide 802.11n’s fate. Two groups’ proposals – one from Mitsubishi and Motorola, and another from Qualcomm – quickly lost support. The other Wi-Fi networking companies quickly united into two competing groups: Task Group ‘n’ Synchronization (TGn Sync), with Intel, Atheros, and Nortel; and World-Wide Spectrum Efficiency (WWiSE), led by Airgo Networks. Airgo also had the advantage of being first to deliver MIMO-capable chipsets.

This kind of consolidation between rival companies or groups in a standard war is common. Few technology companies can afford to set their own technology standards and expect to survive in the marketplace. (It can happen that way. For example, for decades Microsoft could set desktop standards by dint of “We set the standard since we own this market segment.”) Apple manages to get it own way of doing things – from Advanced Audio Coding (AAC) for music formats to Apple Thunderbolt for high-speed I/O – and get away with it because within the closed garden of the Apple development ecosystem there are no other competitors. Companies such as these are the exception to the rule, however.

However, the first-mover advantage often isn’t that important in the long run. AC came after DC, VHS came after Betamax, and RCA’s color-TV technology came long after CBS’s now forgotten color TV tech. That proved to be the case with Airgo’s Wi-Fi experience as well.

For two years, TGn Sync and WWiSE fought it out in standard committee meetings, with neither gaining the required 75% super-majority. In late 2005, it looked as though the two finally came to an agreement in the Enhanced Wireless Consortium. But while its allies might have been ready to throw in the towel, Airgo wasn’t.

Airgo fought on with such tactics as adding more than 12,000 comments (count them, twelve thousand) into the “final” 2005 Wi-Fi standard draft. As Bill McFarland, Atheros’ CTO and one of the draft’s editors and writers, said at the time, “There were a lot of duplicate comments, and three people filed comments for each and every blank line in the document. The physical process of dealing with so many comments is tedious and time-consuming.”

What finally brought this stage of the fight to an end was Qualcomm buying Airgo in December 2006. With Airgo’s management out of the picture by 2008, a true unified standard was ready for approval.

It would be smooth sailing from here right? Wrong.

Before the IEEE approves a given standard, everyone with a patent that touches that standard must sign a Letter of Agreement (LoA). The LoA states that the patent holder won’t sue anyone using its patent in a standard-compatible device. All it takes is one holdout: Commonwealth Scientific and Industrial Research Organization (CSIRO), an Australian government research group that held a patent that concerning the development of a wireless LAN, refused to sign the 802.11n LoA.

Cue a patent war. Apple, Dell, Microsoft, and 11 other companies tried to get CSIRO’s patents overturned. They failed. In April 2009, the tech giants and 802.11n companies surrendered and signed a patent agreement.

Finally, on September 11, 2009, 802.11n was approved. It had taken “only” six years. For a tech standard that everyone agreed was of vital importance.

The Standard Sausage Factory

Why did it take so long? Because the stakes are so high. As Carl Shapiro and Hal Varian wrote in The Art of Standards Wars, “The outcome of a standards war can determine the very survival of the companies involved.”

That’s why, ultimately, technology wars are not about technology. They are about business. Yes, you want a great technology that delivers the goods. But even if your tech is the best, if you can’t turn it into a standard, your innovation is unlikely to make it to market or succeed once it gets there.

Because the stakes are so high, the players can, and do, fight over every tiny issue. Each side seeks an advantage to make sure the resulting software or hardware works best with its “version” of the proposed standard. (For example, Microsoft wanted its finger in the XML pie, and used Microsoft Office formats to try to control it. So today we have two popular office document standards: Microsoft’s OpenXML and the ODF (Open Document Format). In the end, Microsoft finally, albeit very quietly, supports ODF.) The arguments are conducted in technical details, but money is the real driver.

These standards wars are painful, ugly, and can be incredibly petty from the outside looking in. Each participant wants the biggest possible pie.

These fights can be expensive in both engineering and legal costs, so some companies are moving away from standard wars. That’s supported by the realization of the virtue of compromises. That’s always been true: Sony and Phillips realized in 1982 that fighting over CD formats would do neither company any good.

More recently, we’ve been seeing an interesting blend of open-source development and standards. The Linux Foundation brought together fierce rivals to work together on technologies and standards in such consortiums as AllSeen Alliance for the Internet of thingsOpenBEL for open-source biological research; OpenDaylight, for almost all the Software-Defined Networking (SDN) companies; and Open Virtualization Alliance and Xen Project for KVM and Xen virtualization. Perhaps it’s the collaborative nature of open-source projects to which we can credit such successes. Facebook’s Open Compute Project brought open-source methodology to the data center. Apache continues to bring competitors’ projects such as Big Data’s Hadoop and Solr for search.

Open-source software and its business and development methodologies have shown that working together to create common software and standards is more affordable. In short, rather than take a chance on one small, late to market, and expensive pie, it’s better to get a share of one bigger, timely, and affordable pie.

Maybe, thanks to open source, the sausage days of standard making will be behind us. I hope so.

See also:



  1. Maybe, thanks to open source, the sausage days of standard making will be behind us. I hope so.

    Ah-ha-ha, hum, h-a ah-aahhaha. Maybe you should drop by some of these committees where we merry open something folks participate. I can confirm, it’s still all sausage factory, all the way down.

    • Steven Vaughan-Nichols says:

      Oh, I know open-source all too well and even Debian food-fights aren’t as bad as most of the standard wars I’ve seen. The key difference is at the end of day in open source you have to have working code. I’ve seen finalized standards that described almost totally fictional technologies.

  2. Great read Stephen, standards really are hard work and rely on countless hours from dedicated volunteers that all have day jobs. Alas the principals of open source development don’t seem to translate as easily to hardware technologies. The good news is that engineers of all ilk seem willing to embrace community models to move their ideas forward, and standards organizations benefit from this underlying trait.

  3. Sam Johnson says:

    “Apple manages to get it own way of doing things – from Advanced Audio Coding (AAC) for music format”

    Except that AAC is an MPEG standard that was not created by Apple. AAC was developed by Bell Labs, Dolby, Fraunhofer, Sony and Nokia and declared a standard in 1997 which is 6 years before Apple started selling music in the format.

  4. @Steve:
    You can’t possibly be so naive, so I think you just wanted to have a tidy sentence to conclude the article.

    You already mentioned Wayland vs. Mir. I think that will resolve tidily: Only Canonical is supporting Mir, so that will live if Ubuntu Phone takes off, or it will die if Shuttleworth runs out of money.

    The bigger fight is systemd vs. upstart. Again, that’s a fight of Canonical vs. the world, but this time it’s dragging in Debian. Just look at the process the Technical Committee is using to determine which init system to recommend for the next version of Debian. Clearly, by technical merits, systemd would win. But some members of the committee are current or former Canonical employees, and Ian Jackson in particular is not going down without a fight.

    • Steven Vaughan-Nichols says:

      If you look really closely, Mir vs. Wayland, it’s really Red Hat vs. Canonical. Like so many other standard wars before it, it’s really a business-driven one. Systemd v. upstart is also Red Hat vs. Canonical. Lots of open-source people only focus on the technology, with a lot of “We hate Ubuntu” thrown in, but, once more, it’s really a business clash in technical clothing.

  5. Stephen Lemelin says:

    Stephen you points on how business, not technology drives standards is so true. You mentioned many different fields from electricity to electronics. I believe there was big issue with train rail standards in the US for a while due to the same business issues.
    My concern is how this will translate into things like the auto and/or connected car. I think it may have been a good idea for the government to get in on this and help dictate the standard. Normally I am not a fan of the government getting involved, but in this case it may help.

    • Bill Wade says:

      Speaking as someone involved in a standards body – no. You don’t want government involvement. All that means is the parties to the standards process take their arguments to their respective representatives (along with a hefty campaign donation) and then the argument is moved to DC, wherein the technical and market merits of the proposed solutions are shoved to the wayside.

      • Stephen Lemelin says:


        Great point after the rulings from the supreme court that allowed unlimited money in politics and corporations as people, it makes sense. Government cannot be counted on as a neutral party,

      • Steven Vaughan-Nichols says:

        I have to agree. NIST and the like are fine for cleaning up… after the work is done, but you don’t want the govt. in early. You really, really don’t.

    • I was the author of a standard initiated by a government agency. It took years off my life (or so it seemed). The process was years in the making, with involvement from a lot of folks who wanted to be vendors to the government agencies using the standard. Then there was the problem of folks who missed a meeting and at the next would argue vigorously against changes made at the meeting they’d missed; and get agreement. I finally quit when the document was changed so many times back to what had previously been approved and then changed again.

      When I looked at the version that was finally approved, I didn’t recognize it at all. I’m quite surprised that it even succeeded. Perhaps my rant when I quit had some effect because the standard was approved only three months after I quit.

  6. Rob Grainger says:

    I’m still clueless why XML is regarded a good interchange format for anything.
    I mean why say something once when you can needlessly it

    • Bill Wade says:

      I held that same position ten or so years ago. But the ability to delegate much of the error checking on inbound data to a validating reader saves quite a bit of coding. And it’s much harder to argue the semantics of a well-written, well-documented schema than it is to argue over a prose specification.

      • Rob Grainger says:

        I get that, I’ve used XSD extensively, but still dislike the redundancy of information. JSON is better, but could do with some kind of schema standard.

Speak Your Mind