Could Code Review Have Helped


We’ve heard a lot about the lack of time allocated for testing with the new site and there is no doubt that’s true. The raw fact that most software professionals will attest to is that there is rarely enough time allocated for testing. Testing is a tough thing to schedule – how long will it take? Well, we can predict the “happy path” – here’s how long it will take to test the product if it all works correctly. But once something starts to go wrong – how can a tester accurately predict the time it takes to chase down a bug and then later ensure the fix works? It’s the friction that exists on every project.

This is one of the major reasons why so much attention has been paid to moving quality upstream. Everybody on the project is responsible for ensuring the quality of their deliverables – from requirements to deploy scripts. The burden of assuring quality should not rest solely on the shoulders of the testers, as most of us in the industry know. So let’s look upstream a little to see where some of the issues in the site could have been avoided.

Code review is one of those quality measures that many teams do but many don’t do well. It’s harder than it seems on the surface – to do it well, the reviewer must be diplomatic, unbiased and clear while the coder must be receptive and unemotional. That’s one reason why many people eschew “over the shoulder” code reviews in lieu of tool-assisted code reviews, which can provide not only a dispassionate environment for providing/receiving feedback but also a record of the transactions.

Here are a few of the problems that have surfaced in the last couple of weeks that could easily have been found with effective code review procedures:

Sloppy Code

Okay, that’s a broad category but what I mean is general sloppiness – typos, dummy files, inappropriate or obtuse comments, poorly named variables… the kind of thing that makes your source code incomprehensible. If you care at all about maintainability and extensibility (and you should), then keeping your source code clean and understandable is important. In the case of, dummy files were released into the repository that just bloated the code that was released and maintained. Additionally, when the source code was made available on github, crowd reviewers noticed that there was still Lorem Ipsum text, typos, and editorial comments from developers that should never have made it into the released code base. And more importantly, there were some rookie mistakes made that a quick glance would have surfaced, like hard-coding the number of state choices rather than using the array itself to determine this. These are the kind of top-level brain-dead corrections that a code review would have enforced.

Performance issues

The biggest complaint that was raised from consumers (and happily fed many celebrities with comic fodder) was the performance of the site, which often was so slow that users couldn’t initiate or complete transactions. While there has been a lot of noise about the lack of in-depth load testing and, even worse, lack of response to the reported failures in the light load testing that was done, many potential performance issues can be caught before they ever hit the load test lab.

For example, according to one analysis, just hitting the Apply button causes an inordinate number of client-side files to load and unnecessary data transfers to occur between the client and server. A good code review would question the necessity of each of those actions in an effort to keep the code as lean as possible, especially because it is easy to overwhelm a system with too many simultaneous communications between client and server. Another Web Programming 101 error that seems to have been made is the lack of efficient caching. Caching can help minimize both browser performance issues as well as unnecessary data retrievals that will eventually overwhelm the servers as more users come online.

EDI Issues

Perhaps even worse from a consumer and insurance provider point of view are the data anomalies that are being reported. Even if you manage to get past the performance issues and get your application processed, whatever is happening on the back-end is resulting in applications that can’t be properly processed by the insurance companies because they are improperly formatted (which then causes the insurance company systems to error out).

Because the insurance industry is regulated, there are rigorous controls around the data transfers that occur between systems. New enrollments for healthcare applications are transferred via 834s, which have thousands of permutations based on the enrollee’s information. Obviously, the testing for this would be extremely complex and would need to be planned carefully. An equally careful code review to ensure that the 834s are being handled properly by the code itself would minimize the risk that an error would occur in the transfer.

Security problems

And on a par with the severity of the EDI problems are the security breaches that have been reported by analysts of the site. Simple techniques around authentication and the sharing of personal information are known by most developers, and they are especially important for any federally regulated industry. In fact, when security issues began surfacing, the team jumped on them first to try to prevent any further breaches. Ben Simo has been instrumental in finding many of the security breaches that the contractors have tried to fix but, as he points out, it’s possibly more concerning to think about the security issues that we can’t see. (I won’t even insert a comment here about why we outsourced this work to a Canadian company when so much of it relies on US regulations).

Software Quality Matters

It’s an interesting dynamic happening around us now – software and its complications used to be conversation fodder within technical circles only. Now it’s a conversation happening in the primetime news and around dinner tables all over America, where the intricacies of building and testing software are not clearly understood. As software professionals, we have a responsibility to not only deliver quality software but also to be measured and accurate in our responses to these types of situations. The problems with the site are not a reflection on the program or the politics behind it; they are pure technical anomalies that could have been avoided by taking some simple and well-understood steps that are essential for building quality software.

And software quality matters.

See Also:

RedditFacebookTwitterGoogle+Hacker NewsLinkedInEmailPinterestShare


  1. Hi Lorinda,

    Some people see the glass half-full and some see it half-empty. I don’t want to say that I’m among the former group of people, but, from my experience, the code of any application is not 100% perfect. It’s like everything else in life, beautiful on the outside, but not that beautiful on the inside.

    If there are comments here and there in the code, typos, lorem ipsums, etc… that doesn’t mean the code is bad or the whole thing is bad.

    Additionally, one of the very few things in life that get better with age is applications. When you first release it sometimes it’s horrible, but then it gets better and better because of the feedback, until it reaches full stability in a couple of years. Yes – it’s a lengthy process, but it happens to every single piece of software. Ask Microsoft.
    A disclaimer: I’m not involved in politics whatsoever and I’m not defending anything or anyone, this is just my technical opinion based on my experience.

Speak Your Mind