While we all know that most of our world is powered by software and our personal data is stored in multiple databases around the world, we rarely think about how often that data is passed around and manipulated between various disparate systems. Rest assured that your financial and government data is securely managed with encryption and authorization techniques, and that the software vendors who develop those systems must adhere to a set of regulations. But don’t be overly assured.
Here’s where things get dicey – while regulatory bodies have the best of intentions when they design their oversight standards, most of those standards relate to process. A software company that serves the government must adhere to rules about testing procedures, code review, and requirements definition, in an effort to ensure quality processes exist. But there is no way to regulate quality itself, just the process that promotes it.
So, when the Indiana Family and Social Services Administration announced last week that they inadvertently disclosed people’s private information (and lots of people – like almost 188,000 of them), it was not the actual process that was to blame. Despite the regulations about process, a bug slipped through, as bugs are known to do, and caused RCR’s document management system to direct documents to the wrong recipients.
Everybody in the software industry knows that nothing is 100% bug-free. There may be no *known* bugs in a system, but nobody can claim that all bugs have been found and fixed. However, there are a few details about this case that could easily have been avoided – and hopefully will never be repeated.
The most alarming thing to me was this statement and the fact that it was made on July 1, 2013:
“The programming error was made on April 6, 2013, and affected correspondence sent between April 6, 2013, and May 21, 2013. The error was discovered on May 10, 2013. RCR determined the root cause of the programming error and it was corrected on May 21, 2013.”
This is a fairly serious bug – and one that seems like it could have been easy to find. While none of the involved parties disclosed the root cause of the issue, one would hope that RCR conducts rigorous testing on any document management that includes private information, especially financial information and social security numbers. At the very least, I’m sure we can rely on RCR to include an automated regression test for this condition into their suite so it never happens again. Right?
Here’s a truly curious fact: The statement indicates that the error was discovered on May 10 (I am assuming a recipient alerted them that something was amiss) but it continued to affect mailings for another 11 days, until May 21. Why? Risk mitigation is an important part of managing people’s secure information and it seems implausible to me that there was no way to halt additional mailings while the issue was investigated. I could not find a breakdown of how many of the 188,000 people affected were impacted during those final 11 days, but obviously some portion of them were.
And, last but definitely not least, the issue was corrected on May 21. A full six weeks before the announcement was made to the public. At the time the announcement was made on July 1, the statement indicates that “the FSSA says it’s in the process of notifying clients who are at risk.” This seems like an inordinately long time to disclose something that will seriously impact individuals who are most likely in difficult situations already.
What can we learn?
Obviously regulations are intended to reduce the occurrences of this kind of glitch, and I believe they do. We need to acknowledge as a society that systems are built by humans and therefore are inherently imperfect. It is how we deal with the imperfections that matters. In this case, so many privacy regulations were violated, including HIPPA, that it was incumbent on the FSSA and RCR to react quickly, decisively, and with transparency. One can only hope there were reasons for the lack of action (or perhaps there was more action than we have been told about), but that is the whole point of transparency. In this case, the transparency isn’t transparent enough.
- Fabrice Bellard: Portrait of a Super-Productive Programmer
- A Taste of Salt: Like Puppet, Except It Doesn’t Suck
- 13 Things People Hate about Your Open Source Docs