Newspaper reports from the mid-1960s reveal Johnson’s plan to have the federal government develop a computing center (covertly, of course, it was the middle of the Cold War, after all) which would act as a central records depository and the data from which could be accessed simultaneously by government agencies across the country.
Startup costs were estimated at about $2 million and operating costs would eventually top out around $12 million.
With millions of records already captured on magnetic tape including those of the IRS, Department of Defense, FBI, Census Bureau, Office of Education, and Social Security, the shared-service center could launch with a sizeable database from the get-go. And, the ability to cross-reference information would eliminate having vital records siloed and reduce redundancy in federal operations.
So, what happened to our “One Nation Under Cloud” foray into distributed data? Well, somehow information about the project leaked to the press and the public. And, since this was a generation that valued privacy and feared Big Brother (an intrusive government, not a CBS television hit series), the notion of Uncle Sam compiling terabytes (a term not yet even coined back then) of data on its citizens met with a cyber storm of resistance before such a thing was even possible.
Worries over illegal wire-tapping, covert psychological testing, and a panoply of other privacy concerns ultimately killed the project and resulted in the passage of the Privacy Act of 1974 (although another president’s pariah, Nixon’s Watergate, might have helped get that one through Congress).
The Federal Cloud Today
The federal government’s IT department has come a long way since LBJ (and so has its budget at around $80 billion a year now). But, today’s White House administration is still gung-ho on leveraging the cloud to foment efficiencies in spending and productivity.
Known as a ‘Cloud First’ policy, the White House is asking IT staffers to first consider implementing a cloud strategy and give it priority over traditional options when looking at future projects. Also in the mix are mobile technologies and, of course, addressing security concerns in these arenas.
The flexibility to rapidly provision IT resources has appeal for federal CIOs looking to overcome decades of disdain at IT’s historically slow response times to demand for new and improved services. Others, such as the scientists in the Energy Department, are looking at cloud computing models to enhance their computing capabilities as well, although theirs may be an internal, proprietary cloud since the cost of private services presently exceeds the costs of operating their own supercomputers.
Perhaps the biggest challenges facing federal CIOs who want to implement cloud services are contractual rather than technical. The notion of using private sector infrastructure to house government data doesn’t square with federal procurement policy for whole sectors of data, notably defense and financial data.
Still, the cost savings are substantial, an estimated $5 billion per year, so the agencies, their partners in procurement, and the private sector will have to find ways to overcome those hurdles. Whether by choice or by necessity, it seems LBJ will eventually get his way: There will be a cloud behind much of the government’s IT operations.
What cloud related challenges has your organization overcome recently and how did they get over them? Please share your lessons learned in the comments.
- Fabrice Bellard: Portrait of a Super-Productive Programmer
- The Biggest Changes in C++11 (and Why You Should Care)
- 14 Ways to Contribute to Open Source without Being a Programming Genius or a Rock Star