Skip to main content

Parts of the federal IT infrastructure look like a vintage computer museum, with equipment that dates back to the Nixon administration. The Dept. of Defense Strategic Automated Command and Control System runs on an IBM Series 1 computer from the 1970s. Those storied “nuclear codes” may live on 8-inch floppy disks stashed somewhere in the Pentagon.

Federal lawmakers appear ready to revive the Modernizing Government Technology Act (MGT), a bipartisan proposal that stalled out at the close of 2016 for updating legacy federal IT in government agencies.  Last September the MGT won the House vote, but the $9 billion score by the Congressional Budget Office (CBO) was too rich to pass in the Senate. Both House and Senate staffers say legislators are trying to work out a fix.

Rep. Will Hurd (R-Texas) (left) and Rep. Gerry Connolly (D-Va.) are driving the bill forward again in hopes that a new congress and executive administration will proceed with a sense of urgency to replace outdated systems.

How outdated is Federal IT?

Nearly three-quarters of the $80 billion federal IT budget gets funneled into maintaining legacy systems, according to a U.S. Government Accountability Office (GAO) report. A Dell study in 2016 found that 70 percent of federal IT managers reported working with outdated IT systems.

Several departments have legacy applications dating back to 1950s that are hosted on 1990s IBM mainframes. The Department of Veterans Affairs employee clock system is one example; it’s written in COBOL, a business applications coding language developed in the 1940s.

Congressional leaders have identified 24 federal agencies most in need for new IT equipment. Citing a major data breach at the Office of Personnel Management in 2015 that exposed 18 million federal employees personal information, Representatives Hurd and Connolly won quick support of their peers on both sides of the aisle in the House. The $3.1 funding component for MGT ultimately fell short of what the job estimate entailed.

MGT borrowed ideas from The Move IT Act that Hurd introduced with Rep. Steny Hoyer (D-Maryland) proposed setting up individual funds for the 24 agencies; and included elements of the successful Federal IT Acquisition Reform Act (FITARA) penned by Rep. Darrell Issa (R-CA) and Rep. Connolly in 2014. FITARA that proposed a revolving shared fund that funnels cost savings from new infrastructure to pay other departments’ upgrades.

“We think we have a potential fix, and we’re going to sample that around with all the relevant staff and members and hopefully get this reintroduced fairly soon,” Rep. Hurd told The Hill reporters.

It’s so hard to say goodbye to yesterday’s mainframes

MGT sponsors understand that the longer agencies wait, the harder it will be to upgrade the equipment. Sticking with something that is known to work reliably—even if inefficient to support—carries organizational inertia.

Any mainframe computer technician will tell you, enthusiastically, how remarkable the technology is despite its age. Those old IBMs have triple- and quadruple hardware failover mechanisms, several different electrical inputs for backup power, end-to-end encryption, and layers of advanced virtualization. They are built to last 40+ years at 100 percent uptime due to those redundancy features.

IBM System 360 in use at Volkswagon, 1973

She’s so heavy

The major drawback is that mainframe upgrades are done with a forklift and cost millions of dollars. Also, COBOL programmers are near retirement age and dropping out of the workforce. Younger programmers are taught Java, not COBOL, in school, and it’s an unwieldly, inaccessible language to learn on your own.

Then there’s the risk involved in migrating. Massive software projects sometimes end up a miserable failure in places where software isn’t the core business, as in government. Moving volumes of data from pre-2000 mainframe computers to Unix-based client-server workstations is problematic due to incompatible network connectivity. The workaround involves converting volumes of files onto streams of LTO tape, and then reading and unpacking volumes onto a file server. The processes takes considerable effort and time for a project of this scale.

Will federal IT work in the cloud?

Migrating federal data to the cloud is likelier than moving in x86 datacenter hardware onsite. IBM offers extensive solutions directed at federal agencies for getting non-critical data into the cloud with its z-series mainframes. Taking an application database from mainframe to the cloud is still cutting edge. We just started seeing migration to public cloud banking industry, which predominantly uses mainframes for its bookkeeping, in the past few years.

IBM Watson z13

The newest IBM z13 Watson mainframe computer (above) is designed for hybrid cloud infrastructure and the heir apparent for many federal agencies. IBM claims that organizations can lower TCO by a third YOY using the hybrid storage model, which gives federal IT way to fund its own refresh over the span of three to four years.

Taking a longer view of the total costs and savings, Hurd and congressional leaders pushing for new federal IT are urging the CBO to factor in savings to reduce the estimate for the job.

“The good news is that the momentum is there to pass legislation modernizing federal IT systems, and lawmakers are telling us that this will be a priority,” said Trey Hodkins, SVP at the IT Alliance for Public Sector, an IT consultancy for government agencies, and IBM partner. Hurd and company are considering enlisting private sector cybersecurity talent that would serve one or two weekends a year for additional support.

Adam Lovinus

Author Adam Lovinus

A tech writer and Raspberry Pi enthusiast from Orange County, California.

More posts by Adam Lovinus

Join the discussion 3 Comments

  • I am naive but while I understand that the maintenance costs for older equipment is high. And I get that COBOL etc are old languages(most financial institutions and hospitals etc still run on various “old” languages too), what does migrating to the “cloud” really mean. Inst it just storing the data on other servers (i mean they are all just computers of whatever configuration), usually somewhere else? If the current (old) systems work well, then keep them for now, and start new systems from today forward so that in effect you would be both storing on the old systems and new systems. Then in a few years switch over completely as the older data is no longer needed. Also while 8 inch floppys are ancient I agree (I have some somewhere still) are they more secure from being cracked than usb drives and cloud storage? It seems to me that putting all the data etc on the cloud (i truly dislike that term) is only setting the data up to be stolen at worst or corrupted at best as net security seems to be a total joke everywhere.Sometimes I feel that techies love tech for its own sake (i mknow my son does) . Yes faster, and smaller it is. But I rarely find it more durable, less buggy, and it is definitely not more secure. Connecting everything together just increases the risks.

    • Adam Lovinus Adam Lovinus says:

      Those IBM mainframes are in service because they are great computers. You’re correct about that. With dual- and triple-redundancy at the hardware level, they are the battle tanks of computing. They work in the sense they’re not broken. When 75 percent of IT budget goes to maintaining old infrastructure, you can’t expect the government to utilize the power of modern IT beyond break-fix survival. I’m OK with outsourcing data storage to third party companies that exist solely to secure data against intrusion and loss. It’s good for business, too — federal contracts for data migration and hybrid architecture up for grabs.

  • Avatar Samir says:

    You said it well. As the article states, those IBM mainframes were built for working 40 years at 100% uptime without failing. You or I couldn’t probably name a single piece of equipment made today that could dream of that reliability.

    The current iphone has more processing power than the Space Shuttle’s combined 5 computers. And yet, it’s the Space Shuttle’s computers that have reliably helped people get to space and back for decades, whereas iphones last at best for a few years.

    The Voyager missions had memory in the kilobits and yet they have traveled beyond the edge of our solar system and are still running. It’s this type of reliability that is just not there in modern computers and can’t be because the designs are flawed from the get-go.

    I have no problem with every important thing being stored on ancient systems that no one can hack into–that sure keeps them away from the data.

Leave a Reply