2011 Predictions for IT in the Federal, State, and Local public sector
As this is the first week of the first year of a new decade, predictions are in demand. Below are my predictions for Information Technology in the Federal, State, and Local public sector. I promise to revisit these predictions at the end of December to test their accuracy. Wink.
Cloud Computing: Off and running, finally. In the last few months the Departments of Interior and Agriculture, and the General Services Administration have committed to the cloud for office productivity applications like Google Apps or Microsoft’s Business Productivity Online Suite. Major cities including Los Angeles and New York have also made the decision to adopt cloud services. The latest buzz is that FedRAMP will be ready for prime time by April.
Acceptable security protocols have been a major issue slowing adoption of cloud services as envisioned by OMB in its Cloud First strategy. With FedRamp operational I believe that there will be a flurry of procurement announcements for cloud services during the remainder of FY 2011 for implementation with FY 2012 funds. That is, if the major vendors do not bog down the procurement process with protests. And if the initial implementations go down reasonably well with the end users.
Data Center Consolidation: Evolution rather than revolution. The FY 2012 budget submittal for Information Technology is expected to incorporate Agency budgets for the Data Center Consolidation Initiative, the planning for which occupied so much of the time of senior IT folk in 2010 and resulted in the identification of many more data centers than had been acknowledged in the FY 2009 baseline.
Given the atmosphere of cost containment in Congress it appears unlikely that funding will be made available for shiny new mega data centers. As efficient as that might be in the long run, centralized data centers offering shared services for multiple agencies would eliminate jobs in the small 200-500 square foot data centers across the country. These small centers are primarily limited to email servers, office productivity applications, and a few locally required database applications. It is more likely that consolidation will be accomplished in phases that roll up the smaller centers into mid-sized intermediate configurations that can evolve to a more centralized, efficient architecture over five years or so. Meanwhile, at their primary centers agencies are taking advantage of the opportunity to consolidate data center workloads through the exceptional efficiencies available from integrated stacks of hardware and software (e.g. Oracle Exadata) to improve performance on mission critical systems while reducing hardware footprint, staffing, and energy costs.
Open Source Software: Nimble, robust, and in demand. Linux has broken through the fog of traditional commercial software marketing to become a sought-after mainstream alternative in many workhorse environments. Red Hat’s growth in Linux sales during the economic downturn is a clue to the reason: reliable, scalable, and secure core software with a compelling array of component modules required by the data centers of today and tomorrow; including virtualization, cloud services, and a complete middleware suite in JBOSS. I predict that this growth in Linux adoption will accelerate, taking share from some entrenched Microsoft server farms during the next year as more and more IT managers understand the value of the Red Hat Linux support model.
Mobile Apps: Here to stay, will never go away. If you have doubts about the government’s enthusiasm for Apps, just take a look at those available at no cost on USA.Gov: Calculate your BMI, check on airport delay times, scan the crowd for one of the FBI’s Most Wanted – all from your smartphone. And these are just ‘Apps for Citizens’.
Over the past year a wide range of Federal Agencies have awarded contracts for adapting or building custom mobile apps for employees, including apps that can interact with servers at HQ to provide recommendations on crop pest control, build maintenance tasking assignments with digital photos and GIS locations of facility and equipment problems, or augment formal training with reference material in the field that ‘matches’ the problem in the digital photo. The major hurdle to wide adoption will be the effort required to incorporate mobile devices that exchange data with internal servers, into the enterprise network security model. When this security integration process becomes the norm, we may find that more government employees are in the field than are in Washington.
Linked Data/Semantic Web: The light bulb will go on. What do DBPedia, the DoJ/DHS National Information Exchange Model (NIEM), the NIH BioPortal, the IARPA Blackbook Program and BestBuy have in Common? They are among the early adopters who have demonstrated to themselves the high ROI that can be achieved in applying the recent World Wide Web specifications for data interchange. Leading software vendors including Oracle ( Oracle Database 11g Semantic Technologies ), IBM, and HP have developed semantic technology solutions and scores of companies are offering early stage tools to support application development.
The Administration is serious about unwrapping unclassified government data for broad use and putting in place the framework for data interchange among all levels of government without requiring reengineering of existing databases. I predict that by the end of 2011 there will be a range of implementations across government and you will begin to see the terms RDF, OWL, and SPARQL as frequently as you see the terms XML, Schema and SQL.