From streaming movies to checking email 24x7, we take “always-on” data applications for granted. But in the public sector, this always-on mindset is a little slower to catch on.
From cyber-attacks to power outages, data center outages to application failure, IT outages are an ongoing problem for the U.S. government.
A 2015 report by MeriTalk, found that 70% of federal agencies have experienced downtime of 30 minutes or more in recent one-month periods. 42% were blamed on network outages and 29% on connectivity loss.
But high availability in government isn’t just about convenience, it’s also essential to national security, service delivery, and compliance with federal mandates such as FISMA and FedRAMP that require high data availability targets. This is especially key when you think about the increasing use of cloud apps.
To achieve always-on data and application availability, there are several key characteristics that agencies should be looking for – as defined by their own needs and priorities. Who better to define what “always-on” means to the enterprise, than those responsible for the data apps themselves. To that end, DLT partner, DataStax, surveyed architects, developers, and enterprise leaders about the key features they look for in a go-to, always-on data management platform for cloud applications. Here’s what they discovered (summarized from this white paper):
1. It Must be Scalable – Database technology needs to support the scale requirements of government data. 53% of respondents listed scalability as their main concern. But scaling vertically (piling up on RAM, CPU, storage, etc.) can be costly, instead IT professionals should consider scaling horizontally by adding commodity hardware linearly to accommodate huge increases in capacity without reconfiguration or downtime.
2. It Needs to be Always-On – No great revelation here. “Maintaining 100% uptime” came in as a second priority. But DataStax stresses not just the importance of high availability but the criticality of continuous availability. “You want complete protection against any failure for core capabilities, including your entire environment of business-critical applications.” Consider the 2015 AWS outage and how DataStax customer, Netflix, could keep going in spite of the outage via multi-region, active-active replication – where all critical data is replicated between different AWS regions to allow for rapid failover from failures.
3. It Must be Able to Handle Mixed Workloads in Real-Time – Today’s users want instant analytics and lightning-fast search, the only way to do this is to analyze and search operational data in real-time. But you need to achieve this without resulting in lower operational performance, something only a modern architecture that handles real-time transactional and analytical/search workloads.
4. It Must be Fast – 46% of respondents put “ensuring consistently fast response times for business applications” as the biggest technology issue for web and cloud application users. The solution is a data management platform with an architecture that supports predictable, linear-scale performance no matter how much data or users are involved.
5. It Must be Secure – Ensuring data security is a big concern for 44% of those surveyed. Databases can get hacked badly and Gartner lists advanced security as a must-have for production database environments. This includes data encryption, authentication, internal and external activity audits.
Read more about the survey and how NoSQL technologies, whose strength is the ability to support scale-out architecture for next gen apps and insights, is the way to go.