Best Practices for Achieving Migration to a Cloud Model
The following is a transcription of the Best Practices for Achieving Migration to a Cloud Model webcast that was held on February 23, 2011 and hosted by i360Gov.com. You may also view the archived version by registering online.
Moderator:
Good afternoon, and welcome to today's webinar, Best Practices for Achieving Migration to a Cloud Model, brought to you by i360.gov, DLT Solutions, Red Hat and NetApp. We have a great line-up of speakers today. Our first speaker is Van Ristau, he's Chief Technology Officer with DLT Solutions. Our second speaker today is Dawn Leaf, NIST Senior Executive for Cloud Computing, Senior Advisor Information Lab at NIST. Our third speaker is Greg Potter, Research Analyst In-Stat.
Before we get started, I just want to go over a few housekeeping items. Anytime during the next hour, if you'd like to submit a question, just look for the "Ask a Question" console, and we'll field your questions at the end of the presentation. If you have any technical difficulties during the webinar, click on the "Help" button located below the slide window, and you'll receive technical assistance. And finally, after the session is complete, we'll be emailing you a link to the archived version of this webinar, so you can view it again or share it with a colleague.
And now, I'd like to hand it over to Van Ristau – Van.
Van Ristau:
Good afternoon. I'm going to talk for a few minutes today about practical considerations in migrating existing applications in storage to an off-premise cloud environment. Let's take, for example, the considerations you have to make with application migration. First of all, we need to assess which applications should be candidate for replacement versus migration. For example, if you're planning to make a change from one email application to another, now would be a good time to consider shifting to a cloud email software as a service application. Likewise, if you have a widely-used client server application that needs to be modernized for the web to reach a wider end-user base, now would be a good time to investigate what software is the service products might meet your needs, or consider building a replacement on a provider's cloud platform. In general, if you're planning to retire or replace an application, it would not be a good candidate for migration.
Secondly, I would suggest that you consider using some form of a multi-criteria decision analysis methodology to assist in making your migration decisions. A couple of reasons for that: It's really going to be important for you to document the decision process for the criteria you use in selecting or rejecting applications for cloud migration. An objective process will help you to defend your decisions, and later, to revisit decisions when the selection criteria change. Considerations might include ease of integration, ease of migration, what technology stack you need, the application design, including dependencies on web services or other databases in the enterprise.
Then, application by application, determine if a complete migration is possible, or even desirable. Does the cost to migrate outweigh the application suitability for the cloud? For example, applications with low variability and resource requirements wouldn't take advantage of the cloud's elasticity. If that's important for your particular scenario, that might be an application you would want to keep in-house, if you can maintain the resource requirements at a steady state. Secondly, are you going to need web services, for example, for mainframe legacy applications that might support the migrated cloud application? If so, it might be a little complicated, you might want to hold off on something like that until you get a couple of simpler applications under your belt.
Another consideration: Is the application going to be re-hosted on an off-premise cloud infrastructure as a single tenant application with little or no impact on the software code or is it going to be a multi-tenant application with a single code base, for example, serving multiple agencies in sort of a shared services environment? And last under this section, I suggest that you pilot or proof-of-concept for all or part of the application, those critical pieces of the application to ensure that it will migrate well.
Next, determine if the target cloud environment will support the level of security and confidentiality of the data the application collects and stores. Clearly, candidates for early cloud migration or new application development, are those that provide public access to public data – the Open Government Initiative is case in point. There's little risk in that.
Identity and access management considerations are also as important in cloud environments as they are in in-house applications. If you're familiar with NIST 800-53 Revision 3, that document, available through NIST, provides guidance on selecting the impact level of your application. And if you've been following the GSA's infrastructure as a service, BPA competition, or if you haven't, that was recently awarded to 11 teams, and each one of those providers is required to have an infrastructure that's certified by the FedRAMP process at the moderate level.
Next, assess the computer resources you need to meet anticipate peak demand. Order what you need. You'll be required to make some commitments over time as to what you need for compute resources for this application – storage, processor, connectivity. You don't want to order more than you need, but if you have a seasonal or a periodic peak requirement that doesn't fall within those constraints, check to see if your cloud provider offers a burst capability for additional resources for short periods that doesn't really impact your recurring cost.
Next suggestion: Determine what application performance modeling tools are available from the cloud services provider. These cloud services providers vary widely what tools they provide or don’t provide for application performance monitoring. And you’ve got to give consideration to this aspect of your migration strategy, and plan for licensing and installation of your own application and performance monitoring tools if they’re not provider by the cloud services provider, or what he provides is insufficient to your needs.
Determine any one time professional services required. Identify resources and cost. First example, at DLT, both of our Quest Software and Red Hat vendor partners have professional services teams that consist of professional services dedicated to cloud computing for their products and cloud services. A number of other cloud providers have similar services and systems integrators are also very active in this regard. So, if you anticipate needing services in the migration process, start that search early.
Then after migration, tune your applications to optimize performance. It probably goes without saying, but applications in the cloud beyond multitenant software’s service products will require monitoring and tuning if you’re going to deliver the performances that end users expect.
While cloud services provider assumes much of the workload for maintaining the infrastructure, you still have the responsibility for the application in a single tenant environment.
Let’s talk a little bit about storage. Determine what databases the vendor supports. Primarily here we’re talking about platforms as a service where you might build a new application using the cloud service providers tools and databases. But the vendors vary widely in what databases they provide support for. If the vendor doesn’t support the database you’re currently using, determine what the impact is in shifting to another database. It may be low or you may need to have additional training, licensing and what-not.
Understand what options the vendor offers. Does he offer a clustering and backup services as part of the package? Assess trade-offs with respect to light and see data security data security and availability compared to in-house before you make a commitment to migrate a particular application
Any one of these three – a deficiency in any one of these three can be reasons to keep an application, a database in-house. You need to assess not only the service level agreements, but to actually pilot and stress a migration candidate before shifting to the cloud for production.
Understand how you’ll accomplish the initial load, periodic data transfers and synchronization required. Identify what software and tools are required for migration. Most of the major database vendors are now offering migration tools that support cloud migrations, either separately or as part of a combined package. Identify any termination costs, including data transfer. Understand how you will reserve for the cost to transfer data to a new cloud provider, should a decision be made to change. And last, but not least, identify how your cloud databases fit into your disaster recovery plan.
The options are quite substantial. You’ll notice on the viewgraph in front of you that we have a wide range of choices both in enabling technologies, complete cloud solutions for an internal cloud deployment, infrastructures as a service, software as a service, with an ever increasing range of capabilities in software as a service.
The one example that I’m familiar with, I use a – I’ve been using it for about ten months now, is Google Apps. This is what we call a multitenant application. Everyone uses the same code base. So, when it changes, you’ll get the change, everybody else will get the change at the same time. These are very cost effective for applications that have a wide range of users with very similar requirements.
Another multitenant concept or model is platform as a service. And in this case, the multitenant – you’re using multitenant development tools. Everybody’s using the same version of Java, for example the Google App engine provides that any changes to that are reflected in everyone’s development platform.
And where are we headed? Everyone that I’ve talked with over the past year or so and that I work with in cloud migration strategies and supporting our customer fully expect to be in a multi-cloud environment within three to four years. That is, you’ll be using in-house internal cloud, perhaps a community cloud with other agencies or public clouds for multitenant email and that sort of thing. And tools are starting to be developed for this. For example, if you take a look at Red Hat, Red Hat’s Cloud Foundations’ Edition One, you’ll find very good description of how to build your own multi-cloud management environment.
And I’ll turn it over to Dawn Leaf.
Moderator:
Great. Thanks Van. Once again, before we hand it over to Dawn, I just want to encourage our audience to ask as many questions, we’ll be addressing them at the end of the presentation. Now, it’s my privilege to introduce Dawn Leaf – Dawn.
Dawn Leaf:
Well, good afternoon, and thank you for the opportunity to join you here in the webinar. What I'd like to do today is to talk from the perspective of a NIST cloud computing program, but focusing on those strategies and those products that I think are useful to adopters, and to really focus on best practices, which is of course, the goal of the session.
One of the things that we find is the greatest challenge in cloud computing is the sheer volume of work and services and information and opportunities to collaborate that exist in cloud, because it's an emerging technology model, and obviously, there is a strong interest in applying cloud to reduce costs and improve services. So what we've done with the cloud program at NIST, is to try to identify how we can prioritize our work to make sure that we are not only doing good work, but working on the right things. And what I'd like to do today is talk a little bit about the rationale behind that, and then to focus on some very specific working groups, and again, products that might be useful for adopters.
So I think that there are a large number of those who've been following the emergence of the cloud computing model and information technology, who are familiar with the NIST definition of cloud computing, but there's a broad spectrum of work beyond that. Our program is really based on a request from the United States' Chief Information Officer, Vivek Kundra, who has asked NIST to assume a leadership role from a technology perspective, in helping U.S. government agencies to securely and effectively adopt cloud computing to support their missions. And if you are familiar with the U.S. government, the Federal Cloud Computer Strategy that was released just last week, you'll notice that there is a $20 billion target out of the overall $80 billion that U.S. government agencies spend each year on information technology, that is targeted toward cloud computing.
And the NIST portion of the strategy, or our part of the strategy and our program, is really to help identify those interoperability, portability and security requirements that U.S. government agencies need to satisfy in order to implement cloud computing, and to help develop the standards or advance the standards, and the security guidance, and the research that is necessary to support those requirements. An underlying principle that we do at NIST with all of our work is to rely on collaboration with the private sector, not just with federal, state and local governments, and international government, but with industry consortia, academia, and standards development organizations.
So of all the slides in the deck, this one – the cloud program concept and rationale, is probably the most important one, because this is the way that we are trying to make sure that we are focusing our work priorities on those, again, interoperability, portability and security requirements that are of greatest importance to U.S. government agencies, and considered to be the highest priorities from an industry perspective. The slide in today's show is pretty busy. Normally, when I present this, I use the animation function to break it down piece-by-piece, but I'll try to describe it succinctly.
The NIST Cloud Computing Program is really broken into two parts: There's a strategic program and a tactical program. The goal of the strategic program, again, is to help us to identify those priorities. And the mechanism that we are using to define and to communicate the priorities is referred to as the U.S. Government Cloud Computer Standards or Cloud Computing Technology Roadmap. The reason I wanted to spend a moment on this today, or a couple of minutes, is because I think that this approach is usable more broadly than just to U.S. government agencies, but really to any cloud adopter, in terms of determining what your highest priorities are, and what you really need to do to get beyond the generic discussion about security, and being able to move your workload between cloud providers, and being able to distribute your workload into the specifics.
So, in this strategy, the first process is really working with U.S. government agencies to identify very specific target business or mission use cases for cloud computing. These are different than the case studies that you see presented as success stories by GSA or OMB, because these are targets. These are opportunities for cloud where the agency feels that there are requirements or issues that need to be satisfied in order to go forward. So an example that I often use would be NOAA. If NOAA wanted to use the cloud computing model to be able to handle spikes in demand for emergency warnings, for weather warnings, they might like to use cloud computing. But they would be sensitive to the fact that people make life-and-death decisions based on NOAA warnings, and to the fact that they have high and moderate security impact components in the systems.
So the goal in designing the target use cases is to use technically oriented scenarios that are end-to-end operational threads in how to implement cloud computing as a model, or how to operate under the cloud computing model to support the mission’s base. This is owned by the agencies, and the NIST role is to interpret these mission requirements, again, into very specific technical requirements, not just the mantra of interoperability, portability and security, but reliability and maintainability as well.
The second part in the process is defining a neutral cloud computing reference architecture, and this was actually recommended to NIST by industry. The goal here is to develop a logical and physical model of cloud computing, that is more detailed than the cloud computing definition, but is not tied to a particular set of products or a particular provider solution. The goal is to be able to categorize cloud services generically, so that government agencies can compare apples to apples.
And then the third process is exercising the scenarios against that reference architecture, in order to identify gaps, requirements that are not met. This list of gaps for interoperability, portability and security basically is the roadmap. The roadmap is then used to prioritize our tactical projects at NIST, which include guidance, standards and research.
So what I'd like to do now is move into some of the specifics for cloud adopters. But first, I wanted to just provide a timeline for context. And you can see that the strategy was developed between the May and November, 2010 cloud computing forums. And then we launched working groups in November, and we are planning in April to present the results or status of those working groups, and complete the first draft, U.S. Government Cloud Computing Technology Roadmap, or standards roadmap, at the end of this fiscal year.
So in terms of very specific products, and I think we'll be able to go through this very quickly, I'm not intending to read these slides, the goal is really to provide a reference for those who are participating today in the webinar. The first resource is really the workshop, the forum and workshops themselves. They are attended very broadly, as you can see by this list. And there is an opportunity to hear different perspectives from various stakeholders in cloud computing. Everyone's invited, the next one is in April, and it's free, and it's going to be held in Gaithersburg.
Another tool that I wanted to draw your attention to is the NIST cloud computing collaboration site. Each of those strategic efforts that I mentioned earlier and each of those tactical projects, for NIST is both a project with internal work, and it's also a working group that is open publicly. And that's how we keep our work in-sync with those broader sets of collaboration efforts and partnerships. So each of those projects has a working group with somewhere between 300 to 500 registered members. And in actuality, the groups meet weekly with maybe only 30 or 40 calling in, and we use collaboration tools to support those. One of the advantages is that you get exposure to a really broad set of stakeholders in the cloud computing arena, and you can directly participate and contribute and review the materials.
All of the products from the working groups, however, are available on the NIST website to anyone who publicly logs in, so you don't have to be a registered member to see those.
A particular reference that I wanted to talk about, and I know that there's some interest in the CIO Council – and this gets back to the target business use cases – is that NIST works not only as a collaborator with these public working groups, but we are also formally identified as an advisor to the Federal CIO Council. And we Chair the Standards Working Group, which is where government agencies provide information for their target business use cases. And then of course, we transfer that information through, again, the public working group. So it's really two lines of participation or communication; one that's formal within the government; and another that is just broad and publicly available.
One of the things we're working on that I think may be of special interest is a comparison of reference architectures. We've listed 10. And what you find with reference architectures is that these are very often developed from the eye of the beholder. So if you're a security vendor, you tend to think of a reference architecture that's based on security. If you are a storage vendor, you think of storage. Some organizations think of a business model. The goal here, I think one of the advantages is that in looking at all of these models in this summary, which we've presented and provided on our website, you really get insight into how different providers, and for that matter, adopters, look at cloud computing, because once you get below the definition, there really is a lot of interpretation, as far as what cloud computing really means and how it's implemented.
Some work that we will be posting on our website is, Analysis of the Resource Allocation Algorithms for Public Infrastructure as the Service Providers. And some work that has gotten a lot of interest, and again, all these projects are supported by public working groups as well, is the SAJACC Project. SAJACC focuses on low-level technical use cases, like how do I, as a consumer get my data into or out of a cloud. We are now developing test cases and procedures, and really drivers that can be used that are publicly available for testing various functions, again, such as moving your data between cloud providers.
And then this is really the last product slide that I wanted to cover today, and I think this relates very much to what was discussed earlier to some of the some of the subjects that Van covered. I think that there is, and we have observed that there is a desire to have a rulebook for cloud computing, so that you can follow the rules and not make mistakes. But the thing is that cloud computing is an emerging technology model, even though it's based on some fundamental technologies that have been in place for a long time, and there isn't a large experience base to draw from. So it's really important, not only important, but essential for an adopter of cloud computing to really focus on your perspective mission and your requirements, and interpret that.
There's guidance available and NIST provides guidance, and I've listed that here. But it comes down to really thinking about your specific mission and requirements. One key guidance element, if you will, is ensuring that there's an exit strategy, making sure that you consider the ability to negotiate service level agreements, and remembering that just because you're delegating some of your workload or functions to a cloud provider, that is not one and the same with advocating responsibility, because cloud computing is just another model.
And this last slide really just tries to put the whole program together, and to provide a perspective and comes full circle. So I hope this is helpful. And I'm happy to answer questions.
Moderator:
Great. Thank you so much, Dawn. Really good information, thanks for sharing best practices. For our audience, please feel free to leverage Dawn's experience today on the webinar, and to ask as many questions as you like.
Our next speaker is Greg Potter from In-Stat. Greg's a Research Analyst, and I'll hand it over to you, Greg.
Greg Potter:
Thanks, Doug. Hi, everyone. My name's Greg Potter. I'm a Research Analyst here at In-Stat, and I convert amongst other things, public cloud computing. So I'm going to talk briefly today about cloud computing in the government sector, and where we at In-Stat think that is headed.
So first let me talk about the recent research we've done here at In-Stat and how we've come to our overall spending numbers. Our numbers are coming from three different areas to converge and shed light on this topic. Firstly, we conducted a survey of over 2000 IT managers across 20 vertical markets, including government on this topic, and we combined this information with information from government institutions, analysis and interviews of IT personnel, and then we come to our general consensus on spending on public cloud services.
So with the first slide here, let's discuss the advantages and disadvantages of public clouds. The first obvious disadvantage is the fact that the underlying infrastructure is owned by the cloud provider itself. This is both good and bad for government organizations. For smaller governments, not owning the infrastructure is a big plus, it reduces IT expenditures on servers, IT personnel, among other things, and it also provides greater flexibility in times of increased demand. However, large governments, of course, have increased security needs, and they already have experience running data centers, so that's where the downside comes in. The move to the public cloud for these large governments will be more deliberate, and will only be used for certain applications. In the end, In-Stat believes that the larger governments will use a mix of public and private cloud services, in sort of a hybrid cloud system.
So on this slide here, we're just highlighting where our forecast for government spending on public software as a service. We have for 2010, we estimate spending at $75 million, and that goes to $150 million 2014. On this next slide, we have government spending on infrastructure as a service, and so that's roughly $50 million for 2010, and then it goes to $100 million by 2014. On this next slide here, we have the total government spending on public cloud services, and so we have that at $150 million, and that goes to roughly $300 million by 2014.
So currently, In-Stat believes that small governments, those that have less than 1,000 employees, are leading the adoption of public cloud services. These small governments are more accustomed to outsourcing IT functions, and this is why they have been leading adoption of these public cloud services. So across all government organizations, as well as businesses, the major deciding factor in all of this is primarily security. Smaller governments obviously have far fewer security needs than the state and federal government agencies. This means that they're able to outsource far more applications and processes to the public cloud. The larger government organizations are taking a bit more time, and generally looking to only outsource certain applications and processes to a public cloud service.
So I'm sure we've all seen announcements by the various government agencies about switching to Google Apps, so here's a small list of the smaller governments that have moved to Google Apps. And in this next slide, we have a list of the larger governments, and you can see that's quite a large number of users here. So In-Stat believes that mostly larger governments are primarily using Google Mail and Google Docs in their usage of Google Apps, while the smaller governments are using some of the third-party apps available in the Google App marketplace. We estimate the market will see a great uptick in the use of software as a service by government organizations over the next five years. So one of the key features we believe in driving government adoption of these services is the guarantees of 99.9 percent uptime in the Google SLA's. And so these are doing a great job of convincing most of the smaller governments to adopt these types of cloud services.
So that's all I have for today. Doug?
Moderator:
Great. Well, thanks a lot, Greg. That was really good. We're going to jump into the question and answer session right now. I just want to let you know you can still ask questions, so please fire away, and I'll let you jump right into it.
Our first question is for Dawn. It says, Dawn, what is NIST's biggest challenge in getting clouding platforms used across the entire organization?
Dawn Leaf:
Okay. What I can share with you is what we are observing as we work across multiple government agencies, and for that matter, with the international community, and state and local governments. I think that there is general recognition and acceptance that commodity-type computing services – email, office automation, applications – are a straightforward target for cloud computing. I think there is recognition that the private cloud computing model is a natural transition point for organizations that already have outsourced data centers.
The challenges are, first of all, understanding and moving past the current security model, which is based on physical security or boundaries, and the ability to directly inspect the premises of a data center and/or the employee's who support it.
And in the broadest sense of cloud computing, if you think of it as what's often called the fungible cloud, then the security focus needs to move, again, what I think is commonly recognized – a data security focus. So that's one. The other challenge is in those core mission applications, because there are some logistics, for example, if you have legacy applications with very specific configuration requirements, when you migrate those to a cloud, you may lose the cost advantage that you would get in just a commodity-type application, which relies on a very consistent homogeneous configuration. And the other is that the work has to be broken down piece-by-piece. We often get the question, why don't you just lock people up in a room and let them work out the cloud issues? But it really is a painstaking step-by-step process. So the challenge is for those organizations, those individual agencies actually working through their specific mission requirements and moving to cloud.
And I hope that answered your question.
Moderator:
Great, thanks Dawn. Here's a good one for Greg. Do you think that smaller government organizations will ever utilize hybrid clouds on a significant basis?
Greg Potter:
You know, that's a good question. I would say that it would not be on a significant basis. Because mostly – I'm talking about governments with fewer than 1,000 employees, and they're generally not going to be running their own data centers. They're going to want to reduce their infrastructure costs as much as possible by going to a public cloud model.
Moderator:
Okay, great. Dawn, here's actually a good question, and it's multi-part, so bear with me as I get through this one. What is the status of the FedRAMP Program? Are they currently doing certifications; and if so, do we have estimates of how long it will take to go through the process?
Dawn Leaf:
Okay, that is a multi-part question, and I'll try to answer it succinctly. The FedRAMP Program, which is managed out of the General Services Administration because FedRAMP is now going operational to support the cloud computing effort, has just completed an initial comment period on a set of requirements to satisfy controls.
Big picture: The way FedRAMP came about, the goal was to identify a common set of requirements for cloud service providers and agencies to ensure that for low-and-moderate security impact systems, those solutions would meet the FISMA 800-53, 53A requirements. So there's been one round of that. A lot of comments were received, and now the program is looking at how to not only reduce the cost by defining common requirements for controls, but how to take it to the next level. In other words, how to identify continuous monitoring controls that could be applied that would actually improve the security posture for cloud computing. So that's technically where it is.
GSA, in its initial cloud offerings, used requirements for the certification and accreditation that are consistent with the FedRAMP controls requirements. And in fact, the way the way the requirements were created, is that the Chief Information Security Officers across the government, including DOD and those agencies with high security impact systems, got together to identify the interpretation of requirements. I want to clarify, FedRAMP is not yet moving to high security impact systems. All I'm saying is that those requirements were considered.
So, the bottom line answer of how long it will take to get through the process for a vendor, it's equivalent to what has taken to date with the existing GSA cloud services, and that range is roughly several months, depending on the offering and the impact level. I think that's it, I think I pretty much covered it.
Moderator:
Great, thank you, and that was a long question, but a really good answer, and I think you captured it very well. Another question here for Greg. Where do you feel the greatest growth rates will be seen – SaaS, Paas, or IaaS?
Greg Potter:
So, where I see it going is SaaS offer as a service is definitely going to garner the majority of spending, especially for public cloud computing spending by governments, as well as businesses. Infrastructure as a service will also be growing pretty rapidly as well, but not quite as rapidly as the software as a service; and PaaS, or platform as a service, is generally going to be significantly lower than both SaaS and IaaS.
Moderator:
Great. Another question. Dawn, health care is heavily regulated. So when do you see this sector moving to the cloud, and what types of data do you think is most suited to the cloud?
Dawn Leaf:
Okay, actually this is a very interesting question. Here at NIST, we also have a health care program that is led by B.J. Leighty, and we have looked at this a bit. I think there is general agreement that there are tremendous advantages for health care using cloud services, because health care providers now who could not ordinarily afford higher-end medical support systems or records managements systems, or those who are in inaccessible locations can really benefit from the advantages of cloud computing, not just in records management, but actually in supporting clinical case.
Obviously, static data, again, records, is a really good candidate for cloud, because as you work through those requirements for performance access, you have an area of – I want to say of really, fact-finding and proof of concept before you really focus on clinical applications that are more critical, just as we are in the other government areas, where you tend to focus on your new applications or your non-mission critical applications first.
The very last point I will make is that the same issues in health care that you face of personal privacy information, we face those across the board for moderate security impact systems, and personal privacy information, and the requirements for ensuring access to the data is controlled by the owner of the data, so there's really a lot of synergy there.
Moderator:
Great, thank you. Dawn, someone had a really good question as a follow up from your previous statement. Dawn indicated that security is a challenge. How are small governments overcoming that argument or challenge?
Dawn Leaf:
Well, I think what you're seeing, and you've heard it all through the sessions today, is it's natural for organizations and individuals to start with their known base of operations first. That's why there's a lot of focus on infrastructure as a service, and again, the commodity application and again, the private cloud computing model, or community model, because you are starting from that known base of traditional, physical and logical security boundaries.
So I think that the smaller government agencies, and for that matter, the international governments as well, to be direct, they're starting with what's safe first. So they're starting where they have the most control in the model, which tends to be infrastructure, and they're starting with a private model, and then moving out.
But the point I would make here, and it's one that Vivek Kundra makes regularly, is that to really capture the benefit of cloud, we have to develop our security technology, and move past that reliance on physical boundaries and those commodity-type applications to really meet the targets and get the full cost benefits.
Moderator:
Great, thank you. Here's a question for any of our panelists today, and it's a question we've seen a lot at cloud-based discussions. What are the implications of cloud storage for records management and electronic discovery? And I'll open it to anybody that wants to step into this territory.
Van Ristau:
This is Van. I'll attempt that. Interestingly enough, the first presentation I was given by one of our vendor partners on their cloud strategy about 2-1/2 years ago, that was my first question. You've got, for the federal government anyway, the National Archives and Records Administration, Electronic Records Archive (ERA), and you have surrounding that a whole host of regulations related to the storage of public records. Then with respect to electronic discovery, there's a whole body of law and case law, especially First Circuit here in the D.C. area, around electronic discovery. So it's a real issue, not so much in losing the data. It's actually being able to search it, and tag it and define it as, for example, a record that – and that record retention, how long should it be retained, when should it go to NARA to be part of the archive.
And electronic discovery presents a bit of a problem, in that you would want to use some in-cloud tools to do discovery. You don't want to bring all that data back every time you have to go and search and see what a particular undersecretary said about a topic when the IG wants to know, or you get a FOIA request. You've got to have the ability – I mean, it's not just in large litigation cases, it's constant for some of these agencies, what we would term as electronic discovery.
So you've got the issues of searching encrypted data at rest, will your tool handle that – the tool that you're going to use for electronic discovery? And that is being worked through, there are some good solutions, but I would say it's something to look at very carefully when you pick the specific storage provider, or infrastructure as a service provider where you're, for example, storing email or storing other documents, is how you can handle that in accordance with your own agency's records management policies, which might have to be adapted in order to accommodate that cloud storage requirement.
I hope that helps somewhat. But it is a very important issue.
Moderator:
Thanks, Van. That's really helpful. Here's another question for Dawn. It says, there are probably some sessions prior to this, but could you give a basic overview of how this could help benefit a small Parish and how it was used?
Dawn Leaf:
Yes. I actually wanted to take this question, because one of the things that we have seen is that state organizations and county organizations really can benefit from the cloud computing model in the case where they're facing budget cuts, or they need to refresh and can't afford to refresh their data centers and their infrastructure, because it's often more straightforward to implement a new application, a service that you'd like to provide quickly through a cloud computing service provider. I'm not trying to endorse any particular one, so I hesitate to name names. But for example, in Maryland, the Maryland Transportation Department used cloud computing to just implement new services, new apps, and they worked very closely with the cloud provider to define the application, to make information available, especially public information that you want to make available anyway.
You can take advantage of the tools that the provider – and I'm trying to be generic and not name a particular firm again – but I would point to very specifically to the National Association of State CIOs for particular cloud computing cases, and also I know within the State of Maryland, Anne Arundel County, and the Department of Transportation have some very specific examples of how they used cloud computing to reduce cost.
Moderator:
Great, thank you. Here's kind of a tactical question, but I think a really good question. It says, how is data, such as scanned images, stored using the cloud; is there a cost to do so? And that's open to anybody.
Van Ristau:
Well, this is Van. I think I can answer that relatively quickly. Usually, it's stored as a BLOB, and there'd be a pointer in your primary relational database, to point to the location of that BLOB. So that way, you don't affect the performance that much. I hope that answers the question.
Moderator:
Great. I think we have time for one more question here, so let's see if we can find a good one. Here's a good question. It says, on behalf of a government organization that has highly sensitive information, it is recommended to wait until a certain rulebook is established before moving to a cloud. In the instance of highly sensitive information, would there be literal insurance around data lost, and specific reference to intrusion from others? Is this is an additional cost that has yet to be identified?
Dawn Leaf:
Well, this is Dawn again. I asked to take this question, although I do feel like I've monopolized the session a little bit. I, obviously as a former CIO for the Bureau of Industry and Security, which has high security impact data, export control data, I feel very close to this question. Rulebook, to specifically answer your question, I don't think there is a rulebook that will be established. However, there are controls and requirements to satisfy controls that are being defined currently by the, for example, the Federal Information Security Management Council, and with DHS, and with NSA. There is a, for instance, DOD is planning to establish a cloud computing environment to support their high security impact systems.
So I think that there will be some very specific guidance, and guidance is like a special publication – to be clear – documents that will explain how the controls can be met for high security impact systems. And you will see those, for example, from the Federal government through NIST. You will see those posted through the Security Working Group that comes of the Federal CIO Council. As far as insurance, real quick, from the federal perspective, we've established that we can't delegate responsibility, as you sometimes can in a private sector. I mean, ultimately, the government is responsible. But that is a concept in the private sector to use the insurance paradigm to help insure responsibility regarding security requirements and/or a conformity assessment laboratory.
Moderator:
Great. Well, this was a great session. We're running out of time, so I just want to take a step back and thank our speakers for a wonderful presentation, and thank our audience for their participation, especially in the Q&A session, there were some really good questions. And I hope we answered all the questions you had. I'd also like to thank i360.gov, DLT Solutions, Red Hat and NetApp for putting this on today. I want to remind everyone on the call that you'll receive an email tomorrow with a link to the archive version of this webinar for you to view again or pass along to your colleagues. On behalf of all the speakers and myself, thank you for attending today, and this concludes today's webinar.
Additional Resources:
- Time to Get Rolling on Cloud Computing - Whitepaper
- Best Practices for Achieving Migration to a Cloud Model – Archived Webcast
- Budgetary Benefits of the Cloud – Archived Webcast