by Jim Warner, Vice President and Head of TM Forum's Digital Media, Advertising & Cloud Computing Program
Cloud computing has hit a fever pitch in 2009, with more and more companies being attracted to the notion of outsourcing their storage, processing and applications for a fraction of the cost of upgrading an entire data center.
The economics in the cloud model are (in theory) a no-brainer, but what's less clear is how to deal with some very specific challenges: security, reliability and portability.
If you're putting your data out on a server somewhere that you don't physically control, that's an obvious red flag. Most security today is based on some sort of physical control where your hardware and software are in a data center that your employees can keep an eye on. Or you might even be outsourcing these functions, but even then it's sitting on a dedicated server in a data center and you're paying for someone to watch over it.
With a cloud computing model, quite literally your data could be sitting on the same server as your competitor's data. This is a fundamentally different idea over the physical control most enterprises are used to having with their data. Once you don't have that physical control, and your data is traversing a network and going into a data center in some unknown location, it's a whole new environment that naturally brings up huge security issues.
That explains why even though over half of all enterprises have cloud computing in their plans, only 2 percent are actually implementing. There's a lot of trepidation especially on the part of government and larger organizations.
But let me say upfront and for the record, despite numerous concerns and barriers that must be addressed, I firmly believe cloud services will become mainstream.
One large deployment of the cloud that caught many by surprise was the recent announcement of the city of Los Angeles to basically get rid of Microsoft Office entirely and move everything to Google Apps, including Gmail and Google Docs. Obviously this was done for financial reasons--the economic woes of the state of California are no secret--but it raises all sorts of issues, too, not the least of which is that the second largest city in the U.S. by population could now be fair game to hackers.
That may sound harsh, but think about it. What types of information do cities and municipalities keep? Financial records, police records, public safety information. And just the emails alone that are going into the cloud makes it even more of a concern. It's one thing if you as an individual are on Gmail; but an entire city infrastructure using it is a whole other matter.
From my perspective, and I have nothing against Gmail per se, but I do think the jury is still out on exactly how secure and robust it is. Just this year alone, there have been a couple of very high-profile crashes of Google's mail system, the most recent in September. And I'm not suggesting this is because Google is lax in any way. Given their position in the marketplace, they are one of the most tempting targets for hackers around the world. But, facts are facts and this tells me there are still serious concerns for any enterprise that's looking at ditching their existing 'controlled' environment and moving to the cloud.
Reliability and portability
Security is most definitely the top issue for anyone considering the cloud, but coming in at a close second is reliability. This gets into the whole concept of quality of service, service level agreements, response time and latency. If you have your data sitting down the hall from you or on a virtual private network, reliability isn't going to be a huge concern for you. But if your sensitive information is on the public Internet, which as we all know is a best effort network, all bets are off.
Imagine if an airline wanted to move its reservation system to the cloud, and it takes forever to make changes for a customer through their website. There could be a significant impact if you're moving your customer care system to the cloud and response times are lagging because they are running off a server in another country.
And going back to the Gmail outage a couple months ago, it's not the only such app to crash. Skype, which is used by millions of people around the world, had a major outage in 2007 and other high-profile applications have had problems as well. In October, T-Mobile's Sidekick service crashed causing the loss of data for many users. The company has been giving out $100 vouchers to those affected by the crash, but when you've lost information that amount seems like cold comfort.
And on the quality side of the reliability issue, you can always turn to the adage "You get what you pay for." While you're saving money by outsourcing, if down the road the service is lousy and you have outages or security beaches, at some point saving money becomes secondary when your users and executives are equally annoyed and frustrated over the service.
You have to remember that even if you're outsourcing, it doesn't mean you've washed your hands clean. You still have to manage things. I'm not trying to downplay the cloud, but the truth is it will have trouble living up to the hype in the long term.
The third area of portability ties in nicely to reliability. Right now every cloud operates differently, and you can't easily move data from one provider to another or even back in-house. You have very proprietary silos, and as a user your hands are tied once you've made a commitment to a provider. It's the services version of vendor lock-in.
There's a lot of interest in being able to use different cloud providers for different applications but then being able to pull it all together somehow, which is not possible today. Some companies may also want to have a public cloud providing redundancy or backup for a private cloud, or a public cloud providing this for another public cloud.
Today this flexibility simply doesn't exist, which severely limits companies' options.
Getting from hype to reality
Today, cloud computing is more hype than reality, and we hope to turn that around. TM Forum has always been very good at bringing buyers and sellers--and anyone else in the value chain--together to work through barriers to adoption and other requirements.
For most of our existence in working with value chain partners, the buyer has been the telecom operator; today with cloud computing it's actually the enterprise users who are sitting in that buyer's seat. Our ecosystem turns things a bit on its head so that service providers, along with technology suppliers are in a seller role, and the enterprise customers are leading the way. They are essentially saying that cloud computing services look very promising, but until barriers to adoption--such as security, reliability and portability--are adequately addressed, they aren't going to jump in with both feet.
Our role at TM Forum is as the facilitator to ensure that the barriers to widespread adoption of cloud computing get removed and that the service becomes a successful commercial reality.
At Management World Americas in Orlando, we'll be featuring an all-day Cloud Stream on Wednesday, Dec. 9 that will discuss the possibilities and benefits companies have within the different cloud service models (IaaS; PaaS and SaaS), security, open standards and what role Communications Service Providers can play in ensuring the ecosystem meets customers' needs and expectations. Attendees will hear also from Enterprise buyers to understand the cost savings, cost avoidance and growth opportunities that Cloud-based services can deliver and the resultant Enterprise strategies for the new millennium.
Join us for these discussions as well as a Catalyst demonstration of cloud computing in action, an executive cloud roundtable and many more sessions and activities that will address the reality of cloud computing.
Jim Warner is the Vice President and Head of Digital Media, Advertising & Cloud Computing Program for the TMForum