Saturday, October 24, 2009

Windows Azure and Cloud Computing Posts for 10/21/2009+

Windows Azure, Azure Data Services, SQL Azure Database and related cloud computing topics now appear in this weekly series.

•• Update 10/24 and 10/25/2009: David Linthicum: Cloud data integration issues; Aaron Skonnard: Building a cloud app podcast; Steve Nagy: Overcoming objections to developing with Azure; Kevin Jackson: AtomPub’s high overhead; James Hamilton: Networks are in my Way; Patric McElroy: SQL Azure performance and SLA; •• Bob Sutor: Who is the user for cloud computing?; and a few more. 
• Update 10/23/2009: David Lemphers: Designing high-performance Windows Azure Storage; Steve Marx: Wants help improving http://dev.windowsazure.com; Msdev.com: 36 Windows Azure videos for developers; Chris Hoff: Can we secure the cloud?; Lori MacVittie: The Cloud Is Not A Synonym For Cloud Computing; Bill Kallio: Setting Queue Visibility Timeouts; Thomas Claburn: Intel CIO predicts PHR to sell PCs; Ramaprasanna: Get started with Windows Azure quickly; and many more.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page.
* Content for managing DataHubs will be added when Microsoft releases a CTP of the technology

Off topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

• Bill Kallio explains how to use Azure Queues and set visibility timouts during polling in his Azure - Setting Visibility Timeout For Polling Queues post of 10/23/2009:

… This past week my team and I ran into an interesting issue with the Windows Azure StorageClient that I'd like to bring up in case others run into it in the future. Perhaps I can save someone some time. Here is some quick background:

The MessageQueue class is an abstract class included in the StorageClient sample project for interacting with Azure Storage Queues. Additionally, the QueueRest class inherits from the abstract MessageQueue class and provides the implementation used when working with Azure Storage Queues.

One of the key methods in the QueueRest class is GetMessage. This method pulls a message off of the queue, and optionally allows the developer to specify a visibility timeout for the message. The visibility timeout just let's the queue know how long this message should be unavailable to other processes requesting messages from the queue.

Additionally, the QueueRest class implements an easy-to-use toolset for polling a queue for messages. This works well with the Azure architecture, where you have a set of Worker Roles (you can think of them as Windows Services) waiting for work to arrive in the queue. When a message enters the queue, the worker will pick it up and process it. This architecture takes advantage of the scalability of Azure. Do you have a lot of work backing up in your queue? Just spawn more worker instances to process the work! …

Here is a fancy chart I spent hours on to better visualize this process:

Bill  provides sample code and descriptions for setting the visibility timeout during polling.

David LemphersDesigning a High Performance Windows Azure [Storage] Service! post of 10/23/2009 notes recounts:

One thing that popped up this week [in a trip to the UK] was the problem case of:

“I have a service that has millions of users, and each of these users needs to store data in my service, do I just create a new storage account for each user?”

It’s a great question, and the answer is critical to designing a solution that exploits the design of our Storage Service.

So first of all, the best way to think about a storage service account, is not in terms of a “user” account, but more as a storage endpoint. What I mean by that is, creating a new storage account per user is similar to creating a new database server per user in RDMS type systems, rather than partitioning your data inside the data store.

So what design pattern do you apply to this kind of problem?

Firstly, the Storage Service has a optimum account size per service of ~10.

So say I’m building a system that needs to store photos for millions of customers. I would first ensure my front end had the ability to uniquely identify each customer, so I would use something like LiveID, forms based auth, or pretty much anything where you can get a customers unique ID.

David then goes on to recommend how to set up backend storage endpoints such that:

[Your] application is not only partitioning the data at the blob container level for each user, but you[‘re] distributing the number of customers across a collection of accounts, which gives you the best performance design.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

•• Kevin Hoffman recommends using Binary Serialization and Azure Web Applications in this 10/25/2009 post:

You might be thinking, pfft, I'm never going to need to use Binary Serialization...that's old school. And you might be right, but think about this: Azure Storage charges you by how much you're storing and some aspects of Azure also charge you based on the bandwidth consumed. Do you want to store/transmit a big-ass bloated pile of XML or do you want to store/transmit a condensed binary serialization of your object graph?

I'm using Blob and Queue storage for several things and I've actually got a couple of projects going right now where I'm using binary serialization for both Blobs and Queue messages. The problem shows up when you try and use the BinaryFormatter class' Serialize method. This method requires the Security privilege, which your code doesn't have when its running in the default Azure configuration.

Kevin shows you how to enable full trust with a minor change to your app’s service definition file. I have the same issue as Kevin about the the bigtime bandwidth charges and performance hit resulting from wrapping data in the RESTful AtomPub XML format.

David Linthicum warns that Data Integration is a Huge Issue for Cloud Computing Migration in his 10/24/2009 post to ebizQ’s Leveraging Information and Intelligence blog:

Loraine Lawson has done a great job in providing further analysis of Kevin Fogarty article cloud computing reality checks over at Computerworld.

Loraine points out some of the key issues of the article, including:

"Cloud platforms are all different, so migration, support, cost and capacity issues also will be different according to platform."

This is what I think surprises most people about integration between the enterprise and a cloud provider. The APIs, or other points of integration that the cloud computing providers offer are all very different in terms of functionality and quality of the service.

"You may find existing integration points break when you try to move one end point into the cloud - and odds are 10 to 1 your legacy systems are deeply integrated with many, many other on-premise systems."

Integration is almost always a complex problem. Indeed, I wrote a book on the topic, and if you think this is about linking your CRM system with a cloud provider and you're done, you have another thing coming. This is about understand the data you're integrating, including data latency, data governance, semantic mediation, etc.. In for a penny, in for a pound. …

Dave continues with an analysis of licensing and EULA issues.

Murty Eranki’s What type of performance is guaranteed with SQL Azure? thread of 10/22/2009 asked:

Does a database instance with 10 GB … not require compute power. What type of performance is guaranteed with SQL Azure?

Microsoft’s Patric McElroy replied on 10/24/2009 with the following details of current and future Azure SLAs:

We currently do not provide a performance SLA for SQL Azure although this is something that we are looking into. 

Although we run on "commodity hardware" in the data center, it is pretty powerful and the throughput and response rate for most customers has been very good.  As mentioned on another thread, we continue to refine our throttling mechanisms to make sure we are providing the highest level of performance while still being fair across tenants and protecting the health of the overall system.  This is a delicate balancing act and I'm sure we'll continue to refine this based on feedback from customers for some time to come.

Early adopters have also found that SQL Azure allows them to provide a data tier to their applications that leverages scale-out - of relational storage and query processing.  For certain applications, this can have dramatic effects as you harness the power of 10's of CPUs in parallel to service the needs of your app.  For an internal partner running on this same infrastructure they were able to take end user queries down from 10's of minutes to 10's of seconds - quite dramatic for that app.

• David Nichols presented Life on the Farm: Using SQL for Fun and Profit in Windows Live to The 3rd ACM SIGOPS International Workshop on Large Scale Distributed Systems and Middleware (LADIS 2009) on 10/11/2009. The slides for his presentation are here.

Dave’s first slide asks “What is Windows Live?” His answer:

• [It’s] Not:
– Bing – search
– Azure – cloud computing platform
– MSN – News and entertainment portal
• Windows Live is
– Mail (Hotmail)
– Instant messaging
– Photo and file sharing
– Calendar
– Social networking
– Blogging
– File sync

He continues with an analysis of how the Windows Live group addresses issues with using SQL Server for applications requiring high availability and elasticity. Many of these issues (and their solutions) are common to SADB.

Dave “is a software developer in the Windows Live group at Microsoft, working on large-scale storage systems. He came to Microsoft by the acquisition of PlaceWare, Inc. where he was a co-founder and principal architect. The PlaceWare web conferencing product became Microsoft Office Live Meeting.”

My illustrated SQL Azure October 2009 CTP Signup Clarification post of 10/22/2009 explains how to subscribe to the October CTP without copying your invitation code GUID somewhere.

Salvatore Genovese announces SplendidCRM for Microsoft Windows Azure in this 10/20/2009 press release, which includes a link to a live Azure demo:

SplendidCRM solutions for open-source use, has entered the cloud with its release of SplendidCRM 4.0 Community Edition. This release has been specifically updated to run in Microsoft's Windows Azure Platform (http://www.microsoft.com/azure/). SplendidCRM can be installed to run just the database in SQL Azure, with the web application running locally, or to run the database in SQL Azure and the web application in Windows Azure. With this new capability our customers will be able to minimize the cost while maximizing the reliability of their customer data. A live Azure demo is available at http://splendidcrm.cloudapp.net.

SplendidCRM continues to evolve with the incorporation of the Silverlight 3 Toolkit to replace the previous flash-based and hand-made charts. “We continue to pioneer the use of XAML-only graphics as a means to promote rapid application development”, noted Paul Rony, President and Founder of SplendidCRM. “Implementation of a dashlets model will allow each user to customize their home page experience with the new Silverlight charts.”

“In addition, the SplendidCRM query engine has been optimized to handle millions of records using our custom paging system. This custom paging also allows SplendidCRM to be very bandwidth efficient, which is especially important in the SQL Azure environment because of the bandwidth related charges.”

Microguru Corporation announces their free Community Edition license for the Gem Query Tool for SQL Azure:

Gem Query Tool is a lightweight SQL query tool specifically designed for use with SQL Azure,

Microsoft's new database offering in the Cloud. Gem Query provides an intuitive user interface to connect to and work with SQL Azure databases. Gem Query Tool supports execution of any DDL and DML script supported by SQL Azure. To facilitate authoring of SQL queries, Gem Query Tool for SQL Azure displays tables and columns in your database.

Required Runtime: Microsoft .NET Framework 3.5 SP1

Gem Query Tool for SQL Azure - Community Edition License: Free to use for any legal purpose. This software may not be reverse engineered and may not be used as a basis for creating a similar product. The user assumes full responsibility when using this product. No warranties, explicit or implicit, are made with respect to this product.

Gem Query Tool - Main Window

<Return to section navigation list> 

.NET Services: Access Control, Service Bus and Workflow

No significant new posts on this topic today.

<Return to section navigation list> 

Live Windows Azure Apps, Tools and Test Harnesses

Aaron Skonnard contributed .NET Rocks’ show #492, Aaron Skonnard Builds a Real Cloud App, on 10/22/2009:

Aaron Skonnard talks about his experiences building a real application in the cloud.

• Thomas Claburn reports “Paul Otellini sees businesses moving to replace their aging hardware and promise in moving to a more distributed approach to healthcare” in his Web 2.0 Summit: Intel CEO Expects PC Sales Surge post of 10/22/2009 to InformationWeek’s Healtcare blog:

… In an interview with Web 2.0 Summit program chair John Battelle, Otellini said that PC sales are looking up. …

Asked what he made of Microsoft (NSDQ: MSFT)'s shift from loathing to loving cloud computing, Otellini said that from a hardware perspective, cloud computing isn't much of a change. "I like [Larry] Ellison's definition of the cloud," he said. "He said there's nothing new here. You still have servers, networks, and clients. What's different is the use model." …

Otellini also expressed confidence in the market for healthcare IT, noting that Intel has partnered with GE to focus on home healthcare. "Let's keep people [in need of medical treatment] at home longer," he said, noting that home care represents the lowest cost to society. "We're developing a family of devices to allow that," he said, citing video conferencing and intelligent medication systems as examples.

Healthcare, he said, needs to shift from a centralized model to a distributed one. …

Steve Marx asks What Makes a Great Developer Website? Help Me Improve http://dev.windowsazure.com! in this 10/22/2009 post to his Cloud Development blog:

Did you see the new http://windowsazure.com?  I think the site has a much cleaner look and makes it easier to find what you’re looking for.

http://dev.windowsazure.com takes you directly to the Windows Azure developer site, which is the place developers go for information about Windows Azure.

My responsibility is to play curator for that site.  I’m going to try to find the best content available and organize it in a way that developers can find the information they need as quickly and easily as possible.  I’d like your help doing that.

Three questions every product website should answer

When I’m investigating a new development technology, these are the questions I immediately ask (and the time I’m willing to spend answering each):

  1. What is this? (30 seconds)
  2. Is it useful to me? (5 minutes)
  3. How can I get started? (15 minutes)

I’d like http://dev.windowsazure.com to deliver great answers to these questions for a developer audience and then provide organized, deeper content for people who decide to invest their time learning the platform.

I’d like to see more live Windows Azure and SQL Azure demo applications along the lines of those produced by Jim Nakashima (Cloudy in Seattle) during Windows Azure’s early days.

The Msdev.com site offers 36 current and future Windows Azure Training Videos for developers. Future segments are scheduled for release in November and December 2009.

Robert Rowley, MD reports EHR use and Care Coordination improves health outcomes for Kaiser Permanente in this 10/22/2009 post to the PracticeFusion blog:

A recent report out of the Kaiser system shows how EHR use, combined with Care Coordination, improves chronic disease management. The 5-year project (published in the British Medical Journal) showed that specialists (nephrologists) can improve the health outcomes of high-risk kidney disease patients by using their EHR to identify patients at-risk for deterioration, and proactively consulting on these cases remotely, making recommendations to their primary care physicians for medication management or requesting referrals. Chronic kidney disease contributes to a significantly higher cost of health care, and good management is a PQRI quality metric.

Can this experience be generalized outside the Kaiser system? What lessons can be learned here? One of the healthcare-delivery system characteristics of Kaiser is that it is a very large, closed, multi-specialty group practice, with a finite well-defined patient population assigned to this group. Patients “belong” to the entire team – unsolicited consultation from specialists using EHR tools can occur without requiring specific patient permission for care from “outside” providers. Within the group, there is no accusation that such unsolicited consultation is simply “trolling for additional business,” which is the claim in an uncoordinated fee-for-service private setting outside their system.

Clearly, then, reproducing the Kaiser experiment of using EHR tools to proactively consult on at-risk patients remotely would be difficult to achieve outside a coordinated-network setting. Large group practices, and at-risk IPAs that are delegated to manage the care of their insurance-assigned patient population – in short, accountable care organizations (ACOs) – are about the only settings that come to mind where such interventions are feasible. …

Dr. Rowley’s EHRs Fare Well in Battle of QI Strategies post of the same date notes that electronic health records are effective Quality Improvement (QI) tools:

… The scientists [Mark Friedberg and colleagues from Brigham and Women’s Hospital and Blue Cross/Blue Shield of Massachusetts] looked at these QI techniques, among others: providing feedback to physicians regarding their performance, distributing reminders to patients and physicians about needed services, making interpreter services available, extending office hours to weekends and evenings and and using multifunctional EHRs.

Multifunctional EHRs were defined as those which included alerts and reminder systems, as well as decision support at the point of care.

The scientists found that practices which routinely used multifunctional EHRs scored significantly higher on 5 HEDIS measures, including screening for breast cancer, colorectal cancer and Chlamydia, and 2 in diabetes care. The improved performance ranged from 3.1% to 7.6%. …

• Alice Lipowicz asserts HHS faces hurdles on electronic exchange of medical lab results in this 10/20/2001 FederalComputerWeek post:

Federal health authorities will face several problems implementing the electronic exchange of patient lab results as part of an electronic health record (EHR) system, members of a federal advisory workgroup said at a meeting today.

The workgroup is a task force of the Health IT Policy Committee that advises the Health and Human Services Department’s (HHS) Office of the National Coordinator for Health information technology. HHS plans to release regulations later this year on how to distribute $19 billion in economic stimulus funding to doctors and hospitals that buy and "meaningfully use" certified EHR systems.

Roughly 8,000 hospital labs and 6,000 independent clinical labs perform three quarters of the lab testing in the United States and some of those facilities have installed interfaces that enable standardized electronic delivery of the results, Jonah Frohlich, deputy secretary of health information technology at California's Health and Human Services Agency, testified before the Information Exchange Workgroup.

Without interfaces in place, many lab results that could be sent electronically are scanned and faxed to physicians, he said. “While approximately one-quarter of physicians nationally have an EHR, many still receive faxed lab results that are either manually entered or scanned into the patient record. This is a limitation of both the lab and EHR industry,” Frohlich said. …

The HealthVault team announces the first issue of HealthVault News for You (consumers) in this 10/21/2009 post:

Each month we'll feature applications that can help you take charge of your health, highlight health tools and tips, and help you get the most out of your HealthVault experience. …

<Return to section navigation list> 

Windows Azure Infrastructure

•• Ping Li asserts The Future Is Big Data in the Cloud in this 10/25/2009 post to GigaOm:

While when it comes to cloud computing, no one has entirely sorted out what’s hype and what isn’t, nor exactly how it will be used by the enterprise, what is becoming increasingly clear is that Big Data is the future of IT. To that end, tackling Big Data will determine the winners and losers in the next wave of cloud computing innovation.

Data is everywhere (be it from users, applications or machines) and as we get propelled into the “Exabyte Era” (PDF), is growing exponentially; no vertical or industry is being spared. The result is that IT organizations everywhere are being forced to grapple with storing, managing and extracting value from every piece of it -– as cheaply as possible. And so the race to cloud computing has begun.

This isn’t the first time IT architectures have been reinvented in order to remain competitive. The shift from mainframe to client-server was fueled by disruptive innovation in computing horsepower that enabled distributed microprocessing environments. The subsequent shift to web applications/web services during the last decade was enabled by the open networking of applications and services through the Internet buildout. While cloud computing will leverage these prior waves of technology –- computing and networking –- it will also embrace deep innovations in storage/data management to tackle Big Data. …

Ping Li is a partner with Accel.

•• Steve Nagy’s “Why Would I Use Windows Azure?” or “Developer Evolution” essay begins as follows:

Foreword: Apologies for the title, I’m still not sure (after completing the entry) what it should be called.

Why would I use the Windows Azure Platform? Its a good question, one that I’ve had a lot of internal discussions on lately (fellow consultants from Readify). For most its quite a paradigm shift and therefore difficult to grok. Many believe that you can only build Azure apps from scratch and that it would be a lot of work to convert an existing product to use the Azure platform. I don’t want to say this is a false statement, but Azure is pretty big, bigger than most people realise.

I Can’t Use Azure Because…

Most of us are only familiar with the Windows Azure component which allows you to host either services or web pages. Most custom applications require a data store and for Microsoft developers, this tends to be SQL Server. Unfortunately there is no one-for-one mapping to easily port your database over to the data stores in the cloud (Windows Azure Storage, SQL Azure, etc). This means people feel resistance when they do consider the Azure services and write-off the whole platform.

When SQL Azure is presented as an option, people tend to pick on the little features that are missing, expecting a direct cloud equivalent for their data. But some things just don’t make sense for SQL Azure. People lose context: SQL Azure is a service. It should be treated like any other service within your architecture. You don’t need to be implementing the next service related buzzword (SOA, SaaS, S+S, etc) to get the benefits of the abstractions provided by services. If you build your services in an autonomous fashion, SQL Azure will fit right in with your story.

Steve continues to deflate other arguments about cloud computing in general and Azure in particular.

• Bob Sutor asks Who is the user for cloud computing? in this 10/24/2009 blog post, which opens:

I think many of the discussions of cloud computing focus too much on the implementation side and not enough on who the potential users are and what will be their needs. Many users don’t have or need a very precise definition of “cloud computing.” Indeed, I think that for many people it simply matters whether their applications and data live on their machines or devices, or if they are run through a browser or reside somewhere out on the network, respectively.

Following are abbreviations of Bob’s six initial categories:

    1. A user of a virtualized desktop on a thin or fat client
    2. A non-technical end user who accesses services through a browser or via applications such as disk backup to remote storage
    3. A “cloud choreographer” who strings together cloud-based services to implement business processes
    4. A service provider who needs to handle peak load demands
    5. A developer who employs dynamic resource allocation in clouds to speed application or solution creation
    6. An IT system administrator who does not build clouds but deploys onto them, probably in addition to traditional managed systems

Bob is Vice President for Open Source and Linux, IBM Software Group, IBM Corporation. He was very active in the development of SOAP and WS-* standards for XML Web services.

• Ramaprasanna offers on 10/23/2009 the shortest How to Get started with Windows Azure post yet:

It is getting simpler.

Firstly you need to have a live ID.

If you don’t have one get one here.

To get a Windows Azure token, all you need to do is to click here and complete the application. It would take approximately a week to get a token. 

Once you receive a token you can redeem it at windows.azure.com .

You can now use the Windows Azure deployment portal to deploy your Azure application in the Microsoft Windows Azure cloud.

The Cloud Computing Use Case Discussion Group posted 42-page Version 2.0, Draft 2 of its Cloud Computing Use Cases white paper on 10/23/2009. Here’s the TOC:

Table of Contents
1 Introduction
2 Definitions and Taxonomy
   2.1 Definitions of Cloud Computing Concepts
   2.2 Taxonomy
   2.3 Relationships Between Standards and Taxonomies
   2.4 Application Programming Interfaces (APIs)
3 Use Case Scenarios
   3.1 End User to Cloud
   3.2 Enterprise to Cloud to End User
   3.3 Enterprise to Cloud
   3.4 Enterprise to Cloud to Enterprise
   3.5 Private Cloud
   3.6 Changing Cloud Vendors
   3.7 Hybrid Cloud
   3.8 Cross-Reference: Requirements and Use Cases
4 Customer Scenarios
   4.1 Customer Scenario: Payroll Processing in the Cloud
   4.2 Customer Scenario: Logistics and Project Management in the Cloud
   4.3 Customer Scenario: Central Government Services in the Cloud
   4.4 Customer Scenario: Local Government Services in a Hybrid Cloud
   4.5 Customer Scenario: Astronomic Data Processing
5 Developer Requirements
6 Conclusions and Recommendations

Following is a typical Use Case Scenario illustration for what appears to me to be a so-called hybrid cloud:

• Mary Hayes Weier defines Alternative IT in this 10/22/2009 InformationWeek post:

Cloud Computing. SaaS. They're such over-used marketing words that they've become the butt of jokes (Larry Ellison on YouTube, anyone?). But hopefully the hype machine hasn't generated too much noise to drown out the fact that there have been some significant, permanent changes in how CIOs view software. At InformationWeek, we call it Alternative IT.

And our hope is that Alternative IT doesn't become another shallow term that can mean just about anything. (For example, I recently got a press release in my inbox with "SaaS" in the headline, and it turned out to be a Web-based service for storing digital photos. I mean, where will it end?) But we know, from talking to the CIOs who spend billions of dollars a year on IT, that a grinding recession, paired with new choices in terms of online software, mobile computing, outsourcing, open source, and more, has opened the door to alternatives in IT.

In particular, CIOs are rethinking significant parts of their software strategies, considering alternatives to conventional licenses, maintenance, and fee structures, as well as alternatives to lengthy internal development cycles, complex customization, and long global rollouts and upgrades.

This isn't trendy, it's reality. In fact, Bill Louv, CIO at pharmaceutical company GlaxoSmithKline, bristles at the idea that his company is chasing the cloud trend. "The evolution here isn't, 'Gee, let's do something in the cloud or SaaS,'" Louv told me in an interview. "Our Lotus Notes platform was getting to end of life. The question came up innocently that, given we'll have to spend a lot of money here, is there something we can do that's smarter?" What he decided is to move all 115,000 employees worldwide to the online, monthly subscription Exchange and SharePoint offerings.

Jim Miller’s Avanade finds growing Enterprise enthusiasm for the Cloud (see below) analysis of 10/22/2009 begins:

I covered Avanade’s Global Cloud Computing Survey for CloudAve back in February, and took a closer look at the security concerns it highlighted in a related post for this blog.

Avanade re-commissioned the same research firm, Kelton Research, to undertake some follow-up work between 26 August and 11 September, and the responses from over 500 C-level executives show a healthy dose of pragmatism with respect to the Cloud and its associated hype.

In amongst the pragmatism, it was interesting (and pleasing) to see a “320% increase” in respondents planning some sort of deployment. Whilst it’s worth noting that this increase only took the ‘planning to use’ contingent up to the dizzying heights of some 10% of respondents, the figure was a more respectable 23% in the USA. In the same data, companies reporting ‘no plans’ to adopt Cloud tools had fallen globally from 54% to 37%. That’s interesting.

Also interesting was the relatively small impact of the economic situation upon Cloud adoption, with only 13% suggesting it had ‘helped’ adoption plans and 58% reporting ‘no effect.’ In my conversations with Nick Carr and others, there’s been an underlying presumption (on my part, as well as theirs) that cost-saving arguments with respect to Cloud Computing would prove persuasive and compelling. It would appear not. This would suggest, of course, that Enterprise adopters are taking to the Cloud for reasons other than the budget sheet… which is hopefully one more nail in the coffin for IDC’s recent ‘advice’.

B. Guptill and M. West deliver Saugatuck’s Microsoft’s Q409 Rollout: How Will This Impact IT Spending in 2010? Research Alert of 10/22/2009. The What Is Happening section concludes:

Few IT vendors would attempt to roll out such a broad range of significant changes in core business, organization and technology in such a short period of time. Saugatuck believes the impact could easily reach well beyond Microsoft, its offerings, and its partners. In the end, it may trigger a significant change in IT spending, and as a result, catalyze and accelerate major change in how IT is bought, used and paid for.

The scale of that investment may be great enough to tip user organizations toward a much more rapid move to Cloud-based IT, including desktop virtualization, investment in netbooks and other new form factors, and a rapid move to SaaS, Cloud infrastructure services, and related Cloud Computing.

For partners, that investment is almost certain to include more and faster moves to expand their business to include SaaS and Cloud, which will require change well beyond adding a new offering or line of business, to include new business organizations, relationships, business models, etc. …

Rich Miller reports Demand Remains Strong in Key Markets for data center real estate in this 10/21/2009 post:

Data center demand is outpacing supply across the United States and pricing remains strong in key markets, according to the latest analysis by commercial real estate specialist Grubb & Ellis.

In a presentation Tuesday at DataCenterDynamics Chicago, Jim Kerrigan of Grubb & Ellis said more than 20 megawatts of data center critical load was leased in the third quarter, the strongest activity thus far in 2009. Kerrigan, the director of the data center practice at Grubb & Ellis, said demand is outpacing supply by “three-fold.”

Chicago is Hottest Colo Market
Kerrigan said downtown Chicago is the hottest colocation market in the country, while northern Virginia is seeing the strongest activity in leasing of wholesale data center space.

Kerrigan noted that the Chicago market is really two markets, with strong demand and limited supply in downtown, while larger blocks of space are available in the suburban market due to new construction. Driven by strong demand from financial trading firms, data center occupancy in downtown Chicago is pushing 95 percent.

In northern Virginia, supply is limited through the remainder of 2009, but several new projects will come online in early 2010, including new data center space from Digital Realty Trust, Power Loft, CoreSite and IT Server. …

Lori MacVittie emphasizes The Cloud Is Not A Synonym For Cloud Computing in this 10/21/2009 post:

… Thanks to the nearly constant misapplication of the phrase “The Cloud” and the lack of agreement on a clear definition from technical quarters I must announce that “The Cloud” is no longer a synonym for “Cloud Computing”. It can’t be. Do not be misled into trying, it will only cause you heartache and headaches. The two no longer refer to the same thing (if they ever really did) and there should be no implied – or inferred - relationship between them. “The Cloud” has, unfortunately, devolved into little more than a trendy reference for any consumer-facing application delivered over the Internet.

Cloud computing, on the other hand, specifically speaks to an architectural model; a means of deploying applications that abstracts compute, storage, network, and application network resources in order to provide uniform, on-demand scalability and reliability of application delivery. …

Jay Fry compares Scientists v. Cowboys: How cloud computing looks from Europe in this 10/21/2009 post:

Is Europe following the U.S. on cloud computing...or vice versa?
While I was over in Berlin for a chunk of the summer, I had a chance to connect up with some of the discussions going on in Europe around cloud computing. It's true, high tech information these days knows no international boundaries. Articles that originally run in North American IT pubs are picked up wholesale by their European counterparts. New York Times articles run everywhere. Tweets fly across oceans. And a lot of technology news is read directly from the U.S. sources, websites, communities, and the like.


However, homegrown European publications are brimming with cloud computing, too. I found references to cloud in the Basel airport newsrack and the Berlin U-Bahn newsstands, all from local European information sources (and some of their reporters are excellent). European-based and -focused bloggers are taking on the topic as well; take a look at blogs like http://www.saasmania.com/ and http://www.nubeblog.com/. Even http://www.virtualization.info/, one of the best news sources on (you guessed it) virtualization, is run by Alessandro Perilli out of Italy. And, of course, there are big analyst contingents from the 451 Group (hello, William Fellows), Gartner, Forrester, and many others in various European enclaves. …

Marketwire discusses Avenade’s Global Study: Recession Has Little Impact on Cloud Computing Adoption; C-Level Executives, IT Decision Makers Report More Than 300 Percent Increase in Planned Use in this 10/21/2009 press release subtitled “Companies Choosing Hybrid Path to Cloud Adoption; U.S. Adoption Faster Than Global Counterparts”:

Cloud computing is no longer just a buzzword. A recent study commissioned by Avanade, a business technology services provider, shows a 320 percent increase over the past nine months in respondents reporting that they are testing or planning to implement cloud computing. This is the first data that indicates a global embrace of cloud computing in the enterprise.

The study also found that while companies are moving toward cloud computing, there is little support for cloud-only models (just 5 percent of respondents utilize only cloud computing). Rather, most companies are using a combination of cloud and internally owned systems, or hybrid approach.

"For very large organizations, the hybrid approach is logical and prudent," said Tyson Hartman, global chief technology officer at Avanade. "No one is going to rip and replace decades of legacy systems and move them to the cloud, nor should they. Additionally, at this stage of cloud computing maturity, not every computing system is appropriate for the cloud." …

<Return to section navigation list> 

Cloud Security and Governance

Tim Green reports on a semi-annual security survey in his Trust the Cloud? Americans Say No Way article of 10/24/2009 for PC World’s Business Center:

Americans don't trust cloud storage for their confidential data, with identity theft ranking as their top security concern, according to a twice-yearly survey by network security consulting firm Unisys.

Asked what they felt about personal data being stored on third-parties' remote computers, 64% say they don't want their data kept by a third party, according to the latest installment of "Unisys Security Index: United States.” …

The U.S. Security Index is based on a telephone survey of 1,005 people 18 and older.

Jaikumar Vijayan’s Microsoft wants ISO security certification for its cloud services post of 10/23/2009 to ComputerWorld’s security section:

Microsoft Corp. wants to get its suite of hosted messaging and collaboration products certified to the ISO 27001 international information security standard in an effort to reassure customers about the security of its cloud computing services. The move comes at a time of broad and continuing doubts about the ability of cloud vendors in general to properly secure their services.

Google Inc., which has made no secret of its ambitions in the cloud computing arena, is currently working on getting its services certified to the government's Federal Information Security Management Act (FISMA) standards for much the same reason.

It's unclear how much value customers of either company will attach to the certifications, particularly because the specifications were not designed specifically to audit cloud computing environments. Even so, the external validation offered by the standards is likely to put both companies in a better position to sell to the U.S. government market.

Windows Azure already has ISO/IEC 27001:2005 certification. See my Microsoft Cloud Services Gain SAS 70 Attestation and ISO/IEC thread of 5/29/2009 in the Windows Azure forum regarding Windows Azure ISO/IEC 27001/27005 certification:

Charlie McNerney's Securing Microsoft’s Cloud Infrastructure post announces:

Independent, third-party validation of OSSC’s approach includes Microsoft’s cloud infrastructure achieving both SAS 70 Type I and Type II attestations and ISO/IEC 27001:2005 certification. We are proud to be one of the first major online service providers to achieve ISO 27001 certification for our infrastructure. We have also gone beyond the ISO standard, which includes some 150 security controls. We have developed 291 security controls to date to account for the unique challenges of the cloud infrastructure and what it takes to mitigate some of the risks involved [Emphasis added].

Charlie is GM, Business & Risk Management, Microsoft Global Foundation Services.

• Rafe Needleman explains in his Reporters' Roundtable: The Dangers of cloud computing post of 10/23/2009 to CNet News:

This week we are covering the dangers of cloud computing. Get it? With the major loss of consumer data for the Sidekick smartphone users -- the Sidekick is made by Danger, a Microsoft company -- the whole idea of "cloud" safety was brought front and center for consumers. Businesses, likewise, are wondering if they are exposed to similar risks when they put their apps and data in the cloud.

Can we trust the cloud?

Our guests to discuss this topic are Stephen Shankland, CNET Senior Writer and author of our Deep Tech blog, and our special expert guest is Christofer Hoff, author of the Rational Survivability Blog, which is about this very topic. Hoff is director of cloud and emerging solutions at Cisco, so has a vested interest in keeping the cloud safe and profitable. [Emphasis added.]

• John Pescatore asserts Risk Is Just Like Obscenity in his 10/23/2009 post from the Gartner Symposium in Orlando, FL:

Yesterday at our last security session at Gartner’s annual Symposium, I chaired a debate called “Is Government Regulation Required to Increase Cybersecurity?” The panelists were Gartner analysts French Caldwell, Paul Proctor and Earl Perkins. Basically, I was against government regulation and those three were for it.

Essentially, French felt regulation done right was needed and would increase cybersecurity. Earl said that capitalism had no conscience and regulation is always needed to inject that, security no different. Paul’s position was that regulation was needed to get management to pay attention.

My position is that regulation around cybersecurity can’t be done right, hasn’t and won’t inject security, and only causes management to pay attention to compliance not security. The difference is critical – government regulations can only work when something is stable enough for slow moving legislators to write regulations that can lead to some audit against some stable standard. Information technology is definitely not stable – software engineering is still an oxymoron. Most everyone agreed, and said that’s why the focus of legislation should be around “risk” not technology mandates.

I left the conference audience with my prediction: risk is pretty much like obscenity. It is impossible to define, but we all know it when we see it. But we all see it differently. Legislation around obscenity has a long torturous history of failing – especially where technology is involved. And technology is at the heart of the cybersecurity issue – that’s the cyber part.

My prediction is that any legislation in the next 5 years trying to mandate cybersecurity levels will be as completely ineffective and money wasting as the V-Chip legislation was in the US in trying to deal with inappropriate content over televisions. I’ve used this analogy before – back in 2001 when the browser industry was trying to claim the use of Platform for Privacy Preferences technology would solve web privacy issues, I wrote a Gartner research note “P3P Will Be the V-Chip of the Internet.” That proved to be pretty dead on.

Chris Hoff (@Beaker) asks Can We Secure Cloud Computing? Can We Afford Not To? in this 10/22/2009 “re-post from the Microsoft (Technet) blog I did as a lead up to my Cloudifornication presentation at Bluehat v9 I'll be posting after I deliver the revised edition tomorrow]”:

There have been many disruptive innovations in the history of modern computing, each of them in some way impacting how we create, interact with, deliver, and consume information. The platforms and mechanisms used to process, transport, and store our information likewise endure change, some in subtle ways and others profoundly.

Cloud computing is one such disruption whose impact is rippling across the many dimensions of our computing experience. Cloud – in its various forms and guises — represents the potential cauterization of wounds which run deep in IT; self-afflicted injuries of inflexibility, inefficiency, cost inequity, and poor responsiveness.

But cost savings, lessening the environmental footprint, and increased agility aren’t the only things cited as benefits. Some argue that cloud computing offers the potential for not only equaling what we have for security today, but bettering it. It’s an interesting argument, really, and one that deserves some attention.

To address it, it requires a shift in perspective relative to the status quo. …

Beaker’s Bluehat v9 Session was “Cloudifornication: Indiscriminate Information Intercourse Involving Internet Infrastructure,” described as follows:

What was in is now out.

This metaphor holds true not only as an accurate analysis of adoption trends of disruptive technology and innovation in the enterprise, but also parallels the amazing velocity of how our data centers are being re-perimiterized and quite literally turned inside out thanks to cloud computing and virtualization.

One of the really scary things that is happening with the massive convergence of virtualization and cloud computing is its effect on security models and the information they are designed to protect. Where and how our data is created, processed, accessed, stored, backed up and destroyed in what is sure to become massively overlaid cloud-based services – and by whom and using whose infrastructure – yields significant concerns related to security, privacy, compliance, and survivability.

Further, the "stacked turtle" problem becomes incredibly scary as the notion of nested clouds becomes reality: cloud SaaS providers depending on cloud IaaS providers which rely on cloud network providers. It's a house of, well, turtles.

We will show multiple cascading levels of failure associated with relying on cloud-on-cloud infrastructure and services, including exposing flawed assumptions and untested theories as they relate to security, privacy, and confidentiality in the cloud, with some unique attack vectors.

I’ll update this post when Chris makes his updated version available.

• Eric Chabrow reports NIST Suspends IT Lab Reorganization and asks Should Computer Security Division Become NIST's 11th Lab? in this 11/22/2009 post:

As the National Institute of Standards and Technology placed a hold a proposed reorganization of its Information Technology Laboratory (ITL), critics of the plan proposed making the lab's Computer Security Division (CSD) a lab itself.

"We have received expressions of both support and concern from various stakeholders," IT Lab Director Cita Furlani said Thursday in testimony before the House Science and Technology Subcommittee on Technology and Innovation. "We are seriously considering this input and plan to re-evaluate how to ensure that our structure is as flexible and efficient as possible in meeting the many challenges and opportunities ahead.."

Under the proposed reorganization, unveiled in August, the director of the lab's Computer Security Division would have been elevated to a position within the IT Lab director's office, serving as ITL's cybersecurity adviser. The proposal would have encouraged more multidisciplinary collaboration among NIST units in developing cybersecurity programs and guidance, a move some critics saw as weakening the CSD brand. …

• David Navetta’s Legal Implications of Cloud Computing -- Part Three (Relationships in the Cloud) article of 10/21/2009 on the Information Lawgroup site advises:

In the legal world, some take the position that Cloud is no different than “outsourcing”.    Unfortunately, making that comparison reveals a misunderstanding of the Cloud and its implications.  It is sort of like saying that running is no different than running shoes. Like “running,” outsourcing is a general term describing an activity. In this case the activity involves organizations offloading certain business processes to third parties. Cloud computing (like “running shoes”) is a “new” method for leveraging existing technologies (and technological improvements that have occurred in the past 20 years) that can be used by outsourcers to provide their services more effectively and cheaply (as running shoes represents a technology that can be used to achieve the activity of running more efficiently).  In other words, one can outsource utilizing a Cloud architecture provided by a third party, or by using a more traditional dedicated third party hosted technology solution. Both are different technologies or methods for achieving the same activity: outsourcing of business processes.

For lawyers analyzing outsourcing to the Cloud the question is whether the technology, operational aspects and various relationships of a given Cloud transaction create new legal issues or exacerbate known legal problems. To illuminate this question, this post explores the relationships that exist between organizations outsourcing in the Cloud (“Cloud Users”) and those providing services in the Cloud. Coincidentally (or maybe not so much) understanding these relationships is crucial for attorneys that need to address legal compliance risk and draft contracts to protect clients entering into the Cloud. …

David’s earlier members of the Legal Implications of Cloud Computing are:

Links to these articles in Part three are broken.

Barbara Darrow lists (and expands on) these Top five IT channel lessons for the quarter in her 10/21/2009 post to the IT Knowledge Exchange:

    1. Fear the cloud
    2. While you’re at it, fear Google
    3. Watch Cisco like a hawk
    4. Keep your eye on M&A
    5. Don’t equate small with easy

<Return to section navigation list> 

Cloud Computing Events

•• James Hamilton recounts his presentation to the Stanford Clean Slate CTO Summit in this 10/24/2009 post:

I attended the Stanford Clean Slate CTO Summit last week. It was a great event organized by Guru Parulkar. … I presented Networks are in my Way. My basic premise is that networks are both expensive and poor power/performers. But, much more important, they are in the way of other even more important optimizations. Specifically, because most networks are heavily oversubscribed, the server workload placement problem ends up being seriously over-constrained. Server workloads need to be near storage, near app tiers, distant from redundant instances, near customer, and often on a specific subnet due to load balancer or VM migration restrictions. Getting networks out of the way so that servers can be even slightly better utilized will have considerably more impact than many direct gains achieved by optimizing networks.

Providing cheap 10Gibps to the host gets networks out of the way by enabling the hosting of many data intensive workloads such as data warehousing, analytics, commercial HPC, and MapReduce workloads. Simply providing more and cheaper bandwidth could potentially have more impact than many direct networking innovations. …

David Pallman will join Aaron Skonnard, Simon Guest, Michael Steifel, and Chris Pels as hosts of PDC 2009’s Birds of a Feather (BOF) @Lunch Cloud Computing Session on Wed 11/18 according to his Will Cloud Computing Change Your Life? at PDC post of 10/23/2009.

• Randy Bias “gave a short, 5-minute ‘lightning talk’ at Cloud Camp in the Clouds” according to his State of the Cloud – Cloud Camp in the Clouds post of 10/23/2009:

Here is a recreation of the talk including my original deck, which has my fonts (the perfectionist in me, I can’t help it), and a professional (sort of) quality voice-over.  I’d also like to thank the Slideshare folks whose audio synchronization tool was a cinch to use.

• Eric Nolin’s 19 Days to Getting Ahead of the Curve - Free Tix Here post of 10/23/2009 promotes the Defrag 2009 Conference being held on 11/11 to 11/12/2009 in Denver, CO:

Defrag is now officially less than three weeks away, and I’m starting to feel like a kid on Christmas eve.

But not so much the kid that I miss interesting developments - like this piece from the Gartner Symposium about IT falling “behind the curve.” Gartner argues that there’s a “fundamental mismatch” between what IT is good at and what is happening on the internet. The result is IT departments everywhere losing a grip on what’s actually going on. The “digital natives” are on iPhones, google, facebook, twitter, etc — whether you even KNOW it or not (forget liking). You no longer have 18 months to roll out a collaboration suite. People are routing around the steady pace of the IT department and “rolling their own.”

This is all really just a continuation of what happened when Salesforce.com first launched. IT people everywhere were simply STUNNED by the idea that Billy Joe down in business unit X had simply *bought himself* a license, uploaded sales data, and started using a service that no one had vetted OR approved. Guess what? That’s now everything. And it’s exploding. And it’s only gonna get worse.

The big question in the SaaS world for the last few years has been “how to keep control” - where control means compliance, security, auditing, etc. You can just feel that wave of compliance/audit/security sweeping toward all things in the social computing world (it may be the thing that pushes us into the trough). Look for it next year.

See, and that right there is why if you’re on the fence about Defrag, you should hop off and come join us. Because if you’re anywhere CLOSE to thinking about these issues (and how they intersect with so many other technology touch points), then you need to start seeing what’s coming next. …

• Joe McKendrick reports Anne Thomas Manes: SOA can be resurrected, here's how for ZDNet on 10/23/2009 from this year’s International SOA Symposium in Rotterdam:

…[T]he prevailing theme is “Next-Gen” SOA, in which we see service-orientation emerge from its bout with the skeptics to take a stronger role within the enterprise.

Thomas Erl, the conference organizer and prolific author on SOA topics, launched the event, noting that we are moving into a period of Next-Generation SOA, with the foundation of principles and practices being laid within many entreprises.

Next up was Anne Thomas Manes of Burton Group, who declared in a post at the beginning of the year that “SOA” — at least as we knew it — was “dead.” However, the second part of Anne’s post was “Long Live Services,” which is the theme that she picked up on in her keynote address.

“Business wasn’t really interested in buying something called ‘SOA,” she declared, adding that in her own research, fewer than 10% of companies have seen significant business value in their efforts.

However, that is not to diminish the importance of service oriented architecture. “The average IT organization is in a mess,” she says. “The average business has 20 to 30 core capabilities. Why do they need 2,000 applications to support those 20-30 capabilities?” …

Joe and Anne are signatories of the SOA Manifesto, the almost-final version of which and its Guiding Principles appeared on 10/23/2009.

Barton George reviews Gartner’s Bittman: Private Cloud’s value is as Stepping Stone presentation at Gartner’s IT Symposium (see below) in this 10/23/2009:

Yesterday Gartner distinguished analyst Tom Bittman, who covers cloud computing and virtualization,  posted some thoughts and observations from the Gartner Symposium in Orlando.

Based on Tom’s observations, private cloud (however defined) seems to have captured the hearts and minds of IT.  Before he began his talk on virtualiztion he did a quick poll asking how many in the audience considered private cloud computing to be a core strategy of theirs.  75% raised their hands.  While not overly scientific, that’s a pretty big number.

… Tom sees private cloud’s value as a means to end and concludes his post by saying

“The challenge with private cloud computing, of course, is to dispel the vendor hype and the IT protectionism that is hiding there, and to ensure the concept is being used in the right way – as a stepping-stone to public cloud… [italics mine]”

This is where I disagree.  I believe that while private cloud can be a path to the public cloud, it can also be an end unto itself.  Unfortunately (or fortunately) we will always have heterogeneous environments and in the future that will mean a mixture of  traditional IT, virtualized resources, private clouds and public clouds.  In some case workloads will migrate from virtualizaiton out to the public cloud but in other cases they will stop along the way and decide to stay.

IT will become more efficient and more agile as the cloud evolves but there will be no Big Switch (see above illustration), it (IT) will need to manage a portfolio of computing models.

Thomas Bittman’s Talk of Clouds (and Virtualization) in Orlando post of 10/22/2009 from Gartner’s IT Symposium emphasizes attendee preference for private over public clouds:

At this week’s Gartner Symposium in Orlando, there was a noticeable shift in the end user discussions regarding virtualization and cloud computing, and a few surprises:

1) In my presentation on server virtualization on Monday, before I started, I asked the audience how many of them considered private cloud computing to be a core strategy of theirs. 75% raised their hands (I expected maybe one-third). Clearly, everyone has a different idea of what private cloud computing means (or doesn’t), but the fact that so many people have glommed onto the term is very interesting. …

2) My one-on-ones shifted heavily from virtualization toward cloud computing and private cloud computing. I had 18 one-on-ones that discussed server virtualization, and 26 that discussed cloud and private cloud. …

Tom concludes:

The challenge with private cloud computing, of course, is to dispel the vendor hype and the IT protectionism that is hiding there, and to ensure the concept is being used in the right way – as a stepping-stone to public cloud, based on a timing window, the lack of a mature public cloud alternative and a good business case to invest internally.

Andrea Di Maio’s A Day in the Clouds post from the Gartner Symposium on 10/21/2009 begins:

I spent a good part of today at Gartner Symposium in Orlando talking to government clients about cloud computing, moderating a vendor panel and finally running an analyst-user roundtable, all on the topic of cloud computing in government.

Two main take-aways for me:

  • Government clients are confused as to (1) whether vendor offerings really meet their security requirements, (2) which workloads could be easily moved to a public cloud and (3) what’s the roadmap from public to private/community clouds.
  • Some vendors have great confidence that their existing solutions already meet most of the security requirements and claim that there is already a lot of government stuff in the cloud.

Therefore, either vendors are not effectively or convincingly marketing their success stories, or government clients use “security” as a blanket topic not to move to the cloud.

My current reading though – and I know this won’t be welcome by some – is that most of these conversations really are about alternative service delivery models.

Except one or two cases, all conversations focused on “cloud computing can make me save money” rather than “I need scalability or elasticity”. …

Guy Rosen made a Presentation: Measuring the Clouds to an IGT workshop on 10/21/2009 and includes a link to a copy of his slides:

Following up on the cloud research I’ve been conducting and publishing here, yesterday I presented the topic at an IGT workshop. There was a lot of great discussion on the findings as well as ideas for new angles and fresh approaches to looking at the data. Thanks to IGT’s Avner Algom for hosting the session!

I’ve SlideShare’d the presentation below for my readers’ enjoyment. If anyone is interested in discussing feel free to reach out.

IGT is holding its primary cloud computing event of the year, IGT2009 – World Summit of Cloud Computing, on December 2-3. A ton of folks from the industry are attending and I’ll definitely be there. I look forward to meeting fellow cloud-ers. …

You can review his slides here.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

•• Steve Lesem contends that Salesforce and Box.net: Saas and Cloud Storage will Transform Enterprise IT in this beathless post of 10/25/2009:

The announcement that Salesforce is integrating directly with cloud-storage Box.net is the tip of the iceberg when it comes to the future of the cloud:
Techcrunch explains what Box.net is thinking:

“CEO Aaron Levie says that this is the first step in Box.net's plan to give businesses a secure way to share their files across multiple services on the web. He says that many of the cloud services geared toward the enterprise don't work well together -- oftentimes you'll have to reupload the same content to multiple sites to share or edit it. Box.net wants to help unify these services by serving as the central hub for your uploaded files, which you can then access from these other web-based services. Levie hints that we'll be seeing more integrations with other services in the near future.”

What we are witnessing is the future of enterprise IT infrastructure. We have been talking about programmatic access through RESTful APIs for some time now.  This move by Saleforce is an evolutionary step in how enterprise IT will manage its IT infrastructure - it will be a cross-cloud platform, with applications and open access to the storage cloud of your choice.

Security is not an issue, and the future is about cross-cloud collaboration.

I bet the Salesforce.com/Box.net won’t even qualify as even an “evolutionary step” let alone transformative. All cloud storage players offer more or less RESTful APIs, regardless of the overhead cost.

• Bob Evans reports “Two offshore-services execs and a prominent American CIO plan to open IT-services centers across the U.S. as low-risk and and price-competitive alternatives to offshore providers” in his New Tech Firm Hiring Hundreds In U.S. To Take On The World post of 10/22/2009 to InformationWeek’s Global CIO blog:

… Called Systems In Motion, the company's plan is to create a handful of U.S.-based centers offering world-class IT services at competitive prices and with less risk, fewer regulatory obstacles, and a level of flexibility and nimbleness that today's high-change global business environment requires [Systems In Motion link added].

The startup company expects to get up to speed quickly by utilizing leading-edge technologies and infrastructure such as Salesforce.com's Force.com, Amazon's Web Services, and Zephyr Cloud Platform to keep costs low and project cycles short, and to position itself on the front edge of transformative approaches that many enterprises are just beginning to fully comprehend. [Zephyr link added] …

• Jim Liddle reported that GigaSpaces finds a place in the Cloud on 10/23/2009:

[A] new report from analysts The 451 Group outlines the success to date  that GigaSpaces has had in the Cloud Sector. The report talks about how GigaSpaces now has 76 customers using its software on cloud-computing platforms. This is up from 25 on Amazon’s EC2 in February. GigaSpaces have moved forward their cloud strategy in recent weeks, announcing support for deployments on GoGrid and also recently announcing tighter integration with VMWare which enables GigaSpaces to dynamically manage and scale VMWare instances and enable them to participate in the scaling of GigaSpaces hosted applications.

GigaSpaces have a number of hybrid deployments in which their application stack is hosted in the cloud and the data or services are hosted on premise which have had some notable successes.

The GigaSpaces product provides a strong Cloud middleware stack which encompasses Logic, data, services and messages in memory underpinned by a real-time Service Level Agreement enforcement which functions at the application level enabling the stack to scale up and out in real time based on SLA’s set by the business. As everything is held in memory, it is faster than alternative ways of trying to build enterprise scale applications in the cloud, and it has sophisticated sync’ services that enable async (or sync) of data to a DB or persistent store.

• Tom LounibosWhat Happens When Vendors Repackage Old Technology and Call it a Cloud Service? post of 10/23/2009 takes on HP and its LoadRunner product:

In an effort to remain relevant, some software vendors take marketing advantage of the newest hot technology fad at the expense of their own customers. Cloud computing (the newest hot trend) has definitely been defined and positioned by some traditional software and hardware vendor’s marketing organizations to meet their specific agenda, which usually means extending the life of an existing product or product line.  It has become a virtual “cloud rush” as to how many times “cloud computing” can be mentioned in product and marketing collateral. . . including collateral for products that were first developed back when Bill Clinton was president!

A great example of this “cloud rush” is Hewlett Packard with its LoadRunner product.  LoadRunner was developed in the early 90’s to help corporate development teams test client/server applications.  It became, over time, the de-facto standard testing tool for most enterprise companies and was priced accordingly.  Entry-level pricing began at $100,000 and if you needed to simulate thousands of users the cost skyrocketed into the millions of dollars very quickly.

Today, HP is attempting to “perfume the pig” so to speak, by repackaging LoadRunner into a new cloud-based service called Elastic Test. To the uneducated observer it simply looks like a new cloud service for testing web applications.  The problem is that it’s the same LoadRunner product built almost 20 years ago to test client/server applications and it carries a lot of technology baggage along with it. Subsequently, HP chooses to pass along a lot of this baggage in the form of costs back to its customer base.  For example, an entry-level HP virtual test will take weeks to develop and will have a “starting” cost of $45,000.  Hardly living up to cloud computing’s value proposition for on-demand services that provide ease, speed and affordability.

Tom is CEO of SOASTA, one of the first cloud-based testing services.

• HP claims a “Quicker Path To Infrastructure and Application Modernization” in its Infrastructure Services Executive Overview publication:

Too many applications. Too much customization. Underutilized server capacity. Operations costs that are spiraling upward. Increases in power and cooling costs.

And that's just the beginning of the challenges.

HP Adaptive Infrastructure (AI) Services offering can help you overcome these challenges with a new approach to outsourcing that speeds the delivery of next-generation infrastructure benefits. HP Adaptive Infrastructure Services offers a prebuilt infrastructure with a design based on HP Adaptive Infrastructure innovations and delivered through highly automated processes and procedures. You will realize quicker infrastructure and application modernization at a reduced risk and cost. Because all assets are owned and managed by HP, you will be able to convert the capital investment associated with an infrastructure buildout into an ongoing operating expense.

This sounds similar to hosting to me but doesn’t mention “cloud.”

Steve Lesem asks BMC's Tideway Acquisition: A Stairway to the Cloud? in this 10/21/2009 post:

BMC Software's announcement that it has entered into a definitive agreement to acquire privately-held Tideway Systems Limited (Tideway), a provider of IT discovery solutions, can be interpreted as an extension of BMC's commitment to cloud-computing.
Here are two important statements in the press-release:

1. BMC will deliver unmatched visibility into the data center and rapidly reduce the time and resources required to model, manage and maintain applications and services. This is critical for IT organizations that are transitioning applications and services to cloud computing environments.

2. With the acquisition of Tideway, BMC adds the industry's leading application discovery and dependency mapping capabilities to manage and maintain complex data center environments including distributed, virtual and mainframe IT platforms and further extends its leadership in business service management.

So let's see what this could mean. 

It gives BMC the critical capability to discover and map complex data environments which are both physical and cloud-based.

This acquisition also puts BMC in a strong position to build a cloud-based CMDB.  While that might not happen right away, it is clearly now a key capability if they decide to pursue it. It also allows them to build a federated CMDB - and manage the hybrid cloud - private and public - across enterprise and hosted data centers. 

The evolution towards cloud-based ITIL continues.

<Return to section navigation list> 

blog comments powered by Disqus