Tuesday, February 23, 2010

Windows Azure and Cloud Computing Posts for 2/22/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts and Databases”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in February 2010. 

Azure Blob, Table and Queue Services

William Vambenepe’s Square peg, REST hole post of 2/22/2010 observes that the Representational State Transfer (REST) data model isn’t well suited for many IT/cloud management interaction models:

For all its goodness, REST sometimes feels like trying to fit a square peg in the proverbial round hole. Some interaction patterns just don’t lend themselves well to the REST approach. Here are a few [abbreviated] examples, taken from the field of IT/Cloud management:

  • Long-lived operations
  • Query
  • Events
  • Enumeration
  • Filtering
  • Collections
  • Afterlife [of Deletions]

See William’s post for detailed explanations of the examples.

I am not saying that these patterns cannot be supported in a RESTful way. In fact, the problem is that they can. A crafty engineer can come up with carefully-defined resources that would support all such usages. But at the cost of polluting the resource model with artifacts that have little to do with the business at hand and a lot more with the limitations of the access mechanism.

Now if we move from trying to do things in “the REST way” to doing them in “a way that is as simple as possible and uses HTTP smartly where appropriate” then we’re in a better situation as we don’t have to contort ourselves. It doesn’t mean that the problems above go away. Events, for example, are challenging to support even outside of any REST constraint. It just means we’re not tying one hand behind our back.

The risk of course is to loose out on many of the important benefits of REST (simplicity, robustness of links, flexibility…). Which is why it’s not a matter of using REST or not but a matter of using ideas from REST in a practical way.

With WS-*, on the other hand, we get a square peg to fit in a square hole. The problem there is that the peg is twice as wide as the hole…

Amen on the WS-* issues, which I call “SOAP Header Soup.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Brendan Cournover interviews Brent Ozar and Kevin Kline for his SQL Azure and what it means to performance post of 2/22/2010 to SearchWindowsServer.com:

With Microsoft’s Azure platform now in full-effect, one of the questions affecting DBAs and developers is “How does it change what I do?”

I spoke with SQL Server MVPs Brent Ozar and Kevin Kline of Quest Software earlier about some of the performance implications for putting databases in the cloud with SQL Azure. The two are set to co-host a free, all-day virtual training session covering general SQL Server performance tuning and troubleshooting on Mar. 3.

“There’s a certain set of skills out there in the marketplace that are ‘evergreen’, and every year there are people that come into the business that haven’t learned those skills. And performance tuning and troubleshooting are at the top of the list,” said Kline about the session. He added that the all-day event gives them time to not only go over the general process points for getting started, but also the specific commands and techniques needed for proper troubleshooting and tuning.

But what about the cloud? Surely, throwing SQL Azure and other cloud-databases into the mix is going to have some implications in regards to how people think about performance. …

Brendan continues with “… some of the key points that database pros should keep in mind.”

Cory Fowler’s Reach for the Sky… Script a Database for SQL Azure post of 2/22/2010 summarizes the scripting process for creating SQL Azure databases:

While there are many different ways to get your database up and running into the cloud. You could read up on SQL Azure Sync Framework in my post titled “Synchronizing a Local Database with the Cloud using SQL Azure Sync Framework”.

However, nothing feels more comfortable to a developer than something that familiar. After a little bit of investigating while preparing for my talk at Confoo on SQL Azure, I managed to find a post on the MSDN website that Explains what is needed in order to use the Generate Scripts Wizard in SQL Server Management Studio.

Create the Transact-SQL Script
  1. In Object Explorer, right-click the database, point to Tasks, and select Generate Scripts.

  2. In the Script Wizard dialog box, click Next to get to the Select Database step. Select School, select Script all objects in the selected database, and then click Next.

  3. In Choose Script Options, set the following options:

    • Convert UDDTs to Base Types = True
    • Script Extended Properties = False
    • Script Logins = False
    • Script USE DATABASE = False
    • Script Data = True

    SQL Azure does not support user-defined data types, extended properties, Windows authentication, or the USE statement.

  4. Click Next, click Next, and then click Finish. The Script Wizard generates the script. Click Close when the script is completed.

  5. In the generated script, delete all instances of "SET ANSI_NULLS ON".

  6. Each CREATE TABLE statement includes a "WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]" clause. Delete all instances of that clause.

  7. Each CREATE TABLE statement includes the "ON [PRIMARY]" clause. Delete all instances of that clause.

The reason you need to apply these changes to the script is that SQL Azure currently doesn’t support all the features of the currently release of SQL Server 2008.  There are plans to start to incorporate some of the features that are in this outline, including the USE statement. …

My January Synchronizing On-Premises and SQL Azure Northwind Sample Databases with SQL Azure Data Sync, Using the SQL Azure Migration Wizard v3.1.3/3.1.4 with the AdventureWorksLT2008R2 Sample Database, and SSMS 2008 R2 11/2009 CTP Has Scripting Problems with SQL Azure Selected as the Target Data Base Engine Type posts in January 2010 cover the same or similar ground.

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

No significant articles today.

<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Eric Nelson’s Visual Studio 2010 Conversion Wizard and Azure solutions &ndash; do not change target to .NET 4 of 2/23/2010 warns about a premature change from .NET v3.51 SP1 to .NET v4:

I am currently (Feb 2010) moving lots of Azure samples originally built in Visual Studio 2008 to Visual Studio 2010 RC. I just realised there is a simple mistake folks can fall into. Easy to do, easy to avoid.

When opening an Azure solution originally built in Visual Studio 2008  you will see the Conversion Wizard.

image

Once you click Finish you will see this dialogue box with a default of Yes. Do not click Yes! Instead click No.

image

The reason is simple. As of Feb 2010 the Windows Azure Fabric in Microsoft Data Centres is not running the .NET 4.0 RC, it is instead running .NET 3.5 SP1. Therefore you want to stick with .NET 3.5 SP1 for your projects to get the equivalent experience with the development fabric – which means you won’t have access (just yet) to some of the really good stuff such as Entity Framework 4.

End result is a nicely converted solution. …

David Linthicum’s When Your SOA Using Cloud is Too Distributed and Complex post of 2/23/2010 to ebizQ’s Where SOA Meets Cloud blog:

Your SOA using cloud computing deals with very fine-grained architectural components, including services, processes, and data, so there is the problem of going a bit too crazy. Take the case of a single application where you might scatter services across four cloud computing platforms, and perhaps leave a few services on-premise. While this seems like something you would not attempt, going forward there will be many instances where the data may live on an infrastructure-as-a-service provider, such as Amazon EC2, while the user interface lives on a remote Web server, and the core services reside on-premise. This is the world we are moving toward.

However, you could find that your architecture has performance and reliability issues because you leveraged too many platforms to host those service. The problem comes into play when you consider that the application is dependent upon every service functioning in order to continue processing, and thus a single platform that's down, could stop the entire application from working, depending on how it's design. In essence, your reliability is really a function of the reliability of all systems, on-premise or cloud-based, that are supporting that application. Consider the number of links in the chain. The more you distribute the application, the more likely you are to have reliability issues. That's just the reality of distributed computing.

Furthermore, you should consider performance. It's been well known that SOA services that are too fine-grained have a tendency to cause performance problems since each service needs to talk to other services through a single interface. The more services, the more communications that go on, and the more the network and the processors get saturated. There are not hard and fast rules around granularity for SOA, or SOA leveraging cloud computing, it's simply a matter of understanding how your architecture will be affected by more services rather than fewer, and the functional need to break down the services into smaller more fine-grained services, versus more coarse-grained services.

Does this seem logical?

John McClelland interviews Chris Rolon in an 00:11:04 Neudesic Migrates Quark Promote to Windows Azure video segment of 2/22/2010:

In this interview, Chris Rolon, Architect Evangelist and Employee #1 at Neudesic, talks with Microsoft Partner Evangelist John McClelland, about his latest passion, Microsoft’s Azure offering for cloud computing. Chris discusses his experience migrating Quark Promote to the Azure Services Platform.

Chris describes how Azure storage solved two major design limitations, universal access to Quark Promote custom templates and scalability. Chris also talks about Neudesic's decision to become an early adopter of Azure and his recommendations for maximizing partner relationships with Microsoft.

Return to section navigation list> 

Windows Azure Infrastructure

Eric Beehler claims “The idea of the private cloud provides a lot of promise, but the technology is not cheap. Here's what vendors can actually offer today -- and how much it will cost” in the deck of his Building a Private Cloud article for Redmond Magazine’s March 2010 issue:

The buzz about cloud computing has been deafening for the past few months. Every new startup seems to be using these services, and now cloud computing is even the focus of television commercials during major sporting events. The power of the cloud is undeniable. It has the ability to provide on-demand computing with little up-front investment, but it can be extremely disruptive as well. How does an IT manager approach the cloud for infrastructure and applications?

Although some companies have been moving full-steam into the cloud, most mature IT organizations are just starting to look at the cloud for possible solutions. In a tight economy, IT needs to find new ways to save money and be innovative at the same time. With the entrance of big players like IBM Corp. and Microsoft, the market for cloud technologies now has several competitors vying for space in companies' networks.

Outside the realm of customer- and Internet-facing applications, cloud solutions either extend internal infrastructure into an environment where IT professionals don't have traditional management capabilities, or move the cloud to an internal infrastructure.

IT departments have spent years building rock-solid processes, procedures and expertise to make their data centers and the systems in them reliable and manageable. Now, promises of lower costs and better utilization are calling from the hyper-machine. Can IT departments take the core of cloud computing and provide nimble, real-time responses to the needs of their businesses using the technology they already have? IT pros should take a rational look at the options, cut through the hype and ultimately deploy the cloud effectively in their organizations' infrastructures.

Eric continues with an analysis of “The Private Cloud Today,” “Vendor Assessment: Amazon,” and ”Vendor Assessment: IBM”. He writes in his “Vendor Assessment: Microsoft” section:

Microsoft is charging ahead in the cloud-computing space with the rollout of Windows Azure, which is a platform as well as a way to host Windows and SQL Servers in the public cloud. Microsoft made Windows Azure "available free to evaluate" through January, and on Feb. 1 Redmond launched the platform as a paid service.

The current Windows Azure offerings are all public cloud for now, but Microsoft recently announced private-cloud technology based on Windows Azure. For example, "Project Sydney" will enable enterprises to tie together data centers and the public Windows Azure services as part of a hybrid infrastructure, and AppFabric will allow developers to harness cloud services regardless of whether those services are on-premises or hosted in the public cloud.

As of yet, Microsoft hasn't integrated Windows Azure into this hybrid cloud model. AppFabric is already in developer preview, with Project Sydney expected to make its beta debut by the end of 2010.

The Dynamic Data Center Toolkit for Enterprises (DDCT-E) is scheduled to be released during the first half of 2010. It aims to address servers, networks and storage as a single set of available resources in the same pool, thereby reducing idle time. Automation through batch creation and provisioning of VMs is the key to the automation of the Microsoft environment. The focus is on automation of provisioning and proper tracking of those deployed resources. Microsoft provides a self-service portal along with role-based access control. Tracking and chargeback reports are also available. DDCT-E will be free, but it integrates with Windows Server 2008 R2 Hyper-V and Microsoft System Center Virtual Machine Manager 2008, which carry steep price tags. Microsoft finds itself on the cusp of big developments in the private cloud, as well as in hybrid cloud models, but it hasn't done much yet in the way of shipping products.

and continues with “Vendor Assessment: VMware.”

Full disclosure: I am a contributing editor for Visual Studio Magazine, which is a sister publication of Redmond Magazine.

Joe Pannatieri reports Microsoft Quietly Researches Hosting Market on 2/23/2010 at the Parallels Summit 2010 conference in Miami, FL:

Microsoft is quietly conducting some informal SaaS and hosting research at the Parallels Summit in Miami. The cloud-centric conference has attracted several hundred service providers and VARs focused on recurring revenue opportunities. Clearly, Microsoft is working hard to connect more closely with summit attendees — many of whom are checking out the Google Apps reseller program. Here’s some perspective.

When I sat for this morning’s conference keynote, I noticed a one-page Microsoft survey on my seat. The survey asks attendees which media sites they use to gather information about the hosting world. As part of the survey, Microsoft highlights three destinations (a website, blog and Twitter feed) to help attendees learn more about Microsoft’s hosting community efforts.

No doubt, Microsoft wants to get the message out regarding Windows Azure (Windows applications in the cloud), SQL Azure (databases in the cloud) and Business Productivity Online Suite (BPOS, including Exchange Online and SharePoint Online, among other options). I don’t have any firm stats measuring Microsoft’s cloud progress, but the company’s decision to spend time surveying the Parallels Summit audience speaks volumes about the software industry.

As Microsoft gathers info, Google is working the crowd. The Google Apps reseller team has prime booth space here at the Parallels Summit. And Google this morning announced that the Google Apps Reseller program has attracted 1,000 partners.

Microsoft, apparently aware of Google’s progress, cut BPOS prices in November 2009.

Are more moves coming? I’ll be looking for more answers here at the Parallels Summit.

“Soma” Somasegar’s Key Software Development Trends post of 2/23/2010 cites “Cloud Computing as the first of his seven key development trends:

More than ever before, today’s developers are open to considering and using multiple technologies to enable them to build solutions smoothly and deliver them to their customers quickly. There are an increasing number of choices available for developers in terms of programming styles. Our goal is to provide fantastic support for all programming styles within our tools to enable our customers to build great software.

Several trends are emerging within the area of software development. Below are some of the most important trends I’ve been thinking about recently. This list isn’t comprehensive of all software trends, but each one represents an area that Microsoft is currently or will be investing in to bring to our customers.

Cloud Computing

Cloud computing allows companies to leverage just the computing resources they need today, scale up to handle peak loads, and avoid the overhead of managing hardware. Cloud computing levels the playing field for small companies to compete against large, established companies at a reasonable and predictable cost. Windows Server, Windows Azure, SQL Azure, and services such as Windows Live, Office, and Xbox Live are now live in the cloud. Microsoft has committed to bringing the best cloud computing platform and services to the Windows ecosystem. The cloud is just one example of a virtualized computing platform, and the next generation of developer tools must enable developers to build software that deploys and performs well in cloud and other virtual environments. …

Soma is senior vice president of the Developer Division at Microsoft.

John Fontana reports "Microsoft plans to invest heavily in its cloud platform but expects to see little revenue for two to three years, Bob Muglia, the president of the server and tools business, said Tuesday” as a preface to his Microsoft: Cloud revenue to hit in a couple years post to 1/23/2010 to the IDG News Service:

Microsoft plans to invest heavily in its cloud platform but expects to see little revenue for two to three years, Bob Muglia, the president of the server and tools business, said Tuesday.

Muglia also said Microsoft is still waiting for businesses to resume spending on client and server software, and he took a number of swipes at VMware, which Microsoft is battling in the virtualization and cloud markets. He made his remarks during a Webcast from the Goldman Sachs Technology and Internet Conference in San Francisco.

"From the perspective of investment internally, interest from customers and engagement clearly the cloud will be an area of focus," Muglia said. "But in the next two to three years that is not what will drive financial growth in server and tools. It is essentially zero percent of our current operating revenue." …

Jo Maitland delivers more details of Muglia’s presentation in her Microsoft's Muglia brings 'Cloud Computing for Dummies' on stage post of 2/23/2010 to SearchCloudComputing.com:

Bob Muglia, president of the server and tools business at Microsoft, pulled out a Cloud Computing for Dummies book during his keynote at the Goldman Sachs technology conference today, having found it on his admins desk. He said it was an indicator that the market was growing, but that it still has a ways to go.

"Cloud has a lot of focus right now but will not drive revenue growth over the next two to three years … Windows Server and SQL Server … are the big dogs really driving it," he said, of Microsoft's outlook for 2010.

Cloud not material to revenue for next several years

Muglia was bullish on cloud computing transforming the IT industry over the long term, but said that it will not be financially material (meaning more than $1 billion in revenue) to Microsoft's business for several years.

He said cloud computing is changing the way companies build and buy hardware and write applications. More computers will be bought in mass to run discrete systems, he said. Microsoft used to buy full, pre-configured racks of servers, but it's now buying entire shipping containers of two thousands servers and petabytes of storage at a time. This is rolled into the data center, where the company adds power, cooling and Ethernet, and it's ready to go.

"This is going to have a major impact on the way servers are built and shipped over the long term," he said.

On the software side, Muglia expects to see a large number of applications built on Windows Azure over the next few years, although he didn't give any numbers. According to Goldman's latest report on cloud, 10% of small businesses using cloud development platforms are using Azure, versus Amazon, Force.com or Google.

"We are a complete platform, whereas Amazon is just a raw virtual machine," he said.

That may be so, but Amazon Web Services has 70% of the market overall, according to Goldman's numbers.

Dana Gardner reports Survey: IT executives experimenting with mostly 'private' cloud architectures in this 2/23/2010 post to ZDNet’s Briefings Direct blog:

If you want a realistic view of cloud computing adoption – along with an understanding of what motivates IT executives to invest the cloud, what concerns remain, and what initiatives are planned – you can’t limit your frame to a single industry. The full picture only becomes clear through a cross section of research, manufacturing, government and education fields.

That’s the approach Platform Computing took at a recent supercomputing conference. The company late last year surveyed 95 IT executives across a number of fields to offer insight into how organizations are experimenting with cloud computing and how they view the value of private clouds. [Disclosure: Platform Computing is a sponsor of BriefingsDirect podcasts.]

The results: Nearly 85 percent intend to keep their cloud initiatives within their own firewall.

“When deploying a private cloud, organizations will need a management framework that can leverage existing hardware and software investments and support key business applications,” says Peter Nichol, general manager of the HPC Business Unit at Platform Computing. “This survey reaffirms the benefits that private clouds offer – a more flexible and dynamic infrastructure with greater levels of self-service and enterprise application support.”

Most organizations surveyed are experimenting with cloud computing – and experimenting is the key word. Eighty-two percent don’t foresee cloud bursting initiatives any time soon. This suggests an appreciation for private cloud management platforms that are independent of location and ownership, and can provide the needed security in a world of strict regulations around transparency and privacy.

Security is chief concern

Forty-nine percent cite security as a chief concern with cloud computing. Another 31 percent pointed to the complexity of managing clouds, while only 15 percent said cost was an issue. Indeed, security concerns are a force driving many IT execs toward private rather than public clouds. Forty-five percent of organizations considering establishing private clouds as they experiment with ways to improve efficiency, increase their resource pool and build a more flexible infrastructure.

Dana continues with a discussion of respondents’ naïveté about cloud computing.

Lori MacVittie prefaces her WILS: The Many Faces of Compression post of 2/23/2010 with “There’s compression, and then there’s compression:”

One of the most common means of improving application performance is to reduce the size of the data being exchanged as redress for inherent network protocol behavior that can cause excessive delays in delivery of application data. Compression is often enabled to achieve this goal, and because most data being delivered to applications is text-based (XML, HTML, JSON) this technique generally works quite well. Depending on the architecture of the application delivery network, however, there may be other “types” of compression that can be used in addition to the “compression” typically associated with web-application data.

  • Asymmetric Compression
    Asymmetric Compression is the compression used by most web servers as well as intermediaries claiming the title “application acceleration.” Asymmetric compression is one-sided; it is applied at the server only and uses industry-standard algorithms (deflate, gzip) to compress data before delivery.
    Asymmetric compression is good for web application data because the pattern of data exchanges is heavily weighted on the egress (outbound, response) side of the data exchange. HTTP requests are generally small while the responses are often large and thus the asymmetric nature of this compression is well-suited to these applications.
  • Symmetric Compression
    Symmetric Compression is what is usually implemented by WOC (WAN Optimization Controllers) and primarily works at the network-layer using data de-duplication technologies. These algorithms attempt to reduce the size of data by finding commonly repeated chunks of data and transmitting an “index” or “pointer” to that data rather than the entire data set. One WOC removes the data ,the other replaces it upon receipt. This reduces the amount of data exchanged over the WAN but requires two devices, one at each side of the WAN.
    Symmetric Compression is well-suited for large, repetitive data set exchanges across WAN links, such as commonly shared files/virtual images/etc… shared by multiple users at a remote location.
  • Adaptive Symmetric Compression
    Adaptive Symmetric Compression combines industry standard compression with the network-layer compression traditionally implemented by a WOC and applies it intelligently. Rather than always using a specific algorithm, this capability chooses the algorithm on-demand based on the type of network-link being used and the type of data being exchanged. This means compression will generally not be used on data that does not compress well, such as images (JPG, for example, is already compressed and such files do not benefit from additional compression), and the “best fit” algorithm will be used for other data.
    Adaptive Symmetric Compression is well-suited for environments in which both web-application data and large file sets are being exchanged over the WAN, and excels in environments where more than one WAN link is used. …

Lori continues with a detailed “Comparison of Compression Methods” table. Note: WILS is an acronym for “Write It Like Seth” Godin.

The Windows Azure Team announced West Europe and East Asia sub-regions for Windows Azure deployment in a Additional Deployment Options Now Available for Windows Azure Developers post of 2/22/2010:

Starting today, February 22, 2010, Windows Azure developers have the option of deploying their applications in two more regions. These two new sub-regions should provide developers with additional resources and flexibility in addressing customer needs.  With the two new locations, ‘West Europe' and ‘East Asia", a total of six available deployment options are now available:

  • Europe: West Europe and North Europe
  • Asia Pacific: East Asia and Southeast Asia
  • North America: South Central US and North Central US … 

A specific list of countries included in and data center locations for the new regions would have been helpful. Note that my Windows Azure and Cloud Computing Posts for 1/8/2010+ post observed.

David Robinson reported SQL Azure North Europe is Online on 1/8/2009:

“It’s been great to see the enthusiasm of people looking forward to using the entire Windows Azure Platform in Europe.

“I’m happy to announce that SQL Azure is now available in North Europe.

“Starting today, when creating a new SQL Azure server there will now be three options in the region drop down South Central US, East Asia, and North Europe.”

Dave didn’t state the location, but I’m betting that it’s the Dublin, not the Amsterdam, data center.

On second thought, North Europe is probably the Amsterdam data center and West Europe the Dublin data center.

The Windows Azure Team’s Windows Azure Platform Billing Overview post of 2/22/2010 provides additional insight into the innards of the Microsoft Online Customer Service Portal (MOCP):

There have been a lot of questions about how Windows Azure Platform billing works so we thought we'd provide some additional explanations and definitions.  Feel free to contact Microsoft Online Services customer support or post a comment here if you have any additional questions.  You can also review the Windows Azure Platform Offers Comparison Table to help you decide which of the current Windows Azure Platform Offers best suits your needs.

The Microsoft Online Customer Service Portal (MOCP) limits one Account Owner Windows Live ID (WLID) per MOCP account.  The Account Owner can create and manage subscriptions, view billing and usage data and specify the Service Administrator for each subscription.  Large companies may need to create multiple subscriptions in order to design an effective account structure that supports/reflects their go-to-market strategy. The Service Administrator (Service Admin WLID) manages services and deployments but cannot create subscriptions.

For each MOCP account, the Account Administrator can create one or more subscriptions. For each subscription, the Account Administrator can specify a different WLID as the Service Administrator.  The Service Administrator WLID can be the same or different as the Account Owner and is the persona who actually uses the Windows Azure Platform.  The creation of a subscription in the Microsoft Online Customer Service Portal (MOCP) portal results in a Project in the Windows Azure portal. …

The Yankee Group’s claims “Forty-three percent of enterprises cite cost savings as their top reason for moving their infrastructure to cloud computing” and then asks “Does today’s tight economy mean 2010 will be the year enterprises fully embrace the cloud?” in its Cloud Reality Check: Vendors Will Grow Mindshare, not Market Share in 2010 press release of 2/22/2010 for a new research report:

Probably not. As Yankee Group finds in its new report, “Clouds in 2010: Vendor Optimism Meets Enterprise Realities,” fully 75 percent of enterprises are earmarking no more than a third of their 2010 IT budget to the cloud. While the 26 thought leaders Yankee Group interviewed for the report—including executives from AT&T, Akamai, Cisco, Citrix, IBM, Salesforce.com and Verizon—all have high hopes for cloud computing this year, enterprises cite key stumbling blocks, including security, performance, standards and interoperability issues. And Yankee Group sees those issues festering well into 2010 and beyond.

“Although most vendors emphasize their commitment to ‘cloud openness,’ the jury is out as to which vendors can or will deliver on that promise,” says Agatha Poon, senior analyst at Yankee Group and co-author of the report. “Industry players will struggle to come to terms with cloud openness in 2010, and many will find it difficult to unlock their platforms—and their customers.”

Other key findings include:

  • Security SLAs are a must-have. Security and availability are the top two barriers preventing enterprise usage of cloud computing, even in private cloud scenarios.
  • Established firms have a leg up on the cloud competition. Enterprises say their go-to vendors for infrastructure-as-a service offerings in 2010 include Cisco, IBM and AT&T, while Microsoft, IBM and Sun are the prime choices for platform as a service.

Join the complimentary webinar, Executive Commentary: Cloud Computing Milestones for 2010, on Feb. 24, 2010, at 11:00 a.m. EST to learn more. Host Zeus Kerravala, Yankee Group senior VP, will be joined by a panel of experts from Microsoft, Sybase and Arista. Register at https://www2.gotomeeting.com/register/334599002

Lori MacVittie claims “There’s a difference between automation and orchestration, and knowing which one you’re really doing is half the battle in achieving a truly dynamic data center” in her Knowing is Half the Battle post of 2/22/2010:

Randy Heffner on CIO.Com wrote an excellent article on SOA and its value, “SOA: Think Business Transformation, Not Code Reuse.” The problem I had with the article was not in any way related to its advice, conclusions, or suggestions. The problem I had was that I kept thinking about how perfectly much of his article could be applied to data center orchestration, operational transformation, and automation. Simply replace “SOA” with “orchestration”, “software reuse” with “automation”, and “business” with “operational” and you’ve pretty much got what needs to be said. Here, I’ll show you what I mean:

quotes The worst CIO CTO misunderstanding about service-oriented architecture (SOA) orchestration is thinking of it as only another technical initiative for automation software reuse. Although SOA's reuse orchestration’s automation potential is real and good, its business operational impact goes much further: In Forrester surveys, 38 percent of Global 2000 SOA orchestration users say they are using it for strategic business operational transformation. SOA's Orchestration’s true source of power is in its business operational design models, not its technology — and this means that SOA orchestration provides a broad foundation for a much larger shift in business operational technology (BT) architecture that goes far beyond SOA orchestration itself. By correctly understanding orchestration SOA, CIOs CTOs can lead their organizations on a solid and well-managed path toward a strategic technology future and greater business value.

This is true for SOA, and it’s true for cloud computing, where it is the orchestration of the data center infrastructure that brings the value to the table through making more efficient the operational processes codified to automate and integrate systems. …

Lori continues with detailed

  • AUTOMATION is HOW, ORCHESTRATION is WHY …
  • ORCHESTRATION is BUSINESS, AUTOMATION is OPERATIONAL …
  • DOES IT REALLY MATTER WHAT I CALL IT? …

topics.

<Return to section navigation list> 

Cloud Security and Governance

David Linthicum claims “Although enterprises are getting a better understanding of cloud computing, huge mistakes are still being made” in his 2/23/2010 The top 3 cloud computing mistakes post to InfoWorld’s Cloud Computing blog:

One would think that we're getting pretty good at cloud computing, considering the hype that's driving some quick projects -- and some quick failures, for that matter. The fact is that this has been -- and will continue to be -- a learning experience, at least for the new few years.

Considering this, I came up with the three most common blunders I've seen when organizations move to the cloud. Perhaps you can learn from these mistakes; perhaps some of them sound familiar:

    1. Ignoring governance and security until the end of the project …
    2. Leveraging "big consulting" …
    3. Falling in love with the technology, not the solution …

Read David’s post for the details of the three mistakes.

Ellen Messmer reports “Good security pros hard to find, survey respondents say” in her Security of virtualization, cloud computing divides IT and security pros post of 2/22/2010 to NetworkWorld:

Is moving to virtualization and cloud computing making network security easier or harder? When some 2,100 top IT and security managers in 27 countries were asked, the response revealed a profound lack of consensus, showing how divided attitudes are within the enterprise.

The "2010 State of Enterprise Security Survey - Global Data" report shows that about one-third believe virtualization and cloud computing make security "harder," while one-third said it was "more or less the same," and the remainder said it was "easier." The telephone survey was done by Applied Research last month on behalf of Symantec, and it covered 120 questions about technology use -- organizations remain overwhelmingly Microsoft Windows-based -- and cyberattacks on organizations.

To explain such different perceptions about the security impact of virtualization and cloud computing, Matthew Steele, Symantec director of strategic technology, said the best way to understand these answers is to know that "if they had a real security background, they immediately got concerned. But if they care for IT operations, they were thinking about it from an IT optimization standpoint." And the middle-of-the-road responses -- it's all "more of less the same" -- tended to originate from those with budget responsibilities. "If the business is still moving, things are OK," Steele remarked. …

See the Yankee Group’s Cloud Reality Check: Vendors Will Grow Mindshare, not Market Share in 2010 press release of 2/22/2010 in the Windows Azure Infrastructure section.

K. Scott Morrison reported on 2/22/2010 that You Can Help the Cloud Security Alliance Classify the Top Threats in the Cloud:

The Cloud Security Alliance (CSA) needs your help to better understand the risk associated with cloud threats. Earlier this year, the CSA convened a working group with the mandate to identify the top threats in the cloud. This group brought together a diverse set of security and cloud experts, including myself representing Layer 7. Our group identified 7 major threats that exist in the cloud, but now we would like to gauge how the community as a whole perceives the risk these threats pose.

I would like to invite you to participate in a short survey so we can get your input. This should only take you about 5 minutes to complete. We intend to work the results of this survey into the CSA Top Threats to Cloud Computing document. This will be formally unveiled at the Cloud Security Alliance Summit, which is part of next week’s RSA conference in San Francisco.

Help us to make the cloud a safer place by identifying and characterizing its greatest threats. Share this survey link with your colleagues. The more participation we can get, the better our results will be, and the stronger the work will become. …

You will find our survey here.

The Secretary of the US Department of Heath and Human Services (HHS) posted on 2/23/2010 a Web page listing Breaches Affecting 500 or More Individuals:

As required by section 13402(e)(4) of the HITECH Act, the Secretary must post a list of breaches of unsecured protected health information affecting 500 or more individuals. …

The 37 breaches affected as few as 501 to as many as 500,000 individuals.

<Return to section navigation list> 

Cloud Computing Events

Info~Tech Research Group invites you to attend its Strategies to Optimize the Cloud Webinar on 2/24/2010 at 11:00 AM PT, 19:00 GMT:

Virtualization enables a journey that begins with the capital savings of server consolidation and ends in an agile and responsive utility infrastructure. Such a flexible and scalable infrastructure is increasingly being called internal clouds. However, for the internal cloud to work the focus has to be on capacity planning and measurement as well as a physical infrastructure that optimizes VM performance.

  • The status of virtualization in the market today
  • Opportunities companies are facing with virtualization
  • How to focus on capacity planning and measurements
  • What others are doing to optimize VM performance


The webinar will feature perspectives from John Sloan, Senior Research Analyst with Info-Tech Research Group, as well as a featured case study.

Details:
Date: Wednesday, February 24, 2010
Time: 2 PM ET (presentation plus Q&A)
Speaker: John Sloan, Info-Tech analyst

See the Yankee Group’s Cloud Reality Check: Vendors Will Grow Mindshare, not Market Share in 2010 press release of 2/22/2010 in the Windows Azure Infrastructure section for a:

… Complimentary webinar, Executive Commentary: Cloud Computing Milestones for 2010, on Feb. 24, 2010, at 11:00 a.m. EST to learn more. Host Zeus Kerravala, Yankee Group senior VP, will be joined by a panel of experts from Microsoft, Sybase and Arista. Register at https://www2.gotomeeting.com/register/334599002

Ben Kepes announced CloudCamp Melbourne, Apr 1, 2010 in this 2/22/2010 post:

CloudCamp is an unconference where early adopters of Cloud Computing technologies exchange ideas. With the rapid change occurring in the industry, we need a place where we can meet to share our experiences, challenges and solutions. At CloudCamp, you are encouraged to share your thoughts in several open discussions, as we strive for the advancement of Cloud Computing. End users, IT professionals and vendors are all encouraged to participate.

Location: Ether, 265-281 Little Bourke St, Melbourne, Australia. Register here.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Jeff Barr’s Amazon EC2 Reserved Instances with Windows post of 2/23/2010 to the Amazon Web Services blog begins:

It seems to me that every time we release a new service or feature, our customers come up with ideas and requests for at least two more! As they begin to "think cloud" and to get their minds around all that they can do with AWS, their imaginations start to run wild and they aren't shy about sharing their requests with us. We do our best to listen and to adjust our plans accordingly.

This has definitely been the case with EC2's Reserved Instances. The Reserved Instances allow you to make a one-time payment to reserve an instance of a particular type for a period of one or three years. Once reserved, hourly usage for that instance is billed at a price that is significantly reduced from the On-Demand price for the same instance type. As soon as we released Reserved Instances with Linux and OpenSolaris, our users started asking for us to provide Reserved Instances with Microsoft Windows. Later, when we released Amazon RDS, they asked for RDS Reserved Instances (we've already committed to providing this feature)!

I'm happy to inform you that we are now supporting Reserved Instances with Windows. Purchases can be made using the AWS Management Console, ElasticFox, the EC2 Command Line (API) tools, or the EC2 APIs.

As always, we will automatically optimize your pricing when we compute your AWS bill. We'll charge you the lower Reserved Instance rate where applicable, to make sure that you always pay the lowest amount.

You can also estimate your monthly and one-time costs using the AWS Simple Monthly Calculator. …

Rich Miller offers a 00:00:35 First Look: Apple’s Massive iDataCenter video in his 2/23/2010 post to the Data Center Knowledge blog:

How big is Apple’s new iDataCenter in Maiden, North Carolina? It’s plenty big, as illustrated by this aerial video posted to YouTube (apparently taken by an area realtor) of the 500,000 square foot facility. The new $1 billion data center will be nearly five times the size of Apple’s existing 109,000 square foot Newark, Calif. facility, and is seen as a key component of Apple’s cloud computing strategy. The video is brief (about 35 seconds), but provides an interesting perspective on the new facility. …

Apple’s data center in Maiden is expected to provide the back-end for a larger move into cloud computing, with most speculation focusing on a shift of iTunes user libraries from user desktops to online storage. For those just joining this story, here’s a summary of our reporting on Apple’s new facility:

Can this video tell us anything interesting about Apple’s data center design and what’s happening inside the facility? Have a look at the video and share your insights and theories in the comments.

Mary-Jo Foley reported Amazon becomes the latest company using Linux to pay Microsoft for patent deal on 2/22/2010:

Amazon.com has joined a host of other companies using Linux to pay Microsoft as part of a patent cross-licensing arrangement.

Not surprisingly, the wording of the February 22 announcement by Microsoft regarding its latest IP licensing deal doesn’t claim Amazon or Linux infringing (or even potentially infringing) on any Microsoft patents. Microsoft execs learned their lesson about doing that after CEO Steve Ballmer’s remarks contradicted claims by Novell execs in a patent-licensing arrangement a few years ago. — right around the time Microsoft officials said free and open-source software violated 235 Microsoft patents.

But like other similar patent agreements Microsoft has struck with companies ranging from TomTom and Melco/Buffalo, to Samsung and Fuji Xerox,  the deal with Amazon does cover open-source and Linux-based technologies — including the Kindle e-reader, which runs Linux. …

Dilip Tinnelvelly claims “Open Source Rivals Novell and Red Hat march to different tunes as they take their Linux strategies to the cloud” in his Novell and Red Hat: Taking Linux to the Cloud post of 2/21/2010:

It is interesting to see how Open source rivals Red Hat and Novell have transferred their Linux warfare to the cloud. As both companies seek to use their open source history to advantage on the web platform by spouting standard mantras about avoiding vendor lock-ins and low cost, they have also taken different approaches on other counts when taking their cloud services to market.

Red Hat is full of idealistic enthusiasm about innovation, progress and being open through and through. It has always been a vociferous proponent of open standards and Open content that is hundred percent free from licensing or royalty payments and has been singing along the same tunes on cloud. It has been up in arms with Microsoft on many issues including opposing the approval of Microsoft Office Open XML by the International Organization for Standardization. Novell on the other hand has leveraged its Microsoft partnership beautifully for many years. It claims to hold an advantage in terms of vendor interoperability in mixed-source environments comprising of both Open source and Proprietary platforms. This is mainly thanks to of course its Microsoft alliance which allows for Novell’s technology to seamlessly integrate with Windows servers and related networks. The partnership has also worked in Novell’s favor to reduce cloud deployment costs while Microsoft wages its proxy war against Red Hat through Novell’s SUSE Linux. Novell has more recently earned headlines making noises about Cloud security.

Novell won big last week. Following the UK Government’s recently revamped IT policy promoting open source usage and cloud computing, National Health Service, the publicly funded Health care System in England has extended a networking deal worth around £6 million with Novell encompassing IT security, storage and remote workload management. Novell has been propagating the idea of unified security and IT management service over physical, virtual and cloud environments since December and calls it Intelligent Workload Management. The NHS agreement is a good indication that its new service realignment is working. …

Dilip continues with additional Novell and Red Hat history.

<Return to section navigation list> 

blog comments powered by Disqus