Thursday, December 17, 2009

Windows Azure and Cloud Computing Posts for 12/16/2009+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
• Update 12/17/2009: Lori MacVittie: The Cloud Computing - Application Acceleration Connection; Quosal, Inc.: Quosal Announces Immediate Availability, Support for Microsoft SQL Azure Cloud for Its On-Demand Sales Quote and Proposal Platform; David Ramel: Acing Azure and Dabbling in Dallas; ADO.NET Data Services Team: Data Services Update for .NET 3.5 SP1 – Now Available for Download; Martin Schneider: Why I Like Microsoft In The Clouds; Institute for Defense & Government Advancement (IDGA): Microsoft to participate in the Cloud Computing for DoD & Government Summit; Cloud Security Alliance: Version Two of Guidance Identifying Key Practices for Secure Adoption of Cloud Computing; Christofer Löf: Introducing ActiveRecord for Azure; and others.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

Alex James Getting Started with the Data Services Update for .NET 3.5 SP1 – Part 1 post of 12/17/2009 provides a fully illustrated tutorial for what was ADO.NET Data Services v1.5:

Yesterday we released the Data Services Update for .NET 3.5 SP1 that basically brings the functionality available in .NET 4.0 to .NET 3.5 too.

To help you get started with the Update, in this post we will:

  • Install the Update
  • Create a database.
  • Create a Web Application containing:
    • An Entity Data Model (EDM) to expose the database
    • A configured Data Service to expose the EDM using V2 of the OData protocol

And in Part 2 we will Create a WPF Application to use the Data Service

The ADO.NET Data Services Team’s Data Services Update for .NET 3.5 SP1 – Now Available for Download post of 12/17/2009 announces the final version of what was earlier called ADO.NET Data Services v1.5 and provides a detailed description of its contents:

We’re very excited to announce that the “Data Services Update for .NET Framework 3.5 SP1” (formerly known as “ADO Data Services v1.5”) has been released and is available for download. If your target is Windows7 or Windows 2008 R2 you can pick it up here.  For all other OS versions you can get the release from here. This release targets the .NET Framework 3.5 SP1 platform, provides new client and server side features for data service developers and will enable a number of new integration scenarios such as programming against SharePoint Lists.     

As noted in the release plan update post, this release is a redistributable, in-place update to the data services assemblies (System.Data.Services.*.dll) which shipped as part of the .NET Framework 3.5 SP1.  Since this is a .NET Framework update, this release does not include an updated Silverlight client library, however, we are actively working on an updated Silverlight client to enable creating SL apps that take full advantage of the new server features shipped in this release.  We hope to have the updated SL client available shortly into the new year. …

Since this release is an update to the .NET Framework 3.5 SP1 we kept the name consistent with what was used in the 3.5 SP1 timeframe so that printed documentation, etc. is consistent.  Going forward in .NET 4 and onwards you’ll see us use WCF Data Services. …    

Sebastian Gomez offers the source code for his Windows Azure Web Storage Explorer in a post of 12/16/2009:

A few days ago I posted about my first useful Azure application, which is hosted on here http://storageexplorer.cloudapp.net. Today I just wanted to announce I just uploaded the code to a Google Code project called windowsazurewebstorageexplorer :)

It is a good (I think is good) example of both programming for the Windows Azure platform and programming against the Windows Azure Storage client API.

So feel free to download it and let me know what you think. (Make sure you download the Windows Azure tools for Visual Studio 2008 first.) …

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

• Eric Lai asserts “Microsoft, IBM, Oracle: Three vendors, three different paths on dealing with the Hadoop open-source data processing technology” in his Big three relational database vendors diverge on Hadoop post of 12/17/2009 to InfoWorld’s Data Explosion blog:

The three leaders of the relational database market are responding to the sudden mania for the data processing technology Hadoop in three very different ways.

While startups and established data warehousing vendors such as Sybase and Teradata are embracing Hadoop and its Google-developed progenitor, MapReduce, Microsoft is resisting it.

"We'd never bring Hadoop code into one of our products," said Microsoft technical fellow and University of Wisconsin-Madison professor David J. DeWitt.

DeWitt's lack of interest is not surprising. DeWitt is an academic expert in parallel SQL databases, having co-invented three of them. He co-authored a paper this spring that argued that SQL databases still beat MapReduce at most tasks. He hasn't changed his mind.

"Every database vendor wants to claim that they're doing Hadoop because it's the popular thing," he said. "There's too much FUD. SQL databases still work pretty well."

DeWitt leads a database research lab at Madison that is helping Microsoft with R&D for its upcoming Parallel Data Warehousing version of SQL Server 2008 R2 , formerly known as Project Madison.

As such, he said that the new edition of SQL Server will add some analytic functions that roughly mimic some of the features of MapReduce/Hadoop.

Liam Cavanagh’s Looking Forward - Offline Capable Applications Using SilverLight and Sync Framework post of 12/14/2009 describes how SyncFx supports clients that don’t have a SyncFx runtime:

In order to truly take advantage of SilverLight's cross platform capability we had to come up with a solution where we could still support clients that do not have a Sync Framework runtime. To do this what we did was build the architecture such that all of the core Sync Framework logic resides on the service (in this case a Windows Azure Webrole), and that is where all of the heavy lifting of the synchronization process is executed. From the service we exposed a simple HTTP-based synchronization protocol that the client consumed, allowing any client to participate in synchronization. Currently the best offline store within SilverLight is isolated storage, and as you can see in Mark's demonstration of filtering data, this is sufficient to build a rich and very quick data-access application. [Emphasis added.]

Over the next year you will see more of how we plan to enable this scenario and we will provide additional details on what you can do to get started to build these offline-capable applications using SilverLight.

Jim Nakashima has received (and answered) several comments to his ASP.NET Provider Scripts for SQL Azure post of 11/24/2009 about the use of the *.cscfg file instead of Web.config to store connection strings:

You can use this method to get Azure providers to read from the *.cscfg file but it doesn’t work for SQL Azure/SQL Server providers.

George Huey demonstrates his SQL Azure Server Migration Wizard v3.0 in this 00:08:12 video posted 12/16/2009:

The SQL Azure Migration Wizard helps you migrate your local SQL Server 2005 / 2008 databases into SQL Azure. The wizard walks you through the selection of your SQL objects, creates SQL scripts suitable for SQL Azure, and allows you to migrate your data. The SQL Azure Migration Wizard (SQLAzureMW) gives you the options to analyzes, generates scripts, and migrate data (via BCP) from 1) SQL Server ...

Stephen Forte explains Using PowerPivot with SQL Azure in this 12/16/2009 post:

Far and away the number one Business Intelligence client in the world is Microsoft Excel. While there are tons of data visualization tools out there, Excel is hands down the leader since it is both familiar to users and very powerful. Developers have built tons of business intelligence (BI) apps in Excel using connections to data warehouses (or cubes) and letting the users go crazy with pivot tables and charts.

Things are about to get even better with project Gemini or now  Microsoft SQL Server PowerPivot (we rebel and just call it PowerPivot).  PowerPivot is an add-in for Excel 2010 or Sharepoint. PowerPivot for Excel is a data analyses tool that allows you to connect to a database, download data and store it in a local data engine (VertiPaq) so you can slice and dice it to your heart’s content using familiar Excel tools such as pivot tables and charts. (You can then send it up to SharePoint if you like, but let’s just focus on Excel for now.) PowerPivot works in-memory using your local PC’s processing power and is optimized to handle millions of rows in memory on cheap commodity PCs. The VertiPaq OLAP Engine that is part of PowerPivot  compresses and manages millions of rows of data in memory for you.

There are many awesome features of PowerPivot, but something I learned reading the team’s blog on data importing was that PowerPivot supports SQL Azure natively. This scenario is great since you can download your SQL Azure data, store it locally and slice and dice offline.

bonniefe announces AdventureWorks Community Sample Databases available for SQL Azure on CodePlex in this 12/16/2009 post:

It is a delight to announce the availability of the AdventureWorks Community Sample Databases for SQL Azure.  You can download the scripts and data files from CodePlex.  To install the databases on your SQL Azure server just follow the directions on the release page at CodePlex or in the ReadMe.htm file which is included in the zip file.

The scripts and data files in the zip file can be used to create the data warehouse (DW) and light (LT) sample databases in a SQL Azure server without SQL Server or Visual Studio being installed on the machine performing the installation. Currently the AdventureWorks OLTP database is not supported for SQL Azure and is not included in the zip file.

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

Dave Kearns claims “Sentillion found its niche concentrating solely on IAM [Identity and Access Management] in the healthcare industry” in his Microsoft's acquisition of Sentillion stands out article of 12/15/2009 for NetworkWorld:

It took until December, but what could be the biggest acquisition story of the year broke last week. At least until Oracle's buyout of Sun gets EU approval, Microsoft's acquisition of Sentillion will be the big one.

Sentillion was, perhaps, the biggest vendor concentrating solely on IAM in the healthcare industry. The 11-year-old company was started at the same time as Business Layers and Oblix, the pioneers of user provisioning -- but it's the only one of the three that's still around. They show the wisdom of developing a niche and sticking to it (as long as it's growing).

A couple of years ago I sat down with Sentillion CEO Rob Seliger and tried to get him to admit an interest in branching out beyond healthcare. Nothing too exotic; perhaps an allied market such as pharmaceuticals? But he wouldn't be baited. He claimed Sentillion knows the market well -- the company was spun off from HP's Medical Products Group nine years ago --  and wants to leverage its expertise to do healthcare identity better than anyone else. According to the Gartner Group that might be so. In announcing their Magic Quadrant for user provisioning earlier this year, Gartner said: "Sentillion remains the vendor to beat in healthcare, providing increasingly innovative approaches to deal with a unique industry segment's needs, and responding to the increased attention it is receiving in the U.S." …

It will be interesting to see if Microsoft adopts Sentillion’s authentication/authorization products for Amalga only or adapts them for general purpose use with other platforms, such as Azure and HealthVault.

Bill Lodin (@wlodin) published four new Web seminars on 12/14/2009 about the Azure AppFabric to the Everything You Need to Know About Azure as a Developer msdev.con site:

  1. Windows Azure Platform: AppFabric Overview. In this high level overview, attendees will learn what the Windows Azure platform AppFabric and what it offers Microsoft’s cloud customers.
  2. Windows Azure Platform: AppFabric Fundamentals. This session is designed to introduce the Windows Azure platform AppFabric from a developer’s point of view. Through a series of coding examples attendees will see how to take a simple …
  3. Windows Azure Platform: Introducing the Service Bus. The Service Bus is one of the two main components of the Windows Azure platform AppFabric. In this demonstration-heavy session, attendees will see multiple ways in which the Service Bus …
  4. Windows Azure Platform: The Access Control Service. The Access Control Service is one of the two main components of the Windows Azure platform AppFabric. In this demonstration-heavy session, attendees will examine the setup and code …

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

My Codename “Dallas” Developer Portal Walkthrough post of 12/16/2009, which shows you how to obtain an account and use SQL Azure’s new dataset subscription service, was updated on 12/17/2009 to include a screen capture of and downloadable code for a C# service proxy for the AP Online dataset:

• Christofer Löf’s Introducing ActiveRecord for Azure post of 12/17/2009 :

The Windows Azure Storage system is very capable and will probably fit your and your application needs in a lot of scenarios. I feel that, especially Windows Azure Tables, is far too often forgotten when considering object persistence in the cloud. This post was supposed to show off  Windows Azure Tables together with the WCF Data Services client in order to try to change the perception of "SQL Azure is the only option".

But the accompanying code examples evolved, so..

Please say welcome to ActiveRecord for Azure

ActiveRecord for Azure (or just AR4A) is what its name implies; an ActiveRecord implementation for the Windows Azure Storage system. Its main objective is to make Windows Azure data access really easy, yet powerful.

AR4A is built on top of the WCF Data Services client and the managed Windows Azure Storage API - utilizing the great work that's already been put into those libraries. AR4A just adds more simplicity to them.

To get started you inherit your entities from the ActiveRecord base class followed by telling the ActiveRecord environment to create the required tables. AR4A then handles the persistence of entities, retrieval of entities, continuation tokens, table partitions, among other things for you.

Feature overview (at the point of writing this)

  • CRUD operations
  • Partition management
  • Table generation
  • Id generation
  • Stubs

The source is available now on CodePlex for preview.

He continues with a “revamped RdChat application that comes with the Windows Azure Training Kit using AR4A.” Chris is a Senior Consultant @ Microsoft Services Sweden.

• Patty Enrado’s The Sunny von Bülow bill post of 12/17/2009 to the EHRWatch.com blog comments on Jonathan Bush’s interview by Wall Street Journal writer Joseph Rago (see Windows Azure and Cloud Computing Posts for 12/11/2009+):

Jonathan Bush, chairman and CEO of Athenahealth, was interviewed in the Wall Street Journal last weekend and spoke his colorful mind about the state of the healthcare industry and the political forces that are trying to reshape it.

My favorite quote was referring to the HITECH Act's billions of dollars for health IT adoption. It bears repeating and reflection (after the chuckling):

"It is kind of too bad that all these software companies that we're really close to putting out of business, these terrible legacy companies, with code that was written in the '70s, are going to get life support. That's why I call it the Sunny von Bülow bill. What it is, basically, is a federally sponsored sale on old-fashioned software."

I actually agree that there are creaky legacy companies, and a handful of them will likely get life support. But a number of them will get the plug pulled - stimulus funds or not. Why? The market has changed dramatically. Users and potential users are more demanding and vocal about their demands. Technology continues to evolve, or rather it's moving at warp speed. Open source, cloud computing, software as a service. These are the emerging technologies that will make EHRs and EMRs more affordable, accessible, user-friendly. The legacy companies, with their documented failed implementations, can't operate in this new world. They will have to transform themselves as surely as, say, Madonna continually transforms herself with the changing times and new generations to stay interesting and still popular. [Emphasis added.]

It's a small world when it comes to successes and failures in health IT implementations - even more so these days because of the infusion of federal funds. So make no mistake that it's a new world we're operating in and if you can't change, if you can't be nimble, you're not Sunny von Bülow, you're a dinosaur. Period. [Link added.]

Ben Riga interviews Thuzi’s Jim Zimmerman in Azure Lessons Learned: Outback Steakhouse Facebook Application, a Channel9 video posted 12/16/2009:

I’m back with another episode of Azure Lessons Learned.  In this one I discuss building apps that need to be able to scale very highly very quickly.  Jim Zimmerman, the CTO and Lead Developer for Thuzi, walks me through the Facebook application his team built for Outback Steakhouse.

Thuzi specializes in building social media applications for large organizations to connect with their customers on sites like Facebook.  The concern with these types of applications is the possibility that the offer will go viral as it gets passed from customer to customer and potentially swamp the servers to the point that they are not responsive and ultimately disappoint customers.  Thuzi chose to use Windows Azure since the Windows Azure platform has ample capacity and they could scale up or down the solution based on the current demand of the market.  The Outback Steakhouse offer in this case was a free Bloomin’ Onion at any of the thousands of Outback Steakhouse restaurants.

Facebook apps don’t actually run on Facebook.  They are embedded using an iframe into a Facebook page.  The app that is running in that iframe must be hosted somewhere else.  In this case, Thuzi hosts that in Windows Azure. They actively monitor the campaign and turn on or off web and worker role instances as required.  In order to scale they used Windows Azure table storage and queues.  They also use SQL Azure to perform reporting and analytics on the results of the campaign.

• David Ramel describes Acing Azure and Dabbling in Dallas in this 12/16/2009 post to Visual Studio Magazine’s Data Driver blog:

Here's a ringing endorsement for the simplicity of the Windows Azure platform: I was able to migrate a database into a SQL Azure project and display its data in an ASP.NET Web page. What's more, I actually developed a Windows Forms application that displayed some of the vast store of public data accessible via Microsoft's Dallas project.

And believe me, if I can do it, anybody who knows what a connection string is can do it. It's wicked easy, as they say here in the Boston area.

I've long been fascinated by the cloud. In fact, almost exactly three years ago I commissioned an article on the nascent technology, having identified it as the future of... well, just about everything. For me, a coding dilettante, there's just something cool about the novelty of being in the cloud. For IT pros, it must be exciting having no worries about hardware, the nitty gritty of administration minutiae, and so on.

So if you haven't yet, you should check out the cloud. Microsoft's Windows Azure platform services are free until Feb. 1.

Here's how I did it. I used a laptop running Windows 7, with Visual Studio 2010 Beta 2, SQL Server Management Studio, SQL Server Express, the Windows Azure SDK, the Windows Azure Toolkit for Visual Studio 2010, and the Windows Azure Platform Training Kit. …

• Quosal, Inc.’s  Quosal Announces Immediate Availability, Support for Microsoft SQL Azure Cloud for Its On-Demand Sales Quote and Proposal Platform press release of 12/17/2009 says the port “Provides Sales Teams With Turn-Key, Secure and Cost-Effective Way to Host Quosal:”

Quosal Inc. today announced full support and immediate availability for Microsoft® SQL Azure(TM) Database cloud for its on-demand Quote and Proposal platform. With SQL Azure support, Quosal offers sales teams a cost-effective, highly secure and turn-key way to host and backup their Quosal data and access the application from multiple locations, desktop and/or device.

Quosal, which can be deployed as an installed or on-demand application, is the ultimate software tool for the preparation, delivery and management of both simple form-based quotes and complex, multi-section proposals. With Quosal users can create timely, accurate, high-quality and attractive quotes and proposals with tight CRM integration and up-to-the-minute pricing, availability, promotions, product specs, images and more -- in a fraction of the time it normally takes.

Karsten Januszewski and Tim Aidlin present a Channel9 Introducing Incarnate video on 9/16/2009 to describe their new Azure-hosted Avatar service:

Our friends and neighbors at Mix Online have just released the latest version of their site which includes a new lab offering: Incarnate. Incarnate is a REST-based service that uses peoples’ usernames to find their avatars on the web. To do this, Incarnate queries Facebook, MySpace, Twitter, Xbox Live and YouTube. Karsten Januszewski and Tim Aidlin are the masterminds behind Incarnate so I decided to take a walk down the hallway of our building to find out from them that what, why and how behind Incarnate. Tune in. This is Old School 9.

You can find out about Incarnate and download its code here.

Ben Riga’s Azure Lessons Learned: Outback Steakhouse Facebook App post of 12/16/2009 describes a Facebook application that runs on Windows Azure:

I’m back with another episode of Azure Lessons Learned.  In this one I discuss building apps that need to be able to scale very highly very quickly.  Jim Zimmerman, the CTO and Lead Developer for Thuzi, walks me through the Facebook application his team built for Outback Steakhouse.

Thuzi specializes in building social media applications for large organizations to connect with their customers on sites like Facebook.  The concern with these types of applications is the possibility that the offer will go viral as it gets passed from customer to customer and potentially swamp the servers to the point that they are not responsive and ultimately disappoint customers.  Thuzi chose to use Windows Azure since the Windows Azure platform has ample capacity and they could scale up or down the solution based on the current demand of the market.  The Outback Steakhouse offer in this case was a free Bloomin’ Onion at any of the thousands of Outback Steakhouse restaurants.

Facebook apps don’t actually run on Facebook.  They are embedded using an iframe into a Facebook page.  The app that is running in that iframe must be hosted somewhere else.  In this case, Thuzi hosts that in Windows Azure. They actively monitor the campaign and turn on or off web and worker role instances as required.  In order to scale they used Windows Azure table storage and queues.  They also use SQL Azure to perform reporting and analytics on the results of the campaign.

Watch the video: Channel 9: Azure Lessons Learned: Outback Steakhouse Facebook App.

Harish Ranganathan details Azure Training Kit workarounds in his FIX for Unable to find “Microsoft.ServiceHosting.ServiceRuntime” Windows Azure Training Kit Nov 09 post of 12/15/2009:

Ok, I am playing with the Windows Azure Training Kit November 2009 release and the first sample I wanted to try was “Migrating web applications to Windows Azure”.  I believe a whole bunch of people moving to Azure aren’t just going to create new web apps but rather try and move their existing web apps which is why, I thought this exercise is more important.

After following the initial few steps, I came to the place where we manage state providers and one of the requirement is to the StorageClient library available as a part of the training kit.  Now when you add reference to this library (project) and try to build, you may hit the above error i.e. unable to find “Microsoft” or  “Microsoft.ServiceHosting.ServiceRuntime” which is one of the primary assemblies used in the “StorageAccountInfo.cs” file. 

I went through various searches and found out the information that this has moved to Microsoft.WindowsAzure.ServiceRuntime.  What followed was a series of build errors in the same file pointing to various references.  So the idea behind this post is to help folks get through this hurdle. …

Jared Bienz interviews Markus Egger in a Software Escrow as a Service with EPS, Silverlight and Azure Channel9 video posted 12/15/2009:

In this video we visit with Markus Egger of EPS Software and Code Magazine. Markus talks about how they created a unique software escrow service (Tower48.com) using Silverlight and how they architected their solution to run on Windows Azure.

The Windows Azure Team reiterates December 2009 Windows Azure-Related Developer Challenges on 12/15/2009:

There are two new December 2009 Windows Azure-related challenges underway.*

Elance logo

Elance has just launched a way for you to get paid to play with Windows Azure.  You will need to have an Elance account to participate.  You can sign up for a free account here.

There are two ways to get paid by Elance for creating a Windows Azure based application:

  • Earn $50 from Elance for each accepted application you build on Windows Azure and submit through Elance by December 31, 2009.  Any type of Windows Azure based application from simple samples to extremely complex solutions will qualify.
  • Win up to $10,000 from Elance for the top application and $1,000 each for the top five runner-up applications.

Click here to take the challenge offered by Elance.

The Code Project logo

The Code Project has a Windows Azure challenge too, where you can win an Amazon Kindle.

Click here to take the challenge offered by The Code Project.

*Please note: These challenges from Elance and Code Project are not Microsoft offerings and are being independently brought to you by third party companies (who are not affiliated with  Microsoft).

Return to section navigation list> 

Windows Azure Infrastructure

Martin Schneider’s Why I Like Microsoft In The Clouds post of 12/17/2009 to Business Computing World (UK) explains why Windows Azure appeals to him and his team:

It is odd to see a company filled with open source wonks excited about working closely with Microsoft. However, the guys in Redmond have a pretty strong vision for cloud computing in Azure – one that I personally feel is different than anyone making a major play for the cloud.

Why? Well, for a number of reasons. For one, Microsoft is looking at the cloud as less of a compute power play as many of the early cloud players saw it, and more of a distributed business stack, available at any time and with scale. What I mean is, Microsoft is probably the most application focused cloud player out there. (Note: I don’t believe anything Salesforce.com does is really cloud computing.)

Microsoft owns, operates and understands a full service stack. This is a big benefit when porting to a cloud environment. While it may seem like a limitation, it is actually in my opinion a strong point. Microsoft is able to fully control the OS, database and web servers etc. as it creates a cloud environment. Sure, there is a proprietary aspect – but most of the apps running on top of Azure will not touch anything other than the database in any profound manner.

Other cloud providers are working with a few distributions of Linux and other open source back end components – which is great for economy and scale – but these providers do not “own” the system entirely. It’s just a “nice to have” Microsoft has in its favor.

Another big factor is that Microsoft has been playing more of an interoperability game with Azure than it ever has – as far as I’ve seen. I mean, they are working closely with us, a PHP-based application, to make sure Azure can support apps written in as many languages as possible. This is a major development (in a good way) from even a couple years ago when IIS was not the ideal web server for running PHP apps (to put it lightly).

All told, when it comes to applications in the cloud, it will be the large scale providers – not the small, limited vendor-hosted SaaS providers that will realize the potential of running your business in the cloud. Vendor-hosted SaaS is great for some companies and a lot of different business needs, but for truly cloud-based operations – these guys fall short. It is great to see companies like Microsoft supporting the notion of the Open Cloud.

Martin is Director of Product Marketing at SugarCRM.

• Lori MacVittie makes The Cloud Computing - Application Acceleration Connection in this 12/17/2009 post:

traffic_light_green…  Consider, if you will, that an organization chooses to deploy an application which will be used on a daily basis by employees in “the cloud.” In previous incarnations that application would have likely been deployed in the local data center, accessible to employees over the local LAN. High speed. Low latency. The only real obstacle to astounding application performance would have been hardware and software limitations imposed by the choice of server hardware, web/application server software, and the database. Now move that application to “the cloud” and consider the potential obstacles to application performance that are introduced: higher latency, lower speed, less control. What’s potentially true is that moving to “the cloud” mitigates the expense associated with higher performing servers so bottlenecks that may have occurred due to limitations imposed by the server hardware are gone, but they are replaced by the inevitable degradation of performance that comes with delivery over the Internet – higher latency, because it’s farther away and there are more users out there than “in here” and lower speed because it’s rare that an organization has LAN-like speeds even to the Internet.

So what we have is a most cost-effective method of deploying applications that’s farther away from its users. The potential – and I’d say inevitability – is there that performance will be impacted and not in a good way. The solution is to (1) keep the application deployed locally, (2) tell the users to deal with it, or (3) employ the use of application acceleration/optimization solutions to provide consistent performance of an acceptable level to users no matter where they might end up.

There are well-known, proven solutions to addressing the core problem of distance on application performance: caching, compression, more efficient connection management, etc… All of which fall under the “application acceleration” umbrella.  As cloud computing experiences higher adoption rates it is inevitable that performance will be raised as an issue and will need to be addressed. Hence it makes perfect sense that growing cloud computing adoption will provide growth opportunities for application acceleration solution vendors’ as well, which will positively impact that market. …

• Ry Crozier reports Microsoft talks up private cloud toolkit to [Australian] partners and “Updates channel on server virtualisation plans” in a 12/14/2009 post to CRN.com.au:

Microsoft Australia has met with systems integrators, web hosts and customers to bring them up to speed on its dynamic data centre toolkit due for release next year.

The Redmond giant's director of virtualisation Zane Adams and his team met with business and channel partners, service providers, systems integrators and hosters, as well as customers.

"My discussion has been on what their next deployment is and on giving them information on the dynamic data centre toolkit for enterprises to deploy a private cloud for their data centres," Adams said.

The free toolkit would be released in March, he said [Emphasis added].

He also revealed that Microsoft would add "automation tools and features" into the next revision of its System Center management suite, potentially including some form of auto-provisioning for virtual machines.

That revision was due out "in the next 180 days", Adams said [Emphasis added].

It’s not often that you hear a Microsoft exec provide a specific date for a recently announced product.

James Governor offers 20 RedMonk Predictions for 2010 in this 12/16/2009 post. Here are those most related to cloud computing:

1. Cloud API proliferation will become a serious problem

7. NoSQL will bid for mainstream acceptance

12. Hybrid Cloud and On Premise models for the enterprise – the Big Cloud Backlash will be in full effect in 2010, after all the hype in 2009.

Salvatore Genovese asserts “Google and Microsoft Double Down on Cloud in the Enterprise; Traditional Vendors Play Catch-Up Through M&A Activity” in his Microsoft Azure Will Cannibalize a Global Account –Appirio post of 12/16/2009:

Appirio, a cloud solution provider, highlighted 10 predictions for how cloud computing will impact enterprises in 2010.

Appirio predicts that innovation from cloud ecosystem next year will remove many of the remaining barriers to enterprise adoption of cloud. Industry analysts Gartner and IDC concur, placing cloud computing at or near the top of their own 2010 predictions. …

The prediction Sal refers to is:

7. Microsoft lets Azure cannibalize a global account. Microsoft has shown that it's serious about Azure at this year's Professional Developers Conference. We predict that Azure will cannibalize Microsoft's on-premise footprint at a global account.

Geva Perry’s Application Lifecycle in the Cloud: Update post of 12/16/2009 quotes Daryl Taft’s recent eWeek article:

Back in November I posted Application Lifecycle in the Cloud, making the argument that more and more of the phases of the application lifecycle (dev, testing, monitoring, etc.) are moving to the cloud.

Yesterday, Daryl Taft at eWeek published a nice piece about Atlassian, which in many ways validates what I was saying in the blog post. I actually mentioned Atlassian as a "pioneer" in this area, and it looks like the hosted versions of their developer products -- JIRA and Bamboo -- are gaining significant traction.

Here's a quote from the article:

“Development teams who were previously paranoid about having their source code hosted are moving increasingly towards SaaS, he said. Even large teams with 100-plus developers are now confident to put their "behind the firewall" tools and most critical intellectual property into SaaS-based applications. Github is a good example of this, allowing developers to host their code online, he said.

“Atlassian has noticed this trend with its hosted development product, JIRA Studio.  "When we started, there was no such thing as Github," said Michael Knighten, director of hosted services at Atlassian. "We thought the market would go in that direction, but it was an educated guess, at best. Now the idea of SaaS development collaboration is starting to gain traction."

“One of the interesting trends we have seen is that JIRA Studio is one of our fastest growing products, and we see this as an indication that a lot of development teams are looking to SaaS and hosted solutions,” Gibbs said.”

Geva and James Urquhart have returned to producing “Overcast Show” podcasts on cloud computing topics. Their latest one, “Focus on PaaS,” is here.

Enrique Castro-Leon’s Cloud Computing and the Hype Cycle post of 12/15/2009 analyzes the cloud-computing hype:

It is undeniable that cloud computing activities have come to the forefront in the IT industry to the point that Gartner declares “The levels of hype around cloud computing in the IT industry are deafening, with every vendor expounding its cloud strategy and variations, such as private cloud computing and hybrid approaches, compounding the hype.” As such, Gartner has added cloud computing to this year’s Hype Cycle report and placed the technology right at the Peak of Inflated Expectations

Michael Sheehan in his GoGrid blog analyzed search trends in Google* Trends as indicators technologies’ mindshare in the industry. Interest in cloud computing seems to appear out of nowhere in 2007, and interest in the subject keeps increasing as of the end of 2009.

Also worth noting the trend of virtualization, one of the foundational technologies for cloud computing. Interest in virtualization increased through 2007 and reached a plateau in 2008. Likewise, the trend in terms of news reference volume has remained constant in the past two years.

  • Blue line: cloud computing
  • Red line: grid computing
  • Orange line: virtualization

GoogleTrends.png

Figure 1. Google Trends graph of search volume index and news reference volume for cloud and grid computing and virtualization. …

J. Nicholas Hoover asserts “The software vendor is using Sharepoint, Bing, SQL, Azure, and other tools to seize a chunk of the open government market” in his Microsoft Taps Into Open Government Market post of 12/15/2009 to InformationWeek’s Government blog:

With the open government movement in full swing and the Obama administration's Open Government Directive finally in federal agency hands, vendors such as Microsoft are looking to offer up their help.

Earlier this year, Microsoft, Google, and Amazon began offering to host public data on their cloud services, and the competition will likely only heat up. Microsoft has touted the fact that SharePoint is the front-end platform for stimulus-tracking Website Recovery.gov, and clearly has a few other ideas up its sleeves.

For example, Microsoft recently worked with NASA to develop a Website called Be A Martian, part of which could be developed into a crowdsourced discussion platform like Google Moderator, Microsoft federal CTO Susie Adams said in an interview. This feature, which Microsoft and NASA call Town Hall, would allow users to ask questions, vote on them, read responses, earn a reputation, and sort questions by category and statistics like number of votes. The White House earlier this year used Google Moderator to crowdsource questions for a Presidential press conference.

Another Microsoft effort that could be helpful for open government projects is an effort codenamed Dallas. Through Dallas, Microsoft helps customers store strategic data sets on Microsoft's SQL Azure cloud database platform and then "curates" that data by adding an open API to allow developer access and a front-end search feature to query relational data.

If you’re interested in “Open Government,” a.k.a. “Governement 2.0,” be sure to subscribe to Andrea DiMaio’s Gartner blog.

<Return to section navigation list> 

Cloud Security and Governance

The Cloud Security Alliance announces Version Two of Guidance Identifying Key Practices for Secure Adoption of Cloud Computing in this press release data 12/17/2009:

The Cloud Security Alliance (CSA) today issued the second version of its “Guidance for Critical Areas of Focus in Cloud Computing”, now available on the Cloud Security Alliance website.

The Cloud Security Alliance is a not-for-profit organization with a mission to promote the use of best practices for providing security assurance within Cloud Computing, and to provide education on the uses of Cloud Computing to help secure all other forms of computing. The whitepaper, “Guidance for Critical Areas of Focus in Cloud Computing – Version 2.1”, outlines key issues and provides advice for both Cloud Computing customers and providers within 13 strategic domains. Version 2.1 provides more concise and actionable guidance across all domains, and encompasses knowledge gained from real world deployments over the past six months in this fast moving area.

The second version of the guidance tops off a strong inaugural year for the CSA, in which it first published its guidance and, grew to 23 corporate members strong, and joined forces with numerous leading industry groups (such as ISACA, ENISA, the DMTF and the Jericho Forum) to help advance the goal of cloud security.

Chris Hoff (@Beaker) offers his commentary about “Guidance for Critical Areas of Focus v2” in his Cloud Security Alliance v2.1 Security Guidance for Critical Areas of Focus in Cloud Computing Available post of the same date.

• HIPAA.COM continues its Exploring HIPAA and HITECH Act Definitions series of posts with Part 13 of 7/16/2009:

[T]hrough December, HIPAA.com is providing a run through of HIPAA transaction & code set, privacy, and security definitions, along with relevant HITECH Act definitions pertaining to breach notification, securing of protected health information, and electronic health record (EHR) standards development and adoption. These definitions are key to understanding the referenced HIPAA and HITECH Act enabling regulations that are effective now and that will require compliance by covered entities and business associates now or in the months ahead, as indicated in HIPAA.com’s timeline. Each posting will contain three definitions, with a date reference to the Federal Register, Code of Federal Regulations (CFR), or statute, as appropriate.

Dana Gardner claims “New HP offerings enable telcos to deliver more safe cloud services fast” in his HP Makes Cloud Computing Safe post of 12/16/2009:

… At Software Universe in Hamburg, Germany, HP today announced three new offerings designed to enable cloud providers and enterprises to securely lower barriers to adoption and accelerate the time-to-benefit of cloud-delivered services. …

Among the new offerings:

  • HP Operations Orchestration, which will automate the provisioning of services within the existing infrastructure, allowing businesses to seamlessly increase capacity through integration with such things as Amazon Elastic Compute Cloud. Look for other public cloud providers to offer this as well.
  • HP Communication as a service (CaaS), a cloud program that will enable service providers to offer small and mid-size businesses services delivered on an outsourced basis with utility pricing. CaaS includes an aggregation platform, four integrated communications services from HP and third parties, as well as the flexibility to offer other on-demand services.
  • HP Cloud Assure for Cost Control, designed to help companies optimize cloud costs and gain predictability in budgeting by ensuring that they right-size their compute footprints. …

HP is a sponsor of Dana’s BriefingsDirect podcasts

Ellen Rubin recommends that you “Determine your cloud objectives: what are you trying to accomplish?” in her Five Things to Do Before Moving to Cloud Computing post of 12/15/2009:

Before moving an enterprise application to the cloud, you need to be sure that your expectations are realistic and your objectives match what the cloud can deliver.

In this post, I’d like to share what we’ve learned from working with our beta customers, from their initial exploration of cloud possibilities to going live with a specific application they’ve migrated to the cloud.

The following [abbreviated] steps can help guide the thought process when considering a cloud deployment, and provide a starting point for moving forward.

  1. Determine your cloud objectives. …
  2. Pick an application that makes sense. …
  3. Involve the CSO/risk management team from the beginning. …
  4. Decide which cloud(s) are acceptable. …
  5. Create a sandbox where people can experiment. …

Of course, she fills out her five steps with relevant details. Ellen is the founder and VP for products at CloudSwitch.

Chris Hoff (@Beaker) announced that he’s Speaking at the 2009 Federal Identity Management & Cybersecurity Conference in this 12/15/2009 post:

The (first annual) 2009 Federal Identity Management & Cyber Security Conference is being held in Washington on December 15-16th.  I’m speaking on day two on a panel moderated by Earl Crane of DHS on “Innovation and security in Cloud Computing.”

The Information Security and Identity Management Committee (ISIMC) of the Federal CIO Council is taking steps to deliver  on the President’s pledge for cybersecurity. ISIMC will discuss strategies and tactics for securing and defending federal IT  systems and networks for trusted and reliable global communication.

The objectives of this conference are awareness, education, and alignment toward a common vision for cyber defense  within the federal community.   This conference will focus on protecting the nation against cyber aggression, while preserving and protecting  the personal privacy and civil liberties that are the core of [A]merican values.

Sorry for the short notice.

Sophos offers a free download of the Ponemon Institute’s The State of Privacy and Data Security Compliance white paper, which Sophos sponsored:

With new privacy and data security regulations increasing, organizations are asking questions. Do the new regulations help or hinder the ability to protect sensitive and confidential information? With these new regulations on the march, how can you remain competitive in the global marketplace? This report provides answers and examines how compliance efforts can impact a company's bottom line.

Download The State of Privacy and Data Security Compliance to learn:

  • The value of compliance to the organization
  • Security practices differences between compliant and non-compliant companies

The State of Privacy and Data Security Compliance study—conducted by Ponemon Institute and sponsored by Sophos—examines whether compliance efforts improve an organization’s relationship with key business partners, help secure more funding for IT security, and improve a company's overall security posture.

<Return to section navigation list> 

Cloud Computing Events

The Institute for Defense & Government Advancement (IDGA) announces Microsoft to participate in the Cloud Computing for DoD & Government Summit in this 2/16/2009 post:

IDGA announced today that Microsoft will participate in the Cloud Computing for DoD & Government Summit, scheduled for February 22-24, 2010 at the Hilton Old Town Alexandria.

The event, chaired by QinetiQ North America, will address the fast-growing IT infrastructure sector of cloud computing.

Speakers from National Institute of Science & Technology, Defense Information Systems Agency, Department of Energy, Department of Interior, United States Air Force Reserves, United States Navy and many more will discuss current programs and future plans for cloud computing activities.

While cloud computing provides vast computing power, reliable off-site data storage, wide availability, and lower investment costs, potential security risks remain top-of-mind to IT professionals. Cloud computing eliminates the need to buy hardware and pay to manage it and offers more flexibility which is why this is appealing to government agencies.

For more information on attending the Cloud Computing for DoD & Government event please visit http://www.CloudComputingEvent.com or contact Alexa Deaton at alexa.deaton@idga.org.

See Chris Hoff’s announcement that he’s Speaking at the 2009 Federal Identity Management & Cybersecurity Conference in the Cloud Security and Governance section.

TechWeb offers a 40% discount to attendees who register for their Cloud Connect conference by 12/31/2009. The conference will be held at the Santa Clara Convention Center from 3/16 to 3/28/2009. Tracks/topics include:

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

OpSource claims its “Cloud Is the First Cloud Computing Solution to Include Microsoft Windows Server 2008 as a Standard Option” in its OpSource Brings Enterprise Class Windows to the Cloud press release of 12/16/2009:

OpSource™, the leader in Cloud operations, today announced that OpSource Cloud™ is the first cloud solution to include Microsoft Windows Server 2008 as a standard server option. Together with Microsoft Corp., OpSource is bringing enterprise class Windows to the cloud through its support of Windows Server 2008. Now, developers and enterprise IT departments that have already standardized on Windows Server 2008 will be able to develop using the same technology in the cloud. OpSource Cloud’s use of enterprise standard technology speeds up development, eliminates the need to learn new systems and increases enterprise confidence in the cloud. The Windows Server 2008 option is available immediately from www.OpSourceCloud.net – and both Windows Server 2008, with all its powerful new features, and Windows Server 2003 are included in the standard price of the OpSource Cloud offering.

“Customers are beginning to demand more enterprise class solutions in the cloud,” said John Zanni, general manager of the Worldwide Software Plus Services Industry for Microsoft Communications Sector, Microsoft. “Microsoft is working with partners like OpSource to meet that customer demand by bringing Windows Server 2008 to the cloud.”

OpSource Cloud is the first cloud to bring together the availability, flexibility and community of the public Cloud with the security, performance and control the enterprise requires. For companies standardizing on Windows Server 2008, this integration makes the move to cloud computing simple and dependable. Windows Server 2003 support is also available on OpSource Cloud. All license fees for Windows Server 2008 and Windows Server 2003 are included in standard OpSource Cloud service fees.

Amazon Web Services’ Amazon EC2 Now Offers Windows Server 2008 announcement of 12/9/2009, as reported in my Other Cloud Computing Platforms and Services post, seems to me to be a week earlier than OpSource’s claim:

Starting today, Amazon EC2 now offers Microsoft Windows Server 2008 and Microsoft SQL Server® Standard 2008 instances in all Amazon EC2 Regions. This new announcement extends Amazon EC2’s existing Microsoft-based offerings that include Windows Server 2003 and SQL Server 2005 instances. Like all services offered by AWS, Amazon EC2 running Windows Server or SQL Server offers a low-cost, pay-as-you-go model with no long-term commitments and no minimum fees. Please visit the AWS website for more information on using Amazon EC2 running Windows.

Or, perhaps OpSource doesn’t consider AWS to be a “cloud computing solution.”

Research and Markets announced a new PaaS Remains on the Edge report in a 12/16/2009 press release:

… How Force.com, Workday, NetSuite and Intuit fit into the emerging enterprise application platform battle for corporate and ISV developers

PaaS remains on the Edge - Why independent software vendors are reluctant to embrace Force.com and why Workday and NetSuite hold promise for corporate developers

At its user conference DreamForce'09, Salesforce.com released some impressive statistics on the traction that its Force.com platform has been gathering. The company claims 135,000 custom applications and 10,000 sites are built on Force.com. Already, 55% of the HTTPS transactions the company processes come through the API (i.e., from partner applications) versus only 45% coming from Salesforce's own applications.

What drives that adoption and how does Force.com stack up against the alternatives?

New research by analyst firm Tech Strategy Partners contrasts the strengths and weaknesses of Force.com vs. platforms provided by NetSuite, Workday and Intuit. Tech Strategy Partners finds that PaaS is gaining most traction with corporate developers, not independent software vendors (ISVs). …

James Hamilton says it’s been a Big Week at Amazon Web Services in this 12/16/2009 post:

There were three big announcements this week at Amazon Web Services. All three announcements are important but the first is the one I’m most excited about in that it is a fundamental innovation in how computation is sold.

The original EC2 pricing model was on-demand pricing. This is the now familiar pay-as-you-go and pay-as-you-grow pricing model that has driven much of the success of EC2. Subsequently reserved instances were introduced. In the reserved instance pricing model, customers have the option of paying an up-front charge to reserve a server. There is still no obligation to use that instance but it is guaranteed to be available if needed by the customer. Much like a server you have purchased but turned off. Its not consuming additional resources but it is available when you need. Drawing analogy from the power production world, reserved instances are best for base load. This is capacity that is needed most of the time.

On-demand instances are ideal for Peak Load. This is capacity that is needed to meet peak demand over the constant base load demand. Spot instances are a new, potentially very low cost instance type ideal for computing capacity that can be run with some time flexibility. This instance type will often allow workloads with soft deadline requirements to be run at very low cost. What makes Spot particularly interesting is the Spot instance price fluctuates with the market demand. When demand is low, the spot instance price is low. When demand is higher, the price will increase exactly as the energy spot market functions.

Also announced this week were the Virtual Private Cloud unlimited beta and CloudFront streaming support. …

James continues with an analysis of the significance of the three announcements.

Maureen O’Gara asserts that “VPC is Amazon’s way of creating hybrid cloud computing” in her Amazon’s Virtual Private Cloud Computing Floats into Beta post of 12/16/2009:

… Amazon Web Services (AWS) sent its enterprise-directed Virtual Private Cloud (VPC) widgetry into full public beta Monday. The thing’s been in limited public beta since the summer and before that it was in private beta.

VPC is Amazon’s way of creating hybrid clouds by letting an enterprise connect its existing infrastructure to a set of isolated AWS compute resources via a virtual private network (VPN) – a bog standard encrypted IPsec tunnel – and use its own existing security services, firewalls and intrusion detection systems for the EC2 instances and traffic. Ditto whatever third-party management software it’s using.

Amazon imagines VPC being used for added capacity, disaster recovery and corporate applications such as e-mail systems, financial systems, trouble ticketing systems and CRM apps to save on TCO. It’s expected to be quite popular. …

<Return to section navigation list> 

blog comments powered by Disqus