Wednesday, December 30, 2009

Windows Azure and Cloud Computing Posts for 12/28/2009+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

• Update 12/30/2009: Lorraine Lawson: Judith Hurwitz Cloud Computing Interview parts I and II: M. Kramer: Gentlemen, Start Your Cloud Predictions; Hovhannes Avoyan: The Mass Adoption of Cloud Computing Is Coming; Neil MacKenzie: Entities in Azure Tables; Jayaram Krishnaswamy: Windows Azure Platform Training Kit; Mamoon Yunus: The Guillotine Effect of Cloud Computing; Mike Leach: Coding With Azure: "Up or Down" Service Monitor App; Ben Riga: Windows Azure Lessons Learned: Active Web Solutions; and more.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: AOL Greets 48-Hour CompuServe Classic POP3 and Webmail Outage with Silence on 12/28/2009; OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

• Neil MacKenzie explains how to work with the the WritingEntity and ReadingEntity events exposed by the DataServiceContext class in his Entities in Azure Tables post of 12/30/2009. Neil provides extensive source code for the following operations:

    • Writing Data to an Azure Storage Table
    • Handling WritingEntity Events
    • Removing a Property from an Atom Entry
    • Reading Data from an Azure Storage Table
    • Initializing a Property
    • Serializing Generic Properties

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Stephen Forte’s PowerPivot Demo Part II post of 12/28/2009 demonstrates how to build a more advanced pivot table against the Northwind sample database’s Orders, Order Details, Products, Categories and Employees tables:

About a week ago, I showed a simple demo on my blog of SQL Server PowerPivot for Excel, or just PowerPivot for short. I used SQL Azure as my backend and just did a mapping to a few Northwind tables and then built a pivot table using the Order Details table.

Today I will take that example one small step further and write a custom TSQL query and build a more advanced pivot table against that data source.  In a future post I will build a PowerPivot example against a true OLAP Cube, but that will have to wait for the new year. …

This gives the user an extremely powerful pivot table and the ability to do some pretty sophisticated drill downs and filters. In addition, it is pretty easy to add/remove even more data (Customers and Employees for example) to this pivot table.


[George] Huey and [Wade] Wegner will “Migrate Us to SQL Azure!” on 12/31/2009 when .NET Rocks releases Show #512:

George Huey and Wade Wegner from Microsoft talk to Carl and Richard about George's creation, the SQL Azure Migration Wizard, a free tool that will save you countless hours when migrating SQL Server databases to SQL Azure.

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

Scott Gellock reports Windows Azure AppFabric Java and Ruby SDK’s updated for v1 in this 12/28/2009 post:

Over the past couple of days updated versions of the Java and Ruby SDK’s for AppFabric have been posted.  These updated SDK’s account for the changes made in the v1 service release of Service Bus and Access Control.  You can get the updated Java SDK here and the updated Ruby SDK here.

Bruce Kyle announces a new “guide that provides step-by-step instructions on how to use Windows Identity Foundation in Windows Azure solutions” in his Guide Explains How to Build On-Premises, Cloud Identity Aware Apps post of 12/29/2009:

Windows Identity Foundation (previously known as “Geneva” Framework) makes federation and claims-based identity first-class citizens in the .NET Framework. Developers can now rely upon a single, consistent model for creating identity-aware applications on ASP.NET and WCF.

We just published a guide that provides step-by-step instructions on how to use Windows Identity Foundation in Windows Azure solutions, so that on-premises or partner identities can be seamlessly used regardless of where your resources are hosted, on-premises or in the cloud.

The guide also provides explanations of the tradeoffs that are necessary at this point for making the two pre-releases to work together, highlighting the attention points and the workarounds, and will be updated regularly as we move toward commercial availability. We invite you to download the guide and experiment with the simple but powerful scenario it enables: stay tuned for updates and for more scenarios of claims-based identity in the cloud.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

The Centers for Medicare & Medicare Services (CMS) and the Office of the National Coordinator for Health Information Technology (ONC) announce CMS and ONC Issue Regulations Proposing a Definition of ‘Meaningful Use’ and Setting Standards for Electronic Health Record Incentive Program in a 12/30/2009 press release:

The Centers for Medicare & Medicare Services (CMS) and the Office of the National Coordinator for Health Information Technology (ONC) encourage public comment on two regulations issued today that lay a foundation for improving quality, efficiency and safety through meaningful use of certified electronic health record (EHR) technology. The regulations will help implement the EHR incentive programs enacted under the American Recovery and Reinvestment Act of 2009 (Recovery Act).

A proposed rule issued by CMS outlines proposed provisions governing the EHR incentive programs, including defining the central concept of “meaningful use” of EHR technology. An interim final regulation (IFR) issued by ONC sets initial standards, implementation specifications, and certification criteria for EHR technology.  Both regulations are open to public comment. …

The IFR issued by ONC describes the standards that must be met by certified EHR technology to exchange healthcare information among providers and between providers and patients. This initial set of standards begins to define a common language to ensure accurate and secure health information exchange across different EHR systems.  The IFR describes standard formats for clinical summaries and prescriptions; standard terms to describe clinical problems, procedures, laboratory tests, medications and allergies; and standards for the secure transportation of this information using the Internet.

The IFR calls for the industry to standardize the way in which EHR information is exchanged between organizations, and sets forth criteria required for an EHR technology to be certified. These standards will support meaningful use and data exchange among providers who must use certified EHR technology to qualify for the Medicare and Medicaid incentives.

Under the statute, HHS is required to adopt an initial set of standards for EHR technology by Dec. 31, 2009.  The IFR will go into effect 30 days after publication, with an opportunity for public comment and refinement over the next 60 days.  A final rule will be issued in 2010.  “We strongly encourage stakeholders to provide comments on these standards and specifications,” Dr. Blumenthal said.

The Recovery Act established programs to provide incentive payments to eligible professionals and eligible hospitals participating in Medicare and Medicaid that adopt and make “meaningful use” of certified EHR technology.  Incentive payments may begin as soon as October 2010 to eligible hospitals.  Incentive payments to other eligible providers may begin in January 2011. …

The CMS proposed rule and fact sheets, may be viewed at

ONC’s interim final rule may be viewed at In early 2010 ONC intends to issue a notice of proposed rulemaking related to the certification of health information technology.

The 556-page PDF of the proposed “Medicare and Medicaid Programs; Electronic Health Record Incentive Program” rule (CMS-0033-P, RIN 0938-AP78), which affects 42 CFR Parts 412, 413, 422, and 495, as it appears in the Federal Register is available for online review here. Following is a summary:

This proposed rule would implement the provisions of the American Recovery and Reinvestment Act of 2009 (ARRA) (Pub. L. 111-5) that provide incentive payments to eligible professionals (EPs) and eligible hospitals participating in Medicare and Medicaid programs that adopt and meaningfully use certified electronic health record (EHR) technology. The proposed rule would specify the-- initial criteria an EP and eligible hospital must meet in order to qualify for the incentive payment; calculation of the incentive payment amounts; payment adjustments under Medicare for covered professional services and inpatient hospital services provided by EPs and eligible hospitals failing to meaningfully use certified EHR technology; and other program participation requirements. Also, as required by ARRA the Office of the National Coordinator for Health Information Technology (ONC) will be issuing a closely related interim final rule that specifies the Secretary’s adoption of an initial set of standards, implementation, specifications, and certification criteria for electronic health records. ONC will also be issuing a notice of proposed rulemaking on the process for organizations to conduct the certification of EHR technology.

According to a tweet from Practice Fusion’s Glenn Laffel, MD, “Proposed measures for each of the Meaningful Use criteria begin on page 65.” Dr. Laffell’s earlier posts detail elements of the telephonic briefing described below.

Emily at Practice Fusion posted an ALERT: Meaningful Use Announcement Today on 12/30/2009:

ONC and CMS are holding a press conference this afternoon and will likely announce revisions to the “Meaningful Use” criteria for electronic health record (EHR) systems. To learn more about Meaningful Use and the HITECH Act, visit our Stimulus Center.

The briefing will be at 5:15 p.m. ET , Toll-Free Dial: (800) 837-1935, Conference ID: 49047605, Pass Code: HITECH

Practice Fusion’s physicians and EHR experts will be available for immediate comment on the announcement. Follow along with Dr. Laffel’s thoughts on Twitter or contact Practice Fusion's press team for more details.

Practice Fusion’s blog post continues with the full text of HHS’s CMS and ONC to Discuss Next Steps in Electronic Health Records Programs press release.

The Azure Support Team’s Tom posted Getting SSL certificates to work on 12/30/2009. His article observes:

There are a number of useful informational articles out on the web on how to deal with SSL Certificates with Windows Azure.  The first few places to start are:

There may be times, depending on the certificate you are trying to use that these steps won’t be enough.  You may see an error like:

At least one certificate specified in your service definition is not found.
Please upload these certificate(s), and then upload your application package again.
- Dr. Watson Diagnostic ID: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

Tom suggests this resolution:

If you see something like this, and you are using a certificate from a 3rd party, you may need to get the intermediate certificates installed as well.  One problem that happens with some 3rd parties is that they will not give you a .pfx file.  Since that is the only file type you can upload to Azure for certificates, you have to convert them.  All that is needed is to create a .NET application with the following code:

// The path to the certificate.
string certificate = @"c:\test.cer";

// Load the certificate into an X509Certificate object.
X509Certificate cert = new X509Certificate(certificate);
byte[] certData = cert.Export(X509ContentType.Pkcs12, "Password");

System.IO.File.WriteAllBytes(@"c:\test.pfx", certData);

Or you can do the same in powershell:

$c = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2("c:\test.cer")
$bytes = $c.Export("Pfx", "Password")
[System.IO.File]::WriteAllBytes("c:\test.pfx", $bytes)

Running this will allow you to create .pfx files for each certificate and then import them into Azure.  Just be sure to change the “Password” and put the same one in on the dashboard for Azure.

• Jayaram Krishnaswamy questions why Windows 7 and Visual Studio 2010 were omitted as operating systems and prerequisites for the latest Windows Azure Platform Training Kit in this 12/30/2009 post.

Just an oversight?

• Mike Leach provides the source code and a video walkthrough of his Coding With Azure: "Up or Down" Service Monitor App in this 12/29/2009 post:

Every year during the holiday break I look forward to digging into a fun software project and learning something new. This year I developed a simple application called "Up or Down" using the Microsoft Windows Azure cloud platform. Windows Azure represents a huge bottom-up R&D effort by Microsoft to support highly scalable Internet-connected applications across a number of devices and platforms.

Jump to the 5 minute video walkthrough or Download the source code from Codeplex.

Mike continues with a detailed description of Up or Down’s architecture.

• Ben Riga describes the move of CCH’s sales tax calculation service from an in-house to cloud application in his Windows Azure Lessons Learned: CCH post of 12/30/2009:

In this episode of Lessons Learned we talk with Jones Pavan and Gurleen Randhawa of CCH [formerly Commerce Clearing House] about tax and accounting!  No wait stick around, that stuff can be exciting too.  Yes, really!  :)

The good folks at CCH (a Wolters Kluwer company) have built an interesting service on Windows Azure.  The solution we discuss here is a sales tax calculation service which they offer to other accounting firms.  This is an existing on-premises product that they are now moving to the cloud.

Channel 9: Windows Azure Lessons Learned: CCH

The existing product was a stateless web service that was designed to live behind the firewall.  The service is meant to be called directly via a plug-in in an accounting firm’s ERP system (for example, Dynamics AX).  To move that to the cloud CCH wrapped the web services in Windows Communication Foundation (WCF).

They had been using another third party RAD development tool called CA Plex.  The Plex runtime was added to the project and copied out to the cloud.  One of the things they quickly learned is that the nature of the cloud app is to be stateless and that required special consideration when moving on-premises apps (for example the Plex tool was caching db connections behind the scenes).

Anther important consideration was security.  They were not ready to move to ACS so for the initial release they used X.509 cert[ificate]s, ADFS and message-based security to establish trust relationships with the server.

BTW, the Windows Azure marketing folks have already published a case study on the CCH solution (available here).

Ben Riga’s Windows Azure Lessons Learned: Active Web Solutions post of 12/29/2009 links to a Channel9 video, with describes an Azure-based tracking system for fishing vessels:

There are not many solutions that can claim to have saved lives.  In this episode of Lessons Learned I chat with Richard Prodger of Active Web Solutions about the Windows Azure project they’ve been working on that tracks fishermen in real time. It monitors not only their location but also their status so as to immediately raise the alarm if help is needed (e.g. fallen off the side of a boat or pressed a panic button). This solution is already credited with saving the lives of 9 fishermen.

Channel 9: Windows Azure Lessons Learned: Active Web Solutions

Electronics on the fishing vessels communicate directly via satellite to the Windows Azure solution. Those messages are processed via Windows Azure worker roles and routed using the Windows Azure AppFabric Service Bus to various on-premises systems for review and action. The desktop client overlay marine charts onto Bing maps so that the coast guard gets a visual representation of the exact location of boats that have raised alarms.

The good folks at Active Web Solutions have published some of the source code that they developed to “automatically bridge arbitrary TCP endpoints, handling any intermediate firewall traversal.”  The code is available on CodePlex as the SocketShifter project:

If this is interesting, you should also have a look at Port Bridge published by Clemens Vasters on his blog.  Clemens describes it as “Socketshifter’s older brother”

Robert Rowley, MD’s 2009 – the Year in Review for Health IT post of 12/29/2009 describes the primary impediments to implementing fully functional Electronic Health Records (EHRs) as envisioned by “ARRA and its HITECH section, which earmarked $19.2 billion intended to increase the use of Electronic Health Records (EHR) systems by physicians and hospitals, unprecedented attention has been focused on EHR systems.” Dr. Rowley continues:

The “hook” to try to incentivize physicians to adopt EHRs (given that the adoption rate as reported in the New England Journal of Medicine was 4% for fully-functional EHRs and 13% for basic systems) revolved around offering bonus incentives through Medicare and Medicaid payments for demonstrating “meaningful use” of “certified” EHRs. Such payments would begin in 2011 and extend through 2015, and thereafter the “stick” of payment reductions for failure to adopt EHRs would start to kick in.

As a result, the topic of EHR adoption has been on the minds and lips of physicians all year. And the state of things might best be summed up by an anecdotal comment I heard recently: “we finally adopted an EHR in our clinic, and it’s ok I guess, but the real problem is that they all need to talk to each other.”

The difficulty in making that happen – in getting EHR systems to talk to each other – has been a much harder nut to crack than one might have imagined. While the Office of the National Coordinator (ONC) has been working furiously all year developing a framework for defining “meaningful use” and is still detailing how to define “certification”, one of the big areas of attention has been on exactly this concern – interoperability between systems. …

Microsoft’s Public Sector DPE Team describes a Government Mapping solution built with Silverlight 3 and Windows Azure + SQL Azure in this detailed 12/28/2009 post:

Almost every single government entity is looking to deliver contextual information on maps, so this reusable framework built by our partner iLink Systems can help deliver a great experience to the end-users.

Reusable Framework for GIS Mapping Applications/Solutions

Partner iLink Systems has come up with a great framework and solution built with Silverlight 3 hosted on Windows Azure platform, consumes data stored in SQL Azure (cloud database) and uses Bing Maps for the visualization.

You can access the sample solution built using their framework at Sample may be available only for a month or so, but feel free to send a note to to learn more about their framework and discuss how your state/agency can utilize it to deliver ton of GIS mapping based solutions for one or multiple departments.

Sample mapping solution is about fictitious county/city showing contextual information (e.g. offices/locations where govt services are offered, etc) on Bing Maps, but their framework can be used to create any GIS mapping application pretty easily. Solution not only has a rich user-experience and accessible on all modern browsers (via the Silverlight plug-in), but the cool thing is that the end-user can install shortcut of the application on their desktop and/or start menu of Windows OS (Windows XP, Windows Vista and Windows 7) by simply using right-click on the application’s canvas/surface.

Partner chose Windows Azure platform simply because the data shown on the maps is for the public, however your solution/application built using their framework can be hosted on your on-premise infrastructure too.  Once you get hold of the framework, you will be pleased that the application can talk to on-premise SQL Server database or cloud-resident SQL Azure database with simple change in the database connection string (stored in the web configuration file).


Bob Familiar describes how he ported his personal music Web site to Windows Azure, SQL Azure, Silverlight 3 and Expression Blend 3 with the beta of Visual Studio 2010 in his Migrating Applications to Windows Azure post of 12/28/2009:

SFPic2… Over the past three years I have been using my personal music website as a sandbox for experimenting with the latest Microsoft Application Platform features for building service oriented web applications. The last version of the site featured WCF SOAP services and a Silverlight 2 UI. The site was hosted at DiscountASP and leveraged SQL Server 2005 for the data store. I posted the details of this development effort back in August of 2008.

It has been a year and a half since I have done any work on the site. With the recent release of Silverlight 3, Expression Blend 3, Windows Azure, SQL Azure and the beta of Visual Studio 2010 I felt it was time. One of my primary goals with this migration effort was to reuse as much of my existing code as possible. Hopefully you will find my experience useful as you embark on your own migration efforts. …

The details of the Windows Azure migration project can be found in the following six posts

  1. The Architecture of a Service Oriented Rich Internet Application
  2. Setting up your Azure Account and Development Environment
  3. Migrating your SQL Server database to SQL Azure
  4. Configuring your Visual Studio 2010 Windows Azure Project
  5. Migrating WCF SOAP services to WCF RESTful services
  6. Using Silverlight 3 to create a rich user interface that can run in or out of the browser

Return to section navigation list> 

Windows Azure Infrastructure

• Matt Asay claims An application war is brewing in the cloud in this 12/30/2009 post to C|Net’s Open Road blog:

Today's cloud-computing vendors focus on infrastructure, but that won't be the case for long. It can't be. As competing vendors seek to differentiate themselves, they're going to move "up the stack" into applications.

It's like the history of enterprise computing, played out in months and years instead of decades.

Oracle arguably set this strategy in motion when it acquired its way to a complete infrastructure-plus-applications portfolio to lower customer acquisition costs and improve its competitive differentiation for CIOs. IBM and Microsoft also went that route, though to differing degrees and in different ways.

Cloud-computing platform vendors are going to have to do the same thing, except they don't have the luxury of waiting. …

Matt is vice president of business development at Alfresco, a company that develops open-source software for content management. His earlier 2010 the year of cloud-computing...M&A post of 12/29/2009 points out that:

… Gartner expects the cloud-related SaaS market to top $8 billion in 2009, which suggests that real customers paying real money.

They may not be paying enough, however, to support the mushrooming cloud vendor marketplace. Not yet.

Industry insiders are predicting a shakeout as pre-recession venture funding runs out for many of the early cloud vendors, forcing them into the arms of buyers or bankruptcy courts. …

• Ted Schadler anoints Consumer broadband [a]s the workforce technology of the decade in his 12/30/2009 post to ZDNet’s “The View from Forrester Research” blog:

That call may surprise you. You might have put storage or Gigabit ethernet or the Internet itself at the top of the list. But when I think about what’s different in the life of your average information worker as the decade comes to a close, it’s the instant-on access to just about everything that the adoption of consumer broadband has fueled.

From our Consumer Technographics(r) survey of over 50,000 consumers every year for the last 12 years, between 2000 and 2009, consumer broadband soared from 2% to 63% of US households. For context, home PC adoption grew from 51% to 77%.

But why is consumer broadband the workforce technology of the decade? …

Ted details three main reasons:

    • Telecommuting has become a way of life for millions of information workers …
    • Broadband-enabled markets have triggered massive IT innovation …
    • Consumers master new technologies at home — and expect the same at work …

• Hovhannes Avoyan asserts The Mass Adoption of Cloud Computing Is Coming in this 12/30/2009 post, which quotes Bob Rudy, CIO of Avago, about moving its internal applications to the cloud:

I’m really loving a phrase that I read recently about cloud computing. It came from the CIO of Avago Technologies, a San Jose, CA-based semiconductor maker, which is gradually migrating its data and apps to the cloud from its internal servers – including recruiting, human resources, e-mail and web security.

According to Bob Rudy, CIO at Avago, migration has saved the company millions of dollars by eliminating hardware needs and software licenses and improving security, speed and storage.

Moving to the cloud has also freed up employees from annoying and trivial tasks like managing their e-mail, enabling them to focus more on their core jobs.

But Bob phrased a simple description about the pull of cloud computing that I’d like to share: “The days of owning software are coming to an end.”

Bob was featured in a recent story in the San Francisco Gate about the rise of cloud computing, which called Bob’s statement “an increasingly common sentiment. …

• M. Kramer’s Gentlemen, Start Your Cloud Predictions post of 12/29/2009 offers brief, tongue-in-cheek prognostications from B and L Associates about:

  • Consolidation of cloud computing vendors
  • Adoption of cloud interoperability standards
  • Cloud insurance and cloud brokers
  • “Cloud” becoming a verb

in 2010. Definitely worth reading.

• Mamoon Yunus suggests “Try asking your SaaS partner to put an agent in their container - good luck!” in his The Guillotine Effect of Cloud Computing post of 12/30/2009, which seconds Dave Linthicum’s assertion that cloud computing will kill design-time service governance (see below.)

David Linthicum asserts “Every technology wave drowns something” in his Cloud computing will kill these 3 technologies post of 12/29/2009 to his InfoWorld Cloud Computing blog. According to Dave, the three technologies awaiting their demise are:

    1. Design-time service governance
    2. Older and smaller clouds
    3. Tier 2 enterprise software

Of course, Dave elaborates on his choices.

Arthur Cole recommends WAN Optimization for a Better Cloud in this 12/29/2009 post to ITBusinessEdge:

Amid all the predictions about 2010 being a banner year for virtualization and cloud computing, there is little consideration given to the impact these technologies will have on wide area networking.

But as the enterprise begins to port more applications and data onto both public and private cloud infrastructures, it will become increasingly obvious that optimizing the WAN will be a key factor in ensuring the kinds of performance levels that users have come to expect from enterprise IT.

That's part of the reason market analysts are expecting WAN optimization to kick into high gear in the coming year, along with the rest of the networking segment. Infonetics Research noted that sales of optimization equipment jumped a healthy 12 percent in the third quarter, breaking a string of poor results. Top vendors like Blue Coat, Cisco and Riverbed led the rebound, which coincided with single-digit growth in both the router and Ethernet switch markets.

According to a recent survey by Expand Networks, virtualization and cloud computing will be the key drivers for WAN optimization in the coming year. The company reports that a recent survey of IT executives found that 75 percent are planning new virtualization deployments next year, with two-thirds of those saying WAN optimization would improve system performance. A slightly smaller figure was planning on cloud deployments, but the vast majority would combine it with WAN optimization services if the price was right. …

Patrick Thibodeau lists 10 big cloud trends for 2010 in this 12/29/2009 ComputerWorld article posted to ITWorld:

    1. Commodity cloud price slashing continues
    2. A move to simpler cloud pricing models
    3. Enterprise application vendors embrace metering
    4. Cloud providers increasingly offer enterprise-caliber SLAs
    5. New technologies will improve cloud use and performance
    6. Cloud providers address security concerns
    7. Performance monitoring will become ubiquitous
    8. Open standards for cloud computing advance
    9. Politics will drive decisions
    10. The cloud will decentralize IT decision-making

Phil Wainwright describes Cloud delusions at the turn of the decade with an emphasis on Windows Azure in his 12/28/2009 post to ZDNet’s Software as Services blog:

Continuing my series of posts about the big themes on this blog over the past year, I now turn to the topic of cloud computing. My views on how to implement cloud and SaaS have hardened considerably over the course of 2009. Halfway through the year, I took a conscious decision to promote multi-tenancy as the only acceptable architecture for cloud and SaaS infrastructures. You might expect that from someone who makes a living consulting for multi-tenant vendors. But I’ve deliberately chosen a hardline and controversial stance, intended as a counterpoint to the many siren voices that argue for a more hybrid approach.

I still see migration to the cloud as a journey, but I’m concerned that too many people, especially with the advent of platforms like Windows Azure, have decided they can achieve all the benefits by going just some of the distance. This is a risky self-delusion, and the more people fool themselves this way, the more the cloud model will be discredited, not because of inherent weaknesses, but through implicit association with the disasters and disappointments these half-hearted implementations will bring in their wake. There are several different cloud delusions to beware of. [Emphasis added.] …

Phil continues with links to earlier articles about “cloud delusions.”

<Return to section navigation list> 

Cloud Security and Governance

• Lorraine Lawson interviews Judith Hurwitz in her The Best Cloud Case: What Functions You Should – and Shouldn't – Consider Moving (Part II) post of 12/30/2009:

Loraine Lawson recently interviewed Judith Hurwitz, president of business consulting firm Hurwitz & Associates and co-author of the recently released Cloud Computing for Dummies. In part one of the interview, Hurwitz explained why an integration strategy is important for moving to the cloud. In this second half, she discusses integration with the cloud, what you should and shouldn't move to the cloud and a prediction for 2010.

Part I of the interview is Baby Steps and Integration Strategy Key When Moving to the Cloud of 12/23/2009.

Mamoon Yunus’s Understanding Cloud Taxonomies and Security post of 12/29/2009 reports:

OWASP AppSec DC 2009 had a compelling session that defined cloud taxonomies and the security implications associated with the cloud computing.  The three taxonomies that have become part of our vernacular are:

  1. Infrastructure as a Service (IaaS)
  2. Platform as a Service (PaaS)
  3. Software as a Service (SaaS)

Mamoon’s post includes a link to Dennis Hurst’s Cloud Security and its Affect on Application Security presentation. Dennis is a Security Engineer for HP and a Founding member of the Cloud Security Alliance.

He predicted Cloud Reliability Will Be Bigger than Cloud Security for 2010-11 in this earlier (12/23/2009) post:

We have all the tools for securing information in a Cloud: establishing trust through identity, data privacy through encryption, and content integrity through signatures. We are overly focused on Cloud Security issues and less on reliability.  This is all about to change. Following the outages experience[d] by Amazon EC2 in 2009, another premiere cloud provide[r], Rackspace, suffered an outage on December 18. Using technology such as Forum Systems XML/ Cloud gateways is essential for establishing multi-cloud reliability and fault tolerance. …

Mamoon is the CEO of Forum Systems, which manufactures the Forum Sentry, a cloud gateway that he describes in his Why is a Cloud Gateway Required? post of 12/30/2009.

Mike Vizard analyzes Mamoon’s XML security appliance approach in a Reducing the Complexity of Application Security post of 12/21/2009:

As business-to-business interactions over the Web become more pervasive, so too does the complexity associated with securing those transactions.

Unfortunately, all that complexity serves only to dissuade businesses from integrating business processes across the Web at a time when we want to encourage that behavior. So the challenge facing chief technologists is to find a way to make it simpler to integrate business processes without having to introduce complex layers of security.

Forum Systems CEO Mamoon Yunus thinks his company has the answer in the form of an identity broker appliance that sits at the edge of the corporate network. Instead of trying to layer security software into every application, Yunus is arguing that all the security related to XML schemas associated with service oriented architecture (SOA) applications should be handled via an appliance. …

Michael Krigsman quotes Dana Gardner in his SOA adoption: The human face of governance post of 12/28/2009 to the Enterprise Irregulars blog:

Human and organizational factors such as politics, information silos, and change management generally underlie IT success and failure.

During a recent conversation, IT analyst and fellow ZDNet blogger, Dana Gardner, affirmed the extent to which this view applies to service-oriented architecture (SOA) projects.

I asked Dana to elaborate on the importance of organizational dynamics with respect to SOA governance and adoption:

“In the many discussions we’ve had with analysts, users and suppliers over the advancement of SOA and governance, we always came back to the human element as a weak or missing link. Successful adoption and efficiency of SOA hinges on acceptance, trust, collaboration and news ways for people to work together. SOA also depends on people and groups that had not in the past worked closely to begin doing so.

“When I heard more about Asuret and Pragmatic Enterprise 2.0 [my professional involvements], I saw a unique way to bring the human element of SOA into focus. The process of gathering perceptions across teams and any affected people — with trend lines over time for those — really strikes me as essential to understand how and why SOA is working — or not.

“I think that SOA efforts will gain significant tools for how to gauge and assuage the progress of SOA adoption, and the best means of governance over long periods of time once SOA activities gain traction.” …

Dana also highlighted the gap between technical and human governance issues:

“SOA governance goes a great job at tracking artifacts and defining and automating rules and policies for artifacts. The role of IT infrastructure can be managed via SOA governance, and the composition and business process alignment of services are also well served by governance. But how people really feel about how the processes are development, implemented and refined is a bit of a block hole when SOA governance is employed. Without a strong view of the perceptions and their change patterns, SOA and governance are obtuse rather then precise instruments.”

Following our discussion, Dana brought together a group of top SOA analysts to examine these issues in depth.

This post reprints, in its entirety, Dana’s article on that analyst session. You can listen to the podcast by clicking the player at top of this post. …

Hovhannes Avoyan claims “Think giants of commerce and names like Amazon, Walmart and Expedia come up” in his Commercial Giants Held Hostage to Denial of Service post of 12/28/2009:

Think giants of commerce and names like Amazon, Walmart and Expedia come up.

Now, think how much those giants depend on the huge cloud computing infrastructure to be secure and reliable to keep their businesses running and in shape.

So, if you’re an IT person, you can imagine how serious a Distributed Denial of Service (DDoS) attack is to online commerce.

Now, add the fact that it happened on the day before Christmas eve to those giants of commerce.

It appears that the attack was aimed at the servers of Neustar, which offers DNS services to many major companies under the name UltraDNS.

The attack started at 4:45 p.m. PST and lasted for about an hour. It was compounded by the fact that it also affected Amazon’s S3 and EC2 cloud services. A lot of web services rely on Amazon’s cloud infrastructure.

To Neustar’s credit, it reacted quickly and contained the damage to the Northern California area. …

Click here to find out more!“Forrester Research SOA expert Randy Heffner discusses how to establish an iterative design process for evolving your SOA security architecture that considers your current and future security requirements, emerging industry specifications, overlaps in product functionality for SOA security, and possibilities for custom security integration” in his SOA Security: Good Enough and Getting Better post of 8/18/2009 to the CIO blog:

Security is not a reason to stay away from SOA. Although full SOA security maturity is yet to come, 30 percent of organizations now use SOA for external integration with customers and partners. For standard Web services using SOAP, WS-Security has achieved critical mass as a foundational standard. On the other hand, advanced SOA security — involving federation among partners, nonrepudiation, and propagation of user identities across multiple layers of service implementations — is in its early days. To navigate the path from what's practical today to the future of advanced SOA security, establish an iterative design process for evolving your SOA security architecture that considers your current and future security requirements, emerging industry specifications, overlaps in product functionality for SOA security, and possibilities for custom security integration.

<Return to section navigation list> 

Cloud Computing Events

Ben Day’s Beantown .NET Meeting on Thursday, 1/7/2010: Jason Haley, Windows Azure SDK post of 12/29/2009 announces:

Beantown .NET is going to be meeting on Thursday, 1/7/2010. This month we have Jason Haley presenting “Get Started with the Windows Azure SDK”.

As always, our meeting is open to everyone so bring your friends and co-workers – better yet, bring your boss. It is not required to RSVP for our meetings but if you know you’re coming, please RSVP by 3pm on the day of the meeting to help speed your way through building security and to give us an idea how much pizza to order. Click here to RSVP.

See [George] Huey and Wade [Wegner] will “Migrate Us to SQL Azure!” on 12/31/2009 when .NET Rocks releases Show #512 in the SQL Azure Database (SADB) section.

Bruce Kyle recommends that you Mark Your Calendars for Microsoft BI Conference, which is colocated with Microsoft Tech*Ed North America in New Orleans during the week of 6/7/2010:

The conference will focus on the following categories:

  • Empower Your Users – Microsoft Excel (including PowerPivot for Excel), SSRS (Report Builder), Microsoft Visio, Search/FAST
  • Improve Organizational Effectiveness - SharePoint Server 2010 (including PerformancePoint Services, Excel Services, Visio Services)
  • Increase IT and Developer Efficiency - SSAS, SSRS, SSIS (including best practices for implementation), MDS, PowerPivot for SharePoint
  • Partner Services and Solutions Delivery – Partner and ISV methodology and solution offerings that augment our BI solution, Integration with other products (e.g., CRM)
  • Customer Business Value – Learn how customers are using our products in the field with in-depth sit downs with a TDM angle

Registration and demand generation efforts will kick-off in early January.

Hopefully, SQL Azure users will finally learn what BI features will be offered in the cloud. However, Eric Lai reports “Business intelligence as a service remains too scary for most enterprises, analyst says” in his Startups keep on-demand BI faith, but big vendors wait and see article of 12/22/2009 for InfoWorld’s Cloud Computing blog:

When a tech vendor fails, rivals usually rejoice. Not in the nascent BI (business intelligence)-on-demand space, after well-funded startup LucidEra folded in June.

The company had raised almost $21 million in funding from high-profile venture capitalists. Though competitors immediately began wooing LucidEra's customers, they also felt compelled to put out public statements saying that its failure was an outlier, not the beginning of a trend. …

Not only have customers failed to adopt BI-on-demand the way they have other kinds of apps such as CRM (, for example) or productivity ( Google Apps), but it has also received little validation from the big players.

IBM, for one, is only starting to "research" a Web version of its Cognos BI software.

"The data model differs from company to company, which is why you just can't put it into a multi-tenant environment," said Rob Ashe, general manager for IBM's BI and performance management division.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Randy Bias describes his Infrastructure-as-a-Service Builder's Guide v1.0 white paper of 12/2009:

This paper is targeted at anyone building public or private clouds who want to understand clouds, cloud computing, and Infrastructure-as-a-Service. It highlights some of the important areas to think about when planning and designing your infrastructure cloud.

His earlier Cloud Standards are Misunderstood and Subscription Modeling & Cloud Performance white papers, as well as a brief biography are available here

Daniel Roth summarizes his Time Your Attack: Oracle’s Lost Revolution feature article for Wired Magazine’s December 2009 issue: “In 1995, Larry Ellison began his network computer crusade. The effort failed, but the concept would spark what became cloud computing.” I would add “… and netbooks, too.” Roth continues on page 3:

“Ellison is often time-dyslexic — right about the fundamental trend but wrong on timing, ” says David Roux, a partner at private equity firm Silver Lake and a former Oracle executive vice president. “It’s hard to look at a $299 netbook and not see the NC vision come to life.”

<Return to section navigation list> 

AOL Greets 60-Hour+ CompuServe Classic POP3 and Webmail Outage with Silence

Update 12/30/2009 12:40 PM PST: Changed “48-Hour” to “60-Hour+.” The @aolmail folks finally acknowledged that they were having problems with e-mail in this tweet of Wednesday morning:

It’s about time.

Update 12/29/2009 8:35 AM PST: Received the following message from Marcel Brewer, the AOL Mail technical representative for the CompuServe Classic service who is mentioned in posts to the CompuServe Help Community forum:

Please accept my apologies. I have have been out of the office and just been made aware of the issue. I am working with the entire team to ensure that this will not happen again. This is clearly not the way we want to offer service to our users.

The issue should have been solved as of yesterday evening. …

There remain no replies on Twitter to tweets about the problem. The last @aolmail tweet still is 12/18/2009. I believe a commercial presence on Twitter obliges the organization to respond to technical service requests in a timely fashion.

One reader of this post, who wishes to remain anonymous, mentioned that his CompuServe Classic mail service was down for five days.

For those of you too young to remember CompuServe in its heyday, Ken Gagne and Matt Lake wrote a CompuServe, Prodigy et al.: What Web 2.0 can learn from Online 1.0 article for ComputerWorld on 7/15/2009, shortly after CompuServe ceased being an ISP. The article includes a brief history of CompuServe, as well as other early online services.

Update 12/28/2009 4:00 PM PST: POP3 and Webmail service finally was restored this afternoon. I can’t find any explanation from @aolmail on Twitter (their last tweet is still 12/18/2009 at 9:24 AM). Nor is there a post about the problem to the AOL Mail blog (last post was 12/17/2009). Apparently, AOL would like CompuServe Classic users to forget the problem occurred.

Update 12/28/2009 11:20AM PST: I’ll update this post when I receive a response from the Aol Mail folks.

I’ve had a CompuServe mail account for more than 20 years. At the time I began using an online service, CompuServe was the only appropriate choice for serious PC users and software developers.

Early beta testers of Microsoft products interacted in CompuServe forums where Microsoft paid the hourly charge. (In those days, CompuServe charged by the hour for dial-up connections.) I can remember only an occasional brief problem or two with CompuServe until June 2009, when CompuServe ceased their ISP operations and moved CompuServe Classic mail users to POP3/SMTP and Webmail access provided by America Online (Aol Mail). 

My email address (roger_jennings[at] is in the Introduction to more than 1.25 million English copies of my 30+ books; countless articles in magazines such as Visual Basic Programmers Journal, Visual Studio Magazine, and Redmond Developer News; and 712 Blog posts. Thousands of my contacts only know my CompuServe address, so it’s not practical to substitute aliases for other mailboxes. AOL doesn’t appear to offer a relay service to other email aliases.

Early Saturday morning, 12/26/2009, Outlook began requesting updated credentials for my CompuServe POP3 account and reporting the following error:

To test whether my Outlook Account details were at fault, I logged into AOL’s Classic CompuServe Webmail and was greeted by the following message:

I tested AOL’s SMPT service with Outlook and found I could send mail to my other aliases without difficulty. SMPT transmission requires logging on with the same credentials as the POP3 server.

However, mail sent to my CompuServe alias returns an “Undelivered Mail Returned to Sender” message with the following details (emphasis added and @ replaced by [at]):

Reporting-MTA: dns;
X-Internet-Inbound-Queue-ID: D7B333800025E
X-Internet-Inbound-Sender: rfc822; rogerj[at]
Arrival-Date: Sun, 27 Dec 2009 12:03:52 -0500 (EST)

Final-Recipient: rfc822; roger_jennings[at]
Original-Recipient: rfc822;roger_jennings[at]
Action: failed
Status: 4.2.1
Remote-MTA: dns;
Diagnostic-Code: smtp; 450 4.2.1 Mailbox disabled, not accepting messages

I then began posting messages to @aolmail on Twitter and used another alias to send a message to an AOL representative who had assisted me with a previous problem on 12/17/2009. As of 9:30 AM PST on 12/28/2009 I had received no reply from any AOL representative, either directly or in response to my tweets. (Click here to see my tweets to @aolmail.)

To determine whether the problem was specific to my mailbox, I began searching AOL forums for assistance requests with Classic CompuServe POP3 or Webmail. Classic CompuServe forums appear to have disappeared after the June 2009 ISP shutdown. However, I did find a lengthy thread in the CompuServe Help Community forum about problems with Classic CompuServe mail, Can’t access mailbox, beginning on 12/26/2009. The CompuServe Help Community forum is for CompuServe 2000 users only, so little help was forthcoming.

The AOL Mail Team posted on 12/10/2009 A Renewed Focus on YOU, a self-congratulatory paean about AOL that began with:

A new day dawns here at AOL. As you may or may not know, today AOL again became an independent company. We couldn't be more excited as this provides us, the Mail team, an opportunity to get back to our roots.

It is fair to say that some of the changes made to AOL Mail in the past didn't always have our users' best interests top of mind. No more. That stopped yesterday.

Today we take a stand. A passion for providing the best email product possible is our promise to you.

But no post had appeared as of 10:00 AM PST about the CompuServe Webmail/POP3 outage. It didn’t take AOL very long to renege on their “promise.”

Michael Arrington wrote an AOL’s Deteriorating Fundamentals Not A Hit With Analysts essay for TechCrunch on 12/23/2009, which includes the following commentary:

Aol is telling a good story, but Citi analyst Mark Mahaney isn’t buying it. AOL is probably the toughest Internet turnaround story, he says in a report today, citing “28% Y/Y decline in its Subscriber base and 38% Y/Y decline in its EBITDA.” He recommends people buy Yahoo, which “will almost surely revert to growth before AOL.”

Mahaney also notes that Aol was the only top 5 web property in the U.S. to have year over year declines in visitors. …

Barclays analyst Douglas Anmuth was similarly bearish on AOL a couple of weeks ago.

It seems to me that lack of attention to the infrastructure failure and customer service reported here is an egregious example of why AOL has encountered a “28% Y/Y decline in its Subscriber base.”

Sunday, December 27, 2009

Windows Azure and Cloud Computing Posts for 12/24/2009+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

Cory Fowler’s Working with Windows Azure Development Storage post of 12/23/2009 explains how to set up the local SQL Server database for developing Azure applications in the Development Fabric:

During the Development Cycle it is necessary to connect to a database to ensure that your data is getting stored in Windows Azure Storage Services. Microsoft was nice enough to give us this functionality out of the box so we don’t actually need a Windows Azure account before beginning development on an application. Of course you will need to Initialize your Development Storage Service on your development machine before you get going. 

Once you’re done setting up the Development Storage Service you will need to configure the Development Storage in the ServiceConfiguration.cscfg file.  You will need to add the … ConfigurationSettings to each Role element in the ServiceConfiguration.cscfg. …

Once you have these configuration settings in place you will be ready to interact with the Windows Azure Development Storage Service on your Development Machine.  Stay tuned for my next blog Series which will describe the different between Blob Storage, Queue Storage, and Table Storage and how you will go about interacting with the different storage spaces.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

André van de Graaf provides a solution to the SQL Azure Migration Wizard. BCP upload process failed: SQLState = 37000, NativeError = 40531 error in this 12/27/2009 post:

This error can be solved if you specify also the server name after the login name [prefixed with] @ instead of only [the login name:]

Thanks, André.

Alex Handy’s Embarcadero taps into SQL Azure beta article for SDTimes, copied by Synergistics India on 12/24/2009, describes Embarcadero’s free (for a 90-day trial) DBArtisan v8.7.2 tool for SQL Azure:

Microsoft's cloud-enabled database platform, codenamed SQL Azure, isn't yet ready for prime time. But that hasn't stopped developers and database administrators from signing on to try it out.

To enable developers and administrators to better experiment with the SQL Azure beta, Embarcadero Technologies has partnered with Microsoft to release a free version of DBArtisan specifically tailored to help with the move to Azure.

SQL Azure is based upon some SQL Server technologies, but it is being specially crafted for the cloud. Scott Walz, senior director of product management at Embarcadero, said that both developers and Microsoft have questions to be worked out about how a cloud-based database should work.

As such, DBArtisan 8.7.2 does not include optimization features, nor does it offer deep monitoring capabilities. But that is because even Microsoft hasn't yet figured out how to give that type of information or power to users, said Walz.

DBArtisan focuses on migration and query tools. Because this is a free version limited to a 90-day trial, it is also only able to migrate Microsoft SQL Azure databases to Windows Azure. Walz said that the eventual commercial version of DBArtisan for SQL Azure will include migration tools for many different types of databases, but because the Azure platform is not complete, the decision was made to include only Microsoft-specific tools this time. …

Embarcadero will need to increase DBArtisan’s feature set considerably to justify a license fee. George Huey’s free SQL Azure Migration Wizard v3.1.1 handles SQL Server <-> SQL Azure migrations nicely.

Sheila Molnar quotes Slavik Markovich, CTO and founder of the database security company Sentrigo, in her Staying Abreast of SQL Server Database Trends in 2010 article of 12/22/2009 for SQL Server Magazine. Under the topic, “2010: The Enterprise Moves Data to the Cloud,” Sheila writes:

… According to Slavik, the “biggest push in 2010 is the move to cloud-based services. Microsoft will push the Azure cloud platform and SQL Azure database services.” A major hurdle will be “how do you protect the data in the cloud environment?” Organizations need protect data from attacks from both outside and “from your own data administrators, plus your cloud administrators or administrators from the hosting company.” The questions are: “How do you trust them? Or trust but verify that your data is not being accessed or breached?” And “how do you monitor access to the information while it’s kept in the cloud?”

Slavik notes that DBAs have been slow to move data to the cloud because the market hasn’t been ready. “There weren’t good services out there that offered real SQL Server hosting. What you got from Amazon [for example] for their cloud was just basically the platform. And Google of course provided its own database. Smaller companies provided the SQL Server environment, but didn’t provide the whole vision thing. Whereas Microsoft with Azure provides a really strong platform that offers both platform services and higher-level services—SQL Server web services and a path between them.” For more on SQL Azure database services, see Mike Otey’s “7 Facts about SQL Azure,” InstantDoc ID 102766. …

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

No significant articles yet.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

Geoff Nairn claims Siemens signs up for Microsoft cloud in this brief report in the “What’s New” section of the Financial Times for 12/27/2009:

Siemens IT Solutions and Services aims to be among the first service providers to sign up for Azure. It has struck a deal with Microsoft to remotely distribute software updates to its business customers using the Azure cloud computing platform. The service is part of Siemen’s Common Remote Service Platform, which provides remote maintenance of customers’ IT systems.

The preceding item follows an earlier Microsoft unwraps Windows Azure item of the same date:

Microsoft has finally taken the wraps off its Windows Azure cloud computing platform, which was first announced a year ago. From January 1, Azure will go live as a commercial offering although Microsoft will not start charging customers until February. Being based on Windows, Microsoft argues that Azure is easier to manage than existing cloud platform offerings from the likes of Google or Amazon.

Is Windows Azure becoming a mainstream news item? Nick Eaton included an item entitled “Cloud computing takes center stage” in his 10 biggest Microsoft stories of 2009 article of in the SeattlePI Blog:

Microsoft unveiled Windows Azure at PDC08 and announced its commercial availability at PDC09. But what the heck is it? An operating system in the cloud, that exists nowhere and everywhere? And why the heck do we need it? The answer to all those questions is simple: the future.

Microsoft won't start charging for the Azure Platform until February, but it's already got hundreds of developers using the service. With Windows Azure, companies and individuals can create and manage cloud-based applications that people access via the Web. It lets clients scale up and scale back their server use as needed, and can slash a company's IT costs.

Abel Avram reports Information Can Be Sold and Bought in “Dallas” in this 12/24/2009 review of Codename “Dallas” and Microsoft Pinpoint for InfoQ:

Microsoft’s service codename “Dallas” is an information marketplace bringing together data, imagery and service providers and their consumers facilitating information exchange through a single point of access.

“Dallas” has been built with and on top of Windows Azure platform, the service consisting of three main components:

  • Information Discovery – discovering and consuming structured and blob datasets available to any application on any platform.
  • Brokerage – facilitates partnership between information providers and organizations interested in consuming it.
  • Analytics and Reporting – provides data analysis and reporting.

Data can be accessed through a REST-based API, but C# proxy classes provide an object model facilitating access from within any .NET program. Services provide their data as ATOM feeds or in a tabular form. Data can be also loaded into Excel through PowerPivot. “Dallas” will soon provide data through SQL Server and SQL Azure queries.

Eric Nelson’s SyncToBlog #7 Windows Azure Platform links post of 12/24/2009 is a list of resources for Windows Azure developers in the following categories:

    • Recent Windows Azure articles of interest to developers
    • A few case studies
    • Plus some of the “must remember to read” Azure Blogs

The OakLeaf blog isn’t (yet) on Eric’s “must remember to read” list.

John Savageau predicts A Cloudy Future for Networks and Data Centers in 2010 in this 12/24/2009 post:

Data Center within a Data Center CloudThe message from the VC community is clear – “don’t waste our seed money on network and server equipment.” The message from the US Government CIO was clear – the US Government will consolidate data centers and start moving towards cloud computing. The message from the software and hardware vendors is clear – there is an enormous investment in cloud computing technologies and services.

If nothing else, the economic woes of the past two years have taught us we need to be a lot smarter on how we allocate limited CAPEX and OPEX budgets. Whether we choose to implement our IT architecture in a public cloud, enterprise cloud, or not at all – we still must consider the alternatives. Those alternatives must include careful consideration of cloud computing. …

John is President of Pacific-Tier Communications.

David Aiken announces a New [Azure] Training kit for the Holidays in this detailed post of 12/23/2009, which includes a catalog of its:

    • Hands On Labs
    • Presentations and Videos
    • Demos

in the Windows Azure Platform Training Kit – December (2009) Update.

Lynn Langit (@llangit, a.k.a. SoCalDevGal)’s Developing Windows Azure applications – Setup post of 12/23/2009 explains how to get started with Windows Azure projects in Visual Studio 2008 SP1 or 2010 November 2009 CTP:

I’ve been working with Windows Azure (post PDC09 build) to take a look at the basic mechanics of coding and deploying applications.  To that end, I wanted to share what I’ve learned so far in this post.

First, I’ll talk about what you need to download and install to get started.  There are several categories of items to get in order for you to start developing.  Of course what you’ll need is dependent on what you intend to build.

I initially wanted to build ASP.NET C# applications that either used Windows Azure storage (i.e. table, blob or queue) or that used SQL Azure storage (i.e. RD[B]MS-like tables).  I’ll remind you that other application development languages are supported, such as PHP.  Also if you were simply using SQL Azure as storage, there is no requirement that the front-end application actually be a Window Azure application. …

Lynn continues with detailed descriptions of the tools required to create Azure projects, the account (token) acquisition process, and creating a sample to do application.

Schematic deployed on 12/18/2009 TwittZURE, a Windows Azure/Silverlight 3 sample application that displays Twitter search results and timelines by taking advantage of the new Twitter API:

Clicking the Install Now button performs a Click Once installation of the out-of browser version.

Schematic is an “interactive agency that creates branded experiences” from offices in Los Angeles, New York, Atlanta, Austin, Minneapolis, San Francisco, San Jose (Costa Rica), and London (UK).

The Silverlight Team Blog posted on 12/23/2009 a Case Study - TwittZure Silverlight Twitter Client on Azure, which describes the project:

… Schematic used the engaging rich internet application of Silverlight 3 and the responsiveness of Windows Azure cloud platform to build a cutting edge Twitter application, the recently released Beta version of Twittzure. 

TwittZure enables users to access key Twitter features by providing an engaging and sleek user interface to interact with friends and colleagues through the Twitter public APIs. TwittZure takes advantage of a highly available and scalable cloud server platform by using Windows Azure to serve the application and as a bridge between the application and the Twitter APIs. …

Why another Twitter client if there are so many around? TwittZure is also a unique showcase and proof of concept for several emerging technologies and platforms that are integrated into the application and are relevant for Schematic and our clients. TwittZure integrates not only the Twitter user and search REST public APIs, but also uses Windows Azure, Microsoft’s cloud platform, to host the application,--providing a highly available and scalable bridge between the application and Twitter’s REST services. …

Anthony Baker’s TwittZure Showcased at post of 12/18/2009 offers a brief history of TwittZure’s development. According to the case study, Anthony is a Software Architect in the Microsoft Platforms Group and was TwittZURE’s Project Lead, Application Architect, Developer.

Return to section navigation list> 

Windows Azure Infrastructure

My OakLeaf Blog Analytics for November 2009 will be of more interest to me than anyone else, but I intend to post statistics monthly to archive readership trends.

Dion Hinchcliffe explains How Cloud Commoditization Will Transform Enterprise Computing in this 12/23/2009 post to the Enterprise Irregulars blog:

The announcement last week of Amazon’s new EC2 Spot Instances was more than just another move considerably ahead of the rest of the industry by that forward-looking cloud computing leader. Spot Instances also heralds the beginnings of a real trading market for cloud computing resources of all kinds, whether that is storage, support services, real compute power, or even human capital (as in on-demand crowdsourcing.)

Up until now — indeed, still today except for Amazon — you basically had to pay fixed “retail” amounts according to the publicly posted prices from a bulk vendor or refer to the the rates listed in your negotiated contract with a private supplier. Now in the new spot market (live price ticker here) you can just look at the current price of unused cloud capacity and if your bid is the highest, it’s all yours. Of course, this is only available in Amazon’s cloud at the moment and it’s just for EC2, their compute cloud. But you can count on this expanding to their other services as well and for competitors to respond, which is where it gets a lot more interesting.

Admittedly, the days of the fixed price retail cloud are far from over. It will take a while to grow broader interest in treating the unused and available portions of the cloud as a commodity. It might even take longer for buyers to start thinking of compute resources in all its forms as an instantaneous product to consume, something that up until now still looked more like long-term fixed investments in 3rd party pooled resources than rapidly fluctuating units of exchange with highly dynamic value and price. …

Cloud Computing Cost Models: Long Term Contract, Retail, and Spot Market (Commodity)

Click the graphic above to read the original eBizQ post.

Frances Karamouzis predicts “By 2012, India-centric IT services companies will represent 20% of the leading cloud aggregators in the market (through cloud service offerings)” in his initial (and untitled) Gartner blog post of 12/23/2009:

As we approach the end of the year, I thought I take the opportunity to launch this blog with a question and hopefully engaging discussion about the extensively hyped topic of Cloud.   As a Gartner analyst, clearly our organization has put forth lots and lots of research regarding Cloud Computing. However, I want to focus on “IT Services” and the Cloud.

Within Gartner, we are referring to this as Cloud enabled Services or Cloud enabled Outsourcing — or just Cloud Services for short.  Regardless of the name — the question on the table is how much money, marketshare, channel mastery and MOST importantly in the first 18 months (how much MINDshare) will be commanded by IT services companies (whether its some of the traditional providers like Accenture, Capgemini, ACS, CSC, or some of the Indian vendors (TCS, Wipro, Infosys, Cognizant, HCL, and the other 40+ Indian vendors OR other offshore vendors (Softtek, Neoris, EPAM, CPM Braxis, iSoftstone, VanceInfo,  etc. or some of the smaller emerging providers like AppsLabs, Appirio, Tory Harris etc.)) [Emphasis by the author.]

Please weigh in with your thoughts, comments, challenges …

BCS, the UK’s Chartered Institute for IT asserts Cloud computing 'will dominate focus in 2010' in this 12/24/2009 post:

Cloud computing will dominate the focus of the industry in the new year, a new study has revealed.

In a survey of the chief information officers and chief technology officers of several leading companies by Logicalis, the main technology trends of 2010 were found to be Web 2.0 and Cloud computing.

The latter, which was the second most popular term mentioned by the respondents after Web 2.0, was seen as important and popular due to its flexible and scalable architecture.

Commenting on the findings, Adam Bosnian, vice president of products, strategy and sales at security management firm Cyber-Ark, said: 'Almost any size of organisation can use public or private cloud resources and enjoy significantly enhanced economies of scale.'

'Even if an organisation uses private cloud resources … where the server storage environment is effectively outsourced to the service provider - there are still economies of scale to be had.'

<Return to section navigation list> 

Cloud Security and Governance

Chris Hoff (@Beaker) issues a challenge to the “select few who ignore issues brought to light and seem to suggest that Cloud providers are at a state of maturity wherein they not only offer parity, but offer better security than the ‘average’ IT shop” in his The Great Cloud Security Challenge: I Triple-Dog-Dare You… post of 12/27/2009:

There’s an awful lot of hyperbole being flung back and forth about the general state of security and Cloud-based services.

I’ve spent enough time highlighting both the practical and hypothetical (many of which actually have been realized) security issues created and exacerbated by Cloud up and down the stack, from IaaS to SaaS.

It seems, however, that there are a select few who ignore issues brought to light and seem to suggest that Cloud providers are at a state of maturity wherein they not only offer parity, but offer better security than the “average” IT shop.

What’s missing is context.  What’s missing is the very risk assessment methodologies they reference in their tales of fancy.  What’s missing is that in the cases they suggest that security is not an obstacle to Cloud, there’s usually not much sensitive data or applications involved. (Author’s emphasis.) …

Chris details his challenge and concludes:

I’m all for evangelism, but generalizing about the state of security (in Cloud or otherwise) is a complete waste of electrons.

Chris Hoff (@Beaker)’s How Many Open Letters To Howard Schmidt Do We Need? Just One post of 12/23/2009 includes a brief message to the newly appointed “Cyber-Security Czar:”

Dear Howard:

I’ll keep it short.

Let me know how we can help you be successful; it’s a two-way street. No preaching here.



and ends with an offer to volunteer his services:

If Howard called me tomorrow and asked me to quit my job and make sacrifices in order to join up and help achieve the lofty tasks before him for the betterment of all, I would. [Emphasis Beaker’s.]

If I were Howard, I’d take Beaker up on his offer.

Eric Chabow reports “Lawmakers Seek to Give More Power to White House Infosec Adviser” in his Cybersecurity "Czar" Hubbub Continues post of 12/23/2009:

Don't expect the hullabaloo surrounding the cybersecurity "czar" to vanish despite the appointment Tuesday of Howard Schmidt as the White House cybersecurity coordinator.

Since President Obama announced in late May he would appoint a cybersecurity coordinator, much of the hubbub focused on who that person would be and - as the months rolled by - when the appointment would be made. That's been settled.

But the fact that Schmidt reports not to the president, but to National Security Adviser James Jones, and that the post doesn't require Senate confirmation, bothers some influential lawmakers who believe the job should be situated higher on the White House organizational chart.

While praising the naming of Schmidt, Sen. Joseph Lieberman said he will introduce legislation early next year to require the White House cybersecurity adviser be confirmed by the Senate. The Connecticut Independent Democrat who chairs the Senate Homeland Security and Governmental Affairs Committee, in his statement, did not indicate what powers his bill would give the cybersecurity adviser. …

With friends like Sen. Lieberman, you don’t need any enemies.

David Talbot asserts “Information technology's next grand challenge will be to secure the cloud--and prove we can trust it” in his Security in the Ether cover article for the January/February 2010 issue of MIT’s Technology Review magazine:

… Computer security researchers had previously shown that when two programs are running simultaneously on the same operating system, an attacker can steal data by using an eavesdropping program to analyze the way those programs share memory space. They posited that the same kinds of attacks might also work in clouds when different virtual machines run on the same server.

In the immensity of a cloud setting, the possibility that a hacker could even find the intended prey on a specific server seemed remote. This year, however, three computer scientists at the University of California, San Diego, and one at MIT went ahead and did it (see "Snooping Inside Amazon's Cloud" in above image slideshow). They hired some virtual machines to serve as targets and others to serve as attackers--and tried to get both groups hosted on the same servers at Amazon's data centers.

In the end, they succeeded in placing malicious virtual machines on the same servers as targets 40 percent of the time, all for a few dollars. While they didn't actually steal data, the researchers said that such theft was theoretically possible. And they demonstrated how the very advantages of cloud computing--ease of access, affordability, centralization, and flexibility--could give rise to new kinds of insecurity. …

Vinton Cerf claims “The next step in cloud computing is to link different systems” in his Integrating the Clouds article for the January/February 2010 issue of MIT’s Technology Review magazine:

At Google, we operate many data centers around the world, each of which contains a large number of computers linked to one another in clusters. In turn, the data centers are linked through a high-speed private network. These data centers support applications and services that users can access over the public Internet to tap into virtually unlimited computing power on demand, a process known as cloud computing (see "Security in the Ether"). Amazon, IBM, Microsoft, and others are implementing and experimenting with similar systems. Currently, these clouds operate in isolation, communicating only with users. But I think we need to start developing interfaces so that clouds can communicate directly among themselves.

An integrated cloud would have a number of advantages. Users may wish to move data from one cloud to another without having to download all their data and then upload it again. Or users may want to store the same data in multiple clouds for backup. In this case, reliable mechanisms for synchronizing data across different clouds would be useful. Some may wish to do coördinated computation in multiple clouds.

How can a program running in one cloud reference data in another? If one cloud puts restrictions on access to data, how can those controls be replicated in a second cloud? What protocols, data structures, and formats will allow clouds to interact at users' direction and in accordance with their requirements? …

Vinton Cerf is vice president and chief Internet evangelist at Google. In the 1970s and '80s he worked at DARPA, where he is widely credited with developing the Internet.

Graphic Credit: Paddy Mills

<Return to section navigation list> 

Cloud Computing Events

No significant articles yet.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

See Dion Hinchcliffe’s How Cloud Commoditization Will Transform Enterprise Computing article about Amazon Web Service’s spot pricing in the Windows Azure Infrastructure section.

<Return to section navigation list>