Wednesday, December 30, 2009

Windows Azure and Cloud Computing Posts for 12/28/2009+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

• Update 12/30/2009: Lorraine Lawson: Judith Hurwitz Cloud Computing Interview parts I and II: M. Kramer: Gentlemen, Start Your Cloud Predictions; Hovhannes Avoyan: The Mass Adoption of Cloud Computing Is Coming; Neil MacKenzie: Entities in Azure Tables; Jayaram Krishnaswamy: Windows Azure Platform Training Kit; Mamoon Yunus: The Guillotine Effect of Cloud Computing; Mike Leach: Coding With Azure: "Up or Down" Service Monitor App; Ben Riga: Windows Azure Lessons Learned: Active Web Solutions; and more.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: AOL Greets 48-Hour CompuServe Classic POP3 and Webmail Outage with Silence on 12/28/2009; OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

• Neil MacKenzie explains how to work with the the WritingEntity and ReadingEntity events exposed by the DataServiceContext class in his Entities in Azure Tables post of 12/30/2009. Neil provides extensive source code for the following operations:

    • Writing Data to an Azure Storage Table
    • Handling WritingEntity Events
    • Removing a Property from an Atom Entry
    • Reading Data from an Azure Storage Table
    • Initializing a Property
    • Serializing Generic Properties

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Stephen Forte’s PowerPivot Demo Part II post of 12/28/2009 demonstrates how to build a more advanced pivot table against the Northwind sample database’s Orders, Order Details, Products, Categories and Employees tables:

About a week ago, I showed a simple demo on my blog of SQL Server PowerPivot for Excel, or just PowerPivot for short. I used SQL Azure as my backend and just did a mapping to a few Northwind tables and then built a pivot table using the Order Details table.

Today I will take that example one small step further and write a custom TSQL query and build a more advanced pivot table against that data source.  In a future post I will build a PowerPivot example against a true OLAP Cube, but that will have to wait for the new year. …

This gives the user an extremely powerful pivot table and the ability to do some pretty sophisticated drill downs and filters. In addition, it is pretty easy to add/remove even more data (Customers and Employees for example) to this pivot table.

image

[George] Huey and [Wade] Wegner will “Migrate Us to SQL Azure!” on 12/31/2009 when .NET Rocks releases Show #512:

George Huey and Wade Wegner from Microsoft talk to Carl and Richard about George's creation, the SQL Azure Migration Wizard, a free tool that will save you countless hours when migrating SQL Server databases to SQL Azure.

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

Scott Gellock reports Windows Azure AppFabric Java and Ruby SDK’s updated for v1 in this 12/28/2009 post:

Over the past couple of days updated versions of the Java and Ruby SDK’s for AppFabric have been posted.  These updated SDK’s account for the changes made in the v1 service release of Service Bus and Access Control.  You can get the updated Java SDK here and the updated Ruby SDK here.

Bruce Kyle announces a new “guide that provides step-by-step instructions on how to use Windows Identity Foundation in Windows Azure solutions” in his Guide Explains How to Build On-Premises, Cloud Identity Aware Apps post of 12/29/2009:

Windows Identity Foundation (previously known as “Geneva” Framework) makes federation and claims-based identity first-class citizens in the .NET Framework. Developers can now rely upon a single, consistent model for creating identity-aware applications on ASP.NET and WCF.

We just published a guide that provides step-by-step instructions on how to use Windows Identity Foundation in Windows Azure solutions, so that on-premises or partner identities can be seamlessly used regardless of where your resources are hosted, on-premises or in the cloud.

The guide also provides explanations of the tradeoffs that are necessary at this point for making the two pre-releases to work together, highlighting the attention points and the workarounds, and will be updated regularly as we move toward commercial availability. We invite you to download the guide and experiment with the simple but powerful scenario it enables: stay tuned for updates and for more scenarios of claims-based identity in the cloud.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

The Centers for Medicare & Medicare Services (CMS) and the Office of the National Coordinator for Health Information Technology (ONC) announce CMS and ONC Issue Regulations Proposing a Definition of ‘Meaningful Use’ and Setting Standards for Electronic Health Record Incentive Program in a 12/30/2009 press release:

The Centers for Medicare & Medicare Services (CMS) and the Office of the National Coordinator for Health Information Technology (ONC) encourage public comment on two regulations issued today that lay a foundation for improving quality, efficiency and safety through meaningful use of certified electronic health record (EHR) technology. The regulations will help implement the EHR incentive programs enacted under the American Recovery and Reinvestment Act of 2009 (Recovery Act).

A proposed rule issued by CMS outlines proposed provisions governing the EHR incentive programs, including defining the central concept of “meaningful use” of EHR technology. An interim final regulation (IFR) issued by ONC sets initial standards, implementation specifications, and certification criteria for EHR technology.  Both regulations are open to public comment. …

The IFR issued by ONC describes the standards that must be met by certified EHR technology to exchange healthcare information among providers and between providers and patients. This initial set of standards begins to define a common language to ensure accurate and secure health information exchange across different EHR systems.  The IFR describes standard formats for clinical summaries and prescriptions; standard terms to describe clinical problems, procedures, laboratory tests, medications and allergies; and standards for the secure transportation of this information using the Internet.

The IFR calls for the industry to standardize the way in which EHR information is exchanged between organizations, and sets forth criteria required for an EHR technology to be certified. These standards will support meaningful use and data exchange among providers who must use certified EHR technology to qualify for the Medicare and Medicaid incentives.

Under the statute, HHS is required to adopt an initial set of standards for EHR technology by Dec. 31, 2009.  The IFR will go into effect 30 days after publication, with an opportunity for public comment and refinement over the next 60 days.  A final rule will be issued in 2010.  “We strongly encourage stakeholders to provide comments on these standards and specifications,” Dr. Blumenthal said.

The Recovery Act established programs to provide incentive payments to eligible professionals and eligible hospitals participating in Medicare and Medicaid that adopt and make “meaningful use” of certified EHR technology.  Incentive payments may begin as soon as October 2010 to eligible hospitals.  Incentive payments to other eligible providers may begin in January 2011. …

The CMS proposed rule and fact sheets, may be viewed at http://www.cms.hhs.gov/Recovery/11_HealthIT.asp

ONC’s interim final rule may be viewed at http://healthit.hhs.gov/standardsandcertification. In early 2010 ONC intends to issue a notice of proposed rulemaking related to the certification of health information technology.

The 556-page PDF of the proposed “Medicare and Medicaid Programs; Electronic Health Record Incentive Program” rule (CMS-0033-P, RIN 0938-AP78), which affects 42 CFR Parts 412, 413, 422, and 495, as it appears in the Federal Register is available for online review here. Following is a summary:

This proposed rule would implement the provisions of the American Recovery and Reinvestment Act of 2009 (ARRA) (Pub. L. 111-5) that provide incentive payments to eligible professionals (EPs) and eligible hospitals participating in Medicare and Medicaid programs that adopt and meaningfully use certified electronic health record (EHR) technology. The proposed rule would specify the-- initial criteria an EP and eligible hospital must meet in order to qualify for the incentive payment; calculation of the incentive payment amounts; payment adjustments under Medicare for covered professional services and inpatient hospital services provided by EPs and eligible hospitals failing to meaningfully use certified EHR technology; and other program participation requirements. Also, as required by ARRA the Office of the National Coordinator for Health Information Technology (ONC) will be issuing a closely related interim final rule that specifies the Secretary’s adoption of an initial set of standards, implementation, specifications, and certification criteria for electronic health records. ONC will also be issuing a notice of proposed rulemaking on the process for organizations to conduct the certification of EHR technology.

According to a tweet from Practice Fusion’s Glenn Laffel, MD, “Proposed measures for each of the Meaningful Use criteria begin on page 65.” Dr. Laffell’s earlier posts detail elements of the telephonic briefing described below.

Emily at Practice Fusion posted an ALERT: Meaningful Use Announcement Today on 12/30/2009:

ONC and CMS are holding a press conference this afternoon and will likely announce revisions to the “Meaningful Use” criteria for electronic health record (EHR) systems. To learn more about Meaningful Use and the HITECH Act, visit our Stimulus Center.

The briefing will be at 5:15 p.m. ET , Toll-Free Dial: (800) 837-1935, Conference ID: 49047605, Pass Code: HITECH

Practice Fusion’s physicians and EHR experts will be available for immediate comment on the announcement. Follow along with Dr. Laffel’s thoughts on Twitter or contact Practice Fusion's press team for more details.

Practice Fusion’s blog post continues with the full text of HHS’s CMS and ONC to Discuss Next Steps in Electronic Health Records Programs press release.

The Azure Support Team’s Tom posted Getting SSL certificates to work on 12/30/2009. His article observes:

There are a number of useful informational articles out on the web on how to deal with SSL Certificates with Windows Azure.  The first few places to start are:

http://blogs.msdn.com/davethompson/archive/2009/11/24/add-ssl-security-to-your-azure-webrole.aspx
http://blogs.msdn.com/jnak/archive/2009/12/01/how-to-add-an-https-endpoint-to-a-windows-azure-cloud-service.aspx
http://blogs.msdn.com/davethompson/archive/2009/11/24/add-ssl-security-to-your-azure-webrole.aspx

There may be times, depending on the certificate you are trying to use that these steps won’t be enough.  You may see an error like:

At least one certificate specified in your service definition is not found.
Please upload these certificate(s), and then upload your application package again.
- Dr. Watson Diagnostic ID: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

Tom suggests this resolution:

If you see something like this, and you are using a certificate from a 3rd party, you may need to get the intermediate certificates installed as well.  One problem that happens with some 3rd parties is that they will not give you a .pfx file.  Since that is the only file type you can upload to Azure for certificates, you have to convert them.  All that is needed is to create a .NET application with the following code:

// The path to the certificate.
string certificate = @"c:\test.cer";

// Load the certificate into an X509Certificate object.
X509Certificate cert = new X509Certificate(certificate);
byte[] certData = cert.Export(X509ContentType.Pkcs12, "Password");

System.IO.File.WriteAllBytes(@"c:\test.pfx", certData);

Or you can do the same in powershell:

$c = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2("c:\test.cer")
$bytes = $c.Export("Pfx", "Password")
[System.IO.File]::WriteAllBytes("c:\test.pfx", $bytes)

Running this will allow you to create .pfx files for each certificate and then import them into Azure.  Just be sure to change the “Password” and put the same one in on the dashboard for Azure.

• Jayaram Krishnaswamy questions why Windows 7 and Visual Studio 2010 were omitted as operating systems and prerequisites for the latest Windows Azure Platform Training Kit in this 12/30/2009 post.

Just an oversight?

• Mike Leach provides the source code and a video walkthrough of his Coding With Azure: "Up or Down" Service Monitor App in this 12/29/2009 post:

Every year during the holiday break I look forward to digging into a fun software project and learning something new. This year I developed a simple application called "Up or Down" using the Microsoft Windows Azure cloud platform. Windows Azure represents a huge bottom-up R&D effort by Microsoft to support highly scalable Internet-connected applications across a number of devices and platforms.

Jump to the 5 minute video walkthrough or Download the source code from Codeplex.

Mike continues with a detailed description of Up or Down’s architecture.

• Ben Riga describes the move of CCH’s sales tax calculation service from an in-house to cloud application in his Windows Azure Lessons Learned: CCH post of 12/30/2009:

In this episode of Lessons Learned we talk with Jones Pavan and Gurleen Randhawa of CCH [formerly Commerce Clearing House] about tax and accounting!  No wait stick around, that stuff can be exciting too.  Yes, really!  :)

The good folks at CCH (a Wolters Kluwer company) have built an interesting service on Windows Azure.  The solution we discuss here is a sales tax calculation service which they offer to other accounting firms.  This is an existing on-premises product that they are now moving to the cloud.

Channel 9: Windows Azure Lessons Learned: CCH

The existing product was a stateless web service that was designed to live behind the firewall.  The service is meant to be called directly via a plug-in in an accounting firm’s ERP system (for example, Dynamics AX).  To move that to the cloud CCH wrapped the web services in Windows Communication Foundation (WCF).

They had been using another third party RAD development tool called CA Plex.  The Plex runtime was added to the project and copied out to the cloud.  One of the things they quickly learned is that the nature of the cloud app is to be stateless and that required special consideration when moving on-premises apps (for example the Plex tool was caching db connections behind the scenes).

Anther important consideration was security.  They were not ready to move to ACS so for the initial release they used X.509 cert[ificate]s, ADFS and message-based security to establish trust relationships with the server.

BTW, the Windows Azure marketing folks have already published a case study on the CCH solution (available here).

Ben Riga’s Windows Azure Lessons Learned: Active Web Solutions post of 12/29/2009 links to a Channel9 video, with describes an Azure-based tracking system for fishing vessels:

There are not many solutions that can claim to have saved lives.  In this episode of Lessons Learned I chat with Richard Prodger of Active Web Solutions about the Windows Azure project they’ve been working on that tracks fishermen in real time. It monitors not only their location but also their status so as to immediately raise the alarm if help is needed (e.g. fallen off the side of a boat or pressed a panic button). This solution is already credited with saving the lives of 9 fishermen.

Channel 9: Windows Azure Lessons Learned: Active Web Solutions

Electronics on the fishing vessels communicate directly via satellite to the Windows Azure solution. Those messages are processed via Windows Azure worker roles and routed using the Windows Azure AppFabric Service Bus to various on-premises systems for review and action. The desktop client overlay marine charts onto Bing maps so that the coast guard gets a visual representation of the exact location of boats that have raised alarms.

The good folks at Active Web Solutions have published some of the source code that they developed to “automatically bridge arbitrary TCP endpoints, handling any intermediate firewall traversal.”  The code is available on CodePlex as the SocketShifter project:

http://socketshifter.codeplex.com/

If this is interesting, you should also have a look at Port Bridge published by Clemens Vasters on his blog.  Clemens describes it as “Socketshifter’s older brother”

http://blogs.msdn.com/clemensv/archive/2009/11/18/port-bridge.aspx

Robert Rowley, MD’s 2009 – the Year in Review for Health IT post of 12/29/2009 describes the primary impediments to implementing fully functional Electronic Health Records (EHRs) as envisioned by “ARRA and its HITECH section, which earmarked $19.2 billion intended to increase the use of Electronic Health Records (EHR) systems by physicians and hospitals, unprecedented attention has been focused on EHR systems.” Dr. Rowley continues:

The “hook” to try to incentivize physicians to adopt EHRs (given that the adoption rate as reported in the New England Journal of Medicine was 4% for fully-functional EHRs and 13% for basic systems) revolved around offering bonus incentives through Medicare and Medicaid payments for demonstrating “meaningful use” of “certified” EHRs. Such payments would begin in 2011 and extend through 2015, and thereafter the “stick” of payment reductions for failure to adopt EHRs would start to kick in.

As a result, the topic of EHR adoption has been on the minds and lips of physicians all year. And the state of things might best be summed up by an anecdotal comment I heard recently: “we finally adopted an EHR in our clinic, and it’s ok I guess, but the real problem is that they all need to talk to each other.”

The difficulty in making that happen – in getting EHR systems to talk to each other – has been a much harder nut to crack than one might have imagined. While the Office of the National Coordinator (ONC) has been working furiously all year developing a framework for defining “meaningful use” and is still detailing how to define “certification”, one of the big areas of attention has been on exactly this concern – interoperability between systems. …

Microsoft’s Public Sector DPE Team describes a Government Mapping solution built with Silverlight 3 and Windows Azure + SQL Azure in this detailed 12/28/2009 post:

Almost every single government entity is looking to deliver contextual information on maps, so this reusable framework built by our partner iLink Systems can help deliver a great experience to the end-users.

Reusable Framework for GIS Mapping Applications/Solutions

Partner iLink Systems has come up with a great framework and solution built with Silverlight 3 hosted on Windows Azure platform, consumes data stored in SQL Azure (cloud database) and uses Bing Maps for the visualization.

You can access the sample solution built using their framework at http://mapapp.cloudapp.net. Sample may be available only for a month or so, but feel free to send a note to info@ilink-systems.com to learn more about their framework and discuss how your state/agency can utilize it to deliver ton of GIS mapping based solutions for one or multiple departments.

Sample mapping solution http://mapapp.cloudapp.net is about fictitious county/city showing contextual information (e.g. offices/locations where govt services are offered, etc) on Bing Maps, but their framework can be used to create any GIS mapping application pretty easily. Solution not only has a rich user-experience and accessible on all modern browsers (via the Silverlight plug-in), but the cool thing is that the end-user can install shortcut of the application on their desktop and/or start menu of Windows OS (Windows XP, Windows Vista and Windows 7) by simply using right-click on the application’s canvas/surface.

Partner chose Windows Azure platform simply because the data shown on the maps is for the public, however your solution/application built using their framework can be hosted on your on-premise infrastructure too.  Once you get hold of the framework, you will be pleased that the application can talk to on-premise SQL Server database or cloud-resident SQL Azure database with simple change in the database connection string (stored in the web configuration file).

 

Bob Familiar describes how he ported his personal music Web site to Windows Azure, SQL Azure, Silverlight 3 and Expression Blend 3 with the beta of Visual Studio 2010 in his Migrating Applications to Windows Azure post of 12/28/2009:

SFPic2… Over the past three years I have been using my personal music website as a sandbox for experimenting with the latest Microsoft Application Platform features for building service oriented web applications. The last version of the site featured WCF SOAP services and a Silverlight 2 UI. The site was hosted at DiscountASP and leveraged SQL Server 2005 for the data store. I posted the details of this development effort back in August of 2008.

It has been a year and a half since I have done any work on the site. With the recent release of Silverlight 3, Expression Blend 3, Windows Azure, SQL Azure and the beta of Visual Studio 2010 I felt it was time. One of my primary goals with this migration effort was to reuse as much of my existing code as possible. Hopefully you will find my experience useful as you embark on your own migration efforts. …

The details of the Windows Azure migration project can be found in the following six posts

  1. The Architecture of a Service Oriented Rich Internet Application
  2. Setting up your Azure Account and Development Environment
  3. Migrating your SQL Server database to SQL Azure
  4. Configuring your Visual Studio 2010 Windows Azure Project
  5. Migrating WCF SOAP services to WCF RESTful services
  6. Using Silverlight 3 to create a rich user interface that can run in or out of the browser

Return to section navigation list> 

Windows Azure Infrastructure

• Matt Asay claims An application war is brewing in the cloud in this 12/30/2009 post to C|Net’s Open Road blog:

Today's cloud-computing vendors focus on infrastructure, but that won't be the case for long. It can't be. As competing vendors seek to differentiate themselves, they're going to move "up the stack" into applications.

It's like the history of enterprise computing, played out in months and years instead of decades.

Oracle arguably set this strategy in motion when it acquired its way to a complete infrastructure-plus-applications portfolio to lower customer acquisition costs and improve its competitive differentiation for CIOs. IBM and Microsoft also went that route, though to differing degrees and in different ways.

Cloud-computing platform vendors are going to have to do the same thing, except they don't have the luxury of waiting. …

Matt is vice president of business development at Alfresco, a company that develops open-source software for content management. His earlier 2010 the year of cloud-computing...M&A post of 12/29/2009 points out that:

… Gartner expects the cloud-related SaaS market to top $8 billion in 2009, which suggests that real customers paying real money.

They may not be paying enough, however, to support the mushrooming cloud vendor marketplace. Not yet.

Industry insiders are predicting a shakeout as pre-recession venture funding runs out for many of the early cloud vendors, forcing them into the arms of buyers or bankruptcy courts. …

• Ted Schadler anoints Consumer broadband [a]s the workforce technology of the decade in his 12/30/2009 post to ZDNet’s “The View from Forrester Research” blog:

That call may surprise you. You might have put storage or Gigabit ethernet or the Internet itself at the top of the list. But when I think about what’s different in the life of your average information worker as the decade comes to a close, it’s the instant-on access to just about everything that the adoption of consumer broadband has fueled.

From our Consumer Technographics(r) survey of over 50,000 consumers every year for the last 12 years, between 2000 and 2009, consumer broadband soared from 2% to 63% of US households. For context, home PC adoption grew from 51% to 77%.

But why is consumer broadband the workforce technology of the decade? …

Ted details three main reasons:

    • Telecommuting has become a way of life for millions of information workers …
    • Broadband-enabled markets have triggered massive IT innovation …
    • Consumers master new technologies at home — and expect the same at work …

• Hovhannes Avoyan asserts The Mass Adoption of Cloud Computing Is Coming in this 12/30/2009 post, which quotes Bob Rudy, CIO of Avago, about moving its internal applications to the cloud:

I’m really loving a phrase that I read recently about cloud computing. It came from the CIO of Avago Technologies, a San Jose, CA-based semiconductor maker, which is gradually migrating its data and apps to the cloud from its internal servers – including recruiting, human resources, e-mail and web security.

According to Bob Rudy, CIO at Avago, migration has saved the company millions of dollars by eliminating hardware needs and software licenses and improving security, speed and storage.

Moving to the cloud has also freed up employees from annoying and trivial tasks like managing their e-mail, enabling them to focus more on their core jobs.

But Bob phrased a simple description about the pull of cloud computing that I’d like to share: “The days of owning software are coming to an end.”

Bob was featured in a recent story in the San Francisco Gate about the rise of cloud computing, which called Bob’s statement “an increasingly common sentiment. …

• M. Kramer’s Gentlemen, Start Your Cloud Predictions post of 12/29/2009 offers brief, tongue-in-cheek prognostications from B and L Associates about:

  • Consolidation of cloud computing vendors
  • Adoption of cloud interoperability standards
  • Cloud insurance and cloud brokers
  • “Cloud” becoming a verb

in 2010. Definitely worth reading.

• Mamoon Yunus suggests “Try asking your SaaS partner to put an agent in their container - good luck!” in his The Guillotine Effect of Cloud Computing post of 12/30/2009, which seconds Dave Linthicum’s assertion that cloud computing will kill design-time service governance (see below.)

David Linthicum asserts “Every technology wave drowns something” in his Cloud computing will kill these 3 technologies post of 12/29/2009 to his InfoWorld Cloud Computing blog. According to Dave, the three technologies awaiting their demise are:

    1. Design-time service governance
    2. Older and smaller clouds
    3. Tier 2 enterprise software

Of course, Dave elaborates on his choices.

Arthur Cole recommends WAN Optimization for a Better Cloud in this 12/29/2009 post to ITBusinessEdge:

Amid all the predictions about 2010 being a banner year for virtualization and cloud computing, there is little consideration given to the impact these technologies will have on wide area networking.

But as the enterprise begins to port more applications and data onto both public and private cloud infrastructures, it will become increasingly obvious that optimizing the WAN will be a key factor in ensuring the kinds of performance levels that users have come to expect from enterprise IT.

That's part of the reason market analysts are expecting WAN optimization to kick into high gear in the coming year, along with the rest of the networking segment. Infonetics Research noted that sales of optimization equipment jumped a healthy 12 percent in the third quarter, breaking a string of poor results. Top vendors like Blue Coat, Cisco and Riverbed led the rebound, which coincided with single-digit growth in both the router and Ethernet switch markets.

According to a recent survey by Expand Networks, virtualization and cloud computing will be the key drivers for WAN optimization in the coming year. The company reports that a recent survey of IT executives found that 75 percent are planning new virtualization deployments next year, with two-thirds of those saying WAN optimization would improve system performance. A slightly smaller figure was planning on cloud deployments, but the vast majority would combine it with WAN optimization services if the price was right. …

Patrick Thibodeau lists 10 big cloud trends for 2010 in this 12/29/2009 ComputerWorld article posted to ITWorld:

    1. Commodity cloud price slashing continues
    2. A move to simpler cloud pricing models
    3. Enterprise application vendors embrace metering
    4. Cloud providers increasingly offer enterprise-caliber SLAs
    5. New technologies will improve cloud use and performance
    6. Cloud providers address security concerns
    7. Performance monitoring will become ubiquitous
    8. Open standards for cloud computing advance
    9. Politics will drive decisions
    10. The cloud will decentralize IT decision-making

Phil Wainwright describes Cloud delusions at the turn of the decade with an emphasis on Windows Azure in his 12/28/2009 post to ZDNet’s Software as Services blog:

Continuing my series of posts about the big themes on this blog over the past year, I now turn to the topic of cloud computing. My views on how to implement cloud and SaaS have hardened considerably over the course of 2009. Halfway through the year, I took a conscious decision to promote multi-tenancy as the only acceptable architecture for cloud and SaaS infrastructures. You might expect that from someone who makes a living consulting for multi-tenant vendors. But I’ve deliberately chosen a hardline and controversial stance, intended as a counterpoint to the many siren voices that argue for a more hybrid approach.

I still see migration to the cloud as a journey, but I’m concerned that too many people, especially with the advent of platforms like Windows Azure, have decided they can achieve all the benefits by going just some of the distance. This is a risky self-delusion, and the more people fool themselves this way, the more the cloud model will be discredited, not because of inherent weaknesses, but through implicit association with the disasters and disappointments these half-hearted implementations will bring in their wake. There are several different cloud delusions to beware of. [Emphasis added.] …

Phil continues with links to earlier articles about “cloud delusions.”

<Return to section navigation list> 

Cloud Security and Governance

• Lorraine Lawson interviews Judith Hurwitz in her The Best Cloud Case: What Functions You Should – and Shouldn't – Consider Moving (Part II) post of 12/30/2009:

Loraine Lawson recently interviewed Judith Hurwitz, president of business consulting firm Hurwitz & Associates and co-author of the recently released Cloud Computing for Dummies. In part one of the interview, Hurwitz explained why an integration strategy is important for moving to the cloud. In this second half, she discusses integration with the cloud, what you should and shouldn't move to the cloud and a prediction for 2010.

Part I of the interview is Baby Steps and Integration Strategy Key When Moving to the Cloud of 12/23/2009.

Mamoon Yunus’s Understanding Cloud Taxonomies and Security post of 12/29/2009 reports:

OWASP AppSec DC 2009 had a compelling session that defined cloud taxonomies and the security implications associated with the cloud computing.  The three taxonomies that have become part of our vernacular are:

  1. Infrastructure as a Service (IaaS)
  2. Platform as a Service (PaaS)
  3. Software as a Service (SaaS)

Mamoon’s post includes a link to Dennis Hurst’s Cloud Security and its Affect on Application Security presentation. Dennis is a Security Engineer for HP and a Founding member of the Cloud Security Alliance.

He predicted Cloud Reliability Will Be Bigger than Cloud Security for 2010-11 in this earlier (12/23/2009) post:

We have all the tools for securing information in a Cloud: establishing trust through identity, data privacy through encryption, and content integrity through signatures. We are overly focused on Cloud Security issues and less on reliability.  This is all about to change. Following the outages experience[d] by Amazon EC2 in 2009, another premiere cloud provide[r], Rackspace, suffered an outage on December 18. Using technology such as Forum Systems XML/ Cloud gateways is essential for establishing multi-cloud reliability and fault tolerance. …

Mamoon is the CEO of Forum Systems, which manufactures the Forum Sentry, a cloud gateway that he describes in his Why is a Cloud Gateway Required? post of 12/30/2009.

Mike Vizard analyzes Mamoon’s XML security appliance approach in a Reducing the Complexity of Application Security post of 12/21/2009:

As business-to-business interactions over the Web become more pervasive, so too does the complexity associated with securing those transactions.

Unfortunately, all that complexity serves only to dissuade businesses from integrating business processes across the Web at a time when we want to encourage that behavior. So the challenge facing chief technologists is to find a way to make it simpler to integrate business processes without having to introduce complex layers of security.

Forum Systems CEO Mamoon Yunus thinks his company has the answer in the form of an identity broker appliance that sits at the edge of the corporate network. Instead of trying to layer security software into every application, Yunus is arguing that all the security related to XML schemas associated with service oriented architecture (SOA) applications should be handled via an appliance. …

Michael Krigsman quotes Dana Gardner in his SOA adoption: The human face of governance post of 12/28/2009 to the Enterprise Irregulars blog:

Human and organizational factors such as politics, information silos, and change management generally underlie IT success and failure.

During a recent conversation, IT analyst and fellow ZDNet blogger, Dana Gardner, affirmed the extent to which this view applies to service-oriented architecture (SOA) projects.

I asked Dana to elaborate on the importance of organizational dynamics with respect to SOA governance and adoption:

“In the many discussions we’ve had with analysts, users and suppliers over the advancement of SOA and governance, we always came back to the human element as a weak or missing link. Successful adoption and efficiency of SOA hinges on acceptance, trust, collaboration and news ways for people to work together. SOA also depends on people and groups that had not in the past worked closely to begin doing so.

“When I heard more about Asuret and Pragmatic Enterprise 2.0 [my professional involvements], I saw a unique way to bring the human element of SOA into focus. The process of gathering perceptions across teams and any affected people — with trend lines over time for those — really strikes me as essential to understand how and why SOA is working — or not.

“I think that SOA efforts will gain significant tools for how to gauge and assuage the progress of SOA adoption, and the best means of governance over long periods of time once SOA activities gain traction.” …

Dana also highlighted the gap between technical and human governance issues:

“SOA governance goes a great job at tracking artifacts and defining and automating rules and policies for artifacts. The role of IT infrastructure can be managed via SOA governance, and the composition and business process alignment of services are also well served by governance. But how people really feel about how the processes are development, implemented and refined is a bit of a block hole when SOA governance is employed. Without a strong view of the perceptions and their change patterns, SOA and governance are obtuse rather then precise instruments.”

Following our discussion, Dana brought together a group of top SOA analysts to examine these issues in depth.

This post reprints, in its entirety, Dana’s article on that analyst session. You can listen to the podcast by clicking the player at top of this post. …

Hovhannes Avoyan claims “Think giants of commerce and names like Amazon, Walmart and Expedia come up” in his Commercial Giants Held Hostage to Denial of Service post of 12/28/2009:

Think giants of commerce and names like Amazon, Walmart and Expedia come up.

Now, think how much those giants depend on the huge cloud computing infrastructure to be secure and reliable to keep their businesses running and in shape.

So, if you’re an IT person, you can imagine how serious a Distributed Denial of Service (DDoS) attack is to online commerce.

Now, add the fact that it happened on the day before Christmas eve to those giants of commerce.

It appears that the attack was aimed at the servers of Neustar, which offers DNS services to many major companies under the name UltraDNS.

The attack started at 4:45 p.m. PST and lasted for about an hour. It was compounded by the fact that it also affected Amazon’s S3 and EC2 cloud services. A lot of web services rely on Amazon’s cloud infrastructure.

To Neustar’s credit, it reacted quickly and contained the damage to the Northern California area. …

Click here to find out more!“Forrester Research SOA expert Randy Heffner discusses how to establish an iterative design process for evolving your SOA security architecture that considers your current and future security requirements, emerging industry specifications, overlaps in product functionality for SOA security, and possibilities for custom security integration” in his SOA Security: Good Enough and Getting Better post of 8/18/2009 to the CIO blog:

Security is not a reason to stay away from SOA. Although full SOA security maturity is yet to come, 30 percent of organizations now use SOA for external integration with customers and partners. For standard Web services using SOAP, WS-Security has achieved critical mass as a foundational standard. On the other hand, advanced SOA security — involving federation among partners, nonrepudiation, and propagation of user identities across multiple layers of service implementations — is in its early days. To navigate the path from what's practical today to the future of advanced SOA security, establish an iterative design process for evolving your SOA security architecture that considers your current and future security requirements, emerging industry specifications, overlaps in product functionality for SOA security, and possibilities for custom security integration.

<Return to section navigation list> 

Cloud Computing Events

Ben Day’s Beantown .NET Meeting on Thursday, 1/7/2010: Jason Haley, Windows Azure SDK post of 12/29/2009 announces:

Beantown .NET is going to be meeting on Thursday, 1/7/2010. This month we have Jason Haley presenting “Get Started with the Windows Azure SDK”.

As always, our meeting is open to everyone so bring your friends and co-workers – better yet, bring your boss. It is not required to RSVP for our meetings but if you know you’re coming, please RSVP by 3pm on the day of the meeting to help speed your way through building security and to give us an idea how much pizza to order. Click here to RSVP.

See [George] Huey and Wade [Wegner] will “Migrate Us to SQL Azure!” on 12/31/2009 when .NET Rocks releases Show #512 in the SQL Azure Database (SADB) section.

Bruce Kyle recommends that you Mark Your Calendars for Microsoft BI Conference, which is colocated with Microsoft Tech*Ed North America in New Orleans during the week of 6/7/2010:

The conference will focus on the following categories:

  • Empower Your Users – Microsoft Excel (including PowerPivot for Excel), SSRS (Report Builder), Microsoft Visio, Search/FAST
  • Improve Organizational Effectiveness - SharePoint Server 2010 (including PerformancePoint Services, Excel Services, Visio Services)
  • Increase IT and Developer Efficiency - SSAS, SSRS, SSIS (including best practices for implementation), MDS, PowerPivot for SharePoint
  • Partner Services and Solutions Delivery – Partner and ISV methodology and solution offerings that augment our BI solution, Integration with other products (e.g., CRM)
  • Customer Business Value – Learn how customers are using our products in the field with in-depth sit downs with a TDM angle

Registration and demand generation efforts will kick-off in early January.

Hopefully, SQL Azure users will finally learn what BI features will be offered in the cloud. However, Eric Lai reports “Business intelligence as a service remains too scary for most enterprises, analyst says” in his Startups keep on-demand BI faith, but big vendors wait and see article of 12/22/2009 for InfoWorld’s Cloud Computing blog:

When a tech vendor fails, rivals usually rejoice. Not in the nascent BI (business intelligence)-on-demand space, after well-funded startup LucidEra folded in June.

The company had raised almost $21 million in funding from high-profile venture capitalists. Though competitors immediately began wooing LucidEra's customers, they also felt compelled to put out public statements saying that its failure was an outlier, not the beginning of a trend. …

Not only have customers failed to adopt BI-on-demand the way they have other kinds of apps such as CRM (Salesforce.com, for example) or productivity ( Google Apps), but it has also received little validation from the big players.

IBM, for one, is only starting to "research" a Web version of its Cognos BI software.

"The data model differs from company to company, which is why you just can't put it into a multi-tenant environment," said Rob Ashe, general manager for IBM's BI and performance management division.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Randy Bias describes his Infrastructure-as-a-Service Builder's Guide v1.0 white paper of 12/2009:

This paper is targeted at anyone building public or private clouds who want to understand clouds, cloud computing, and Infrastructure-as-a-Service. It highlights some of the important areas to think about when planning and designing your infrastructure cloud.

His earlier Cloud Standards are Misunderstood and Subscription Modeling & Cloud Performance white papers, as well as a brief biography are available here

Daniel Roth summarizes his Time Your Attack: Oracle’s Lost Revolution feature article for Wired Magazine’s December 2009 issue: “In 1995, Larry Ellison began his network computer crusade. The effort failed, but the concept would spark what became cloud computing.” I would add “… and netbooks, too.” Roth continues on page 3:

“Ellison is often time-dyslexic — right about the fundamental trend but wrong on timing, ” says David Roux, a partner at private equity firm Silver Lake and a former Oracle executive vice president. “It’s hard to look at a $299 netbook and not see the NC vision come to life.”

<Return to section navigation list> 

blog comments powered by Disqus