Wednesday, March 17, 2010

Windows Azure and Cloud Computing Posts for 3/15/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts and Databases”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in February 2010. 

Azure Blob, Table and Queue Services

Kevin Kell’s Windows Azure Storage – Part 2 post of 3/17/2010 begins:

In contrast to Blobs, Azure tables offer structured storage. These “tables”, however, are not to be confused with tables that might exist in the context of a relational database. In fact Azure tables are more like a typical “object database” where each table contains multiple objects, or “entities”. An entity in Azure can be up to 1 MB in size. …

and continues with StorageClient code examples. His Windows Azure Storage – Part 1 of 3/3/2010 covered Azure blobs. Kevin also published Application Architectures in Windows Azure – Part 1 and Application Architectures in Windows Azure – Part 2 in February 2010.

Kevin is currently involved as Technical Editor in the following Learning Tree Courses:

The OData Team posted Announcing the OData SDK on 3/16/2010:

Open Data ProtocolLast November at PDC 09 we announced the Open Data Protocol (OData), providing a way to unlock your data and free it from silos that exist in applications today. OData makes it easy for data to be shared in a manner that follows the philosophy of Open Data and enables the creation of REST-based data services. These services allow resources identified using Uniform Resource Identifiers (URIs) and defined in an abstract data model, to be published and edited by Web clients using simple HTTP messages. OData enables a new level of data integration across a broad range of clients, servers, services, and tools.

This morning during the Keynote at Mix 2010, Doug Purdy announced the re-launch of OData.org and the release of the OData SDK.

The OData SDK brings together a wealth of resources to help developers participate in the OData Eco-system including:

  • Sample OData online services (northwind, etc) - open a browser and test out an OData Service.
  • OData client libraries
    • Windows Phone 7 series
    • iPhone
    • AJAX\Javascript
    • PHP
    • Java
    • .NET
    • Silverlight
  • Online OData explorer (Source code also available for download from odata.org)
  • Data Service Provider toolkit: Whitepaper and sample WCF Data Services provider implementation to demonstrate how to create a data service over *any* data source
  • OData validation tool: A test harness and a few sample validation tests to make it easy to validate your OData endpoint.  The harness is designed to be easily extended allowing anyone to easily add new tests.

Also announced today at Mix, there are some new OData services available publicly:

  • Netflix has exposed their catalog of movies via OData at http://odata.netflix.com
  • Microsoft codename ‘Dallas’ exposes datasets in the cloud and allows developers to access and monetize them using OData
  • SQL Azure now features an “OData easy button” - a one click experience to get your SQLAzure database exposed as an OData feed

How can you get involved in the OData ecosystem? Check out OData.org – Expose OData – and the next time you’re developing an app ask yourself, is there a feed for that? 

See also Mary Jo Foley’s Microsoft delivers updates on OData, Houston, Dallas post of the same date in the SQL Azure Database (SADB) section below.

Pablo Castro’s offline video of OData: There's a Feed for That MIX10 presentation is now live in MP4 Video, Windows Media Video and Windows Media Video (High) formats. You can also watch Mike Flasko’s Implementing OData: How to Create a Feed for That complementary session.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

David Linthicum asserts “The data processing requirements of cloud computing is causing many to leave SQL and relational databases behind” in his SQL and relational databases: They're not right for the cloud post of 3/17/2010:

I'm at the Cloud Connect 2010 conference in Santa Clara, Calif., one of the first major gatherings of the year on cloud computing. One of the larger topics that has come up thus far is not using relational databases for data persistence. Called the "NoSQL" movement, it is about leveraging more efficient databases that are perhaps able to handle larger data sets more effectively. I've already written about the "big data" efforts that are emerging around cloud, but this is a more fundamental movement to drive data back to more primitive, but perhaps some more efficient models and physical storage approaches.

NoSQL systems work with data in memory, typically or uploading chunks of data from many disks in parallel. The issue is that "traditional" relational databases don't provide the same models and, thus, the same performance. While this was fine in the days of databases with a few gigabytes of data, many cloud computing databases are blowing past a terabyte, and we'll see huge databases supporting cloud-based systems going forward. Relational databases for operations on large data sets are contraindicated because SQL queries tend to consume many CPU cycles and thrash the disk as they process data.

Dave appears to have bought into the NoSQL movement.

Mary Jo Foley reports from MIX10 on 3/16/2010: Microsoft delivers updates on OData, Houston, Dallas:

Company officials announced at the Mix 10 conference that Microsoft is making available for download on March 16 a second Community Technology Preview (CTP) of Dallas which adds support for 25 new data sets. The company also made available an OData SDK for download today, and committed to delivering a preview of another related technology, codenamed “Houston,” later this spring.

Open Data ProtocolOData, the Open Data Protocol, is Microsoft’s alternative to Google’s GData. If you’ve heard Microsoft use the codename “Astoria” or talk about ADO.Net Data Services in the past, these two codenames are now under the larger OData umbrella. Microsoft defines OData as a protocol that builds on top of HTTP, Atom Publishing Protocol (AtomPub) and JSON “to provide access to information from a variety of applications, services, and stores.” Microsoft is building OData support into a number of its products, including SharePoint Server 2010, Excel 2010, Dallas, its Dynamics products, and its MediaRoom IPTV offerings.

The OData SDK, announced today, bundles Microsoft’s various OData clients — for Java, PHP, PalmOS, .Net, and (as of today), the iPhone — into a single package. Microsoft officials also said during today’s Mix keynote that Microsoft is open-sourcing the .Net OData client, under an Apache license.

Dallas is a new service built on top of Windows Azure and SQL Azure that will provide users with access to free and paid collections of public and commercial data sets that they can use in developing applications. The datasets are available via Microsoft’s PinPoint partner/ISV site. Microsoft is planning another Dallas CTP in the next couple of months and plans to announce Dallas pricing at the Worldwide Partner Conference in July, officials said. [Emphasis added.]

Houston is a browser-based Silverlight control that allows developers to interact with SQL Azure directly. Houston allows for the rapid creation of tables, views, procedures, and is targeted at those wanting to do rapid database development in the cloud. Microsoft first demonstrated Dallas and Houston at the Professional Developers Conference 2009 late last year. …

Moe Khosravy’s Web cast of his Microsoft Project Code Name "Dallas": Data For Your Apps MIX10 session is now available.

Jamie Thomson provides more OData background in his OData.org updated, gives clues about future SQL Azure enhancements post of 3/16/2010:

Open Data ProtocolThe OData website at http://www.odata.org/home has been updated today to provide a much more engaging page than the previous sterile attempt. Moreover its now chockful of information about the progress of OData including a blog, a list of products that produce OData feeds plus some live OData feeds that you can hit up today, a list of OData-compliant clients and an FAQ. Most interestingly SQL Azure is listed as a producer of OData feeds:

If you have a SQL Azure database account you can easily expose an OData feed through a simple configuration portal. You can select authenticated or anonymous access and expose different OData views according to permissions granted to the specified SQL Azure database user.

A preview of this upcoming SQL Azure service is available at https://www.sqlazurelabs.com/OData.aspx and it enables you to select one of your existing SQL Azure databases and, with a few clicks, turn it into an OData feed. It looks as though SQL Azure will soon be added to the stable of products that natively support OData, good news indeed.

Cory Fowler rings in with A Big O for OData- Mix Keynote Day Two on 3/16/2010:

… Today Microsoft announced the Open Data Protocol [Read more at odata.org] at the MIX Conference. The Open Data Protocol is an extension of the ATOMPub format.  The OData Information can be can be represented in ATOM or JSON Format.

One of the key features of OData is that each element contains a datatype so that the data can be consumed by a number of different platforms including Java, javascript, Python, Ruby, PHP, and .NET without running into type safety issues, or misrepresenting the data in a string format. …

Pablo Castro’s offline video of OData: There's a Feed for That MIX10 presentation is now live in MP4 Video, Windows Media Video and Windows Media Video (High) formats. You can also watch Mike Flasko’s Implementing OData: How to Create a Feed for That complementary session. (repeated from above.)

Liam Cavanagh reports on 3/16/2010 the availability of a MIX10 Webcast - Building Offline Web Apps Using Microsoft Sync Framework:

This week a number of the people from our team are at the MIX conference in Las Vegas.  Yesterday, Mike Clark (the Group Manager for our team) presented a session to kick off the conference on the topic of building offline web applications.  In this session Mike presented a look at the work we are doing to enable users to build offline SilverLight applications that enable bi-directional synchronization of data from offline SilverLight applications and a central data store like SQL Azure and SQL Server. [Emphasis added.]  

The offline SilverLight support is based on a new set of capabilities we are creating called the "Asymmetric Protocol" that allows us to extend sync framework capabilities to virtually any device such as Windows Mobile,  Windows Phone 7 Series, iPhone and Symbian, even if those devices do not actually have support for the Sync Framework runtime.

Mike went through a number of demonstrations that were based on the MIX scheduling application built using this new Asymmetric protocol.  He showed how we enable users to take the conference agenda and session information offline in their Desktop, Windows Phone 7 Series device and iPhone devices allowing people to register for sessions and then sync and share this information amongst all of their devices via the central conference database.

The online version of this session is now available here.

Reuven Cohen seconds Microsoft’s investment in OData and SQL Azure in his Data is the New Oil post of 3/15/2010:

I'm sitting in my hotel room in Santa Clara at the Cloud Connect event. A conference focused on the future of business computing -- so what better a setting to discover that Facebook has passed Google as most-viewed site in US in past week. An amazing feat to say the least, but why? How did this come to be? How did in a little over six years an upstart manage to surpass Google?

To understand you really need to think what the PC era has done to information. In effect the PC revolution started what the Internet has super charged - Information Creation. Think about it, more information is now being creating in the time it takes me to write this post than was probably created between the time humans first figured out how to write up until the birth of the Internet.

But for the most part the majority of the information humankind has created has not been accessible. Most of this raw data or knowledge has been sitting in various silo's -- be it a library, a single desktop, a server, database or even data center. But recently something changed, the most successful companies of the last decade have discover how to tap into this raw data. These companies are better at analyzing, mining and using this mountain of data sitting "out there" -- turning a useless raw resource into something much more useful, Information.

Before you say anything, Yes I know I'm not the first to say this. In a 2006 post Michael Palmer wrote "Data is the new oil!" declaring "Data is just like crude. It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc to create a valuable entity that drives profitable activity; so must data be broken down, analyzed for it to have value." …

Ruv continues with further support for the Data as the new oil theory.

Manas Ranjan Sahu explains How to connect to SQL Azure using SQL Server Reporting Services 2008 R2 in this 3/15/2010 post:

Problem
We know that SQL Azure is the database offering on the Windows Azure cloud computing platform, and it goes without saying that all the technologies that plug-in to databases need to start exercising and adapting to this flavor of databases along with the regular approach of database access. In this tip we learn how to use SQL Server Reporting Services (SSRS 2008 R2 hereafter) 2008 R2 Nov CTP to connect to SQL Azure.

Solution
Two major providers can be used in SSRS 2008 R2 to connect to SQL Azure: "Microsoft SQL Server" and "OLE DB". Using these providers, a report developer can continue to develop the report in much the same way as any locally or network installed database. The only thing that one should take care of is that, when SQL Azure goes RTM (official release), users will be charged for accessing SQL Azure. So initial development and prototyping can be done on a locally installed database, and when the report is developed to a considerable extent, testing and validation can be started against SQL Azure.

Again this is not the only option, but it really depends upon the pricing policy opted. It also may fall to a scenario that part of the data is hosted on SQL Azure and part is hosted locally. In this tip we will focus on how to connect to SQL Azure using SSRS 2008 R2, and this tip assumes that the reader has some basic working knowledge of SSRS.

To get the full exercise click here

<Return to section navigation list> 

AppFabric: Access Control and Service Bus

The Windows Azure platform AppFabric Team posted A Calculator for the Windows Azure platform AppFabric Billing Preview on 3/16/2010:

Last week we released a Billing Preview that provided a usage summary to help customers prepare for when billing begins in April. Usage statistics in this billing preview are consistent with our new Connection-based billing for the AppFabric Services Bus that we announced in January.  A key feature of this model allows customers to purchase Connections individually on a pay-as-you-go basis, or in Connection Packs if they anticipate needing larger quantities.

To provide greater insight into the Billing Preview and the options available to customers, a member of our team created an Excel-based calculator that converts usage numbers into costs.  The calculator also allows customers to see how much their usage would have cost with each of the different reserved pack options.

While we hope customers find this tool quite useful, please do note that it is being provided only as a courtesy and is not guaranteed against possible bugs. The calculator also does not guarantee the amount of future bills.

You will find the calculator attachment [here].

Vittorio Bertocci announced the Identity Developer Training Kit March 2010 update: WIF+Silverlight, WIF+WCF on Windows Azure on 3/16/2010:

IDTK_march2010

Are you ready to roll? I have just flipped the publish bit on the March 2010 update of the Identity Developer Training Kit!

The new release contains various fixes and improvements, but above all it contains two brand-new labs about two scenarios you asked for loud and clear: Web Services and Identity in Windows Azure  and Developing Identity-Driven Silverlight Applications. …

Microsoft’s Azure AppFabric Team released the Windows Azure platform AppFabric Labs SDK on 3/15/2010. From the ReleaseNotes.htm file:

The AppFabric Labs environment will be used showcase some early bits and get feedback from the community. Usage for this environment will not be billed.

In this release of the LABS environment, we’re shipping two features:

  • Silverlight support: we’ve added the ability for Silverlight clients to make cross-domain calls to the Service Bus and Access Control Services.
  • Multicast with Message Buffers: we’ve added the ability for Message Buffers to attach to a multicast group. A message send to the multicast group is delivered to every Message Buffer that is attached to it.
To get started with Labs:
  • Go to https://portal.appfabriclabs.com/,
  • Sign up using your Live ID,
  • Create your Labs project and namespace, and
  • Download Labs samples from here to learn more about these new features.

To provide feedback and to learn what the community is building using the Labs technology, please visit the Windows Azure platform AppFabric forum and connect with the AppFabric team.

Running AppFabric V1 Samples Against the Labs Environment

To run the AppFabric V1 SDK samples against the Labs environment, you'll have to rename the ServiceBus.config.txt file found at the AppFabric Labs SDK page to ServiceBus.config and place it in your  .NET Framework CONFIG directory. The CONFIG directory is located at:

  1. C:\Windows\Microsoft.NET\Framework\v2.0.50727\CONFIG on x86 systems, and

  2. C:\Windows\Microsoft.NET\Framework64\v2.0.50727\CONFIG on x64 systems.

The AppFabric V1 SDKs Windows Azure sample will not work against the Labs environment. This is because to run V1 samples against Labs you need to place the ServiceBus.config file in your .NET Framework CONFIG directory and Windows Azure doesn't allow that.

Known Issues
  1. When uploading CrossDomain.XML to the Service Bus root, please leave out the <!DOCTYPE> schema declaration.

  2. Silverlight clients convert every Service Bus operation error to a generic HTTP NotFound error.

Microsoft’s U-Prove Team announces U-Prove CTP now available in this 3/15/2010 post to the “Geneva” Team Blog:

As you might have heard from Kim, Vittorio, Mike, or from the Identity TechNet or End to End trust sites, we recently released the U-Prove Community Technology Preview (see the announcement in Scott Charney’s RSA keynote).

U-Prove is an innovative cryptographic technology that enables the issuance of claims in a manner that provides multi-party security: issuing organizations, users, and relying parties can protect themselves not just against outsider attacks but also against attacks originating from each other. At the same time, the U-Prove technology enables any desired degree of privacy (including authenticated anonymity and pseudonymity) without contravening multi-party security.

Given these user-centric aspects, it comes as no surprise that we have integrated the technology into the identity metasystem, and in particular, using information cards. Users can now obtain information cards protected by U-Prove and present them 1) with higher privacy guarantees, and 2) without online connectivity to the identity provider when interacting with relying parties. The U-Prove technology helps realize the vision set forth by the laws of identity.

To encourage experimentation and gather feedback on the technology, the following software components are made available as part of the U-Prove CTP:

· Windows Identity Foundation Extension (U-Prove CTP): an extension to WIF that provides the ability to build a custom Security Token Service (STS) for U-Prove token issuance (for identity providers), and the ability to verify U-Prove token presentations (for relying parties).

· Active Directory Federation Services 2.0 (U-Prove CTP): a U-Prove enabled version of AD FS 2.0 that has the ability to issue an information card that supports U-Prove; and that can act both as a U-Prove identity provider (IP-STS) and a relying party (RP-STS).

· Windows CardSpace 2.0 (U-Prove CTP): a U-Prove enabled version of Windows CardSpace 2.0 that has the ability to obtain, store, and present U-Prove tokens associated with an information card.

Try it out, and let us know what you think!

<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

David Mokogon’s RIA Services RC, Deployment Guide, and Azure post of 3/17/2010 offers this advice:

Saurabh Pant has a great blog post about deploying RIA Services to your server. This post specifically targets .NET 4, Silverlight 4, and Visual Studio 2010, and even announces some hosting companies that now provide RIA Services RC support.

While I won't rehash all of the deployment details, I wanted to draw attention to Azure deployments. Saurabh points out that RIA Services RC only supports .NET 4. Currently, Azure only supports .NET 3.5. This means the “server” side of your RIA Services app cannot yet be used in Azure (although you can develop it locally and run with the Development Fabric).

I haven’t heard or seen any official statement about .NET 4 support on Azure, but my gut feeling (read: educated guess) is that we’ll see an Azure Virtual Machine upgrade at the same time .NET 4 is RTM, currently slated for April 12. Hopefully this will all be cleared up in the next month.

If you need to deploy a RIA Services application to Azure today, continue working with the RIA Services Beta which was announced at PDC in November. The Beta works with both .NET 3.5 and .NET 4. [Emphasis added.]

Scott Densmore reports Enterprise Library 5 on Azure on 3/16/2010:

We spent some time in our first few weeks of the project getting Enterprise Library 5 (Beta 2) working on Azure. The first thing we did is took the reference implementation (MusicStore) from the Web Client Developer Guidance and added components of Enterprise Library to it. We also wrote up a short Technical Note on our findings. The good news is that most everything just works. There are some gotchas that have to do with the Azure Platform itself (nothing bad). You can read the paper on our CodePlex site here. We should have the code to post in the next couple of days.

Dhananjay Kumar explains Hosting and Debugging Silverlight in Windows Azure in this 3/15/2010 post to the C#Corner blog:

This article will give a pictorial representation of how a Silverlight application can be hosted and debugged on local development fabric of Windows azure.

Now a Silverlight application can be hosted in a Web Role. And on local development fabric application can be debugged also.

Ron Crumbaker announced on 3/15/1020 that his new Silverlight-enhanced Windows Azure based "cloud" search project is live on the Windows Azure Services platform.

Unfortunately, Ron doesn’t provide documentation for his project, which appears to feature Microsoft System Center. I haven’t been able to successfully search for “cloud” in any tabbed location.

Dan Kasun suggests Write an Azure app to win a NetBook! in this 3/15/2010 post:

Well... at least be entered into a contest to win one of three Dell Minis.

I really like it when I find interesting projects like this: a contest for Azure apps that is being done by an independent organization, not affiliated with Microsoft.  Just a great developer community engaging their developers in an interesting way.

The folks at CodeProject have very clear instructions on setting up accounts, developing applications, deploying applications, and then de-provisioning them (hey! with a link to my blog... which actually how I found out about the contest... suddenly got a spike in views in Feb that came from CodeProject).  So there's another thing I like: unexpected links that drive a bunch of traffic to my blog :)

Go ahead, give it a shot... http://www.codeproject.com/Competitions/396/Try-Windows-Azure-in-March-and-Win-a-Netbook.aspx

As I've mentioned before - Cloud platforms, like Windows Azure, are a fantastic choice for Open Government solutions, so for those of you who are in the Public Sector space, consider this an opportunity to kill two birds with one app.

Also - if anyone has any good Government-oriented apps, let me know about them... would love to showcase them on my blog and on the Public Sector Developer portal: http://www.microsoft.com/government/developer.

Dan is Microsoft’s Seniior Director, US Public Sector Developer & Platform Evangelism

Sreeni.NET explains Developing and Hosting Application in Windows Azure in a detailed, fully illustrated tutorial of 3/15/2010.

Larry Gregory interviews Mike Hoskins of Pervasive Software Inc. in Channel9’s Talking with Pervasive Software Inc. about electronic document exchange and Azure Webcast of 3/12/2010:

Pervasive Business Exchange is a hosted service for trading business documents with trading partners. While initially seen as a horizontal solution, Larry and Mike talk about how this could target vertical markets such as healthcare. In addition Mike explains some of the issues which led Pervasive to work with the Microsoft Azure Platform.

SoCalTech.com reports Citysourced Links With Microsoft, Eyes Award on 2/24/2010, which I missed when posted:

Los Angeles-based CitySourced, the developer of a mobile tool for helping involve citizens in local government headed by Jason Kiesel and Kurt Daradics, has linked up with Microsoft and its Windows Azure platform, the firms said Tuesday. CitySourced is an angel-funded firm which develops a smart phone application which allows citizens to report potholes, graffiti, and other issues directly to local governments. The firm is using Microsoft Windows Azure for its application infrastructure. [Emphasis added.]

In unrelated news, CitySourced also said it has also been nominated to the World Economic Forum Davos Pioneer 2011 awards, an effort to spotlight companies developing technology with long term impact on business and society. Daradics is well known for his organization of the MOTM and Digital Family Reunion events in the Los Angeles area. CitySourced has received angel funding from Dale Okuno, most recently the founder of E-Z Data.

Return to section navigation list> 

Windows Azure Infrastructure

Eric Nelson’s Windows Azure guidance from the Patterns and Practices team post of 3/17/2010 reports:

The P&P team have started to share guidance on the Windows Azure Platform.  They plan to group their efforts around:

1. Moving to the Cloud
2. Integrating with the Cloud
3. Leveraging the Cloud

First up is a document which explains the capabilities and limitations of Enterprise Library 5.0 Beta 2 in terms of use within .NET applications designed to run with the Windows Azure platform. You can download it here.

Related Links:

Eric also posted links to his Slides and links from Cloud Computing Congress session on Windows Azure Platform on 3/16/2010.

Steve Marx offers My “Slides” from MIX10: Lap Around the Windows Azure Platform in this 3/16/2010 post:

A Lap Around the Windows Azure PlatformFor my MIX10 presentation this year, I cooked up a Windows Azure application that creates zoomable presentations (a bit like prezi or pptPlex) using Silverlight.  I’ll blog more when I get a chance about how the application works, but for those of you who are curious about the slides, you can take a look at them now at http://www.onebigslide.com/slides/play/smarx-mix10.  (The full recording of my talk should be somewhere on visitmix.com within the next day or so.)

Here’s one of the “slides,” which shows the architecture of http://onebigslide.com.

onebigslide.com architecture

Eugenio Pace’s Windows Azure Guidance – a-Expense “before” on CodePlex post of 3/15/2010 reports:

First build of our samples is now available on CodePlex. This initial version is the “before the cloud” baseline application, so you won’t find anything related to Windows Azure here.

This week we will take this simple baseline and start moving it to the cloud.

Goals for this next iteration are to:

  1. Claims-enable the application to keep SSO experience for users. We will use WIF for this.
  2. Remove dependency with AD for querying the user Manager and Cost Center. This will be done by sending this information as claims as opposed to having the application querying AD. We want to avoid having to call back into Adatum from a-Expense.
  3. Deploy an Identity Provider on Adatum (e.g ADFS). This is the issuer of security tokens for a-Expense. It will be configured to issue the claim set a-Expense needs (e.g. employee, Cost Center, employee manager)
  4. Move database to SQL Azure. This is straight forward. We may add some connection retry logic to the data access layer to increase resiliency. But it should “just work”.
  5. Move the Profile storage to Azure Table storage. This database is fairly small and has a simple data model. There’s really no need for full relational support.

Eugenio continues with a detailed “How it works” section:

In a nutshell:

image

We are trying to keep things as simple as possible.

Jody Gilbert claims “You don’t have to know everything about cloud computing, but a familiarity with the terminology will help you follow the trends and industry developments. This glossary offers a rundown of the terms you’re likely to come across” as an introduction to her Mini-glossary: Cloud computing terms you should know post of 3/16/2010 to TechRepublic’s Servers and Storage blog:

Cloud computing is one of the hottest topics in IT these days, with Microsoft, Google, Amazon, and other big players joining in the fray. However, the technology brings with it new terminology that can be confusing. Here are some common cloud-related terms and their meanings.

Note: This glossary is also available as a PDF download.

Architecture Update, the Microsoft Belux Architect Newsletter, provides a detailed description of Event driven architecture onto the Azure Services Platform with this summary:

Thanks to the Azure Services Platform each and every architect has an almost infinite amount of storage and compute power at his disposal without any large upfront investments. Together with these major advantages however, also come a lot of design challenges that will change the way we design software.

In this article, I will guide you through this new environment and point out some of these design challenges that the cloud presents to us. I will also propose an architectural style, and some additional guidance, that can be used to overcome many of these challenges. Furthermore I'll give you an overview of the tools offered by the Azure cloud platform that can be used to implement such a system. …

ScreenHunter_01 Mar. 17 11.13

I was surprised to find no mention of the new Reactive Extensions (Rx) for .NET in the discussion.

Lori MacVittie explains “In this case “baby” is load balancing and the corner is cloud computing” in her Nobody Puts Baby in a Corner post of 3/15/2010:

SocialCloudNow recently wrote up a pretty darn accurate (which is hard to find these days) description of “cloud computing” by walking through the components required. The author did an excellent job – especially where he dove into the relationship between orchestration and cloud computing. Loved that a lot – most folks ignore that piece of cloud computing even though it’s very, very important. But I was a bit put off (okay, a lot put off) at one statement:

blockquote

An honorable mention goes out to the Load balancer – which does the obvious.

Honorable mention? It’s an afterthought that certainly one of the key enabling technologies of cloud computing does not deserve. Shortly after reading the post and debating this point with Paul Richards (the author) I came to the realization that he was looking at cloud computing from the view point of the consumer, i.e. the organization, the customer, an administrator/developer looking for a cloud in which to deploy applications. That made his statement make a lot more sense. If you’re looking at cloud services offered and trying to decide which one to jump on then perhaps a load balancer isn’t your primary concern at all (although that makes me want to say, “Inconceivable!”). But from the perspective of the definition of cloud computing and the folks who are implementing (internal/external, public/private) such environments, a load balancer is certainly a lot more than just window dressing.

So I will say that as far as cloud services go, load balancing may be – based solely on consumer need, or perception of need – worthy of only honorable mention. But as far as implementing a cloud computing environment goes, it’s a requirement. …

Lori continues with a discussion titled “LOAD BALANCING is in the CLOUD DNA: FROM CPU to NETWORK to APPLICATION to DATA CENTER.”

Agile Intuition” writes Azure: Do the Maths in this 3/15/2010 post based on his take-away from Chris Auld’s MIX 10’s Monday afternoon Building Cloud Services with Windows Azure Platform session:

As technical director, I’m obviously concerned with technological progress and making sure we invest money only where the ROI justifies it. I’ve been using EC2 since its inception, mostly for making sure I can scale easily without incurring actual expenses on projects that might realize a profit. Basic equation: nothing to write mom about. But after attending Chris Auld’s presentation, I realize I did not get the Cloud right. Coding for high volume and high availability is not enough on the cloud. Partitioning and denormalizing for performance as one does in a non-cloud environment is not only not enough but will bite you big time when your next invoice comes.

Therefore, while the economic aspect of cloud do relate to scaling and managing upfront costs,  it also relates to managing the architecture such as to leverage the possibilities of cloud computing and working with the “pay as you go” nature of that beast. Cloud changes the way one thinks of scaling as well as the way one thinks of his data. For example, denormalization and duplications need to be considered at least as much as the parallel and asynchronous dimensions of the cloud application. Batch jobs, queues, blobs and CSS sprites are all tools one need to consider in his architecture when thinking of cloud. What you store and where you store it is also a consideration. For example, Binaries in SQL Server can make sense but it doesn’t in SQL Azure as we pay by the GB of space.

So what I take from yesterday’s workshop is that cloud is about maths: performance will not be an issue as long as you apply sound development methods. Success though, will come by figuring out the economic aspects of the beast and adjusting the application architecture accordingly. In the end, I think one could feel justify to think of Cloud as a game changer. It will not be so much a divide between those who are on the cloud and those who are not but between those who can leverage the cloud and those who get bitten by it.

Jim Nakashima’s Mix ‘10: Building and Deploying Windows Azure-Based Applications with Microsoft Visual Studio 2010 post of 3/16/2010 supplements his MIX10 presentation with several links:

Thank you to all of you who attended my session at Mix ‘10 on Building and Deploying Windows Azure-Based Applications with Microsoft Visual Studio 2010.  The video will be available within the next couple of days.

Here are some of the key takeaways and links from the session:

Getting Started

The Web Platform Installer automates a number of the steps to install the Windows Azure Tools for VS 2008 or to install IIS prior to installing the Windows Azure Tools for VS 2010.

Get the patches - http://msdn.microsoft.com/en-us/azure/cc974146.aspx

Samples

http://code.msdn.microsoft.com/windowsazuresamples

Using WCF on Windows Azure

http://code.msdn.microsoft.com/wcfazure

ASP.NET Web Roles vs ASP.NET Web Applications

The 3 differences are:

  • References to the Windows Azure specific assemblies: Microsoft.WindowsAzure.Diagnostics, Microsoft.WindowsAzure.ServiceRuntime, and Microsoft.WindowsAzure.StorageClient
  • Bootstrap code in the WebRole.cs/vb file that starts the DiagnosticMonitor as well as defines a default behavior of recycling the role when a configuration setting change occurs.
  • The addition of a trace listener in the web.config file: Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener.

NerdDinner

The NerdDinner sample code can be found at: http://nerddinner.codeplex.com/

ASP.NET Provider scripts for SQL Azure

To use the ASP.NET providers with SQL Azure, you can use these scripts: http://support.microsoft.com/default.aspx/kb/2006191 to setup the database.

Unzipping the Service Package

To be able to unzip the Service Package the environment variable you need is _CSPACK_FORCE_NOENCRYPT_. See this post for more information.

<Return to section navigation list> 

Cloud Security and Governance

Ellen Messmer reports “Cloud providers are entitled to post the McAfee Cloud Secure mark when they've successfully completed daily vulnerability-assessment scans” in her McAfee scans cloud environments for security vulnerabilities article of 3/16/2010 for NetworkWorld:

McAfee Tuesday announced [at the Cloud Connect conference, #ccevent] a vulnerability-assessment scanning service that's aimed at giving cloud-computing service providers a way to provide security assurances to their customers.

Called the McAfee Cloud Secure Program, the daily scanning service is directed from the Internet into the cloud service provider to probe for any weaknesses in the network infrastructure, perimeter and applications, says Marc Olesen, senior vice president and general manager for McAfee's software-as-a-service business unit. 

The McAfee Cloud Secure Program, which is likely to be expande to include other security services as well, "is intended to give customers more confidence in their cloud providers," Olese says. SuccessFactors.com is among the first to participate in the McAfee program.

The charges for McAfee Cloud Secure are based on number of IP addresses, among other factors, and start at less than $5,000 per month, according to Olesen.

Cloud providers are entitled to post the McAfee Cloud Secure mark when they've successfully completed daily vulnerability-assessment scans. The program is modeled to some extent after the McAfee Secure Trustmark program for e-commerce Web sites that undergo security scans.

CloudTweaks reviews the Cloud Security Alliance’s latest activities in a 3/15/2010 post:

Security, and VP WW Engineering, reached out to me and after a rousing game of calendar alignment, we spoke about the Cloud Security Alliance, its goals and how it plans to go about achieving those goals. Novell’s products, by the way, have been powerful tools to help organizations achieve the goals of high levels of security and its expertise will certainly help the alliance. …

K. Scott Morrison explains Why Intermediaries Matter in SOA (and Layer 7 SecureSpan intermediates, in particular) in this 3/15/2010 post, which begins:

Last week Joe McKendrick from ZDNet asked the question are SOA anti-principles more important than success principles? The idea of anti-principles came from Steve Jones, who a few years back did some nice work documenting SOA anti-patterns. In a post published last fall, Steve builds on his ideas, observing:

“The problem is that there is another concept that is rarely listed, what are your anti-principles?”

which is one of those good questions that should give you pause.

Steve continues:

“In the same way as Anti-Patterns give you pointers when its all gone wrong then Anti-Principles are the things that you will actively aim to avoid during the programme.”

I found this interesting because one of the anti-principles the post lists is direct calling. Steve describes this bad practice as follows:

This anti-principle is all about where people just get a WSDL and consume it directly without any proxy or intermediary. Its programme suicide and it shouldn’t be done.

Naturally, because I’m in the business of building intermediaries, this seemed perfectly reasonable to me. But on reflection, I think that the argument as to why direct calling is an anti-principle needs further explanation. …

and ends:

The task of implementing this security model now falls under the jurisdiction of a professional security administrator, not the developers of each separate application. In fact, no code or configuration needs to change on foo, bar, or any of my services. The security model is decoupled from the application, taken out of the hands of each developer and centralized. This is the basic value proposition of intermediaries in SOA, and this value is never realized effectively if you allow direct connections between clients and servers. This is why architectural patterns are sometimes necessary to allow us to be consistent with our principles—or our anti-principles, as the case may be.

Interested in trying an intermediary? You can get a Layer 7 SecureSpan virtual appliance to try out at http://www.layer7tech.com. Alternatively, do your evaluation completely in the cloud. Check out the SecureSpan virtual appliance gateway on the Amazon marketplace. This virtual appliance AMI runs in the EC2 cloud on Amazon Web services. It is the first and only SOA gateway to run in the cloud.

<Return to section navigation list> 

Cloud Computing Events

Nuno Godinho’s MIX 10 – Day Three Microsoft Silverlight and Windows Azure – A Match Made for the Web post of 3/17/2010 provides an outline of Matt Kerner’s SVC06 presentation of the same date:

Design Patterns

  • Hosting Silverlight in the Cloud
    • ASP.NET website hosted in Windows Azure
    • XAP file hosted on the website, used in the browser
      • Like an on-premises website, but has the flexibility of the Cloud
  • Shared Storage Signature for WIndows Azure Storage
    • Use the FromConfigurationSetting static method from the CloudStorageAccount class
    • Create a SharedAccessPolicy to define the policy for how the key will be shared.

Tips and Tricks

  • Silverlight
    • Use .NET Framework 3.5 in the Cloud
    • ASP.NET MVC is recommended as the WebRole
    • Make sure you Install Static Content support in IIS
    • Smooth Streaming not supported yet on Windows Azure
  • WCF
    • For load-balanced considerations
      • Load balanced endpoint flags
      • Consider connections timeouts (1 minute)
        • Asynchronous patterns works well
      • PollingDuplexHttpBinding expects affinity
        • In this case you’ll need to store the client state yourself
    • Different port mappings in the cloud
      • Quick: Patch port mappings into auto-generated service reference
  • Other Windows Azure Platform Services
    • SQL Azure
      • SQL Database in the cloud
      • Provisioned on-demand with high-availability
      • No physical administration required
    • Windows Azure Platform AppFabric
      • Service Bus
        • Connect Asynchronously between on-premise services and cloud services
          • Firewall transversal with message rendezvous
      • Access Control
        • STS in the Cloud

http://dev.windowsazure.com

The outline is useful, especially in view of the lack of an abstract for this session.

Webcasts of the following Azure-related MIX10 Sessions are available as of 3/17/2010 3:00 PM PDT:

Jim O’Neill announces that he’ll be presenting at the Windows Azure at Cloud Services Meetup on 3/17/2010 (today):

3-16-2010 4-45-02 PM The February Meetup was cancelled due to the ‘winter storm that wasn’t’, so I’ll be presenting on Windows Azure this Wednesday, March 17th.

The second presentation of the evening will be by Gil Rapaport, Co-founder and President of Viewfinity, which provides a SaaS solution for managing the support and control of desktops, laptops and Windows servers, regardless of worker location.

The meetings are hosted at Aprigo, 460 Totten Pond Road, Suite 660, Waltham, MA, by the Meetup founder Tsahy Shapsa, and begin at 6:30 p.m.

Please RSVP at the Boston Cloud Services Meetup site, so they have an accurate count for pizza and drinks.

AzureBootCamp.com has posted its complete Schedule for all Azure Boot camps from March through June 2010, but many cities’ camps are still marked TBD:

AzureBootCamp.com’s Announcing Windows Azure Boot Camp in Houston. May 24-25, 2010 post of 3/15/2010 explains:

What is a Windows Azure Boot Camp?

Windows Azure Boot Camp is a two day deep dive class to get you up to speed on developing for Windows Azure. The class includes a trainer with deep real world experience with Azure, as well as a series of labs so you can practice what you just learned. ABC is more than just a class, it is also an event in a box. If you don't see a class near you, then throw your own. We provide all of the materials and training you need to host your own class. This can be for your company, your customers, your friends, or even your family. Please let us know so we can give you all of the details.

Awesome. How much does it cost?

Thanks to all of our fantabulous sponsors, this two day training event is FREE! We will provide drinks and snacks, but you will be on your own for lunch on both days. This is a training class after all.

How do I attend one?

Just find a city and date on the schedule that works for you. Click through to see the details for that class, and then register. Keep in mind you will most likely need to bring your own laptop to do the labs.

What do I need to bring?

For most boot camps, you will need to bring your own laptop, and have it preloaded with the software listed here. An extension cord would help as well.

There isn't one near me? Now what?

Throw your own! We will help you put on your own boot camp. Contact us for more details

Dan Kasun’s Slides from my FedScoop presentation last week post of 3/13/2010 offers his slide deck from his Building Solutions with Windows Azure presentation at FedScoop’s Microsoft-sponsored Roadmap to the Cloud event on 3/9/2010.

Dan is Microsoft’s Seniior Director, US Public Sector Developer & Platform Evangelism

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Jeffrey Schwartz reports IBM Launches Public Cloud Service in this 3/17/2010 article for Visual Studio Magazine.

Extending upon the private cloud offerings launched last year, IBM is the latest major player to launch a commercial hosted service.

Like Microsoft's recently-released Azure services, the new IBM Cloud, released Tuesday, is targeted at developers and testers. However like Microsoft, Amazon and Google, IBM is clearly looking to extend its portfolio of products and services to the public cloud over time.

"You will see IBM continuing to release this set of work-based cloud computing environments," said Daniel Kloud, director of cloud computing in IBM's Rational business group.

"IBM has been talking a good cloud game for the last year or so," notes Forrester Research analyst James Staten in a blog posting. "But its public cloud efforts, outside of application hosting have been a bit of wait and see. Well, the company is clearly getting its act together in the public cloud space with today’s announcement."

While IBM does offer targeted hosted services such as Lotus Live, the company's new IBM Cloud service brings some key components of Big Blue's platform to the commercial cloud such as its WebSphere suite of application servers and its DB2 and Informix databases.

"What IBM is offering customers is not only the infrastructure to put development and test environments in place, but we also provide software images," Kloud said, in an interview.

A customer or partner is presented with a catalog of images or they can have IBM provision their own images, Kloud said. "Basically you can get your development and test teams up and running in a matter of minutes because they avoid basically acquiring the hardware and configuring the system and software," he said.

According to IBM, 50 percent of an organization's IT infrastructure is used for development and test, while 90 percent of it is idle at any given time.

"Certainly any IaaS [infrastructure as a service] can be used for test and development purposes so IBM isn’t breaking new ground here," notes Forrester's Staten. However he says IBM is launching a storing offering with support from test and development partners such as SOASTA, VMLogix, AppFirst and Trinity Software.

IBM's new commercial cloud service currently only supports hosting of Linux systems – the company did not disclose plans for offering Windows Server images other than to say it will be expanding on its stack.

The public IBM Cloud infrastructure is based on Red Hat Enterprise Virtualization (RHEV) stack based on the Kernel-Based Virtual Machine (KVM). Red Hat acquired the technology from Qumranet in 2008.

Red Hat called the choice of RHEV over virtualization technology from VMware a coup for its hypervisor stack. "It's a big milestone," said Scott Crenshaw, vice president and general manager of Red Hat's cloud business, in an interview. Crenshaw argued that the key advantage of its RHEV stack released in November is its support for multi-tenant data architectures. "It has a lot of advantages in areas like reliability, scalability and security," he said.

As part of its launch, IBM released Rational Software Delivery Services for Cloud Computing v1.0, which includes components of the company's Rational development and testing suite. IBM is not publishing pricing for its service.

If you don’t publish public prices for cloud services, how does a customer know if his company is receiving “most-favored purchaser” pricing?

Brenda Michelson’s Cloud Connect Related Announcements post of 3/17/2010 list begins:

As … discussed on Twitter last week, there are a dizzying number of cloud computing announcements coinciding with Cloud Connect this week. …

I had the pleasure of meeting Brenda at Tuesday night’s San Francisco Cloud Computing Club meet-up that was colocated with Cloud Connect at the Santa Clara Convention Center.

Brenda’s @ Cloud Connect: Public, Private, or Hybrid: Where’s the Value Today and Where’s It Going? post provides detailed coverage of Cloud Connect’s Public, Private, or Hybrid: Where’s the Value Today and Where’s It Going? panel discussion moderated by Vanessa Alvarez, Industry Analyst, Enterprise Infrastructure, Frost & Sullivan:

There’s no doubt that virtualization, automation, and service-centric architectures lead to cost efficiency and more agile information technology. But there are many ways to deploy clouds: Privately, atop on-premise hardware behind enterprise firewalls; publicly, through third-party service providers; or in a hybrid, blended model that leverages the best of both worlds. Which of these is right today? Why, and will this change? Join this panel for a look at the sweet spot of clouds and how utility computing will evolve in coming years.

Brenda continues with a list of panelists and their opening statements. Vanessa also attended the San Francisco Cloud Computing Club meet-up:

Mike Kirkwood asks Rulers of the Cloud: Will Amazon's Computing Fabric Become a New Economy? in this 3/16/2010 post:

This is the third entry in our exploratory series "Will One Company Dominate the Cloud". Today we're blinking twice after reviewing the innovation engine at Amazon.

The Amazon AWS product is all about services. While others are marketing the cloud with an explanation point, the cloud leader is focused on the raw building blocks. This includes everything from storage to people. Amazon is learning how to find new ways to optimize connections and monetize them in increments of time.

Amazon, the Verb: Motion

When thinking of Amazon as a verb, one word stands out, motion. When Amazon was first introduced as the Internet bookstore, it immediately created a change in the landscape.

It seemed like the writing was on the wall for brick and mortar retail, and to a large degree, it was. In a mere 15 years, it has disrupted the entire book vertical with an end-to-end digital system. Amazon is now in the position to completely automate the flow of content bits from upstream to downstream. …

Shane O’Neill’s Inside MS Cloud Model for Productivity Apps of 3/15/2010 reports:

Enterprises of all shapes and sizes are catching on to the value in moving e-mail and other productivity apps to the cloud where they can be delivered and managed by vendors like Microsoft, Google or Cisco.

The major appeal of cloud is that it saves money, says Ron Markezich, Corporate VP of Microsoft Online Services.

But the benefits of moving apps out of your data center and into a cloud environment such as Microsoft's BPOS (business productivity online suite) go deeper than cost cutting, says Markezich. A cloud environment can speed up workflow simply by allowing workers to access e-mail from any Internet connection. It can get top brass using wikis and blogging to improve communication at a company. And it can take the burden of managing servers off the IT department and free them up to work on more business critical projects.

Yet the cloud still has certain stigmas that have kept some CIOs from allowing any data to leave the data center.

In this interview with CIO.com's Shane O'Neill, Microsoft's Markezich discusses the joys and potential pains of the cloud model for productivity apps. …

Read the interview here. Also, see Shane’s Google's Big New Cloud Play: Should Microsoft Be Afraid? of 3/11/2010.

Chirag Mehta’s Emergent Cloud Computing Business Models of 3/15/2010 begins:

The last year I wrote quite a few posts on the business models around SaaS and cloud computing including SaaS 2.0, disruptive early stage cloud computing start-ups, and branding on the cloud. This year people have started asking me – well, we have seen PaaS, IaaS, and SaaS but what do you think are some of the emergent cloud computing business models that are likely to go mainstream in coming years. I spent some time thinking about it and here they are [abbreviated]:

  • Computing arbitrage …
  • Gaming-as-a-service …
  • App-driven and content-driven clouds …

Chirag is Technology, Design, and Innovation Strategist with the Office of the CEO[s] at SAP.

<Return to section navigation list> 

blog comments powered by Disqus