Thursday, March 24, 2011

Windows Azure and Cloud Computing Posts for 3/22/2011+

image[2] A compendium of Windows Azure, Windows Azure Platform Appliance, SQL Azure Database, AppFabric and other cloud-computing articles.

AzureArchitecture2H640px3332   

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.


Azure Blob, Drive, Table and Queue Services

Alan Smith announced the availability of his 00:23:38 Introduction to Azure Storage Webcast in a 3/23/2011 post:

image Whilst enjoying my enforced 8-months Swedish paternity leave I have found a bit of time to continue looking at Windows Azure and AppFabric (both Server and Azure), and have started another series of Azure webcasts with the first one on Azure Storage (please excuse my daughter having a tantrum half-way through recording). I plan to look at the existing functionality present in Windows Azure, and then look at the emerging technologies in Windows Azure AppFabric.

imageThe first in the series is “An Introduction to Azure Storage”, more will follow.

image


<Return to section navigation list> 

SQL Azure Database and Reporting

imageNo significant articles today.


<Return to section navigation list> 

MarketPlace DataMarket and OData

imageNo significant articles today.


<Return to section navigation list> 

Windows Azure AppFabric: Access Control, WIF and Service Bus

Paul Mehner explained Programmatically Adding Google or Yahoo as an Identity Provider to the Windows Azure AppFabric Labs v2.0 Access Control Service in an 3/22/2011 post to the Wintellect blog:

imageThis blog post assumes that the reader knows the basics of Identity Providers  and Security Token Services. Its purpose is to illustrate how to programmatically add Google or Yahoo as an Identity Provider because there isn’t much information available on how to do this. For further information about using the ManagementServices proxy, I suggest downloading the Codeplex ACS Management examples from http://acs.codeplex.com/releases/view/57595

image722322222We manage the Windows Azure AppFabric Access Control Service v2.0 through code using the ManagementService proxy and data types which are generated when we add a service reference to the ACS Metadata endpoint located at https://{yournamespace}.accesscontrol.appfabriclabs.com/v2/mgmt/service, You can do this using either the Visual Studio “Add Service Reference” menu option, or manually using the svcutil.exe utility. There are examples of this in the code samples mentioned above.

To begin, we will use the management service proxy to retrieve a list of the IdentityProviders that have already been installed for the targeted namespace. By default, Windows Live ID will already be present and cannot be removed. The management service API requires that all requests be accompanied by a SWT token, which is also covered in the previously mentioned code samples.

To create a new IdentityProvider, we need to establish an Issuer for tokens coming from that Identity. To do this, we create a new instance of the “Issuer” type and initialize its Name property to “Google”. This “friendly name” will appear in the ACS Management portal UI. We can then add that type to the management Issuer’s collection and save our changes. This will generate a new Id for the Issuer. We can then create an instance of IdentityProvider. Set the DisplayName and Description to appropriate values for display in the ACS Management Portal. Set the WebSSOProtocolType to “OpenId” and the IssuerId to the Id property of the Issuer that we just created and saved.

       // ms is an instance of ManagementService proxy 
      Issuer issuer = new Issuer { Name = “Google” };
      ms.AddToIssuers(issuer);
      ms.SaveChanges(SaveChangesOptions.Batch);

      // Create Identity Provider
      IdentityProvider identityProvider = new IdentityProvider {
            DisplayName = “Google” ,
            Description = “Google” ,
            WebSSOProtocolType = “OpenId”,
            IssuerId = issuer.Id
      };
      ms.AddObject("IdentityProviders", identityProvider);

We need a means for the token requestor and consuming applications to verify the authenticity of tokens issued by the STS. The STS publishes the base64 encoded public key of the certificate that it will use to digitally sign its tokens in the metadata exchange document. We will set the appropriate IdentityProviderKey properties to the certificate values and then we’ll add the IdentityProviderKey object to our object graph and associate it with the IdentityProvider that will use it as shown in the following code:

       // *** Create the Identity Provider key used to validate

       // the signature of IDP-signed tokens. Signing certificates
       // can be found in a WSFederation IDP's metadata. 
       IdentityProviderKey identityProviderKey = new IdentityProviderKey {
              DisplayName = "GoogleIdentityProviderKeyDisplayName",
              Type = “X509Certificate”
              Usage = “Signing”,
              Value = Convert.FromBase64String("MIIB9DCCAWGgAwI…”),
              IdentityProvider = identityProvider,
              StartDate = DateTime.UtcNow,
              EndDate = DateTime.UtcNow.AddYears(1);,
       };

       ms.AddRelatedObject(identityProvider, "IdentityProviderKeys", identityProviderKey); 

Our new Google or Yahoo IdentityProvider will need to have an endpoint address associated with it. We can do this by creating an instance of the IdentityProviderAddress class and adding it to the entity data model then saving our changes. There are two properties on this class with values that are less than obvious (or even discoverable). The Address property of the endpoint address instance must be set to https://www.google.com/accounts/o8/ud and the EndpointType must be to “SignIn”.  For Yahoo, set the Address property to https://open.login.yahooapis.com/openid/op/auth and the EndpointType to “SignIn”.

       IdentityProviderAddress googleRealm = new IdentityProviderAddress() {
              Address = "https://www.google.com/accounts/o8/ud",
              EndpointType = “SignIn”,
              IdentityProvider = identityProvider,
       };

       ms.AddRelatedObject(identityProvider, "IdentityProviderAddresses", googleRealm);
       ms.SaveChanges(SaveChangesOptions.Batch);

We now need to associate our new Google IdentityProvider with the relying party applications that will depend upon it. In our case, this is every RelyingParty defined (other than the AcessControlManagement) so we simply loop through them as the following code demonstrates:

// Make this IDP available to relaying parties
// (except for the Management RP)         
       foreach (RelyingParty rp in ms.RelyingParties) {
              // Skip the built-in management RP
              if (rp.Name != "AccessControlManagement") {
                     ms.AddToRelyingPartyIdentityProviders(new RelyingPartyIdentityProvider {
                           IdentityProviderId = identityProvider.Id,
                           RelyingPartyId = rp.Id
                     });
              }
       }
       ms.SaveChanges(SaveChangesOptions.Batch);

This should be enough to supplement your knowledge of using the Windows Azure AppFabric Labs v2.0 Access Control Service Management API to programmatically setup Google (or Yahoo) as an Identity Provider for your relying party applications.


The Windows Azure AppFabric Team posted a Windows Azure AppFabric Scheduled Maintenance Notification (March 31, 2011) on 3/22/2011:

Due to upgrades and enhancements we are making to Windows Azure AppFabric, the AppFabric Portal, located at http://appfabric.azure.com, will be locked for updates for a few hours.
During that time you will not be able to create, update or delete any namespaces.

There will be no disturbance to the services themselves, nor will there be any breaking changes as result of the upgrades.

When:

  • START: March 31, 2011, 9 am PST
  • END: March 31, 2011, 9 pm PST

Impact Alert: You will not be able to create, update or delete any namespaces in the AppFabric Portal.

Action Required: None

If you experience any issues or have any questions please visit our Windows Azure Platform Forums.

We apologize in advance for any inconvenience this might cause.


An MS Tech Talk persona asked and answered Why use Windows Azure AppFabric caching service instead of Azure Storage/SQL Azure? in a thread in the Windows Azure AppFabric CTP forum on 3/21/2011:

image Q: I’m trying to understand how to position the Windows Azure AppFabric caching service.

Here is my main question/concern:
If the cache is managed in the cloud, how much performance benefit does an app really get compared to just accessing Windows Azure Storage or SQL Azure again?

I would expect a big performance benefit to justify setting things up on the portal, introducing DLL dependencies on my project, learning an API & integrating the use of it into my code, and then the to-be-determined pricing.


imageA: The Windows Azure AppFabric Caching service is a distributed in-memory application cache that is sharable between processes/services. The core difference is that with the Caching service you’re only talking over the network and the data is loaded in memory (hence no disk I/O) whereas storage and SQL Azure hit disk (in most cases). As result of it being in-memory, you get better performance.

There are demo apps such as http://cachedemo.cloudapp.net/ that showcase the improved performance. You can read more about this demo here.

It is pretty easy and straightforward to provision and use. If you’ve used the on-premise version of the Caching that is part of Windows Server AppFabric you will see it has the same development model and API. Here is a useful Hands-on-Lab that you can use to get started.

You can also use the Caching service to cache ASP.NET Session State data without having to make a change in a single line of code, just make a minor change in the configuration file. Take a look here.

I sincerely doubt if the icon above is a registered trademark; it’s the Windows trademark that’s registered.


<Return to section navigation list> 

Windows Azure VM Role, Virtual Network, Connect, RDP and CDN

imageNo significant articles today.


<Return to section navigation list> 

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Avkash Chauhan announced Windows Azure Toolkit for Windows Phone 7 is released on 3/23/2011:

image Cloud services and phone applications are a powerful combination, and our goal is to make it as easy as we can for you to use Windows Azure to provide storage and services for your phone applications. Everything we’ve built in this toolkit is to simplify the experience and optimize your time. For example, rather than require you to learn any new semantics around storage or put in the time to build out membership services to provide authentication and authorization for your phone applications, we’ve done it for you and provide you with a sample demonstrating the necessary steps.

The toolkit contains the following pieces:

  • Binaries – These are libraries we’ve written that you can use in your Windows Phone 7 applications to make it easier to work with Windows Azure (e.g. a full storage client library for blobs and tables). You can literally add these libraries to your existing Windows Phone 7 applications and immediate start leveraging services such as Windows Azure storage.
  • Docs – We’ve provided documentation that covers setup and configuration, a review of the toolkit content, getting started, and some troubleshooting tips.
  • Dependency Checker – As you’ve come to expect and love, we provide a full dependency checker to ensure that you have all the bits required in order to successfully use the toolkit.
  • Project Templates – We have built VSIX (which is the unit of deployment for a Visual Studio 2010 Extension) files that create project templates that make it easy for you to build brand new applications.
  • Samples – We have a sample application that fully leverages the toolkit, both available in C# and VB.NET. The sample application is also built into one of the two project templates created by the toolkit.

What you can do immediately:

  1. Create a Cloud Project for Windows Phone 7 and include desired Role
  2. Connect to Azure Storage very quickly from the application
  3. Use WCF endpoints for communication with other components

Get It from http://watoolkitwp7.codeplex.com/.

Documentation: http://watoolkitwp7.codeplex.com/wikipage?title=Getting%20Started&referringTitle=Documentation.


Jason Haley (@haleyjason) explained How to Add MVC 3 Project Types to the Azure Cloud Service Wizard in a 3/22/2011 post:

image Last night Julie Lerman was asking for something like this … I hope this helps you out Julie.

This should save some time if you are using MVC 3 applications with Azure web roles.

imageThe approach I took was to adapt the existing MVC 2 Cloud Service Item Template to MVC 3 and it seems to work fine – except you have to add the test project manually (which I did some work on to – more on that below).

I created the following Item Templates and MVC Test Project Template(linked for you to download the ones you want):

NOTE: All are currently C#

After you follow the steps below you’ll get a dialog with the new web roles available, when you create a new Cloud project – like shown here:

image

Step 0:  Get the files

Download the templates you need from my site:

Step 1:  Put the Item Templates where VS will use them

Copy the zip files you downloaded to the following directory:  C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\ItemTemplates\CloudService\NETFramework4\Web Role\Visual C#

My directory has all the templates and looks like this afterwards:

image

Step 2:  Put the Test Project Template where VS will use it

Copy the MvcWebApplicationTestProjectTemplatev3.0.cs.zip file to the following directory:

C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\ProjectTemplates\CSharp\Test\1033

My test directory has the following contents after I copy the zip into it:

image

Step 3:  (Optional) Delete the ItemTemplate and ProjectTemplate cache

The step may be optional, but I did it – feel free to try and skip it, If you find the templates are not showing up in the dialog, you can come back and do Steps 3 and 4 at that time.

Find  the ItemsTemplatesCache directory (C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\ItemTemplatesCache) and delete all the contents in it.

Find the ProjectTemplatesCache directory (C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\ProjectTemplatesCache) and delete all the contents in it.

Step 4:  Rebuild Visual Studio’s template cache

Open a VS Command Prompt (found under Visual Studio 10 –> Visual Studio Tools on you start menu). 

Run the command: devenv.exe /installvstemplates

This will take awhile – on my machine it takes between 10 and 15 minutes.

About the Test Project

After adding the templates as outlined above, you’ll be able to create a web role using the different MVC 3.0 project templates – however you don’t get the chance to add a test project.  That is the reason for step 2 above.

After you have created the web role, you get a Solution Explorer that looks something like the image to the left. 

In order to add a test project you need to do the following:

  1. Right click on the Solution node
  2. Choose Add –> New Project…
  3. In the template listing on the left, choose Visual C# –> Test
  4. This will give you the option to add an ASP.NET MVC3 Test Project

image

Once the Test Project is added, you will need to add a reference to your MVC project and fix the namespaces in the Controller Tests (if you keep them). 

I hope this saves someone some time, let me know if you have any problems with it.

Related posts

<Return to section navigation list> 

Visual Studio LightSwitch and Entity Framework v4+

The Jeff Derstadt posted Using WCF Data Services with Entity Framework 4.1 and Code First to the ADO.NET Team blog on 3/21/2011:

In this post you’ll see how Code First and the DbContext class in Entity Framework 4.1 RC will work with WCF Data Services going forward. WCF Data Services in .NET 4.0 was released prior to the Entity Framework 4.1 RC and so does not natively know about the DbContext class. There are numerous posts already on how to get DbContext to work with WCF Data Services in .NET 4.0, including this one by Rowan Miller. These posts talk about how to write some extra code to initialize the DataService class. This is not how we envisioned these technologies working together. However, it is possible to see how these technologies were meant to work together by using EF 4.1 RC along with the Microsoft WCF Data Services 2011 CTP2. In this walk through we’ll build a simple WCF Data Service using Code First with the latest and greatest releases. This post does assume some knowledge of Code First, DbContext, and WCF Data Services.

Getting Started

1. Download and install Microsoft WCF Data Services 2011 CTP2.

2. Open Visual Studio 2010 and create an ASP.NET Empty Web Application called “HospitalWeb”

3. Using the NuGet package manager, add the Entity Framework 4.1 RC to your project by installing the “EntityFramework” package.

Alternatively, you can download and install the full Entity Framework 4.1 RC setup. If you do this, you should add a reference to “EntityFramework.dll” in your project.

Create your data model

We’ll be using the Code First approach to build our data model and so will be using the DbContext API to help surface that. Here is what you need to do to your project:

1. Add a new code file to your project called “Model.cs”.

2. Add the following entity classes to your file:

public class Patient
{
    public int Id { get; set; }
    [MaxLength(64)]
    public string Name { get; set; }
    public virtual ICollection<LabResult> LabResults { get; set; }
}

public class LabResult
{
    public int Id { get; set; }
    public string Result { get; set; }
}

3. Add a DbContext class to your project. In this case, I’ve called mine HospitalContext.

public class HospitalContext : DbContext

public class HospitalContext : DbContext
{
    public DbSet<Patient> Patients { get; set; }
    public DbSet<LabResult> LabResults { get; set; }
}

4. You can also optionally add a Database initializer to your HospitalContext. One of the nice things about writing new applications with DbContext is that it can optionally create a database for you based on your .NET classes or entity data model, and also optionally pre-populate that database with seed data. Here is the initializer I am using:

public class HostpitalContextInitializer :
                 DropCreateDatabaseIfModelChanges<HospitalContext>
{
    protected override void Seed(HospitalContext context)
    {
        context.Patients.Add(new Patient { Name = "Fred Peters" } );
        context.Patients.Add(new Patient { Name = "John Smith" } );
        context.Patients.Add(new Patient { Name = "Karen Fredricks" } );
    }
}

To have your HospitalContext use this initializer, I added a static constructor to my HospitalContext with the method call to start using the initializer:

static HospitalContext()
{
    Database.SetInitializer(new HostpitalContextInitializer());
}

Your data model and context are now ready to be used with WCF Data Services. The first time a HospitalContext is created by our service, a database will be created for us and our seed data will be inserted.

Create your data service

The Microsoft WCF Data Services 2011 CTP2 did not include updated Visual Studio item templates, so I’ll show you how to use the existing item templates and fix them up to use the CTP:

1. Add a new project item to your HospitalWeb project, choosing “Web” and then “WCF Data Service” in the Visual Studio Add Project Item dialog. I called my service “HospitalService.svc”.

2. The WCF Data Service that was added to your project refers to .NET 4.0 only, not to CTP2, so we’ll need to make a few changes:

a. Remove the project references to “System.Data.Services” and “System.Data.Services.Client” from your project.

b. Add project to references “Microsoft.Data.Services” and “Microsoft.Data.Services.Client” from your project. Note that these assemblies are not installed in the GAC as part of WCF Data Services 2011 CTP2 installation, so you’ll need to browse to the installation directory to find them.

c. In your “HosptialService.svc.cs” file, change the service version from V2 to V3:

config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V3;

Use your data model with your data service

Now you are ready to use your HospitalContext with your WCF Data Service. Here is all you need to do:

1. In your “HospitalService.svc.cs” file, change your HospitalService class to be a DataService<HospitalContext>:

public class HospitalService : DataService<HospitalContext>

2. Change the access permissions to the two sets of data we are interested in, Patients and LabResults:

public static void InitializeService(DataServiceConfiguration config)
{
    config.SetEntitySetAccessRule("Patients", EntitySetRights.AllRead);
    config.SetEntitySetAccessRule("LabResults", EntitySetRights.AllRead);
    config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V3;
}

3. You can also optionally add a service operation to your service to run custom queries and business logic using your HospitalContext. Below is a service operation method to retrieve all patients by name:

[WebGet]
public IQueryable<Patient> GetPatientsByName(string name)
{
    return CurrentDataSource.Patients.Where(p => p.Name.Contains(name));
}

Notice that CurrentDataSource is strongly typed to your HospitalContext, which is a DbContext instance from Entity Framework 4.1.

4. Finally, give permissions to your service operation in the InitializeService method:

config.SetServiceOperationAccessRule("GetPatientsByName", ServiceOperationRights.AllRead);

Run your service

You are now ready to run things. Go ahead and press F5. A bunch of things are going to happen:

1. Code First is going to build your data model using the Patient and LabResult classes.

2. The DbContext class is going to use that data model to create a database for you and populate it with our seed data.

3. A WCF Data Service is going to start up and display the metadata for the service

You can now try out a few WCF Data Services queries. Here are a few I played with:

1. Return all patients:

http://localhost:55051/HospitalService.svc/Patients/

2. Call the service operation GetPatientsByName and search for “Fred”:

http://localhost:55051/HospitalService.svc/GetPatientsByName?name='Fred'

As you can see, we’ve tried to make using the DbContext class and Code First seamless with DataServices. Both of the releases used in this post are in pre-release form, so if you find any issues with using them, using them together, or with the experience, please let us know!

image2224222222.

 


Return to section navigation list> 

Windows Azure Infrastructure and DevOps

Nubifer reported Microsoft Outlines Plans for Integration-as-a-Service on Windows Azure on 3/21/2011:

Although Microsoft officials have been discussing plans for the successor to the company’s BizTalk Server 2010 product for some time, the cloud angle of Microsoft’s plans for its BizTalk integration server didn’t become clear until late October 2010, at the Professional Developers Conference (PDC). 

imageWhen looking at a BizTalk Server Team blog, it appears as if Microsoft is thinking about BizTalk vNext transforming into something akin to Windows Azure and SQL Azure—at least in concept—a “BizTalk Azure.”

An excerpt from the blog says, “Our plans to deliver a true Integration service—a multi-tenant, highly scalable cloud service built on AppFabric and running on Windows Azure—will be an important and game changing step for BizTalk Server, giving customers a way to consume integration easily without having to deploy extensive infrastructure and systems integration.”

The latest news from Microsoft reveals that there will be an on-premise version of BizTalk vNext as well—and the final version is slated to arrive in 2012. A Microsoft spokesperson said, “We will deliver new cloud-based integration capabilities both on Windows Azure (as outlined in the blog) as well as continuing to deliver the same capability on-premises. This leverages our AppFabric strategy of providing a consistent underlying architecture foundation across both services and server. This will be available to customers in the 2 year cadence that is consistent with previous major releases of BizTalk Server and other Microsoft enterprise server products.”

In September 2010, Microsoft released the latest on-premises software version of BizTalk (BizTalk Server 2010), which is a minor release of Microsoft’s integration server that supports Visual Studio 2010, SQL Server 2008 R2, Windows Server AppFabric and Windows Server 2008 R2.

There are currently over 10,000 BizTalk Server customers—paying a hefty price for the product—and thus Microsoft officials are being careful in their positioning of BizTalk Azure. Microsoft will ensure that existing customers are able to move to the Azure version “only at their own pace and on their own terms.” Microsoft plans on providing side-by-side support for BizTalk Server 2010 and BizTalk Azure to make sure apps don’t break and will also offer “enhances integration between BizTalk and AppFabric (both Windows Server AppFabric and Windows Azure AppFabrics).

Microsoft recently rolled out the First CTP (Community Technology Preview) of the Patterns and Practices Composite Application Guidance for using BizTalk Server 2010, Windows Server AppFabric and Windows Azure AppFabric together as part of an overall composite application solution. Additionally, Microsoft previewed a number of future enhancements to Windows Azure AppFabric.


<Return to section navigation list> 

Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds

Kurt Mackie posted Microsoft Unveils System Center 2012 Products to the RedmondMag blog on 3/22/2011:

image Microsoft today announced its System Center 2012 product family at the Microsoft Management Summit, which is ongoing this week in Las Vegas.

image The new product family succeeds System Center 2007, and includes solutions such as System Center Virtual Machine Manager 2012 (SCVMM 2012), which is available today as a beta release at this site. Other new products to come include System Center Operations Manager 2012, System Center Service Manager 2012 and System Center Data Protection Manager 2012.

New Capabilities
Microsoft added enhancements to the products in the System Center 2012 family, but it also added some new capabilities.

One of the new capabilities is called System Center Orchestrator, an IT process automation solution for datacenter management that coordinates services, which is based on Microsoft's Opalis acquisition. Microsoft updated Opalis to version 6.3 in November and announced last year that it would be available to customers who purchased Microsoft's server management suite enterprise license or server management suite datacenter license with Software Assurance. This month, Microsoft announced a Technology Adoption Program to sign-up for its next-generation Opalis release.

image

Another new capability is System Center Advisor, which is the service originally code-named "Atlanta" that can be used to actively detect server configuration problems. System Center Advisor is currently available as a release candidate version at this Microsoft portal. The technology was first announced at the last year's SQL PASS event, but it's not just for SQL Server, according to Microsoft officials.

The third and last new capability is System Center Project, code-named "Concero." System Center Project is the successor to Microsoft System Center Virtual Machine Manager Self-Service Portal, according to Amy Barzdukas, Microsoft's general manager of server and tools communications. It allows workflows to be created, such as assigning a cloud to the finance department within an organization, even while IT administrators maintain overall control.

Microsoft is enabling self-service portals that will allow service owners to configure, manage and deploy services "without having to deal with things like Virtual Machine Manager and spinning up virtual machines," explained Don Retallack, an analyst with the Directions on Microsoft consultancy. He added that Concero is part of this effort but that SCVMM 2012 will also enable the capability.

"Concero looks very much like the VMM self-service portal but with some other features -- a graphical view of the service, for example -- and it will be tied to all of the System Center products at some point, including Operations Manager," Retallack explained.

Enhancements
As for the enhanced capabilities, System Center Operations Manager 2012 now fully integrates Microsoft's AVIcode acquisition, which helps pinpoint flaws in applications built on Microsoft .NET and J2EE platforms. System Center Service Manager 2012 now enables self-service requests from business managers to request cloud resources. System Center Data Protection Manager 2012 adds "enterprise-class" centralized backup and protection, de-duplication support, and SharePoint integration functionality.

New in SCVMM 2012 is its beefed-up hypervisor support. The virtual machine management product currently works with Microsoft Hyper-V and VMware vSphere 4.1 solutions, and it now includes support for Citrix's Xen Server. IT pros can manage multiple clouds running different hypervisors, explained Kenon Owens, technical product manager for Microsoft integrated virtualization, during a press demo. Owens generally noted that Microsoft has done a lot of work to meet its customer requirements for managing private clouds with this release of SCVMM 2012.

Another new feature in SCVMM 2012 described by Owens is called "dynamic optimization." It allows IT pros to allocate virtual machine workloads on the fly. He said that this capability allows IT pros to set how the workload is balanced and they can also manage capabilities such as power optimization with it. Users can also create collections of virtual machines in a new "service template." Owens said that the nice thing about using service templates is that versions can be set for them, which can be useful for IT pros when they update services. Microsoft also added storage capability based on the SMI-S storage protocol.

As noted above, some parts of the System Center 2012 product family are available today as test versions. Microsoft plans to deliver final System Center 2012 product releases sometime this year, but the details weren't disclosed. However, Retallack said that most of the System Center 2012 products will be released by Microsoft in the second half of this year.

Microsoft also issued a visionary statement about enabling private cloud computing. Brad Anderson, corporate vice president at Microsoft's Management and Security Division, noted that IT is moving on from just consolidating servers through virtualization into a "new computing paradigm." That paradigm will focus more on managing applications and tapping cloud computing, he explained in a blog post.

Kurt is online news editor, Enterprise Group, at 1105 Media Inc.

Full disclosure: I’m a contributing editor for Visual Studio Magazine, which 1105 Media publishes.


<Return to section navigation list> 

Cloud Security and Governance

image

No significant articles today.


<Return to section navigation list> 

Cloud Computing Events

No significant articles today.


<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Matt explained how to Build a Cluster Computing Environment in Under 10 minutes in a 3/22/2011 post to the Amazon Web Services blog:

We've created a new video tutorial, which describes how to setup a cluster of high performance compute nodes in under 10 minutes. Follow along with the tutorial to get a feel for how to provision high performance systems with Amazon EC2 - we'll even cover the cost of the resources you use, through a $20 free service credit.

Why HPC?

Data is at the heart of many modern businesses. The tools and products that we create in turn generate complex datasets which are increasing in size, scope and importance. Whether we are looking for meaning within the bases of our genomes, performing risk assesments on the markets or reporting on click-through traffic from our websites, these data hold valuable information which can drive the state of the art forward.

Constraints are everywhere when dealing with data and its associated analysis, but few are as restrictive as the time and effort it takes to procure, provision and maintain the high performance compute servers which drive that analysis.

The cluster compute instance sizes available on Amazon EC2 can greatly reduce this constraint, and give you the freedom to run high specification analysis on-demand, as and when you need them. Amazon EC2 takes care of provisioning and monitoring your compute cluster and storage, leaving you more time to dive into your data.

A guided tour

To demonstrate the agility this approach provides, I made a short video tutorial which guides you through how to provision, configure and run a tightly coupled molecular dynamics simulation using cluster compute instances. The whole cluster is up and running in under 10 minutes.

Start the tutorial!

To help get a feel for this environment, we're also providing $20 of service credits (enough to cover the cost of the demo), so you can follow along with this tutorial for free. To register for your free credits, just follow the link on the tutorial page.

In addition to getting up and running quickly, each cluster compute instance is no slouch either. They use hardware virtualisation to allow your code to get closer to the dual quad core Nehalem processors, and full bi-section 10Gbps networking for high speed communication between instances. Multi-core GPUs are also available - a perfect fit for large scale computational simulation or rendering.

Just as in other fields, cloud infrastructure can help reduce the 'muck' and greatly lower the barrier of entry associated with working with high performance computing. We hope this short video will give you a flavour for things.

Get in touch

Feel free to drop me a line if you have any questions, or you can follow along on Twitter. I also made a longer form video, which includes a wider discussion on high performance computing with Amazon EC2.


<Return to section navigation list> 

 

0 comments: