Friday, April 15, 2011

Windows Azure and Cloud Computing Posts for 4/13/2011+

image2 A compendium of Windows Azure, Windows Azure Platform Appliance, SQL Azure Database, AppFabric and other cloud-computing articles.

image4    

• Updated 4/15/2011 at home from MIX 11 with articles marked from the Windows Azure Team, Channel9, MIX 11, Michael Bridge, OData Wiki, Don Kieley, and Guru Prasad.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.


Azure Blob, Drive, Table and Queue Services

imageNo significant articles today.


<Return to section navigation list> 

SQL Azure Database and Reporting

Steve Yi reported the availability of a MSDN Article: How to Connect to SQL Azure using PHP on 4/13/2011:

image For developers creating an application using PHP, you can connect to Microsoft SQL Azure Database. MSDN has written a walkthrough article on how to do this. Check out the link below to see you how you can do this.  The great thing about this is that you can run your PHP application on your own server, with a hoster, or within Windows Azure.  In any one of those deployment choices, you can access SQL Azure database without any changes.

imageHow to: Connect to SQL Azure Using PHP.

 


<Return to section navigation list> 

MarketPlace DataMarket and OData

Turker Keskinpala announced the availability of an OData Service Validation Tool in a 4/13/2011 post to the Open Data Protocol (OData) Wiki:

image OData is released under the Microsoft Open Specification promise. This allows anyone to create OData services that implement the specification and to freely interoperate with OData implementations.

Currently there are several OData service producers and server libraries including .Net Framework, Java, Rails and several client libraries across a range of platforms such as Objective-C, Javascript, PHP, and Java. The fact that an OData service can be consumed by a wide range of applications and libraries makes interoperability a key requirement.

imageToday we are announcing the OData Service Validation Tool to address that requirement. The goal with this tool is to enable OData service authors ensure that their service interoperates well with any OData client. Consumers of the OData protocol can also benefit from this tool by testing OData service implementations that they are building an experience for to pinpoint potential issues.

The tool can currently validate the following types of OData endpoints:

  • Service document
  • Metadata document
  • A feed/collection
  • An Entry

The following URI constructs are currently not supported:

  • Select, Expand and Format system query options in the endpoint URIs
  • Endpoint URIs pointing to payloads larger than 1 MB
  • Authenticated feeds

The screenshot below shows validation of a Metadata document. It can be seen that 7 rules that are related to Metadata document were executed and the metadata document passed all these checks.

001

Did you notice that the payload that was validated is displayed in the Source section? This is especially useful when the validation results in errors or warnings since the line in the payload highlighted in such cases. Screenshots below demonstrate this using a service with a metadata document which has errors:

002

003

The current (alpha) release of this tool runs a small representative set of rules derived from the specification. You can see a current list of rules here.

We are actively working on adding more rules to increase the coverage across the specification so that compliance can be determined with a much more exhaustive set of rules. The goal is to add new rules to the service on a regular cadence.

Please go ahead and try the tool either with a service of your own or using one of many public OData services.

Let us know if you have any questions or comments about this service or if there is a feature that you would like to see added. This tool is in the very early stages of development and you can help shape its evolution by providing feedback. Please join the OData mailing list to provide feedback or report issues


Guru Prasad explained how to Retrieve using ODATA and JSON in CRM 2011 in a 4/13/2011 post:

image In this example i am trying to retrieve Account entity information in  form load script using ODATA  and JSON in CRM 2011

To use ODATA service you need two resource files

  JSON and JQuery

you can download these popular resource in web...

Now write following java script in the new web resource file

function init()

{

// write required ODATA query

var odataSelect = "http://server/orgname/XRMServices/2011/OrganizationData.svc/AccountSet(guid'6C2EFF37-BD39-E011-91D1-000C2971AF13')";

$.ajax({

       type: "GET",

       contentType: "application/json; charset=utf-8",

       datatype: "json",

       url: odataSelect,

       beforeSend: function (XMLHttpRequest) { XMLHttpRequest.setRequestHeader("Accept", "application/json"); },

       success: function (data, textStatus, XmlHttpRequest)

           {

// Use this method for a selection that  return single entitie
               RetrieveEntityData(data.d);

              // Use this method for a selection that may return multiple entities

               RetrieveMultipleEntities(data.d.results);

           },

       error: function (XmlHttpRequest, textStatus, errorThrown) { alert('OData Select Failed: ' + odataSelect); }

   });

}

function RetrieveEntityData(Entity)

    // get the fields from the Entity object

     var accountNumber = Entity.AccountNumber;

     var AccountName = Entity.Name;

alert(Entity.Name);  

}

function RetrieveMultipleEntities(ManyEntities)
{
  for( i=0; i< ManyEntities.length; i++)
  {
// get the fields from the Entity object
     var Entity = ManyEntities[i];
     var accountNumberAttribute = Entity.AccountId;    
     var AccountName = Entity.Name;
alert(Entity.Name);   
  }
}

Just call the init method from the CRM form events, but make sure you have to add JSON and JQuery web resources to the event library before adding this web resource. It has to follow the sequence JSON,JQuery and Custom web Resource.

To get the Required entity ODATA set use the following url

http://server/XRMServices/2011/OrganizationData.svc/new_enitityset


<Return to section navigation list> 

Windows Azure AppFabric: Access Control, WIF and Service Bus

Vittorio Bertocci (@vibronet) reported The new ACS ships! on 4/11/2011:

image Last week we shipped a new version of the Windows Azure AppFabric Access Control service. and MIX is all a buzz about it!

The new ACS includes a plethora of new features that you guys have been asking with enthusiasm: single sign on from business and web identity providers, easy integration with our development tools, support for both enterprise-grade and web friendly protocols, out of the box integration with Facebook, Windows Live ID, Google and Yahoo, and many others. If that would not be enough to get your attention: as part of a special promotion, customers and partners will be able to use ACS free of charge until at least January 2012. Pretty neat eh?

image722322222In order to help you  to hit the ground running, today we are releasing a series of new assets (videos, samples and live sites) that complement the existing documentation and samples that come with the service. Below you can find a brief description and links for each of the assets.

New Code Resources
Identity Training Kit

clip_image003

All the ACS labs in the new Identity Developer Training Kit and Windows Azure platform Training Kit released today have both been updated to point to the ACS production service.

Furthermore, we added one new lab which demonstrates how to take advantage of ACS for handling user authentication in your WP7 applications and how to use OAuth2 for securing calls to OData services from the phone.

Both training kits are also available unbundled as MSDN courses here and here.

ACS Extensions for Umbraco

clip_image005

With over 85,000 websites running it, including Wired, Heinz and Vogue, Umbraco is the most widely adopted CMS on the .NET platform.

In the last few weeks we worked on some extensions which seamlessly integrate key ACS features into the Umbraco’s administrative UI.

Today we are releasing the ACS Extensions for Umbraco, both in source code and NuGet package forms.

Niels Hartvig, Founder of Umbraco, is quoted saying:

We're excited about the very diverse integration scenarios the ACS (Access Control service) extension for Umbraco allows. The ACS Extension for Umbraco is one great example of what is possible with Windows Azure and the Microsoft Web Platform

Visit http://umbracoacsextensions.codeplex.com/ for downloading the source package and the comprehensive documentation, or install the extension directly from Visual Studio. All the main tasks are demonstrated by three screencasts on Channel9 (see below)

ACS Plugin for Wordpress

clip_image007

ACS is entirely based on open standards: as such, it can be used by applications running on any platform, and we have proof. Aaron Smalser from the ACS team wrote a PHP plugin for Wordpress which integrated with ACS. You can get it from http://wordpress.org/extend/plugins/acs-plugin-for-wordpress/.
From the readme:

The ACS WordPress Plugin allows WordPress hosts to enable federated login for their WordPress site using Windows Azure AppFabric Access Control Service (ACS) 2.0.
WordPress administrators can use ACS to create trust relationships between their site and identity providers such as Windows Live ID, Facebook, Google, Yahoo!, and custom identity providers such as Microsoft Active Directory Federation Services 2.0. The ACS WordPress Plugin then renders a custom login page based on the ACS configuration, and enables end users to log in to the WordPress site using an identity provider of their choice.

myTODO AppFabric

clip_image009

Many of our samples aim to demonstrate realistic end to end cloud scenarios, where identity is one of the many aspects developers and architects need to consider. This has always been well received, however you asked us to highlight some key tasks in simpler examples as well. MyTodo AppFabric is one such sample: it shows how to use ACS for brokering authentication to web identity providers, how to host a custom HDR page and how to handle token-based registration in a very, very simple scenario. You can download it from here, and see it in action on an live instance running in Windows Azure (see below).

FabrikamShipping SaaS

clip_image011

FabrikamShipping SaaS, our most comprehensive cloud sample to date, is being updated to the ACS production environment.

New Videos

clip_image013

This is the news that so many have been waiting for: the new version of Access Control Service finally hit RTW stage!

Stuart Kwan, Principal Group Program Manager on the Cloud Identity Platform team and recurrent guest on the IdElement, gives a four-minute introduction to the service and touches on the pricing model. For example, did you know that you can use ACS in production free of charge until at least January 2012? Jump to http://windows.azure.com and get started NOW!

http://channel9.msdn.com/Shows/Identity/The-Access-Control-Service-20-Ships

clip_image015

If you want to understand what the Access Control Service is really about, look no further: this is the interview you want to watch.
Justin Smith, Principal Program Manager Lead for the Windows Azure AppFabric Access Control Service, worked on ACS from its very first version. From that vantage point, Justin looks back at the roots of the problem that ACS is meant to solve, retraces the trajectory that the service has been following from its 1.0 version to the new 2.0 release, and touches on some of the most important scenarios it addresses.
http://channel9.msdn.com/Shows/Identity/Justin-Smith-on-the-Release-of-Access-Control-Service-20

clip_image017

Have you ever tried to handle authentication for a mobile app, regardless of the platform? Every provider has its own protocol, which forces you to write and maintain a lot of different implementations. Writing protocol code on devices might not always be easy, and the fact that web protocols are moving targets which change every few months doesn't help.

Nobody knows this better than Caleb Baker, Senior Program Manager on the ACS team. Caleb has been working on making it real easy to outsource to ACS your mobile authentication woes: his solution is the base of the new ACS+WP7 hands-on lab in the Identity Developer Training Kit.

In this quick interview Caleb examines in details the authentication flow of his solution, from the Silverlight control which wraps most of the ACS integration to the way in which the phone app uses OAuth2 to secure calls to one OData service.

Caleb also worked on improving the way in which errors are handled in federated scenarios, and drove interesting features in ACS which can really help with that: thanks to his explanation here, you'll be able to use those features in just minutes. Folks, don't miss this interview!
http://channel9.msdn.com/Shows/Identity/Caleb-Baker-on-Using-ACS-in-Windows-Phone-7-apps-and-ACS-Error-Management

clip_image019

ACS may be a PaaS service, but the programmatic route is not the only way to is heart: there are many situations in which developers, administrators and users interact directly with it.

The new release of the Access Control Service features a management portal you can use for managing your access control policies, from which identity providers you want to engage with (you have a choice of social providers, such as Windows Live ID, Facebook, Yahoo, Google and any OpenID or OAuth2 provider, and business providers, such as Active Directory Federation Services instances or any other WS-Federation/WS-Trust provider) to the transformation rules which decide what claims will be available to your application.

Furthermore, ACS now provides various features aimed at solving the home realm discovery problem (HDR): in practical terms, features which make it easy for developers and end users to always pick the right identity provider.
The man behind those features is Aaron Smalser, Program Manager on the ACS team: in this 20-minutes interview Aaron discusses the user interaction aspects of the service from his unique perspective.

http://channel9.msdn.com/Shows/Identity/Aaron-Smalser-on-the-ACS-20-Portal-and-HDR-Features

clip_image021

The Access Control Service (ACS) Extensions for Umbraco code sample is one extensions to Umbraco 4.7 which enables you to authenticate users from Facebook, Windows Live ID, Google, Yahoo, Active Directory and other identity providers. Setup, user management and handling of authorization policies are all seamlessly integrated in the Umbraco UI. Download the ACS Entensions for Umbraco here.

All the screencasts in the series:

  • 1 Setup
    This screencast shows you how to set up Umbraco 4.7 and install & configure the Access Control Service (ACS) Extensions for Umbraco.
  • 2 SignIn and Authorization for Social
    In this screencast you will learn how to use the ACS Extensions to add sign in, sign up and authorization features to your web site. Furthermore, you wil learn how to invite users from Facebook, Windows Live ID, Google and Yahoo to your web site and manage their access level via roles.
  • 3 ADFS2 Integration
    In this screencast you will learn how to use the ACS Extensions to grant access to users coming from business identity providers (like ADFS2) to your Umbraco web site.

clip_image023

All work and no play makes your PaaS dull!

In this lightning-fast screencast you'll see how ACS helped the guys at www.angrytoyfactory.com to handle their authentication needs, without compromising anything in the stunning visuals in their latest creation, the online strategy game at www.AtlantisOnline.com.

http://channel9.msdn.com/Shows/Identity/ACS-and-AtlantisOnline

2nd Edition of “A Guide to Claims based Identity”

clip_image025

This new edition of the patterns & practices’ “Guide to Claims based Identity”, has been extended with 5 new chapters covering the recently released “Window Azure Access Control Service”, Windows Phone 7 and SharePoint. After an introduction of the core concepts and principles of claims based identity, the book provides guidance on common scenarios such as: WebSSO, Federation, claims based authentication with SOAP and REST web services; and claims enabling SharePoint. It also includes extensive downloadable code samples that demonstrate each of the described scenarios. Get it at http://claimsid.codeplex.com

New Live Web Sites
myTODO Live Instance

clip_image027

At http://mytodoappfabric.cloudapp.net/ you can find a live instance of the myTodo AppFabric sample described above. The application is entirely self-service, since anybody can sign up and create new lists. That’s a very good asset for quickly experiencing basic concepts in action, such as federated sign in, custom HDR pages, sign up and registration flows and similar.

ACS Extensions Live Instance

clip_image029

At https://umbracosample.cloudapp.net/ you can find one instance of Umbraco with the ACS Extensions on. Social members need to be registered with the web site, hence you can’t use web providers there, however you can use it for experiencing ADFS integration with the AdventureWorks’ SelfSTS utility you can find in the FabrikamShipping SaaS companion (available for download at www.fabrikamshpping.com).

FabrikamShipping SaaS

clip_image031

The live instance of FabrikamShipping SaaS at www.fabrikamshipping.com proved to be a valuable asset for customers who use it to provision new tenants daily, as a booth demo at many important events (PDC, RSA, TechEd) and even as keynote demo (TechEd Europe, TechEd China). The ACS labs environment is still up hence the transition to ACS prod should be smooth.

AtlantisOnline

clip_image033

www.AtlantisOnline.com will start accepting beta participants this spring: its ACS integration is a real beauty, and makes it for a gorgeous demo without requiring you to install anything on your machine.

If you have feedback on those samples, please do not hesitate to drop me a line. We are committed to empower you to use our services in the most effective way.

Please join me in congratulating the ACS team for their incredible work!


<Return to section navigation list> 

Windows Azure VM Role, Virtual Network, Connect, RDP and CDN

imageNo significant articles today.


<Return to section navigation list> 

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Pedro Ardila posted MVC + Code-First + Azure Part 2: [Cloud] Deployment to the ADO.NET Team Blog on 4/14/2011:

In the previous article, you learned how to create a basic MVC application using Code First. In the second and final section you will learn how to prepare and deploy your application to Windows Azure using Visual Studio.

Deploying to Windows Azure
Preparing Application

Having tested our app locally, it is time to set up Azure to deploy our app. Although this process may seem long, all the steps except for deployment only have to be done once per application.

First thing we must do before deploying is ensure our project has access to all of the assemblies we are referencing. Windows Azure machines run .NET 4.0, so we have to copy any new assemblies we may be using. We do this by going to the Solution Explorer on Visual Studio, expanding the References folder. Now we right click on EntityFramework and select Properties from the Context UI. Next, on the properties window, we set Copy Local equal to ‘True’.

Repeat this process for:

  • Microsoft.WindowsAzure.Diagnostics
  • Microsoft.WindowsAzure.StorageClient
  • System.ComponentModel.DataAnnotations
  • System.Web.Mvc
  • System.Web.Routing
New Storage Account

We need to create a storage account through the Management Portal. Go to the ‘Hosted Services, Storage Accounts, & CDN’ section, and click on Storage Accounts. Select New Storage Account on the Ribbon, and on the dialog box enter a URL name. Make this name unique for your app. In this example we will use codefirstazure.

Now create an affinity group. We do this so that both, the storage account and the hosted service reside within the same data center, which is good for application performance. Name the Affinity Group WorkoutsAffinityGroup, and select a location near you.

Note the properties pane contains some information about the storage account. We will use some of this information later to deploy our application.

New Hosted Service

Creating Hosted Service
On the azure portal, click on ‘Hosted Services, Storage Accounts & CDN’ and then click on New Hosted Service. Enter a name for your service on the new window. We will call ours WorkoutsAzureApp. Enter workoutsazureapp on the URL Prefix textbox. Now click on the radio button to ‘Create or choose an affinity group’ and select the affinity group we created in the previous step. Under Development options, select ‘Do not deploy’, then hit OK.

Publishing Application
In visual studio, right click WorkoutsAzureApp and click Publish… Under Credentials, hit <Add…>.

Now, on the new dialog, create a new certificate for authentication by clicking on the dropdown and selecting <Create…>. Enter a friendly name for the certificate. We will call ours WorkoutsCert. Click on ‘Copy the full path’ on the authentication window, then go back to the Management Portal and upload the certificate. To do this, click on Hosted Services, Storage Accounts & CDN. Then click on Management Certificates on the left hand menu, and click on Add a Certificate on the Ribbon.

On the new dialog window, click browse, then paste the path currently on your clipboard onto the ‘File name’ textbox. Now click Open, then click Done.

Now, on the Management Portal, copy the subscription ID that shows up on the properties pane on the left. Then go back to Visual Studio and paste it on the textbox asking for the subscription ID on the Windows Azure Project Management Authentication textbox. Name the credentials ‘WindowsAzureAccount’ Click OK.

At this stage, the Deploy Windows Azure project should be all filled in. Click OK to begin deployment.

The app will begin deployment. You can see the progress on the Windows Azure Activity Log.

When deployment completes, go to the Windows Azure Portal and click on the Deployment node. Next, click on the link under DNS name. This will bring up our application:

Maintaining the App

Development won’t stop after you deploy the first time. If you would like to make more changes to the application, the fastest way to do so is by upgrading the deployment, rather than redeploying from scratch. To do so, right click on WorkoutsAzureApp on Visual Studio and select Publish…. Now select ‘Create Service Package Only’ and click OK. A windows explorer window will open, showing you a file named ‘WorkoutsAzureApp.cspkg’ and another one named ‘ServiceConfiguration.cscfg’. Now go back to the Management Portal, go to the ‘Hosted Services, Storage Accounts & CDN’ section, and then click on the Hosted services folder. Now select your deployment, and click Upgrade on the Ribbon. On the new pop-up, click Browse Locally… next to the Package location textbox, and browse to the location of the two files created by publishing. Select the WorkoutsAzureApp file, and then repeat this task for the Configuration file. With the two files selected, click OK and the deployment will begin. When it finishes, your application should be ready to be used again.

Conclusion

In this exercise, you have learned how to build an MVC Web App that is hosted in Windows and SQL Azure, using code first. We covered the basics of code first, and showed you the simplest way to deploy your applications to the cloud. At this point, you should be able to write and deploy your own apps to Azure from Visual studio. We hope that you find this article helpful. Please be sure to leave your feedback in the comments section.

At MIX 11, the ASP.NET MVC team mentioned that it hopes to automated this process with a Visual Studio 2011 template in the near future.


Pedro Ardila posted MVC + Code-First + Azure Part 1: Local Development to the ADO.NET Team Blog on 4/12/2011:

imageThis entry will guide you through the process of creating a simple workout tracker application using ASP.NET MVC and the Code First development approach. MVC stands for Model View Controller you can find more information on MVC here. The application will reside in Windows Azure, and it will be backed by a SQL Azure database. By the end of the article, we hope you understand how to use Code First with MVC, and how to test and deploy an application using MVC, Code First, SQL Azure, and Windows Azure.

Here is an overview of what we will cover:

  • How to develop an MVC application using Code First
  • Seeing some of the Code First basics at work
  • How to seamlessly use a SQL Azure database
  • Testing the application locally using the Windows Azure Tools for Visual Studio
  • Deploying the app to a Windows Azure staging Deployment using Visual Studio
  • Moving the application to a Production Deployment through the Azure Portal

Each topic will not be covered in depth in this article. However, you will find links wherever further explanation is required. Also, be aware that the MVC application we are building is for demonstration purposes only so we have taken the liberty to break conventions for the sake of brevity. We hope you find this post useful, and encourage you to leave any questions or comments at the bottom of the post!

Entity Framework and Azure

Entity Framework 4.1 and Azure work quite nicely together, as long as the configuration settings are correct. Here are some of the key things to have in mind while deploying a CF app to Azure. If any of this is not clear to you, please continue reading, as each bullet is explained in the appropriate context below:

  • Add PersitSecurityInfo=true to the connection string to allow Code First to create the database in SQL Azure. Make sure to remove PersistSecurityInfo from the connection string after the database is created.
  • Ensure that any assembly referenced in your project that is newer than .NET 4.0 has Copy Local = true
  • Make sure all third-party assemblies are signed
  • Uploading the security certificate is key for Visual Studio to communicate with Windows Azure
  • Make sure to set CustomErrors on Web.Config to the RemoteOnly so that errors are only shown on remove clients.
  • Use the ‘Upgrade’ option on the Windows Azure Management portal whenever you make changes to the application. It is faster than re-deploying the app from visual studio
  • System.Transactions transactions cannot be used with SQL Azure. Please see this article for General SQL Azure Guidelines and Limitations
  • You may run into connection retry issues. Check out this blog post for troubleshooting options.
Pre-Requirements

To complete this exercise, you must:

  1. Download and Install the Windows Azure SDK and Windows Azure Tools for Microsoft Visual Studio.
  2. Download and install the latest version of the NuGet Package Manager.
  3. Steps 1 and 2 will allow you to create and deploy a Windows Azure app locally. To deploy the application to the cloud, you must obtain a Windows Azure account. Click here to sign up for Windows Azure or here to obtain a free trial (free trial is only available through June 30th, 2011).
  4. Create a SQL Azure account. For that, you can follow the steps here.
Getting Started: Creating a Windows Azure Project

After fulfilling the list above, we can get started with development. The first step is to create a new Windows Azure Project in Visual Studio. To do so, press Ctrl + Shift + N and on the New Project window, click on Cloud under the Visual C# node, then select Windows Azure Project. Name your project WorkoutsAzureApp, and click OK.

A window called New Windows Azure Project will come up. We will use the window to add the MVC Role to our application. Double-click on ASP.NET MVC 2 Web Role, then click OK. Note that we can also use MVC3, however, we are using MVC2 since it is the latest version offered through the ‘New Windows Azure Project’ dialog at the time of writing.

A new window will come up asking you if you would like to create unit tests. Select No to skip the creation of unit tests, and then click OK. We skip creation of unit tests for the sake of simplicity, however, you should strongly consider using unit tests in your application. After these steps, we will get a project that we can immediately begin working with.

Before we can use code first, we must bring in the Entity Framework assembly. We can get it from NuGet by right clicking the references node in the Solution Explorer, and selecting Add Library Package Reference… When the window opens, select Online from the menu on the left, then select EFCodeFirst from the list, and click Install. A dialog will show up asking to accept the License terms. Click ‘I Accept’ and then close the Add Library Package Reference window.

Creating the Model

We will begin by creating the model, which is where we will have most of our interaction with code first. We will create a new model by right-clicking on the Models folder on the Solution Explorer, then going to Add, and selecting New Item… On the new screen, create a new class called WorkoutsModel.cs

We can start creating the model for our workouts. Within the namespace, we will have two classes that will represent our entities named Workout and Gear. Feel free to delete the WorkoutsModel class as we will not use it. We will also have a class named WorkoutsContext which will inherit from DBcontext, and will help us keep track of our entities. Note that it would be best practice to have your POCO classes and your context in different files. We are keeping them in the same file to keep our walkthrough short.

Here is what our classes will look like:

using System;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System.Data.Entity;
using System.Data.Entity.Database;
namespace MvcWebRole1.Models
{
public class Workout
    {
public Workout()
        {
            Gear = new List<Gear>();
        }
public int Id { get; set; }
        [StringLength(50, MinimumLength = 3)]
public string Name { get; set; }
public TimeSpan? Duration { get; set; }
public decimal? Distance { get; set; }
public virtual ICollection<Gear> Gear { get; set; }
    }
public class Gear
    {
public Gear()
        {
            Workouts = new List<Workout>();
        }
public int Id { get; set; }
public string Brand { get; set; }
public string Name { get; set; }
public virtual ICollection<Workout> Workouts { get; set; }
    }
public class WorkoutsContext : DbContext
    {
public DbSet<Workout> Workouts { get; set; }
public DbSet<Gear> Gear { get; set; }
protected override void OnModelCreating(
                    System.Data.Entity.ModelConfiguration.ModelBuilder modelBuilder)
        {
            modelBuilder.Entity<Workout>().ToTable("Workouts");
            modelBuilder.Entity<Gear>().ToTable("Gear");
            modelBuilder.Entity<Workout>().Property(c => c.Name).IsRequired();
        }
    }

}

The first class will be for workouts. Each workout will have an ID, name, duration, distance, and a collection of gear associated with the workout. The second class is called Gear and it will contain an Id and two strings for the brand and name respectively. Observe some of the conventions at work. For instance, Code First will use the Id property in Workouts as the primary key based on convention. Additionally, data annotations are used to shape the data. The StringLength annotation above the Workout’s Name property is an example of this.

Creating the Controller

Create a controller by right clicking on the Controllers node on the solution explorer, selecting Add…, and clicking on Controller…. Enter WorkoutsController on the textbox, and select the checkbox to add action methods for all CRUD scenarios. Now, press Add.

By selecting the checkbox above, MVC scaffolding provides us with a set of methods we can fill in to complete our controller. Some of the noteworthy methods will look like the following:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Mvc;
using MvcWebRole1.Models;
namespace MvcWebRole1.Controllers
{
public class WorkoutsController : Controller
    {
WorkoutsContext db = new WorkoutsContext();
//
// GET: /Workouts/
public ActionResult Index()
        {
return View(db.Workouts.ToList());
        }
//
// GET: /Workouts/Details/5
public ActionResult Details(int id)
        {
Workout w = db.Workouts.Find(id);
return View(w);
        }

//
// POST: /Workouts/Create
        [HttpPost]
public ActionResult Create(Workout w)
        {
try
            {
if (ModelState.IsValid)
                {
                    db.Workouts.Add(w);
                    db.SaveChanges();
return RedirectToAction("Index");
                }
return View(w);
            }
catch
            {
return View();
            }
        }
//
// GET: /Workouts/Edit/5
public ActionResult Edit(int id)
        {
Workout w = db.Workouts.Find(id);
return View(w);
        }
//
// POST: /Workouts/Edit/5
        [HttpPost]
public ActionResult Edit(Workout w)
        {
try
            {
if (ModelState.IsValid)
                {
var workout = db.Workouts.Find(w.Id);
                    UpdateModel(workout);
                    db.SaveChanges();
return RedirectToAction("Index");
                }
return View(w);
            }
catch
            {
return View();
            }
        }
    }
}

Note that at the top we created an instance of WorkoutsContext named db. We read from it when we query for workouts on the Index and Details methods, and we change it when we create or edit our workouts.

Creating Views

We need to build to be able to create strongly typed views from our controller methods. You can build the project by pressing Ctrl+Shift+B). We create views for each method by right clicking on the method name in the code itself, and selecting Add View. In the new dialog, select ‘Create a strongly-typed view’ and pick MvcWebRole1.Models.Workout from the View Data Class dropdown. We will use the same class for all of our views, as they will all be for workouts. The view content will be unique for each view: For the Index view, select List from the View Content dropdown. For the Details method, select the same Details. Follow this pattern for the Create and Edit methods.

We will only need to slightly alter the views. We will make sure to pass in the item keys to the ActionLink methods. Here is what they will look like:

<td>
<%: Html.ActionLink("Edit", "Edit", new { id=item.Id }) %> |
<%: Html.ActionLink("Details", "Details", new { id = item.Id })%>
</td>

In the above example, we have removed the ‘Delete’ action as we won’t be implementing it on this walkthrough.

Database Initializer

Code First allows us to plant seed data into our database by using a database initializer. We will add a new class to our Workouts Model for the sake of simplicity. Alternatively you could create this in a separate file under the Models folder. Here is what the initializer looks like:

public class WorkoutsInitializer : DropCreateDatabaseAlways<WorkoutsContext>
{
protected override void Seed(WorkoutsContext context)
    {
Gear Bicycle = new Gear { Name = "P2", Brand = "Cervelo" };
Gear Wetsuit = new Gear { Name = "Torpedo", Brand = "TYR" };
Gear Watch = new Gear { Name = "310xt", Brand = "Garmin" };
Workout swim = new Workout { Name = "Swim", Distance = 3800, Duration = new TimeSpan(1, 10, 0), Gear = new List<Gear> { Wetsuit } };
Workout bike = new Workout { Name = "Bike Ride", Distance = 112, Duration = new TimeSpan(6, 15, 0) };
Workout run = new Workout { Name = "Run", Distance = 26, Duration = new TimeSpan(4, 20, 0) };
        bike.Gear = new List<Gear> { Bicycle, Watch };
        run.Gear = new List<Gear> { Watch };
        context.Gear.Add(Bicycle);
        context.Gear.Add(Wetsuit);
        context.Workouts.Add(swim);
        context.Workouts.Add(bike);
        context.Workouts.Add(run);
        context.SaveChanges();
    }
}

We can ensure the initializer gets called by adding the following setting to Web.config, within the <configuration> node:

<appSettings>
  <add key="DatabaseInitializerForType MvcWebRole1.Models.WorkoutsContext, MvcWebRole1" value="MvcWebRole1.Models.WorkoutsInitializer, MvcWebRole1" />
</appSettings>

Before we move to local deployment, we must edit the Global.asax to ensure our application immediately reroutes to our Workouts page. This is done by editing routes.MapRoute() so that it uses the Workouts controller as opposed to the Home controller. The final result is:

routes.MapRoute(
"Default", // Route name
"{controller}/{action}/{id}", // URL with parameters
new { controller = "Workouts", action = "Index", id = UrlParameter.Optional } // Parameter defaults
);

Deploying Locally Using SQL Azure

Creating a Valid Connection String

It will only take a few more steps to deploy our workouts application locally using SQL Azure as the backend. To leverage SQL Azure, all we have to do is change the database connection string to target our SQL Azure database. A valid connection string can be obtained through the Windows Azure Portal, under the Database section. Once there, expand the subscriptions tree until you can see the ‘master’ database. Click on master on the tree on the left, and then click on View… under ‘Collection Strings’ on the Properties pane on the right. This will bring up a dialog with valid connection strings for ADO.NET, ODBC, and PHP. The string will look like the following:

Server=tcp:<YourServer>.database.windows.net,1433;Database=master;User ID=<YourUserID>;Password=myPassword;Trusted_Connection=False;Encrypt=True;
 

Copy this string to your clipboard and go back to Visual Studio, then open Web.config under ‘MvcRole1’. Once there, look for the <connectionStrings> node and create a new connection string named WorkoutsContext. Set the connection string equal to the one obtained in the Windows Azure Portal, then replace the placeholders with your real username, password, and desired database name.

Lastly, in order to allow CodeFirst to create the database on SQL Azure, we must append PersistSecurityInfo=True; to our connection string. The final connection string will be similar to the one immediately below:

<add name="WorkoutsContext" connectionString=" Server=tcp:<YourServer>.database.windows.net,1433;Database=WorkoutsDB;User ID=<YourUserID>;Password=myPassword;Trusted_Connection=False;Encrypt=True;

PersistSecurityInfo=True;" providerName="System.Data.SqlClient" />

Remember to remove PersistSecurityInfo=True once the database has been created to ensure your credentials are not left in memory.

Changes To Web.config
We will make some changes to our app configuration so that it runs smoothly on Windows Azure. Remove the following nodes <system.web>:

  • <authentication>
  • <membership>
  • <profile>
  • <roleManager>

While testing locally, we will use impersonation to make sure we can create the database. This will only work if you have admin access to your computer. To use impersonation, type <identity impersonate="true" /> inside the <system.web> node.

Now, press F5 to deploy your application. Your default browser should show your application:

We can verify that CodFirst created the database using SQL Server Management studio or the Windows Portal. You can test the app by pressing ‘Create New’ and adding a new workout. Make sure to remove the identity impersonation before continuing to part 2 of the series.

Conclusion

At this point, our MVC application is running perfectly our machine thanks to the Windows Azure SDK. In the second and final part of this series you will find out how to take deploy your application to Windows Azure. Please let us know if you found this article helpful by leaving a comment below!

Pedro Ardila
Program Manager - Entity Framework


<Return to section navigation list> 

Visual Studio LightSwitch and Entity Framework 4.1

Michael Bridge posted Visual Studio LightSwitch Hosting :: Data Storage in Visual Studio LightSwitch on 4/14/2011:

With the recent release of Visual Studio LightSwitch Beta 1 to MSDN Subscribers, I have gotten a few questions. As questions come in, I’ll do my best to answer them here. One of the first questions I got was about how LightSwitch applications manage and store data source connection information – aka connection strings.

image2224222222Question: In a Visual Studio LightSwitch application, where is the connection string to the database being stored?
LightSwitch applications can work with several types of data, including “local” application data, external SQL Server or SQL Azure data (or any other database supported by the .NET Framework), SharePoint data, and other external data exposed through WCF RIA Services. If you choose to get data from an external source, the Attach Data Source Wizard shows you your options.


On the other hand, the “local” application data is a reference to a SQL Server Express database that is created if you choose the “Create new table” option as shown here.


This application data is defined and stored in a local SQL Server Express database. For example, in the Vision Clinic demo we have used, the first step was to create a new table to store Patient data. in fact, in the demo we are defining an entity model, and creating a Patients (plural) table in the local application database using SQL Server Express. Notice in the following image that in the Solution Explorer there is a Data Sources node containing an ApplicationData node containing a Patients node. Patients is the table that was created in the ApplicationData database.


The ApplicationData.mdf file is a SQL Server Express file that is created and stored in the $ApplicationRoot\bin\Data directory. When connected to with Server Explorer, you can see that the Patients table is there, along with the ASP.NET membership tables.


So where is the connection string to this database stored? In a Web.config file that is created in the $ApplicationRoot\bin\debug directory.

<connectionStrings>
  <add name="_IntrinsicData" connectionString="Data Source=.\SQLEXPRESS;AttachDbFilename='e:\my documents\visual studio 2010\Projects\Application2\Application2\Bin\Data\ApplicationDatabase.mdf';Integrated Security=True;Connect Timeout=30;User Instance=True;MultipleActiveResultSets=True" />
</connectionStrings>
When the application is published (Build | Publish menu), the Publish Application Wizard asks if you want to publish the database directly to an existing database, or if you want to generate a script file to install and configure the database

If you choose to publish to a database, you are then prompted to provide the database information.

If you choose to create an install script file, you are prompted to describe the database that will be scripted.

All in all this is the standard approach to managing database connections in a multi-tiered application. The Web.config in the application tier contains the database connection information. As will all Web.config files, you may encrypt the connection information.

But what about the connections to other data sources, like SQL Azure or SharePoint?
A connection string is a connection string is a connection string. They all get stored in a the Web.config file.

<connectionStrings>
  <add name="_IntrinsicData" connectionString="Data Source=.\SQLEXPRESS;AttachDbFilename='e:\my documents\visual studio 2010\Projects\Application2\Application2\Bin\Data\ApplicationDatabase.mdf';Integrated Security=True;Connect Timeout=30;User Instance=True;MultipleActiveResultSets=True" />
  <add name="PrescriptionContosoData" connectionString="Data Source=server01.data.int.mscds.com;Initial Catalog=PrescriptionContoso;User ID=admin01@server01;Password=[removed]" />
</connectionStrings>

Of course, all of the data in a LightSwitch application is represented by entities. The entities are defined in the ApplicationDefinition.lsml file (stored in $ApplicationRoot\Data). This is simply a reference to the data source(s) represented by the entities in the application. Each data source is described in this file, and the entities representing the data objects are described here as well.

<EntityContainerGroupProperty EntityContainer="Application2:ApplicationData"
    Name="ApplicationData" />
<EntityContainerGroupProperty EntityContainer="Application2:PrescriptionContosoData"
    Name="PrescriptionContosoData" />
</EntityContainerGroup>

...

<DataService DataProvider="EntityFrameworkDataProvider"
    EntityContainer=":PrescriptionContosoData" Name="PrescriptionContosoDataDataService">
    <DataService.ConnectionProperties>
        <ConnectionProperty Name="DataProviderName" Value="91510608-8809-4020-8897-fba057e22d54" />
        <ConnectionProperty Name="DataSourceName" Value="067ea0d9-ba62-43f7-9106-34930c60c528" />
            <ConnectionProperty Name="ProviderInvariantName" Value="System.Data.SqlClient" />
            <ConnectionProperty Name="SafeConnectionString" Value="Data Source=server01.data.int.mscds.com;Initial Catalog=PrescriptionContoso;User ID=admin01@server01" />
            <ConnectionProperty Name="ConnectionStringGuid" Value="1e9905dc-b519-4003-9387-1272a768b256" />
            <ConnectionProperty Name="ProviderManifestToken" Value="2008" />
        </DataService.ConnectionProperties>
    ...
</DataService>

Summary

All in all, LightSwitch applications are built using standard best practices. In the case of connection strings, they are stored in Web.config files in the application tier, and have all the support of ASP.NET configuration files, including encryption.

It’s my understanding that the best practice for a Web role connection string is to place it in the ServiceConfiguration.cscfg file because you can edit the connection string in this file in the Windows Azure Portal without redeploying the project.


Don Kieley asserted “What Visual Studio LightSwitch is good for: quickly building data-centric apps” as a deck for his Visual Studio LightSwitch: A Useful Rapid Development Tool for Building Data Applications article of 4/5/2011 for DevProConnections (missed when posted):

image Microsoft recently released Beta 2 of its upcoming Visual Studio LightSwitch development tool, and I've been working with it a lot for an upcoming project. I wasn't sure that the world needed another Microsoft development environment, but the more I've worked with LightSwitch the more I've grown to appreciate its value for a certain class of applications.

image2224222222LightSwitch is a Visual Studio-based development environment and application framework for quickly building data-centric applications. It provides a rich three-tier application infrastructure that lets the developer focus on the custom business logic and data design, minimizing the amount of code required. The idea is that you can quickly design, build, test, and deploy applications for small business or departmental units, getting them into the hands of users that need them today. The developer is freed from designing an application architecture and selecting the right technologies, because for the most part LightSwitch has made all those decisions for you. If you can live with the choices and need the application done fast, LightSwitch can be a surprisingly viable option.

A LightSwitch application is built using Silverlight 4.0, Windows Communication Foundation (WCF) RIA Services, Entity Framework, and other proven .NET-based technologies, accessing data stored in SQL Server, SQL Server Express, Azure, or SharePoint. It uses well-known and widely used patterns and best practices like a three-tier application architecture and MVVM (Model View ViewModel). Microsoft seems to have been careful not to invent any new development technologies for LightSwitch, instead preferring to go with the tried and the true. For the most part, LightSwitch restricts your direct access to its underlying technologies, so you can't muck around with them even if you want to. But it does provide some APIs to the underlying data objects and has extensive support for extensibility at almost every level when you want to go beyond LightSwitch's out-of-the-box capabilities.

Because LightSwitch is built on Silverlight, you can easily deploy the final application either to the desktop or to a web server to run in a browser. The application loses a bit of functionality when run in the browser, since it runs in the Silverlight sandbox and can't access things like COM automation or get unlimited access to the local file system. But if you can live with those limitations, there is currently no easier or quicker way to quickly build a browser-based Silverlight application.

Using LightSwitch
You can install LightSwitch either as a standalone dedicated Visual Studio instance or, if you install it on top of an existing Visual Studio installation, an additional set of Visual Studio project templates. When you create a new LightSwitch project, you start by selecting whether you want to use Visual Basic or C#, then specify whether you want to access an existing data store or create a new one. If you use an existing SQL Server database, for example, LightSwitch will import the schema for the tables and views you select and create entity objects you can access in the application to handle all data access. If you select to create a new data store, it will create a design-time SQL Server Express database of your design, along with data entities over them, and deploy the resulting schema with the application. Then you can begin creating screens using various templates that let the user interact with the data. You don't even have to customize the screens from the defaults, and the result will be a serviceable if not entirely satisfying user experience.

From there, you have extensive options to customize the application. You'll almost certainly want to customize the screens, add validation at the entity or screen level, and add the touches that make for an easy-to-use, robust, data-centric application. The important point is that the basic application is all there and running, and all you have to do is the customizations necessary to implement logic and features most important to your users.

Read more: 2, Next


The ADO.NET Team Blog announced EF 4.1 Released on 4/11/2011 (missed when posted):

We are excited to announce the final Release to Web (RTW) of Microsoft ADO.NET Entity Framework 4.1 (EF 4.1). This is a fully supported, go-live release.

What’s in EF 4.1?

ADO.NET Entity Framework 4.1 introduces two new features:

  • The DbContext API is a simplified abstraction over ObjectContext and a number of other types that were included in previous releases of the ADO.NET Entity Framework. The DbContext API surface is optimized for common tasks and coding patterns. DbContext can be used with Database First, Model First and Code First development.
  • Code First is a new development pattern for the ADO.NET Entity Framework and provides an alternative to the existing Database First and Model First patterns. Code First is focused around defining your model using C#/VB.NET classes, these classes can then be mapped to an existing database or be used to generate a database schema. Additional configuration can be supplied using Data Annotations or via a fluent API.
Getting EF 4.1

ADO.NET Entity Framework 4.1 is available in a couple of places:

Getting Started

There are a number of resources to help you get started with EF 4.1:

Non-English Releases?

This initial release only includes US English IntelliSense, exception messages and Visual Studio item templates. In approximately a month we will also be releasing a series of ‘Language Packs’ that will add localized versions of these resources to an existing EF 4.1 install. These language packs will be available for the same language set as Visual Studio 2010.

Support

This release can be used in a live operating environment subject to the terms in the License Terms. The ADO.NET Entity Framework Forum can be used for questions relating to this release.

What Changed Since EF 4.1 Release Candidate?

The new features in ADO.NET Entity Framework 4.1 were previously available in a Release Candidate. The changes between RC and RTW are mostly bug fixes with one exception:

  • Change of default length for non-key string and binary columns from ‘128’ to ‘Max’. SQL Compact does not support ‘Max’ columns, when running against SQL Compact an additional Code First convention will set a default length of 4000. There are more details about the change included in a recent blog post.
What’s Not in EF 4.1?

There are a number of commonly requested features that did not make it into EF 4.1. We appreciate that these are really important to you and our team has started work on a number of them already, we will be reaching out for your feedback on these features soon:

  • Enum support
  • Spatial data type support
  • Stored Procedure support in Code First
  • Migration support in Code First
  • Customizable conventions in Code First
Thank You

It has been great to have so much community involvement helping us drive the new features in EF 4.1. We thank you for giving us your valuable input and look forward to working together on the next release.


Return to section navigation list> 

Windows Azure Infrastructure and DevOps

The Windows Azure Team set the following “Changes to the Windows Azure platform Offer for Visual Studio with MSDN Subscribers” message to MSDN subscribers on 4/14/2011:

imageWe are pleased to share with you the new Visual Studio with MSDN Professional, Premium and Ultimate offers, as well as changes to your current MSDN offer.  Subscribers with an active Visual Studio with MSDN subscription are now entitled to the monthly benefits listed below, based on their subscription level:

image[17]

For customers, like you, with a Windows Azure Platform MSDN subscription, we have additional good news: as of April 12, 2011, your monthly benefits will be automatically upgraded to the following levels:

image[20]

In exchange for these increased benefit levels, we are eliminating the 5% discount on compute, SQL Azure database, Access Control transactions and Service Bus connections for billing periods beginning on or after June 1, 2011. 

If you’re looking for help getting started with the Windows Azure platform, click here to find helpful resources, such as SDKs, training kits, management tools, and technical documentation. To help spark your creativity, also be sure to check out what other developers have built using the Windows Azure platform.  Thanks again for your interest in the Windows Azure platform and happy developing!

1,500 hours of a small compute instance lets you run two roles instead of the previous one. My present MSDN subscription provides three 1-GB Web databases. I (and probably most other folks) would much rather have five 1-GB databases, which is the same price.


David Linthicum asserted “Microsoft's silly 'we'll save you 80 percent' claims show how too many businesses focus on the wrong cloud benefit” as a deck for his What you're missing in your cloud ROI calculations article of 4/13/2011 for InfoWorld’s Cloud Computing blog:

image A recent Microsoft "study" claims an 80 percent savings by using the cloud. This news hit the blogosphere with a big splash -- it's another positive outlook for cloud computing, and it provides some pretty compelling reasons to move to the cloud.

Not.

imageFirst of all, if I went into someone's office and claimed that I could save 80 percent on IT costs just by using the cloud, I'd get laughed out of the building. When you create studies like this, you tend to look at technology shifts in perfect worlds, and the perfect world does not exist.

However, this study misses the core reason to use the cloud. It's not the ability to share infrastructure in a multitenant environment, thus reducing infrastructure costs significantly -- we all get that. It's the ability to create an IT infrastructure and sets of systems that have the ability to change. The ability for cloud computing to bring agility to enterprises provides far more value than the mere sharing of hardware and software in some public cloud.

Most businesses ignore the value of agility when they do business cases for cloud computing because it's much more difficult to calculate the number of moving parts, and because "agility" is often associated with SOA (service-oriented architecture). Many who advocate cloud computing have no clue how SOA fits into the mix, so they miss a huge part of the puzzle. But if you want the complete picture to emerge from the puzzle pieces, you can take the business case from two angles.

First, there is the angle that using 10,000 servers that are shared with 500 companies is far cheaper than purchasing your own hardware and software and placing it in a rented or purchased data center. We've all heard the condescending power plant analogy, that's it's better to pay an electric company than use your own generator. This argument is beginning to seem both silly and redundant -- we get it, so move on.

Second, there is the angle that the true value of all this technology is the ability to quickly adjust your IT assets to take advantage of new business opportunities or to make adjustments to capture new markets. This means leveraging the cloud to allocate the resources you need when you need them, scaling up and down as the business requirements change. The value of doing this just one time typically outweighs any value around sharing resources.

Just think -- you can legitimately get both angles into your business case if you go cloud.

<Return to section navigation list> 


Windows Azure Platform Appliance (WAPA), App-V, Hyper-V and Private/Hybrid Clouds

imageNo significant articles today.


<Return to section navigation list> 

Cloud Security and Governance

No significant articles today.


<Return to section navigation list> 

Cloud Computing Events

Channel9 was starting to post archived MIX 11 session content on 4/15/2011. There doesn’t appear to be a filter for those sessions with active archive segments and some sessions weren’t recorded.


Mary Jo Foley (@maryjofoley) reported Microsoft rolls out new Azure cloud services and features in a 4/12/2011 post to her All About Microsoft blog for ZDNet News:

image Microsoft announced several new Windows Azure features and services on April 12, the first day of its Mix ‘11 conference for developers and designers.

Among the new Azure platform services:

  • imageAn update to the Windows Azure SDK (software development kit), due out later today, that includes a Web Deployment Tool to simplify the migration, management and deployment of IIS Web servers, Web applications and Web sites. The tool integrates with Visual Studio 2010 and the Web Platform Installer.
  • Updates to the Windows Azure AppFabric Access Control service, which provides a single-sign-on experience to Windows Azure applications by integrating with enterprise directories and Web identities from Microsoft, Facebook and Google.
  • The release of the Windows Azure AppFabric Caching service — some time in the next 30 days — which the Softies said will accelerate the performance of Windows Azure and SQL Azure applications.
  • A community technology preview (CTP) of Windows Azure Traffic Manager, a new service that allows Windows Azure customers to balance application performance across multiple geographies.
  • A preview of the Windows Azure Content Delivery Network (CDN) for Internet Information Services (IIS) Smooth Streaming capabilities, which allows developers to upload IIS Smooth Streaming-encoded video to a Windows Azure Storage account and deliver that video to Silverlight, iOS and Android Honeycomb clients.

There are also some new offers for Azure customers, explained in a new Windows Azure Team blog post.


<Return to section navigation list> 

Other Cloud Computing Platforms and Services

No significant articles today.


<Return to section navigation list> 

0 comments: