Wednesday, April 20, 2011

Windows Azure and Cloud Computing Posts for 4/18/2011+

image2 A compendium of Windows Azure, Windows Azure Platform Appliance, SQL Azure Database, AppFabric and other cloud-computing articles.

image4    

Update 4/20/2011: Updates will be less frequent this week as I prepare for my Linking Access tables to on-premise SQL Server 2008 R2 Express or SQL Azure in the cloud Webcast of 4/26/2011.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.


Azure Blob, Drive, Table and Queue Services

Avkash Chauhan described Uploading Crash Dump from Azure VM to Windows Azure Storage using Diagnostics Monitor Agent in a 4/18/2011 post:

image Using Windows Azure SDK 1.4 you can use the following code in Web Role or Worker Role to Add :

        public override bool OnStart()
{
// Set the maximum number of concurrent connections
ServicePointManager.DefaultConnectionLimit = 12;
//We need to get default initial configuration
var config = DiagnosticMonitor.GetDefaultInitialConfiguration();
//Windows Azure Logs. Table: WADLogsTable
config.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
config.Logs.ScheduledTransferPeriod = TimeSpan.FromSeconds(5);
//Performance counters. Table: WADPerformanceCountersTable
config.PerformanceCounters.DataSources.Add(
new PerformanceCounterConfiguration
{
CounterSpecifier = @"\Processor(*)\*",
SampleRate = TimeSpan.FromSeconds(1)
}
);
config.PerformanceCounters.ScheduledTransferPeriod = TimeSpan.FromSeconds(5);
//You can add more performance counter using the link below:
// - http://blogs.msdn.com/b/avkashchauhan/archive/2011/04/01/list-of-performance-counters-for-windows-azure-web-roles.aspx?wa=wsignin1.0
//Windows Azure Diagnostic Infrastructure Logs. Table: WADDiagnosticInfrastructureLogsTable
config.DiagnosticInfrastructureLogs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
config.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = TimeSpan.FromSeconds(5);

//Windows Event Logs. Table: WADWindowsEventLogsTable
config.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
config.WindowsEventLog.DataSources.Add("Application!*");
config.WindowsEventLog.DataSources.Add("System!*");
config.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromSeconds(5);
//Crash Dumps. Blob. Container: wad-crash-dumps
CrashDumps.EnableCollection(true);
//Custom Error Logs
var localResource = RoleEnvironment.GetLocalResource("ErrorStorageFolder"); // ErrorStroageFolder is the Local Storage you will have to add with You Role
var directoryConfiguration = new DirectoryConfiguration
{
Container = "wad-custom-log-container",
DirectoryQuotaInMB = localResource.MaximumSizeInMegabytes,
Path = localResource.RootPath
};
config.Directories.DataSources.Add(directoryConfiguration);
config.Directories.BufferQuotaInMB = 1024;
config.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);//for crash dumps and custom error log
// Based on above code, you can verify that you have setup dump data transfer time to 1 minutes.
//Start with new configuration
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", config);
Thread.Sleep(10000);
//Adding 10 seconds time so crash dumps can be uploaded. You can find a suitable time depend on your test to upload the crash dumps.
return base.OnStart();
}

Verification  on Windows Azure VM:

You can verify that infrastructure log are created in Windows Azure VM at location below:

C:\Resources\directory\<Deployment_ID>.<Role_Name>.<StorageName>\

The crash dumps will be located at folder bellow:

C:\Resources\directory\<Deployment_ID>.<Role_Name>.<StorageName> \CrashDumps

You can also look at the Monitoring Host configuration file for the role and verify the transfer time is set as below:

<Directories>
      <BufferQuotaInMB>1</BufferQuotaInMB>
      <ScheduledTransferPeriodInMinutes>1</ScheduledTransferPeriodInMinutes>
      <Subscriptions>
        <DirectoryConfiguration>
          <Path>********** Role Path **************** </Path>
          <Container>wad-crash-dumps</Container>
          <DirectoryQuotaInMB>1024</DirectoryQuotaInMB>
        </DirectoryConfiguration>
      </Subscriptions>
    </Directories>

Further Analysis:

Once the crash will occur the role will die and upload will be scheduled during next role startup. Based on above, the crash dump will be uploaded next time your role starts up, and it will take some time for the uploading. If your role only runs for a short time after the crash, then Diagnostics may  not have enough time for the uploading. That's why we have added Thread.Sleep() in the above code to give enough time to upload the crash dumps when created.

Here is how it works:

  1. Role starts and then crash happens due to any reason during start
  2. The crash dump was created immediately
  3. This crash cause application to exit
  4. When App Agent starts the role again the crash dump is being uploaded however if time is very less to upload and the upload action is broken and crash dumps cannot be uploaded
  5. Adding some sleep time in the same thread solves this crash dump upload issue


<Return to section navigation list> 

SQL Azure Database and Reporting

Steve Yi posted TechNet Wiki- Connection Management in SQL Azure on 4/18/2011:

image Developers are always interested in how to get the best performance out of applications that connect to SQL Azure. This TechNet wiki article provides a detailed introduction of SQL Azure and its network topology.  It goes deeper by explaining some of the reasons for connection-losses or throttling cases and provides guidelines and best coding practices on how to manage the connection life cycle in SQL Azure.

Click here to view the article.

imageWhat connection scenarios have you experienced with SQL Azure? Let me know in the comments, as I'd love to discuss them.


Mark Kromer (@mssqldude) posted Microsoft Cloud BI part II on 4/17/2011:

image Head over to the SQL Server Magazine BI Blog where Derek Comingore and I blog about Microsoft BI here. Today, I added part 2 of my series on created Microsoft Cloud BI solutions using SQL Azure, PowerPivot and Reporting Services. Part 1 is also available from that blog site where I created a SQL Azure data mart in the cloud from my on-premises SQL Server AdventureWorks database.

imageIf you have interest in Silverlight BI for mobile or other devices, I’m going to get to that in part 4 and will use some of the techniques that you can find here on MSSQLDUDE blog like those that I talked about at Code Camp and SQL Saturday.

From the introduction:

At this point in the project (click here for part 1), as I perform ad-hoc self-service business analysis using cloud data in my SQL Azure database, let me just point out to you that I (IMO) see this as a hybrid approach. As it stands today, I cannot create a PowerPivot cube or an SSAS cube in the cloud. But I can use Excel 2010 with the PowerPivot add-in to run powerful analysis on that data which is in the cloud and do so without needing any local on-premises infrastructure. No data marts, data warehouse, database of any kind. I am going to use a direct connection to SQL Azure from PowerPivot for Excel 2010.


Cihan Biyikoglu explained Considerations When Building Database Schema with Federations in SQL Azure in a 4/16/2011 post:

When working with federations, database schema requires a special consideration.

imageWith federations, you take parts of your schema and scale it out. With each federation in the root database, a subset of objects are scaled-out. You may create multiple federations because distribution characteristics and scalability requirements may vary across sets of tables. For example, with an ecommerce app, you may both have a large customer-orders set of tables and a very large product catalog that may have completely different distribution requirements.

imageAll the schema artifacts in the root and in federation members are scoped to the database. Meaning objects in the root database are only visible when connected to root and objects in the members are only visible in the member. There is no schema enforcement across members of a federation to have exactly matching schema elements. So federation member 1 and 2 may have completely different schema.

The root database, provide the containment boundary for all information required to access all the information app cares about – root know about all federations and all federation members in these federation members and their distribution scheme and current distribution layout. This is all exposed through metadata for federations. New bunch of system tables like sys.federation, sys.federation_members and more will give you this information in the root and in federation members. Root’s containment boundary also serves as a control point for authentication and account management across all federations and their members.  the following figure represent the setup I described above;

To express all this, there are surprisingly few annotations required in the federated schema. All object other than tables do not require any special annotations. Stored procedures, indexes, functions or triggers work within the scope of the database they are created in, regardless of the specific location of the object: in the root or in any federation member. The only annotations required are on the federated tables. Lets take a step back and take a look at the types of tables in databases with federations. Federations introduce 3 types of tables; federated tables, reference tables and central tables.

Federated Tables:
Refer to tables that contain data that is distributed by the federation. Federated tables are created in federation members and CREATE TABLE syntax contain a federation distribution key annotated with the FEDERATED ON(federation_distribution_key = column_name) clause.

CREATE TABLE t1(…) FEDERATED ON (distribution_key=column_name) 

In the figure above, federated tables are marked light blue. Federated tables contain part of the scaled out data. There can be many federated tables in a federation member. When a SPLIT operation is issued; federated table schema is fully copied to the destination members. However, federated table data is filtered based federation key and based on the value of the split point to the destination federation members.

First thing to remember is that column specified in the federated tables as the distribution key is required the exactly match the data type of the federation schemes data type definition. This is specified as part of the CREATE FEDERATION statement for the federation.

Another important aspect is that the column that is the federation key is present in each federated table. For some tables, this may require denormalization. For example, order_details table may not contain customer_id but to annotate the table as part of the classic orders & order_details schema, customer_id needs to be added to the order_details table as well as the orders table.

There are a few others requirements to remembers;

  • Federation key column can only be one of the following types in v1: INT, BIGINT, UNIQUEIDENTIFIER, VARBINARY(900). In future iterations, you can expect the list to expand.
  • Federation key column value in the table cannot be a nullable type and cannot be updated.
  • Federation distribution key is required to be part of every unique key.
  • Foreign key relationship between federated tables are required to include the federation key at the same ordinal. This restriction does not apply to foreign key relationships to reference tables.
  • In v1, federation key column cannot be a computed column.
  • Require all foreign key relationships between federated tables to include federation distribution key.
  • Due to repartitioning operations, database scoped functions are challenging with federation members as well. Federation members do not support identity property on tables and
  • Due to similar limitations, timestamp data type is not supported in federation members.
  • Schema for the federated tables can only be changed in an UNFILTERING connection.

Reference Tables:
Refer to tables that contain reference information to optimize lookup queries in federations. Reference tables are created in federation members and no special annotation is required when creating reference tables. The absence of the FEDERATE ON clause is all that is needed. Reference tables typically contain small lookup information useful for query processing such as looking up zipcodes that is cloned to each federation member.

In the figure above, reference tables are marked green. When a SPLIT operation is issued, reference table schema and data is cloned to both destination members.

A few things to remember with reference tables;

  • The same identity and timestamp limitations apply to reference tables as well.
  • Updating reference tables schema and data can only be done in an UNFILTERING connection.
  • Reference tables cannot have foreign key relationships referring to federated tables.

Central Tables:
Refer to tables that are created in the federation root for typically low traffic objects such as application metadata. No special annotation is required for the these. In the figure above, central tables are marked orange. Central tables simply only exist in the root database context and are accessible only when you are in the root database. Some of the limitations listed above do not apply to the root database and tables created in the root database. Since centralized tables are in the root database only, they do not participate in any repartitioning operations. I refer to this above as well but one thing to remember is that centralized tables are only visible when connected to the root and not accessible when connected to the member databases.

Other Considerations for Federation Member Schema

Finally, a few other limitation in v1 exist in federation member schema.

  • This is implied above but to be explicit; object_ids between federation members for objects with same name are independent.
  • Indexed views are not supported in federation members.
  • All changes that impact the global view of the atomic unit require connections that has turned off filtering with FILTERING=OFF.
  • Schema updates to the federation members can only be done with unfiltering connections..
  • Updates to the reference table data also require unfiltering connections.
  • Much like the login and user relationship that exist today between master database and user databases, user accounts in federation members have to match the user account name at the root at create time.

The above list summarizes the list of requirements for the schema. As always, if you have questions, feel free to reach out through the blog.

Cihan is posting a lot of information about a product without even a CTP. For more background on SQL Azure Federation, see my Build Big-Data Apps in SQL Azure with Federation article in Visual Studio Magazine’s March 2011 issue.


<Return to section navigation list> 

MarketPlace DataMarket and OData

image Marcelo Lopez Ruiz reported datajs intro video now online in a 4/18/2011 post:

The MIX11 session on datajs is now online - enjoy!

From the abstract for the Data in an HTML5 World session:

imageCome and learn about ‘datajs’. datajs is a new cross-browser JavaScript library that enables better data-centric web application by leveraging HTML5 browser features and modern protocols such as OData. It's designed to be small, fast, and provide functionality for structured queries, data modification, and interaction with various cloud services, including Windows Azure.


<Return to section navigation list> 

Windows Azure AppFabric: Access Control, WIF and Service Bus

Vittorio Bertocci (@vibronet) continued his Access Control series with ACS Extensions for Umbraco - Part II: Social Providers on 4/18/2011:

image Welcome to the second installment of the series on the ACS Extensions for Umbraco, part of the big ACS launch wave!

This time I am going to assume that you succesfully installed the Extensions on your Umbraco instance, and you want to start accepting members from social & web identity providers. You’ll see how easy it is to stitch together the sign-up and sign-in machinery, I’d estimate 3 mins max, and how we integrated the classic email verification flow into our invitation engine.

image722322222As usual, I am going to paste here what we wrote in the docs. However, remember that one of the big success factors for Umbraco is its great flexibility: in the same spirit, feel free to tinker with the Extensions as well to add any extra feature you may need. Few suggestions: bulk invitations, self-registering pages, etc.

Access Control Service (ACS) Extensions for Umbraco

social-video
'Click here for a video walkthrough of this tutorial'

One of the main features of the ACS Extensions is the ability to add members to your Umbraco web site directly from social providers: people will be able to sign up to your web site without having to create a new account and remember a new password, an you will not need to maintain credentials, field calls for lost passwords and so on.

The ACS Extensions accomplish this by providing you with a workflow for inviting users via a simple email verification system. The ACS Extensions also provide you some UI macro which you can use for creating sign-in and sign-up pages in just few clicks. The sign-in and sign-up mechanisms are integrated with the role-based security supported natively by Umbraco, the member groups.

In the section below we will walk you through the process of enabling classic sign-in and sign-up authentication features in your web site: this will help you to understand how the various components of the ACS Extensions work, so that you will be able to devise your own flow.

Create Some Sample Members Groups

In order to add authentication capabilities to your Umbraco web site, you need to subdivide your users in member groups.

  1. Navigate to the Umbraco back-end and login as the user account you created during the configuration wizard.
  2. Go to Members section and right-click on Members->Members Groups node and select Create.
  3. Enter the Name of the member group and then click on Create button. For this tutorial we will need the “Power Members” group.

    33698430-31ef-4fb1-a5b4-98e6f8f3f0b1

    Figure 22 - Create Member Group popup dialog

  4. Repeat the process and create a new member group called “Common Members”. In this tutorial we will assign “Common Members” to everybody. “Power Members” will be reserved to those members which need access to restricted pages.
Create Login and Error Pages

In order to enable members to sign in using social providers, you need to add to your web site a login page that will display the appropriate choices, that is to say the list of social providers you selected when you set up the ACS Extensions. Thanks to the macro provided by the extensions, this task is very straightforward.

  1. From the Content section of the management UI, click on the “Simple Web Site” (or other “Homepage”) node at the top of the Content tree.
  2. Right-click on the Homepage node and select Create to add a new page in your web site.
  3. Write “Login” in the Name field and click on Create button.

    06e7bf11-6cbb-4b71-b83b-ae178923d21f

    Figure 23 - Create Content popup dialog

  4. Go to the right panel to edit the page.
  5. In the Content tab add a welcome sentence (e.g. “Welcome back! Please sign in with the provider you used for registering among the ones below”)
  6. Click on Insert Macro button from toolbar.

    9fe4895e-dce8-48c2-9455-3ceeb2384d9a

    Figure 24 - Insert Macro button

  7. Select the acsExtensions.loginMacro. That macro contains the necessary logic for rendering links to all the social providers (and other identity providers, more about that later), so that the user can click on the provider of choice and be redirected to authenticate there. acsExtensions.LoginMacro provides username and password fields as well, in case you want to support the traditional Umbraco member management side by side with the new federated authentication features.

    47f9d404-3c42-4c2e-a1d2-090558fb0f12

    Figure 25 - Insert macro popup dialog

  8. Go to the properties tab and check Hide in navigation, to prevent the page from appearing in the navigation bar.
  9. Click on the SaveandPublish button.

    5ff7968a-9391-4732-9d88-0975d085dbd4

    Figure 26 - Login Page Created

  10. Now that you have a login page, you need an error page to display in case something goes wrong. Right-click on the Homepage node and select Create once again.
  11. Enter “Unauthorized” in the Name field and click on Create.
  12. Go to the right panel to edit the page.
  13. In the Content tab add a suitable error message (e.g. “You are not authorized to access the page you requested.”)
  14. Go to the properties tab and check Hide in navigation.
  15. Click on the SaveandPublish button.

    You now have all the necessary assets to authenticate members: the next step is to wire them up to the content you want to protect.

Restrict Access to Content

Umbraco supports role-based authorization as the access control method for its content pages. When you elect to use role-based security with a given page, Umbraco will ask you to specify one login and one error page. The ACS Extensions snap in that mechanism by using a login page containing its macro, just like the page you created in the former task. In the steps below we will restrict access to one page in order to demonstrate the process.

  1. From the Content section right-click on the area (or page) that you want to secure and select Public Access. In this tutorial we are protecting the page “Go Further”.

    7ddff6b6-4af9-461d-a892-139b0ab1e29a

    Figure 27 - Public access menu item

  2. Choose the Role based protection option and click on Select button.

    489c2c18-014d-4cab-87cc-06515c0d5f51

    Figure 28 - Public access popup dialog | Choose how to restrict access to this page

  3. You’ll be presented with a dialog which allow you to choose which members group should have access to this page. Select the Power members and use the right arrow button to move it to the list on the right.
  4. In the Login Page area click Choose… to specify the page that Umbraco will display in case one unauthenticated user attempts to access the specified page. Pick the Login page you created earlier.

    24dc3124-142d-4cb4-9fda-92aa4d88ec15

    Figure 29 - Public access popup dialog | Role based protection

  5. Repeat the process Error Page area. This time, pick the Unauthorized page created earlier.

    6df61418-4fbb-443c-a8b1-7818260bb417

    Figure 30 - The end result in the public access popup dialog | Role based protection

  6. Click Update and close the dialog
  7. At this point we are ready to test the sign-in page. Navigate to your Umbraco web site

    b8b21e2e-1690-4d14-8877-51b4a91519c2

    Figure 31 - The Umbraco web site home page

  8. Click on Go further in the navigation bar. You will be redirected to the login page, as shown below. The list of the social providers is automatically obtained from ACS; the username and password fields are provided for compatibility with the native Umbraco membership management system

    fe68d691-e998-4d7b-865f-b75b09ae689e

    Figure 32 - The login page

    The login mechanism works, but at this point you don’t have any registered members yet. In the next task you will create the activation page that invited members will use for signing up to your Umbraco instance.

Create Activation Pages for Signing Up Social Members

In order to register new members from social providers, you need to create some pages that will allow them to sign up to your web site using their account of choice. In a later task you will learn how to invite users; in the steps below you will see how to build the assets that will support the member activation experience.

The infrastructure is pretty simple: you need one Activation page, which constitutes the landing page for new members who have been just invited to join the site; and one Signup page, which they will use to associate their account from their social provider of choice with the account on your web site.

  1. Using the procedure you have learned in the former tasks, add a new page in the content tree and call it Sign Up.
  2. Go to the right panel to edit the page.
  3. In the Content tab add a sentence which will clarify the purpose of the page (e.g. “Please choose from the list below which social provider account you want to use for registering with this web site”)
  4. Click on Insert Macro button from toolbar.
  5. Select the acsExtensions.federatedloginMacro. This macro behaves in the same way as the acsExtension.loginMacro you used when creating the sign in page, with the only difference that the username and password UI elements are omitted: in order to sign up with a social provider account, those elements would not be needed.

    8fc20176-3c85-4e7b-9597-1c5e5d67c6a4

    Figure 33 - Insert the federated login macro to list all configured social providers login links

  6. Go to the properties tab and check Hide in navigation, to avoid the page from appearing in the navigation bar.
  7. Click on the SaveandPublish button.
  8. The Sign Up page will be used to secure the invitations landing page. Create a new page and call it activation.

    noteNote:

    The default name and path for the activation page is “~/activation”. If you keep all the defaults it is important that you call the page “activation”, or the invitation mechanism will fail (see the next task).
    If you want to change the page name you use here to toehr than “activation”, you need to update the web.config file of your Umbraco web site (appSettings -> ActivationPage).

  9. Go to the right panel to edit the page.
  10. In the Content tab write an activation confirmation text like: “Your account was activated successfully!”

    0f2f2870-1566-469e-a8e7-58641e9df6b6

    Figure 34 - Edit activation page

  11. Go to the properties tab and check Hide in navigation, to avoid the page from appearing in the navigation bar.
  12. Click on the SaveandPublish button.
  13. Using the procedure you’ve learned in the Restrict Access to Content task, set the public access for the activation page to Role based protection. Assing Common Members as the member group authorized to see the page, set the Sign Up page as the login page and the Unauthorized page as the Error page.

    3d3ad8ae-f5a6-44e5-b9cc-ba0a623adcce

    Figure 35 - Permissions for the activation page

  14. Click on Save or Publish button.

    At this point you have everything you need to start inviting members from social providers.

Add and Manage Social Members

With the ACS Extensions you can add members directly from social providers such as Facebook, Windows Live ID, Google and Yahoo. The sign in and sign up pages you created in the two former steps provide the user experience elements to support the registration and authentication flow. In this task you will learn how to use the administrative features of the ACS Extensions to invite one new member. You will discover that the process is practically the same of creating traditional members, minus the pain of managing credentials.

  1. Go to Members section and right-click on Members->Members node and select Create.

    cf183863-a85b-4a1c-b09a-abac13340083

    Figure 36 - Create Member

  2. Choose the Social Member type, a special type added by the ACS Extensions. Complete the login name and the email fields and click on Create button. Make sure to use an working, actual email address as the member will need access to it in order to sign up.

    1f721859-7ec0-42fd-85be-83cee4da6386

    Figure 37 - The Create Member popup dialog

    note[1]Note:

    Note: As soon as you hit the Create button, the ACS Extensions create one invitation mail and send it to the user (using the SMTP you configured in the setup). That invitation contains a specially crafted activation link, which the member can use to associate his social account of choice with your web site using the activation assets you created in the former step.
    If you want to modify the template of the invitation email, you can do when creating a member from within the Create Member Dialog.
    The newly created member has some extra properties in respect to the default Umbraco member: one flag reporting if the invitation was sent, another if the account has been activated (false at the beginning, it flips when the user successfully signs up) and from which identity provider he is coming from (gets a value at activation time, according to the member’s choice).

  3. In the right panel you can assign the newly created member to the web site’s member groups. For the purposes of this tutorial, add the member to the Common Members group.

    bf576385-6166-403c-abae-f768ea959cbf

    Figure 38 - Edit Member panel

  4. Save the changes.

Let’s step for a moment in the shoes of the new member and see how the invitation and sign up experience looks like.

The flow begins when the prospect member receives the invitation mail, as shown below.

4cd1f742-5aec-4bcc-93ee-132db2617fee

Figure 39 - The default invitation mail

As mentioned above, the invitation email template can be configured to fit your preferences.

The URL has been constructed to trigger the activation process; the ticketnumber parameter is used for associating the member defined in Umbraco with this specific email verification transaction.

Following the link, the prospect member will land on the activation page, which in turn will redirect to the Sign Up page.

Figure 40 - The Sign UpPage

There the user can select the Identity Provider he wants to use for signing in the web site. That account will be mapped to the Social Member created in Umbraco, with all its properties (including the assigned roles). For the purpose of this tutorial, let’s assume that the use picks Windows Live ID and successfully authenticates.

cbbebbaf-84a8-4a7d-acce-d2ac59712cf7

Figure 41 - Activation Completed

As a result, Umbraco can finally serve the activation page.

Now, you may remember from the earlier steps that the current member has been assigned to the Common Members group; that’s not enough for accessing the Go further page, which you configure to be visible only to Power Members. If you click on Go further at this point, you will get redirected to the error page.

b8d27ba5-094a-45f5-8fa9-e1965264e4a9

Figure 42 - Unauthorized

  1. Let’s change the member groups settings for the member so that he can access the Go further page. Go back to the Member details in the backend and refresh the page. Notice that the Is Activated and Identity Provider fields have been updated.

    c4d23abf-1fdf-4f65-9234-c19ef0e62c98

    Figure 43 - Updated Member Details

  2. Add the user to the Power Members group, Save the changes and hit F5: now access to the restricted page will be granted.

    9a1cc911-468f-44dd-b544-2dd3007271a8

    Figure 44 - Access Granted.

There you have it: your Umbraco web site is now integrated with Windows Live ID, Facebook, Google, Yahoo and any other social provider the Access Control Service will support in the future! It takes much longer to explain how to do it than to do it: the average time for walking through this tutorial is consistently under the 20 minutes.


<Return to section navigation list> 

Windows Azure VM Role, Virtual Network, Connect, RDP and CDN

imageNo significant articles today.


<Return to section navigation list> 

Live Windows Azure Apps, APIs, Tools and Test Harnesses

The Windows Azure Team posted Real World Windows Azure: Polycom Chooses Windows Azure To Power its Cloud-Based Media Services on 4/18/2011:

Tell us about Accordent and Polycom.

image Polycom recently acquired Accordent Technologies, a developer of award-winning enterprise video platforms used by large organizations to produce and deliver video-based assets for communications, training and educational purposes. This week, Polycom will launch Accordent Media Services - Powered by Windows Azure, the first enterprise media management system hosted on the Windows Azure platform.

What is Polycom's Accordent Media Services?

imageAccordent Media Services - Powered by Windows Azure enables organizations to deliver "YouTube for the Enterprise" via an automated software-based video creation, management, publishing and delivery platform.  Based 100% on Microsoft technologies, the platform enables organizations to leverage their Microsoft infrastructure investments to deliver highly bandwidth-efficient, live and on-demand video for virtually any business scenario.  

How does Windows Azure benefit your customers?

Windows Azure extends the capabilities of the on-premise Accordent solution by enabling customers to experience the benefits of centralized media management on a subscription basis, without incurring the up-front network and storage hardware costs and complexities of deploying and maintaining an on-premise solution. 

What does the service include?

The service includes an installed, branded, and fully functional Accordent Media Management System, hosted within Windows Azure datacenters, which delivers content in various formats to widely distributed audiences via the Windows Azure Content Delivery Network (CDN). Customers will be able to submit content created from remote Accordent capture products across the cloud leveraging Microsoft Expression Encoder technology, video conferencing integrations, and desktop Microsoft Lync clients. The service will provide full enterprise video content management capabilities (categorizing, indexing, retention management, etc.), publishing to customer Microsoft SharePoint solutions, and scalable delivery of live and on-demand content over the Windows Azure CDN.

Why did you choose to build on the Windows Azure platform?

The choice of the Windows Azure platform was a natural one for Polycom, and an extension of its long-standing strategic partnership with Microsoft. The Windows Azure platform provides an open, standards-based environment to give customers exceptional flexibility and scalability. The solution also included all the elements required to deliver a world class, cloud-based media management solution, including the Windows Azure Virtual Machine (VM) Application Hosting, Windows Azure Virtual Storage, and Windows Azure Content Delivery Network Services.

Click here to learn more about Accordent.  Click here to read more about how other companies are using the Windows Azure platform.


<Return to section navigation list> 

Visual Studio LightSwitch

The LightSwitch Team explained How to Create Composed and Scalar Queries (Ravi Eda) on 4/18/2011:

image2224222222This article presents a scenario at a sales team that shows the benefit of using composed and scalar queries. It presents a detailed walkthrough for creating composed and scalar queries in LightSwitch, and creating screens to display the query results.

Composed query is a query that operates on the results of another query. Scalar query is a query that returns one and only one row.

Business Case

Consider a business application designed for a sales department at a medium sized organization. This application should equip the sales teams to slice and dice orders based on various parameters. Parameters such as customers in a certain geographic area, orders from a particular date range, products that sell at a high price and large orders. The sales team would also want to know the customer who spent the most on a single order.

LightSwitch Application for the Sales Team

Let us create a LightSwitch application that can help the sales team. To follow along with this example you will need Northwind database in addition to LightSwitch.

Create a project, attach to Northwind database and import Order Details, Orders, Products and Customers tables. We want to create a filter that can display all the orders from customers who live in a certain country. To do this, add a query under Order_Details table. Name the query “OrderDetailsForCustomersInCountry”.

On the query designer surface, click “+ Add Filter”. Select “Order.Customer.Country” for the filter’s left-operand, operator as “=” (is equal to) and then choose “@ Parameter” for the value type. For the filter’s right-operand value, click “Add New”. This will create a parameter by name "Country”. Make this parameter optional. To do this, open the properties window of the parameter and check “Is Optional”. The query designer should now resemble Figure 1.

clip_image002

Figure 1: Query Designer with “OrderDetailsForCustomersInCountry” Query

The parameter “Country” is marked optional since the value is set to a default value based on certain logic. For example, a sales team member in the United States would be more interested in orders from customers living in the United States. If the parameter is not optional then the team member will have to enter the value of the parameter each time the filter executes. PreprocessQuery() method makes it possible to write code that will set the default value of “Country”.

Click “Edit Additional Query Code” link in the properties window and add the following lines of code in the PreprocessQuery() method. This code first checks to see if the parameter is passed and if not, filters the query on customers in the USA:

Visual Basic:

        Private Sub OrderDetailsForCustomersInCountry_PreprocessQuery
                    (Country As String, ByRef query As 
                     System.Linq.IQueryable(Of LightSwitchApplication.Order_Detail))
            If String.IsNullOrEmpty(Country) Then
                query = From detail In query
                        Where detail.Order.Customer.Country = "USA"
            End If
        End Sub

C#:

partial void OrderDetailsForCustomersInCountry_PreprocessQuery
    (string Country, 
    ref IQueryable<LightSwitchApplication.Order_Detail> query)
{
    if (String.IsNullOrEmpty(Country))
    {
        query = from details in query
                where details.Order.Customer.Country == "USA"
                select details;
    }
}
Search Screens

Let us build a screen to display results of “OrderDetailsForCustomersInCountry”. Open “OrderDetailsForCustomersInCountry” in the Query designer. Click “Add Screen” on the designer’s command bar. Choose “Search Data Screen” template. Enter “CustomersByCountry” as the screen name. Choose “OrderDetailsForCustomersInCountry” as the screen data. See Figure 2.

clip_image002[5]

Figure 2: Add New Screen Dialog for CustomerByCountry Search Screen

In the Screen designer, select “Order_DetailCountry” and open its properties. Enter “Filter by Country:” for the display name. Check “Is Parameter”. Uncheck “Is Required”. See Figure3.

clip_image004

Figure 3: Parameterized Search Screen

Run (F5) the application. The runtime should look as shown in Figure 4. Notice that “Filter by Country:” is blank and the number of pages returned is 8. The logic in PreprocessQuery() method filtered and returned only orders from customers in USA. Without this filtering, the number of pages returned will be greater than 8.

clip_image006

Figure 4: Orders Details from Customers in USA

Suppose the sales team members want to look at orders from customers in some other country, say Austria. In that case, they enter “Austria” in “Filter by Country:” field and hit Enter or Tab key. The filtered results will look as shown in Figure 5. Notice that the number of pages returned is 3.

clip_image008

Figure 5: Order Details from Customers in Austria

Composed Queries

There are some benefits in using a query as a source of another query. The composed query reuses the logic built on the designer and the code written in the PreprocessQuery() method of the source query. In addition, it helps in maintainability. It is easier to update a single query than make changes on multiple queries.

The data source for “OrderDetailsForCustomersInCountry” is Order_Details. In LightSwitch, it is possible to choose a query as a source for another query. The composed query operates on the results obtained from the source query.

Let us build a composed query that uses the results from “OrderDetailsForCustomersInCountry”. Start by adding a new query under Order_Details. Name the query “OrderDetailsForDateRange”. From the Query designer’s command bar, change the Source of the query from Order_Details to “OrderDetailsForCustomersInCountry”. See Figure 6.

clip_image002[7]

Figure 6: Choose the Source for “OrderDetailsForDateRange” Query

Notice that parameters from the source query, in this case Country, are available in the composed query. These inherited parameters will not be editable on the designer surface. However, the parameters will be available for the PreprocessQuery() method and on the designer surface, specifically as a filter’s right-operand.

“OrderDetailsForDateRange” filters orders with order date between a certain date ranges. The start and end dates are to be provided by the user. On the query designer surface, click “+ Add Filter”. Select “Order.OrderDate” for the left-operand of the filter. For the operator, choose “is between”. There will be two right-operands for “is between” operator. Add a “StartDate” parameter as the first right-operand and “EndDate” parameter as the second. Query designer should now look as shown in Figure 7.

clip_image004[4]

Figure 7: Query Designer with “OrderDetailsForDateRange” Query

Hierarchy of Composed Queries

This section shows how a query can be a source for multiple queries. It also shows that a composed query can be a source for another query or queries.

The sales team needs to track products that sell at a high price. Each member of the sales team is interested in looking at only the products he/she is responsible to track. A high price product is one that has a unit price greater than $25. In addition, the product should be available in the market i.e., “Discontinued” field is set to false.

Add a new query under Order_Details. Name this query “OrderDetailsForHighPriceProducts”. Change the source of the query to “OrderDetailsForCustomersInCountry”. On the query designer surface, add two filters as shown in Figure 8.

clip_image006[4]

Figure 8: Query Designer with “OrderDetailsForHighPriceProducts” Query

Multi-Level Composition

Now let us create another level of composition on “OrderDetailsForHighPriceProducts”. Among these high priced products, we want to filter out large purchases. Sales team defines a purchase of $2000 and greater as a large purchase.

Add a new query under Order_Details. Name this query “OrderDetailsForLargePurchase”. Change the source of the query to “OrderDetailsForHighPriceProducts”. The designer surface does not provide support to add logic that determines the price because it is a calculation of multiple fields. Hence, add the following code in the PreprocessQuery() method.

Visual Basic:

        Private Sub OrderDetailsForLargePurchase_PreprocessQuery
                    (ByRef query As System.Linq.IQueryable
                        (Of LightSwitchApplication.Order_Detail))
            query = From detail In query
                    Where (detail.UnitPrice * detail.Quantity) > 2000
        End Sub

C#:

partial void OrderDetailsForLargePurchase_PreprocessQuery(ref IQueryable<Order_Detail> query)
{
    query = from details in query
            where (details.UnitPrice * details.Quantity) > 2000
            select details;
}

“OrderDetailsForLargePurchase” is composed on “OrderDetailsForHighPriceProducts”, and “OrderDetailsForHighPriceProducts” is composed on “OrderDetailsForCustomersInCountry”. This demonstrates multi-level composition of queries.

“OrderDetailsForCustomersInCountry” is the source for “OrderDetailsForHighPriceProducts” and “OrderDetailsForDateRange”. This demonstrates that multiple queries can use a query as the data source.

You should notice that the source dropdown on the Query designer command bar lists only valid data sources. A valid data source is the parent entity or any other query under the same entity that will not cause a cyclical reference. A cyclical reference occurs when queries become data source for each other. If Query A is the source of Query B then Query B cannot be the source of Query A. In the sales team queries, “OrderDetailsForLargePurchase” cannot be the source of OrderDetailsForHighPriceProducts. However, both the queries can use Order_Details, “OrderDetailsForCustomersInCountry” or “OrderDetailsForDateRange” as their source.

Scalar Queries

The sales team wants to know their most valuable customer, the customer who spent the most on an order. To locate this one and only one customer we can create a query.

Add a query under Order_Details. Name it “BiggestPocketQuery”. Open the query’s Properties window. Change “Number of Results Returned:” property from Many to One. Observe that “Add Sort” button becomes disabled. Since the number of results returned for this query is one and sort on that single result is not applicable. This query should look as shown in Figure 9.

clip_image008[4]

Figure 9: Set “BiggestPocketQuery” Query as Scalar

Please note that a scalar query cannot be a source for another query or a screen.

Add the following code in PreprocessQuery() method to compute the purchase price for each order and return the largest.

Visual Basic:

        Private Sub BiggestPocketQuery_PreprocessQuery
                    (ByRef query As System.Linq.IQueryable
                    (Of LightSwitchApplication.Order_Detail))
            query = From detail In query
                    Order By (detail.UnitPrice * detail.Quantity) Descending
                    Take (1)
        End Sub

C#:

partial void BiggestPocketQuery_PreprocessQuery(ref IQueryable<Order_Detail> query)
{
    query = query = query.OrderByDescending(od => od.UnitPrice * od.Quantity).Take(1);
}
Screen for Scalar Query

This section will show how to build a screen that can display the result from a scalar query.

Open one of the non-scalar queries in the designer. On the Query designer’s command bar, click “Add Screen”. On “Add New Screen” dialog, choose “Search Data Screen” template. Enter “MostValuableCustomer” in the Screen Name. Leave Screen Data as “(None)”. Click OK. See Figure 10.

clip_image010

Figure 10: Add New Screen Dialog for Most Valuable Customer Screen

In the Screen designer, click “Add Data Item…” button on the command bar. See Figure 11.

clip_image012

Figure 11: Add Data Item

On “Add Data Item” dialog, choose “NorthwindData.BiggestPocketQuery”. Click OK. See Figure 12.

clip_image013

Figure 12: Add the Scalar Query to the Screen

In the Screen designer, drag the query onto the screen layout section as shown in Figure 13.

image

Figure 13: Add the Scalar Query to the Screen Layout

Run the application (F5) and open “Most Valuable Customer” screen. This should look as the shown in Figure 14.

clip_image016

Figure 14: Most Valuable Customer Screen

Conclusion

The Query designer in LightSwitch makes it simple for the developer to create composed and scalar queries. The PreprocessQuery() method equips the developer to write additional logic to filter the results. The Screen designer allows you to create screens that can display the results from the queries. Have fun building your own queries in LightSwitch!


Julie Lerman (@julielerman) posted Accessing ObjectContext Features from EF 4.1 DbContext on 4/17/2011:

image The Entity Framework 4.1 DbContext is a lightweight version of the EF ObjectContext. It's simpler to work with thanks t[o] a streamlined surface of properties and methods. It may be just what you've always been looking for.

But....

Once in a while you might want to use a random method of the ObjectContext that is not available in the DbContext. .

All is not lost. The EF team built a hook in for you so that you can actually get to the ObjectContext from DbContext. It's not as simple as a propert[y] however, it requires a bit of casting and more.

When I know I will want the occasional benefit of the ObjectContext, I simply create a property in my DbContext class so it's easier to get to.

Here's what it looks like:

   public class MyContext: DbContext
    {
        public DbSet<Blog> Blogs { get; set; }
       //other dbsets, ctor etc.

        public ObjectContext ObjectContext()
        {
            return (this as IObjectContextAdapter).ObjectContext;
        }
    }

Now when I have an instance of the DbContext, I can use features of the ObjectContext on the fly:

 db.ObjectContext().....


Return to section navigation list> 

Windows Azure Infrastructure and DevOps

Wes Yanaga announced the opening of Microsoft Virtual Academy for IT Professionals–Cloud Computing on 4/18/2011:

image Microsoft Corp. has launched a new virtual training portal aimed at IT professionals who want to learn new skills and enhance existing ones for cloud computing environment.  The Microsoft Virtual Academy (MVA) offers no-cost training for IT professionals with 100 and 200-level content. It takes you right to the heart of the latest advances in cloud computing, guides you through real-life deployment scenarios and the latest technologies and tools. By selecting the courses that match your needs, you can get up to speed quickly without wasting time on areas you are not interested in.

imageMVA modules include:

  • Introduction to SCVMM, Architecture & setup
  • Creating VMs, Template & Resources in VMM
  • Managing Windows Azure
  • SQL Azure Security
  • Identity & Access
  • Data security and cryptography

Each module also consists of a self-assessment to help reinforce the materials allow you to check your progress.

As you take modules and assessments you gain points enabling you for future offers and benefits. You can also opt into the league tables and compare your points tally to others in the Academy and your locale. Register now for free

Here are some courses:

  • Introduction To SQL Azure
  • Introduction To Windows Azure
  • Windows Azure Security Overview
  • Planning, Building and Managing a Private Cloud


Michael Coté (@cote) posted Eating the full cloud pie – highlights from Randy Bias’ guest apparance on 4/18/2011:

imageThere is no half-steppin’ in cloud, guest Randy Bias of Cloudscaling, IT Management and Cloud Podcast #087 – Transcript

View more documents from Michael Coté

Going full-tilt on cloud is a lot different than just installing some cloud products and stacks. That’s the take-away from reviewing a conversation I had recently with Cloudscaling’s Randy Bias, the full transcript is in the above PDF (or go to the original IT Management & Cloud podcast show-notes for the plain-text transcript).

Here are some highlights from that conversation (all from Randy):

On “Enterprise Clouds”

I have had this kind of like rant about the enterprise cloud lately is because, I really figured out lately that this whole approach to building sort of these “enterprise clouds” is fundamentally broken from the ROI point of view.

I mean, you have sort of got this weird disconnect or you have got the larger service providers, I don’t want to name anybody’s name, they are pretty obvious when you go out there and look at them, they have got these big enterprise spaces and they are saying, “hey, enterprises don’t want what Amazon has got, they want something different, they need to support all these legacy applications.”

So they are trying to build these very complex, very expensive clouds that are not going to be anywhere near cross-competitor with Amazon. And at the same time you look at the centralized IT department and they are making a decision. They are like, “well, are we going to outsource all these legacy apps and our jobs go away, or do we just build our own internal private cloud?” Most of them are choosing to go down building that internal private cloud route.

So you have got centralized IT going to the enterprise vendors to build an infrastructure that looks exactly the way these external public enterprise clouds look, same people, same technology, same management processes. And I don’t understand how there’s — I don’t see success in the future for either of those paths, and they are inherently competing with each other as well, and Amazon has kind of run away.

On security

Security is sort of a nonstarter. You can build a cloud to be as secure as you want, doesn’t matter what techniques you use. I just pretty much ignore that.

On what cloud operations looks like

Any kind of infrastructure cloud is basically going to look a lot like Amazon. Your CAPEX costs are going to be reduced by something like 75%. Your operational costs are going to be reduced similarly, at least for the infrastructure side. And you will probably see a change of a factor of 10 or a 100x in the number of infrastructure people you need to run a successful private cloud.

Any kind of IT that provides simply basic services to the business probably shouldn’t be run by the internal IT department. The internal IT department should be focused on those parts of the business that are fundamentally differentiating and that should be what your private cloud is focused on.

On the need to be transformative, not just install things

[T]he things there that people are still looking at this as sort of a product problem instead of a transformation problem, and I have literally had senior enterprise people say to me, “wow, we are buying this new automation software, we are going to put it in our data center and we are going to turn our data center into a cloud,” and I just tragically fell off my chair laughing it was like, there is no software you can buy to automate your data center and turn it into a cloud, if there was somebody would have been successful with all the attempts that have happened over the past 30 years to automate data centers. I mean, that’s not what’s happening.

…people look at it as sort of being solved by products, and I don’t think it can be solved by products, it has to be solved by a combination of products, architecture, and cultural change.

On standardization, simplifying IT

How has Amazon got 400 engineers and data center techs basically running 80,000 plus physical servers? I mean, it’s because they are doing things very differently [than traditional IT]….

And part of the economies of scale is like very homogenous environments. Like Google is reputed to have five hardware configurations across one to two million servers, whereas in a typical enterprise environment I have seen hundreds of hardware configurations across a much smaller footprint.

If you liked the above, check out the full episode, it’s chock-full of nice cloud commentary.

image.


<Return to section navigation list> 

Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds

image

No significant articles today.


<Return to section navigation list> 

Cloud Security and Governance

image

No significant articles today.


<Return to section navigation list> 

Cloud Computing Events

Vittorio Bertocci (@vibronet) claimed it’s Time to be a spreker again: TechDays Belgium & DevDays Nederlands:

image

image I am back from sunny Vegas, where we finally announced the RTW of the new ACS, to the less sunny Redmond; but not for long. Next Saturday I’m scheduled to fly back to EU, where I’ll have the honor and privilege to blabber about my favorite topics at TechDays Belgium and Nederland’s DevDays.

TechDays Belgium

This will be the third time that I speak at the BeLux TechDays. The last 2 times I had a lot of fun, and I’ve been impressed with the excellent work that Katrien and the rest of the Belgian crew put together; I am sure that this year they’d deliver just as well Smile.

Note: for scheduling reasons I ended up getting a travel plan that will bring me to Belgium well ahead of schedule, on the 24th; hence, if your company wants to meet and chat about Identity and/or Windows Azure, or even grab a Duvel, please get in touch with me.

The first session I am doing is a pretty standard intro to identity and the cloud: if you already know about claim, that’s not the session for you.

Identity and Access Control in the Cloud with Windows Azure

Wednesday the 27th

13:00-14:15

If you don’t yet know what claims-based identity is, it’s time to get busy. Signing users in and granting them access is a core function of almost every cloud-based application, and claims-based identity is the best way to take care of that. In this session we will show you how to simplify your user experience by enabling users to sign in with an existing account such as a Windows Live ID, Google, Yahoo, Facebook, or on-premises Active Directory account, implement access control, and make secure connections between applications. You will learn how the AppFabric Access Control Service, Windows Identity Foundation, and Active Directory Federation Services use a claims-based identity architecture to help you to take advantage of the shift toward the cloud while still fully leveraging your on-premises investments.

The second session is, surprise surprise, not exclusively focused on identity: it’s going to be about architecting and developing SaaS applications on the Windows Azure platform. I’ll share in detail my experience in leading the www.fabrikamshipping.com project.

Developing SaaS Solutions with the Windows Azure Platform

Thursday the 28th

14:30-15:45

Come to this session to learn about how to take advantage of the Windows Azure platform for developing and running your subscription based applications. Discover, through concrete examples, how to onboard customers, dynamically provision application instances, handle single sign-on, and offer self-service authorization. Explore the patterns and the tradeoffs you need to consider in order to meet the needs of a wide variety of customers, all the while maintaining control over your resources and the way you run your business.

Right after the SaaS session I’ll jump on a train (bus?) and head down to Amsterdam for the DevDays.

DevDays Nederland

I’ve been to Amsterdam many times for customer meetings, but never for one event: that’s why I am so grateful to Arie for inviting me for the this year’s DevDays.

In fact, rather than a usual speaking engagement this will be a bit of a vocal cords marathon: I have all my 3 sessions on the 29th Smile . Luckily there is some buffer between time slots, and hopefully by then the jetlag will be under control; and of course I’ve done much worse, for example at the WIF Workshops I motormouthed for 2 days in a row for each of the 6 cities I visited. But I digress. Here there are times and abstracts.

The first session is the same standard intro I mentioned above. The title lacks the “with Windows Azure” but – trust me – that’s what I’ll use whenever I need the cloud.

Identity and Access Control in the Cloud

Friday the 29th

09:15-10:30

If you don’t yet know what claims-based identity is, it’s time to get busy. Signing users in and granting them access is a core function of almost every cloud-based application, and claims-based identity is the best way to take care of that. In this session we will show you how to simplify your user experience by enabling users to sign in with an existing account such as a Windows Live ID, Google, Yahoo, Facebook, or on-premises Active Directory account, implement access control, and make secure connections between applications. You will learn how the AppFabric Access Control Service, Windows Identity Foundation, and Active Directory Federation Services use a claims-based identity architecture to help you to take advantage of the shift toward the cloud while still fully leveraging your on-premises investments.

The second session is a deep dive in ACS. Whereas in the intro ACS plays a very important part, I never have the time to really dig in the features and specific scenarios that ACS enables. The idea is that with one session exclusively on ACS I should be able to, but it largely depends on YOU: the amount of features and advanced scenarios I’ll be able to cover depends on how much need there will be for introductory content, and I’ll gauge that by observing how you react as we go along Smile

Windows Azure AppFabric Access Control Service: Deep Dive

Friday the 29th

13:15-14:30

The Windows Azure AppFabric Access Control Service (ACS) offers you a great way of offloading authentication and access control woes away from your applications, giving you more time and resources to focus on your business goals.
In this session we will take a detailed look at how ACS can help you to broker authentication between your application and many different providers; how you can use ACS to address browser-based, rich client, Web site to Web site and mobile application scenarios; and how you can take advantage of the rich management OData API to seamlessly integrate the ACS capabilities in your own solutions.

The last session is the SaaS one I described for Belgium, too. Tough topic for the end of the day but very, very important.

Developing SaaS Solutions with the Windows Azure Platform

Friday the 29th

16:15-17:30

Come to this session to learn about how to take advantage of the Windows Azure platform for developing and running your subscription based applications. Discover, through concrete examples, how to onboard customers, dynamically provision application instances, handle single sign-on, and offer self-service authorization. Explore the patterns and the tradeoffs you need to consider in order to meet the needs of a wide variety of customers, all the while maintaining control over your resources and the way you run your business.

That’s it! Looking forward to meet you in Antwerp & Amsterdam… now I better start working on the decks.


Jonathan Rozenblit reported on 4/17/2011 AzureFest Meets the East Coast at Moncton, Halifax and Fredericton, New Brunswick and Nova Scotia on 5/6 through 5/8/2011:

East Coast, after many hours of logistics discussions and preparations, AzureFest is coming your way! If you haven’t yet heard of AzureFest, check out this post where AzureFest is described in full.

Remember, AzureFest is a hands-on event. This means that you’ll be following along on your own laptop and actually deploying your solution during the event. In order to get the most out of the experience, make sure to bring your laptop, a power cable if you’re going to need to plug in your laptop, and a credit card. Don’t worry, nothing will be charged to your credit card during AzureFest. Your credit card is just required for activating your Windows Azure account.

Here’s the information for the cities on the AzureFest East Coast Tour. If you want to see for yourself how easy it is to move your existing application to the cloud, this is an event you don’t want to miss. Register early as space is limited.

Moncton
Greater Moncton Chamber of Commerce Board Room – First Floor
1273 Main Street, Suite 200, Moncton, NB
Friday, May 6, 2011 6:00 PM - 10:00 PM
Click here to register

Presenters: Cory Fowler (@SyntaxC4)

Fredericton
UNB Campus
Room 317, ITC
Saturday, May 7, 2011 9:00 AM – 12:00 PM
Click here to register

Presenters: Cory Fowler (@SyntaxC4)

Halifax
The Hub
1673 Barrington St., 2nd Floor, Halifax, NS
Sunday, May 8, 2011 1:30 PM – 4:30 PM
Click here to register

Presenters: Cory Fowler (@SyntaxC4)

We’re always looking to hear your thoughts and suggestions on the things that we do. If you have any feedback, we’d really appreciate it if you would share it with us here. We’re starting to think of the next wave of AzureFests – what would you like to see us cover in the next hands-on event? Post any and all suggestions you may have here. We’ll take everyone’s input and design AzureFest 2.0 accordingly. Looking forward to hearing from you.

Moncton, Halifax, and Fredericton developers, enjoy AzureFest!


<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Alex Williams asked Cloud Poll: How Disruptive Is Cloud Foundry to the Platform Market? on 4/16/2011:

CloudFoundry.jpgVMware surprised a lot of people this past week with the unveiling of Cloud Foundry. Shrouded in secrecy, the announcement left competitors a bit flat footed as they soaked in the news.

imageIt's a significant development. Cloud Foundry has the potential to disrupt the platform market. Why? If nothing else, the price is right. You can download the software and set up your own cloud platform if you like. Plus, it's not a lock-in like you see with most platforms.

CloudBzz frames the issue this way:

But PaaS before this week meant lock-in. Developers, and the people who pay them, don't like to be locked into specific vendor solutions. If you write for Azure, the fear (warranted or not) is that you can only run on Azure. Given that Microsoft has totally fumbled the opportunity to make Azure a partner-centric platform play, that means you need to run your Azure apps on Microsoft's cloud. Force.com is even worse - with it's own language, data model, etc. there's not even the chance that you can run your code elsewhere without major rework. Force.com got traction primarily for people building extensions to Salesforce's SFA and CRM offerings - though some people did do more with it. VMforce (Spring on Force.com) was supposed to change the openness issue by providing a framework for any Java apps to run. Google AppEngine is also proprietary in many respects, and when it launched with just a single language (Python!), a lot of developers shrugged. Even the proprietary PaaS components of AWS have been a problem. I could not get my developers to use SimpleDB back in 2008 because, as the rightly pointed out, we'd be stuck if we wanted to move off of EC2 at some point.

It's fair to say that many believe VMware has created something that is open and viable for the enterprise. It allows for the flexibilities that developers demand but has the controls in place to satisfy the enterprise guards.

What does Cloud Foundry represent? Do you think it will it be a disruptive force in the market?

How Disruptive Is Cloud Foundry to the Platform Market?customer surveys

No significant articles today.


<Return to section navigation list> 

0 comments: