Wednesday, February 06, 2013

Windows Azure and Cloud Computing Posts for 2/4/2012+

A compendium of Windows Azure, Service Bus, EAI & EDI, Access Control, Connect, SQL Azure Database, and other cloud-computing articles. image_thumb7_thumb1

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:


Azure Blob, Drive, Table, Queue, HDInsight and Media Services

Carl Nolan described Submitting Hadoop MapReduce Jobs using PowerShell in a 2/5/2013 post:

imageAs always here is a link to the “Generics based Framework for .Net Hadoop MapReduce Job Submission” code.

In all the samples I have shown so far I have always used the command-line consoles. However this does not need to be the case, PowerShell can be used. The Console application which is used to submit the MapReduce jobs call a .Net Submissions API. As such one can call the .Net API directly from within PowerShell; as I will now demonstrate.

image_thumb75_thumb1The key types one needs to be concerned with are:

  • MSDN.Hadoop.Submission.Api.SubmissionContext – The type containing the job submission options
  • MSDN.Hadoop.Submission.Api.SubmissionApi – The type used for submitting the job

To use the .Net API one firstly has to create the two required objects:

$SubmitterApi = $BasePath + "\Release\MSDN.Hadoop.Submission.Api.dll"
Add-Type -Path $SubmitterApi
$context = New-Object -TypeName MSDN.Hadoop.Submission.Api.SubmissionContext
$submitter = New-Object -TypeName MSDN.Hadoop.Submission.Api.SubmissionApi

After this one just has to define the context with the necessary job submission properties:

[string[]]$inputs = @("mobile/data")
[string[]]$files = @($BasePath + "\Sample\MSDN.Hadoop.MapReduceCSharp.dll")

$config = New-Object 'Tuple[string,string]'("DictionaryCapacity", "1000")
$configs = @($config)

$context.InputPaths = $inputs
$context.OutputPath = "mobile/querytimes"
$context.MapperType = "MSDN.Hadoop.MapReduceCSharp.MobilePhoneRangeMapper, MSDN.Hadoop.MapReduceCSharp"
$context.ReducerType = "MSDN.Hadoop.MapReduceCSharp.MobilePhoneRangeReducer, MSDN.Hadoop.MapReduceCSharp"
$context.Files = $files
$context.ExeConfigurations = $configs

One just has to remember that the input and files specifications are defined as string arrays.

In a recent build I added support for adding user-defined key-value pairs to the application configuration file. This ExeConfigurations property expects an array of Tuple<string, String> types, hence the object definition for the $config value.

Optionally one can also set the Data and Output format types:

$context.DataFormat = [MSDN.Hadoop.Submission.Api.DataFormat]::Text
$context.OutputFormat = [MSDN.Hadoop.Submission.Api.OutputFormat]::Text

However, this is not necessary if one is using the default Text values.

Once the context has been defined one just has to run the job:

$submitter.RunContext($context)

To call the PowerShell script from the Hadoop command-line once can use:

powershell -ExecutionPolicy unrestricted /File %BASEPATH%\SampleScripts\hadoopcstextrangesubmit.ps1

All in all a simple process.


Haddy El-Haggan (@hhaggan) listed Windows Azure Table Storage, all the Predefined Functions (.Net) in a 1/4/2013 post:

imageFollowing the last 2 posts of the Live in a Cloudy World Paper, the blob storage and the queue storage, this document is for the table storage. It contains all the necessary explanation to understand the Windows Azure Table Storage and its predefined functions.

imageWaiting for your feedback.

 


Haddy El-Haggan (@hhaggan) listed Queue Storage all the predefined Functions (.Net) in a 1/4/2013 post:

imageFollowing a previous blog post on how to develop on Windows Azure, Microsoft Platform of Cloud Computing, I have wrote a document that I hope it might help you with your development with all the predefined functions of the blob storage. This type of storage is most likely used for storing unstructured data on the cloud. This is mainly all about the Microsoft.WindowsAzure.StorageClient.

Hope you like them, waiting for your feedback

image_thumb1


<Return to section navigation list>

Windows Azure SQL Database, Federations and Reporting, Mobile Services

The SQL Server Team (@SQLServer) announced the availability of a Windows Azure SQL Database and SQL Server -- Performance and Scalability Compared and Contrasted white paper on 2/5/2013:

imageCurious about the differences in approaching performance and scalability in SQL Server vs. Windows Azure SQL Database (formerly SQL Azure)? Check out this new paper that brings together insights from the Microsoft SQL Engineering and Customer Advisory Teams (CAT) to detail the differences between on-premises SQL Server and Azure SQL Database tuning and monitoring techniques and best practices for database performance testing. If you’re an on-premises SQL Server whiz and currently integrating the cloud or considering a move to Platform as a Service, this paper is a must read!

Paper Overview

imageWhile SQL Server and Windows Azure SQL Databases have large and important similarities, they are not identical, and while the differences are relatively small, they affect the way that applications perform on SQL Database compared to SQL Server. As a result, the application architecture and performance evaluation techniques for each platform also differ.

This document explains these performance differences and their causes and includes real-world customer wisdom from experience troubleshooting performance on production customer SQL Databases. This document also examines common SQL Server performance evaluation techniques that do not work on SQL Database.

Read the full paper.


Nick Harris (@cloudnick) described Building Mobile Apps with Windows Azure Content from BUILD 2012 on 2/4/2013:

image//BUILD 2012 was an awesome event! this post is a little late. Although late this content is still extremely relevant if you are building Connected Mobile Apps. Here are a couple of sessions you should watch:

Keynote Demo of Windows Azure Mobile Services

imageDuring //BUILD 2012 I was fortunate enough to be to be on point for delivering the day 2 Mobile Services keynote demo app Event Buddy. If you have not watched this keynote demo I would recommend you check it out – the Mobile Services Demo starts about 10mins 30 seconds in

image_thumb75_thumb2Direct Video link on Channel9. Event Buddy is now also available as a Code Sample that you can download here.

Developing Mobile Solutions on Windows Azure – Part I

Watch Josh take a Windows Phone 8 + Windows Store application and light it up with cloud services to handle data, authentication and push notifications – right before your eyes with Windows Azure Mobile Services. Almost all demo and no slides, this session is designed to take you from zero to Mobile Services here in 60 minutes.

You can also watch this video directly on Channel 9 here.

Developing Mobile Solutions on Windows Azure – Part II

In addition to this I also presented with fellow baldy – Chris Risner. In this session we took the output of Part I from Josh and demonstrated how you could extend your existing applications to support common scenarios such as geo-location, media, and cloud to device messaging using services from Windows Azure. Here a summary of the content of this presentation that I humbly grabbed directly from a prior post by Chris

  • We took pictures and uploaded them to Blob Storage. For this we used a web service layer, running in Windows Azure Websites, to get a SAS (Shared Access Signature) which allowed us to securely upload to blob storage.
  • We then got the location of the device and used that information to geo-tag the pictures we just uploaded.
  • We added a web page to our web service which allowed a user to select a geographical area on a map and request a push notification be sent to anyone that had taken a picture inside of it (this used the Bing Maps API and Queues from Windows Azure Storage)
  • We deployed a worker role to Windows Azure Cloud Services which would check the queue and then figure out who should be notified (using Entity Framework’s geospatial support) and sent out the actual push notifications to both the Windows 8 and Windows Phone 8 clients.

Here is the direct link to watch on Channel 9 and Chris has also made the code available here


Carlos Figueira (@carlos_figueira) explained Enabling Single Sign-On for Windows 8 Azure Mobile Apps in a 2/4/2013 post:

imageOne of the common complaints about the authentication support in the Windows 8 SDK for Azure Mobile Services is that regardless of whether the user selects the “save password” / “remember me” checkbox when entering their credentials, every time the application starts and the user needs to authenticate, they have to enter their credentials all over again. Currently the only way to enable the single sign-on (SSO) for Win8 apps would be to use the Live SDK (or some native SDK from the other providers, such as Facebook or Google) and then only authenticate with Azure Mobile Services with the token received from that SDK.

imageWe recently released an update of the Azure Mobile Services client SDK for Windows 8 (along with an update to the service runtime), and this feature is now available. To enable single sign-on, we need to use a new overload to the LoginAsync (or login from the JavaScript client), and we also need to set the package SID of the Windows 8 application from the store. Let’s see how I can change my application to use this new feature. This is my current app. It currently uses Facebook authentication (a simple app which gets data from the graph API about the logged in user), but if I were using the other three authentication providers the steps to enable SSO would be the same.

App

image_thumb75_thumb2The code for the application (in one method) is the following:

  1. private async void btnGetData_Click_1(object sender, RoutedEventArgs e)
  2. {
  3. try
  4. {
  5. var user = await MobileService.LoginAsync(MobileServiceAuthenticationProvider.Facebook);
  6. AddToDebug("Logged in as {0}", user.UserId);
  7. var table = MobileService.GetTable("UserIdentity");
  8. var item = new JsonObject();
  9. await table.InsertAsync(item);
  10. AddToDebug("Item: {0}", item.Stringify());
  11. var fbAccessCode = item["identities"].GetObject()["facebook"].GetObject()["accessToken"].GetString();
  12. var httpClient = new HttpClient();
  13. var resp = await httpClient.GetAsync("https://graph.facebook.com/me?access_token=" + fbAccessCode);
  14. var body = await resp.Content.ReadAsStringAsync();
  15. JsonObject bodyJson = JsonObject.Parse(body);
  16. foreach (var key in bodyJson.Keys)
  17. {
  18. AddToDebug("{0}: {1}", key, bodyJson[key].Stringify());
  19. }
  20. }
  21. catch (Exception ex)
  22. {
  23. AddToDebug("Error: {0}", ex);
  24. }
  25. }

First, the change in the client, which is the simple one: just use the new overload of the LoginAsync method – see the change below. Everything else remains exactly the same.

  1. var user = await MobileService.LoginAsync(MobileServiceAuthenticationProvider.Facebook, true);
  2. AddToDebug("Logged in as {0}", user.UserId);
  3. var table = MobileService.GetTable("UserIdentity");

Now, for the required server change. To enable this feature, we actually need to have a package id for the application in the Windows 8 store. If you still haven’t created an application on the Windows Store, you can follow the instructions in the tutorial for enabling push notifications. So far I hadn’t needed to do that, although once my app was ready for publishing to the store I’d had to do that anyway. So let me create one now:

CreateNewAppWindowsStore

After the application is created (I’ll call mine “blogpost20130204”) we can associate the VS project with the application in the store:

AssociateAppStore

Next, we need to find the package SID for the store application. I actually find the dashboard for the Live Connect Developer Center easier to navigate than the Windows Store dashboard, so I’ll go from there:

LiveConnectDashboard

After selecting the application, select “Edit Settings” and “API Settings” and it should show the package SID for my app:

PackageSid

Now that we have that value, we need to set it on the Azure Mobile Service portal for our application. Currently the only place where we can set it is on the “PUSH” tab, so that’s where we go, even through we’ll use it for authentication, and not push (in the future I’m sure there will be an update to the portal and that will be in its proper place).

PushTab

Ok, now we have both the client and the server changes made. And that should be it. The username / password information should be now remembered by the client application!

More info / known issues

The old behavior and the new behavior are a byproduct of our usage of the Web Authentication Browser control in our Windows 8 SDK. Over the next couple of days I’ll publish a post with the internal details of our authentication, and how it relates to that control.

There’s currently one issue that we’re aware of: if you try to use both the Live SDK to authenticate the user to the app (a.k.a. client-side auth flow) and the single sign-on for Microsoft accounts (a.k.a. server-side auth flow), then the server-side flow won’t work. Notice that this isn’t something that one would normally do, as both do the same thing. Just something to be aware.

image_thumb18


<Return to section navigation list>

Marketplace DataMarket, Cloud Numerics, Big Data and OData

image_thumb8No significant articles today


<Return to section navigation list>

Windows Azure Service Bus, Caching Access Control, Active Directory, Identity and Workflow

Alex Simons explained Simple, responsive sign-in to Microsoft services driven by Windows Azure AD in a 2/5/2014 post:

This winter, the Windows Azure AD team will launch a new sign-in experience across Office 365, Windows Azure and other Microsoft services used by organizations around the world. Our redesign goals were to create a simple experience that’s optimized for modern devices, reduces the number of times users need to sign in, and provides the best possible experience across the many devices that you want to use.

imageWhether we’re accessing our email, collaborating on SharePoint or managing our services on Azure, it usually starts by signing in. Let’s check it out.

Responsive Design

The new sign in experience automatically adapts to the screen resolution and capabilities of different devices, OSes and browsers. It looks fantastic on recent browsers, tablets and mobile devices.It’s optimized for touch and also works great with mouse or keyboard. Regardless of the device you access from, your sign in experience is consistent and predictable.

Latency Optimizations

We’ve optimized the way the page is structured and defer downloading illustrations and partners’ “prefetching” content. This allows users to start signing in almost immediately, while the rest of content downloads in the background. On smaller screens such as on a smartphone, the large illustration is usually not downloaded, saving time and brandwith.

Enhanced Federated Sign-In

Many large organizations enable on-premises-to-cloud Single Sign On by federating their on-premises identity infrastructure, such as Windows Server Active Directory and AD FS, to Windows Azure AD. The new sign-in UX greatly enhances the experience for employees of these organizations.

And because Windows Azure AD works seamlessly across services, you can easily navigate between Office 365, Windows Intune, the Windows Azure Management Portal or any 1st or 3rd party service that uses Windows Azure AD without having to sign in a second time. For more information about federating please see Alex Simon’s previous post.

Simplified subsequent sign-ins

We’ve made it simpler for users to remain signed in and save time on subsequent sign-ins. Checking “Keep me signed in” will keep you connected until you explicitly sign out, and allow a user tile to be shown on the sign in page on subsequent sign ins. Clicking “forget this account” removes the tile.

Preview

During this preview period, the old sign-in UX is shown by default but you can opt-in to try out the new one. The preview UX is reserved for testing and is not supported as a production service.

To turn on the new UX, please visit http://login.microsoftonline.com/optin.srf and click the opt-in button. You’ll need to visit this page weekly and opt-in from each browser and device where you want to see the new UX. To turn it off, simply visit this page again and click the blue button.

Tell us what you think!

You can provide feedback on the new sign-in experience and report issues on the Windows Azure AD forum.


Haishi Bai described (@HaishiBai2010) New features in Service Bus Preview Library (January 2013) – Epilogue: Message Browse in a 2/4/2013 post:

[This post series is based on preview features that are subject to change]

imageThis is an epilogue of my 3-part blog series on the recently released Service Bus preview features.

Why an epilogue?

imageI’ve missed another feature in the preview library – message browse, which allows you to iterate through the messages without blocking any receivers. But, before I get to that, I have to clarify one thing regarding what I said in the Message Pump post. When you call Client.Receive() method, you can specify a timeout of any duration to wait for a message to arrive. This is not a long-polling, but it does allow you to specify a timeout period before the method returns. So, the price comparison in the original post isn’t that relevant after all. However, the gist of that post is never about price reduction, but about how Message Pump enables event-driven programming model, which is really natural to client applications, especially rich clients. With that said, let’s move on to the main topic: Message Browse.

Message Browse

image_thumb75_thumb3Message browse can be useful in scenarios where you need to provide monitoring or tracing capabilities without affecting existing clients. For example, you can examine top 10 messages from dead-lettered queue. Or, you can provide live monitoring of pending jobs (such as if a job related to a specific customer has appeared on the queue). Here I’ll just present a very simple Console application as an example:

static void Main(string[] args)
{
    var conString = "[SB Connection String]";
    var queueName = "workqueue";
    QueueClient sender = QueueClient.CreateFromConnectionString(conString, queueName);
    for (int i = 0; i < 10; i++)
        sender.Send(new BrokeredMessage(string.Format("Message {0}", i+1)));
    QueueClient receiver = QueueClient.CreateFromConnectionString(conString, queueName, 
        ReceiveMode.ReceiveAndDelete);

    BrokeredMessage message = null;
    do
    {
        message = receiver.Peek();
        if (message != null)
            Console.WriteLine("Message found: " + message.GetBody<string>());
    } while (message != null);

    Console.ReadLine();

    message = null;

    do
    {
        message = receiver.Receive();
        if (message != null)
            Console.WriteLine("Message received: " + message.GetBody<string>());
    } while (message != null);

    Console.ReadLine();
}

The above code sends 10 messages to a queue, browses the messages by using the Peek() method (highlighted yellow), and then retrieves the messages using the Receive() method (highlighted green). Because the Peek() method doesn’t remove any messages from the queue, Recevie() calls are not affected afterwards and all the messages can be received.

Send us your feedbacks!

I hope you’ve learned some useful information from this blog series. And I know the production team is eager to hear from you. Are the features useful in your scenarios? What are we missing? Where do we need to improve? The whole idea of the preview library is to get your feedbacks so that we can make the final product better. So, if you have any feedbacks, either positive or negative, please don’t hesitate to send them my way, and I’ll make sure they are heard by the team. You can either leave comments to this blog, or send tweets to @HaishiBai2010.

image_thumb9


<Return to section navigation list>

Windows Azure Virtual Machines, Virtual Networks, Web Sites, Connect, RDP and CDN

Craig Kitterman (@craigkitterman) posted Windows Azure and VM Depot – Better Together to the Windows Azure blog on 2/4/2013:

imageIt has been just a few weeks since Microsoft Open Technologies, Inc. announced the availability of VM Depot - the community-driven catalog of open source virtual machine images. Since then, the community has been participating in this new and open approach by using and publishing virtual machine images through VM Depot. VM Depot launch partners, including Alt Linux, Basho, Bitnami and Hupstream have contributed a number of packages and the number of images published on the site has nearly tripled with new images being uploaded every day.

imageToday I would like to share with you that VM Depot is now integrated into the Windows Azure management portal to make using these images even easier. Users can now browse for open source stacks provided by the community - based on supported Linux distributions - and provision them as their personal images directly within the Windows Azure portal.

image_thumb75_thumb4When you select the "Virtual Machines" tab, you will now see an option to "BROWSE VMDEPOT":

After clicking "BROWSE VMDEPOT" you are presented with a full list of available images, with the ability to filter by OS distro:


Please join the community and help us make VM Depot the best place for open source communities to work together and build shared images for the cloud. You can also can get timely alerts on new images uploaded on Twitter @vmdepot. See you there


Brady Gaster @bradygaster) described nopCommerce and Windows Azure Web Sites in a 1/5/2013 post:

imageThis week we announced support for nopCommerce in the Windows Azure Web Sites application gallery. Using the Windows Azure portal and with requiring zero lines of code, you can set up nopCommerce on Web Sites and get your online store up in minutes. You’ll have your very own products database, shopping cart, order history – the works. On the nopCommerce web site you can learn a lot more about the features nopCommerce offers. In this blog post, I’ll show you how to get your own store up and running on Windows Azure Web Sites.

imageI walked through this process today. As with other entries in the Web Sites application gallery, you really do have to do very little digging to figure out where to go to get started. It’s pretty much “start, new site, from gallery,” as you’ll see from the picture below. It shows you exactly which menu item to click in the Windows Azure portal to get started.

1

image_thumb75_thumb4The next step should be pretty self-explanatory. Select nopCommerce from the list of available applications.

2

The next step will ask you for some database connection information. This is what will be set in your nopCommerce installation once the process is complete. I’m going to create a new database solely for use with my nopCommerce site I’m creating for this demonstration.

3

The next screen is one that appears in a few other application gallery entries, too. It’s the “where do you want to store your data today?” screen. I’m creating a new SQL Server database in this screen so I need to provide the database name, specify a new server’s creation, and provide the database username and password. Don’t bother writing this down, there’s an app screen for that later in this post.

4

Once I click OK here, my site is created. First, the portal tells me it’s creating the site:

5

Once the site is up and running, Windows Azure lets me know:

6

If I select my nopCommerce site and click the “Browse” button in the Windows Azure portal, the site will open up in a new browser instance and allow me the capability of specifying the database connection string it’ll use.

7

Now, I’ll go back to the Windows Azure portal’s dashboard for my nopCommerce demo site. In that dashboard page I’ll click the link labeled “View connection strings,” and a dialog will open. In that dialog I’ll see the connection string for my database. I can copy that from the dialog…

8

… and paste it into the nopCommerce setup window.

9

Of course, I’ve blocked out my site’s real connection string in this picture, but the idea is – it doesn’t get much easier. Once I click the “Install” button in the nopCommerce setup page, the site and database schema, as well as some sample data points, will be installed automatically and the site configured to access the database. Once the setup process is complete, I’ll be redirected to my very own store site.

10

In the navigation bar I’ll click on the “My Account” link, login, and then, at the very tip-top of my browser I’ll see a link to get to the Administration panel of my new nopCommerce store site.

12

The administration portal for the nopCommerce product promises to give me just about everything I’d need to sell some stuff, know how my sales are doing, and so on. I can pretty much do whatever I need to do using their rich, extensive administration functionality.

13

If you’ve been thinking of setting up a store, with a shopping cart, online, or you’ve been asked to do so and are more interested in getting it up and running quickly than you are with re-inventing the wheel by writing custom code, check out nopCommerce. Get your free Windows Azure trialwhich comes with 10 free web sites for free – right here, then set up your own nopCommerce site and have your products selling in your store.

image_thumb11


<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Guarav Mantri (@gmantri) posted a Workaround for IIS Express Crashing When Running Windows Azure Cloud Service Web Role with Multiple Instances in Windows Azure SDK 1.8 Compute Emulator on 2/4/2013:

imageI ran into this weird issue when developing a very simple Windows Azure Cloud Service Web Role using Windows Azure SDK 1.8. I started off with a basic ASP.Net MVC 4 Web Role with no changes whatsoever. When I tried to debug it with just one instance running, everything worked great. The moment I increased the number of instances to 2 (or more than 1) and tried to debug it, IIS Express crashed with the following entries in event logs:


Faulting application name: iisexpress.exe, version: 8.0.8418.0, time stamp: 0x4fbae3d6
Faulting module name: ntdll.dll, version: 6.2.9200.16420, time stamp: 0x505ab405
Exception code: 0xc0000008
Fault offset: 0x0000000000004c39
Faulting process id: 0x54bc
Faulting application start time: 0x01cdfeb70c1e060e
Faulting application path: C:\Program Files\IIS Express\iisexpress.exe
Faulting module path: C:\Windows\SYSTEM32\ntdll.dll
Report Id: 4a8f54b2-6aaa-11e2-be81-9cb70d025a13
Faulting package full name:
Faulting package-relative application ID:


Fault bucket 79240842, type 4
Event Name: APPCRASH
Response: Not available
Cab Id: 0

Problem signature:
P1: iisexpress.exe
P2: 8.0.8418.0
P3: 4fbae3d6
P4: ntdll.dll
P5: 6.2.9200.16420
P6: 505ab405
P7: c0000008
P8: 0000000000004c39
P9:
P10:

Attached files:
C:\Users\Gaurav.Mantri\AppData\Local\Temp\WER4720.tmp.WERInternalMetadata.xml

These files may be available here:
C:\ProgramData\Microsoft\Windows\WER\ReportArchive\AppCrash_iisexpress.exe_1263888cd189f3357758cdc998d7afecf3f28_2dae4eb2

Analysis symbol:
Rechecking for solution: 0
Report Id: 4a8f54b2-6aaa-11e2-be81-9cb70d025a13
Report Status: 17
Hashed bucket: b615b24f7933990317ae65713d680dab


image_thumb75_thumb5This issue does not happen on all machines. I asked a few of my friends to try on their computers and some of them were able to recreate this problem while others couldn’t. My development environment was Windows 8, Visual Studio 2012 and Windows Azure SDK 1.8.

imageSince I was facing this problem and a few of my friends are also facing the same problem, I thought there may be many more folks who may be facing same problem and hence this blog.

Windows Azure product team helped me find the cause of the problem and gave me a workaround which fixed the problem. I’m also told that this issue will be fixed in the coming versions of SDK but till the time the new SDK comes, the workaround recommended by the team can be used to address this problem.

Cause

From what I have been told, this is related to an issue with Windows Azure SDK 1.8 and it happens because IIS Express is trying to bind same IP address to multiple instances. Here’re the entries from applicationHost.config file for 2 instances in which I was trying to run my application which let the product team to that conclusion:

Instance 0:

<site name="deployment18(110).WindowsAzureService3.WindowsAzureService3WebRole_IN_0_Web" id="1273337584"> 
                <application path="/" applicationPool="b31dc554-ad4b-4a18-8a62-098b5079f26d" enabledProtocols="http"> 
                    <virtualDirectory path="/" physicalPath="D:\Projects\WindowsAzureService3\WindowsAzureService3WebRole\" /> 
                </application> 
                <bindings> 
                    <binding protocol="http" bindingInformation="*:82:" /> 
                </bindings> 
                <logFile logExtFileFlags="Date, Time, ClientIP, UserName, SiteName, ComputerName, ServerIP, Method, UriStem, UriQuery, HttpStatus, Win32Status, BytesSent, BytesRecv, TimeTaken, ServerPort, UserAgent, Cookie, Referer, ProtocolVersion, Host, HttpSubStatus" logFormat="W3C" directory="C:\Users\Gaurav.Mantri\AppData\Local\dftmp\Resources\40fae03f-8519-4c66-a36b-da8e2de73e9d\directory\DiagnosticStore\LogFiles\Web" period="Hourly" /> 
                <traceFailedRequestsLogging enabled="true" directory="C:\Users\Gaurav.Mantri\AppData\Local\dftmp\Resources\40fae03f-8519-4c66-a36b-da8e2de73e9d\directory\DiagnosticStore\FailedReqLogFiles\Web" maxLogFiles="1000" /> 
            </site> 

Instance 1:

<site name="deployment18(110).WindowsAzureService3.WindowsAzureService3WebRole_IN_1_Web" id="1273337584"> 
                <application path="/" applicationPool="92214b82-0516-4502-8e38-4d94c3cdf949" enabledProtocols="http"> 
                    <virtualDirectory path="/" physicalPath="D:\Projects\WindowsAzureService3\WindowsAzureService3WebRole\" /> 
                </application> 
                <bindings> 
                    <binding protocol="http" bindingInformation="*:82:" /> 
                </bindings> 
                <logFile logExtFileFlags="Date, Time, ClientIP, UserName, SiteName, ComputerName, ServerIP, Method, UriStem, UriQuery, HttpStatus, Win32Status, BytesSent, BytesRecv, TimeTaken, ServerPort, UserAgent, Cookie, Referer, ProtocolVersion, Host, HttpSubStatus" logFormat="W3C" directory="C:\Users\Gaurav.Mantri\AppData\Local\dftmp\Resources\3c7c31cb-a2fb-4809-aadb-9238acb2f092\directory\DiagnosticStore\LogFiles\Web" period="Hourly" /> 
                <traceFailedRequestsLogging enabled="true" directory="C:\Users\Gaurav.Mantri\AppData\Local\dftmp\Resources\3c7c31cb-a2fb-4809-aadb-9238acb2f092\directory\DiagnosticStore\FailedReqLogFiles\Web" maxLogFiles="1000" /> 
            </site>

As you can see from above in the “binding” section, both instances were trying to bind to port 82 and hence the error.

Workaround

There were two workarounds recommended by the Windows Azure SDK team:

Install Visual Studio 2010 SP1

If you’re using Visual Studio 2010, it is recommended that you install SP1 for that. Seemingly that takes care of this problem. Since I don’t have VS 2010 on my machine, I did not try it as it was not applicable to me.

Add an Environment Variable

Other workaround recommended was to add an environment variable called “_CSRUN_DISABLE_WORKAROUNDS” and set the value of this variable to “1”. To set an environment varaible, search for “Control Panel” and then go to “System” –> “Advanced system settings” –> “Advanced” tab –> “Environment Variables…” button –> “New…” button.

I’ve also added a screenshot for your convenience as well.

image

After you do that you would need to restart IIS Express and Compute/Storage Emulator.

I tried this approach and it worked for me.

image_thumb22


<Return to section navigation list>

Visual Studio LightSwitch and Entity Framework 4.1+

Kostas Christodoulou (@kchristo71) answered Can I delete this? in a 2/5/2013 post:

imageIn a previous post I suggested a way to override default LightSwitch add/edit behavior. In this post I will suggest an override to default delete behavior.

It has happened many times to delete entries, only to get an error message, when trying to save, informing me about reference constraints being violated and the action cannot be completed. This is not much of problem when you try to delete one record (apart from the fact this message is ugly and not one that I would allow my end users to deal with, for that matter). But when trying to delete many records you have no clue which was the one (or more) record(s) that cannot be deleted. In this case all you can do is delete-save until you find the record that cannot be deleted. And when you find it, you have to refresh your screen or undo the changes of the current record (a future article will suggest a way to do this).

image_thumb6What I needed was a good generic way to be able to set Delete_CanExecute result. This is my suggestion using (what else) extension methods, interfaces (Yann will love me for this ) and an attribute.

First the interface to help decide if an instance can be deleted or not:

public interface IDependencyCheck : IEntityObject
{
  bool CanDelete { get; }
}

Quite simple you must admit. Keep in mind interfaces you want to use in your common project have to be accessible both by Server and Client project also. If someone here thinks of EntityName_CanDelete, someone forgets that it is a server side hook.

Now an attribute to help us decide which of the dependencies an entity might have are strong ones, meaning that the referential integrity of the database or your business logic if you don’t use database constraints, would not allow an instance to be deleted.

[AttributeUsage(AttributeTargets.Class)]
public class StrongDependencyAttribute : Attribute
{
  public StrongDependencyAttribute(string dependencies) {
    this.dependencies = dependencies.ToStringArray();
  }

  public string[] Dependencies {
    get { return dependencies; }
  }

  private readonly string[] dependencies;
}

This attribute takes a list of strong reference property names as comma delimited string (or whatever delimiter you like for that matter) using these simple extensions:
public static string[] ToStringArray(this string strings) {
   return strings.ToStringArray(',');
}

public static string[] ToStringArray(this string strings, char delimiter) {
   return (from string item in strings.Split(new char[] { delimiter }, StringSplitOptions.RemoveEmptyEntries)
               select item.Trim()).ToArray();
}
If you want to use another delimiter just replace
this.dependencies = dependencies.ToStringArray();
in the constructor with (for example):
this.dependencies = dependencies.ToStringArray(';');

Or if you want, you can have one instance of the attribute for each property you want to check and avoid delimited strings altogether. Plenty of choices….

Keep in mind that if you declare no instance of this attribute but you implement IDependencyCheck then ALL dependencies will be considered as strong ones and checked for integrity.

This attribute needs to be accessible by the Common project only.

Now, all that said, there are two extension methods that will help us do the job.
The first one needs to be accessible by the Client project only:

public static bool CanDeleteSelection(this IScreenObject screen, string collectionName) {
   if (!screen.HasSelection(collectionName))
     return false;
   IVisualCollection collection = screen.Details.Properties[collectionName].Value as IVisualCollection;
   if (collection.SelectedItem is IDependencyCheck)
     return (collection.SelectedItem as IDependencyCheck).CanDelete;
   return true;
}

Please note I am using IScreenObject.HasSelection extension that has already been introduced in a previous post. That’s why I am not checking if the cast to IVisualCollection is successful (not null).

The second one has to be accessible by the Common project only:

    public static bool CanDelete(this IEntityObject entity) {
      IEnumerable<IEntityCollectionProperty> collectionProperties = 
        entity.Details.Properties.All()
        .Where(p => p.GetType().GetInterfaces()
          .Where(t => t.Name.Equals("IEntityCollectionProperty"))
          .FirstOrDefault() != null)
        .Cast<IEntityCollectionProperty>();
      if (collectionProperties == null)
        return true;
      List<string> strongDependencies = new List<string>();
      IEnumerable<StrongDependencyAttribute> dependencies = 
        entity.GetType()
        .GetCustomAttributes(typeof(StrongDependencyAttribute), false)
        .Cast<StrongDependencyAttribute>();
      foreach (StrongDependencyAttribute dependency in dependencies)
        strongDependencies.AddRange(dependency.Dependencies);
      bool hasDependencies = strongDependencies.FirstOrDefault() != null;
      bool canDelete = true;
      foreach (IEntityCollectionProperty property in collectionProperties) {
        if (hasDependencies &&strongDependencies.FirstOrDefault(d => d.Equals(property.Name)) == null)
          continue;
        IEnumerable value = entity.GetType()
          .GetProperty(string.Format("{0}Query", property.Name))
          .GetValue(entity, null) as IEnumerable;
        try {
          if (value != null && value.GetEnumerator().MoveNext()) {
            canDelete = false;
            break;
          }
        }
        catch {
          continue;
        }
      }
      return canDelete;
    }


Although it’s obvious at first glance what the code does Hot smile, I will give brief explanation:

If there is not a StrongDependencyAtttibute defined for the entity then all reference properties are checked and if at least one has members then the entity cannot be deleted. If a StrongDependencyAtttibute is defined for the entity then only reference properties included in the attribute are checked. That’s all…

If you manage to read the code (I am not very proud about the absence of comments) you will notice that only one-to-many and many-to-many references are handled. In my world one-to-one references mean inheritance and in this case both objects should be deleted. But what if the base object can be deleted (has no direct references) and the derived object has? Again in my world, if you are trying to delete the base object you are already doing it wrong! Anyway if someone lives in a world other than mine (I am very democratic guy) and wants to support one-to-one relations all he/she has to do is find where IEntityCollectionProperty definition is and look for the respective property type (I believe it is IEntityReferenceProperty but I am not quite sure).

And for the end an example so that anyone can see what all of the above end up to:

Suppose you have a Customer entity. And this Customer entity has a collection of Orders. The property of the Customer entity that holds these orders is called CustomerOrders. In your Datasource you right-click Customer entity and select View Table Code. A partial class implementation file is created (if it does not already exist). Modify the definition of you class as follows:

[StrongDependency("CustomerOrders")]
public partial class Customer : IDependencyCheck {
...

  #region IDependencyCheck members 
  public bool CanDelete {
    get { return this.CanDelete(); }
  }
  #endregion IDependencyCheck members
}

Remember to reference (using) the namespace where your extension method (CanDelete) is declared.

Please note that IDependencyCheck gives you the potential to write whatever else hardcoded (or not) check you want in your CanDelete property implementation. In the code above I just call the extension method I introduced earlier. But you can do whatever you want. You can even skip the dependencies mechanism suggested altogether. The client side extension will still work.
So in the screen that you have your list of Customers right click the Delete command of the list or grid and in the CustomersDelete_CanExecute just write:

partial void CustomersDelete_CanExecute(ref bool result){
  result = this.CanDeleteSelection("Customers");
}

As partial implementation of Execute write:
partial void CustomersDelete_Execute(){
  this.Customers.DeleteSelected();
}
I know some of you have already noticed the overhead of potentially loading the dependent objects the first time you select an item of your list or grid. I cannot argue with that, except for the fact that my approach is suggested for intranet implementations (I am not sure I would do something like that over the web) and the fact that the time under these circumstances is an acceptable price to pay in order to avoid the annoying referential integrity message. At least in my world


Julie Lerman (@julielerman) answered What’s Best for Unit Testing in EF? It depends, dude! in a 2/4/2013 post:

imageThis tweet stopped me in my tracks because I couldn’t reply in a tweet:

Stephen Coulson @sdcoulson

        @julielerman In your opinion what is the best solution for unit testing
        #EF dependent code? #MockingFramework? #SQLCE? or other?

image_thumbIt’s a loaded question, so I’ll just answer briefly.

First: it depends! :) It always depends.

Do you mean unit tests or integration tests? A unit test shouldn’t be hitting a database.

FAKING FOR UNIT TESTS

I use fakes to avoid hitting a database when I truly want to do a unit test on code that’s trying to do data access. I (and many others) have written extensively about faking with EF. I have some old blog posts and book chapters that do this with EF4 but not using code first or dbcontext.

In my DbContext book I have some examples of faking with DbContext (ala EF4.1 & EF5).

I have a module in one of my Pluralsight courses on testing with EF (Entity Framework in the Enterprise).

MOCKING FRAMEWORKS FOR UNIT TEST

I’ve had a hard time finding good current examples of using mocking frameworks with EF. Typemock has one that’s terribly outdated and not useful. I found one person who’s written a little about using Moq with EF.

SQL CE FOR INTEGRATION TESTS

Sure you can use SQL CE. It’s a perfectly good way to go. Here’s a blog post by Eric Hexter to get you started:

Using sql compact for integration tests with entity framework

CODE FIRST DATABASE INITIALIZATION FOR INTEGRATION TESTS

This is what I use. One of the intializers is “DropCreateDatabaseAlways”. I use that combined with some seed data to refresh my database for each integration test. This is one of the methods I show in my Pluralsight course.

More More More!

There are plenty of other ways to go as well. For example, Lynn Langit has pointed me via twitter to ApprovalTests.

And just watch the comments fill up with other suggestions.

Hope this helps, Stephen! :)


Return to section navigation list>

Windows Azure Infrastructure and DevOps

Avkash Chauhan (@avkashchauhan) posted Powershell script to detect the Windows Azure SDK in Azure Cloud Service VM and fix the Powershell execution issue on 2/5/2013:

imageWhen you try to run Powershell commands on Windows Azure Cloud Service (PaaS) VM you may have seen exception as below:

PS D:\Users\avkash> Add-PSSnapin Microsoft.WindowsAzure.ServiceRuntime
image_thumb75_thumb6Add-PSSnapin : Cannot add Windows PowerShell snap-in Microsoft.WindowsAzure.ServiceRuntime because it is already added.
Verify the name of the snap-in and try again.
At line:1 char:13
+ Add-PSSnapin <<<< Microsoft.WindowsAzure.ServiceRuntime
+ CategoryInfo : InvalidArgument: (Microsoft.WindowsAzure.ServiceRuntime:String) [Add-PSSnapin], PSArgume ntException
+ FullyQualifiedErrorId : AddPSSnapInRead,Microsoft.PowerShell.Commands.AddPSSnapinCommand

I have created a Powershell script, which auto detects the Windows Azure SDK version runtime in the Azure VM and then fixes this problem in registry and sets the Role Busy as below:

[xml] $RoleModelXml = Get-Content "E:\RoleModel.xml"
$sdkVersion = $RoleModelXml.RoleModel.Version
Write-Output ("---------------------------------------")
Write-Output("Your SDK Version is: " + $sdkVersion)
Write-Output("---------------------------------------")

Write-Output("")
Write-Output("Now Setting SDK Version in Registry: " + $sdkVersion.substring(0,3))
Write-Output("---------------------------------------")

$finalSDKVersion = $sdkVersion.substring(0,3)
$fixedAssemblyName = "Microsoft.WindowsAzure.ServiceRuntime.Commands, Version=" + $finalSDKVersion + ".0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"

$Registry_Key = “HKLM:\Software\Microsoft\PowerShell\1\PowerShellSnapIns\Microsoft.WindowsAzure.ServiceRuntime\"
Set-ItemProperty -path $Registry_Key -name Version -value $finalSDKVersion
Set-ItemProperty -path $Registry_Key -name PowerShellVersion -value 2.0
Set-ItemProperty -path $Registry_Key -name AssemblyName -value $fixedAssemblyName


#Validate the registry
Write-Output("*** --- Validating the registry change *** ----")
Get-ItemProperty -path $Registry_Key -name Version
Get-ItemProperty -path $Registry_Key -name PowerShellVersion
Get-ItemProperty -path $Registry_Key -name AssemblyName


$AzureSnapIn = "Microsoft.WindowsAzure.ServiceRuntime"

if ( (Get-PSSnapin -Name $AzureSnapIn -ErrorAction SilentlyContinue) -eq $null )
{

Add-PsSnapin $AzureSnapIn
}
else
{
$message = "PsSnapin " + $AzureSnapIn + " already loaded!!"
Write-Output($message);
}


#Settings the Role busy
Write-Output("*** --- Setting Role Busy *** ----")
zSet-RoleInstanceStatus -busy

<Return to section navigation list>

Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds

image_thumb75_thumb7No significant articles today


<Return to section navigation list>

Cloud Security, Compliance and Governance

David Linthicum (@DavidLinthicum) asserted “Cloud providers face increasing number of DDoS attacks, as private data centers already deal with today” in a deck for his As cloud use grows, so will rate of DDoS attacks post of 2/5/2013 for InfoWorld’s Cloud Computing blog:

imageThe eighth annual Worldwide Infrastructure Security Report, from security provider Arbor Networks, reveals how both cloud service providers and traditional data centers are under attack. The report examined a 12-month period and asked 200 security-based questions of 130 enterprise and network operations professionals. The key findings follow:

  • 94 percent of data center managers reported some type of security attacks
  • 76 percent had to deal with distributed denial-of-service (DDoS) attacks on their customers
  • 43 percent had partial or total infrastructure outages due to DDoS
  • 14 percent had to deal with attacks targeting a cloud service

imageThe report concluded that cloud services are very tempting for DDoS attackers, who now focus mainly on private data centers. It's safe to assume that, as more cloud services come into use, DDoS attacks on them will become more commonplace.

image_thumb2Arbor Networks is not the only company that cites the rise of DDoS attacks on cloud computing. Stratsec, in a report published last year, stated that some cloud providers are being infiltrated in botnet-style attacks.

This should not surprise anyone. In my days as CTO and CEO of cloud providers, these kinds of attacks were commonplace. Indeed, it became a game of whack-a-mole to keep them at bay, which was also the case at other cloud providers that suffered daily attacks.

The bitter reality is that for cloud computing to be useful, it has to be exposed on public networks. Moreover, cloud services' presence is advertised and the interfaces well-defined. You can count on unauthorized parties to access those services, with ensuing shenanigans.

The only defense is to use automated tools to spot and defend the core cloud services from such attacks. Over time, the approaches and tools will become better, hopefully to a point where the attacks are more of a nuisance than a threat.

The larger cloud providers, such as Amazon Web Services, Hewlett-Packard, Microsoft, and Rackspace, already have good practices and technology in place to lower the risk that these attacks will hinder customer production. However, the smaller cloud providers may not have the resources to mount a suitable defense. Unfortunately, I suspect they will make them the primary targets.


<Return to section navigation list>

Cloud Computing Events

Rick Blaisdell listed Major Cloud Computing Events In 2013 in a 2/5/2013 post:

Cloud Computing is a vibrant technical environment and 2013 is full of many conferences, workshops, exhibitions and learning programs dedicated to cloud technologies. Here are some of the most important events in 2013 you should keep a close eye on and consider attending:

Cloud Connect 2013April 2-5, Silicon Valley, California

Cloud Connect gathers application developers, senior IT professionals, infrastructure and service providers, and cloud computing innovators to share deep cloud insights. Hosted and led by the industry’s top experts, Cloud Connect delivers the latest business, technology and cloud regulatory insights.

Conference tracks I recommend:

  • Application Design & Architecture;

  • Cloud Economics & Strategy;

  • Infrastructure;

  • Performance & Availability;

  • Private & Hybrid Clouds;

  • Risk Management & Security;

Cloud & Big Data Conference & Expo / 3PX|TechCon – April 16-17, Bellevue, Washington

Cloud&Big Data Conference&Expo brings together business leaders and technology experts to explore the “third platform” (Social, Mobile & Mobility, Big Data & Analytics and Cloud Services) and its opportunities to streamline operations, mitigate risk, enhance security and increase profits.

Conference topics I find most interesting:

  • Software for Application Defined Networking: Secure Network Virtualization for M2M and Cloud Computing;

  • How In-Memory Computing Can Boost Cloud Performance for Big Data;

  • Enabling Widespread Enterprise Cloud Adoption;

  • How to Simplify Application On-boarding and Optimize Price-performance on the Cloud

  • Cloud-Based BI and Big Data Analytics for Strategic Business Advantage

  • The Distributed and Decentralized Cloud

Virtualization & Cloud Computing Summit 2013 May 30 – Jun 6, Anaheim, California

The summit gathers security managers and analysts, auditors, compliance analysts who want to discuss and understand the risks of virtualization and cloud computing models and scenarios.

Topics from the agenda that I recommend:

  • Extending Your Private Cloud into Public Clouds

  • Emerging Architectures, Emerging Threats

  • Tools You Can Use: The Virtualization and Cloud Toolkit

CloudCon Expo & ConferenceMay 14-15, San Francisco, California

The event brings IT professionals and decision makers together, to discuss best practices and strategies for Cloud Deployment. It aims to help them understand how cloud technologies implementation can bring benefits such as reliability, adaptability and cost reduction.

Interesting topics on the event agenda:

  • Challenges of Cloud Computing in Highly Scalable and Secure enterprises

  • PaaS on Software Defined Data Centers (SDDC)

  • Cloud Storage & Backup Platforms

The 12th International Cloud Expo – June 10-13, New York

The event brings together decision makers in SMBs and enterprises to discover the latest innovations in Cloud Computing that can help them minimize cost, improve scalability and maximize performance. I don’t have the full details yet, but the Call for Papers is open until March, 17th.


<Return to section navigation list>

Other Cloud Computing Platforms and Services

Datanami (@datanami) reported Dell Enters into Acquisition Agreement on 2/5/2013:

imageDell Inc. announced on Feb. 5th, 2012 it has signed a definitive merger agreement under which [Mr. Denali], Dell’s Founder, Chairman and Chief Executive Officer, in partnership with global technology investment firm Silver Lake, will acquire Dell.

Under the terms of the agreement, Dell stockholders will receive $13.65 in cash for each share of Dell common stock they hold, in a transaction valued at approximately $24.4 billion. The price represents a premium of 25 percent over Dell’s closing share price of $10.88 on Jan. 11, 2013, the last trading day before rumors of a possible going-private transaction were first published; and a premium of approximately 37 percent over the average closing share price during the previous 90 calendar days ending Jan. 11, 2013. The buyers will acquire for cash all of the outstanding shares of Dell not held by Mr. Dell and certain other members of management.

The Dell Board of Directors acting on the recommendation of a special committee of independent directors unanimously approved a merger agreement under which Michael Dell and Silver Lake Partners will acquire Dell and take the company private subject to a number of conditions, including a vote of the unaffiliated stockholders. Mr. Dell recused himself from all Board discussions and from the Board vote regarding the transaction.

The transaction is subject to other customary conditions, including receipt of required regulatory approvals, in addition to the Dell stockholder approvals described above. The transaction is expected to close before the end of the second quarter of Dell’s FY2014.

Dell has stated that the delivery of its solutions, services and experiences will not be interrupted or altered by the merger.

Alex Mandl, lead director of Dell’s Board of Directors, said of the acquisition, “The Special Committee and its advisors conducted a disciplined and independent process intended to ensure the best outcome for shareholders. Importantly, the go-shop process provides a real opportunity to determine if there are alternatives superior to the present offer from Mr. Dell and Silver Lake.”

Datanami’s article failed to mention that Microsoft loaned $2 billion to aid Michael Dell and Silver Lake Partners in financing the transaction. It’s unknown what effect Microsoft’s participation will have on the fate of the Wyse/Dell “Project Ophelia” MiniPC as described in my First Look at the CozySwan UG007 Android 4.1 MiniPC Device article updated 2/6/2013 or a revival of interest in adopting the Windows Azure Platform Appliance (WAPA).


Jeff Barr (@jeffbarr) reported Relational Database Service - Now With Event Subscriptions in a 2/4/2013 post to his Amazon Web Services blog:

imageYou can now elect to receive notifications via the Amazon Simple Notification Service (SNS) for a wide variety of events associated with each of your Relational Database Service (RDS) instances.

Why Notify?
If you are a database administrator (DBA), you can now use these "push" notifications to arrange for notification when your RDS DB Instances are low on storage or have recovered from a failure.

image_thumb111If your application (or your management tools) call the AWS APIs, they need to track the state of the AWS objects that they manage. By using notifications instead of polling (repeatedly calling the "Describe" functions) you can reduce the number of API calls you make while also simplifying your application architecture.

The Details
You can elect to set up notifications for any of your RDS DB Instances by creating an Event Subscription. The notifications will be delivered to the Amazon SNS topic of your choice when certain events occur. Over 40 types of notifications are available, grouped in to the following categories:

  • Availability - Database shutdown or restart.
  • Backup - Backup started or finished.
  • Configuration Change - Security group modified, instance scaling started or finished, password changed, and more.
  • Creation - Instance or snapshot created or deleted.
  • Failover - Failover (for Multi-AZ Instance) started or completed.
  • Low Storage - Allocated storage has been exhausted.
  • Maintenance - Going offline or returning online for patch installation.
  • Recovery - Recovering a database instance.
  • Restoration - Restoring a database instance to a point in time or from a snapshot.

In conjunction with the Simple Notification Service, you can arrange to receive notifications as email messages.

You can manage the notifications by using the RDS APIs or the AWS Management Console. Here is how you manage notifications from the console. The RDS console's navigation pane contains a new item for DB Event Subscriptions:

You can begin the process of creating a new subscription by clicking this button:

From there you can fill in the following form:

What do you think? What kinds of interesting uses can you imagine for this feature?


<Return to section navigation list>

0 comments: