Saturday, April 03, 2010

Windows Azure and Cloud Computing Posts for 4/2/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
• Update 4/3/2010: March 2010 Uptime Reports for OakLeaf Azure Test Harness Running in the South Central US Data Center: Pingdom reports 10 minutes of downtime (see the Live Windows Azure Apps, APIs, Tools and Test Harnesses section).

Generate Identity / Sequence Values from Azure Storage with this 4/1/2010 tutorial (see the Azure Blob, Table and Queue Services section.)

Developing and Deploying Cloud Apps in Visual Studio 2010 for MSDN Magazine’s April 2010 issue (see the Live Windows Azure Apps, APIs, Tools and Test Harnesses section).

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts and Databases”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in April 2010. 

Azure Blob, Table and Queue Services

Benjamin Day explains how to Generate Identity / Sequence Values from Azure Storage in this 4/1/2010 tutorial (missed when posted):

If you’re developing a business application, you frequently need to use sequential integers to provide friendly IDs for things.  For example, if you are writing a billing and invoicing application, you’ll probably want to create something like “Invoice Number”.  If you’re using SQL Server or Oracle, this is relatively straightforward – you’ll either use an Identity column (@@IDENTITY, SCOPE_IDENTITY()) or, in Oracle, a SEQUENCE.  We’re so used to just having these available and having them just work “automagically” but these structures do quite a bit. 

Here are a handful of requirements for IDENTITYs/SEQUENCEs:

  • Can’t return the same value twice.  The values have to keep getting incremented by 1 and the value never gets re-used.
  • A sequence is shared by all callers to the database and must be thread-safe.  Funky behavior isn’t allowed just because the database is under load. 
  • Fast. Got to be fast.  Get the value.  Return.  Done.  Minimal locking/blocking calls to the sequence from other connections.
  • Persistent.  If the database is shut down or goes down, the value of the sequence doesn’t go back to zero.

Now, if you’re writing an application with Azure Storage (not SQL Azure), you don’t have any comparable functionality at the moment.  If you want to create that “Invoice Number” or “Order Number” how are you going to do this?

Ben offers these diagrams and the code:

Here’s a Visual Studio 2010 Activity Diagram that shows the process for some client code getting the next value from a sequence:

image

Here’s a Visual Studio 2010 Activity Diagram that shows how the sequences are managed from the Worker Role:

image

The only problem I see with this approach is the cost of $80+/month for a worker role (times 2 for two instances to qualify for the SLA.) It’s less expensive to buy a 1GB SQL Azure instance and INSERT records with an identity primary key.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS) and OData

David Aiken’s Cloud Cover Episode 7 – Dallas post of 4/2/2010 reports:

Cloud Cover Episode 7 is now available on C9 [here].

In this episode [Ryan] Dunn & [Steve] Marx cover:

  • [T]he new information marketplace for producer and consumers called Codename "Dallas"
  • A PowerShell one-liner for monitoring your instances in Windows Azure.
  • How to build and deploy to the cloud from Visual Studio using MSBuild tasks – all automated!

Marcelo Lopez Ruiz reports WCF Data Services supports Accept-Charset in this 4/1/2010 post:

I'm not sure if the documentation states this explicitly somewhere, but WCF Data Services supports the Accept-Charset header, and will try to pick a character set that works for the client that is making the request. .NET developers are probably more familiar with the term 'Encoding', which is the type name that encapsulates how we translate back and forth between bytes and strings.

The support has been around for a while. In fact, it gave us some grief a couple of years ago, but it's all better now - the client actually leverages this to make sure it understands the server's response.

If you want to play with this for a bit, try the following: start up Fiddler, then navigate to http://services.odata.org/OData/OData.svc/Products(0). You can then drag the request you get into the Request Builder tab and tweak it a little bit. For example, add a line that reads as follows.

Accept-Charset: utf-16

And then Execute the request. If you look at the response using the Hex View tab, you'll see that characters now take up two bytes. You can try other encodings, like us-ascii, although the data is relatively simple and you won't see many changes.

If you have localized data for which UTF-8 isn't an appropriate encoding, you can use this capability to choose a different one, perhaps reducing the payload size or making it easier for a client or server to consume the data.

Kingsley Idehen explains Creating an OData Publishing Endpoint using Virtuoso's ADO.NET Data Provider in this very detailed and fully illustrated but undated tutorial on the Virtuoso Open-Source Wiki:

… How are Virtuoso and OData Connected?

Virtuoso is a hybrid data server with native data management and integration capabilities covering RDF via SPARQL, ODBC or JDBC accessible Relational Data Sources; XML via XQuery/XPath; Web Services; and other HTTP-accessible Hypermedia and non-Hypermedia resources. Virtuoso includes a high-performance native ADO.NET data provider that provides access to its Relational-, RDF Graph-, and Document-model based storage engines, and thereby also makes data from these engines accessible to OData-based data consumers.

How Do I Publish OData using OpenLink Virtuoso?

There are two ways to produce OData from Virtuoso: one approach uses Virtuoso's built-in support of the lightweight and platform-agnostic OData protocol; the other uses Virtuoso's ADO.NET provider. This guide covers use of the ADO.NET provider.

To create an OData service endpoint using Virtuoso and its ADO.NET provider, you will need the following in place:

  • Microsoft Visual Studio 2010
  • The ADO.NET Entity Framework 4.0 runtime and associated tools (included in Visual Studio 2010, but not installed by default) …

Kirsten JonesOData Catalog Updates post of 3/26/2010 to the NetFlix blog reports:

After working with the OData system for several days, we have made some improvements to the system - some of them will break existing code, but this should be our last update of that nature (although it remains in "preview" mode, and so could change in the future.

Changes:

  • The “Catalog” prefix has been removed from all entity sets. Therefore, “CatalogTitles” is now just “Titles”, “CatalogTitleGenres” is now just “TitleGenres”, and so on. This mirrors Netflix’s existing APIs, and also removes a bit of redundancy from the URL, since the root of the service is already “/Catalog/”.
  • The Title entity’s ID is no longer a GUID, but rather a generated string that comes from Netflix’s API.
  • The Person entity’s ID is no longer a GUID, but rather a generated number that comes from the Netflix’s API.
  • The catalog data now includes whether a title is available for instant watch in HD (“Catalog/Titles?$filter=Instant/HighDefinitionAvailable”), which Netflix is beginning to start providing more often.
  • The catalog data is now updated nightly to stay in sync with current Netflix data.

These changes should make the OData API more useful and consistent for developers to use.  All changes have been documented on the OData Catalog API page.

<Return to section navigation list> 

AppFabric: Access Control and Service Bus

Eugenio Pace’s Federated Identity interoperability samples post of 4/1/2010 adds more details about the a-Expense project:

Customers frequently ask me about interoperability with their non-Microsoft products, and identity is not an exception. There’s roughly 2 set of scenarios that come up often:

  1. A relying party (an application) built on the Microsoft stack trusting a non-Microsoft Identity Provider.
  2. A non-Microsoft application trusting ADFS.

Good news in the identity world is that there are quite some standards that we all have agreed to implement (WS-Federation, SAML, WS-trust, etc). A few weeks ago I reached to to my colleague Claudio Caldato, who is a member of our Interoperability Labs and has spent quite some time figuring out all the implementation details of these scenarios.

Claudio was kind enough to allow us to use his lab, so we’ve been busy deploying one of our apps from the Guide there and learning from the process.

More good news: everything works with zero changes to the app :-).

Here’s a screenshot of a-Expense with a user authenticated on an OpenSSO identity provider (you can only tell from username):

clip_image001

Claudio’s lab is quite comprehensive and we have this running against IBM Tivoli Manager, CA Siteminder and Oracle’s identity products too.

Stay tuned, we’ll post more information soon.

Eugenio’s Windows Azure Guidance – Development Process post of the same date discusses the process’s ALM:

One frequent question we get is around “process guidance”. Also known by the more modern and fancy acronym “ALM”: Application Lifecycle Management, which replaced the old SDLC term, which in turn (and only if you are old enough like me) meant something completely different, but I digress….

Application lifecycle management on Windows Azure has some special considerations. Among the three things that clearly impact ALM from my perspective are the following:

  1. Deploying an application on Windows Azure is a very specific set of tasks. You don’t really have a lot of options.
  2. Any resource that you acquire on Windows Azure costs money.
  3. The notion of an “environment” in the cloud (think “production”, “test”, “beta testing”) is an illusion.

He continues with the details of the preceding three points. …

image… Our own current environment looks more or less like this:Our dev team typically runs against the local dev fabric.

  • All code is checked-in on TFS (our source code management tool).
  • The build server runs tests locally too.
  • Packaging and deployment to Windows Azure is automated (scripts are included in the last drop and see Scott’s post for more detail).
  • The Test team runs tests both locally and on the “live” environment.
  • We all share a single Admin LiveID that we use to (seldom) access the portal.
  • We all use the MMC.

It is not difficult to imagine more complex requirements. For example, we don’t really have a “production” service since nobody uses our sample application (a-Expense). But extending the above architecture is not that complex.

Here’s a simple expansion considering 2 environments and an additional role that manages deployments “production”:

image

Notice that “production” could potentially be another subscription (with its own separate Admin LiveID / Certificates).

<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Updated My March 2010 Uptime Reports for OakLeaf Azure Test Harness Running in the South Central US Data Center on 4/3/2010 to account for 10 minutes of downtime in March reported by Pingdom on 4/3/2010 that wasn’t previously disclosed by PingdomAlert emails.

Jim Nakashima, Hani Atassi and Danny Thorpe co-wrote Developing and Deploying Cloud Apps in Visual Studio 2010 for MSDN Magazine’s April 2010 issue:

There are many reasons to deploy an application or services onto Windows Azure, the Microsoft cloud services platform. These include reducing operation and hardware costs by paying for just what you use, building applications that are able to scale almost infinitely, enormous storage capacity, geo-location ... the list goes on and on.

Yet a platform is intellectually interesting only when developers can actually use it. Developers are the heart and soul of any platform release—the very definition of a successful release is the large number of developers deploying applications and services on it. Microsoft has always focused on providing the best development experience for a range of platforms—whether established or emerging—with Visual Studio, and that continues for cloud computing. Microsoft added direct support for building Windows Azure applications to Visual Studio 2010 and Visual Web Developer 2010 Express.

This article will walk you through using Visual Studio 2010 for the entirety of the Windows Azure application development lifecycle. Note that even if you aren’t a Visual Studio user today, you can still evaluate Windows Azure development for free, using the Windows Azure support in Visual Web Developer 2010 Express. …

image: Adding Roles to a New Cloud Service Project

Figure 1 Adding Roles to a New Cloud Service Project …

Note that Anson Horton, Hani Atassi, Danny Thorpe, and Jim Nakashima cowrote Cloud Development in Visual Studio 2010 for Visual Studio Magazine’s April 2010 issue, which appears to be almost identical to the preceding article, but Anson Horton is missing from the MSDN Magazine article’s byline. Anson (and Gus Perez) get technical reviewer credit at the end of the article.

1105 Media publishes both MSDN Magazine and VSM which might make this duplication less than coincidental.

(See Windows Azure and Cloud Computing Posts for 3/29/2010+’s Live Windows Azure Apps, APIs, Tools and Test Harnesses section for the VSM version.)

Ian Tomlin asserts “The announcement that Encanvas will be on Microsoft Azure later this month is likely to forever change the role of business analysts” in his Encanvas Enterprise Mashups on Azure changes the role of Business Analyst article of 4/2/2010 for the ebizQ blog:

At one time business analysts were almost seen as the triage people for IT development teams. They were the people that could understand business and make sense of processes; people with one foot in the business and the other in IT able to bridge across the impermeable divide that exists in the enterprise between business and IT. They were also the 'owners of the problem' that developers were charged to fix with their software developments. So in many organizations, business analysts became the well paid 'good communicating' bag carriers of the slightly higher paid software developers who could sit in their back rooms and 'just code'.

This narrowly scoped role of the business analyst is not likely to sustain. In an agile business era there is no room for the heavy price of 'discovery and learning' in software development. Everyone knows that any learning exercise necessarily requires mistake to be made (that is after-all how people learn) but the cost of learning and the economics of failure are often so grim that organizations would rather pay a premium for 'off-the-shelf' solutions than risk project over-runs. Neither is there sufficient budget to pay for teams of 2,4,6,8 or 10+ people to build database-centric portal applications that employ a combination of forms, databases, reports, dashboards, maps, workflow and logic components of varying degrees.

On the face of it, the choice seems to be (1) 'outsource' and risk losing control over your information systems or (2) purchase 'shrink-wrapped solutions' that sound ideal on the face of it but promise the IT leader a legacy of integration and data management issues down the line. But Web 2.0 Rich Internet Applications platforms like Encanvas Secure&Live present a third way. …

Ian continues with a description of the “third way.” (He’s President of Encanvas Inc.) His earlier What makes Squork better than Google Wave and open source social networking media? (and other good questions) of 2/26/2010 begins:

The YouTube videos on Squork are getting some attention and the first few bits of feedback we've received ask some very pointy questions on what makes Squork better than Google Wave and free source software.

Rather than answer the same questions a hundred times a day I thought it probably a good idea to answer them here;-). My thanks to Aravind for summarizing the key questions!

QUESTION - "What is Squork?"

Squork is a secure and live business social operating system. It enables communities to develop and operate virtual social network operating spaces for their business or community of interest.

QUESTION - "Where has Squork come from?"

ANSWER - Squork is a deployment of the Encanvas Secure&Live Integrated Software Platform (which itself can take some explaining). With Encanvas people can design, deploy and operate their business applications using a single integrated platform without needing programming or scripting skills. It obviates the need for traditional enterprise portal platforms and means people can publish directly to their favored cloud computing platform or web server (although at the moment we're only supporting Microsoft Azure). Encanvas produces ASP.NET web spaces. [Emphasis added.]

Because Squork is built on Encanvas, organizations benefit from what we call 'frictionless IT' - i.e. Once they've deployed the Encanvas platform they don't need to buy any more applications as it becomes easier and more cost effective to build applications than purchase them. Using Encanvas also means organizations no longer have to suffer with upgrade costs.

The fact that Encanvas underpins Squork means that organizations (and communities) can use Squork to extend their networks and processes beyond the boundaries of the enterprise without compromising security.

QUESTION - "I could understand that you deliver data security, aggregating information, making everyone work on the same page...but my question is how is that different from existing stand alone usage of social media. What advantage will it provide me as a customer ahead of ones who aren't using squork?"

ANSWER - While most social media tools are 'applications', Squork is a complete Social Operating System.

Ian continues with more Squork details, which explain Squork better than the company’s web site. At this writing Squork says “we haven't got the facility online to register users yet.
To register over the phone please call +44 1280 700 535.”

I’m surprised Microsoft’s PR folks haven’t done a case study on Squork. Steve Marx, are you listening?

Bob Familiar’s CSC Uses Windows Azure Platform to Offer Managed Hosted Software as a Service post of 4/2/2010 reports:

image Hong Choing has written an article that details how CSC, a global leader in  providing technology-enabled solutions and services, is leveraging Windows Azure to provide their Legal Solutions Suite® Software as a Service.

Legal Solutions Suite creates a collaborative electronic workspace for budgeting, planning and legal strategy. Developed by legal professionals, Legal Solutions Suite helps organizations create a highly efficient, end-to-end legal management process

Brian Boruff, VP of Cloud Computing Services at CSC, was interviewed at PDC09 about CSC’s cloud initiatives.

image

CSC’s Legal Solution Suite is listed in the Microsoft PinPoint marketplace.

image

More more information:

  1. Global Systems Integrator Whitepaper (Windows Azure SI White paper.pdf)
  2. Brian Boruff interview with the Economist on Cloud Computing. (Take me there)

Why Windows Azure Platform?

  • CSC believes that moving to Windows Azure will give it a significant competitive advantage with its Legal Solutions Services product.
  • Local applications now behave more like job schedulers, handing off compute-intensive tasks to cloud-based worker services hosted on the virtual machines in the Windows Azure platform and brought online and offline as needed.

Return to section navigation list> 

Windows Azure Infrastructure

James Urquhart continues his Devops series with Understanding cloud and 'devops'--part 3 of 4/2/2010 posted to CNet’s The Wisdom of Clouds blog:

The first two parts of this series laid out a case for why cloud computing is driving an applications focus in operations instead of a server focus, and why that applications focus forces a change in the core responsibilities of IT operations personnel, respectively. Those posts triggered a very lively discussion on Twitter and in what I call the "cloud-o-sphere" about the consequences--and limits--of "devops."

I know I promised some examples of devops in action, but that will have to wait until a later post. For now, let me lay out some of the more interesting observations--and debates--that my review of devops triggered.

James continues with excerpts from comments about his Devops series to date. (Graphics Credit: CNET)

SearchCloudComputing’s Daily Cloud blog questions whether Cloud computing to hit $222.5 billion by 2015? on 4/2/2010:

Global Industry Analysts, an 800-man international research firm, will awaken new hope into the hearts of weary cloud pitchmen and boosters who see the magic fairy dust wearing off of cloud computing. GIA announced that the cloud market, which it says is comprised of application services, business process services and infrastructure services, will hit [$]222.5 billion by 2015.

Some of the companies profiled as major players in cloud include Dell, Novell and Oracle…so there you are. Rosy days appear to be ahead for those worried that cloud was peaking in the hype cycle and due for the "trough of disillusionment." It'll also be a wake-up call for those sticks in the mud at IDC who somehow believe that cloud spending will be a paltry $70 million in 2015.

Readers can do their part for cloud markets by purchasing GIA's report online, delivered from the cloud, for only $4,400.

Stephen O’Grady offers an Event Report: “Solutions for the Virtual Era” or Dell and the Cloud on 4/2/2010:

If I told you that the most interesting thing I heard from Dell at their analyst event last week was the fact that their design point for their hardware is the medium sized business, would you consider that an indictment? Well, you shouldn’t.

Dell, like a lot of larger enterprises these days, is bullish on virtualization, billing the present as “The Virtual Era.” And while it might seem illogical for a maker of physical hardware to be heavily touting the benefits of virtual hardware, the mismatch is not so great as you might suppose. In spite of the ever improving ability of virtualization providers to cram more VMs onto a given piece of physical infrastructure, VMs need to sit on hardware, somewhere. Virtualization simply makes that hardware easier to utilize. Lowering the barriers to consumption, in turn, generally has one effect: it increases consumption. Think about it. If you lower the price of gas, do you sell more or less gas? So goes the thinking from purveyors of infrastructure, and that thinking is correct.

Part and parcel of virtualization in the “Virtual Era,” of course, is cloud computing. Dell had that angle well covered, calling themselves the #1 provider of cloud infrastructure today. And with customers like Facebook and Microsoft (Azure), and to a lesser extent Ask.com, that claim may be correct. Even it is not, however, and Google’s sprawling datacenters housing their homemade hardware outweigh Dell’s deployed volume, Dell is for many the defacto hardware platform of the web. Why? Because Dell, unlike the larger systems players, is able to sell to small and medium sized businesses. The kinds of small businesses that grow up to be Facebook, in fact. Cue Joyent’s tale of woe with HP and Sun versus their success with Dell. …

Dell is a major supplier of servers for Microsoft’s data centers. Stephen O’Grady is an industry analyst with RedMonk.

David Linthium claims “A recent report by Greenpeace calls cloud computing out for its carbon footprint. Here's why it's wrong” while asking Is cloud computing really killing the planet? on the first page of his 4/2/2010 article for InfoWorld’s Cloud Computing blog:

A recent report from Greenpeace states that the proliferation of devices that use the "cloud," such as the new Apple iPad could be killing the planet. "The report finds that at current growth rates, data centers and telecommunication networks, the two key components of the cloud, will consume about 1,963 billion kilowatt-hours of electricity in 2020, more than triple their current consumption and over half the current electricity consumption of the United States -- or more than France, Germany, Canada, and Brazil combined."

Greenpeace states that it wants people to think about where all of this cloud stuff is leading: "It points to the use of dirty energy in the IT sector, namely by Facebook, which recently announced the construction of a data center that will run primarily on coal."

OK, I was told cloud computing was green, so what gives? The fact of the matter is that reports such as this fail to consider the shift in processing from on-premise to shared computing centers. Indeed, Greenpeace just focused on the impact of the new data centers being built to support the cloud, and not the end result of them moving some existing enterprises processing to shared public clouds. Moreover, Greenpeace didn't consider the impact of efficiencies driven around private cloud computing. …

David continues his article on page 2.

Guy Rosen offers an online version of his Presentation at CloudConnect and “Hacking Cloud Adoption” slides in this 4/1/2010 post:

A couple of weeks ago I gave a keynote talk at CloudConnect to share some of the findings that I’ve been publishing here on the blog. CloudConnect turned out to be a great event which really brought together many people from the cloud community – this is a good opportunity to congratulate and thank the organizers for a job well done.

Bill Zack’s Windows Azure Design Patterns for the Cloud post of 4/1/2010 reports:

imageMost presentations about Windows Azure start by presenting you with a bewildering set of features offered by Windows Azure, Windows Azure AppFabric and SQL Azure. Although these groupings of features may have been developed by different groups within Microsoft and spliced together, as architects we are more interested in solving business problems that can be solved by utilizing these features where they are appropriate.

I recently did a Webcast that took another more solution-centric approach.  It presented a set of application scenario contexts, Azure features and solution examples. It is unique in its approach and the fact that it includes the use of features from all components of the Windows Azure Platform including the Windows Azure Operating System, Windows Azure AppFabric and SQL Azure.

Michael Armbrust, Armando Fox, Rean Griffith, Anthony D. Joseph, Randy Katz, Andy Konwinski, Gunho Lee, David Patterson, Ariel Rabkin, Ion Stoica, and Matei Zaharia coauthored the A View of Cloud Computing article for the April 2010 issue of the Association for Computing Machinery’s Communications of the ACM:

Clearing the clouds away from the true potential and obstacles posed by this computing capability.

[article image]

Cloud computing, the long-held dream of computing as a utility, has the potential to transform a large part of the IT industry, making software even more attractive as a service and shaping the way IT hardware is designed and purchased. Developers with innovative ideas for new Internet services no longer require the large capital outlays in hardware to deploy their service or the human expense to operate it. They need not be concerned about overprovisioning for a service whose popularity does not meet their predictions, thus wasting costly resources, or underprovisioning for one that becomes wildly popular, thus missing potential customers and revenue. Moreover, companies with large batch-oriented tasks can get results as quickly as their programs can scale, since using 1,000 servers for one hour costs no more than using one server for 1,000 hours. This elasticity of resources, without paying a premium for large scale, is unprecedented in the history of IT.

As a result, cloud computing is a popular topic for blogging and white papers and has been featured in the title of workshops, conferences, and even magazines. Nevertheless, confusion remains about exactly what it is and when it's useful, causing Oracle's CEO Larry Ellison to vent his frustration: "The interesting thing about cloud computing is that we've redefined cloud computing to include everything that we already do.... I don't understand what we would do differently in the light of cloud computing other than change the wording of some of our ads."

Our goal in this article is to reduce that confusion by clarifying terms, providing simple figures to quantify comparisons between of cloud and conventional computing, and identifying the top technical and non-technical obstacles and opportunities of cloud computing. (Armbrust et al is a more detailed version of this article.)

The Armbrust reference is to the [in]famous Above the Clouds: A Berkeley View of Cloud Computing white paper of 2/10/2009 from UC Berkeley’s Electrical Engineering and Computer Sciences department by most of the authors of this article. Graphic credit: Jon Han

<Return to section navigation list> 

Cloud Security and Governance

Dave Kearns claims “Trend examplified by Courion's move into governance and SailPoint's entry into provisioning” in his Identity management companies move beyond 'single-issue vending' post to NetworkWorld’s Security blog:

Every time I try to consign something to the "plumbing" layer, or say that it's been commoditized, my words tend to come back to bite me. It was just two months ago that I said (about user provisioning): "In fact, we almost take it for granted today." (See "User provisioning: right access to the right people".) Lately, though, there's been enough new ideas in provisioning that we really can't take it for granted.

First, though, I should properly credit the title of that newsletter ("The right access for the right people") where it belongs -- to Courion since it paraphrases their mantra ("User provisioning ensures that only the right people have the right access to the right resources"). And it's Courion I want to talk about today.

Last week it announced a technology partnership with Symantec to facilitate the integration of Courion's Access Assurance Suite 8.0 with Symantec Data Loss Prevention 10. According to Courion's Chris Sullivan (vice president of customer solutions), this will "create the industry's most comprehensive content-aware identity and access management solution." Adds Kurt Johnson (vice president of corporate development and strategy), "[T]his solution represents a huge step forward in the ability to monitor and secure sensitive information."

So it seems that Courion wants to get more involved in the governance aspects of identity. This coming right on the heels of SailPoint's announcement that the Austin-based governance company is getting into the provisioning space. …

<Return to section navigation list> 

Cloud Computing Events

Best Two Cloud-Related April Fool Stories of 2010:

  1. Jeff Barr’s Introducing QC2 - the Quantum Compute Cloud (Amazon Web Services)
  2. Werner VogelsAWS Import/Export launches support for Legacy Storage Systems (Legacy = punched cards)

Add your favorite cloudy April Fools stories to the comments.

Maarten Balliauw’s Put your existing application in the cloud! post of 4/1/2010 has a link to his slide deck from TechDays 2010:

As promised during my talk, here's the slide deck for "Put your existing application in the cloud!".

Abstract: "Leverage the highly scalable Windows Azure platform and deploy your existing ASP.NET application to a new home in the clouds. This demo filled session will guide you in how to make successful use of Windows Azure’s hosting and storage platform as well as SQL Azure, the relational database in the cloud, by moving an existing ASP.NET application to a higher level."

Put Your Existing Application On Windows Azure

Thanks for joining TechDays 2010 and my session!

ICS Solutions presents Windows Azure: the next big thing in Cloud computing from Microsoft on June 29, 2010, 5:00 PM GMT at ICS Solutions Ltd, .NET Solutions Campus,
Skippetts House, Skippetts Lane, Basingstoke, Hampshire RG21 3HP,
United Kingdom:

… The seminar will help you to evaluate the Azure Platform and take the first step to Cloud computing, providing you with a base knowledge of Azure, providing information on Azure adoption, Proof of concepts and detailing your next steps.

  • Introduction to Windows Azure
  • Understanding Windows Azure and the Azure platform
  • Windows Azure demonstration
  • Partner presentation - Real world Azure solutions (Dot Net Solutions)
  • Azure pricing and Proof of Concept offerings
  • Next steps in your Azure adoption
  • Azure resources available to you
  • Q&A session

ICS Solutions’ reputation as ‘The Microsoft Gurus’ has evolved over more than 12 years of working exclusively in the application of Microsoft technologies. ICS is one of the most highly accredited Microsoft Gold Partners in the UK, providing Information Worker, SOA and Business Process and Custom Development Solutions. …

Please book at http://www.ics.net/microsoft-online-services-seminars.aspx, email events@ics.net, or call (01256) 403800 and speak to our events team to book.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

William Vambenepe’s Enterprise application integration patterns for IT management: a blast from the past or from the future? post of 4/2/2010 analyzes CA’s approach to enterprise application integration patterns:

In a recent blog post, Don Ferguson (CTO at CA) describes CA Catalyst, a major architectural overall which “applies enterprise application integration patterns to the problem of integrating IT management systems”. Reading this was fascinating to me. Not because the content was some kind of revelation, but exactly for the opposite reason. Because it is so familiar. …

While most readers might not share my historical connection with this work, this is still relevant and important to anyone who cares about IT management in the enterprise. If you’re planning to be at CA World, go listen to Don. Web services may have a bad name, but the technical problems of IT management integration remain. There are only a few routes to IT management automation (I count seven, the one taken by CA is #2). You can throw away SOAP if you want, you still need to deal with protocol compatibility, model alignment and instance reconciliation. You need to centralize or orchestrate the management operations performed. You need to be able to integrate with complementary products or at the very least to effectively incorporate your acquisitions. It’s hard stuff.

Bonus point to Don for not forcing a “Cloud” angle for extra sparkle. This is core IT management.

Dustin Amrhein describes “Requirements for application platforms in the cloud” in his Building Cloud-Based Application Platforms post of 4/1/2010:

It seems that cloud computing conversations are slowly beginning to march up the stack. Much of the initial focus in the space has been on cloud-based infrastructures such as servers, storage, networks, etc. To some degree, cloud-based approaches to this layer of IT are becoming more of a norm and less of an exception. As this happens, the industry begins to look at the next layer of the IT stack in the context of cloud computing: application platforms.

Cloud-based application platforms are intriguing in that they promise to shift the focus of cloud computing to what is important, the application. In its ideal form, users describe their applications, the application's dependencies, service level agreements, and then they hit the easy deploy button. The cloud platform renders the underlying infrastructure and provides a robust, dynamic runtime for the application.

If charged with building and delivering a cloud application platform, my end goal would be to render both the infrastructure and application platform software as a black box to end users. That is much, much easier said than done.

Recently I chatted with a colleague about this very same topic, and I realized just how much would go into delivering such a solution to end-users. I left that conversation with several different requirements, but to give you an idea I'll share three with you here [abbreviated]:

  1. Robust stable of platform software
  2. Operational and runtime management services for applications
  3. Extensible foundation for runtime service delivery

Dustin (@damrhein) is a technical evangelist for IBM emerging technologies in the WebSphere portfolio.

<Return to section navigation list> 

blog comments powered by Disqus