Tuesday, December 9, 2008

Asynchronous Programming and Power Threading

Many of us who've had experience with asynchronous development know how difficult such code can be to write. Asynchronous code is typically non-linear, and jumps from one portion of a program to another. It is difficult to debug, and is difficult to tame if errors occur.
To understand the difficulties inherent in asynchronous development it helps to first consider a simple example.

Suppose you begin an IO operation of some kind, perhaps the download of a large file. The download is going to take several minutes. To avoid locking up your program during the download, you set up a thread on which the operation can run. You start the thread, call it from your main thread, and set up a callback method which can be executed when the operation completes. Because the download is run on a secondary thread, the main thread of your program is still responsive during the download, and can interact with the user. When the task completes, the callback is executed, thereby announcing the termination of the download. You might then have a new asynchronous task that you might want to begin, such as processing the downloaded file and adding portions of its content to a database. Again, this task is going to take some time, and so you start another thread, providing another callback method that can be executed when the task is completed.

The model outlined in the previous paragraph is common, but awkward. The problems inherent in this scenario are numerous, but two of the worst problems can be summed up as follows:

1) The code does not execute in serial fashion, but instead jumps from one callback to another, thereby making it difficult to debug. Someone new to the code might find it hard to understand which callback will execute next, or which thread is currently active.

2) If something goes wrong during the execution of the code, it can be very difficult to clean up the current operation and exit the process smoothly. Operations are occurring on multiple threads, or inside some seemingly random callback. Allocations, open files, and initialized variables are hard to clean up, and it is difficult to define which code should execute next after you enter an error condition.

Setting up a try..catch block is difficult at best, and sometimes impossible. The result can be a mass of spaghetti code that is difficult for the original developer to understand, and nearly incomprehensible to others who are assigned the unfortunate task of maintaining it.
All of these problems are commonly encountered by developers who create asynchronous code.

Jeffrey Richter has written a library that allows us to write asynchronous code in a synchronous style, as if each operation were occurring in a linear, or serial, sequence. In other words, you can write a single method in which the file is first downloaded, then parsed, and data is then inserted in a database. The code looks like synchronous code, and appears to execute in a linear fashion. Behind the scenes, however, the code is actually asynchronous, and uses multiple threads.

The library is built around C# Iterators, which bear the weight of handling the multiple threads that are spawned during your asynchronous operations.

Take a look at the movie to learn exactly how it works, and then download free library to try it yourself. Not only does he show a simple way to write asynchronous code, but he also does a great job of explaining exactly how C# Iterators are put together.

Download or view the video: http://channel9.msdn.com/posts/Charles/Jeffrey-Richter-and-his-AsyncEnumerator/

The library is available at: http://wintellect.com/PowerThreading.aspx

Wednesday, November 26, 2008

Red Dog and more!!!

Check this nice article about the history and the man behind Red Dog (Windows Azure)

Ray Ozzie Wants to Push Microsoft Back Into Startup Mode

Monday, November 24, 2008

The "Geneva" Identity Framework

The "Geneva Framework" is a framework for building identity-aware applications. It contains functionality for incorporating Information Cards into an ASP.NET web site. The framework abstracts the WS-Trust and WS-Federation protocols and presents to developers an API for building security token services and identity providers. Applications can use the framework to process tokens issued from security token services and make identity-based decisions at the web application or web service.

Major Features

Build claims-aware applications
“Geneva” Framework helps developers build claims-aware applications. In addition to providing a new claims model, it provides applications with a rich set of API’s to help applications make user access decisions based on claims.
“Geneva” Framework also provides developers with a consistent programming experience whether they choose to build their applications in ASP.NET or in WCF environments.

ASP.NET Controls
ASP.NET controls simplify development of ASP.NET pages for building claims-aware Web applications, as well as Passive STS’s.

Translate between claims and NT tokens
“Geneva” Framework includes a windows service, named “Geneva” Claims to NT Token Service, that acts as a bridge between claims-aware applications and NT token based applications. It provides developers with an easy way to convert claims to NT-Token identity and makes it possible to access the resources that require NT-Token based identity from a claims-aware application.

Issue managed information cards
“Geneva” Framework offers InformationCard control that makes it easier for enabling information cards (for example: Windows CardSpace ““Geneva””) login in existing ASP.Net applications.

Easy provisioning of claims-aware application with a STS
“Geneva” Framework provides a utility, named FedUtil, to allow easy provisioning of claims-aware applications with an STS, for example: ““Geneva”” Server STS, LiveID STS.

Build identity delegation support into claims-aware applications
“Geneva” Framework offers the capability, referred as ActAs functionality, of maintaining the identities of original requestors across the service boundaries. This capability offers developers the ability to add identity delegation support into their claims-aware applications.

Build custom security token services (STS)
“Geneva” Framework makes it substantially easier to build a custom security token service (STS) that supports the WS-Trust protocol. These STS’s are also referred to as an Active STS.
In addition, the framework also provides support for building STS’s that support WS-Federation to enable web browser clients. These STS’s are also referred to as a Passive STS.

Major Scenarios
· Federation
“Geneva” Framework makes it possible to build federation between two or more partners. Its functionality offerings on building claims-aware applications (RP) and custom security token services (STS) help developers achieve this scenario.

· Identity Delegation
“Geneva” Framework makes it easy to maintain the identities across the service boundaries so that developers can achieve identity delegation scenario.

· Step-up Authentication
Authentication requirements for different resource access within an application may vary. “Geneva” Framework provides developers the ability to build applications that can require incremental authentication requirements (for example: initial login with Username/Password authentication and then step-up to Smart Card authentication).

Thursday, November 20, 2008

Introducing Windows Azure

Check this video out. Manuvir Das explains the Windows Azure.
Manuvir Das: Introducing Windows Azure

Wednesday, November 19, 2008

The "Oslo" Modeling Platform

Model-driven development is a term that is often used to indicate a development process that revolves around building applications by using models of applications and data as specifications.

Using a model-driven approach means a development process and platform that enables:
  • Using abstraction to view structure at the important level of detail and hiding complexity until it is needed.
  • Using models – or logical data types and relationships – as core to the development experience.
  • Implementing the model-aware components such that they follow the requirements of the modeled application or business process.
  • Associating models and model instances at various development stages so that model-driven development can move back and forth in the development lifecycle and maintain those relationships.
  • Automation of particular application environments and artifacts so that users can more easily make use of them in the preceding ways.

The preceding points are complex ways of describing a development approach in which the real feature is more robust support of efficient and manageable complex application development. It is this feature that the “Oslo” modeling platform aims at: The goal of code name “Oslo” is to reduce the gap between the intention of the developer and the software components that get developed, deployed, and executed inside of complex, widely-distributed, database-driven applications. Modeling the application means moving more of the definition of an application into the world of data, where the platform (and you) can more easily make queries as to the developer’s original intent. Microsoft technologies have been moving in this direction for over a decade now; for example, things like COM type libraries, .NET metadata attributes, and XAML have all moved increasingly toward “writing things down” directly as data and away from encoding them into a lower-level form, such as x86 or IL instructions. The “Oslo” modeling platform continues this progression.

In short, the “Oslo” modeling platform:

  • Makes it easier for people to write things down in ways that make sense for the problem domain they are working in—a common term for this is modeling.
  • Makes the things that people wrote down accessible to platform components during program execution.

The “Oslo” modeling platform makes this possible by providing:

  • A visual design tool (Microsoft code name “Quadrant”) that enables people to design business processes with well-understood, flowchart-like graphics; developers to design applications and components that comply with the requirements of those processes; and both to move from one view back and forth to observe the effect any changes in either place have on the overall validity of the application or business process.
  • A modeling language (Microsoft code name “M”) that makes it natural to extend system-provided models (such as Windows Communication Foundation (WCF) or Windows Workflow Foundation (WF) models) or create your own models for use on the “Oslo” modeling platform.
  • A SQL Server database (the code name “Oslo” repository) that stores models as SQL Server schema objects and model instance data as rows in the tables that implement the schema. This data is available to “Quadrant” and any other tool or data-driven application that can make use of it (and that has the appropriate permissions to do so). Whether models or model instance data is created visually, using “M”, or using any SQL data access API (for example, ADO.NET, EDM, OLE-DB, and so on) creating models and storing them in the “Oslo” repository enables future applications to examine and manipulate not only data structures used by applications but – because applications are modeled – the applications themselves, as they run. If data-driven application has enough detailed model information, applications can run without recourse to static compilation.

Whether you create or modify model data visually, textually, or using a SQL data access technology, all of the modeling information is available in a relational database (the code name “Oslo” repository) at runtime. Some platform components are part of the System Provided Models, which enable you to write a service or an application by populating that database with the definition of that service or application. In addition, because that data is captured in the “Oslo” repository, it is available to all kinds of tools that specialize in structured data, whether these tools are design tools like “Quadrant” or third-party tools that can sift, search, and filter the information to make available information that is very difficult to understand using current tools.

This is the essence of the "Oslo" platform.

Source: http://www.msdn.com

Monday, November 17, 2008

Application Architecture Guide

Microsoft's patterns & practices group has published Application Architecture Guide 2.0 Beta 1 , a book containing principles, patterns and practices for designing the architecture of applications built on the .NET Framework. The intended audience is solution architects and development leaders.

Raining Cloud Compute

The last 12-18 months has seen the emergence of multiple software companies coming out with their versions and models for what each one perceives Cloud Computing. To list a few famous ones:

1) Google App Engine with its Python development environment
2) Amazon EC2 - Elastic Compute Cloud
3) Microsoft Windows Azure
4) VMWare announcing its virtualised OS for the cloud.
5) IBM Big Table

What is amazing is that each one has their own strategy and model which is trying to address the needs for different markets and with a obivious goal of extending/protecting ones existing user base. A further drill down leads me to beleive that Cloud Computing is
provided as a service by each one in a way that differs as apples and oranges.

I see three broad classification in the way the Cloud ecosystem is evolving:

1) Cloud Application as a Service.
Eg: Google Docs, SalesForce, Hosted Exchange etc.

2) Infrastructure as a Service.
Eg: Amazon EC2 which enable complete host deployments including support for Windows, Linux etc. and also a variety of databases.

3) Platform as a Service. (Paas)
Eg: Microsoft Windows Azure Services, SQL Data Services and its suite of products aim at giving the developers their familiar development environment .Net and a highly scalable database with support for ADO.Net. to build and deploy enterprise services.

Read Microsoft's detailed strategy here:

The million (or should I say the billion) dollar question remains. Which strategy will deliver the thunder and how all this will affect the computing world as a whole. Computing is definitely in midst of an epoch no less than the PC revolution. Only time will tell ........

Thursday, November 13, 2008

Windows Application Server Code Named "Dublin"

Windows Server: Application Server Demands of Today’s Agile Businesses

Windows Server delivers a platform for deploying and running custom applications built with the Microsoft .NET Framework and includes key application server functionality directly in the operating system.

As companies increasingly adopt service-oriented architecture (SOA) principles and embrace composite applications, they reuse services and compose new applications quickly and easily. New requirements arise for the application server:

1. Composite applications are typically more complex for IT to deploy, manage and evolve. This creates a need for developers to write more complex infrastructure code and for more sophisticated operations, deployment and management capabilities on the application server than exist today.
2. Composite applications present new challenges around scalability, performance and reliability. The tried-and-true strategies for optimizing traditional applications do not satisfy in the more complex environment of composite applications.

To address these requirements, composite applications must adopt more sophisticated application architectures, including management of highly asynchronous transactions, automation of long-running durable workflows, coordination of processes across heterogeneous environments, and seamless interoperability across platforms using standards. To manage this complexity, customers prefer to leverage new tools and techniques alongside traditional approaches in a single application server design and runtime environment.

.NET Framework 4.0 and “Dublin” Meet the Needs

To address these new requirements, Microsoft is enhancing Windows Server including key components in the .NET Framework 4.0 release by adding significant functionality to the next version of Windows Communication Foundation and Windows Workflow Foundation. It is also introducing a set of enhanced Windows Server application server capabilities code-named “Dublin,” which offer greater scalability and easier manageability, and will extend Internet Information Services (IIS) to provide a standard host for applications that use workflow or communications.

Taken together, these enhancements to the Windows Server application server will simplify the deployment, configuration, management and scalability of composite applications, while allowing developers to use their existing skills with Visual Studio, the .NET Framework and IIS. This new application server capability will be delivered as a separate release of technologies that can be downloaded and used by Windows Server customers. The first preview was available at Microsoft’s Professional Developers Conference, Oct. 27–30, 2008, and the exact timing of beta and release-to-market will be based on customer and partner feedback from this community technology preview (CTP).


Q: What application server technologies are coming in Windows Server and .NET Framework 4.0?

Windows Communication Foundation 4.0

Representational state transfer (REST) enhancements
· Simplified building of RESTful services
· Templates to accelerate building Singleton & Collection Services, Atom Feed and Publishing Protocol Services, and HTTP Plain XML Services
Messaging enhancements
· Protocols: WS-Discovery, WS-I BP 1.2
· Duplex durable messaging
Correlation enhancements
· Content- and context-driven, one-way support
Declarative workflow services
· Seamless integration between Windows Workflow Foundation and Windows Communication Foundation and unified Extensible Application Markup Language (XAML) model
· Ability to build an entire application in XAML, from presentation to data to services to workflow

Windows Workflow Foundation 4.0

Significant improvements in performance and scalability
· Performance gains in all aspects of Windows Workflow Foundation at design time and runtime
· At least a tenfold improvement in performance
· Improvements in serialization performance and size needs
New workflow flow-control models and prebuilt activities
· New flowchart control model
· Expanded built-in activities: Windows PowerShell, database, messaging, etc.
Enhancements in workflow modeling
· Persistence control, transaction flow, compensation support, data binding and variable/argument scoping
Updated visual designer
· Easier to use by end users
· Easier to rehost by independent software vendors (ISVs)
· Ability to debug XAML

Windows Server “Dublin” technologies

Provides standard host for Windows Workflow Foundation and Windows Communication Foundation applications
Prebuilt developer services

· Message-based correlation
· Content-based message forwarding service
· Visual Studio templates
Greater scalability and easier manageability
· Enables scale-out of stateful workflow applications
· Enhanced management and monitoring functions
· Tracking store for workflow events
Supports a set of Microsoft’s forthcoming modeling technologies currently code-named “Oslo”

Q: How will “Dublin” be packaged and made available for customers to use?
A: “Dublin” will initially be made available for download and use by Windows Server customers; later, “Dublin” will be included in future releases of Windows Server. “Dublin” will be fully supported; customers with current support contracts, such as those available through Microsoft Software Assurance rights, will be able to take advantage of “Dublin” support under their existing contracts. “Dublin” will first become available after the release of the .NET Framework 4.0 and Visual Studio 2010. Thereafter, “Dublin” will have incremental releases roughly in line with the .NET Framework.

Q: Will “Dublin” support existing applications built on the .NET Framework? What should customers and partners do today to prepare?
A: Yes. “Dublin” will continue to provide backward compatibility for existing Windows Workflow Foundation and Windows Communication Foundation applications. Customers can confidently begin building applications on top of both Windows Server 2008 and .NET Framework 3.5 today, with assurances that those applications will enjoy the benefits of “Dublin” when it becomes available.

Q: What are the customer benefits of the using Windows Communication Foundation and Windows Workflow Foundation with “Dublin”?
A: The 4.0 release of .NET Framework represents the second generation of the Windows Communication Foundation and Windows Workflow Foundation technologies. For the .NET developer, the 4.0 enhancements include these:
- Simplified coordination of work
- Ability to express applications and services in a way that makes sense to individual teams and businesses
- A framework for durable, long-running applications and services
Taken together in 4.0, Windows Communication Foundation and Windows Workflow Foundation integrate much more naturally, allowing developers to better model complex communication patterns in a full-declarative fashion. Together, they ease the development of distributed applications that cross service boundaries.
With “Dublin,” .NET developers can use the technologies they are already familiar with to build applications. They can use the powerful hosting capabilities of “Dublin” as a deployment vehicle on Windows Server. When .NET 4.0 applications are deployed onto “Dublin,” these enhancements to the application server in Windows Server will simplify the deployment, configuration, management and scale-out of composite applications.

Q: What is the Windows Communication Foundation REST Starter Kit?
A: The Windows Communication Foundation REST Starter Kit CTP is a set of features, Visual Studio templates, samples and guidance that enables users to create REST-style services using Windows Communication Foundation. The CTP provides new features that enable or simplify various aspects of using the HTTP capabilities in Windows Communication Foundation, such as caching, security, error handling, help page support, conditional PUT, push-style streaming, type-based dispatch and semistructured XML support. Visual Studio templates simplify creating REST-style services such as an Atom Feed Service, a REST-RPC hybrid service, Singleton and Collection Services and an Atom Publishing Protocol Service. We also provide a rich set of samples that illustrate how to use each new feature and template.

Q: How will developers learn more about the Windows Communication Foundation REST Starter Kit?
A: There will be a page under the Microsoft Developer Network (MSDN) Windows Communication Foundation Developer Center (
http://www.msdn.com/wcf/rest) with documentation, videos, white papers and a link to the CodePlex site for downloading the kit. This site will go live on Oct. 27.

Q: Will “Dublin” work with the “Oslo” modeling platform technologies?
A: Yes. “Dublin” will be the first Microsoft server product to deliver support for the “Oslo” modeling platform. “Dublin” does not require “Oslo” to operate and provide benefits of hosting .NET applications; however, administrators will be able to deploy applications from the “Oslo” repository directly to the “Dublin” application server. “Dublin” provides model-driven “Oslo” applications with a powerful runtime environment out of the box.

Q: Will “Dublin” work with Microsoft BizTalk Server’s enterprise connectivity services?
A: Yes. The integration server and application server workloads are distinct but complementary; customers want to be able to deploy them separately as needed to support their distinct requirements. For example, customers that don’t need the rich line-of-business (LOB) or business-to-business (B2B) connectivity provided by an integration server will deploy the Windows Server application server to host and manage middle-tier applications. Likewise, customers that need to connect heterogeneous systems across an enterprise, but don’t need to develop and run custom application logic, will deploy BizTalk Server. When customers need both capabilities, “Dublin” and BizTalk Server will work together nicely.

Q: What plans does Microsoft or third-party ISVs have for offering products that support the .NET Framework 4.0 and “Dublin” technologies?
A: Among the first product groups to announce plans to support “Dublin” is Microsoft Dynamics, with future versions of both the Microsoft Dynamics AX and Microsoft Dynamics CRM applications leveraging the .NET Framework 4.0 and “Dublin.” In particular, the next version of Microsoft Dynamics AX is being specifically designed to take full advantage of the enhanced capability and scale delivered in Windows Server by the enhanced “Dublin” application server technologies. Among third-party ISVs, line of business applications producers, including Dataract Pty. Ltd., Eclipsys Corp., Epicor Software Corp., RedPrairie Corp. and Telerik Inc., and software infrastructure providers, including AmberPoint SOA Management, SOA Software Inc., Frends Technology and Global360 Inc., are some of the first to already announce plans to leverage the .NET Framework 4.0 and “Dublin” technologies.

Q: How do I get more information on the .NET Framework 4.0 and Windows Sever “Dublin” efforts? Is there a Microsoft Technology Adoption Program (TAP) that I can sign up for?
For now, the best way to get more information is to visit our Web site at
http://www.microsoft.com/net. There, we’ll provide updates, previews of the technology as they become available, and information regarding the TAP.

Wednesday, November 5, 2008

C# Interfaces v/s Concrete Classes

When designing architectures in .NET, we frequently use interfaces for parameter types in our method signatures. This post will help to explain why we should choose to do this and the benefits of coding in this manner.

Let's just say that you had the following two methods implemented in your data-access layer. The first calls the database and returns a result set to the SqlDataReader. The second method fills the a list of articles by iterating through the result set in the SqlDataReader and adding an article to the list for each row in the result set. Let's assume it looked like this:

public IList
SqlConnection connection = new SqlConnection(_connectionString);
SqlCommand command = new SqlCommand();
command.Connection = connection;
command.CommandType = CommandType.StoredProcedure;
command.CommandText = "GetAllArticles";
SqlDataReader reader = command.ExecuteReader(CommandBehavior.SingleResult);
return FillArticles(reader);

private IList
FillArticles(SqlDataReader reader)
articles = new List
while (reader.Read())
Article article = new Article();
article.ArticleID = (int)reader["ArticleID"];
article.Title = reader["Title"];
article.Body = reader["Body"];
article.Published = (DateTime)reader["Published"];
return articles;

As you can see, the FillArticles method is expecting a SqlDataReader (a concrete class). Now let's assume that you are told that articles will no longer be stored in the database, but rather in XML files. In order for you to make this change, you will need to refactor the Get() method to handle XML access and then pass an XmlReader to the FillArticles() method. Unfortunately, you will get an error because it is expecting a SqlDataReader.

How do we fix this? Well, in short, both SqlDataReader and XmlReader implement an interface called IDataReader which requires these methods to be defined: Read, NextResult, Close, RecordsAffected, etc. By changing the parameter type from SqlDataReader to IDataReader, you can still use the Read() method; however, you can now pass in any concrete class that implements IDataReader. Here is what the refactored code will look like:

private IList
FillArticles(IDataReader reader)
articles = new List
while (reader.Read())
Article article = new Article();
article.ArticleID = (int)reader["ArticleID"];
article.Title = reader["Title"];
article.Body = reader["Body"];
article.Published = (DateTime)reader["Published"];
return articles;

There are many ways to determine which common base types are available to use as parameters. One feature you can use is the object browser provided by .NET (Alt + Ctrl + J); search for the concrete class and expand the base types folder to see which are implemented. Additionally, ReSharper will tell you if you can refactor the parameter type based on which methods you use. Finally, Lutz Roeder's .NET Reflector will allow you to find the base types for each class.

Thinking about future issues and maintenance problems while developing projects will save lots of heartache when design changes are made late in the process. I hope this made sense. If I can clarify anything, please let me know.

Sources : http://lowrymedia.com/blogs/technical/


Tuesday, November 4, 2008

Windows Azure - What is it?

Last week's PDC 2008 at Los Angeles created a lot of buzz around primarily two products from the Microsoft stable

1) Windows Azure - Microsoft's foray into the Cloud computing world
2) Windows 7 - Successor to Vista SP1

Out of which i will take up the first product here as that is in the words of Ray Ozzie going to define the next computing platform for enterprises

I am not going to floss too much about the nitty gritties of the Azure Platform here, that i will spare for another post. Here i attempt to put forward a bird's eye view of what is Azure. Excerpts from the Microsoft site.

Azure is a flexible platform that can be used as a whole or in part. You can run entire applications in the Windows runtime environment or utilize individual services.

Write Applications to Run On Windows Azure

Developers can start by writing applications to Windows Azure™ by using the Microsoft® .NET Framework and Microsoft Visual Studio®. Write web or mobile applications or author web services. In the future there will be support for both Microsoft and non-Microsoft programming languages and development environments.
Once you’re done coding the application, deploy it to the cloud and run it in Windows Azure and make it available via the Internet to your end users. Scale compute capacity up or down based on traffic.

Use Azure Services In Online and On-Premises Applications
Take your cloud application to the next level by adding new functionality using additional Azure services. Use Live Services to reach over 460 million Live users, Microsoft .NET Services for workflow, access control, or service bus functionality, or use the Microsoft SQL Services cloud database. Developers can also write applications and web services that can be consumed by business partners or consumers.
Additionally, Azure services can also be used to augment an existing application that runs on a PC or a server to give on-premises software cloud capabilities. The services use industry standard SOAP, REST and XML protocols so using them won’t be a problem regardless of the operating system or programming language you’re using.

Bring It All Together

The Azure Services Platform is a cloud operating system and collection of services that can deliver web, mobile, or hybrid software-plus-services applications to users. Existing software can utilize the services to add cloud capabilities, and developers can easily write applications for the cloud to be used by end users, or write services that can be consumed within other applications.


This is a new beginning !!!
A beginning which took a long time coming.
As with many things in life, it customary to associate a "name" for an entity. Choosing the name can be quite a task.

Lot of deliberations and some look ups later i came up with this name "Memetic-thoughts".

Memetic: (from meme)

A meme (pronounced /miːm/): consists of any idea or behavior that can pass from one person to another by discovery, learning or imitation. Examples include thoughts, ideas, theories, gestures, practices, fashions, habits, songs, and dances. Memes propagate themselves and can move through the cultural sociosphere in a manner similar to a contagious behaviour.

Memento: A behavioural Design Pattern in Software.

The above definitions sum up pretty much what i aim to achieve through this blog. Any ideas, thoughts, solutions, designs (the list is endless) that fascinates me, intrigues me ... would find a place here.

Let the thoughts flow !!!