A basic walkthrough of RabbitMQ using C#.NET examples

In my last post “A trivial message loop using RabbitMQ in C#.NET” I demonstrated a simple concept of a circular message queue using RabbitMQ. I chose not to complicate the post with explaining some fundamentals of RabbitMQ leaving that for this post.

As of writing the version of RabbitMQ I am using is 2.4.1.

Some of the concepts I introduced in the last post were, and I’ll break them down in a hopefully a logical flow as steps to take.

  1. IConnection
  2. IModel
  3. ExchangeDeclare()
  4. QueueHelper.EnsureQueue()
  5. Subscription
  6. SimpleRpcClient
  7. SimpleRpcClient.Cast()
  8. Your own SimpleRpcServer
  9. HandleSimpleCast

First if you’re looking to get setup on Windows head on over to the RabbitMQ official documentation and follow the steps.

Also before you continue reading I urge you to go have a look at Mike Hadlow’s EasyNeqQ API and accompanying blog posts.

Some other good blog posts I have stumbled upon after starting to write my version of this come from Simon Dixon, at the time of me writing this he had 1, 2, 3 part series

On to my take, I’ll be explaining RabbitMQ along the lines of how my simple demonstration application functions, as a reminder it’s available on BitBucket – https://bitbucket.org/NickJosevski/rabbitloop

Step 1 – The Connection

I am using ConnectionFactory with a local machine address to create the connection the client and server will use to build the next required component.

IConnection connection = new ConnectionFactory
      Address = "" //localhost

Step 2 – The Model

Using the connection create a model this is used to configure the rest of details on how communication will take place between the client and server.

IModel model = connection.CreateModel();

Step 3 – ExchangeDeclare()

Declaring an exchange is the concept of setting up a named path for communication, the second parameter offers options on how messages are to be delivered, in this example I’m sticking with a simple choice of ‘Direct’

var exchange = "my.exchange";
model.ExchangeDeclare(exchange, ExchangeType.Direct);

Step 4 – QueueHelper.EnsureQueue()

This is my own helper extension method to help perform a few tasks; creating a queueId, using that queueId to declare a queue on the model, and then bind the queue and exchange together, finally returning the newly created queueName.

public static class QueueHelper
    public static string EnsureQueue(this IModel ch, String exchangeName)
        var queueId = String.Format("{0}.reply", exchangeName);
        var queueName = ch.QueueDeclare(queueId, false, false, false, null);
        ch.QueueBind(queueName, exchangeName, "", null);
        return queueName;

Step 5 – Subscription

The subscription is supplied to the server logic and is created using the model and the queueName

var sub = Subscription(model, queueName);

Step 6 – SimpleRpcClient
Creating the client is just as simple, and we’re ready to send a message.

var client = new SimpleRpcClient(model, queueName);

Step 7 – SimpleRpcClient.Cast()

The Cast() method on the SimpleRpcClient is the asynchronous way to send a message, I want to keep this post more straight forward so I am excluding the message class and how to serialize it but those steps are very basic.

var myMsg = new SerializableClassYouHaveCreated().Serialize();

rpcClient.Cast(new BasicProperties(), myMsg);

Step 8 – NicksSimpleRpcServer

This is where things start to get a little more involved, but it’s relatively straight forward. In this approach I create my own RpcServer that extends SimpleRpcServer.

public class NicksSimpleRpcServer : SimpleRpcServer
    private readonly SimpleRpcClient rpcClient;

    public NicksSimpleRpcServer(Subscription subscription, SimpleRpcClient client)
    : base(subscription)
        rpcClient = client;


Step 9 – HandleSimpleCast

All I do now is override the method that handles asynchronous messages (it also needs to de-serialize the content form byte[] body). Just to follow along from my sample application the server also has a client private member so it can continue to forward a modified message on wards.

//as an override method inside NicksSimpleRpcServer:
public override void HandleSimpleCast(Boolean isRedelivered, IBasicProperties requestProperties, byte[] body)
    //deal with incoming message, create new message to forward
     rpcClient.Cast(new BasicProperties(), msgToForward);

This is where our story comes to an end… As a final step we need to get our server running…

Using the Task Parallel Library create a Task that kicks off another method of the RpcServer – MainLoop(). MainLoop simply sits there (blocks) and accepts incoming messages, each time an asynchronous arrives HandleSimpleCast will fire and the message will be processed.

var server = new NicksSimpleRpcServer(sub, client);

new Task(server.MainLoop).Start();

Hope this is easy to follow.

A trivial message loop using RabbitMQ in C#.NET

During the May 2011 meeting of the Melbourne ALT.NET group, 3 presenters each with their chosen functional language tackled a basic problem of transmitting a message along a chain, with the objective of all nodes contributing to a final output. Here’s some of their code in the the languages of; Erlang, Scala and F#.

As an introductory post to RabbitMQ for my blog I thought I would cover how you go about setting up a simple RabbitMQ server-client in an asynchronous fashion. In this post I’m only going to cover the abstract concept and introduce some terms, I’ll go into more specific detail in another post as to the technical details of using RabbitMQ. I presented this concept as a very quick 10 minute lightning talk at DDD Melbourne 2011.

So stay tuned for more RabbitMQ based posts (links will appear here when they’re complete).

The objective:

  1. Start with a message that contains a letter ‘A’
  2. Send this message off to a node (N)
  3. Have that node increment the letter, e.g. first node makes it ‘B’
  4. Then that node sends it off, and so on looping for M number of messages.

A Node:

Node Structure

The Algorithm:
My RabbitMQ and C# based solution (simplest version), this list will contain some RabbitMQ specific concepts that I’ll describe later in the post. This is the algorithm to create, wire up and kick off processing.

  1. Create the N clients SimpleRpcClient.
  2. Keep a reference to the 1st client, this is what we use to begin messaging.
  3. Create N Subscription elements.
  4. Create N SimpleRpcServer elements these are the actual nodes.
  5. Supply the second client onwards and subscibtion to the node
  6. Create a new Threading.Tasks.Task() for each node.
  7. To complete the loop, wire up the first client, to the last node.

Node Communication (click for larger view):
Each node, houses a client to continue on sending the message.

Node Communication

The Code:

IConnection connection;
IModel model;

char[] letters; //A-Z, repeating
for (x = 0; x < totalToCreate; x++)
    var nextLetter = letters[x];
    //this builds up a string in the format of comm.0.a, comm.0.b, etc
    var exchange = String.Format("comm.{0}.{1}", x, nextLetter); 
    model.ExchangeDeclare(exchange, ExchangeType.Direct);
    var queueName = model.EnsureQueue(exchange);
    subscriptions.Add(new Subscription(model, queueName));

    clients.Add(new SimpleRpcClient(model, queueName));

for (x = 0; x < totalToCreate; x++)
    //note the use of [x+1] on the clients, 
    server = new SimpleRpcServer(subscriptions[x], clients[x-1]);

    new Task(server.MainLoop).Start(); //MainLoop is a RabbitMQ concept

//Inside RpcServer
public override void HandleSimpleCast(bool isRedelivered, IBasicProperties requestProperties, byte[] body)
    var messageToSend = body.Deserialize().IncrementLetter();

    rpcClient.Cast(new BasicProperties(), messageToSend);

The code above had been simplified a bit more just to demonstrate the concept, please review the project code for further intricacies in it how it needs to operate.

Working demo code on BitBucket – https://bitbucket.org/NickJosevski/rabbitloop

*Note: you’ll have to excuse some of the roughness of the code as it stands at 29th of May, I haven’t had a chance to refactor it to be more elegant.

There is a long list of RabbitMQ concepts that are beyond the scope of this blog post, a new one will follow soon with my explanations of how they work:

  • IConnection
  • IModel
  • QueueHelper.EnsureQueue()
  • HandleSimpleCast
  • rpcClient.Cast()
  • Subscription
  • ExchangeDeclare()

Also check back I may have completed a Prezi on the subject.

Migrating a Legacy Web Forms Application to Web Forms MVP

Since I put “Legacy” in the post title to begin with, I would like to share some quotes from the often quoted Michael Feathers author of Working Effectively with Legacy Code. If you haven’t read it, you should – it comes highly recommended by many developers including Scott Hanselman on this Hanselminutes episode. But full disclosure I myself have only read a few random sections of it here and there.

Working Effectively with Legacy Code

What do you think about when you hear the term legacy code? If you are at all like me, you think of tangled, unintelligible structure, code that you have to change but don’t really understand. You think of sleepless nights trying to add in features that should be easy to add, and you think of demoralization, the sense that everyone on the team is so sick of a code base that it seems beyond care, the sort of code that you just wish would die.

This I strongly agree with, not to say that legacy code is always “unintelligible” or you want the code “to die”, but very often it requires a much larger unit of time to change when compared to more desirable (maintainable) code, which leads to this quote:

In the industry, legacy code is often used as a slang term for difficult-to-change code that we don’t understand.

This can be remedied in part by unit-tests…

Code without tests is bad code. It doesn’t matter how well written it is; it doesn’t matter how pretty or object-oriented or well-encapsulated it is. With tests, we can change the behavior of our code quickly and verifiably. Without them, we really don’t know if our code is getting better or worse.

That last quote from Feathers helps to outline my motives…

At my current place of employment we have a flagship application that was originally built using ASP.NET 1.1. Then slowly migrated to .NET framework 2.0 and then immediately converted for .NET framework 3.5. That task itself was not trivial, and now there is a new challenge to make the code more maintainable.

The legacy system was written in … but it worked, and it worked great … It was legacy because it wasn’t done with new and shiny…

Hanselman makes a good point even though I paraphrased so it’s a more general statement. This applies to the application I’m working on improving. The application was in fact working “fine” in its .NET 1.1 form, but making changes/improvements was difficult. When I say difficult in this case I, in particular mean for new developers not familiar with the code base and application. Being the newest developer to work on this, I wanted the code base to improve so we could all make changes and improvements with confidence going forward.

This is the first in yet another series of posts from me. This series will focus on how we’ve gone about undertaking this migration. I’ll attempt to make each post stand reasonably alone to demonstrate migrating a specific application component from Web Forms to the Model-View-Presenter (MVP) pattern and in particular the Web Forms MVP framework.

Some of the objectives that we’re trying to achieve from this framework migration include:

blog.fossmo.net MVP diagram

This also leads down the path of improved development practices for the team:


So then what…

Web Forms by design does not lend itself to achieving all the objectives listed above and it is hoped that the use of the Web Forms MVP framework will take us down a better path. I say this with full understanding that there is no such thing as a Silver Bullet in software development, and that the adoption of a pattern and a supporting framework are just steps we are taking.

We’ve come this far in the post and I haven’t exactly outlined what Web Forms MVP is and why it’s our choice. So first, how about the obligatory pasting of the front page quote from the official project site:

The ASP.NET Web Forms MVP project is about bringing the love back to Web Forms through a renewed approach to using it – an approach that facilitates separation of concerns and testability whilst maintaining the rapid development that Web Forms was built to deliver.

The choice to use Web Forms MVP as the framework to help us achieve the MVP pattern in our application, comes from the knowledge of successful implementations by Readify on large scale sites such as Grays Online. Along with an active community of developers working on the framework and its acceptance into the CodePlex Foundation.

Examples coming up…

Now that we’ve got the background and I’ve publicly committed to “better code” and “testable code”, the next post in this series will be about moving your OnLoad & Page_Load methods out of your code-behind files and some of the challenges.

That’s Me!bourne .NET User Groups

A small list of Melbourne, Australia .NET and related User Groups.

Microsoft .NET

I had a very busy week just past with various User Groups, I thought it would be handy for myself and possibly others to make a list, this by no means is an exclusive list, and I hope to expand it with others as I discover them.

It’s great to catch up with fellow developers, to share what you’re currently working on in your day job, what you’re working on at home, or even what those potential “when I get some time” ideas. With all of us sharing our development frustrations on twitter when we get stuck, it’s great to meet those who offer those helpful suggestions and even those who poke fun at the problem to lighten up your day it’s great to meet and catch up in person.

If you’re not already attending a user group, I strongly advise you to make some time at least every few months to attend and interact with like-minded developers.

My apologies upfront if I have missed any that should be on here, let me know (leave a comment, @twitter me) and I’ll update.

The Obvious
These groups have been around for a while and are well known.

bpVictoria.NET – Various .NET presentations. On average every 6-10 weeks, key contact: @MaheshKrishnan meetings at Microsoft offices in Freshwater place.

bpSilverlight Designer Developer Network (SDDN) – Silverlight, UX, and Design. On average every 6-8 weeks, key contact: @Jakkaj meetings at Microsoft offices in Freshwater place.

bpSQL PASS – Microsoft SQL Server related topics. On average every 4 – 5 weeks, key contact email point: info at sqldownunder dot com.

The Up and Coming
Seed growing in hand
These groups are relatively new, and may not yet be well known.

bpxRM – Microsoft CRM related, newly started, key contact: @Ceibner, this is a new group, I haven’t had the pleasure myself to attend yet.

bpDoing .NET Days, this is also a new group having their second meeting in March ’10, the kick off focus revolves around the nServiceBus. This is a weekend event on Saturdays in Melbourne’s West. Key contact: @SimonSegal.

bpDev Evening, again another recently formed group, which I plan to attend. They are run in Richmond during weekday evenings. On average every 4 – 6 weeks, key contact @alexjmackey.

bpMelbourne ALT.NET, starting on 28th April 2010, held on the last Wednesday of the month, key contact @abienert (see my event summary post for more details).

The Not So Obivous
Ace up your sleeve
These groups are not focussed on .NET, but sometimes cover .NET topics.

bpMelbourne Patterns & Practices Group; Discussion about general software design practices, and their practical application in software development, key contact: @ChrisBushellOz.

bpMelbourne X User Group; very wide ranging software and technology topics.

Side Note: “That’s Me!lbourne” is a (as of 2009-2010) promotional branding for The City of Melbourne.

Update: For an extensive list of a lot of Melbourne, software/tech related user group meetings check out Geoff Burns’ blog where he has a google calendar widget, this is very useful.

A Quick Visit to the Clouds

These evening (23 Feb 2010) I attended a Melbourne .NET session presented by David Lemphers who’s the Senior Program Manager on the Windows Azure team.

David presented an introductory level discussion how Azure works; we were run through the basics of setting up a hello world application. Running it locally on a simulated cloud, side note here is currently a simulated cloud can only be accessed on a single developer machine – meaning you can’t yet host a simulation cloud as a networked machine to test your application. Such a feature should be coming in the future.

We also got taken through what it takes to push it up to the real cloud. The walk through continued to cover how the Azure system works, with Roles; a web-role representing IIS in the cloud and worker-role representing a windows-service in the cloud. Storage Accounts representing containers for data, message queues and blobs.

Windows Azure

Windows Azure is a cloud services operating system that serves as the development, service hosting and service management environment for the Windows Azure platform. Windows Azure provides developers with on-demand compute and storage to host, scale, and manage web applications on the internet through Microsoft datacentres.

Now that just means it’s supposed to be easier to get your applications hosted up servers you don’t need to own or administer for a fee. From what I have seen so far, it is in fact simple in terms of developer difficulty, but there’s a whole host of other considerations before pushing up a business critical application up into the cloud. A quick list of concerns would begin with; security, availability, privacy and confidentiality, but that’s a discussion I would rather not kick off here at least not right now.

There was a brief discussion of load handling and scaling in terms of pricing, so I asked an AllYourClouds.com question about it here. Along with a mention of a possible programmatic control of active instances, but probably only within the range of what’s already part of the billing plan.

From an Australian perspective the Windows Azure service is lacking for the moment, with prices significantly more expensive in the Asia data-centres , that combined with no locally (Australia) based datacentres available. The lack of on Australian soil datacentres ties back to concerns of certain types of data being able to be stored up in the cloud.

Some take away notes:

  • In Visual Studio 2010 Release Candidate to use the Azure SDK you need the Febuary release
  • For VS 2010 Beta 2 and prior the older Azure SDK is fine
  • Microsoft is dogfooding the Azure platform with the billing system for Azure.
  • Also services like Sea Dragon the Deep Zoom tech use a similar approach as Azure’s core functionality.
  • You currently cannot run .NET 4 Framework applications on Azure, will be available when it reaches RTM.
  • Expecting a 6 month release cycle, with road map coming soon.
  • The storage capacity of Azure based X Drive for cloud storage
  • Data caching and concurrency is still a problem to be solved in development, example suggestion for caching is memcached.
  • Azure team currently investigating SMTP in the cloud

Additional Resources:

Coming up Melbourne Cloud Code Camp April 1st 2010.

Victoria.NET January 2010 Session

I just got home from a slightly longer than usual Melbourne Vic.NET session, it was a very intense night, so intense I’ve got a second post lined up just to cover the second presentation. But first there were a few announcements:

  • David Burela reminded us of the April Cloud Camp.
  • Mahesh reminded us of the Silverlight Code Camp weekend end of January. I’m confirmed to be attending this.
  • Also that the User Group is looking for company sponsorship, as the current budget is shrinking

The two topics of the evening were:

  1. An overview and walk-through of some of the features in ASP.NET MVC 2, presented by Malcolm Sheridan and
  2. Command Query Responsibility Segregation, presented by Udi Dahan that I discuss in greater detail here [link coming soon].

The ASP.NET MVC 2 talk was a quick walk through with tips:
The take-away notes were:

  • Areas are useful in particular the ability to have them in a separate project (though that’s currently not functional in the RC). MSDN Link.
  • When using areas be careful on how your routes are impacted
  • Improved validation and custom validation options, through the use of ValidationAttribute interface
  • Validation re-use, in conjunction with Dynamic Data.
  • Other miscellanous improvements, such as shorter AcceptVerbs; Get/Post.

Contractual Obligations

Last night Wednesday 2nd of December I attended a presentation at the Melbourne Patterns & Practices User Group. After the Gang of Four pattern discussion (which was Chain of Responsibility) was a presentation on .NET Code Contracts.

Code Contracts are a Microsoft Labs Research project, that now has a beta release.

Code Contracts provide a language-agnostic way to express coding assumptions in .NET programs. The contracts take the form of preconditions, postconditions, and object invariants.

At their simplest level of application in a code base Code Contracts will help group guard conditions for functions, and also easily support exit guard conditions when a public method completes.

public List<Markers> ExtractGeneticMarkers(List<BioSample> samples)
   //Pre conditions use: 
   Contract.Requires(samples != null);
   //a lambda expression to perform the contract check on all elements
   Contract.Requires(samples.All(s => s.geneticData != null));

   //Post conditions use:
   Contract.Ensures(Contract.Result<Markers>() != null)

  var markers = new List<Makers>(); 
  //function logic
  return makers;

A summary of some of the benefits:

  • Compile time contract validation and error(/warning) output.
  • Runtime contract validation and exceptions thrown.
  • Toggling the contracts per assembly [Full / Pre & Post / Pre / ReleaseRequires / None].
  • Inheritance of contracts, even from interfaces.
  • Outputting documentation from the contracts, for accurate reflection of the state of the code.

It’s quite an extensive discussion for all it’s potential applications and what can be achieved, so for More Info check out some other posts on the topic too:

How much memory are you currently using?

Following on from my previous post about a demo PLINQ application, I had some small discoveries about memory usage and wanted to blog them.

As I was loading large chunks of data into memory and started monitoring my little demo applications RAM footprint, I discovered that even tho the garbage collection system had already cleaned up my now out-of-scope variables, the RAM utilisation as reported to windows did not necessarily drop.

This actually turns out is quite a logical thing to have happen and works this way for good reason, I just hadn’t thought about it. I was simply expecting an action of explicit free-up of memory to instantly translate into free’d memory in Windows.

The process was just holding onto a reserved amount of memory the .NET runtime assuming I would soon need that space again, and it was correct in its assumptions; I did need that memory very soon.

Use More Memory!

Use More Memory!

To prove this: subsequent button clicks to load more data (roughly the same size volume), didn’t make the application’s memory footprint grow even larger. Another interesting thing I noted with the task manager window open all the time, was when I loaded additional applications and my demo app didn’t need the memory, it would free up the few hundred mb it had a hold of but wasn’t currently making use of.

Demo Application Memory Usage

Demo Application Memory Usage

LINQ Basics (Part 2) – LINQ to SQL in .NET 4.0

Continuing my 2 part series on LINQ Basics, here I will attempt to discuss a few improvements in LINQ to SQL as part of the .NET 4.0 release scheduled soon. A key post on the 4.0 changes is by Damien Guard who works on LINQ at Microsoft along with other products. This will allow me to discuss some of the changes but also go into a bit more detail extending the “basics discussion” started in the previous post.

Note: In October 2008 there was a storm of discussion about the future of LINQ to SQL, in response to a focus shift from Microsoft for “Entity Framework”, the key reference post for this is again one by Damien G.

The upcoming 4.0 release doesn’t to cause damage to the capability of LINQ to SQL as people may have been worried about (i.e. disabling/deactivation in light of the discussions of Oct 2008). There are actually a reasonable amount of improvements, but as always there’s a great deal of potential improvements that there was not capacity/desire to implement for a technology set that is not a major focus. But LINQ to SQL is still capable enough to function well to be used for business systems.

First up there are some performance improvements in particular surrounding caching of lookups and query plans in it’s interaction with SQL Server. There are a few but not all desired improvements in the class designer subsystem, including support for some flaws with data-types, precisions, foreign key associations and general UI behaviour flaws.

There is a discussion on Damien’s post about 2 potential breaking changes due to bug fixes.

  1. A multiple foreign key association issue – which doesn’t seem to be a common occurrence (in any business systems I’ve been involved in).
  2. A call to .Skip() with a zero input: .Skip(0) is no longer treated as a special case.

A note on .Skip(), such a call is translated into subquery with the SQL NOT EXISTS clause. This comes in handy with the .Take() function to achieve a paging effect on your data set. There seems to be some discussion on potential performance issues using large data sets, but nothing conclusive came up in my searches, but this post by Magnus Paulsson was an interesting investigation.

As for the bug It does seem logical to have treated it as a special case and possibly help simplify code that might be passing a zero (or equivalent null set), but if it was an invalid case for .Skip() to be applied to with a value greater than zero it will fail forcing you to improve the query or its surrounding logic.

Just for the sake of an example here is .Skip() & .Take(). Both methods have an alternate use pattern too: SkipWhile() and TakeWhile(). The while variants take as input a predicate to perform the subquery instead of a count.

var sched = ( from p in db.Procedures  
             where p.Scheduled == DateTime.Today  
             select new {  

There are also improvements for the use of SQL Metal which is the command line code generation tool. As a side note to make use of SQL Metal to generate a DBML file for a SQL Server Database execute the command as such in a Visual Studio command prompt:

    sqlmetal /server:SomeServer /database:SomeDB /dbml:someDBmeta.dbml

LINQ Basics

As part of preparation work I’m doing for a presentation to the Melbourne’s Patterns & Practices group in October on LINQ and PLINQ. I thought I would cover off the basics of LINQ in .NET 3.5 and 3.5 SP1 in this post. The changes in .NET 4.0 in the next post, and then a discussion about PLINQ in a third post.

O.k. so let’s sum up the basics. The core concept of LINQ is to easily query any kind of data you have access to in a type-safe fashion. Be it SQL Server stored, a collection (i.e. something implementing IEnumerable) or an xml data structure. A further addition to this power is the ability through C# 3.0 to create projections of new anonymous structural types on the fly. See the first code sample below for the projection; in that simple examples it’s the creation of an anonymous type that has 2 attributes.

Continuing on with my “medical theme” used for the WCF posts, here is a simple schema layout of the system, consisting of a patient and a medical treatment/procedure hierarchy. This is given the title of ‘MedicalDataContext’ to be used in our LINQ queries.

Medical System Basic Schema

Medical System Basic Schema

These items have been dragged onto a new ‘LINQ to SQL Classes’ diagram from the Server Explorer > Data Connections view of a database.

Server Explorer Window

Server Explorer Window

To create the ‘LINQ to SQL Classes’ diagram simply add …

Add New Item

Add New Item

a new …
Linq to SQL Classes

Linq to SQL Classes

Back to the logic. We have a Patient who’s undergoing a certain type of treatment, and that treatment has associated procedures. To obtain a collection of procedures for today, and the name of the patient who will be attending we simply build up a new query as such:

var db = new MedicalDataContext();

var sched = from p in db.Procedures
            where p.Scheduled == DateTime.Today
            select new {

Note: The patient table structure doesn’t have a field called ‘FullName’ I make use of a partial class extending its properties to add a read-only representation, check out this post by Chris Sainty for more info on making use of partial classes with LINQ.

At this point we can now iterate over each item in our ‘sched’ (scheduled procedures) collection.

foreach (var procedure in sched)

This brings me to another key point ‘Delayed Execution’ or (‘Deferred Execution’) check out Derik Whittaker’s: Linq and Delayed execution blog post for a more detailed walk through.

Basically the query we defined earlier is only a representation of the possible result set. Therefore when you first make a call to operate on the variable representing the query results, that’s when execution will occur.

So it becomes a program flow decision whether to always execute the query live when it’s being processed (i.e. most up-to-date data) or to force a single execution then make use of that data in the current process flow. A forced execution can be easily achieved several ways, the simplest choice is to just create a new list object via ToList() to execute the fetching of the data.

var allProcedures = todaysProcedures.ToList();

So far this has all revolved around accessing data in a SQL Server Database (the setup of a LINQ to SQL class). LINQ’s purpose is to be able to query any form of collection of data.

Now let’s say we wanted to obtain some information through recursion of a class using reflection.

Note: a business case for the use of reflection is often tied very deeply into some limitation, special case, etc. So a more specific example would take us well away from the topic of LINQ. So will keep this example trivial, this could just as easily manipulate a more useful class object to interrogate it.

var staticMethods = 
    from m in typeof(string).GetMethods()
    where !m.IsStatic
    order by m.Name
    group m by m.Name into g
    select new 
         Method = g.Key, 
         Overloads = g.Group.Count()

Which output element by element will generate this output:

  { Method = Clone, Overloads = 1 }
  { Method = CompareTo, Overloads = 2 }
  { Method = Contains, Overloads = 1 }
  { Method = CopyTo, Overloads = 1 }
  { Method = EndsWith, Overloads = 3 }
  . . .

For an off-topic discussion, check out this article by Vance Morrison post titled Drilling into .NET Runtime microbenchmarks: ‘typeof’ optimizations, as a discussion of the use of ‘typeof’ in my above query.

For a more typical business case example using a LINQ statement to fetch a collection of controls and operate on them (in this case disable them). Here we’re operating on a Control collection.

Panel rootControl;

private void DisableVisibleButtons(Control root)
   var controls = from r in root.Controls
                   where r.Visible
                   select r;

    foreach(var c in controls)
        if(c is Button) c.Enabled = false;
        DisableVisibleButtons(c);  //recursive call

//kick off the recursion:


  • LINQ to SQL generates classes that map directly to your database.
  • LINQ helps you manipulate strongly typed results (including intellisense support).
  • “LINQ to SQL” is basically “LINQ to SQL Server” as that is the only connection type it supports.
  • There is a large set of extension methods out of the box check out for samples: 101 LINQ Samples.