Sketching an actual solution.

It’s been a while since I’ve posted, project commitments have been drowning me. But the tide has now receded.

Last night (Tue 30th of June) I attended the Melbourne Silverlight Designer and Developer Network (SDDN) event, always great fun and interesting and always has free pizza (if you get there early).

With presentations from Mahesh Krishnan @ blogesh.wordpress.com about new features in Expression Blend 3. The most interesting to developers of business systems being; sample data generation, and IntelliSense support.

Which was followed by a great demo of SketchFlow – A rapid prototyping tool (that works) by Shane Morris a User eXperience (UX) guru from Microsoft.

SketchFlow looks to fill a gap well for designing something that can actually be played with and fine tuned by designers, users and business analysts. Having been involved in projects where UI’s left up to the control of developers have resulted in lots of rework when clients begin to use them as part of UAT such a tool if used properly should improve the process greatly.

Some of the great uses for SketchFlow are to quickly explore possible flow paths in an application, screen layout design and actually begin to setup associated actions such as animations and transitions. The deal-maker being it’s ability to create actual, usable code. Without the designer/business analyst writing code. This code can then be “harvested” out when development begins. It also includes a feature designed for testers and other lovers of long documents, it can output a word document of the screen designs, annotations and possibly feedback elements. The feedback element of the system allows a redistributable (via silverlight package or central web location) for users to navigate around the application screens, and draw and add notes. With the most typical example of a “simple request” – *big circle around logo* with the text: “can we make our company logo bigger?!”.

There should be a Beta release coming soon, so get a hold of it to actually witness it’s potential. It’s worth checking out the AU Remix 09 videos including the key note where SketchFlow is mentioned.

I won a copy of ‘Sketching User Experiences: Getting the Design Right and the Right Design’ (Interactive Technologies) available at Amazon. I hope to get a chance soon to start reading parts of this, and may make a post out of it.

I won the book by raising a question about how one could go about using SketchFlow to help design a “Version 2.0” of an existing system. Attempting to incorporate actual real world looking components from the real operational system side by side with new additions, including mixes and matches for modifications/overhauls on existing system screens. The flexibility of the tool seems to lend itself to using it in a variety of scenarios and project types.

Update (10 Jul 2009): Download Blend 3 RC & SketchFlow Release Candidate here.
Update 2: That link isn’t a wise choice… Stay tuned to : seethelight.com

Advertisements

Adventures in the land of SSIS

I had to whip up a solution to a data migration requirement and had no choice but to use SQL Server Integration Services (SSIS). It is marketed as fast and flexible tool for data extraction, no idea about the “fast”, it’s user interface and error/warning messages make using it far from flexible. A lot of the time I found myself in a battle to achieve the simplest task and not being supported by the tool. I admit that this is because I have no prior experience with any of the data control objects. What made matters was the interface wasn’t very helpful in the names of controls or descriptions of tool-tips. Note this is my experience with Visual Studio 2005 and SSIS, it may have improved in VS 2008 or the upcoming VS 2010.

I had 2 objectives to achieve: join data from 2 tables, and the use of the last generated ID for a subsequent query. It appears the latter was not even considered in the design of the tool. You would think that a data writing control would have more outputs than just exceptions.

Having “successfully” met the basic requirements of “migrating data” I thought I’d share the approach I took, it may not be the optimal approach, but it works, and in this scenario performance isn’t a concern.

The data being merged from one location (example: a legacy system) to a new system with a different data representation model. I’ve put in the context of the “Medical System” which is the theme of my posts. In this post I introduce a concept of related patients. Simply put a patient can be related to another patient in the system, examples of relationship types are ‘single’, ‘couple’, ‘sibling’, etc. There are other representation complexities here, but are not relevant to the post or SSIS discussion. The ER model is as follows:

Basic structure Many-to-Many table structure

Basic structure Many-Many table structure

As a requirement at the point of the data merge, every patient must be created with a default ‘single’ relationship entry. This is where SSIS doesn’t support it easily. Based on a requirement of maintaining existing patient ID’s as part of the merge and identity insert is performed by SSIS into the patient table. Then a new ‘single’ relationship type record must be created in the relationship table. Next the non-SSIS-supported task to create a new entry in the linking table (PatientRelationship) using the newly created ID of the single relationship record. This leads to the need for the use of the inbuilt database function SCOPE_IDENTITY() or it’s alternatives such as @@IDENTITY. I could not find a supported approach in SSIS to obtain this value via the output without the use of a stored procedure.

At this point all the material I found online was to make use of a SQL Stored Procedure with an OUTPUT parameter to obtain the value directly from an insert statement. This is fine if you need to make use of it back in SSIS. But in this case all that was required was a follow up insert statement. So I embedded the initial insert and the subsequent statement in 1 stored procedure, taking the PatientID of the record currently being processed by the SSIS package as the input:

CREATE PROCEDURE dbo.SetupRelationships
(   @PatientId int   )
AS
BEGIN
   INSERT INTO dbo.Relationship 
   (
      RelationshipType
   ) 
   VALUES 
   (   1   ) --Note: the ID 1 is the Foreign Key for the type 'Single'

   INSERT INTO dbo.PatientRelationship 
   (
      PatientId, RelationshipId
   )
   VALUES 
   (  @PatientId, SCOPE_IDENTITY() )
END

As a quick side note – I asked a question on Stack Overflow about mapping hard coded values inside SSIS the answer was to use a “Derived Columns” column, here is the stack overflow question and answer that has the tips for data formatting. Another option was to create default values on the database schema that housed the source data for the migration.

Once the stored procedure was created making use of it in SSIS required another “Data Flow Task” and inside that task using an “OLE DB Command” to call the procedure via

EXEC dbo.SetupRelationships ?

the question mark represents a parameter, if you had a procedure taking 3 input parameters and 1 output parameters it would look like this:

EXEC dbo.AnotherProc ?, ?, ?, ? OUTPUT

The SSIS “Data Flow Task” now looks like this:

SSIS Data Flow Task

SSIS Data Flow Task

With the Advanced Editor properties dialog looking like this (click on image to see the full sized screen shot):

OLEDB command setup

OLEDB command setup

The final step is now to create the column mapping to supply the Patient ID into the stored procedure on Column Mapping tab (again click for larger image):

OLEDB Column Mapping

OLEDB Column Mapping

That was it, “Execute Package” and the data would migrate meeting our requirements.

Pex – Assisted White Box Testing

On a brief side note after attending Mitch Denny’s (from Readify) presentation tonight on “Tools for Improving Software Quality” he introduced us to a great little tool for Visual Studio 2008 and beyond. “Pex (Program EXploration) produces a traditional unit test suite with high code coverage“.

It’s a tool that writes the unit test for you!

This may seem like a contradiction to the purpose of the unit test, and slap in the face for those promoting Test Driven Development (TDD). But that’s incorrect. The purpose of Pex is to assist in catching the missed branches (especially the mundane and tedious ones from switch statements). To be used to polish the unit tests, after the best practice approach is followed; of writing the core of the unit test before coding the actual function begins.

Given a parameterized unit test written in a .NET language, Pex automatically produces a small unit test suite with high code and assertion coverage. To do so, Pex performs a systematic white box program analysis.

Download it here at the Research.Microsoft.com site; available for “Commercial Evaluation” and “Academic Use” as it’s still in Beta.

The tool integrates directly into Visual Studio and assist with MSTest tests. You write click on a method and it explores the code paths determines input variables to test areas of the code your logic currently doesn’t cater for, thereby allowing you to further expand the function itself to cater for more “expected exceptions”.runpex

Here are some more screen shots on the official Pex site, also check out the Channel 9 video on it.