Don’t go (Entity Framework) code first

Code first (Entity Framework) allows developers to forget about SQL, deploy schemas and have everything under source control. As long as you don’t think about the future all is just fine and dandy. But the life cycle of a solution and especially it’s database is only starting when it is first developed. The new Fluent Migrations framework and SQL Server Data Tools have got me thinking about what in the end would be the tooling that will retain its value also later in the life cycle.

Using code first getting started is easy and the initial deployment is close to zero effort. Unfortunately this is where the sunshine ends and the dark clouds appear. As time goes buy, the production database gets full with masses of business data and other systems start relying on all the views and schema. Some monday morning your boss asks you to to add a couple of columns, remove some depricated ones and maybe do some renaming while your at it. To be able to test your changes you of course start by copying a part of the production data to to the test server. Now then fun starts! You first edit your (code first) entities classes, then fire up SQL Server Management Studio to edit the database and finally give the changes a thorough testing. To finish off you prepare SQL scripts to get the production database updated. As this is all manual work, you find your self thinking that life could be easier – and so it can!

Imagine coding your schema first with Fluent Migrations, then letting the standard Entity Framework to generate your entities based on the database. Now, no mather what changes come you can always do them first by code, then let Visual Studio update the entities and finally deploy you changes without leaving your IDE. All the promises of code first, without code first. Oh happy days!

Dilemma of code first

Make your Windows 7 a little bit safer

Virus scanners and firewalls are like vitamins, they help you stay healthier but they don’t prevent you from getting sick. This other day I found the Antivirus virus rampaging on my laptop. Vulnerabilities in browser add-ins is the most common reason for viruses to get on your computer.  In my case there where suspicious files both in the Java and browser cache.

After removing the troublemaker with the scanner I had and running some online scanners just to make sure, I did what I should have done a long time ago to make my Windows a little bit safer:

  • First a made a separate admin account and removed the rights from my standard account. Windows will now prompt me for the admin password when necessary. Experts claim 80% of the know viruses will be stopped just by doing this. I had done it on my XP (never had a virus scanner on that old computer) but forgot about it when I upgraded.
  • I updated my Java to the latest version and then I disabled Java both from the Java console and the browsers. I also disabled the Java cache just to be sure. Best option would of course be to just remove the whole thing but I was not sure if I would need it later.
  • Then I installed Chrome as it’s claimed to be the safest browser and set it as default.
  • As a final touch I disabled the JavaScript support from my PDF viewer and disabled the Chrome internal one has I could not find any option to do the same there. I then used some safe PDF scripting examples available on the internet to test my configuration.

The evolution of web development

Now over 13 years back I worked as a professional web developer. In those times JavaSscript was just something used to validate forms and anything else (e.g. toggling the visibility of a element) was considered fancy and labeled as DHTML. Few weeks ago I got a email from a co-worker having problems with including inline JavaScript in a ASP.NET razor page – this got me on memory lane.

Back in the days the code to open a popup window could look like this

<p><a href="javascript:window.open('http://www.google.com','mypopup')">Open google!</a></p>

This not only displayes as a scary link in the browsers status bar but also is useless in browsers not running JavaScript. So quite quickly the developers shape up and started putting the code in a separate JS file and using the events available:

function popitup(url) {  window.open(url,'mypopup'); }
<a href="http://www.google.com" onclick="popitup(this.href);return false;">Open google!</a>

This was fine for many years. But then it became modern to write nonobtrusive code (keeping the scripting and visuals separate). Even if this in practice requires a lot more code frameworks like jQuery make it easy.

window.onload = function() {
var myPopupLink = document.getElementById('myPopupLink');
if(~myPopupLink) myPopupLink.onclick = function() { popitup(myPopupLink.href); return false; }
}
<a id="myPopupLink" href="http://www.google.com">Open google!</a>

What next?
 

Working with legacy web services and WCF

It is quite common that WCF has problems working with old and none-.NET web services. Usually the old “web reference” (ASMX) tehcnology works better in this kind of situtations, but this one time I was determined to solve the challange using WCF.

After having Visual Studio generate me the client classes I did a unit test to see if I could call the web service successfully. It turned out that the call succeeded and the function returned a response. Unfortunately the response object only contained the result code and descrption, but the data property was null.

Usually in this kind of situations I first turn to Wireshark or some similiar network traffic packer analyzer to see what’s actually gets sent and returned. This this time I had to come up with an alternative way as the web service only allowed me to use a secure HTTPS address so all the traffic was encrypted. As the calls returned a valid object with part of the expected data I knew the authentication was working and there was nothing wrong with the message headers. This meant it was enough for me to see the message content and writing this simple message inspector worked as the solution.

    public class SoapMessageInspector : IClientMessageInspector, IEndpointBehavior
    {
        public string LastRequest { get; private set; }
        public string LastResponse { get; private set; }

        #region IClientMessageInspector Members

        public void AfterReceiveReply(ref System.ServiceModel.Channels.Message reply, object correlationState)
        {
            LastResponse = reply.ToString();
        }

        public object BeforeSendRequest(ref System.ServiceModel.Channels.Message request, IClientChannel channel)
        {
            LastRequest = request.ToString();
            return null;
        }

        #endregion

        #region IEndpointBehavior Members

        public void ApplyClientBehavior(ServiceEndpoint endpoint, ClientRuntime clientRuntime)
        {
            clientRuntime.MessageInspectors.Add(this);
        }

        public void AddBindingParameters(ServiceEndpoint endpoint, System.ServiceModel.Channels.BindingParameterCollection bindingParameters) { }
        public void ApplyDispatchBehavior(ServiceEndpoint endpoint, EndpointDispatcher endpointDispatcher) { }
        public void Validate(ServiceEndpoint endpoint) { }

        #endregion
    }

Usage

            inspector = new SoapMessageInspector();
            factory.Endpoint.Behaviors.Add(inspector);

The message inspector revealed to me that the call was sent ok and also the data returned by the server was fine. It was the WCF framework that failed to properly deserialize the response.

The property for the result data was called Any in the response class, so I took a look at the WSDL provided by the server.

<s:complexType name="response">
<s:sequence>
<s:element type="s0:statusType" name="status" maxOccurs="1" minOccurs="1"/>
<s:any maxOccurs="unbounded" minOccurs="0" processContents="skip" namespace="targetNamespace"/> </s:sequence>
</s:complexType>

The any WSDL element leaves the structure of the content undefined. WCF translates this to a Any property of type XmlElment. The reason why WCF could not process the response correctly was pobably caused by this and the minOccurs value.

After trying to edit some of the attributes of the generated classes without success, I decided to take over the parsing of the response using a custom response formater.

        public class MyResponseBehaviour : IOperationBehavior
        {
            public void ApplyClientBehavior(OperationDescription operationDescription, ClientOperation clientOperation)
            {
                clientOperation.Formatter = new MyResponseFormatter(clientOperation.Formatter);
            }

            public void AddBindingParameters(OperationDescription operationDescription, System.ServiceModel.Channels.BindingParameterCollection bindingParameters) { }
            public void ApplyDispatchBehavior(OperationDescription operationDescription, DispatchOperation dispatchOperation) { }
            public void Validate(OperationDescription operationDescription) { }

        }

        public class MyResponseFormatter : IClientMessageFormatter
        {
            private const string XmlNameSpace = "http://www.eroom.com/eRoomXML/2003/700";

            private IClientMessageFormatter _InnerFormatter;
            public eRoomResponseFormatter(IClientMessageFormatter innerFormatter)
            {
                _InnerFormatter = innerFormatter;
            }

            #region IClientMessageFormatter Members

            public object DeserializeReply( System.ServiceModel.Channels.Message message, object[] parameters )
            {
                XPathDocument document = new XPathDocument(message.GetReaderAtBodyContents());
                XPathNavigator navigator = document.CreateNavigator();

                XmlNamespaceManager manager = new XmlNamespaceManager(navigator.NameTable);
                manager.AddNamespace("er", XmlNameSpace);

                if (navigator.MoveToFollowing("response", XmlNameSpace))
                {
                    ExecuteXMLCommandResponse commandResponse = new ExecuteXMLCommandResponse();
                    // and some XPath calls...
                    return commandResponse;
                }
                else
                {
                    throw new NotSupportedException("Failed to parse response");
                }
            }

            public System.ServiceModel.Channels.Message SerializeRequest( System.ServiceModel.Channels.MessageVersion messageVersion, object[] parameters )
            {
                return _InnerFormatter.SerializeRequest( messageVersion, parameters );
            }

            #endregion
    }

As the web service only provided one function returning a fairly simple response object writing the formater only required a couple of lines of code to parse
the response data using XPath. As soon as I had replaced the default formatter with my own, things started working perfectly.

            factory.Endpoint.Contract.Operations.Find("ExecuteXMLCommand").Behaviors.Add(new MyResponseBehaviour());

I found out later that the message inspector I had written earlier also provided me with a way to throw exceptions with meaningful messages as the server always included a error descrption in the SOAP error envelope that WCF did not reveal.

Combining SharePoint Workflows and Calculated fields

In one of my projects I wanted to allow teams to work privately with their documents but also allow the rest of the organization to be able to access everything considered final. The two level versioning feature allowed me easily to do exactly this, but the challenge was to also find a way to encourage the teams to finally “publish” their items. The solution was to have a special column indicate the state of the item.

To make things more interesting, I decided to only use the dialogs of SharePoint designer and the web interface to accomplish this. The easiest way to insert HTML  into a list is by using a calculated column. Unfortunately, as I quite quickly found out, the calculated columns do not update on version and approval changes. The only workaround was to create workflow to react on these changes and then use the calculated column to generate the output.

This is what the list column setup looked like

The IsPublic column gets updated by the workflow and the Visibility column generates the HTML based on the value. There are many posts describing how to use a calculated column together with JavaScript to print out HTML, but all you actually need is to set the result type of the column to integer!

Using SharePoint designer I created the following workflow for the list

The problem with document libraries is that the file might not have been completely uploaded before the workflow kicks in. This is why the workflow needs to first make sure it can access the item before editing it.

There are probably many ways to check if there is a major version available, I decided to convert the version number (which is actually a string) to an int and see if it is higher than 0. After some trial and error I found out that the only way to convert a string representing a decimal (yes, the version column is a text field) value is by first converting it to a double.

The final task was to hide the IsPublic field from the list forms and find some nice icons to indicate the visibility.

QlikView vs. Tableau

Some while ago I was asked by the CIO of my company to give a report on the Microsoft BI tool set. While gathering information I constantly found QlikView and Tableau being named as the new top dogs. Having used QlikView for a couple of years now I got interested to know more about the rival.

Just by reading blog posts and articles it already became clear Tableau fits into the category of data analysis tools. QlikView emphasizes on data discovery, so would it be fair to compare them against the same criteriors?

Planning to buy me a new car I needed to analyse the balance history of my bank account. This gave me a perfect scenario to try to implement using both tools utilizing the data I already had ready in a Excel workbook.

QlikView

In QlikView the building of a solution always starts by configuring the load script defining how the data will get imported. QlikView is never directly connected to a data source, instead the information is always first imported, processed and compressed.

I had to add two data transformations: convert the date information to a correct date format and to change the numbers with a trailing negative or positive sign to the more traditional format. To help with displaying the values on a monthly and weekly timeline I also had the script generate some additional information.

QlikView load script

When I was ready with the importing, setting up the chart and table objects was easy. As I only had data from the beginning of this year I configured the accumulation “manually” by defining it in the chart expression enabling me to include the initial balance.

QlikView solution

The final result looked nice and as QlikView by default already supports filtering by clicking on any visible information I did not see a need for any additional filter boxes.

Tableau

My plan was to use Tableau Desktop but as my trial license had expired I had to turn to Tableau Public that is offered free for publishing dasboards online.

Configuring my Excel files as a data source was easy using the inbuilt wizard. Unike QlikView Tableau links in general directly with the data source and the transformations are processed on the fly.

As the data required some tansformation as desrcibed earlier, I created two additional calculated dimension fields for this. The script language was quite close to what I used in QlikView so I could copy-paste most of what I had earlier written.

Creating the list was easy and done by drag and drop just like when working with pivots in Excel. Configuring the chart was more of a challange as I needed to create some additional calculated measure fields to get the same result I had in my QlikView solution. In contrast to QlikView, I did not have to create any week and month fields has Tableau could do the groupping directly using the available date field by selecting the desired option from the menu.

Tableau workbook

In Tableau the visual elements are defined in worksheets and these are then composed on a dasboard. Setting up the dasboard was easy and done by placing the previsouly created worksheets onto a grid. As there is no standard drill down and filtering available the presentation ended up being quite static .

Tableau dashboard

Summary

Using QlikView, writing the import script and configuring the objects on the canvas requires some degree of knowledge but it also allows the user to setup complex data handling and having full freedom to define the visual appearance of the solution. QlikView enables not only setting up dashboards but also has potential as a platform for creating complete BI tools.

Despite the fact that I did not spend any time on reading manuals or watching video tutorials before getting started with Tableauwas easy. Tableau also offered much of the functionality through menu options that required scripting in QlikView. Unfortunately when working with dashboards what you see is what you get, leaving less room for discovery.

I would imagine Tableau is closer to Microsoft PowerPivot in its user experience and approach than to QlikView. As there tools are of different nature, only by comparing them against specific requiremets will yeld in a truthful conclution.