Control Builder C development using Sublime Text and GCC

Recently I started a project using ABB Control Builder to program a AC500 PM573 PLC.

I need to write a collection of C functions that can be used in CoDeSys as functions & function blocks so the end user (PLC Programmer) doesn’t need to use Structured Text (ST) to write what would be complicated Pascal.

Control Builder doesn’t provide a C editor, so I chose something light as I don’t need a full IDE – Sublime Text 2. This is the first time I’ve used it.

Control Builder is also a bit dated and doesn’t seem to have keyboard shortcuts for compiling, so I created my own build system for AC500 C development. This is the sublime-build file I used, thanks to sublimetext.info:

C.sublime-build

{
    "cmd" : [
    	"C:\\GCC\\4.7.0\\bin\\powerpc-elf-eabi-gcc.exe",
    	"-I", "C:\\Program Files (x86)\\Common Files\\CAA-Targets\\ABB_AC500\\AC500_FWAPI",
    	"-mcpu=860",
    	"-fno-common",
    	"-msdata=none",
    	"-fno-jump-tables",
    	"-fno-section-anchors",
    	"-fno-merge-constants",
    	"-fno-builtin",
    	"-nostdlib",
    	"-Werror-implicit-function-declaration",
    	"-Wconversion",
    	"-std=c99",
    	"-c", "C_Code_App_Shell.c",
    	"-o", "C_Code_App.obj"
	],
    "shell" : true,
    "working_dir" : "$file_path",
    "path" : "C:\\GCC\\4.7.0\\bin"
}

You’ll notice the following things about this build definition:

  • It won’t work with the “automatic” setting as I don’t want it picking up any other C projects
  • It compiles a specific file only – C_Code_App_Shell.c – which is the same file compiled by Control Builder so your other included files should be included properly

Use at your own risk, and I recommend using Control Builder to finally build your code when ready for testing and production, but this is a handy way of using a modern editor in the meantime.

WorkflowDesigner.aspx with your custom system masterpage – a better way

If you customised your system Master page and you work with Nintext workflows, I am pretty sure you came across the problem and description featured in this page:

3-step fix Nintex WorkflowDesigner.aspx with your custom system masterpage

Basically due to the customisations of the system masterpage the layout of the Nintext workflow designer might not render properly, sometimes making it impossible to create/edit workflows (my case).

Big kudos to the author for investigating this problem and coming up with solutions for it, but although they do work they are either hard or not supported by Nintex, in the author words:

Nintex has given me a stern wagging of their fingers and I need to tell you that Nintex does not support this modification. That’s why you need to keep the old Nintex file around and throw it back in if you need to talk to Nintex support.

Both solution revolve around on ways to replace the masterpage with the standard v4 masterpage just for the workflow designer.

The solutions suggested pointed me in the right direction, what I propose to solve this problem is to implement a module deployed as a webapplication feature that intercepts the request, checks if the page requested is WorkflowDesigner.aspx and replaces with the standard v4 Masterpage.

Here is the implementation:

public class MasterReplaceModule : IHttpModule
{
        public void Init(HttpApplication context)
        {
            context.PreRequestHandlerExecute += context_PreRequestHandlerExecute;
        }

        void context_PreRequestHandlerExecute(object sender, EventArgs e)
        {
            Page page = HttpContext.Current.CurrentHandler as Page;
            if (page != null)
            {
                page.PreInit += page_PreInit;
            }
        }

        void page_PreInit(object sender, EventArgs e)
        {
            Page page = sender as Page;
            if (page != null && System.IO.Path.GetFileName(page.Request.PhysicalPath) == "WorkflowDesigner.aspx")
            {
                page.MasterPageFile = "../v4.master";
            }
        }

        public void Dispose()
        {

        }
}

If you are not familiar on what is the best way to implement a module in a sharepoint solution, you can either manually edit the webconfig by hand to include the module (very big no no!!!), or you can use SPWebConfigModification to do it.

You can call SPWebConfigModifications from a feature event receiver FeatureActivated event, which will inject the necessary xml in the web.config for you. Dont forget to remove it on FeatureDeactivating.

here is an example of a SPWebConfigModifications object wich injects a module declaration (IIS7):

var moduleModification = new SPWebConfigModification()
{
                Owner = "ApplicationName",
                Name = "add[@name='MyAwesomeModule']",
                Type = SPWebConfigModification.SPWebConfigModificationType.EnsureChildNode,
                Path = "configuration/system.webServer/modules",
                Sequence = 1,
                // The XML to insert as child node, make sure that used names match the Name selector
                Value ="<add name='MyAwesomeModule' type='FullNameSpace.MasterReplaceModule , AssemblyName, Version=1.0.0.0, Culture=neutral, PublicKeyToken=7be6bd052a28e02f' />"
};

If you need help on how to apply the above in a feature receiver check this article:

http://panvega.wordpress.com/2009/09/02/using-spwebconfigmodification-within-a-feature-receiver/

Columns associated with mappings have been deleted/renamed

If you are using Linq to Sharepoint chances are that you have seen this error before. This errors usually means that either you are using a wrong list name or wrong web url in your Context contructor or there is a mismatch between the SPMetal mappings and the actual field internal names in the list. A cause for this is among other things something I blogged a while back.

Problem with this error is that it does not provide a clue on which field is the culprit, which for long lists is problematic as you will have to check each field internal name one and check against the fields mapped by SPMetal (Microsoft.SharePoint.Linq.ColumnAttribute, Name property).

So using reflector and inspecting the linq to sharepoint API i have developed my own map checking that provides me an indication on which field the mapping is failing. Here is the code:

    public static class EntityListExtensions
    {
        public static void ValidateMappings<T>(this EntityList<T> list)
        {
            var mit = Type.GetType("Microsoft.SharePoint.Linq.SPItemMappingInfo, Microsoft.SharePoint.Linq, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c");
            var pm = Type.GetType("Microsoft.SharePoint.Linq.PropertyMap, Microsoft.SharePoint.Linq, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c");

            ConstructorInfo ctor = mit.GetConstructor(BindingFlags.Instance | BindingFlags.NonPublic, null, new[] { typeof(Type) }, null);
            object instance = ctor.Invoke(new object[] { typeof(T) });
            var properties = (IEnumerable)mit.GetProperty("Properties").GetValue(instance, new object[] { });

            var iName = pm.GetProperty("InternalName");
            var aType = pm.GetProperty("AssociationType");
            var fType = pm.GetProperty("FieldType");

            var entityPropertyList = new List<PropertyInfo>();

            foreach (var property in properties)
            {
                var type = (AssociationType)aType.GetValue(property, new object[] { });
                if (type != AssociationType.Backward)
                {
                    var prop = (string)iName.GetValue(property, new object[] { });
                    var ft = fType.GetValue(property, new object[] { });

                    entityPropertyList.Add(new PropertyInfo { InternalName = prop, Type = ft.ToString() });
                }
            }

            var listPropertyList = new List<PropertyInfo>();
            var spDataList = list.GetType().GetField("list", BindingFlags.NonPublic | BindingFlags.Instance).GetValue(list);

            //check if lists exists in web

            var dc = (DataContext)list.GetType().GetField("dc", BindingFlags.NonPublic | BindingFlags.Instance).GetValue(list);
            var listName = spDataList.GetType().GetProperty("Name").GetValue(spDataList, new object[] { });

            using (SPSite site = new SPSite(dc.Web))
            {
                using (SPWeb web = site.OpenWeb())
                {
                    if (!web.Lists.Cast<SPList>().Any(l => Equals(l.Title, listName)))
                    {
                        throw new InvalidOperationException("Could not find List '" + listName + "' in " + web.Url);
                    }
                }
            }

            IEnumerable dataFields = (IEnumerable)spDataList.GetType().GetProperty("Fields").GetValue(spDataList, new object[] { });
            foreach (var dataField in dataFields)
            {
                var prop = dataField.GetType().GetProperty("InternalName").GetValue(dataField, new object[] { }).ToString();
                var ft = dataField.GetType().GetProperty("FieldType").GetValue(dataField, new object[] { }).ToString();
                listPropertyList.Add(new PropertyInfo { InternalName = prop, Type = ft.ToString() });
            }

            //Check Mappings:

            foreach (PropertyInfo sp in entityPropertyList)
            {
                var tp = listPropertyList.FirstOrDefault(i => i.InternalName == sp.InternalName);

                if (tp == null)
                {
                    throw new InvalidOperationException("Could not find Property '" + sp.InternalName + "' in list '" + listName + "'");
                }

                if (tp.Type != sp.Type)
                {
                    throw new InvalidOperationException("Property '" + sp.InternalName + "' is " + sp.Type + " but the column in the list is set as " + tp.Type);
                }
            }
        }

        public class PropertyInfo
        {
            public string InternalName { get; set; }
            public string Type { get; set; }
        }
    }

To use it:

using (var context = new ModelDataContext( Web_Url_Here )
{
         context.EntityList_Name_Here.ValidateMappings();
}

NOTE: The code above uses reflection heavily so it is breaking Microsoft black box, therefore I cannot guarantee that future updates wont break it. And it is more than likely that it wont work with 2013 (haven’t tested it)

The Speed of a Delegate

A Delegate in C# is a type, but one that can be assigned from a method. Any method that matches the delegate signature can be assigned to that delegate.

I have the need to run a dynamically loaded method from a DLL so that the user can choose the DLL or method at run time, so long as it matches the signature. I also need to run it 1E06 to 3E08 times in a row, so speed is essential, as a millisecond saved will reduce the run time substantially.

In a separate project I have the following method compiled in a DLL:

public class Controller
{
    public double Control(double a, double b, double c, double d)
    {
        // variable names have been changed to protect their identity!
        double setP = Math.Max(0, b - c);
        return Math.Min(setP, d);
    }
}

In my main project (which doesn’t reference the previous project), I define a Delegate that matches the signature of the Control method:

public delegate double Controller(double a, double b, double c, double d);

Then I load the delegate from the DLL (note the Controller here is the Delegate, not the class above):

        private Delegate LoadDelegate()
        {
            Assembly assembly = Assembly.LoadFrom(@"path\to\My.dll");
            Type controllerType = assembly.GetTypes().First(t => t.IsClass && t.Name.Equals("Controller"));
            object controller = Activator.CreateInstance(controllerType);
            MethodInfo handler  = controller.GetType().GetMethod("Control", BindingFlags.Public | BindingFlags.Instance);
            return Delegate.CreateDelegate(typeof(Controller), controller, handler);
        }

Now I have an Actor() class with a Run() function that I call in a loop.

I wire this into my Run() loop through the actors constructor:

public Actor(Delegate controller)
{
    _controller = (Controller)controller;
}

and then call the delegate like so:

public void Run()
{
    ...
    double setP = _controller (a1, b1, c1, d1);
    ...
}

The loop is simply:

Actor actor = new Actor(LoadDelegate());
for (int i = 0; i < 1000000; i ++)
{
    actor.Run();
}

Now the big question: the results! Using a timer around my for loop, and much more detail in the Run() functions that shown here, the results are:

No delegate, code verbatim in the Run() function

Run 10000 iterations...
inner loop took 0.3710212s

Run 100000 iterations...
inner loop took 3.7302133s

Run 1000000 iterations...
inner loop took 39.9192832s

Identical code, moved into a delegate and called from the Run() function

Run 10000 iterations...
inner loop took 0.3810218s

Run 100000 iterations...
inner loop took 4.5762617s

Run 1000000 iterations...
inner loop took 41.1173518s

Identical code but in a private instance method in the Actor() class:

Run 10000 iterations...
inner loop took 0.3650209s

Run 100000 iterations...
inner loop took 3.7172126s

Run 1000000 iterations...
inner loop took 39.9982877s

In all cases, the time is directly proportional to the number of iterations. The slight variability is due to random computer activity, which would average out over lots of test runs.

It appears that the Delegate method of calling a function is slightly slower, but not noticeably until you call at least 1 million iterations. Even then, the variability is the same as may be caused simply by other idle processes using CPU instructions at the time.

Add choices to SPFieldChoice programatically in a Sandboxed solution

The other day I stumbled upon a weird behavior when developing a sandboxed solution. I was trying to add choices programatically to an SPFieldChoice field using the object model. After hours trying to debug the problem, looking at the ULS logs etc. I learned that is actually a known bug. There is a workaround though, you can modify the xml schema of the field to modify/add your choices.


var category = (SPFieldChoice)rootWeb.Fields["Category"];
var choices = new[] { "category one", "category two"};

var doc = new XmlDocument();
doc.LoadXml(category.SchemaXml);
var fieldNode = doc.SelectSingleNode("/Field");
if (fieldNode != null)
{
    var choicesnode = doc.SelectSingleNode("/Field/CHOICES");
    if (choicesnode != null)
    {
        fieldNode.RemoveChild(choicesnode);
    }

    var choicesElement = doc.CreateElement("CHOICES");

    foreach (string choice in choices)
    {
       var choiceElement = doc.CreateElement("CHOICE");
       choiceElement.InnerText = choice;
       choicesElement.AppendChild(choiceElement);
    }

    fieldNode.AppendChild(choicesElement);
    category.SchemaXml = doc.OuterXml;
    category.Update(true);
 }

You have to love SharePoint

Disable “Task has been Changed” Notifications

In Sharepoint 2010 when you have notifications turned on users receive notifications by default when tasks are created and when tasks change.

Depending on your workflow requirements this can result in users being flooded with too many notifications. To disable the “Change” notification but keep the original assignment notification you can follow the instructions in this post.

If you want to script it so you can deploy it part of a solution package, or as a powershell script then you can also use Object Model:

SPList workflowtasks = web.Lists[&amp;quot;Tasks&amp;quot;];
workflowtasks.EnableAssignToEmail = true;
workflowtasks.Update();

//Disable Changed notifications
var alert = web.Alerts.Cast&amp;lt;SPAlert&amp;gt;().FirstOrDefault(i =&amp;gt; i.List.ID == workflowtasks.ID &amp;amp;&amp;amp; i.EventType == SPEventType.All);
if (alert != null)
{
 alert.EventType = SPEventType.Add;
 alert.Update(false);
}

Is IHttpHandler ProcessRequest thread safe or not?

short answer: “Yes” with an “If,” long answer: “No” — with a “But.”
– Reverend Lovejoy

I was debugging a HttpHandler and came across many examples that stated ProcessRequest() is not thread safe, unless of course you’re doing something very simple, like the following:

public void ProcessRequest(HttpContext context)
{
    context.Response.Write("hello");
}

If, however, you’re doing anything with local variables (the examples say), you probably have a problem:

public void ProcessRequest(HttpContext context)
{
    var someValue = someFunction();
    context.Response.Write(someValue);
}

However, local variables are thread safe! So the above example is perfectly ok, even if someFunction() returns a different value each call, so long as it is, in itself, thread safe.

The obviously broken example would be:

public class AttachmentHandler : IHttpHandler
{
    private static int _callCnt = 0;

    public bool IsReusable
    {
        get { return true; }
    }

    public void ProcessRequest(HttpContext context)
    {
        context.Response.Write(_callCnt++);
    }
}

because all static variables (except thread static variables) are shared amongst all threads.

To demonstrate a working example, here’s an attachment handler which imposes a 10 second wait. Call it multiple times within 10 seconds and you should get a different value in the file each time:

public class AttachmentHandler : IHttpHandler
{
    public bool IsReusable
    {
        get { return true; }
    }

    public void ProcessRequest(HttpContext context)
    {
        Random r = new Random();
        double foo = r.Next();
        Thread.Sleep(10000);

        context.Response.ContentType = "text/plain";
        context.Response.AddHeader("Content-disposition", "attachment; filename=\"foo.txt\"");
        context.Response.BinaryWrite(Encoding.ASCII.GetBytes(string.Format("hello, universe {0}!", foo)));
        context.Response.Flush();
        HttpContext.Current.ApplicationInstance.CompleteRequest();
    }
}

So yes, ProcessRequest() is thread safe unless you reference (and maybe not even then) any thread unsafe objects, such as session, static variables, etc.

SPMetal fields retuning null, internal field name exceeds 32 chars

SPMetal is a great tool, of you don’t know SPMetal it’s an ORM tool that maps SharePoint objects to entity classes, it abstracts all of that CAML ugliness with a nice linq provider and it sort of works nicely. But as with almost everything in SharePoint it has its glitches.

I just come across a pretty nasty one. I have this entity that for some reason one of the fields was always returning null, no custom mappings completely auto-generated like the rest of my entities.

I have checked the generated CAML and the field is part of the viewfields parameter as expected. After a bit of debugging i realised that the internal field name didn’t match the one SPMetal was using to build the CAML. The reason is that apparently SharePoint truncates the internal field name if its length exceeds 32 characters. Now this should be fine but apparently SPMetal developers decided to hardcode the logic of conversion from display to internal, instead of querying the internal name when generating the entities. So a field named “Tools and Websites Link”, will be named internally as “Tools_x0020_and_x0020_Websites_x1″, and SPMetal will generate it as “Tools_x0020_and_x0020_Websites_x0020_Link”. Yep nasty.

There is a couple of solutions here, the most obvious one is to make your field name smaller, bear in mind that if you have spaces in your field name SharePoint replaces them with _x0020_ so if you have a lot of spaces you are guaranteed to hit this limit.

Another workaround is to manually set the internal field name as show in this post.

SharePoint Continuous Integration – The type or namespace name ‘xxx’ does not exist in the namespace ‘Microsoft.SharePoint’

There are multiple posts about setting up a SharePoint project building in a continuous integration server. One of the challenges assuming that SharePoint is not installed on the build server (as it shouldn’t) is to provide the necessary SharePoint assemblies for the build to happen. This is where out problems begun.

We decided to copy the libraries from 14ISAPI and keep them under source control for better portability. We could have installed them in the GAC witch would allow us to run multiple SharePoint project builds without having to provide all the necessary dlls all the time.

After getting all the assemblies and changing the references in the project to point our local versions we got several compilation errors building the project:

The type or namespace name 'Office' does not exist in the namespace 'Microsoft' (are you missing an assembly reference?

The type or namespace name 'Publishing' does not exist in the namespace 'Microsoft.SharePoint' (are you missing an assembly reference?)

The type or namespace name 'Taxonomy' does not exist in the namespace 'Microsoft.SharePoint' (are you missing an assembly reference?)

What stumped us is that we are including all those references. After spending an entire day trying different things we came across the problem. Publishing has a dependency on

System.Web.DataVisualization

Nothing on the error messages refer to this namespace. So After including the following library:

C:Program Files (x86)Microsoft Chart ControlsAssembliesSystem.Web.DataVisualization.dll

The project compiled as expected.