Monday I start at Excellence in Motivation. They make marketing and performance improvement programs. They seem to be headed in a good direction and are much closer to home. More than 50 miles closer, so I really couldn't turn that combo down now that my son is old enough to start asking where I am all the time. There is also a pretty good chance at this place I can keep what most people would consider a normal schedule for the first time in my career.
Thanks to everyone that was sending leads, and if any of you are currently looking I have a few I didn't even get a chance to interview for I can pass along.
20110611
20110511
even better type conversion between similar objects
My last post showed how to use compiled linq expressions as delegates to make a type converter. I noted that an MSIL solution would be faster. I am not a MSIL expert by any means, but here is my best attempt as using DynamicMethod and MSIL to create a delegate converter.
It does the same thing as the linq version but requires separate assign blocks for each combination of field and property.
Compared to the Linq version at 100 million repetitions (at lower numbers the results are less predictable):
Linq: 94.579 seconds vs MSIL: 93.423 seconds
Some of that time is consumed dealing with the delegates, so you can transform that into a full dynamic assembly where you can have base classes and real instance references like this:
Compared to the original MSIL version at 100 million repetitions:
Delegate MSIL: 93.423 seconds vs Base Class MSIL: 91.814 seconds
At REALLY high volumes that is significant, and I am certain that someone better at MSIL than I could probably make it a bit faster still, but for my purposes the Linq version is more than sufficient as I find the MSIL versions a little hard to work in for the level of benefit in return.
It does the same thing as the linq version but requires separate assign blocks for each combination of field and property.
using System; using System.Collections.Generic; using System.Linq; using System.Reflection.Emit; namespace typeoverridedemostrater { static class SuperConverter<T1, T2> where T2 : new() { private static Dictionary<Tuple<System.Type, System.Type>, MethodInvoker> converters = new Dictionary<Tuple<System.Type, System.Type>, MethodInvoker>(); delegate T2 MethodInvoker(T1 inpt); static public T2 Convert(T1 i) { var key = new Tuple<System.Type, System.Type>(typeof(T2), typeof(T1)); if (!converters.ContainsKey(key)) { DynamicMethod dm = new DynamicMethod("xmethod", typeof(T2), new Type[] { typeof(T1) }, typeof(T2)); ILGenerator il = dm.GetILGenerator(); il.DeclareLocal(typeof(T2)); il.Emit(OpCodes.Newobj, typeof(T2).GetConstructor(System.Type.EmptyTypes)); il.Emit(OpCodes.Stloc_0); (from z in typeof(T1).GetFields().Select(a => new { name = a.Name, type = a.FieldType }) join y in typeof(T2).GetFields().Select(a => new { name = a.Name, type = a.FieldType }) on z equals y select z.name).ToList().ForEach(a => { il.Emit(OpCodes.Ldloc_0); il.Emit(OpCodes.Ldarg_0); il.Emit(OpCodes.Ldfld, typeof(T1).GetField(a)); il.Emit(OpCodes.Stfld, typeof(T2).GetField(a)); }); (from z in typeof(T1).GetProperties().Select(a => new { name = a.Name, type = a.PropertyType }) join y in typeof(T2).GetProperties().Where(b => b.CanWrite).Select(a => new { name = a.Name, type = a.PropertyType }) on z equals y select z.name).ToList().ForEach(a => { il.Emit(OpCodes.Ldloc_0); il.Emit(OpCodes.Ldarg_0); il.Emit(OpCodes.Callvirt, typeof(T1).GetProperty(a).GetGetMethod()); il.Emit(OpCodes.Callvirt, typeof(T2).GetProperty(a).GetSetMethod()); }); (from z in typeof(T1).GetFields().Select(a => new { name = a.Name, type = a.FieldType }) join y in typeof(T2).GetProperties().Where(b => b.CanWrite).Select(a => new { name = a.Name, type = a.PropertyType }) on z equals y select z.name).ToList().ForEach(a => { il.Emit(OpCodes.Ldloc_0); il.Emit(OpCodes.Ldarg_0); il.Emit(OpCodes.Ldfld, typeof(T1).GetField(a)); il.Emit(OpCodes.Callvirt, typeof(T2).GetProperty(a).GetSetMethod()); }); (from z in typeof(T1).GetProperties().Select(a => new { name = a.Name, type = a.PropertyType }) join y in typeof(T2).GetFields().Select(a => new { name = a.Name, type = a.FieldType }) on z equals y select z.name).ToList().ForEach(a => { il.Emit(OpCodes.Ldloc_0); il.Emit(OpCodes.Ldarg_0); il.Emit(OpCodes.Callvirt, typeof(T1).GetProperty(a).GetGetMethod()); il.Emit(OpCodes.Stfld, typeof(T2).GetField(a)); }); il.Emit(OpCodes.Ldloc_0); il.Emit(OpCodes.Ret); var dg = (MethodInvoker)dm.CreateDelegate(typeof(MethodInvoker)); converters.Add(key, dg); } return (T2)converters[key](i); } } }
Compared to the Linq version at 100 million repetitions (at lower numbers the results are less predictable):
Linq: 94.579 seconds vs MSIL: 93.423 seconds
Some of that time is consumed dealing with the delegates, so you can transform that into a full dynamic assembly where you can have base classes and real instance references like this:
using System; using System.Collections.Generic; using System.Linq; using System.Reflection.Emit; using System.Threading; using System.Reflection; namespace typeoverridedemostrater { static class SuperConverterII<T1, T2> where T2 : new() { private static Dictionary<Tuple<System.Type, System.Type>, convertbase<T1, T2>> converters = new Dictionary<Tuple<System.Type, System.Type>, convertbase<T1, T2>>(); static public T2 Convert(T1 i) { var key = new Tuple<System.Type, System.Type>(typeof(T2), typeof(T1)); if (!converters.ContainsKey(key)) { var xassemblybuilder = Thread.GetDomain().DefineDynamicAssembly(new AssemblyName("xassembly"), System.Reflection.Emit.AssemblyBuilderAccess.Run); var xtype = xassemblybuilder.DefineDynamicModule("xmodule").DefineType("xtype", TypeAttributes.Public, typeof(convertbase<T1, T2>)); var xfunction = xtype.DefineMethod("convert", MethodAttributes.Public | MethodAttributes.Virtual, typeof(T2), new System.Type[] { typeof(T1) }); xtype.DefineMethodOverride(xfunction, typeof(convertbase<T1, T2>).GetMethod("convert")); var il = xfunction.GetILGenerator(); il.DeclareLocal(typeof(T2)); il.Emit(OpCodes.Newobj, typeof(T2).GetConstructor(System.Type.EmptyTypes)); il.Emit(OpCodes.Stloc_0); (from z in typeof(T1).GetFields().Select(a => new { name = a.Name, type = a.FieldType }) join y in typeof(T2).GetFields().Select(a => new { name = a.Name, type = a.FieldType }) on z equals y select z.name).ToList().ForEach(a => { il.Emit(OpCodes.Ldloc_0); il.Emit(OpCodes.Ldarg_1); il.Emit(OpCodes.Ldfld, typeof(T1).GetField(a)); il.Emit(OpCodes.Stfld, typeof(T2).GetField(a)); }); (from z in typeof(T1).GetProperties().Select(a => new { name = a.Name, type = a.PropertyType }) join y in typeof(T2).GetProperties().Where(b => b.CanWrite).Select(a => new { name = a.Name, type = a.PropertyType }) on z equals y select z.name).ToList().ForEach(a => { il.Emit(OpCodes.Ldloc_0); il.Emit(OpCodes.Ldarg_1); il.Emit(OpCodes.Callvirt, typeof(T1).GetProperty(a).GetGetMethod()); il.Emit(OpCodes.Callvirt, typeof(T2).GetProperty(a).GetSetMethod()); }); (from z in typeof(T1).GetFields().Select(a => new { name = a.Name, type = a.FieldType }) join y in typeof(T2).GetProperties().Where(b => b.CanWrite).Select(a => new { name = a.Name, type = a.PropertyType }) on z equals y select z.name).ToList().ForEach(a => { il.Emit(OpCodes.Ldloc_0); il.Emit(OpCodes.Ldarg_1); il.Emit(OpCodes.Ldfld, typeof(T1).GetField(a)); il.Emit(OpCodes.Callvirt, typeof(T2).GetProperty(a).GetSetMethod()); }); (from z in typeof(T1).GetProperties().Select(a => new { name = a.Name, type = a.PropertyType }) join y in typeof(T2).GetFields().Select(a => new { name = a.Name, type = a.FieldType }) on z equals y select z.name).ToList().ForEach(a => { il.Emit(OpCodes.Ldloc_0); il.Emit(OpCodes.Ldarg_1); il.Emit(OpCodes.Callvirt, typeof(T1).GetProperty(a).GetGetMethod()); il.Emit(OpCodes.Stfld, typeof(T2).GetField(a)); }); il.Emit(OpCodes.Ldloc_0); il.Emit(OpCodes.Ret); xtype.CreateType(); converters.Add(key, (convertbase<T1, T2>)xassemblybuilder.CreateInstance("xtype")); } return converters[key].convert(i); } } public abstract class convertbase<T1, T2> { public abstract T2 convert(T1 i); } }
Compared to the original MSIL version at 100 million repetitions:
Delegate MSIL: 93.423 seconds vs Base Class MSIL: 91.814 seconds
At REALLY high volumes that is significant, and I am certain that someone better at MSIL than I could probably make it a bit faster still, but for my purposes the Linq version is more than sufficient as I find the MSIL versions a little hard to work in for the level of benefit in return.
20110509
better type conversion between similar objects
Friday I posted what is about the simplest possible way to automagicly convert between two objects that have mostly the same fields and properties but have no implicit conversion or common interface or base type. It uses reflection and a little Linq, but it uses reflection every single time so it does not perform very well under load. You can see that post here.
I wrote that as a quick hack to solve a specific, temporary problem a few weeks ago and I knew it wasn't going to be very efficient, but it didn't matter in that instance. The best solutions in this sort of problem are MSIL assuming you cannot just change the classes to share dependencies, but MSIL is not something very many folks can read, and even fewer can write.
In .Net 4, you get some interesting additions/better documentation to Linq show you how to build expressions on the fly, so I wanted to take a stab at re-writing the converter that way.
The general idea is the same, use reflection to look for fields and writable properties then generate some action to sync them up, but there are several differences:
the general flow is:
For 100,000 conversions the before/after is:
TotalMilliseconds = 1609.4059 vs TotalMilliseconds = 93.7518
I wrote that as a quick hack to solve a specific, temporary problem a few weeks ago and I knew it wasn't going to be very efficient, but it didn't matter in that instance. The best solutions in this sort of problem are MSIL assuming you cannot just change the classes to share dependencies, but MSIL is not something very many folks can read, and even fewer can write.
In .Net 4, you get some interesting additions/better documentation to Linq show you how to build expressions on the fly, so I wanted to take a stab at re-writing the converter that way.
The general idea is the same, use reflection to look for fields and writable properties then generate some action to sync them up, but there are several differences:
- reflection is only used the first time a pair of types is converted
- the actions are generated as Linq that is then compiled before storing in the converters set
- added the ability to set a property to a field or field to property
- added type checking
using System; using System.Collections.Generic; using System.Linq; using System.Linq.Expressions; namespace typeoverridedemostrater { static class SuperConverter<T1, T2> where T2 : new() { private static Dictionary<Tuple<System.Type, System.Type>, Func<T1, T2>> converters = new Dictionary<Tuple<System.Type, System.Type>, Func<T1, T2>>(); static public T2 Convert(T1 i) { var key = new Tuple<System.Type, System.Type>(typeof(T2), typeof(T1)); if (!converters.ContainsKey(key)) { ParameterExpression value = Expression.Parameter(typeof(T1), "value"); ParameterExpression result = Expression.Parameter(typeof(T2), "result"); var exprs = new List<Expression>(); exprs.Add(Expression.Assign(result, Expression.New(typeof(T2)))); exprs.AddRange(( from z in typeof(T2).GetProperties().Where(a => a.CanWrite) .Select(b => new { name = b.Name, type = b.PropertyType }).Union( typeof(T2).GetFields().Select(c => new { name = c.Name, type = c.FieldType })) join y in typeof(T1).GetProperties().Where(a => a.CanWrite) .Select(b => new { name = b.Name, type = b.PropertyType }).Union( typeof(T1).GetFields().Select(c => new { name = c.Name, type = c.FieldType })) on z equals y select Expression.Assign(Expression.PropertyOrField(result, z.name), Expression.PropertyOrField(value, z.name)) ).ToArray() ); exprs.Add(result); BlockExpression block = Expression.Block(variables: new[] { result }, expressions: exprs.ToArray()); converters.Add(key, Expression.Lambda<Func<T1, T2>>(block, value).Compile()); } return converters[key].Invoke(i); } } }
the general flow is:
- if you don't already have a good converter
- create an input value and output result
- create an action to assign a new instance to the result variable
- add an assign action for every param/field that has a match between the types
- add an action for returning the result
- turn those into an expression block
- turn the block Lambda, then compiled to a TDelegate and store in the converters dictonary
- run the TDelegate from the dictionary
For 100,000 conversions the before/after is:
TotalMilliseconds = 1609.4059 vs TotalMilliseconds = 93.7518
20110506
Cheap type conversion for very similar objects
Ever get stuck between two API and have to keep converting between two almost identical sets of data objects because neither API can be modded at the moment? If you can afford some performance penalty, this is a quick way to get fields and properties with the same names synced.
First you add some implicit type operators:
After you call the converter, you can add whatever specific conversions you need.
(EDIT - as Brooke pointed out, the real version of this was an extension to an existing class, not a class with an internal operator the code above was from the demo project I bashed out to explain the concept to another dev)
Then you use some reflection and linq to do the assignments in a separate class that you can re-use in as many type converters as you like:
Obviously this only works in some cases, and if your types do not match precisely you would have to make it a little more complex by checking for compatible types, or handling failure more gracefully, but it can save a ton of time in some cases.
First you add some implicit type operators:
using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace typeoverridedemostrater { class Class2 { public string string1; public string string2; public string string3; public static implicit operator Class2(Class1 initialData) { return new SuperConverter<Class1, Class2>().Convert(initialData); } public static implicit operator Class2(Class3 initialData) { return new SuperConverter<Class3, Class2>().Convert(initialData); } } }
After you call the converter, you can add whatever specific conversions you need.
(EDIT - as Brooke pointed out, the real version of this was an extension to an existing class, not a class with an internal operator the code above was from the demo project I bashed out to explain the concept to another dev)
Then you use some reflection and linq to do the assignments in a separate class that you can re-use in as many type converters as you like:
using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace typeoverridedemostrater { class SuperConverterwhere T2 : new() { public T2 Convert(T1 i) { var temp = new T2(); typeof(T1).GetProperties().ToList().Where(a => a.CanWrite).ToList(). ForEach(b => { var z = temp.GetType().GetProperty(b.Name); if (z != null) z.SetValue(temp, b.GetValue(i, null), null); }); typeof(T1).GetFields().ToList().ToList(). ForEach(b => { var z = temp.GetType().GetField(b.Name); if (z != null) z.SetValue(temp, b.GetValue(i)); }); return temp; } } }
Obviously this only works in some cases, and if your types do not match precisely you would have to make it a little more complex by checking for compatible types, or handling failure more gracefully, but it can save a ton of time in some cases.
20110504
Adding a WCF function to a service, in code, at runtime
Working on something last week I thought it might be a good idea to have an attribute that added an extra endpoint to a service at runtime. I coded for a bit making decent progress, got stuck, and when I started to google I saw several discussions of it not being possible, and one discussion about how to do it by modifying the host (http://zamd.net/2010/02/05/adding-dynamic-methods-to-a-wcf-service/). It turns out that after the service is running some things seem to be locked down that you would need to pull this trick off.
Remembering some similar issues in HttpModules/HttpApplications I started looking for a point to do the injection of a method while the service is in the middle of loading and I found ServiceBehavior.AddBindingParameters. The other two methods don't seem to be at the right point in the sequence for the injection to work correctly.
WARNING: lots of trial & error based code ahead, I doubt this is correct in all situations, it does work on a REST POX service fairly well. I am mostly writing this so I remember it, and so others can see it probably can be done.
First you have to make an attribute that you will use later to decorate your service implementation. In this example the AddUI sub looks at the first operation for the current contract, copies a lot of it's general settings then adds the messages to an operation, then the operation to the contract.
Since there is no function in the service implementation the default invoker call would error with nothing to act upon, so the DasOP custom operation behavior is substituted for the default. Looking at it's apply dispatch sub below you can see it is just a crutch to a custom invoker. You don't have to do this, but if you don't you have to deal with the message objects by hand, and they are not really that friendly.
Finally you make the invoker, it doesn't really invoke anything, since nothing actually exists to invoke, but it allocates and input and returns a result like their was, so the rest of the WCF seems not to notice the difference.
It isn't pretty, but it works.
Remembering some similar issues in HttpModules/HttpApplications I started looking for a point to do the injection of a method while the service is in the middle of loading and I found ServiceBehavior.AddBindingParameters. The other two methods don't seem to be at the right point in the sequence for the injection to work correctly.
WARNING: lots of trial & error based code ahead, I doubt this is correct in all situations, it does work on a REST POX service fairly well. I am mostly writing this so I remember it, and so others can see it probably can be done.
First you have to make an attribute that you will use later to decorate your service implementation. In this example the AddUI sub looks at the first operation for the current contract, copies a lot of it's general settings then adds the messages to an operation, then the operation to the contract.
Imports System.ServiceModel.Description Public Class ExternalInvalidatorAttribute Inherits Attribute Implements IServiceBehavior Public Sub AddBindingParameters(ByVal serviceDescription As System.ServiceModel.Description.ServiceDescription, ByVal serviceHostBase As System.ServiceModel.ServiceHostBase, ByVal endpoints As System.Collections.ObjectModel.Collection(Of System.ServiceModel.Description.ServiceEndpoint), ByVal bindingParameters As System.ServiceModel.Channels.BindingParameterCollection) Implements System.ServiceModel.Description.IServiceBehavior.AddBindingParameters For Each endpt In endpoints AddUI(endpt) Next End Sub Private Sub AddUI(ByRef endpt As ServiceEndpoint) Dim cd = endpt.Contract.Operations(0).DeclaringContract Dim od = New OperationDescription("invalidatorUI", cd) Dim inputMsg = New MessageDescription(cd.Namespace + cd.Name + "/invalidatorUI", MessageDirection.Input) Dim mpd = New MessagePartDescription("a", endpt.Contract.Namespace) mpd.Index = 0 mpd.MemberInfo = Nothing mpd.Multiple = False mpd.ProtectionLevel = Net.Security.ProtectionLevel.None mpd.Type = GetType(System.String) inputMsg.Body.Parts.Add(mpd) od.Messages.Add(inputMsg) Dim outputMsg = New MessageDescription(cd.Namespace + cd.Name + "/invalidatorUIResponse", MessageDirection.Output) outputMsg.Body.ReturnValue = New MessagePartDescription("invalidatorUIResult", cd.Namespace) With {.Type = GetType(System.String)} od.Messages.Add(outputMsg) od.Behaviors.Add(New DataContractSerializerOperationBehavior(od)) od.Behaviors.Add(New System.ServiceModel.Web.WebGetAttribute() With {.UriTemplate = "/invalidator/{a}"}) od.Behaviors.Add(New System.ServiceModel.OperationBehaviorAttribute()) Dim dc = New DasOP() od.Behaviors.Add(dc) cd.Operations.Add(od) End Sub #Region "not needed" Public Sub ApplyDispatchBehavior(ByVal serviceDescription As System.ServiceModel.Description.ServiceDescription, ByVal serviceHostBase As System.ServiceModel.ServiceHostBase) Implements System.ServiceModel.Description.IServiceBehavior.ApplyDispatchBehavior End Sub Public Sub Validate(ByVal serviceDescription As System.ServiceModel.Description.ServiceDescription, ByVal serviceHostBase As System.ServiceModel.ServiceHostBase) Implements System.ServiceModel.Description.IServiceBehavior.Validate End Sub #End Region End Class
Since there is no function in the service implementation the default invoker call would error with nothing to act upon, so the DasOP custom operation behavior is substituted for the default. Looking at it's apply dispatch sub below you can see it is just a crutch to a custom invoker. You don't have to do this, but if you don't you have to deal with the message objects by hand, and they are not really that friendly.
Imports System.ServiceModel.Description Public Class DasOp Inherits Attribute 'this makes it a decorator Implements IOperationBehavior 'this makes it get called for applybehaviour Public Sub ApplyDispatchBehavior(ByVal operationDescription As System.ServiceModel.Description.OperationDescription, ByVal dispatchOperation As System.ServiceModel.Dispatcher.DispatchOperation) Implements System.ServiceModel.Description.IOperationBehavior.ApplyDispatchBehavior dispatchOperation.Invoker = New InvalidatorInvoker() 'this invoker actually does the work, it needs a reference to the other cache objects so it can meddle in their cache arrays End Sub #Region "not used" Public Sub AddBindingParameters(ByVal operationDescription As System.ServiceModel.Description.OperationDescription, ByVal bindingParameters As System.ServiceModel.Channels.BindingParameterCollection) Implements System.ServiceModel.Description.IOperationBehavior.AddBindingParameters 'not needed End Sub Public Sub ApplyClientBehavior(ByVal operationDescription As System.ServiceModel.Description.OperationDescription, ByVal clientOperation As System.ServiceModel.Dispatcher.ClientOperation) Implements System.ServiceModel.Description.IOperationBehavior.ApplyClientBehavior 'not needed End Sub Public Sub Validate(ByVal operationDescription As System.ServiceModel.Description.OperationDescription) Implements System.ServiceModel.Description.IOperationBehavior.Validate 'not needed End Sub #End Region End Class
Finally you make the invoker, it doesn't really invoke anything, since nothing actually exists to invoke, but it allocates and input and returns a result like their was, so the rest of the WCF seems not to notice the difference.
Imports System.ServiceModel.Dispatcher Imports System.Xml.Linq Imports System.Text.RegularExpressions Public Class InvalidatorInvoker Implements IOperationInvoker Public Function AllocateInputs() As Object() Implements System.ServiceModel.Dispatcher.IOperationInvoker.AllocateInputs Return {Nothing} 'reserve a spot for some input End Function Public Function Invoke(ByVal instance As Object, ByVal inputs() As Object, ByRef outputs() As Object) As Object Implements System.ServiceModel.Dispatcher.IOperationInvoker.Invoke outputs = New Object(-1) {} 'return an empty array here, MSDN does not elaborate as to why
Return "Result" End Function #Region "not needed" Public Function InvokeBegin(ByVal instance As Object, ByVal inputs() As Object, ByVal callback As System.AsyncCallback, ByVal state As Object) As System.IAsyncResult Implements System.ServiceModel.Dispatcher.IOperationInvoker.InvokeBegin Return Nothing End Function Public Function InvokeEnd(ByVal instance As Object, ByRef outputs() As Object, ByVal result As System.IAsyncResult) As Object Implements System.ServiceModel.Dispatcher.IOperationInvoker.InvokeEnd Return Nothing End Function #End Region Public ReadOnly Property IsSynchronous As Boolean Implements System.ServiceModel.Dispatcher.IOperationInvoker.IsSynchronous Get Return True 'disable async End Get End Property End Class
It isn't pretty, but it works.
20110328
Microsoft.Web.Infrastructure + System.Web.PreApplicationStartMethod
What follows is interesting & a neat trick, but I do not claim that it is a good idea to do this in production code. It may be useful, it may not.
I have been playing around with a few of the new bits that are available for .NET web development recently and I came across a neat trick. I had a project that I wanted to debug, but I didn't have access to the source directly and the app itself was handling the error and in the process eating some of the detail I needed.
I made a httpmodule that was decorated with System.Web.PreApplicationStartMethod This is a framework 4 bit that allows a httpmodule to run a shared sub BEFORE all the stack gets built and locked in place.
I then took and used reflection to load and invoke Microsoft.Web.Infrastructure.DynamicModuleHelper.DynamicModuleUtility.RegisterModule. This allows the httpmodule to place itself in the modules collection without altering the web config. Loading it through reflection allows me to just drag the dll into the bin folder along with the httpmodule dll. This is handy because web.infrastructure is part of the MVC framework and the problem I was having was on a non-mvc machine.
A little poking about in reflector and I came up with this function to get the existing event handlers
If you pass the httpapplication instance to it with an eventname you get all the registered handler delegates, which will allow you to call.removeeventhandler() on each of them.
If you do that to the error event, then add your own handler, then re-add the pre-existing delegates in the correct order then your handler fires first, before any of the other handlers has a chance to mangle the even state and the rest of the application seems to be none the wiser as long as you don't alter the even state yourself.
In my example I just put a trace in that wrote to a text file and after a few passes at the broken bits I got a nice clear picture of what was going on.
The approach seems to work in ASPX, MVC, WCF and Sharepoint and only involves copying a couple dlls to the bin directory in the way of altering the original project.
Again, I haven't used this too much yet, so your mileage may vary, but even if it doesn't always work I think it is an interesting approach to spy on a misbehaving build without changing too many variables.
I have been playing around with a few of the new bits that are available for .NET web development recently and I came across a neat trick. I had a project that I wanted to debug, but I didn't have access to the source directly and the app itself was handling the error and in the process eating some of the detail I needed.
I made a httpmodule that was decorated with System.Web.PreApplicationStartMethod This is a framework 4 bit that allows a httpmodule to run a shared sub BEFORE all the stack gets built and locked in place.
I then took and used reflection to load and invoke Microsoft.Web.Infrastructure.DynamicModuleHelper.DynamicModuleUtility.RegisterModule. This allows the httpmodule to place itself in the modules collection without altering the web config. Loading it through reflection allows me to just drag the dll into the bin folder along with the httpmodule dll. This is handy because web.infrastructure is part of the MVC framework and the problem I was having was on a non-mvc machine.
A little poking about in reflector and I came up with this function to get the existing event handlers
Dim t As Type = target.[GetType]()
Public Function GetEventSubscribers(ByVal target As Object, ByVal eventName As String) As [Delegate]()
Dim w = CType(t.GetField("_events", BindingFlags.Instance Or BindingFlags.Static Or BindingFlags.NonPublic).
GetValue(target), System.ComponentModel.EventHandlerList)
Dim k = t.GetFields(BindingFlags.[Static] Or BindingFlags.Instance Or BindingFlags.NonPublic).
Where(Function(x) x.Name.StartsWith("Event" & eventName)).Select(Function(x) x.GetValue(target)).ToList()
Dim d() As [Delegate] = k.SelectMany(Function(x)
If w(x) Is Nothing Then
Return New [Delegate]() {}
Else
Return w(x).GetInvocationList()
End If
End Function).ToArray
Return d
End Function
If you pass the httpapplication instance to it with an eventname you get all the registered handler delegates, which will allow you to call
If you do that to the error event, then add your own handler, then re-add the pre-existing delegates in the correct order then your handler fires first, before any of the other handlers has a chance to mangle the even state and the rest of the application seems to be none the wiser as long as you don't alter the even state yourself.
In my example I just put a trace in that wrote to a text file and after a few passes at the broken bits I got a nice clear picture of what was going on.
The approach seems to work in ASPX, MVC, WCF and Sharepoint and only involves copying a couple dlls to the bin directory in the way of altering the original project.
Again, I haven't used this too much yet, so your mileage may vary, but even if it doesn't always work I think it is an interesting approach to spy on a misbehaving build without changing too many variables.
20110213
Get Well Mike Jennings
A good friend of the family is in ICU with a brain tumor. I have set up a facebook page for people to update / comment / share news as facebook is something simple for most to use and I am away for work most of the week.
If you do not use facebook, the short version of the story is that he didn't know he had it until they found it after he had been rushed to the hospital thursday. He has had surgery and is stable the last I heard yesterday evening.
Get Well Mike Jennings
If you do not use facebook, the short version of the story is that he didn't know he had it until they found it after he had been rushed to the hospital thursday. He has had surgery and is stable the last I heard yesterday evening.
Get Well Mike Jennings
20110130
20110116
Processes stopping innovation
A friend of mine from work linked to A culture of testing a few days ago and it got me thinking...
Agile and TDD doesn't only discourage people from trying things they cannot test in their preferred manner but reinforces the idea that some solutions are not worth trying.
When Apple started to add mice to their systems they had no way to do this in a testable, or even loosely coupled manner. They didn't have the infrastructure to make it a modular part of the system that way as later PCs would, yet is was one of their defining moves in the early days. (if you are a hardware geek, check out the story here)
Most people understand that there are limitations to what can be tested when you get to the hardware level, but if you back up a bit in the history of the mouse there is a much more painful lesson to learn.
When Apple put out the Mac there was a thought that can best be summarized with this quote:
Apple didn't even invent the mouse, as the root idea of the mouse is from the early 1950's and the first identifiable mice came in around the mid 1960s and it was even a common accessory to some German computers at that time.
At the time, I would say Dvorak was right. There wasn't much software that really made good use of the mouse, and there wouldn't be for a while, even on the macs. You can debate if this is a chicken/egg problem, apple was just paving the way to the future, or they should have sold the mouse as an optional accessory, kept the arrow keys and prevented the wasd navigation from being introduced, but none of that is my point.
My point is that at some point between then and now, in some manner, was the right time to introduce the mouse as a mainstream computer input device and at no point between then and now would users that have never done input except with a keyboard have ever thought to have asked for anything remotely like it.
Letting the customer drive the product is a limiting and shortsighted approach from the point of view of innovation. Sometimes having the forethought to include a feature that the users don't understand, or don't need initially is what makes a product truly great. Even if you didn't like the mice and missed your arrow keys and kept the mouse put away on your first mac eventually you were glad you had one and glad that the base operating system knew what to do with one.
No user ever asked for that.
Agile and TDD doesn't only discourage people from trying things they cannot test in their preferred manner but reinforces the idea that some solutions are not worth trying.
When Apple started to add mice to their systems they had no way to do this in a testable, or even loosely coupled manner. They didn't have the infrastructure to make it a modular part of the system that way as later PCs would, yet is was one of their defining moves in the early days. (if you are a hardware geek, check out the story here)
Most people understand that there are limitations to what can be tested when you get to the hardware level, but if you back up a bit in the history of the mouse there is a much more painful lesson to learn.
When Apple put out the Mac there was a thought that can best be summarized with this quote:
San Francisco Examiner, John C. Dvorak, 19 February 1984 The nature of the personal computer is simply not fully understood by companies like Apple (or anyone else for that matter). Apple makes the arrogant assumption of thinking that it knows what you want and need. It, unfortunately, leaves the “why” out of the equation - as in “why would I want this?” The Macintosh uses an experimental pointing device called a ‘mouse’. There is no evidence that people want to use these things.
Apple didn't even invent the mouse, as the root idea of the mouse is from the early 1950's and the first identifiable mice came in around the mid 1960s and it was even a common accessory to some German computers at that time.
At the time, I would say Dvorak was right. There wasn't much software that really made good use of the mouse, and there wouldn't be for a while, even on the macs. You can debate if this is a chicken/egg problem, apple was just paving the way to the future, or they should have sold the mouse as an optional accessory, kept the arrow keys and prevented the wasd navigation from being introduced, but none of that is my point.
My point is that at some point between then and now, in some manner, was the right time to introduce the mouse as a mainstream computer input device and at no point between then and now would users that have never done input except with a keyboard have ever thought to have asked for anything remotely like it.
Letting the customer drive the product is a limiting and shortsighted approach from the point of view of innovation. Sometimes having the forethought to include a feature that the users don't understand, or don't need initially is what makes a product truly great. Even if you didn't like the mice and missed your arrow keys and kept the mouse put away on your first mac eventually you were glad you had one and glad that the base operating system knew what to do with one.
No user ever asked for that.
20110109
So this happened the other day...
My son is 2, he is not getting a hydraulic power armor suit just yet. He is dangerous enough unassisted.
Subscribe to:
Posts (Atom)
Firefox Feedly RSS option
If you use Firefox with a RSS button and want the default RSS page to offer a Feedly option here is what you need to do: go to the about:c...
-
Ever get stuck between two API and have to keep converting between two almost identical sets of data objects because neither API can be modd...
-
In a former life I directed and produced television commercials. I quit and then edited news for a while as I tried to figure o...