November 2006 - Posts
Working with WCF in general is quite nice. No longer are you troubled by the idiosyncrasies of the various communication protocols as they are bundled into the same API. Of course that doesn’t make the different protocols identical and there are still some differences to be handled. One thing is the netMsmqBinding binding and poison messages.
In general it isn’t to hard to handle poison messages. Using Vista you can even reject or move them to a poison queue for later inspection. Not that much luck in Windows XP though. Basically you have two options, either Fault (the default) or Drop the message.
Now dropping the message us not very nice as the information gets lost so faulting and removing and logging the message from a IErrorHandler is a better solution. And in fact an MsmqPoisonMessageException is raised when a poison message is detected. In fact it even contains the MessageLookupId to retrieve the message itself. But how about the queue name? That is missing. And the samples on MSDN show how to do this by adding the queue names twice, once in the WCF configuration and a second time just for the purpose of removing the poison message. Looks like some king of oversight if you ask me. I guess I need to hook into the message pipeline somewhere and watch messages coming in via the queues and record the LookupId and queue used. Only problem is that I haven't found the point to do so yet :-( Any hints are appreciated!
If you are serious about Office and based in the Netherlands the "Nationale Office Dag" Is the thing to go to. Next Thursday, the 30th of November, the next one is held in Ede. And if you are going there come by and say hi as I will be there to present a session on VSTO 2005 SE.
SQL Server 2005 introduced some nice shortcuts for working with queries and identity or GUID key columns. These functions are specially useful if you want to write generic functions without worrying about the key names.
Given a table Article with an identity column the following query will retrieve all data and an extra column __pkey__ containing the primary key, whatever the name is.
Select $identity as __pkey__,*FromArticle
Replace $identity with $rowguid if you are using GUID primary keys.
With the release of the .NET 3.0 framework last week the WF team has also released a large set of samples. All samples, or at least all I have looked at so far, are both in VB and C#, always good to see :-)
Getting a hot fix has always been a bit of a painful experience. It involved calling MS calling support and lots of people thought they should really be just a simple download.
If you ask me it’s a step in the right direction :-)
So finally back home after a week of Tech-Ed and time to reflect on the week.
Overall it was a great week with very few downsides. The only real downside was the hotel, my own choice, and the fact that La Rambla was pretty much a brothel at night. But not much of a problem and I was pretty busy anyway so it only bothered me a bit on the way back at the end of the day.
On average the sessions I went to where good, some very good, so no complaints there. I would like to see some more in-depth coverage of subjects but then being an MVP probably means I am ahead of the pack and may things are just all that easy. Yea right, had an interesting discussion about deploying a VSTO solution to 150 workstations and that is certainly not easy.
Being on the ATE panel was nice to. Except for the first day we got plenty of questions about a whole range of subjects. Not surprisingly LINQ is a major source of questions, even if it is still a year away. Interestingly wearing the ATE badge even resulted in questions while in the subway, no rest :-)
The food and drink, while plenty full, where not all that good this year. Specially the coffee sucked big time :-( But then the waffles where great :-)
All in all I had a great time and hope to do it again next year!
So the last day of Tech-Ed has started. That is both good and bad. Guess its good because people are getting tired, I know I am, and suffering from information overload. The bad part is that there is so much more I want to hear about, guess I will be waiting for the DVD with recorded sessions to arrive.
Duty at the ATE booth yesterday was interesting again. Lots of people with questions, good thing. Like before a lot of people came with questions about LINQ, guess it’s the kind of thing that excited a lot of developers. And rightly so if you ask me. The most challenging question came from a Dutch student who was working on a managed C++ app that had to pass callback function pointers back into an unmanaged video stream library. Guess being a VB guy doesn't really qualify me to debug C++ issues but as there where no C++ people around at the time I gave it a try. And guess what we remote desktoped into his PC at home and found and fixed the problem. Not bad for someone who hasn't done any C++ for over 10 years :-)
At the moment I am listening to a session about service factories. Nice presentation and interesting subject.
Well that’s it for now, more later.
Another day has started at Tech-Ed. Last night we had the country parties and the effect is visible on the people, they are getting tired and where slow to start this morning. But maybe it's more than just the parties as there is load iof information and people are busy absorbing it.
First session this morning was about LINQ fro SQL by Anders Hejlsberg. He went over a lot of the details of LINQ for SQL. LINQ for entities was mentioned but unfortunately he didn't go into when which would be the most appropriate choice. LINQ for SQL is pretty capable with lots of options for example, you can specify the mapping in attributes or in a mapping file or use tables, views, sproc of table functions as the source of the data. Table functions look specially interesting compared to sprocs, when you select data from a sproc all data is returned to the client and filtered there but in the case of a table function the filtering is actually done in the SQL Server database, much more efficient :-) Like usual another great session by Anders!
I will try to get some more data about when to use LINQ for Entities versus LINQ for SQL and post about that later.
At the moment I am listening to Steve Lasker talking about the smart client offline caching and sync framework. Another interesting session, more later.
So finally some more sessions. Actually got to see two sessions about Windows Workflow Foundation. The first by Matt Winkler. He gave a nice overview session about WF and how to use it. The session included some nice performance charts. Of course the performance statistics are very general and he warned about taking them to literal. Anyway the figures, and a lot more, are soon to be released in a WF performance whitepaper. The room was almost completely full.
At the moment Paul Andrew is explaining how to create custom activities. Another nice session with a packed room. Its clear a lot of people are interested in WF and how to develop using it. Nice to hear, guess that means there will be a lot of people who can benefit, or contribute for that matter, to the Windows Workflow Foundation Wiki.
Well it is said that shoes are a vital part of making a first impression, remember the film Sneakers, so it seems we are getting a new MVP perk in the way of shoes. Karen Young is walking around Tech-Ed in the new prototype.
Did another ATE booth duty this morning and it was quite interesting. Yesterday there weren’t too many questions but this morning we where quite busy with all the people stopping by. Maybe the fact that we added beer to the list of possible questions answered helped but the questions where still all about .NET. Unfortunately we couldn’t help everyone, some questions about compiler errors using C++ where a bit hard as there was none around with C++ skills, fortunately Karen Liu from Microsoft was there and she is going to make sure the product team follow up. Other than that we where able to help most people which is a nice thing :-)
One thing that came up a lot was LINQ. People like it and want it :-) The only problem with LINQ is that it will be another year before we can start to use it for real :-(
So I didn't get to see any complete sessions this morning but just now when to Steve Laskers session about SQL Everywhere, oops that should be SQLCE. It was an interesting session where he demoed some of the new sync capabilities that they are working on.
Before the ATE duty started I did get a chance to see the first half of Aaron Skonnard's session about BizTalk. He did a nice job and the first half was pretty interesting, to bad I had to leave half way through but that is a session I will repeat on the DVD for certain.
Well time for a drink and another session to go to :-)
At the moment I am watching Anders Hejlsberg talking on LINQ. As you may have noticed before I am fascinated by LINQ and the power it offers. Searching in any collection will be a lot easier but specially database and even more XML queries will become so much easier to do. Based on the details given in the session its hard to believe that it will be still be another year before we can actually start to use this for real. But that is just the way it is, it's part of Orcas and Visual Studio 2007 something that will be released at the end of next year. Too bad, I could use it :-)
The keynote has just finished here in Barcelona. Not al lot of really spectacular announcements. Some nice demo's about how Vista is going to allow a lot of new functionality but that isn't really news anymore. The main announcements are that Office 2007 and .NET 3.0 are RTM. Not sure if they are already available at the MSDN downloads but if not they will be real soon now.
The first session I wanted to go to was about SharePoint Services c3. Guess a lot of other people did the same as the session was already full by the time I got there. No biggy as there are plenty of other interesting sessions to go to :-)
This afternoon I will be on duty ate the ATE stand for Visual Studio 2005.
Well almost time to leave for Tech-Ed! Guess I better start packing my bags :-) Not that I can take a lot of luggage as my back is still giving me trouble
I am really looking forward to it, lots of good sessions to go to. This year I will be at the Ask The Experts booth, to be exact booth 26 the Visual Studio booth for part of the time. The planning is for me to be there at least on Tuesday afternoon, Wednesday morning and again on Thursday afternoon. So come by an say hi if you are going to Barcelona too.
Recently I needed to find a fix a memory leak an application I have been working on. Now the application is quite big and I am by no means the only developer so I didn't exactly have an obvious starting point. Fortunately I do have a copy of the ANTS Profiler, see http://www.red-gate.com/
, and it turned out to be more that just useful, it’s a real life saver, in these kind of circumstances.
Turns out there where multiple memory leaks.
Lesson 1: never think you are done after fixing a problem, always make sure there isn't a second one.
Using the ANTS Profiler to create a number of snapshots of the running application I noticed that there where numerous copies of an object that I thought was supposed to pretty much a singleton. I say pretty much because a second copy can be around during specific circumstances but they should not be around very long. Turned out that they where kept in memory by a delegate.
Lesson 2: Delegate have a reference to the target object and that will keep it in memory.
So there where too many in memory and I knew what was keeping them there but what caused them to be there in the first place? It turned out to be because of cloning. One of the things we do in this application is allow the user to undo and redo pretty much everything. So we track state by creating clones of the original objects. Works great as long as these objects aren't too large, something that was supposed to be the case. We actually use a generic CloneObject function that will clone every serializable object. This works as follows:
publicstatic T CloneObject<T>(object obj)
MemoryStream ms = new MemoryStream();
BinaryFormatter serializer = new BinaryFormatter();
ms.Position = 0;
newObject = (T)serializer.Deserialize(ms);
Yes it’s a C# project :-)
Now this does a deep clone, meaning it will not just serialize the object itself but also everything it points to. Very useful if you want to clone a set of related objects, like a header with collection of details, but if someone adds a reference to another object that will get cloned to! And that was exactly what was happening in our case.
Lesson 3: Even simple code can cause difficult problems.
So tracking this down using the ANTS profiler was not to hard, in fact I can't even imagine doing without it. But even a useful tool like the ANTS profiler could do with some improvements. The two things I missed most where:
- Seeing how an object is rooted.
- Seeing which objects where ready for garbage collection but not collected yet.
The first is needed because the .NET garbage collector removes all objects that cannot be reached from a root object. So if it is still there I need to know how an object is tooted. Now I needed to track this down using the object hierarchy but because it leads to numerous paths it is time consuming to find all the paths.
The second is needed because some objects might be ready for garbage collection making it pointless to track them. In fact the ANTS Profiler does a GC.Collect() before taking a snapshot but any object that implements IDisposable will be disposed and only collected during the next garbage collection cycle. Due to this behavior they will still be in the snapshot.
Now as it turns out Red Gate is planning a new version and they are already planning to address both issues. In fact the have an ongoing survey where people can vote on or suggest features. I suggest you go to http://www.surveymonkey.com/s.asp?u=537852784382
and participate, it will only take a few minutes and provide them with valuable feedback.
When the .NET runtime needs to create an object it first needs to load the assembly the type is stored in. The basics of that process is not all that complex to understand and is a must know for every .NET developer. If the type is stored in the same assembly everything is nice and cozy as no searching for assemblies has to be done. Now the process gets interesting when the runtime has to go and search for another assembly.
One thing that makes quite a bit of difference is whether the assembly in question has a strong name and is deployed in the GAC. To use side by side deployment an assembly needs to be deployed to the GAC and for that a string name is required.
The first step for the runtime is to check if the assembly has been redirected using the app.config <bindingRedirect> and or <codeBase> elements. Of course the <bindingRedirect> element can only be used of the assembly has a strong name. If not the version is pretty much ignored. Note that beside the app.config the same checks are done in the assemblies publishers config file, unless explicitly disabled, and the machine.config.
Now we know the exact name, including version of the assembly to load. Next the runtime goes through the list of already loaded assemblies to check is the assembly is already loaded. Keep in mind that the version number is only used for assemblies with a strong name, is not only he actual assembly name is used.
So is the assembly wasn’t loaded and it contains a strong name the GAC is checked. Note that this takes precedence over the local directory. So the local copy of the assembly will be ignored if there is a version with the identical version number in the GAC.
If no suitable assembly was found in the GAC the runtime starts looking in other places. It uses the <codeBase> information found in the config files. If this is specified the runtime loads the file indicated. If not the runtime starts probing a number of directories. The first directory tried is the application installation directory. Next a subdirectory with the assembly name is tried, after that the culture and a combination of culture and assembly name. If still not found the runtime tries the binpath, first by itself and then combined with the culture and assembly name.
So that is quite a list of places the .NET runtime checks for a specific assembly, no wonder loading new assemblies can be somewhat slow.
Now the most important thing to remember is that only the GAC allows for side by side installation of multiple versions if an assembly and in order to do so it needs to be strongly named. Giving an assembly as strong name is simple, just open the projects properties, go to the Signing tab and check the Sign the assembly checkbox. Next specify the strong name key file, either by creating a new one or choosing an existing file. Next build the project to create the assembly.
Deploying the assembly to the GAC is done using GACUtil, a command line utility found in the .NET framework SDK. Use GACUTIL –i <assembly file> to install the assembly into the GAC. To check which versions are currently in the GAC you can use GACUTIL –l <assemblu name>.
So people where just getting used to the name SQL Everywhere and now Microsoft has decided to rename it to SQL Server Compact Edition. For one it help prevent confusion with SQL Anywhere (A Sybase product) and SQL Express.
So what does this mean for the product? Well very little is anything has changed from the announcements of SQL Everywhere. Its still targeted as a local data store for WinForm/smart client applications and is certainly not restricted to the .NET compact framework. So I still think this is a pretty cool product :-)