Subscribe: Daily Dribblings of a Demented Developer
Added By: Feedage Forager Feedage Grade A rated
Language: English
application  build  console writeline  database  framework  net  performance  sql  studio  tfs  time  version  visual studio  windows 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Daily Dribblings of a Demented Developer

Daily Dribblings of a Demented Developer

A software developer who noticed one day that all software developers were active bloggers, and was wondering if he was missing out on something......

Updated: 2017-08-11T22:04:10.199+10:00


Getting the correct versions of .Net Framework for WinDbg


To debug a minidump of a .Net application created by Windows Error Reporting, it is essential that you have the correct version of the following 3 dll'ssos.dllmscorwks.dllmscordacwks.dllThe correct version being the version of the .Net framework that the user was running at the time the minidump was generated. For example, if the minidump was created using the framework 2.0.50727.4971, then you need the 2.0.50727.4971 version of the dll's mentioned above to debug this. Chances are this is NOT the version you currently have on your machine, it may not even be a valid version for your OS (XP, Vista, 7, 8) or architecture (x86, x64). As far as I'm aware, there is no central repository where you can go an grab the correct version of these libraries from, so one has to attempt to find them the hard way.The .Net framework is initially deployed with our application, however, it is regularly updated, once installed, by Windows Update, to ensure that security vulnerabilities and major defects are patched. If your application is only installed on a single server (or server farm), this is not really a challenge, however, if your application is installed as a desktop application on nearly quater of a million PC's world wide, finding the correct version of the framework requires some sleuthing skills, and a bit of luck. The first step is to try and find the Microsoft support knowledge base article that describes the update. To do this I've found the following query typed into google, usually gets the result you want..Net framework 2.0.50727.4971 is usually the first hit, but it may be further down. Validate this by searching the resulting page (Ctrl+F) for the framework version (e.g. 2.0.50727.4971) and make sure SOS.dll one of the binaries with this version number.Go to the download information section of this website and click on the provided link. This should take you to the Download Center page where you should see the following optionsClick to download the architecture you require. This will download the .msu file for the update. You can use the command-line Expand utility to extract the files in this update package.expand.exe -D .\Windows6.1-KB2604114-x86.msuWill list all the files in the package, in this case there are 4, .\windows6.1-kb2604114-x86.msu:\windows6.1-kb2604114-x86.msu:\windows6.1-kb2604114-x86.msu: Windows6.1-KB2604114-x86-pkgProperties.txt.\windows6.1-kb2604114-x86.msu: Windows6.1-KB2604114-x86.xmlYou want to extract the file to do this useexpand.exe .\Windows6.1-KB2604114-x86.msu .\You can then use expand.exe to interrogate and extract the contents of this .cab file. To extract the files you want useexpand.exe -F:sos.dll .\ .\expand.exe -F:mscorwks.dll .\ .\expand.exe -F:mscordacwks.dll .\ .\Each of these statements will usually result in 2 folders named something like the following x86_netfx-sos_dll_b03f5f7f11d50a3a_6.1.7600.16992_none_e878512ab7239f09You'll need to check both dll's and make sure you get the version you are after.Simply copy these into a common folder, and you can then use the following windbg commands to set the exepath and load the correct sos.dll.exepath+ C:\MyFrameworkFolder\fwx86_2.0.50727.4927.load C:\MyFrameworkFolder\fwx86_2.0.50727.4927\sos.dllHappy debugging.[...]

Telstra’s Transparent Proxying


I am finding Telstras Transparent proxying a nightmare at the moment. I am trying to configure a Windows Azure SQL DB instance I have set up on a trial account I got from TechEd. Azure SQL database security is tied to your IP address. You can click “Manage” from the portal window to add your current IP address to the firewall rules, but then thanks to Telstra’s transparent proxying, when you connect with SQL Server Management Studio, you get a different IP address, and consequently can’t access SQL Server.

Anyone know of a solution to this, short of getting a fixed IP address and opening up the firewall to the world?

Thoughts on Gina’s $2 a day comment


I think there is a point where the ruling class become so arrogant and full of their own self importance that the “plebs” see their only logical course of action is full scale revolution. I don’t think we’re there yet, and as history has taught us, revolutions rarely succeed in doing more than replacing the head on the coin, but with Gina Rinehart’s latest comments, I feel we have inched one step closer to it. Indeed there are people in West Africa who will work for $2 a day. Many West African nations are struggling economically, and desperate people will do desperate things. Perhaps the global economy does need some re-adjustment, and maybe just, the “invisible hand” of the market can do something to address the great injustices of the world, but for Gina Rinehart to make the inference that Australians need to take severe pay-cuts just because her precious mining business is in danger of slowing down, makes it clear just how detached from reality she has become. You can almost hear “let them eat cake”. It is not the responsibility of the working class to fund the ruling classes lifestyle. If you can’t make your business profitable while being ethical to your workers and paying them a liveable wage that takes into account cultural and social standards, then you don’t deserve to be running a business. Perhaps if Gina was to spend time in West Africa working for $2 a day, she may have a little more sympathy. Even if she was forced for a few months to exist on an Australian un-skilled labourers wage, she may get the picture.

The Slippery Slide


I was raised as a fundamentalist Christian. Until the age of about 25, I believed the following facts to be immutable. There is a metaphysical all knowing and perfect being who exists outside of time and space, who for want of a better name is called God. This God being, created the entire observable universe including a species of cognisant beings in his image who at some point in their "time based" existence started referring to themselves as homo sapiens. These beings were created with a mortal physical presence and an immortal soul. This God being, spoke to some of these cognisant beings throughout this time based history, that were deemed worthy to accurately record all of his various thoughts and actions over the course of a few thousand years. These writings were then collected by a committee of these cognisant beings around 300 ACE who were doing the will of this God being. This collection of works went by the name "The Holy Bible". This God being made up some elaborate rules for these cognisant beings to gain this Gods favour and be eligible to co-exist with this God being after there mortal physical presence had expired, in a really nice place. The alternative to this really nice place was a pretty horrible place, and those who didn't follow these elaborate rules would be sent there. These elaborate rules were spelled out in this Holy Bible book. Oh and This God being has a penis... (A freakin' huge one). I chose the word "fundamentalist" in my opening sentence very specifically. There is a quintessential difference between someone who believes The Bible is the written word of God and infallible, and someone who believes that The Bible is a book of collected works written by humans who were fallible, but had certain insights into the divine at a particular point in history. There are many aspects in which this distinction plays itself out. One we see commonly, is the biblical literalists insistence on attacking the scientific theory of evolution, and insisting, against all evidence to the contrary, that the earth is less than 10,000 years old. The reason Biblical literalists cannot, and will not accept that the creation story is anything more than a myth for the benefit of an illiterate desert tribe to take solace in, is because they argue that this denial in the literal truth of the opening chapter of the Bible creates a slippery slide that will hold the rest of the Bibles message to critique, and will ultimately invalidate (in their minds) the undeniable truth of the word of God. It has very little to do with the psuedo-science of intelligent design. People who fall into category b however, (those I would refer to as liberal Christians), see no problem with juxtaposing the creation myth as a beautiful and metaphorically elegant story of the supernatural origins of the universe against the scientific theory of evolution. I myself went through a phase of believing that the Bible wasn't meant to be a modern scientific dissertation, but rather in regards to the creation story was meant to provide sufficient information for people in a pre-scientific era to satisfy the ontological question, and move beyond this to more important aspects of how to live a good life. While I was still able to accept the rest of the dogmas of the particular brand of Christianity I was involved with at the time, this event marked my slow progression from fundamentalist Christian to liberal Christian, which eventually culminated in my current position of Atheism. But there was much more to this progression than simply taking the creation story metaphorically. I've spoken with a number of ex-fundamentalists. Some have, like me, gone to almost the polar opposite of atheism. Others have chosen the less extreme agnostic route. Some have found solace in some form of deism, while others prefer to hold to their theistic understanding of the world but chose a more liberal approach to their Christianity. Man[...]

.Net Performance and Scalability Tuning Tip #2


The Database Bottleneck Why is my database so flaming slow? Have you heard this before? I have. I’ve also heard it as a good reason not to use relational databases. There are a number of reasons why a particular database could be slow, but lets take a step back and try to understand what a relational database is actually trying to do. Relational databases are designed to be a mathematically structured, transactionally consistent and optimized way to store and retrieve large amounts of “relational” data. If you are not storing relational data, then don’t use a relational database, it is not optimized for your usage scenario. Problem solved… well, perhaps not. Let’s assume you are storing relational data, but the database still seems slow. Well, I could go in to detail about database tuning, but I’m not a DBA, and there are people out there who are far better at that than I. Also database tuning tends to get very vendor specific at its deepest levels. Besides, that’s not what I really want to focus on here. Maybe you could look at the performance of your Network, or the disks and other hardware your database server is running on, but again, I’m not network or hardware engineer, I’m a software engineer. So let’s fix it in software… or at least, to the best of our ability. I’m going to assume then that your database is well tuned and maintained by someone who is experienced in doing so, and knows all the different ways in your organization that this database is going to be used. Once we make these assumptions, then we can no longer say “the database is slow” I would go further and say that any modern database is actually very efficient for what it does. Still I have seen developers in this case trying to blame the database for the poor performance of the application. Most often the database resides on another box entirely to the one you’re application is running on. This is where the problem begins. To actually get data from a database you have to create a database connection, that then opens a network port, which then negotiates a conversation over a protocol, asks the database server for some data. The database engine then does some processing to find your requested data or perform your operation, and finally returns the results as data packets over the network back to your application. Now that mightn’t seem like much, but compared to grabbing that same data from your local memory, or even a file on your local disk, we are talking many orders of magnitude difference. Even if you are using a database engine on your local machine, it will almost certainly be running in a separate process which means that although you don’t have a physical network to navigate, you still have inter-process communications protocols to deal with, and this can still be a few orders of magnitude difference to getting the data directly from memory. The most important thing about developing against a database is to have respect for the overhead required when querying the database. Based on this respect we can establish a few basic rules about how to interact with a database. 1. Don’t keep asking the database for the same thing over and over again. If it’s something you need regularly cache it. Caching comes with its own problems, the biggest of them being when to invalidate it, but the one thing you can be certain of is that it won’t be slow. 2. Try to ask the database as few times as possible for information during any one operation. Ideally it would be great if you could simply have a single SQL statement that gets everything you need to perform the desired operation, and it is worth investing a few (thousand) CPU cycles trying to figure out exactly what data you are going to need for a particular operation before hitting the database. It is also usually better to return a little more data than you might need, than to find you have to hit the database again because you forgot something. Of[...]

Travelling Again


I am currently travelling overseas, partly for work partly for pleasure… actually mostly for pleasure. As with other times, I find that travelling to foreign countries creates a lot to reflect on, and consequently write about. My first stop will be California. I am currently in a hotel in Los Angeles, on Monday I will be driving to Anaheim where I will be attending the Microsoft Build Windows conference, then I am planning to head to Santa Barbra for a bit of R & R. After that I will be visiting Buenos Aires. I will use this my personal blog as essentially a travel diary, but my geek blog as a means of recording my thoughts on the conference.

Microsoft doesn’t care about feedback from Australians


I’m attending the Build Windows 2011 Conference in Anaheim this coming week, and as part of this Microsoft sent me out an email (presumably to all attendees, or at least all who indicated that they were a developer) inviting them to share feedback in special limited numbers discussion groups. Firstly though, they require you to fill out a quick survey. So I click on the provided link, and I’m presented with the following question.


OK, fine, I’m not too proud to be lumped in with “the rest of the world” team, so I click “Somewhere Else”, then “Next”. I am then presented with this

Thank you for your interest in our survey. Those are all the questions we have today. We hope that you will consider participating in future surveys.

OK, so basically, if you are not from any of these 8 countries, then your opinion is completely irrelevant to Microsoft. Regardless of the fact that you are the technical architect for Windows/PC version of a product that has over 400,000 user licenses world wide. Regardless of the fact that you are fostering a product that is responsible for direct license sales of SQL Server 2008 R2. The message is loud and clear, if you’re not in Microsoft’s G8, then your opinion is worthless. I wonder if our Mac Technical Architect (when he is eventually appointed) will get the same kind of treatment from Apple?

Really, the least they could have done is to let me continue with the survey, and then disregard the information anyway, I mean it really doesn’t hurt to store a few extra bytes.

Today’s Random Grumble


I absolutely love OneNote, and use it all the time, but there are some things that really infuriate me at times… I mean how difficult is it really to get url parsing and highlighting working properly?


.Net Performance and Scalability Tuning Tip #1


Measure twice, cut as many times as you like. As promised, I am starting a series of blog posts on .Net performance and scalability tuning. I have been doing this in my current role at QSR International, and have been amazed at just how many possibilities I have found to improve the performance of the application. Each time I get to a point where I think I can’t get any more performance out of it, I find something else, or have another idea. Not all of these tips are specifically .Net related, many of them will be applicable to any programming language, but my focus for the past 7 years of my career has been specifically on .Net, so naturally, I will be focussing fairly heavily on that. These tips will also be fairly heavily “Rich Client” based as that's where I have spent most of my career, however some of these tips will also apply to any .Net code running anywhere. My first tip is simple. Measure what you are trying to tune. You’ll probably see many bugs in your bug tracking system (you do use a bug tracking system right? If not I suggest you stop reading this article right now, and go find yourself a bug tracking system ASAP), that read “Application is slow when I do XYZ” or “Opening form foo takes forever”. The first step is to get this quantified. Exactly how many seconds does it take to do ‘X’, where ‘X’ is repeatable. Testers will probably have a set of test data, and you need to get your hands on this. Often performance issues surface because the data is structured in a specific way, and if you have differently structured data you may well not spot the problem (have a free “Works on My Machine Certification”). Once your testers have measured it, then you might want to start a discussion around what is acceptable, what you should be aiming for in the tuning etc…. Now I said “Measure twice”, well, after the testers have got some basic timings that you can use for comparisons at the completion of your tuning, then it’s your turn to measure. There are a number of tools that will analyse your code and assist in highlighting the areas that require the most attention. Visual Studio comes with a performance wizard under the “Analyze” menu. This is a reasonably good starting point, but personally I prefer tools with a nicer UI. My current favourite at the moment is “ANTS Performance Profiler” from redgate, but there are others out there (Feel free to leave a comment if you have another preference). For me the visualizations provided by ANTS Performance Profiler allow me to very quickly and effectively focus my attention. Without this knowledge, you can spend a lot of time optimizing bits of code that are called so infrequently that even optimized won’t make any noticeable difference to the overall performance. Once you’ve picked a measuring tool, learn it and master it. I usually like to take a series of measurements as I go. As you improve the performance of one problematic area, others will start to rear their ugly heads. Also a series of good visualizations can make for some good discussions with management if they want to know how you’re going. it is also a good education tool for other developers in your team. You can show them exactly why you should, or shouldn’t do particular things. Now that you have measured twice, unlike carpentry where the proverb “Measure twice, cut once” comes from, as coders, we can cut as many times as we need. Assuming you are using a decent source control system (and by the way, if you’re not, stop reading this article and go get yourself one NOW), you can confidently try different options to your hearts content. Keep in mind that sometimes release code behaves slightly differently to debug code, and as such, your final sign off should come from the testers testing on the release version of your build. Sta[...]

Long Time No Blog


For various reasons, I haven’t blogged much over the past few years, but now I feel inspired to get back into it. A lot has changed in my professional life since I last put pen to paper (or rather ascii to screen), but a lot still stays the same.

My last post “Enjoying my iPhone” is an indication of just how long it has been. I now use a Windows Phone 7 device, and I am really liking it. To me the iPhone UX was starting to feel a little stale, and while it was revolutionary for its time, I think MS have made a real challenge with their “metro themed” UX. Having said that, I still keep my old iPhone around just in case. My biggest gripe with my HTC HD7 is that it only has 16GB of memory (my iPhone 3GS had 32GB), and as such I can’t store my entire music collection on it.

The biggest news however, is that I no longer work for Readify. As much as I love Readify, and felt honoured working along side some of Australia’s and even the world’s best geeks, I got to a point where I really needed something different. I now work for an ISV called QSR International. They write an awesome piece of software for qualitative data research called NVivo that is used in universities and research institutes all around the world. I have been working with QSR since April this year, and so far really enjoying it. It’s partly the work that I have been doing with them that has inspired me to post a series of new blog articles on .Net performance tuning, so keep a look out for that.

In other non-geek related news, the inadequacies of the Melbourne public transport system have finally caused me to give up on my hippy car-free lifestyle, and join the masses who are forced to own a car… well almost.


I am now the proud owner of a Honda CBR 600 F4i (named Xena), and absolutely love her. Although I do still catch the bus a fair bit as I believe that it increases significantly my chances of surviving the journey.

Keep a eye out for my series on .Net performance tuning… coming soon… I promise.

Enjoying my iPhone


I have known for a long while now that there were significant deficiencies in the Windows Mobile approach to smart devices. This had been made clear from Apples paradigm shifting release of the iPhone. However, the fact that I could develop applications for my windows mobile device using tools I was familiar with, kept me holding firmly to the inferior device, until about a month ago, when frustrations with my HTC touch HD (WM 6.1) reached a critical mass, and it happened to coincide with a colleague of mine upgrading to iPhone 4 and selling his old (almost unused) iPhone 3GS at a very reasonable price.
Since then, I have to say I have really been enjoying my iPhone, although, fear not dear reader, I am not about to become yet another Mac fanboi... At least not that easily.
There are some really nice things about the iPhone, but at the same time, some things that could be improved.
The Good
The focus on user experience is really where the iPhone (and indeed apple) really shine. It is quite obvious that this is a device designed to be used while traveling on a train with one hand full of groceries, and you texting arm wrapped around a grab rail for support. The different context sensitive keyboards, the size of the keyboards, the auto-word prompt when you mistype a word, the ability for applications to switch orientation all add up to a great user experience when inputting and reading text, as is evident from the fact that I can type up this whole blog post using my iPhone. The concentration on more natural user interfaces is very welcome.
The hardware is fast becoming the minimum spec for smart phone manufacturers that want to compete on the modern market, which I think signifies that apple really nailed it.
The proliferation of iPhone apps has seen the platform become an instant hit, and overall, I am extremely happy with it, to the point where I will not be going back to a windows mobile 6.* device. The overall consistency of "most" applications in the app store, and the new device friendly controls that apple have encouraged developers to use are on the whole really good, however, that could become a double edged sword.
The bad
Battery life sucks. Also coming from a palm is background, I do miss graffiti as a form of input, but I guess there were so few people that bothered to learn it that I'm probably on the minority here. The 20MB download limit when not on a wifi network I see as an affront to my free market rights to purchase the bandwidth I want and how I want it delivered.

I will be getting a Windows Phone 7 device when they come out, because I really want to get into the WP7 development space. I'll have to wait and see how well then WP7 performs
Before I make a choice as to which device I want to use on a day to day basis... May the best phone win.

You know your system is in trouble when…


Tried to use my favourite short cut this morning (Windows Key + E), and I got this.


Speaking at Victoria.Net SIG


Just a quick note to let everyone know that I will be talking at the Victoria .Net SIG on the Microsoft Sync Framework this coming Tuesday

Hope to see you there.

More bad luck


As my attempts to learn Sync Framework continue, I find even more issues along the way, this time with the SQLCE runtime. The basic scenario boils down to attempting to fill a Detail view from a dataset linked to a SQL Compact Edition datasource. When the application is run, an error message to the effect of

Unable to load DLL 'sqlceme35.dll': The specified module could not be found. (Exception from HRESULT: 0x8007007E).

is displayed, and the application crashes. After trying many different things to solve the problem, a simple google search on the exact error yields the solution. Basically it is a problem on Vista and Win7 x64 environments, and simply requires you to install SQl Compact Edition SP1 found here

I guess I was just surprised that it was not included as part of Visual Studio 2008 SP1.

A colleague of mine Mitch Denny has told me that “If I was going to pick a person to clear a minefield you would be that person”, I replied that I was hoping it meant he believed I had special unique skills for finding and defusing mines, and not just that I am expendable.

TFS 2010 Console Application and the .Net 4.0 Client Profile


One little frustration that has caught me twice now (and so it’s time to blog it so I remember it), is that when you create a Console Application and then start adding references to any of the TFS API libraries to it (such as Microsoft.TeamFoundation.Client.dll), if yoiu just accept all the defaults for a Console Application in Visual Studio 2010 Beta 2, you get this warning

Warning    1    The referenced assembly "Microsoft.TeamFoundation.Client, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL" could not be resolved because it has a dependency on "System.Web, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which is not in the currently targeted framework ".NETFramework,Version=v4.0,Profile=Client".    BasicEventListener


and the application will not compile.

The issue is the .Net 4.0 Client Profile does not contain the reference to System.Web, which is required by the TFS API. This is simple to fix, just go to the Properties of your project, and for the Target Framework, select “.Net Framework 4” instead of the “.Net Framework 4 Client Profile”.

The unluckiest developer alive


Sometimes I think I should be a tester for developer tools. For some reason I have the uncanny knack of stumbling onto the most obscure bugs in Visual Studio. I am currently trying to learn Microsoft Sync Framework, and decided (after watching a good video on how to do it) to set up an ADO.Net Sync Services Local Database Cache. I went through the process step by step, the only 2 differences between my environment and the one demonstrated in the video was that I was running SQL Server 2008, and I was using the AdventureWorks2008 database. The procedure I followed was this… Create a new Winforms Application Add A new Local Database Cache (this starts the wizard) select the AdventureWorks database Select the Production.Products, Production.ProductDescription, Production.ProductPhoto tables to add to the cache. Click OK. Everything else I left as the default configuration. The result: So I figure it’s something to do with my environment. I spent about half an hour Googling and looking at forums before I decided that I must be the first person ever to hit this bug. Next I tried to see if it was because of something in my environment, I checked my setup Windows 7 64 bit; SQL Server 2008 SP1; Visual Studio 2008 SP1; Sync Framework 2.0 x64; The only thing I could think might be an issue was the Sync Framework 2.0 x64, so I uninstalled and installed the x86 version. Still the same problem. I then opened SQL Profiler and tried to see if anything was obvious there, but concluded that it was all OK (I later discovered that I didn’t look closely enough at this). I then cracked open the wizard dlls in reflector to see if there was any help there, but alas, theat was pretty much a dead end too. I even posted an article on the Sync Framework Forum. Frustrated, I decided that it had to be my own environment that was causing the problem, so I set about creating a virtual environment on GoGrid. Once I had managed to get everything installed, I tried just the Production.Products table this time, and got the exact same error. I then tried creating a completely new table in a completely new database, and, surprise surprise, it actually worked… well, almost. It got past the wizard, but the code it generated would not compile. It appears that I stumbled on yet another bug in the Sync Framework wizard, if you use a table called “SyncTable”, the generated code is broken. Encouraged by the “success”, I pressed on undeterred. I finally got everything working by creating a table called “MyTable”. Not satisfied that I had made it work (finally), I decided to try and track down the bug. After about an hour or so of playing around and not having any success, I finally cracked open SQL Profiler again and found that the last statement executed before it died when trying to create a local database cache for the Production.Product table was the following SELECT * FROM [Product]ion.[Product] Obviously the code that works out the schema and table name needs some reworking. Technorati Tags: ADO.Net,ADO.Net Sync Services,Sync Framework [...]

Using the TFS 2010 SDK (Beta 2)


With significant changes to the way TFS 2010 works, especially in relation to Team Build and the new concept of Projects Collections, the API has necessarily adapted to account for these changes. For developers wanting to use these new API’s, there is currently very little in the way of documentation. The TFS SDK code gallery site has a small amount of examples, as well as the beginnings of the API documentation, which although sparse, because of reasonably clear naming conventions actually offers users some guidance when attempting to write integration software for TFS 2010. There are also a few other bloggers around who have some posted code examples. I have been looking at re-writing TFS Dependency Replicator for TFS 2010, and in so doing have created a very simple application to enumerate some of the objects I am interested in. In the interests of helping other developers, here is my source code. You can also download the full solution from here. To run it, you must have .Net 4.0 beta 2 and TFS 2010 Explorer Beta 2 installed on your machine. class Program{ static void Main(string[] args) { // Get a list of all registered Project Collections Console.WriteLine("Project Collections"); var collections = RegisteredInstances.GetProjectCollections(); foreach (var collection in collections) { Console.WriteLine("Connecting to TFS Server {0} at {1}", collection.Name, collection.Uri.ToString()); // connect to tfs using (var tfs = new TeamFoundationServer(collection.Uri)) { tfs.EnsureAuthenticated(); Console.WriteLine("Successfully connected to {0}", tfs.Uri.ToString()); // Get items of interest from TFS EnumerateServer(tfs); } } Console.ReadLine(); } private static void EnumerateServer(TeamFoundationServer tfs) { // Get all of the Team Projects var structureService = tfs.GetService(); var teamProjects = structureService.ListAllProjects(); Console.WriteLine("Team Projects"); foreach (var teamProject in teamProjects) { Console.WriteLine("\tName: {0}", teamProject.Name); } // extract all the work items // NOTE: this uses wiql to query the work item store Console.WriteLine("Work Items"); var workItemStore = tfs.GetService(); var workItems = workItemStore.Query( "SELECT [ID], [Title] FROM WorkItems"); foreach (WorkItem workItem in workItems) { Console.WriteLine("\tID: {0}\tTitle: {1}", workItem.Id, workItem.Title); } // Get the build servers Console.WriteLine("Build Servers"); var buildServer = tfs.GetService(); var buildControllers = buildServer.QueryBuildControllers(true); // Get all the build controllerrs foreach (var buildController in buildControllers) { Console.WriteLine("\tController: {0}", buildController.Name); // Get all the build agents foreach (var agent in buildController.Agents) { Console.WriteLine("\t\tAgent: {0}, Team Project: {1}", agent.Name, agent.TeamProject); } } foreach (var teamProject in teamProjects) { // Get each build definition for this Team Project var buildDefinitions = buildServer.QueryBuildDefinitions(t[...]

Broken Angels book review


In his first book, "Altered Carbon", Richard Morgan sets up a universe in which human consciousness has been digitized, and can be transported over inter-planetary distances, re-sleeved (placed into another human body), and backed up. Broken Angels follows the main character Takeshi Kovacs, into another "Subjective" life on another colonized planet.

The exploration of the digitized consciousness concept is great. I particularly love the graphic description of the "Soul Market". He also does reasonable job of describing the not so subtle links between greedy corporations and wars, but I feel he could have gone further. This time Aliens make an appearance (although long extinct), and there are some interesting explorations of the academic attempts to explore alien culture, and the parallels with our attempts to understand ancient earth cultures is fairly obvious.

The development of some of the characters was good enough to understand their motives, but still allowed for some surprises in the end. Other characters I feel weren't really developed as well as they should have been.

I feel that the books main problems revolve around the central character Kovacs. At times it can feel very much a "One man takes on the world", "Alpha Male" stereotype. I think this was also the first books main problem as well. It also uses the created universe of digitized consciousness and technological advances to allow for more graphic violence, but I think that is an acceptable exploration of the ideas the universe is based on.

As with the first book, the general outlook is bleak, basically accepting that technology will never be a panacea for human ills, but it does leave the reader with the slightest glimmer of hope at least for some of the characters you have met along the way.

I will definitely be reading the third in the series.

Technorati Tags: ,

Linq2SQL Bug with derived classes and Calculated fields


Today I discovered a bug in Linq2SQL. When you define a class hierarchy in dbml, where you derive a class from a base Class, and both use the same underlying table, and a discriminator property, as shown below:


If you then wish to expose a calculated field in the derived class, (and trust me I have come across situations where this is a requirement), you will find that Linq2SQL throws an exception when you attempt to insert or update items of the derived class to the database.

The exception is as follows

Test method DerivedClassTest.DerivedClassTest.TestInsert threw exception:  System.ArgumentException: Property 'System.String CalculatedField' is not defined for type 'DerivedClassTest.BaseCLass'.

This bug occurs deep within the Linq2SQL logic, and is to do with the AutoSync functionality where Linq2SQL attempts to create a select statement to return the values of AutoSync fields at the same time as it performs the insert.

You may be able to work around this by setting the AutoSync property on the Column Attribute to “Never” as shown below

[Column(Storage="_CalculatedField", AutoSync=AutoSync.Never, DbType="NVarChar(61)", IsDbGenerated=true, UpdateCheck=UpdateCheck.Never)]
        public string CalculatedField

And note that setting this in the designer doesn’t seem to work either, it just removes the AutoSync parameter all together, but still fails. You manually have to set it in the generated code (another bug perhaps?).

Of course this means that you won’t get the new value of the field after an insert or update to the table, and you’ll have to re-query the table if you need it.

The alternative workaround is to ensure that all calculated fields are placed on the base class.

I have joined the e-Book Revolution


I have finally bitten the bullet and purchased a kindle. I had been tossing up between the kindle and the nook, and even though in many respects the nook is a superior device, there were really 2 “features” that persuaded me in the end. The first was Text To Speech, the second was availability.

The Nook does not have Text To Speech, and I could not find any evidence that they were thinking of adding it in a firmware upgrade. Also I wanted it before Christmas, and the earliest anyone can get there hands on a Nook is 15th of January.

I’ve had my kindle for 2 days now, and so far I am really happy with my purchase, but there are certainly some negatives, so here is my review.

The positives.

  • Great looking device, thin and light.
  • reasonable 3G coverage (in most major Australian cities).
  • Text to speech rocks.
  • Latest firmware upgrade (2.3) adds native support pdfs.
  • Good reading experience (eInk is cool).
  • Support for converting files of many different formats such as html, mobi pocket, rtf, etc…, although native support for these document types would be better.
  • Can sync your personal documents via your own PC.

The negatives

  • No Wifi. I have been amazed at how Amazon have actually sold this as a feature “You don’t have to hunt around for a wifi hotspot”, actually Amazon, I have one of them at home, as do many people, also many of the cafe’s I would go to read have them as well.
  • No Australian news content. This really infuriates me, I would love to get The Age delivered to my kindle, but alas I’ll have to go on killing trees to get my news.
  • Proprietary DRM format. It really annoys me that if I do eventually decide to go for another ebook reading device, I will potentially lose my entire book collection.
  • Not all books come with Text To Speech switched on. I got caught out on this on my first purchase, not checking the information thoroughly enough. I feel a bit cheated, as I feel the publisher has no right to determine how I choose to consume their content. Are they going to dictate that I cannot have another human being read it to me? No, then why draw the line at a computer?

My particular reason for wanting the Text To Speech feature is that I am a very slow reader, and Text To Speech would drastically increase the amount of books I could “read”. In future I will be making purchasing decisions based heavily on whether or not Text To Speech has been allowed by the publisher. Having said that, I wouldn’t be surprised if the problem with Text To Speech rights lies more with audio book retailers than with publishers, but that’s one for the conspiracy theorists to argue about.

Visual Studio Help Integration Wizard and Continuous Integration


As .Net developers we are constantly using msdn and the Visual Studio help system. One of the most powerful features of Visual Studio is the ability to hit the F1 key on a class, property or method, and have it whisk you away to the correct msdn entry for the item of interest. When using third party libraries, it is great when they are able to provide you with the same ability to search API documentation from within Visual Studio. This is all possible, but I recently discovered, not necessarily trivial, and there is a really big gotcha if you want to add it to your Continuous Integration process. To start with you need to ensure that all developers of your API are meticulous about putting XML Summary comments on ALL public facing entities. This is a good practice whether or not you intend to distribute the API. The second step is to use a tool like sandcastle to build compile the xml comments into a usable format. Sandcastle has a number of different outputs including the classic .chm (Compressed html) output that is easy to read and can be distributed with your project, however, if visual studio integration is what you are really after, then you need the Html2 help output (*.hxs). As there are a lot of different settings for sandcastle, it is good to use the SandCastle Help File Builder that provides a nice settings based interface as well as a command line utility to aid you in building your help files. The third step is to create an installation package (.msi or .msm) to distribute your documentation to the millions of developers who will be using your API to write the next killer application. There are a few ways of achieving this, to do this from scratch you could follow these intructions, however, in the Visual Studio 2008 SDK, there is a project type called the Help Integration Wizard that attempts to automate this process for you. This very simple project template works great right up to the point where you try to integrate it into your Continuous Integration Process. The first problem is that the solution created contains 2 vdproj files, which means that you can’t build them with MSBuild, and are forced to pollute your build server with Visual Studio and compile it using the devenv command line. However, even after you have taken this hit, the pain doesn’t stop there. The reason is because of 2 files that are included surreptitiously in the project path but not added to the solution. The executable FixRegTables.exe, and a merge module called MSHelp2_RegTables__RTL_---_---.msm get added to the CollectionFiles folder for the merge module project that is created. If you fail to check these in to your source control solution, your CI build will break with an error about a post build step. However, if you do add them to your source control repository, then chances are when you perform a “Get Latest” operation, these files may well be marked as “Read Only” as this is standard practice for many source control systems to ensure that source files are not modified during a build. In this case, the build will work as expected and produce an msi file, however, the resultant msi file will not install anything when run. The issue is that the Post Build step mentioned above needs to modify the merge module. The solution is to either check the 2 files in to your source control system, and make the merge module writable as part of a pre build step, or to copy the two files from their location in the visual studio sdk into the Collection Files folder as a prebuild step. Tech[...]

Bragging time


On Friday, I successfully passed Microsoft Certification exam 70-561. This now makes me a MCTS: .Net 3.5, ADO .Net Applications.

I am going to do a few more exams over the coming year, but I’ve yet to decide what the next one will be.

One small gripe about my laptop


I absolutely love my new laptop, especially since I put an SSD in it. I have one small gripe however, and that is with the keyboard, or rather the lack of one key that I have become very reliant on, that is the “Right Mouse Key”.

I am often too lazy to plug in my mouse, or often (like now) not in a position to do so (reclining in a comfortable chair with my Feet up, and the laptop on my lap), and I have become quite used to the keyboard shortcuts for all the applications I use, which is why I have always loved the right mouse key, but alas on the Dell Precision M4400 laptop, no such key exists.

Having said that, just in doing my research for this rant, I have in fact discovered that Shift+F10 does the same thing, problem solved! Still would  be nice to have a single button for this though.

Laptops, SSD’s, vhd’s and Hyper-V


I recently splurged and purchased one of these for my new laptop. I was finding hard to justify the extra expense for the 128GB over a 64GB SSD, knowing full well, that if I was careful I could run an OS and all the applications I normally use on 64GB, but I thought… what the hell. This weekend I discovered the “boot from vhd” feature of Windows 7, and I can say I am so glad I spent the extra money.

I have successfully created a dual boot for Windows Server 2008 R2 running on a 25GB vhd stored on my SSD. It is a bit of work, mucking around with DISKPART and BCDEDIT, but once it’s set up, it just works a treat, and thanks to my SSD, the speed is awesome. My main reason to do this was to play with Hyper-V as I have been charged with setting up a build server and test server at the client I am currently engaged to.

New Laptop


having recently gone back to work after returning from a year in Buenos Aires, I have been tooled up. One of those tools as you may already have seen from my previous post is a new laptop, and just by way of showing off, I thought I’d post the specs up.


  • Mobile Workstation AW-M4400n - Dell Precision M4400n-series Base
  • Intel(R) Core(TM)2 Duo Processor T98002.93GHz, 6M L2 Cache, 1066MHz FSB
  • 8GB (2x4GB) 800MHz DDR2 SDRAM
  • 15.4 " Ultrasharp(TM) WUXGA (1920x1200) RGBLED Display with TrueLife(TM)
  • NVIDIA Quadro FX 770M, 512MB dedicated memory
  • 8X DVD + /-RW Drive
  • 2nd Hard Drive 120GB SATA (5400RPM) Hard Drive (External)
  • Integrated 0.3 Megapixel VGA Webcam with Single Digital Mic for WUXGA LCD
  • Wireless Network Cards -Intel WiFi Link 5300 (802.11 a/g/n 3X3) MiniCard

I have also made the decision (as many other Readify people have) to install Windows 7 RC on it. So far there have been no serious problems, but there are a few things I have found and I will blog about them as I find and (hopefully) fix them.

The first one was Google Chrome not working, but there are many posts about this, however even after this, Chrome does seem to be a little unstable from time to time, the end result of which is I end up spending half my time in IE8 which I have no real problem with.

The second problem was the bluetooth issue I talked about in my previous post.

I do get a little bit of instability from Visual Studio 2008 from time to time. I think it has something to do with Team Member Presence information in VSTS Team Explorer, and so I turned off this feature and it now seems to be a lot more stable.

Technorati Tags: ,