Subscribe: Andru's WebLog
Added By: Feedage Forager Feedage Grade B rated
Language: English
azure  data  display docs  driver  file  microsoft  mode  mongodb org  mongodb  net  new  org display  safe mode  service  tfs 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Andru's WebLog

Andru's WebLog

Comments about technology and software architecture, software development and other nerd stuff! :)


Watch my Microsoft BOT Framework Introduction Webinar

Mon, 08 Aug 2016 16:35:23 GMT

Hi everyone!  In case you missed the live streaming of this interesting webinar, you can still watch it on our YouTube Lagash channel, so just click here to watch the spanish version MS Bot Framework intro (spanish) or the english version here MS Bot Framework Intro (English)

Feel free to comments and like the video on our channel, ask for more or propose topics to discuss. Of course, you can reach me here or by Twitter @andresvettori



New "Assessing Data Store Capabilities for Polyglot Solutions" guide published by Microsoft Patterns & Practices

Tue, 01 Dec 2015 17:47:00 GMT

Hi everyone!

Folks from Microsoft Patterns & Practices have finally published the very good "Assessing Data Store Capabilities for Polyglot Solutions" guidance document after A LOT of hard work.

You can find it on GITHUB at where you can review it, make comments and contribute if you can.

This guide is quite comprehensive and I'm happy of being part of the review process, along with a lot of other very smart people.



Slides for my "Docker Introduction" talk on Nov 3rd in MUG (Microsoft Users Group) in Buenos Aires

Wed, 04 Nov 2015 18:13:58 GMT

Hello everybody, I'm posting here a link to the slides I have used in my introductory talk about Docker and its integration with Microsoft platform and tools.

Feel free to grab them from here:

Enjoy!  As always, comments are very welcome.


Playing with the new Azure B2B Collaboration Service Preview

Wed, 14 Oct 2015 18:12:02 GMT

Well, I have the luck of dedicating some time to help in a little spike process here in Microsoft P&P that involves using the new Azure B2B Collaboration service.

While it's a brand new service and still in preview, there is some informaiton and documentation available about how it works and how to use it.

For example, you can start giving these links a look:

Well, as part of the spike we followed the steps described above and found a couple issues, so I'm documenting them now with the hope it helps somebody else.

So, I'll summarize our finding below:

  1. The steps involve creating a new Azure Directory Domain that will host the shared application with some partners.
    1. Tip: this new domain HAS to be linked with an Azure Subscription, because you need an admin user from THAT new domain to login in the Azure management portal in order to create the invitations for partner users.
  2. You need to create a CSV file as described above to invite parter users to use your shared application. You have two options here: the invited person already has an Azure identity or hasn't and he will create a new one when accepting the invitation.
    1. Tip: if partner user accounts already exists, then the CSV "Email" column has to match the "User Name" account attribute of the existing user, not Email.
    2. Tip: if the invited partner user account is not email enabled, then you need to include an undocumented "CCEmail" column in the CSV with a working email address. This is specially useful for demo and testing scenarios (like our own spike here).
    3. Tip: in the CSV you need to include a value for the column "InviteAppId" but that column refers to the "Application Principal ID", not "AppID".
  3. The Azure B2B Collaboraiton service is focused on sharing internal corporate resources with external partners without federating or duplicating user accounts.
    1. Tip: The service in its current state doesn't support well the multi-tenant application scenario where you are sharing an applicaiton to multiple external partners (tenants) and want their users to be properly identified as coming from an external tenant so the applicaiton can use that information to control data access and apply different restrictions. Currently all users appear to come from the same Azure tenant and that's why we could't reliabiy detect from where the user is coming from.

That's all I have for now, will keep an eye on this as this new B2B service does have a very valid scenario and does simplify the process of integrating identity management with an external partner.


PS: Thanks Rohit for your time!

.NET Conf.UY 2015 Cross Platform .NET Slides

Wed, 14 Oct 2015 16:05:00 GMT

Hi everybody!  I'm just posting here a link to view and download the PPT slides I have used for my presentation in the awesome .NETConf UY 2015 conference about cross platform development in .NET with the new Core Clr.

The slides are here:

Enjoy and use in any way yoe see fit, as always, comments are most welcome.



Project SharpHSQL active again!

Thu, 30 Apr 2015 21:36:00 GMT

Hi everyone!

Just a quick note to let you know that I'm bringing back to life the SharpHSQL project, my old port (from 2005...) from the Java version of Hypersonic DB v1.4 to fully native C#.

For now it's located in Codeplex at so feel free to come by and make questions or contribute with anything you can, I'm in the early stages of the new release and the focus will be to get a stable and usable in production library.

Keep you posted!

Thanks and Enjoy!


Published slides for the Azure Search & ML session at MUG

Fri, 21 Nov 2014 12:42:43 GMT

Yesterday I participated (togheter with Rodolfo Finochietti & Mariano Sánchez) in the free MUG (Microsoft Users Group Argentina) event named "News about Azure and ASP.NET" where we talked about the new ASP.NET v6 stack and Visual Studio 2015, Azure Search and Azure ML and Visual Studio Online.

I just have uploaded the PPT slides for the Azure talk to my personal OneDrive so you can download it directly from here.

I will be publishing a post about those topics soon, stay tuned!


Asynchronous Streaming in ASP.NET WebApi

Wed, 12 Dec 2012 15:13:00 GMT

 Hi everyone, if you use the cool MVC4 WebApi you might encounter yourself in a common situation where you need to return a rather large amount of data (most probably from a database) and you want to accomplish two things: Use streaming so the client fetch the data as needed, and that directly correlates to more fetching in the server side (from our database, for example) without consuming large amounts of memory. Leverage the new MVC4 WebApi and .NET 4.5 async/await asynchronous execution model to free ASP.NET Threadpool threads (if possible).  So, #1 and #2 are not directly related to each other and we could implement our code fulfilling one or the other, or both. The main point about #1 is that we want our method to immediately return to the caller a stream, and that client side stream be represented by a server side stream that gets written (and its related database fetch) only when needed. In this case we would need some form of "state machine" that keeps running in the server and "knows" what is the next thing to fetch into the output stream when the client ask for more content. This technique is generally called a "continuation" and is nothing new in .NET, in fact using an IEnumerable<> interface and the "yield return" keyword does exactly that, so our first impulse might be to write our WebApi method more or less like this:           public IEnumerable Get([FromUri] int accountId)         {             // Execute the command and get a reader             using (var reader = GetMetadataListReader(accountId))             {                 // Read rows asynchronously, put data into buffer and write asynchronously                 while (reader.Read())                 {                     yield return MapRecord(reader);                 }             }         }   While the above method works, unfortunately it doesn't accomplish our objective of returning immediately to the caller, and that's because the MVC WebApi infrastructure doesn't yet recognize our intentions and when it finds an IEnumerable return value, enumerates it before returning to the client its values. To prove my point, I can code a test method that calls this method, for example:        [TestMethod]         public void StreamedDownload()         {             var baseUrl = @"http://localhost:57771/api/metadata/1";             var client = new HttpClient();             var sw = Stopwatch.StartNew();             var stream = client.GetStreamAsync(baseUrl).Result;  &[...]

MongoDB usage best practices

Wed, 24 Oct 2012 14:14:00 GMT

The project I'm working on uses MongoDB for some stuff so I'm creating some documents to help developers speedup the learning curve and also avoid mistakes and help them write clean & reliable code. This is my first version of it, so I'm pretty sure I will be adding more stuff to it, so stay tuned! C# Official driver notes The 10gen official MongoDB driver should always be referenced in projects by using NUGET. Do not manually download and reference assemblies in any project. C# driver quickstart guide: Reference links C# Language Center: MongoDB Server Documentation: MongoDB Server Downloads: MongoDB client drivers download: MongoDB Community content: Tutorials Tutorial MongoDB con ASP.NET MVC - Ejemplo Práctico (Spanish): MongoDB and C#: C# driver LINQ tutorial: C# driver reference: Safe Mode Connection The C# driver supports two connection modes: safe and unsafe. Safe connection mode (only applies to methods that modify data in a database like Inserts, Deletes and Updates. While the current driver defaults to unsafe mode (safeMode == false) it's recommended to always enable safe mode, and force unsafe mode on specific things we know aren't critical. When safe mode is enabled, the driver internal code calls the MongoDB "getLastError" function to ensure the last operation is completed before returning control the the caller. For more information on using safe mode and their implicancies on performance and data reliability see: If safe mode is not enabled, all data modification calls to the database are executed asynchronously (fire & forget) without waiting for the result of the operation. This mode could be useful for creating / updating non-critical data like performance counters, usage logging and so on. It's important to know that not using safe mode implies that data loss can occur without any notification to the caller. As with any wait operation, enabling safe mode also implies dealing with timeouts. For more information about C# driver safe mode configuration see: The safe mode configuration can be specified at different levels: Connection string: mongodb://hostname/?safe=true Database: when obtaining a database instance using the server.GetDatabase(name, safeMode) method Collection: when obtaining a collection instance using the database.GetCollection(name, safeMode) method Operation: for example, when executing the collection.Insert(document, safeMode) method Some useful SafeMode article: Exception Handling The driver ensures that an exception will be thrown in case of something going wrong, in case of using safe mode (as said above, when not using safe mode no exception will be thrown no matter what the outcome of the operation is). As explained here!topic/mongodb-user/mS6jIq5FUiM there is no need to check for any returned value from a driver method inserting data. With updates the situation is similar to any other relational database: if an update command doesn't affect any records, the call will suceed anyway (no exception thrown) and you manually have[...]

Argentina Microsoft Users Group Software Architecture Day

Mon, 31 Oct 2011 12:38:00 GMT

Hi Guys, last Friday I was invited to be a speaker in a very nice Software Architecture Event organized by the Argentina Microsoft Users Group.

The event (spanish spoken) was named "Jornada de Arquitectura" (something like "Software Architecture Day") and included the presence of very notable local speakers in the Software Architecture field like: Martín Salías, Diego Gonzalez, Hernán Wilkinson, Diego Fontdevila, Roberto Schatz and, of course, me. If you want more information about the event click here.

The slides and videos for the other presenters should be uploaded in the MUG site shortly and into the Code & Beyond site as well, so check for them in a couple days if interested.

Just for the impacient, I'm uploading my slides here, in Skydrive for your convenience.

Best regards,

Andrés G Vettori, Vmbc, CTO

Dynamic (runtime) Generation of a WCF Service contract

Thu, 01 Sep 2011 17:53:00 GMT

 In my last post I talked about how to register dynamically (at runtime) a WCF instance without the need to have an existing .svc file or something in the configuration file ().

In this post, we are going deeper in the "Dynamic" domain and we would like to not only host our service dynamically, but also to create the actual service contracts and service operations (and its metadata to support WSDL generation) dynamically (at runtime).

You might be asking yourself: Why on earth anyone would like to create a WCF service (and contracts with methods) at runtime?

Well, there is a big chance that you never need to do something like this (I have lived without this so far, and our current projects are running in production very well) but we are building the next version of our development platform (the elusive project codename "E2" I talked about in my previous post) and in that context we need a way to generate the API that our "Business Models, Modules and Process" define. Those artifacts are completely created by our Business Analyst users and they don't require ANY CODING at all, so why we would settle for less when talking about the exposed APIs of those things.

In the sample code you can download below there are not one but two different approaches to this:

  • Using Reflection.Emit to create at runtime a service class that implement the desired operations (methods).
  • Using  the "ContractDescription" class to create and inject at runtime the service metadata that resembles the desired operations (methods).

Both approaches resolve the service metadata creation, the first by creating a class and letting the WCF runtime to generate the service description metadata in the normal way, and the other build and inject this service description metadata from scratch. Both methods works equally well but I tend to preffer the second because generating and loading a dynamic type in the running process (or AppDomain) have the drawback that is difficult to unload those generated Types and replace them with new versions when the business metadata changes (remember that Business Analysts are doing that, and they are free to change anything they need).

Having said that, I know that it would be possible to unload a generated Type by means of custom AppDomains, but is more work and there are security and performance issues associated with this. By using a metadata only approach we eliminate this problems altogether.

So far we resolved the metadata generation part, but the actual execution of those pesky "Business Process" is not mentioned anywhere, and that's because that is the easy part!  :)

WCF already provides all the extensibility you need to intercept the execution of ServiceOperations and do wherever you need, and that's exactly what I do in both examples. I have created an "OperationInvoker" class implementing some WCF interfaces (IOperationInvoker and IOperationBehavior for the injection part) so feel free to explore them, but they are pretty simple. When constructing the metadata we inject it into the "OperationDescription" behaviors collection, check the "CreateOperationDescription" method in the "Service5.cs" project file. 

 Well, enjoy the code and let me know if have any comments.

Source code download from here:

Best regards,

Andrés G Vettori, CTO, VMBC

Registering a file-less WCF Service Dynamically

Mon, 29 Aug 2011 10:50:00 GMT

 I know...  I know...   it has been a while since my last post, but I never forgot about it, just having a blast doing some fun things. Well, some of that stuff wasn't really THAT fun, but that's part of being a CTO and always there is room for improvement in some processes and structures in the company, and so...  after a lot of hard work, we are now in a much better position to actually start doing some fun stuff.And the fun part has begun, in thre form of a couple new and REALLY interesting and fun projects, where I participate not only as CTO but also I'm fulfilling a role of an Chief Architect as well, overseeing aspects of general architecture from development and infrastructure standpoints, and more important, doing a LOT of research and proof of concepts to hand over to our Architect Team.So, for this new project (let's say, Codename "E2") I'm in charge of researching some stuff, and here I will present the results of one of those topics: dynamic (runtime) registering of WCF services.Before jumping in today's topic, let me talk a little about the other topics I'm researching, just to paint a broader picture and set the stage for future posts. Our E2 project is a little ambitious in some aspects, and it's main motto is "Configurable Dynamic", so we are exploring ways to make this happen and the list of things to explore first are:Dynamic registration of WCF services (today's topic)Dynamic Data Access (ORM without Entities)Dynamic Business Rules (or logic, if you want)Dynamic Business Processes (workflows are a possibility here, but not the only one)Performance of of the above (mainly IIS, ASP.NET, WCF) and how to optimize the platform.Performance techniques to use: caching, profiling, monitoring, etc.  Of course there are other topics (like Security, Scalability, Fault Tolerance, etc) but we will advance over those in time. For today's topic, let's explain a little what I'm talking about. For WWCF services there are two ways of let know the runtime environment that we have a Service class and we want to expose it to the world:The plain old .svc file approach: we need a file with .svc extension and this file will contain the Type information needed to activate the service.The new CBA (configuration based activation) approach: this is new in NET4, and therefore is possible to create a WCF service WITHOUT the svc file, using only a section in the web.config file ().While this second option is very interesting, we cannot do it at runtime and so the idea of this post was born. We can register HttpModules at runtime (the MVC3 project do that to register the HttpModule that handles Razor views) and so wwe can try to do something similar. That functionality (dynamically register an HttpModule) could be found in the "Microsoft.Web.Infrastructure" assembly using the "RegisterModule" method found in the "DynamicModuleUtility" class.We are going to take a similar approach to this helper method and use reflection to inject our service configuration somewhere so the runtime thinks it have a CBA Service (file-less) and can active it as any normal service. This is achieved in the "DynamicServiceHelper" class in the attached sample project, using the "RegisterService" method. This is the source code for the "RegisterService" class:  (some lines where removed for brevity) namespace System.ServiceModel {     public static class DynamicServiceHelper     {         static object _syncRoot = new object();         static FieldInfo hostingManagerField;   [...]

TFS 2010 and the missing Area & Iterations (stale data) Issue

Tue, 30 Mar 2010 15:32:00 GMT

The symptom is this: you change some area or iteration in a TFS Project, but the change is not reflected (or updated) in VS or any other TFS Client.

Well, it happens that TFS now has some clever caching mechanisms that need to be updated when you make a change like this, and those changes are propagated by some scheduled jobs TFS is continuously running in the Application Tier. 

So, you you get this behavior, please check (and possibly restart) the "Visual Studio Team Foundation Background Job Agent" service. In my case, this service was logging a very odd "Object Reference Not Set" into the Windows Event Log, and a simple restart fixed it.

Hope this is fixed by RTM...   (we are using the RC version).

And by the way, if the job agent is broken there are some other things that stops working like email notifications.

Best regards,

Andrés G Vettori, CTO, VMBC


Very good post about solving Sharepoint problems with TFS 2010

Tue, 01 Dec 2009 15:15:00 GMT

 The Visual Studio WIT tools team has published a very nice post pointing the most common issues with Sharepoint and TFS 2010.

Check it out at


Andres G Vettori, VMBC, CTO 



IDFX -> Zermatt -> Geneva -> WIF RTM

Tue, 01 Dec 2009 12:01:00 GMT

At the PDC 09 Microsoft announced the Release to Manufacturing (RTM) of the Windows Identity Foundation, previously known as "Geneva", "Zermatt" before that, and "IDFX" before that.

Grab the latest bytes from

Best regards,

Andres G Vettori, VMBC, CTO

VM Prep Tool for Visual Studio Team Lab Management 2010

Mon, 09 Nov 2009 02:26:00 GMT

Microsoft has released the first version of the Virtual Machine Preparation Tool for Visual Studio Team Lab management 2010. What a mouthfull! Try saying that three times in a row..

Well, the tool function is to prepare existing VMs to be compatible with VS 2010 Lab Management requirements, and believe me, there are a few. Configuring an existing VM by hand is a tedious and VERY error prone task, and so this tool was born.

Download it from, this version is prepared to work with VSTS 2010 Beta 2 and Windows Server 2008 X86 SP2 VMs. They will be adding more options as soon they finish testing of different versions (and flavors) of Windows. Perhaps R2 is on the pipeline?

Best regards,

Andres G Vettori, VMBC, CTO


Build Silverlight 2.0 or 3.0 projects with an x64 TFS 2010 Build Agent

Thu, 05 Nov 2009 15:00:00 GMT

I was trying to build our biggest solution after migration (TEST migration) and found that Silverlight projects won't compile. The first error we received was "The Silverlight 2 SDK is not installed". I found a post on the Silverlight forum about this and managed to fix the error, but then a second error appear:

 "The "ValidateXaml" task failed unexpectedly ... System.IO.FileLoadException: Could not load file or assembly 'PresentationCore, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The given assembly name or codebase was invalid. (Exception from HRESULT: 0x80131047)

According to the people in the Silverlight forum, this is a Silverlight BUG and it won't compile in an x64 environment, so the general workaround is to use an x86 Build Agent.

Fortunately, I found a MUCH BETTER workaround for TFS 2010, wich only involves changing the configuration of the build definition and set the "MSBuild Patform" to "X86" (it was "Auto" before).

After this, my Silverlight projects compile again, even in an x64 Build agent. Nice!

Just for reference, the Silverlight post is here:

Andres G Vettori, VMBC, CTO

Enabled new ALM features after migration of TFS2008 to TFS2010

Thu, 05 Nov 2009 14:48:00 GMT

The test migration of our TFS 2008 went extremely well (the import command processed nearly 7GB of data from our TFS 2008) in less than half hour, and all running in my notebook (not some kickass server...).

Today I executed the script published by Hakan Eskici to upgrate the process template from v4.2 to v5, by simply running a BET file. The process is very simple and fast, and after that we now have the updated process template with all the TFS 2010 goodies enabled.

Check out the process and download the script from

After the upgrade I created a new TASK and found the new "Original estimate" field, and the now separated "Remaining work" and "Completed work", and I almost cryied by the emotion...  :)

Also the BUGS have now the new "Repro Steps" and "Test Cases" tabs that are extremely welcome. There also more changes than this (for example the new Test Case WI) so give it a try!

Best regards,

Andres G Vettori, VMBC, CTO

Step by step TFS 2010 configuration

Mon, 02 Nov 2009 13:27:00 GMT

Here I found a good post about how to setup TFS 2010 (and everything else).

TFS 2010 is easier to setup than previous versions, but I cannot say the same for Sharepoint...


Andres G Vettori, VMBC, CTO