Subscribe: Inside Microsoft CRM
Added By: Feedage Forager Feedage Grade B rated
Language: English
account  attribute  column  crm  database  make  might  new  platform  product  string  team  things  time  work  xml  xsd 
Rate this Feed
Rating: 3 starRating: 3 starRating: 3 starRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Inside Microsoft CRM

Inside Microsoft CRM

An insider's view of Microsoft CRM. Rants, raves, and some advanced (read that: unsupported) tricks.

Updated: 2013-02-26T12:04:01.892-08:00


Moving on


Well, sort of moving on. I'm moving my blog over to MSDN to be a bit closer to the rest of the development team. I'm not abandoning the Inside MS-CRM "column" though, I just thought it would be a little easier if I was closer to "home". So, join me over at MSDN and watch the RSS feed for more entries.

I'm moving


I've made the decision to move over to the official Microsoft world of bloggers. For now I'm going to assume that I can pretty much say whatever I want over there, but if things get weird I'm coming back.

Over time I'll try to figure out how to move my past posts over so they're not lost in the shuffle, but for now it'll look like I'm starting over. The added benefit is that I'll be closer to Jason and Aaron, two other members of the MS-CRM team.

I know I have a backlog of articles to write, but don't worry, I am writing them.

Where's Mj


I'm still around, but I've been kinda busy. It's funny, I set out to write this because I had a ton of stuff on the top of my head that I wanted to unload on the MS-CRM community. The problem is that there's just so much stuff and so little time that I've found it harder to prioritize the list. I'm getting closer though and will start writing more over the next few weeks while we're ramping to alpha for V2 (no, I can't give you a date, and no I won't give you a feature list, sorry). I've received a handful of great suggestions both publicly and privately, and I've been scouring the newsgroups for more of those nasty requests that keep popping up.

As I mentioned previously I've finally got an instance of 1.2 installed and running, but I had to break the rules to get it to work. The biggest rule I broke, and I still haven't decided if I want to get around it or not, is that I installed to a named SQL instance. While the core product just doesn't care the replication story to the Outlook client has all kinds of issues with named instances. But, the good thing is that I've got an off-campus installation running and I can start breaking things in ways that should help the community get on with configuring and supporting the 1.2 product while we busily work on the next release.

So, hang in there, I'll be writing more soon. Keep the ideas coming in too because they really help me prioritize what information is most useful. There's a ton of stuff to write about and I'm looking to you folks to guide that work. Look for a more complete serialization example, a forms XML example that adds a custom tab, and a more detailed callout sample.

More fun with XSD


Torsten Schuster asked about the XML serializer, but unfortunately it was pretty buried. I saw it, but I'm not sure anyone else did. It sounds to me like another case of old CRM XSD and the VS XSD tool, but I'm not really sure.

From what I gather the platform and APIs are behaving correctly. For example, on a sales order retrieve, the following column set can be passed in.


Which should (and in this case, does) result in the following XML.


The question is about the "customerid" element and why the serializer isn't pulling it into the hydrated object. I can only guess, but it sounds like the XSD doesn't have a correct definition for "salesorder" nor does it have a definition for "customerid".

Ideally, the sales order XSD should have a "customerid" element of type "tns:customerType" which references this XSD

I can't guarantee that the VS XSD tool will cope well with the XSD that I talked about earlier. Although making the XSD tool deal with it is fairly trivial, I still prefer the other code generator.

Like I said a while back, I can't support any of this stuff, but I can lend guidance on occasion. Hopefully this information is enough to get things moving again. If not, well, maybe someone else can chime in.

Entity size is always a problem


Running into the customization ceiling when adding attributes? I feel your pain. I really do. The team down the hall from me is working quite hard on making some of this pain go away, and they've done a bunch of work in the query processor layer in the platform. There's a reason the limitation exists in V1.x and there's a reason it wasn't "fixed" earlier. The original COLA (contact, opportunity, lead, and account) definitions were quite small and left a ton of room for extensions. One of the things we looked at was to allow customizations of the type where one could store everything one wanted in an XML document in the database. There were way too many problems with that approach (although there are some great upsides too). Simply put, any search and display is going to be a problem with the property bag approach. There really aren't any great mechanisms for telling about the semantics of an attribute. It knows all about the entities, attributes, and relationships, but that's where its knowledge stops. The application, and most other display patterns (except reporting) would work fairly well because it's all just XML and XSLT, and writing another XPATH expression to reach into the bag and pull out the rabbit is a well understood problem. The second approach was to allow physical attributes to be added to entities in the form of database columns. There are some problems with this as well, particularly around name collisions and upgrade scenarios, but none of those couldn't be overcome with some decent engineering work. A little history lesson my help. This is an excerpt from a whitepaper I wrote when we first started looking at how to create an extensible product. This is really ancient history at this point so there's really no reason I can think of to not share it. Several proposals are on the table to allow application developers to customize the storage characteristics (the tables and fields). 1) The approach taken for ClearLead 1.x (bCentral Customer Manager). Each interesting object (business, user, prospect, event) has a set of developer-defined named properties. This approach was an attempt at solving the problems inherent in approach 3. However, it quickly caused two severe problems. First, performance was horrible, each query required multiple outer joins to gather all the detail-level information. Secondly, the data stored rapidly exploded. Where it was be possible to store a single inbound email event record in a single row using an ntext blob, the CL model took the approach that all large data be broken into 2000 character chunks and stored individually. This required that any time this information was read or written, the data had to be reconstructed. 2) Expose a single, opaque, application-specific blob field on every interesting object. This has some appeal since it leaves all the interpretation to the application and puts the burden on the developer to manage and render this information as necessary. The drawback here is that the blob isn't quickly searchable and can't be indexed (full-text indexing is an option, but isn't quite mature enough to be relied upon). Another drawback with this format is that simple queries against the data are difficult to construct and very expensive to run. For example, how would a query be constructed which found all contacts who brought a cat into a vet clinic in May and were serviced by Dr. Smothers. If this data is 'stuffed' into a single Xml blob, the format isn't controllable by the platform, so a generic query like this wouldn't be possible to construct. A secondary problem with this approach is the opaqueness of the data, neither the application nor the platform have any knowledge of the document structure. The platform would need to be written with the document structure in mind to make any reasonable use of the data, in which case the extensibility mechanism is defeated. The application on the other hand may have knowledge of the stru[...]

Third time's a charm


Well, I'm back for a while. I just finished up my M2 features for our next release. Time to hand it off to the Test team for a while and let them beat on it. But, more on the V2 stuff later, after alpha, or beta, or something where we've made the feature set public. I don't want to say anything for fear of getting everyone excited about something and then having it cut at the last minute. But, anyway, what's the title got to do with that? I've been trying to install the 1.2 MSDN release. Normally that wouldn't be much of a problem, I've probably installed it at least a few dozen times (and now that I'm focused on dev work for V2 I'm installing V2 sometimes three times a day - and yes, the new setup is incredibly cool and easier to use). This time was different. I was trying to install on a VPC running Windows 2003 Server. Nothing particularly special about that VPC, it's a domain member and the database server will be running somewhere outside of a VPC. What was different was that I had installed the Whidbey community preview which dropped the 2.0 .NET Framework and I had forgotten to tell Windows that it was going to be just fine if ASP.NET was enabled. But that's just the start of the problems... The first time I ran the install I went through the usual stuff. Fill out the page, click next, read the complaint about Windows feature X not being installed, install feature X, click next, and try again. It didn't take long to work through that since I knew what to expect after the first complaint. What I didn't expect was the flat out failure while creating the databases. Since I wrote most of the V1 database install code I knew where it was when it failed, but I sure couldn't figure out why from the message. I mean, come on, the step it was working through was pretty simple. Turns out that when setup is rebuilding the foreign key constraints in the database it simply walks all of the ones it finds in the current database... Well, I turned on verbose setup logging (/l*v logfile if you haven't seen it yet) and tried again. Sure enough, it was failing at step '03 fknfr'. I stared at it for a while, cleaned everything up, fired up SQL profiler, and tried my third install of the night. Profiler showed that, yes, it was failing there. So I grabbed the script off the installation CD and tried to run it manually (but not before turning off the actual exec calls and replacing them with print) so I could see why things were failing. Well, that was a weird experience. I ran the script and the table names that were scrolling by were only vaguely related to the ones I would normally find in CRM. Oh, don't get me wrong, the expected tables were all there, but so were a bunch of other tables that I wasn't really expecting. To say the least I was getting nervous because the table names shown were actually tables from another very unrelated database on the same server and I started thinking that somehow setup had changed to an incorrect database context and trashed something. To make this a little less painful let's just say I spent quite a while staring at things until I remembered that my 'model' database on that server was actually set up to create a completely different type of table structure for each new database. Since CRM creates three databases during setup, it got seeded with three complete databases before it even started installing it's own stuff. The moral of the story is simple - make sure you have an empty 'model' database or very bad things might happen. By the way, even after switching database servers, I was still unsuccessful installing the product. Sure, things seemed to be working well, but then I ran into some weirdness with importing the default data (aka roles, privileges, and other goodies). Even after what seemed like a complete install things didn't work, which I'm attributing to the Whidbey bits cluttering up that server. I'm also assuming that switching from a 'nor[...]

Code generators


I opened my inbox this morning to find a link to a brand-new version of the XSD-to-code generator that I've been talking about. Just wanted to share the good news. There was a strong feeling that taking a dependency on a code generator from an outside team would be a bad thing because a) it wasn't built "here" and b) because it was a code generator. Well, I guess none of that matters now...

If you're doing any work with the platform XSD (not the as-shipped ones, we know they're not friendly), then grab this tool and give it a try. I use it every day in unit test development - no more building XML strings for this developer.

Programming models


Some days I really wonder what's right for the ISV / VAR community when it comes to programming models. I know what I'd like to see if I were writing code against MS-CRM. The problem is that I don't know what you want. We have this great infrastructure at MSFT that allows us to stay in touch with the community. It works well when the contacts are asked the right questions and when the PM knows what to listen for in an answer. It breaks down when the questions are too generic, too specific, or flat out wrong. That's why I don't really know what you guys want in a programming model. In MS-CRM there's only one way to go after the platform and that's using the SOAP proxy. Sure, it's possible to use the WSDL and generate client-side code directly, but it's a pain because of name collisions and all the other goo that happens when you add a web reference. There are also the well-known problems with the interfaces as they stand today which I've commented on before. We've spent a lot of time talking about the ideal interfaces, programming model, and interaction model for the next releases. The problem is that we're talking about it but your voices aren't being represented anywhere. So, I'd like to see if 1) anyone's interested, 2) anyone's listening and 3) if anyone wants to comment. Option 1 - we leave things the way they are. I won't go into this because everyone understands how things work. You get to keep Intellisense on the API signatures but everything else is a string. Option 2 - clean up the interfaces by choosing a better naming convention, getting rid of the XML strings, and removing some of the 'extra' parameters. You get IntelliSense on the API signatures here to, but no more strings. (But no IntelliSense on the entities themselves). This one saving grace here is that the "objects" are really extensible. Because they're nothing more than a property bag it's very easy to add new attributes without breaking things (this is one of the reasons we have XML strings today). Option 3 - get really radical and move to a type-safe, SOA-based model with only 6 methods but with a pile of messages. You get full IntelliSense here including type-safe entities and interfaces. The price you pay is dealing with the extensibility problem (i.e. what happens to your entities when another customer modifies the entity schema - this might be a recompilation or a property bag over the extra attributes). Option 2 might look a lot like the interfaces do today with the exception that they'll take an "object" instead of XML. This object simply wraps a property bag of strings. You'd get run-time type checking like you do today but with the expense of either having a client-side copy of the metadata or by round-tripping to the server. The code might be something like this (after adding a reference to the metadata, entity, and service assemblies): AccountWebService accountService = new AccountWebService(); Account account = new Account(); // or BusinessEntity account["ownerid"] = ownerid; account["owneridtype"] = 8; account["name"] = 64; account["customertypecode"] = "21"; // is this a string? string accountId = accountService.Create(account); Principal p = new Principal(); p.Type = sptUser; p.Id = someOtherUserId; uint rights = 1; // READ? accountService.GrantAccess(accountId, p, rights); As you can see there's an account "object" on the client, but it's really just a name. The real class is BusinessEntity which is a thin wrapper around a NameValueCollection. The third option would look structurally the same, but there would be a few key differences. This is after adding a web reference to the CRMWebService (which would get the interfaces and schemas). CRMWebService crmService = new CRMWebService(); // get a new account instance with default values Account account = crmService.CreateNew(typeof(Account)); = "account name"; ac[...]

Dealing with broken XSD


Ok, I've been dealing with this broken XSD issue for so long now that I just can't stand it. The platform has ways to give you back XSD, but it's not really XSD, and it's not really friendly about doing so. Given the current v1.x API, which aren't exactly friendly to deal with, and their insistence on using strings of XML for everything, I put this script together to start hiding the complexities. Now, one thing you'll notice is that this script doesn't generate XSD in a flavor that the VS .NET XSD.EXE tool likes (it's not the greatest either, but it does work usually). The cool thing is that there is a great tool available from the SD West Conference. This tool will happily eat the resulting XSD that this script generates and will create some very powerful client-side classes that should make everyone's life much easier. You'll still need to serialize the platform XML into a class, but that's pretty simple. The following C# snippit should do just fine (you'll have to grab some of the XML serialization references, IO, and Text, but that's left as an exercise for the reader). The third code snip is the SQL script you've been waiting for. I recommend always generating the whole pile, you'll end up with about 3,000 lines of XSD and about 30,000 lines of C# when you're done, but it's worth it. Remember though, as shown, this script will not generate XSD that the XSD.EXE tool likes. Don't ask me why, it just doesn't (and that goes for generating typed DataSets too). There are ways to make it work, but would you want to when XSDObjectGen does all the right things in terms of creating real platform XML and dealing well with minimal update sets? Oh yeah, the classes work well with the ColumnSetXml parameter on Retrieve methods. I've even created XSD that represents collections of platform entities and serialized those properly. As usual, nothing you've read here is remotely supported, and if you call support they'll likely have no idea what you're talking about. They might even ask why you're running SQL scripts in the metadata. I won't support this either. So don't ask. I'm making this available because it's something that needed to happen and never did. The bug with the bad XSD was found the day we RTM'd v1.0 and we never looked back (who'd ever want the schemas anyway, aren't XML strings self-describing...) public static string ToString(object o) { StringBuilder sb = new StringBuilder(); XmlSerializer serializer = new XmlSerializer(o.GetType()); XmlSerializerNamespaces ns = new XmlSerializerNamespaces(); ns.Add("", ""); TextWriter writer = new StringWriter(sb); serializer.Serialize(writer, o, ns); return sb.ToString(); } public static object ToObject(Type t, string s) { XmlSerializer serializer = new XmlSerializer(t); TextReader reader = new StringReader(s); object o = null; try { o = serializer.Deserialize(reader); } catch(Exception e) { throw new Exception("Failed to convert XML to object", e); } return o; } void fooBar() { // create an account object in "client-space" and set some properties Microsoft.Crm.WebServices.account a1 = new Microsoft.Crm.WebServices.account(); a1.accountcategorycode.Value = 6; = "Corporate"; a1.creditlimit.Value = 123456.0F; a1.creditlimit.value = "$123,456.00"; a1.creditonhold.Value = true; = "Yes"; a1.createdon.type = 8; a1.createdon.Value = userAuth.UserId; = "This is account 1"; a1.description = "This is my sample account...."; // turn it into XML string xml1 = ToString(a1); Microsoft.Crm.Platform.Proxy.CRMAccount accountService = new Microsoft.Crm.Platform.Proxy.CRMAccount(); // stuff in into the platform and get the new one back string xml2 = accountService.[...]

Setting default values


So, I was just thinking about the picklist / boolean problem with default values. Let's just say I ran across something that looked suspicious to me in the newsgroup. For example, let's say I've added a pair of custom picklist values, one to opportunity and one to lead and let's say that I've set default values for those picklists. Now, one might suspect that running a "convert lead to opportunity" process would automatically set the default value on the picklist. If you launch the lead in the UI, sure enough, you find the picklist with the correct value *displayed*. That's the catch. The application is being really smart here and has noticed that the platform didn't give it a value in the document, so it painted the picklist the best way it could - with a value.

So, how can we get around this problem. There's the supported way of creating a mapping between the attributes on opportunity and lead. That works if you're trying to carry that data forward to the opportunity. The unsupported way is to tell the metadata what the default value should be, regenerate the views and triggers for that entity, and then start saving or updating things.

So, how does one do that you might ask. Well, if you know the value of the picklist item that you want to set as the default, update the Attribute table's DefaultValue column with that value. The only thing is that you need to get the metadata format correct. For numbers, the format is "(number)" and for strings the value is "('string')". Look at some of the other default values that are set in the metadata for an example. By the way, this might work for booleans too since they're really just numbers in the database. Create a default of 0 for false, or 1 for true and see what happens.

You do need to regenerate the views and triggers. The schema manager will do this for you when you add a new attribute. Unfortunately, it won't do it any other time. But, if you run the p_genSpecificViewAndTriggers with the base table name (say, LeadBase, for leads) as that parameter, you'll get the SQL necessary to recreate the bits. You need to run the sproc, copy the output, paste it to a new SQL QA window, and run it in the orgname_MSCRM database.

This is off the top of my head. It's 100% unsupported. And it may not work. I suggest creating a backup before mucking about in the databases because if you break it following something I say here, the support team will not help you.

Feeling guilty


I've been meaning to add the next bit of somewhat useful information here, but I've just been slammed recently. The team has just entered M2 and we're cranking right along. I'm working on the customization bits that we missed in V1. But then again, I've been working on them for a long time now, we're just officially getting them into the build. I've also been working with the PM team and some of the partners on programming models. Eventually we'll come to an agreement about what we should build, how we should build it, and when we'll unleash it. Until then, I'll keep building prototypes and talking to partners to see what we can do better. And no, this isn't a slam on the PM team, they're really involved in the product this time around.

Oh yeah, the reason I'm slammed has nothing to do with MSFT. I'm wrapping up Spring quarter at Seattle U and working on my final project for the Information Assurance track. That's why I really haven't had much time. I have been reading the newsgroup and looking for opportunities to set misunderstandings straight. But, I try to keep a low profile over there... that's for the support team to keep under control. I just use it for ideas about the product and things to think about.

Removing unwanted attributes


First, let me start off by saying that this is a completely unsupported process. There are a number of things that can go wrong and probably will go wrong, and if you break it there's nothing the support team can do to help you. Second, if you're using the Outlook client and have replication set up, this process gets a lot more complicated, and I'm not going to be able to help out either. So, let's say you've gone into the schema manager and you've added an attribute, and let's assume you've done something you didn't want to do (like create it with the wrong type, or the wrong size, or the wrong name...). You probably want to remove that attribute. I can imagine there are other scenarios where removing an attribute might be a really helpful thing to do. Well, removing attributes is a fairly straightforward thing to do. First, you need to make sure you're not using the attribute anywhere. If you are, and you remove it from the metadata, then the product will appear to break when whatever tool is using the attribute tries to access it. Attributes can be consumed in a number of places - the web application uses then obviously; reporting will use them sometimes; integration might use them; and the Outlook client might too. Make sure you've removed the attribute from the forms first, that way you can test out what you've done. Use the tools supplied by the product, they're really good and can do the right thing. Here comes the part where things might break. You need to edit the Attribute table in the metadata database. Find the attribute you want to remove (make sure it's the right one and bound to the entity you're expecting it to be bound to). Keep the attribute id handy because you're going to need it. You need to remove all the references to the attribute first - look in AttributeMap for references to the attribute on both sides and remove those references (preferably do this from the mapping tool in schema manager - it knows the right incantations, and if this step breaks you're still in supported territory, I think). I'm also assuming that you're not doing anything with an attribute used in any relationship because that will flat out break stuff that you can't fix. This means that AttributeOf, SortAttribute, TypeAttribute, and AggregrateOf will all be NULL for the attribute you're removing. It also means that you won't find the attribute used in KeyAttributes anywhere. Once you've got all that cleaned up, you can delete the row from Attribute. There are some limitations to what you can delete, but since you're only deleting attributes you added, right, you shouldn't have a problem. However, it might be educational to talk about some attribute characteristics that describe attributes that just can't be removed. Any attribute with RequiredForGrid, IsPKAttribute, or RequiresPlatformAuthorization == 1 have to stay. Any attribute with IsNullable == 0 have to stay as well - they are required by the product. By the way, if you find your attribute in KeyAttributes or referenced by name in GeneratedJoins (or JoinAttributes), then don't do this - that attribute is being used in a relationship and removing it will change the shape of the entity graph - and you're going to break the software. Once you've deleted the attribute reference from the metadata, you're almost done. You just need to ALTER TABLE on the *Base table to drop the column (if you don't know how to do that, you shouldn't be trying any of this stuff...) and then regenerate the views and triggers. If you look at the stored procedures in the metabase you'll find one called p_genSpecificViewAndTrigger. It takes the *Base name of the table that holds the entity as a parameter and generates a script that you can use to recreate the views and triggers. It won't recreat[...]

A little disclaimer


In case it's not clear to everyone, the opinions here are really my own and don't reflect my employer's opinions. Sometimes you might read something that sounds like it's a done deal, it probably isn't. The other thing to keep in mind, especially folks who are investing in the product, is that we are going out of our way to maintain backwards compatibility. There are a few exceptions that we have publicly talked about (activity platform interfaces) that I can say may not be available in 2.0. But, like I said, we are going out of our way to make sure your 1.x investment continues to work.

That said, I will continue to make what sound like pretty bold statements, but that's just because, even after being involved with this product since the beginning, I'm still pretty excited and passionate about getting it right. I think we did a damned good job for a V1. Sure, the product has some warts, but if we waited to build something perfect we would never ship. Yeah, I trash the product, but that's because it's not quite how I think it should be - and you should probably read that as "we're doing everything we can to make the best CRM product ever".

Why MS-CRM SOAP interfaces suck


I woke up this morning to an email from Matt Powell asking me to clarify why the platform interfaces suck. Well, "suck" wasn't really the word he used, it was more of an echo from Simon Fell about the interfaces because they "blow chunks". Nice way to start the morning, but not unusual. Great timing though because I had a 9am meeting about those very interfaces and what we're going to do about them. I can't go into exact detail because this stuff is still something we're just batting around, but we are looking at a very different flavor. The idea is to trim the endpoints down dramatically, make them very chunky (in a different way though Simon), bake in some WS-* stuff, and generally make the programming model more programmer friendly. Well, back to the topic at hand. The V1.x platform interfaces are kinda ugly to work with. No, let's be honest, they do suck to work with. So, as a client I need to build up an XML string without any help from the tools, make sure it's really XML and not just a string, slam it at a platform interface that's got a weird name, returns more string stuff, and takes other not useful parameters (CUserAuth is something that just doesn't make any sense any more, it's a left-over from a very early design that we couldn't take the time to rip out without breaking everything... trust me, I wish it had gone away)... anyway, the APIs suck. But, if we look at the tools that were available at the time we started designing and implementing V1 it becomes pretty clear that we really didn't have a choice. The .Net framework was in early betas without a clear ship date, the CLR hadn't been used for anything in the business applications space yet, and we were looking at hosted scenarios in bCentral (read: this was supposed to be fast and at the time the framework didn't look that way). So we did some looking around and found a project called Manta over in the ATL world. Well, Manta went on to become ATL Server and sure looked like it would work for us. [Oh yeah, and it was really simple for our soon-to-be-built application framework to build up those magic XML strings right in the browser, POST them to the application server, and have them forwarded untouched to the platform. Talk about few transforms... there were none, the XML was POSTed directly as an XML document, the application server reached in to the document and pulled out what it needed, if anything, and the document was handed to the platform as a string. Pretty much the same thing on the way from the platform to the application and on to the browser - the platform handed the string to the application layer which ran an XSLT over it, and HTML went out to the browser.] The only problem was that we had a hard requirement that our entity definitions had to support change once deployed. That is, our customers typically customize CRM solutions to meet their business needs. That's what CRM is all about. Well, ATL Server only supported C++/COM types. So, the problem became how do we provide a SOAP interface ('cause damn it we were going to support SOAP and web services even if it killed us) that wasn't too hard to program against, was cross-platform friendly, supported WSDL, and supported extensible types. We chose the XML-as-strings path for V1.x. To make some of the pain go away we provided a client-side proxy so developers wouldn't need to spend a lot of cycles trying to make the WSDL-based client-side classes behave properly (can you say namespace collisions?). For V1.2 we spent a lot of time looking at ways to wrap the platform API set in something friendlier but the closest we came (because of the extensibility issues) was to use an "object" called a property bag. Well, that sucks more than XML s[...]

Some topics under consideration


I just started thinking... what will I discuss around here and in what order? I'm not sure about the order yet, but I've got some starter topics that might be of interest.

Adding a custom page to a detail view using an IFRAME
How to remove your added attributes
What's that SecurityDescriptor column all about
Advanced callouts, why aren't there any "pre"-callouts, transactions, locking, and how to make it all work with MSMQ
Scripting workflow without the workflow editor
Using the control framework (in a completely unsupported way)
What's the deal with SRF files
How to write to the platform using web services
What's this XSDObjectGen thing and how would it help
How to convince MS-CRM to give real XSD schemas for entities
How to build a live entity browser like the static one in MSDN
How dates and times work
What are these XML attributes all about and how do we make complex elements behave well
What is and why can't I use T-SQL
How can I get aggregate SQL functions using

I'm sure there are a lot more topics that are interesting to people "in the trenches". This is just a list off the top of my head from taking a quick scan through the newsgroup.

Just getting started


So this is something I've wanted to do for a long time. As one of the founding team members behind Microsoft CRM I've had an interesting viewpoint into the development process and the product itself. I've been reading the CRM public newsgroup since its start and have really noticed a number of places where we could have done better (and, by the way, we do know about a lot of the issues and are trying to address them in upcoming releases, but we had to get this thing out the door).

Sure, the product has some issues, but overall it's pretty cool. I'm not going to play it up all the time, there are just too many holes that need to be patched. But for a V1 product with a small team building on untried technology I think we did a pretty damned good job.

I suppose I should give a little background on myself so people might think I know what I'm talking about. I was over in Carpoint when the DP team spun off to create a productize version of DP5.0 called MerchantPoint (we liked to point at things in those days). The only relevance to CRM I supposed was that I was responsible for taking the MP v1 code and making it work for one of our larger automotive partners. What a pain in the ass that was.

When the company finally decided to get serious about CRM the MP team was completely spun out of MSN and was sent over to bCentral. I went with as a software architect and vowed to dump the entire code base and start from scratch. ...and that's what we did. MS-CRM was designed from the ground-up to follow a web service model and to be hosted on bCentral (along with any other ASP willing pay for the honor of running the platform).

Depending on my mood I might go into the culture clash that happened when, what was then known as ClearLead and bCentral Customer Manager merged with a small team from Office-land called SBCM...

Well, as many people know, MS-CRM doesn't really work all that well in a hosted environment. There are a number of reasons, but the biggest one was that during the MBS acquisition days (GPSI and Navision) MS-CRM got a reset and we decided it would be an on-premise solution. So, here we are building a product that was originally designed for data centers - hence the heavy MSFT stack and prerequisities - but tweaking bits here and there to run on a smaller scale. Oh yeah, and by smaller we actually meant to make both the web-based client and the Outlook-based client use not only the same code but the same binaries. It's some of those tweaks that are so painful to get around today (Security descriptors in the database... seemed like a brilliant idea at the time, and if we could have made it really performant and transparent it would have been great, but...)

Well, anyway, welcome to what might be an interesting combination of history, lessons learned, and an insider's look at Microsoft CRM (and yes, I'll try to address some of those issues like removing attributes from shipped entities).