Subscribe: CodeClimber
http://feeds.feedburner.com/codeclimber
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
api  blog  book  code  core  net core  net  new  owin  studio  time  umbraco  visual studio  visual  web api  web 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: CodeClimber

CodeClimber



Climbing the Cliffs of C#



 



Free eBook on ASP.Net Core is available for download

Thu, 20 Apr 2017 12:55:41 Z

The eBook I wrote together with Ugo Lattanzi is finally available for download for free from the Free Ebooks section of Syncfusion Tech Portal.

This project had a long development as we had to wait till all the latest bits of the ASP.NET Core tools (which only became final with Visul Studio 2017) were finally available to deliver a complete and up to date picture of how to develop an ASP.NET Core application. But now it's finally out and you can get it for free.

The table of content of the book is:

  • About ASP.NET Core Succinctly
  • Introduction to ASP.NET Core
  • What are .NET Core and ASP.NET Core?
  • A Brief History of the Microsoft Web Stack
  • Getting started with .NET Core
  • ASP.NET Core Basics
  • Beyond the Basics: Application Frameworks
  • How to Deploy ASP.NET Core Apps
  • Tools Used to Develop ASP.NET Core Apps
  • A Look at the Future

The books also briefly covers OWIN, which was the "inspiration" for ASP.NET Core, but if you want to know more about it, we also wrote OWIN Succinctly, still published by Syncfusion.

A big "thank you" goes to both the editors at Syncfusion and to technical editor James McCaffrey whose comments helped improve the quality of the book.

(image)



How to get the url of custom Umbraco backoffice API controllers with the GetApiUrl method

Wed, 21 Dec 2016 13:35:47 Z

Today my article on extending Umbraco has been published on the Umbraco Advent calendar site and on a comment Ismail asked if one could have used the umbRequestHelper.GetApiUrl method to build the Url to the backoffice API instead of hardcoding it into the $http request. Since it would quite a long explanation which wouldn't fit in the comment, I'm posting it here instead. The umbRequestHelper.GetApiUrl method This method is used to simplify the creation of Url to the backoffice web api, and it creates them based on a JavaScript dictionary Umbraco.Sys.ServerVariables[umbracoUrls] that is created on the server, directly using the MVC routing engine to get the correct url for controllers. Once a custom controller has been registered in the umbracoUrls dictionary, using the method is pretty easy. You just call it by specifying the apiName (which is the key with which you register your controller in the dictionary) and the actionName within the controller (and a query paramter if needed). For example, if you want the url for the GetLanguages action of the NuTranslation API you just do: umbRequestHelper.getApiUrl("nuTranslation","GetLanguages") Setting the key in the umbracoUrls dictionary But for it to work you need to first set the url in the umbracoUrls dictionary. And here comes the tricky bit. This has to be done inside as handler for the ServerVariablesParser.Parsing event, where you just ask to the UrlHelper to get you the url of a given controller. Here is a simplified version of the code. public class NuTransationApplication : ApplicationEventHandler { protected override void ApplicationStarted(UmbracoApplicationBase umbracoApplication, ApplicationContext applicationContext) { ServerVariablesParser.Parsing += ServerVariablesParser_Parsing; } private void ServerVariablesParser_Parsing(object sender, Dictionary serverVars) { if (HttpContext.Current == null) throw new InvalidOperationException("HttpContext is null"); var urlHelper = new UrlHelper(new RequestContext(new HttpContextWrapper(HttpContext.Current), new RouteData())); umbracoUrls["nuTranslation"] = urlHelper.GetUmbracoApiServiceBaseUrl("Index"); } } To make sure it all goes well, in a production level solution you should probably add some more error checking. You can see a complete version on NuTranslation github repo. If you now call the getApiUrl("nuTranslation","GetLanguages") it will return /umbraco/BackOffice/Api/NuTranslation/GetLanguages. Now if you rename the controller you won't have to update the frontend as well, and you won't have to remember the exact route to API controllers. [...]



Ignoring files on git, but just for a while with assume-unchanged

Mon, 19 Dec 2016 14:09:12 Z

One of the most common issue we all face when versioning code on source repositories is being able to make changes to files that are tracked, but without committing them. A typical example is the web.config file in Umbraco projects: the version in source control should be the default "uninstalled" version, while each developers will have its own, with it's local configurations.

Probably some of these scenarios could be solved with other approaches, like writing .user config files, but a solution that solves all of these issues is using the update-index command of git.

You can type git update-index --assume-unchanged to temporarily exclude the file from any tracking, pretending it never changed. And when you actually need to commit some actual change to the file you can use the opposite command git update-index --no-assume-unchanged .

The only annoying problem is that I cannot find a way to list which files are untracked. But it nicely solves the problem of making configuration changes that we don't want to commit in a way that is transparent for the solution being developed.

Have you ever used this approach before? What do you think? Let me know in the comments.

(image)



10 years of blogging... and blog migrated to Articulate on Umbraco

Sun, 11 Dec 2016 16:37:43 Z

Exactly 10 years, on the 11th of December 2006, I wrote my first post on this blog, just a few days before quitting my first ever job in Esperia and few weeks before moving to New Zealand.

Lots of time passed by. I moved to New Zealand, went back to Italy, quit my job at Calcium, started working in Avanade, got to know Umbraco, moved here to Belgium and started working for the EU, started triathlon (and many other changes as well).

Blog migration

Now, 10 years after my first blog post, as I announced a few days ago, I also changed blogging platform, moving from the obsolete Subtext, to the more modern, easier to update and closer to my current interests, Articulate.

As you can see, the skin is very simple and some of the feautures I had in my previous skin are not here yet, like categories, tags, monthly archives, code highlighting and lightbox for images. I wanted to make sure all content was moved correctly.

A new feature introduced is that comments are now managed off-site, by Disqus.

How I migrated from Subtext to Articulate

The migration was not a difficult one, but a few customizations we needed to keep the old URL structure and make the right URL rewriting for the routes that I couldn't keep the same.

I'll write some posts in the next weeks explaining everything I've done.

Contributions to Articulate

Apart from some customization specific to my scenario I also made a lot of changes to Articulate in order to correctly import my BlogML. I'll be sending a few PR to be incorporated into the main code-base.

(image)



Blog Migration to Articulate

Thu, 08 Dec 2016 09:52:54 Z

I’m in the process of migrating my blog from Subtext to Articulate.

There are various reasons to this. First, not being MVP I don’t have  free azure subscription anymore, and I’d have to migrate anyway. Then Subtext is a very old software, based on pre-nuget libraries, and I cannot even make it to run on my own computer, so I cannot update it with things I’d like in a modern blogging engine. And finally, being more and more into the Umbraco community, I wanted to host my blog on Umbraco as well.

I already setup a blog on Umbraco and Articulate: TriathlonGeek, and is about my triathlon experience and how I’m preparing for my first full distance Ironman for 2018. It’s pretty clean and easy, but planning on adding some more features to it.

Now, what are the issues the I’ve to face when migrating from my blog to Articulate:

  1. First I need to export all posts from Subtext and import them into Articulate: this is easy as Subtext exports to BlogML and Articulate imports BlogML
  2. Then I need to copy over all pics and make sure references are still ok. There are few options I’m considering for this.
  3. Articulate doesn’t natively support comments, but it can be configured to use either Disqus or facebook. I’ll go with Disqus as it allows me to import my old comments from BlogML.
  4. Finally I need to keep the same URLs. Now I have /archive/yyyy/mm/dd/title.aspx. Articulate by default has /archive/title. No date and no .aspx at the end. With a mix of 301 redirects and custom UrlProvider to generate custom url I should have this solved.
  5. I also need to import manually all tags, as the BlogML export from Subtext doesn’t provided them.

So while recovering from surgery I’m doing all these changes and I hope to have them done by when my MSDN subscription expires, beginning of next week. If not, my blog will be down a couple of days before it’s new life on Umbraco and Articulate on Umbraco Cloud.

(image)



All about Connect() 2016

Thu, 17 Nov 2016 12:55:54 Z

Lots of announcements the Connect(); 2016 event around the new wave of tools from Microsoft.

TL; DR; Microsoft released Visual Studio for Mac, .NET Core 1.1 (with ASP.NET Core and EF Core) and Visual Studio 2017 RC… and many other Azure related and mobile related features

Here a selection of all the posts from various offical blogs about what happened recently:

If you want to go bit more in details of some of the features, here are some more in-depth coverage of the some of them:

The conference is not over yet, so make sure you tune in at https://connectevent.microsoft.com/ for more in depth video about the new line of Visual Studio tools and for the Future of .NET

(image)



Extending the dotnet core cli: introducing dotnet-prop

Wed, 16 Nov 2016 21:56:20 Z

Last week, on the last day of the MVP Summit, I participated in the .NET Hackathon. There were many different topics, ranging from ASP.NET Core to Minecraft and HoloLens. But I was interested in better understanding the .NET Core tooling so I joined the table of the dotnet-cli team. With the move from the easily editable project.json file to the old .csproj format, it became more difficult to manually edit the project file, so I built a command for the dotnet cli, to make it easier to edit some properties without messing up with the XML-based format of the MSBuild file. The usage of the tool is very simple. First you have to add the tool to the project you want to edit by manually editing the csproj file. I know this sounds a bit weird but unfortunately there is no way to add global tools for the dotnet-cli, but only tools specific for a project. I sent a pull request to the dotnet-cli project so this step won’t be needed if they include this tool as command. To add the reference to the tool you have to add: 0.1.0-* Then, restore the dependencies and finally just call the dotnet prop verb from the root of project. The commands available are: add, to add a property (or update one that already exists) del, to remove a property list, to list all the properties of the csproj file For the moment I only implemented the possibility to add/update the following three properties: version number (-v|--version), which sets the VersionPrefix property in the MSBuild file target framework (-f|—framework),  which  sets the TargetFramework property supported runtimes (-r|--runtime), which sets the RuntimeIdentifiers. Since one project can be built for many runtimes, this option can be specified more than once You can find the code on my github repository: https://github.com/simonech/dotnet-prop/ The tool is available on nuget and will be automatically downloaded when restoring the packages of your project. This was just a test and it’s a work in progress, so please let me know what you think of it, and which other properties should be added. I’m also undecided on the right commands and options to use. Which of the following pattern do you think is better: command for the operation (add or delete or list) and option for the property (like now): dotnet prop add –version command for the property and option for the operation: dotnet prop version  –add |—del Tags: dotnet-cli, netcore [...]



The time I was top link on Hacker News and Azure killed my site: lessons learned

Thu, 03 Nov 2016 14:23:03 Z

Monday morning (USA time) 31st October 2016, at a certain point my post about “Surface Book vs MacBook Pro” got the first position on Hacker News. It triggered a good amount of discussion, and luckly the tech-savy people commenting managed to go on using either the Google cache or directly reading from the RSS feed (which is distributed via feedburner, thus static) What happened? How did I notice it? Because someone wrote me on facebook that my site was down. The error I got was: Going to page linked from the error, Azure SQL Database resource limits, looks like the problem is that I reached the maximum number of workers allowed by my SQL Basic Tier (30). Also DTUs were maxed out, according to the resource chart from azure portal. All these was due to the huge number of requests arriving from Hacker News, more than 40k per hour… with more than half of the response returning HTTP errors. Normally I have less than 500 per hour, so a huge peak.   I’m running on Azure thanks to my MSDN subscription, which entitles me to 130 Euro per month to spend on Azure. So I had a Standard tier for the Web and a Basic tier for SQL Server. And at the time I set them up this was almost maxing out my budget. I realized lately I was just using less than 40€ per month, so I upgraded the SQL server to the Standard tier, which has 60 workers and 10 DTUs. Unfortunaly it took 3-6 hours to scale up a database (even if mine is pretty small, less than half gigabytes). So I guess I lost the traction of the first visits from Hacker News. I would have expected Azure to allows sporadic peaks (I’m usually at half DTU all year long, and for the 3 hrs I’m above the limit they shut it down?). What did I learn? Short term solution, thanks to some suggestion from some friend, I setup my site under CloudFlare. It is a kind of CDN which caches my website and serves it up to the world. It caches only static files (js, css, images) but by default it doesn’t cache dynamic pages, so I’ll have to understand how to configure it to cache them and also keep the backend outside and update pages after comments have been added. Longer term solution: Move away from Subtext and from Azure. This was already something I was planning to do since my MSDN license was a benefit of being MVP. And not making any money out of my blog (apart from the sporadic 5-10SUSD a month I get from ads) I can’t afford to spend 50€ a month on Azure fees. I’m evaluating some options for the new home of my blog: one is making everything static, and host it on github pages. Another is migrating to another blog engine built with Umbraco, Articulate, and run it from Umbraco Cloud. Tags: azure, hackernews, cloudflare, traffic peak [...]



MacBookPro vs Surface Book [based on specs]

Fri, 28 Oct 2016 15:25:00 Z

Now it is an exciting moment to buy a new laptop, with Microsoft having released its new lineup of Surface devices and Apple having released its new MacBookPro with TouchBar. I have a super-old MacBook Pro, “early 2009” Core 2 Duo, that is still working pretty ok till now, after having boosted RAM to 8Gb and disk with an SSD. With a lifetime of 7 years and half, having kept all my previous laptop maximum 3 years, this is the longest living computer I ever had, more than twice. But with the model getting out of support and not compatible with new macOS versions, it’s time for a new laptop to enter my home-office. If there were no Surface laptop, the decision would have been a no-brainer, but now it gets more complicate. So let’s see the specs and features of both the two models. Note: Since none of the two laptops are available on the market, this comparison is just based on specs and personal opinions. Microsoft Surface Book The Surface Book is a hybrid laptop, with a 13” touch screen and pen, and a detachable keyboard that also contains additional battery and discrete graphic processor (NVIDIA® GeForce® GTX 965M 2GB). The new version, called “Surface Book with Performance Base”, comes in 3 configurations all based on i7 Intel CPU: 8Gb RAM and 256Gb SSD and 16Gb RAM with both 512Gb or 1Tb of SSD. The old Surface Book is still available and comes in 7 configurations, i5 or i7, with or without discrete graphic processor (still NVIDIA GeForce), 8Gb or 16Gb RAM and 126Gb to 1Tb SSD. Apple MacBook Pro The MacBook Pro is a standard laptop, with a new Touch Bar strip on top of the keyboard, available in both 13” and 15” displays, the latter with a quad-core i7 and a discrete graphic processor (Radeon Pro 450 2GB or 460 4GB). In addition to these, there is also a revamped MacBook Pro 13” without the Touch Bar. In total there are more than 30 possible configurations as you can mix and match what you want. They even have 2 different clock speeds available and SSD goes up till 2Tb, with the powerful configuration being a 15” with 2.9Ghz quad-core i7, 16Gb of RAM, 2TB SSD and Radeon Pro 460 4GB. Comparing the two beasts Moving along from the factual information, let’s compare the products, focusing on the features that are not just coming from the specs Processing Power This is difficult to do a fair comparisons, especially when comparing prices later, because, while for the MacBook Pros we know exactly their clock speed and if they are dual-core or quad-core, there are no specs for the new Surface Book. I read somewhere they used the same processors of the previous model, so dual-core i5 2.4Ghz and dual-core i7 2.6Ghz. Which is more than the revamped MBP whose i5 is 2.0Ghz and i7 is 2.4Ghz, but (much) less than the new models which have an i5 with 2.9Ghz and i7 up to 3.3Ghz and even a quad-core i7 2.9Ghz for the 15”. Update: In comments someone wrote that “processors are not benchmarked by GHz these days”. While there is more to CPU performances than GHz, they still play an important part of it. So I looked up the model numbers of CPUs, and compared their scores on PassMark benchmarks. CPU Laptop Mark i7-6920HQ @ 2.90GHz MBP 15” (optional) 9588 i7-6820HQ @ 2.70GHz MBP 15” (standard) 8697 i7-6700HQ @ 2.60GHz MBP 15” (standard) 8029 i7-6567U @ 3.30GHz MBP 13” (optional) 5479 i5-6267U @ 2.90GHz MBP 13” (standard) 4882 i7-6600U @ 2.60GHz Surface Book (all) 4751 So, as you can see, the i7 used by the Surface Book has even lower score than the i5 used as standard option in [...]



I’m no more an MVP… and that’s OK

Fri, 30 Sep 2016 13:02:06 Z

This the time of the year when usually member of the Microsoft community post about becoming Microsoft MVP or having their status renewed for another year. I also did that for 8 years, first on my blog since I was first awarded in 2008, in 2009, then when I became a Belgian MVP in 2010, and then on social media when blogging became slower. But this year, instead of announcing my renewal as MVP, I’ve to announce that I’m not being renewed. The reason is simple… during the last year, apart from presenting ASP.NET Core at the Umbraco CodeGarden, I mainly worked on my upcoming book, which is not released yet. So no real tangible contribution. Also last year they put together many expertices into one, so from being MVP on ASP.NET I became an MVP in Visual Studio and Development Technologies, together with MVPs from .NET, ALM, Security, IE and C++, so basically the “generic” Microsoft developers group which do not focus on Azure or devices. This meant more people to be compared with, and with categories that are usually more prolific in community contributions. At first I was kind of upset and annoyed, but after some hours of introspection I remembered that being an MVP for me was a way to get into closer contact with the product groups I care, which are ASP.NET and Web Tooling. But I’m an ASP Insider, so I can still do that. The only thing that affects me is that I’d lose my Azure credits, so I’ll move my blog away from Subtext and from Azure (the use I do wouldn’t justify the 30-40 Euro per month that I’d pay). So, you’ll still see me this year at the MVP summit, and probably also in the years to come. Tags: MVP, MVP Summit [...]



From iPhone to Windows Phone and back: why?

Thu, 29 Sep 2016 15:07:01 Z

I just bought myself a new iPhone 7. This was a long overdue change, as since a few months my Microsoft Lumia was acting weird, and with no clear future path for Windows Phones, I didn’t want to buy something that would be useless in a few months. Back to #iphone after seeing the #smartphone #dream of #Microsoft rise and fall #iphone7 A photo posted by Simone Chiaretta (@simonech) on Sep 21, 2016 at 5:31am PDT A bit of history But let’s go back in time and see my phones. I bought my first smartphone, an iPhone 3G, in June 2008, as soon as the first version of iPhone working in Europe was announced (iPhone 3G). I went on using iPhone for a few years, upgrading to iPhone 4 in 2010. Then Microsoft announced its smartphone operating system, Windows Phone 7, and I almost immediately bought one to test how it worked and with the idea of building some apps. After some times I started using WP as my main phone, and never looked back for a few years, passing from the Samsung Omnia, to a Lumia 800, to a Lumia 930 in July 2014. I loved the UI, with dynamic tiles that displayed information at a glance. I also developed an newsreader for Windows Phone 7. Then Windows Phone 8 came, and changed the way apps had to be built. And then again Windows Phone 10, and yet another change in how apps have to built. While this is not a problem per se for end users, it is a problem for developers that had to rebuild their apps in order to be compatible with the new versions of the phones. Sometimes small changes, sometimes more fundamental changes. This, over time, alienated developers which at the end turned into less applications available for end-users, which caused less users to buy WP phones, which didn’t incentivate developer to spend time in updating apps, and so on. And only the biggest app and companies that couldn’t afford to lose that small 5%-10% of the market (in EU, in USA I think it never grew more than 2-3 %) made apps for Windows Phone. Which IMHO was enough for most of the users: email, calendar, facebook, twitter, instagram, whatsapp, snapchat, wheather, maps, banking and the occasional games and fitness app. Why moving back to iPhone? The fitness niche and no future So why did I move to away from Windows Phone to an iPhone? Because recently I started being more involved into sport. I started training for triathlon. And none of the big fitness companies that produce sport watches or devices have apps for Windows Phone.  This is due to the fact that Windows Phone 10 lacks some support for connecting to these modern BLE devices (doesn’t support “code-less” BLE pairing and cannot act as BLE client to devices). So I couldn’t sync my sport watch with my phone. Same for cyclocomputers and indoor trainers. And since these big players do not support WP, none of the popular fitness apps like Strava, TrainingPeaks and more support it either. Another is my growing interest in connected devices, most of which comes from startups from USA. And given the afortmentioned reasons (low market share) they obviously don’t spend time in making a Windows Phone app. Yet another, more fundamental, reason is the (no) roadmap for the future of Windows Phone. Microsoft sold the featurephone division beginning of 2016, and it was hinted that they will not make new Lumia phones and even stop selling what they have in stock. They might produce a Surface Phone, or anything else, but this level of uncertainty doesn’t help keeping the few users they still have. Why not an Android? I own an Android phone from Sony for 2 years already, and recently I also had a Galaxy Express for a few weeks when I was in USA and both my other phones were dead (Lumia with a unresponsive touch-screen and Sony with a broken glass). In g[...]



Umbraco and Me

Tue, 05 Jul 2016 15:12:09 Z

If I can summarize in one word what is my main field of expertise I would say CMS: since I started working in 1996 I always built public websites based on CMS. When working with Esperia, around year 2000, I developed our internal CMS, which was used to power lots of very popular websites with lots of traffic (at least for the time), like all the top Italian soccer teams, winter sport and soccer portals. At the time, around 2003-2006, I was also using DotNetNuke to develop some sites for small businesses. It was ok for simple sites, but customizing and making a site look like the designer envisioned was almost impossible. Then in 2006 I stopped working professionally with CMS as I worked almost one year building emailing apps in New Zealand and then other 2 years and half doing ”IT Consultancy” in Avanade. But as side I was still working on a smaller CMS, the once famous Subtext Blogging engine, which still powers this blog. Then, a few months after my ASP.NET MVC v1 book was released, I received an email from Niels Hartvig (CEO of Umbraco) asking me if I could go to Copenhagen and give him and the core team a quick start on ASP.NET MVC because they want to rebuild it using ASP.NET MVC. At that point I had never used Umbraco before, but just evaluated it a bit some years before. Obviously I said yes, I went there, delivered the course and immediately felt like I knew these guys since ever. Also I was fascinated by how they were working with the community to bring an edge to the product. I immediately became engaged with the community, and gave two talks at CodeGarden 10 about ASP.NET MVC. Accidentally, as soon as I started my new job after moving to Belgium, I was surprised that Umbraco was used for one of the main public sites of the organization I joined. So I also start working professionally with it. Unfortunately I couldn’t always work with it, as my job requires me to juggle many different hats and various projects, but nevertheless I stayed involved into the community as much as I could, and I attended various conferences of the Belgian Umbraco User Group (or BUUG) and went again to two other CodeGarden, in 2014 and 2015. Then the Big Bang happened… and a new project came by. For the next years, Umbraco will be the main product I’ll be working with as we’ll be rebuilding all our online presence using this amazing CMS, and most of the things we’ll customize will be given back to the community, both as packages and PR to the Core. At first, coming from my background of “purist” .NET developer, I didn’t like too much the mixed approach that required developer to configure the system using the backoffice as it prevented a proper code versioning and Ci to happen. But with the help of the great people in the community I solved most of the issues. And now, with Umbraco 7.4, almost all these issues are solved, thanks to strongly typed models and some tools that help with the versioning of stuff that is still configured in the backoffice. Now that I’ll be working full-time on Umbraco expect to see something more coming out of me in the Umbraco community and conferences in the future. And if you missed it, I also just had a talk about ASP.NET Core at CodeGarden 16 (slides and demo are available). And hopefully soon I’ll be moving my blog from this totally dead Subtext to Articulate on Umbraco. #h5yr Tags: umbraco, community [...]



Slides and demo for my ASPNET Core talk at Umbraco CodeGarden 2016

Fri, 17 Jun 2016 10:40:42 Z

Yesterday I had the pleasure to introduce ASP.NET Core to a very crowded and interested room at Umbraco CodeGarden. I really liked the conference and the amazing best OSS community ever, and even got more hooked to Umbraco if that’s even possible. Now I just want to list the links and resources I mentioned during my talk. Slides and the very basic sample apps: https://github.com/simonech/MeetASPNETCoreAtCG16 A similar presentation I did a few months, but on ASP.NET v5 and still with DNX instead of the dotnet CLI, recorded as webcast: Introduction to ASP.NET Core 1.0 video A blog post I wrote which explains how to enable debugging on .NET Core RC2 inside Visual Studio Core: How to debug .NET Core RC2 app with Visual Studio Code on Windows My OWIN book: My new free eBook is out: OWIN Succinctly by Syncfusion My upcoming “Front-end development with ASP.NET Core MVC 1, AngularJS and Bootstrap” book: Front-end Development with ASP.NET MVC 6, AngularJS, and Bootstrap If you attended my talk, I’d love if you could comment or tweet me (@simonech) and tell me what you thought of it, both about the topic itself and about my presentation. Apparently there were videos recorded, so I’ll post a link when they’ll be online. Tags: ASP.NET Core, .NET Core, Umbraco, CodeGarden [...]



How to debug .NET Core RC2 app with Visual Studio Code on Windows

Fri, 20 May 2016 10:33:40 Z

So, you installed .NET Core RC2 , you followed the getting started tutorial and you got your “Hello World!” printed on your command prompt just by using the CLI. Then you went the next step and you tried to use Visual Studio Code and the C# extension to edit the application outside of Visual Studio. And finally you want to try and debug and set a breakpoint inside the application, but you encountered some problems and nothing worked. Here is how to make it work. Specify the launch configuration Visual Studio Code needs to know how to launch your application, and this is specified in a launch.json file inside the .vscode folder. From the debug window, click the “gear” icon and Code will create it for you: just choose the right environment “.NET Core”. Then you must specify the path to your executable in the program property. In the standard hwapp sample app, replace "program": "${workspaceRoot}/bin/Debug//", with "program": "${workspaceRoot}/bin/Debug/netcoreapp1.0/hwapp.dll", There is much more you can specify in the launch.json file. To see all the options have a look at the official doc: Debugging in Visual Studio Code. Specify the task runner If you try to debug now you’ll have another warning: “No task runner configured”. This is because for launching, VS Code has to build the project, and this is done via a task. But no worries, just click the “Configure Task Runner” button in the info box, choose which task runner you want to use, in this case “.NET Core”, and the tasks.json file will be created for you. More info on task runners in VS Code can be found on the offical documentation: Tasks in Visual Studio Code. Running and debugging Now you can click the “Start Debugging” button or F5 and the application runs. Cool… Now you set a breakpoint and the executions stops where you set it, doesn’t it? Well… if you are on Mac or Linux it does. But it doesn’t stop if you are on Windows and the Debug Console says something like: WARNING: Could not load symbols for 'hwapp.dll'. '...\hwapp\bin\Debug\netcoreapp1.0\hwapp.pdb' is a Windows PDB. These are not supported by the cross-platform .NET Core debugger. Introducing Portable PDBs In order to be able to debug cross-platform, .NET Core has now a “portable PDB” format, and the newly introduced .NET Core debugger for Visual Studio Code only supports this format. Unfortunately by default, on Windows, the .NET Core build generates standard “Windows PDBs”, which are not supported. But the fix is easy, you just have to tell the compiler to generate portable PDBs. This is done by specifying the debugType to be portable. { "buildOptions": { "debugType": "portable" }, ... } And voila! Breakpoints are hit! Tags: vscode, netcore, debugging [...]



The .NET Core RC2 stack has been released, and a new platform download site

Wed, 18 May 2016 14:18:49 Z

Finally, after some months of delay due to the replatforming of DNX on top of the new .NET Core CLI, at the beginning of the week all things RC2 have been released. There are already tons of documention on how to get started, both on the ASP.NET Core Documentation and .NET Core Documentation sites, but in this post I just want to collect all the announcements. Announcements The three main pieces of the puzzle, .NET Core, ASP.NET Core and Entity Framework Core, all RC2. Announcing .NET Core RC2 and .NET Core SDK Preview 1 Announcing ASP.NET Core RC2 Announcing EF Core RC2 Then there is the Tooling, preview 1: Announcing Web Tooling for ASP.NET Core RC2. It’s important to understand why one thing is RC2 and the other is preview. Libraries and runtime are RC2, and will be RTM end of June: they are a real RC2, and they have been working on it for more than 2 years. The tooling, that isthe CLI and the support inside Visual Studio and Visual Studio Code are still a preview, and they have been working on it, expecially the web tooling part, only since end of last year: they will become RTM only with the next version of Visual Studio “15”. Changes A lot changed, between RC1 and RC2, but do not worry too much: changes are mainly in the hosting and runtime parts of apps. No major change in the common APIs… well, maybe some renaming and moving of namespaces. Here are links to what changed in .NET Core and ASP.NET Core between RC1/DNX and RC2: Changes from DNX to .NET Core CLI Changes from ASP.NET Core RC1 to RC2 New website But there is more to it. All things .NET can now be downloaded from the, IMHO, super-cool new url: http://dot.net From there you can download the standard framework, .NET Core, and mobile development tools for Xamarin. Tags: netcore, aspnetcore, aspnet [...]



How to access Council of EU data on votes on legislation using SPARQL and AngularJS

Thu, 28 Apr 2016 09:02:18 Z

One of the areas I've been focusing on lately is the so called "Semantic Web", in particular Open Data as a way to make governments more transparent and provide data to citizens. From a technical point of view, these data are redistributed using the  RDF/LD format. I’m particularly excited of having worked on the release of what I think is a very important data set that helps understand how decisions are taken in the Council of European Union. The Council of European Union published how member states have voted in since 2010 In April 2015, the Council of European Union released as open data how Member States vote on legislative acts. In other words, it means that when the Council votes to adopt a legislative act (ie a regulation or a directive), the votes of each country are stored and made publicly visible. This means that you can see how your country voted when a given law was adopted, or you could get more aggregate data on trends and voting patterns. Recently, the Council has also released two additional open datasets containing the metadata of all Council documents and metadata on requests for Council documents. DiploHack, Open Data Hackathon The Council will also organise for tomorrow 29 and 30 of April, together with the Dutch Presidency, DiploHack, an hackaton about open data, in Brussels. The goal of the hackaton is to make use of Council’s opendata sets, linking them with all the other datasets available from other EU institutions, and build something useful for citizens. You can still register for the hackathon. This post will show you how to access the votes using SPARQL, which is a query language for data published in RDF format, and how to access those data using AngularJS. A brief introduction to RDF/LF and SPARQL In the context of Semantic Web, entities and relation between entities are represented in triples which are serialized in a format called “Turtle” or in RDF/XML (which is what is usually referred as RDF) and many others formats. You can imagine a “triple” as a database with 3 columns: subject, predicate, object. And each of those is represented with a URI. This is a very flexible format that can be used to represent anything. For example you can say that the author of this blog is myself (univoquely identified by my github account url and with the name “Simone Chiaretta”) and that the topic of this blog is Web Development. The corresponding serialization in Turtle (using the simple notation) of these three information will be: . "Web Development" . "Simone Chiaretta" . Notice the use of the URI to represent entities, which gives them an unique identifier. In this case the http://purl.org/dc/elements refers to an URI defined by the Dublin Core’s  Metadata Terms. Another possible solution to represent the topic, could have been to refer to another URI coming from a managed taxonomy. This way it would have been possible to make “links” with other datasets. But  how to query these data? We use SPARQL. SPARQL uses a syntax very similar to Turtle, and uses SQL-like keywords like SELECT and WHERE.  Using the bibliographic example, one could query for all publications written by Simone Chiaretta. The syntax would be: SELECT ?publication WHERE { ?publication . } [...]



Introduction to ASP.NET Core 1.0 video

Fri, 04 Mar 2016 16:42:31 Z

Actually still called Introduction to ASP.NET 5 (I did it before the name change from .NET 5 to .NET Core), a few days ago Microsoft TechRewards published the video I produced for Syncfusion about the new open-source web framework by Microsoft.

In the video I go through a quick introduction, followed by installation producedures, and then how to create command line tools and simple websites using ASP.NET Core v1.0, using both Visual Studio Code and Visual Studio 2015.

You can read more about the content of my video on the post Video Review: Introduction to ASP.NET 5 with Simone Chiaretta and, of course watch the video (and take the quiz at the end).

(image)

Hope you like it, and let me know what you think about it in the comments.

(image)



Two Razor view errors you might be doing too

Fri, 19 Feb 2016 13:02:24 Z

Lately I went back developing web sites with ASP.NET MVC (after quite some time in SPA and Web API), and I struggled for some time with some strange Razor views behaviours I couldn’t understand. Here are some of them. Hope this post will help you save some time in case you have the same problems. Using Generics in Razor views Generics’ syntax has a peculiarity that might interfere when writing inline inside HTML tags: the use of angular brakets. This confuses the Razor interpreter so much that it things there is missing closing tag. For example, when trying to write @Model.GetPropertyValue(“date”) you’ll get an error and Visual Studio will show some wiggle with the following alert. Basically he thinks is an HTML tag and wants you to close it. Solution is pretty simple: just put everything inside some brakets, like @(Model.GetPropertyValue(“date”)) Order of execution of Body and Layout views I wanted to set the current UI Culture of my pages with every request, so I wrote a partial view that I included at the top of my layout view: all text in the layout was correctly translated, while the text coming from the Body was not. After some digging I realized that the order of execution of a Razor view starts with the view itself (which renders the body) and then goes on with the Layout. So my UICulture was set after the body was rendered. So I had to move the partial view that was setting the culture at the top of the “main” view. If you have many views, just put all initialization code inside a view called _ViewStart.cshtml. This way the code is executed before body is rendered, for every view, and you don’t have to add it to each view manually. That’s all for now. Tags: razor, generics, partial view, ViewStart [...]



ASP.NET 5 is now ASP.NET Core 1.0

Thu, 21 Jan 2016 11:42:02 Z

A few months from the RTM of the new version of ASP.NET, Microsoft changed the name: what it was originally referred to as ASP.NET vNext and later as ASP.NET 5, it’s now called ASP.NET Core 1.0. Also all the related libraries change name: .NET Core 5 becomes .NET Core 1.0 ASP.NET MVC 6 becomes ASP.NET Core MVC 1.0 Entitiy Framework 7 becomes Entitiy Framework Core 1.0 I personally think this is a great move as it was causing a lot of confusion in people that where just looking at the whole thing from time to time and not following all the evolution. Why this is a good move Calling the next version v5, v6 and v7 (respectively for ASP.NET, MVC and EF) would have lead to think that they were actually the next version of the various libraries and frameworks. But they were not: ASP.NET 5 would have not been a replacement for ASP.NET 4.6 because it was lacking a lot of its features (WebForms above all) ASP.NET MVC 6 was not a replacement of MVC 5 because you couldn’t run it on top of ASP.NET 4.6 So it’s a good move to reboot the version number to 1.0, and start a new product from scratch, because this is indeed what ASP.NET 5 was: a compiletely new product, wrote from scratch, without backward compatibility and also running a different runtime. Calling it 1.0 also opens the way to a future ASP.NET 5 running on the full framework and still supporting WebForms for example. Calling everything 1.0 also clears up the versioning mess of all the libraries that ran on top of ASP.NET: MVC 5, WebAPI 2, SignalR, Web Pages 2. Now they’ll all be part of the Core family and will all go back to 1.0. And will evolve together with the Core family. Why I don’t like it that much But naming and versioning are hard, and also this naming has its faults: you can still run ASP.NET Core 1.0 on top of the “full” .NET Framework 4.6, same goes with EF Core 1.0. Will this lead to some confusion: I’m pretty sure it will. Also, if you search on Google for ASP.NET MVC 1.0 you’d have to make sure the v1.0 you are reading about is the the “Core” and not the old version of the “full” ASP.NET MVC. Personally I’d have gone even farther, and I would have called completely differently: Foo 1.0. But this would have had also pro and cons: the main point in favour is that we’d finally getting rid of the legacy of “Active Server Pages” and losing the bad connotation that ASP.NET WebForms have in the other communities. Also any name would be better and more appealing than “ASP.NET Core 1.0 MVC” as this is getting very close to the long names that we had from Microsoft in the past. the disadvantage of the new name is that they’ll lose all the ASP branding that has been build over 20 years. How all the new parts stack up after the name change Let’s try to clear up things a bit. As bottom level we'll have: the "full" .NET Framework 4.6 which provides base class library and execution runtime for Windows; .NET Core v1, which provides the base class library and many of the other classes. From RC2 it also provides the execution runtime and all related tools (packages, build, etc), everything that was before in DNX. This runs on all OS. Then as base web framework level: ASP.NET 4.6, runs on top of "full" .NET 4.6 ASP.NET Core v1, runs on top of .NET Core v1 and on top of the "full" .NET 4.6 Then at higher web libraries level: ASP.NET MVC 5, Webforms, and so on and on run on top of ASP.NET 4.6 ASP.NET Core v1 MVC, which runs on top of ASP.NET Core v1 (and in RC2 looses the exec[...]



Automatically applying styles to a Word document with search and replace

Mon, 14 Dec 2015 16:24:23 Z

Word as end-use is a very strange topic for me to blog about, but I just discovered a tip that would have saved me countless hours of time. So I thought to share it. At the moment I’m writing a book (yeah, another one): for my personal convenience I write it in Markdown, so that I can easily push it to GitHub, and work on it from different devices and even when travelling via tablet. I’ve synced my private repository to Gitbook so that I can easily read it online or export it to PDF or Word, but unfortunately I cannot rely on these features to send the chapters to my publisher. In fact book publishers have very strict rules when it comes to styles in Word documents. For example, if I want a bullet list, I cannot just click the bullet list button button in the toolbar, but I’ve to apply a “bulletlist” style. Same goes for all the other standard styles. For most of the styles it’s not a big deal: I just select the lines I need to re-style and in 15-20 minutes a 20 pages chapter is formatted. The problem arrives when formatting “inline code”: in markdown, inline code is formatted with back-ticks (`), so each time I need to show something as inline I’ve to remove the the trailing and leading ticks, and then apply the “inlinecode” Word style. This process alone, in a typical chapter, takes away at least a few hours of time. After a few chapters and hours of frustration I asked for help to my girlfriend, whom, working in language translation, uses Word as her main working tool all day: she had a solution for this problem, so I’m sharing it in case other fellow technical writers need it. First open the Advanced Find dialog, switch to the Replace tab: In Find you put a kind simplified regular expression: (`)(*)(`). This means: find any sting which starts with a back-tick, and ends with a back-tick. In Replace put \2. This means: replace it with the content of the second “match group”. Also specify the style you want applied, in my case “InlineCode”. And remember to check the box Use wildcards, otherwise this won’t work. Let’s see in action on some lines from my upcoming book with the markdown file: Once pasted into Word (and applied the basic styling) it becomes (notice all that text with back-ticks): I then apply the magic find&replace: And voila! In a few seconds 20 pages of Word documents are correctly updated by removing the ticks around inline code and applying the correct style. it’s not my typical content, but I hope you’ve learnt some thing that you didn’t know. To see all you can do with wildcards: How to Use Wildcards When Searching in Word 2013 Next step in automating this process would be writing some code that automatically formats it properly in one go. Tags: word, book, find&replace, markdown, gitbook [...]



Web European Conference Registrations opens 1st July 12:00CET

Tue, 30 Jun 2015 09:36:48 Z

The moment has finally come: tomorrow at midday, Central European Time, it will be possible to start registering for the 2nd Web European Conference. In the previous edition of the conference we sold out all the tickets available at the time (170) in the first few hours after opening: this year we’ll have 400 seats, but just to be sure, remember to set an alarm and get to the registration page on time not too loose the option to take part in the conference. Register for the Web European Conference Speakers and session Tomorrow we’ll also close the Call for Presenters, and we’ll also ask for your opinion on which sessions to include in the conference: you can already see all the proposals on out github repository and from tomorrow you’ll be able to vote for your favorite sessions. But we have already our two TOP speakers: Scott Hanselman and Dino Esposito.   Sponsors A final word on our sponsors and partners, without whom this conference will not be possible.   Tags: webnextconf, conference [...]



CodeGarden 2015: recap of day 1

Wed, 10 Jun 2015 19:51:04 Z

Here I am again, for the third time at Umbraco CodeGarden. For those who do not know what it is, it's the yearly Umbraco developer conference, this year celebrating its 10th anniversary. Before going to sleep after a long day I just wanted to post my recap of the day. The Keynote Some numbers on the "size" of the community: almost 200k active developers on the community site almost 300k active public installation of Umbraco over 200k installation of Umbraco v7 in the last year In addition to giving all these figures, Niels also highlighted some popular packages contributed by the community (Vorto for 1 to 1 translations, NuPicker and Nested Contents for enhanced editors experience, LePainter a visual grid editor and BookShelf to provide inline contextual help to the backoffice). Other announcements included the features that are coming with v7.3 (automatic load balancing, a new API library as first path to get rid of legacy API, authentication based on ASP.NET Identity, which enables twitter, Google, Active Directory and 2 factor authentication via Google) and future features that are currently being experimented like the new cache layer, a new content type editor and a full-fledged REST API based on the HAL standard. Roadmap panel Immediately after the keynote, 5 members of the core dev team answered questions on specific pain-points that users would like addressed in future (v8) releases, and also unveiled HQ's priorities: Improving the UX Fresh start on the code (getting rid of the decennial original legacy API) Bringing many features of Umbraco.com (the SaaS platoform) to on-premises installations (like migrations from environments, syncronization of content and so on) Segmentation, segments-based content variations and personalization Contributing to the core After the usual organic lunch, the afternoon started with some Git tips to better contribute to the core of Umbraco and make maintainers' life easier: First squash all commits into one, making sure no typos or "missed file" kind of commits are sent in the pull request. The suggestion was to use the git rebase --interactive command. Then making sure our pull request is based on a pretty recent version of the repository, using the following process: Track upstream git remote add upstream ... Fetch upstream git fetch upstream Rebase your commit on top of the latest version of the repo git rebase upstream/dev-7 And finally, merge all the conflicts that might arise before doing the pull request Make Editors Happy As last year, one of the main tenet of the conference is reminding us developers that also content editors deserve love, and with Umbraco 7 it's very easy to craft data editors tailored to custom editing expectations and flows. But even without going down the path of customization with AngularJS, many things can be done also with the core editors and a few selected packages: group properties in tabs, remove from the RTE everthing tha editors do not need, provide contextual help (maybe consider the uEditorNotes package) and finally use NuPicker and Nested Content to provide a better experience when choosing node from the tree and when creating list of items. How to sell Umbraco The day ended with an amazing talk by Theo Paraskevopoulos with tips on how to sell Umbraco as platform when doing projects. Unfortunately the slides are not published yet, but will update the post as soon as they are. Some impressive facts I didn't know about: NFL uses Umbraco for one of they sub-sites (http://[...]



My new free eBook is out: OWIN Succinctly by Syncfusion

Mon, 16 Mar 2015 11:13:33 Z

I’m happy to announce that my latest book, OWIN Succinctly, has just been released by Syncfusion, within their “succinctly” series of eBooks. I’ve written this book together with my friend and co-organizer of the 2nd Web European Conference, Ugo Lattanzi, with whom I’ve also give a speech in Paris last May, still about Owin. Owin is a big inspiration for the new ASP.NET 5 stack, so we decided to write this book both to show how you can use this “philosophy” with current version of the ASP.NET, and to let you know how it could be in the future with ASP.NET 5. The book covers all aspects of OWIN, starting with a description of the OWIN specification, moving on to how Katana, Microsoft’s implementation of the specs, works. Later we also show how to use Katana with various web frameworks, how to use authentication and finally how to write custom middleware. The table of contents is: OWIN Katana Using Katana with Other Web Frameworks Building Custom Middleware Authentication with Katana Appendix OWIN, and the new ASP.NET will be big actors in the 2nd Web European Conference in Milano next 26th of September, so, if you want to know more about those technologies, consider participating to the conference. A big “thank you” goes to Syncfusion, for giving us the possibility to reach their audience, and to our technical reviewer Robert Muehsig, whose comments helped making the book even better. If you have comments or feedback on the book, do not hesitate to write a comment on this post, or contacting me on twitter @simonech. Tags: owin, book, aspnet vNext [...]



Using Entity Framework within an Owin-hosted Web API with Unity

Fri, 20 Feb 2015 14:05:56 Z

After quite a lot of time of writing applications without direct interaction with Databases, lately I’ve been working on a pretty simple ASP.NET Web API project that needs to save data on a database. Despite the simplicity of the application, I faced some interesting problems, which I’m going to write about in a few blog posts over the next weeks. The first of the problems, which I’m going to write about in this post, is how to configure an ASP.NET Web API application to run within Owin, have its dependencies resolved with Unity, and have Entity Framework DbContext injected via IoC/DI. Setting up ASP.NET Web API with Owin The first thing to do is getting the right packages: first create a new ASP.NET Web Application, choose the Empty template, and tick the Web API option under “Add folders and core references for”: this will install all the Nuget packages needed for a Web API project, and will setup the folder structure; then you need to install the Owin packages and the Owin-Web API “bridge”: by installing the Microsoft.AspNet.WebApi.Owin you’ll get everything you need; finally, depending on where/how you want to run the Web API project, you also need the Nuget package for the Owin server you need: download Microsoft.Owin.Host.SystemWeb for starter if you want your app to run within IIS. Once all the core dependencies are ready, you have to configure the Owin Startup class to fire up Web API: just add a OWIN Startup class from Visual Studio contextual menu and add to the Configuration method the right configuration for Web API. public class Startup { public void Configuration(IAppBuilder app) { HttpConfiguration config = new HttpConfiguration(); // ... Configure you web api routes app.UseWebApi(config); } } And Voilá! You have a Web API running with an Owin host. Adding Unity to Web API within Owin Next step is adding Unity and configuring it to correctly resolve dependencies in a Web API application. Just add the Unity.AspNet.WebApi Nuget package and all the needed packages and bootstrapping code will be added to the project: in particular it will add two important files: UnityConfig class, where the configuration of the Unity container should go UnityWebApiActivator class, which is fired up using WebActivator, that registers the unity dependency resolver for Web API (by saving into the GlobalConfiguration.Configuration object) Unfortunately, if you run you application now (and you have already some dependencies injected into your controllers via IoC/DI) nothing will be injected, simple because the DependencyResolver is still empty, despite being set by the Start method of the UnityWebApiActivator: this works fine in a normal Web API application, but not with Owin, because of the sequence in which the various services are instantiated. The solution to the problem is pretty easy : just delete the UnityWebApiActivator class and put the same code into the Owin configuration method: public class Startup { public void Configuration(IAppBuilder app) { HttpConfiguration config = new HttpConfiguration(); // ... Configure you web api routes config.DependencyResolver = new UnityDependencyResolver(UnityConfig.GetConfiguredContainer()); app.UseWebApi(config); } } For reference, the UnityConfig.GetConfiguredContainer is a static method exposed by the UnityCo[...]



How to unset a proxy for a specific git repository or remote

Tue, 20 Jan 2015 10:16:17 Z

In this post I’ll show something that I just discovered and solved a problem I had once we introduced a in-house git repository: how to have many git repositories using proxies but have one that connects directly without proxy. Lately we moved our source code repository, from a “standard” TFS repo to the git-based TFS repository that has been introduced with TFS 2013. Besides working with github repositories, now I had to connect also to some a repository hosted inside the local network and authenticate using the local domain credentials. All went well from within Visual Studio, but since you cannot do everything from VS, I also needed to connect to the internal repository via the git command line tools. The problem is that it didn’t connect. After a bit of troubleshooting I realized that the problem was the proxy: I’m behind a corporate firewall, so I had to configure a proxy to connect to github. Unfortunately the proxy was not recognizing my connection as local, so was trying to resolve it on the internet, and of course it failed. I had to remove the proxy configuration, and I could connect to my local git-based TFS repository, but I couldn’t connect to the other repositories unless I specified the proxy on each of the repositories that needed it, which was kind of tedious since I need proxy for all repos except one. Looking through the git-config documentation I found the solution: Set to the empty string to disable proxying for that remote. This not only work when specifying a proxy for a specific remote, but also for the whole repository. Without further ado, here are the command for this configuration. First you specify your global proxy configuration $ git config --global --add http.proxy "http://proxy.example.com" Then you move to the repository for which you want to unset the proxy and add an "empty" proxy. $ git config --local --add http.proxy "" And in case you need to specify an empty proxy only for a specific remote $ git config --local --add remote..proxy "" It took me a day to understand the cause of the problem, hope this post will help other people in a similar situation. Tags: git, proxy, TFS [...]