Subscribe: Geekswithblogs.net
http://www.geekswithblogs.net/MainFeed.aspx
Preview: Geekswithblogs.net

Geekswithblogs.net



Geekswithblogs.net



 



Beginning Azure Machine Learning

Mon, 13 Nov 2017 18:20:57 GMT

Originally posted on: http://geekswithblogs.net/GinoAbraham/archive/2017/11/14/beginning-azure-machine-learning.aspx

Free version for Azure ML Studio for Learning

https://studio.azureml.net

Azure ML Cheat Sheet for Selecting ML Algorithm for your Experiments

https://docs.microsoft.com/en-us/azure/machine-learning/studio/algorithm-cheat-sheet

(image)
(image) (image)



Announcing Enzo Online: IoT and Mobile Development Made Easier

Mon, 13 Nov 2017 08:30:56 GMT

Originally posted on: http://geekswithblogs.net/hroggero/archive/2017/11/13/announcing-enzo-online-iot-and-mobile-development-made-easier.aspx

If you are a software developer and would like to build a Mobile application, or an IoT system, it is likely that you will experience a steep learning curve for two reasons: development languages, and lack of SDKs.  Indeed, every platform has its own programming language variation; Objective-C on iOS, C++ on Arduino boards, Python on Rasbperry Pis (or other supported languages), .NET/Javascript for MVC Web applications, PowerShell for DevOps... and so forth. The second learning curve is around the limited (or complex) support for Software Development Kits (SDKs) for certain platforms, such as Microsoft Azure, or other technologies that do not have formal SDKs (such as sending a text message from an Arduino board for example), or introduce breaking changes when upgrading.

To simplify an already complex ecosystem of languages and platforms, I created a new kind of cloud technology: Enzo Online. Enzo Online is an HTTP Protocol Bridge that makes it very easy to access other services. With Enzo Online, you can easily configure access to some of your key cloud services, and call them from your Mobile apps and IoT devices without the need to download an SDK. In other words, Enzo Online allows you to configure your services once, and reuse from any language/platform through HTTPS calls.

During the Preview phase of Enzo Online, you can query SQL Server/Azure and MySQL databases, send SMS messages, Emails, and access other Microsoft Azure services (Service Bus, Azure Storage, Azure Key Vault) by sending simple HTTPS commands.

At the time of this writing, Enzo Online is in preview, and many more services will be added over time.

Visit https://portal.enzounified.com for more information.

This is a cross post from https://www.herveroggero.com/single-post/2017/11/07/IoT-and-Mobile-Development-Made-Easier

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified (http://www.enzounified.com/). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

(image) (image)



Octopus Deploy : Format exception unpacking Windows NuGet package onto Linux box

Fri, 10 Nov 2017 11:00:15 GMT

Originally posted on: http://geekswithblogs.net/alexhildyard/archive/2017/11/10/octopus-deploy--format-exception-unpacking-windows-nuget-package-onto-again.aspx

There appears to be an issue with OD's "Deploy NuGet package" task (tested in Octopus Deploy 3.1.5). We encountered the problem while unpacking a .NET Core 2.0 application on a Windows tentacle and copying the resultant assets to an Ubuntu box. We noticed that if the assets were extracted using OD's in-built "Deploy package" task, the application would fail to launch on the Linux box with an "incorrect format" exception. If we unpacked the NuGet package some other way however (eg. manually extracted it), the application would run without error.

As a workaround, there are quite a few options. We have tried both of the following successfully:

1. Share the Octopus Packages folder, and then replace the "Deploy NuGet Package" step with a call to the "[System.IO.Compression.ZipFile]::ExtractToDirectory()" method, using the Octopus.Project.Name and Octopus.Release.Number properties to identify the package to extract and the Octopus.Tentacle.Agent.ApplicationDirectoryPath and Octopus.Environment.Name properties to identify the desired extraction path.

2. Install mono on the Linux box in question and unpack the package there using the "Deploy NuGet Package."

I have raised a Support ticket with Octopus Deploy:


(image) (image)



Octopus Deploy : Format exception unpacking Windows NuGet package onto Linux box

Fri, 10 Nov 2017 10:59:07 GMT

Originally posted on: http://geekswithblogs.net/alexhildyard/archive/2017/11/10/octopus-deploy--format-exception-unpacking-windows-nuget-package-onto.aspx

There appears to be an issue with OD's "Deploy NuGet package" task (tested in Octopus Deploy 3.1.5). We encountered the problem while unpacking a .NET Core 2.0 application on a Windows tentacle and copying the resultant assets to an Ubuntu box. We noticed that if the assets were extracted using OD's in-built "Deploy package" task, the application would fail to launch on the Linux box with an "incorrect format" exception. If we unpacked the NuGet package some other way however (eg. manually extracted it), the application would run without error.

As a workaround, there are quite a few options. We have tried both of the following successfully:

1. Share the Octopus Packages folder, and then replace the "Deploy NuGet Package" step with a call to the "[System.IO.Compression.ZipFile]::ExtractToDirectory()" method, using the Octopus.Project.Name and Octopus.Release.Number properties to identify the package to extract and the Octopus.Tentacle.Agent.ApplicationDirectoryPath and Octopus.Environment.Name properties to identify the desired extraction path.

2. Install mono on the Linux box in question and unpack the package there using the "Deploy NuGet Package."

I have raised a Support ticket with Octopus Deploy:


(image) (image)



Starting An Umbraco Project

Fri, 10 Nov 2017 07:08:50 GMT

Originally posted on: http://geekswithblogs.net/tmurphy/archive/2017/11/10/starting-an-umbraco-project.aspx

(image)

As I have been documenting Umbraco development I realized that people need a starting point.  This post will cover how to start an Umbraco project using an approach suitable for ALM development processes.

The criteria I feel a maintainable solution include are a customizable development project which can be easily in source control with a robust and replicatable database.  Of course this has to fall within the options available with Umbraco.  For mean this means an ASP.NET web application and a SQL Server database.  Let’s take a look at the steps required to get started with this architecture.

Create The Database

I prefer a standard SQL Server database instance over SQL Server Express due to its manageability.  For each Umbraco instance we need to create an empty database and then a SQL Server login and a user with permissions to alter the database structure.  You will need the login credentials when your first start your site.

Create The Solution

This is the easiest part of the an Umbraco project.  The base of each Umbraco solution I create starts with an empty ASP.NET Web Application.  Once that is created open the NuGet package manager and install the UmbracoCms package.  After that it is simply a matter of building and executing the application.

Finish Installation

As the ASP.NET application starts it will present the installation settings.  The first prompt you will get is to create your admin credentials as shown below.  Fill these fields in but don’t press any buttons.

(image)

The key is the be sure to click the Customize button before the install button as it doesn’t verify whether you want to use an existing database before running the install.  It will simply create a SQL Server Express instance on its own.  Pressing the Customize button will show the configuration screen shown below.  Fill in your SQL Server connection information and click Continue.

(image)

Conclusion

Once you start the install sit back and relax.  In a few minutes you will have an environment that is ready for your Umbraco development.  This will be the starting point for other future posts.  Stay tuned.

(image) (image)



Relating Umbraco Content With the Content Picker

Wed, 08 Nov 2017 14:36:09 GMT

Originally posted on: http://geekswithblogs.net/tmurphy/archive/2017/11/08/relating-umbraco-content-with-the-content-picker.aspx After addressing Umbraco team development in my previous post I want to explore maintaining relationships between pieces of content in Umbraco and accessing them programmatically here. For those of us who have a natural tendency to think of data entities and their relationships working within a CMS hierarchy can be challenging.  Add to that the fact that users don’t only want to query within that hierarchy and things get even more challenging.  Fortunately we will see here that adding the Content Picker to your document type defintion and a little bit of LINQ to your template you can deliver on all these scenarios. Content Picker Adding the Content Picker to your document type definition is the easiest part of the process but make sure that you use the new version and not the one that is marked as obsolete.  You will then be presented with a content tree that allows you to navigate to and select any node in your site. Querying Associated Content The field in your content will return the ID of the content instance you associated using the Content Picker.  Unfortunately it actually returns it as a HtmlString so you need to use the ToString method before utilizing it or you will get unexpected results and compile errors. In the example below I am looking for the single piece of content selected in the Content Picker.  The LINQ query show a more complicated approach, but this also gives you and idea of how you could get a list of all nodes of a certain content type and use a lambda expression to filter it.  It requires that you first back up to the ancestors of the content you are displaying and find the root.  The easier way is to use the Content or TypedContent methods of the UmbracoHelper.  In future posts I will show alternate methods for finding the root node as well. var contentId = Umbraco.Field("contentField"); var associatedContent = Model.Content.Ancestors().FirstOrDefault().Children().Where(x => x.Id == int.Parse(contentId .ToString())).FirstOrDefault(); Conclusion While the Umbraco team needs to create some better documentation for this feature it is extremely useful for building and using relationships between content in your Umbraco site. [...]









How Did I Become An IT Consultant Curmudgeon?

Thu, 02 Nov 2017 01:25:29 GMT

Originally posted on: http://geekswithblogs.net/tmurphy/archive/2017/11/02/how-did-i-become-an-it-consultant-curmudgeon.aspx

(image)

I have been accused of being a curmudgeon by more than one co-worker.  The short, pithy answers to the question of I got to this point would be “experience” or “it comes with age”.  But what is the real reason and does it have any benefit?

Firstly, I was raised in an Irish-German family which by default makes me surly and sarcastic.  At almost half a century of this habit I don’t see any change coming there.  I also find that most developers have similar traits along with a dry sense of humor.

The main thing that gained me the label of curmudgeon is the knack for identifying issues that could adversely affect a project.  Things like scope creep and knowledge voids that could push a project beyond its budget and deadlines.  This tendency has come from 20 years of being a technical lead and architect.  It is an attribute that has served me well.

The place where this becomes a thorn in some team members side is that with years I think I have become more blunt with my assessments.  I am still capable of tact, but probably need to employee a little more liberally.

Ultimately, I think being a self-aware-curmudgeon is a good thing.  As long as we continue to learn and strive to work with people a little surly is just the spice of life.

(image) (image)



Umbraco Team Development

Fri, 27 Oct 2017 05:20:40 GMT

Originally posted on: http://geekswithblogs.net/tmurphy/archive/2017/10/27/umbraco-team-development.aspx

(image)

The Umbraco CMS platform give you the ability to create a content managed site with the familiar development process of ASP.NET MVC.  If you are the only developer things don’t get too complicated, but the moment you are sharing your solution with a team you gain a few wrinkles that have to be addressed.

Syncing Content and Document Types

Umbraco saves its content partially to the file system and partially to the database.  This complicates sharing document types, templates and content between developers.  While Courier allows you to sync these elements between your local machine and your stage and production servers it doesn’t do well between to local host instances.

We addressed this problem using the uSync package which is available in the Umbraco package page under the developer section of your site.  It allows you to automatically record changes every time you make a save in the back office.  They are saved to files that can then be transferred to another system and will automatically be imported when the application pool restarts.

Other Special Cases

Another area that you will have to address is the content indexing files and the location of the auto-generated classes.  Files like “all.dll.path” will need to be ignored in your source control since it contains a fully qualified directory location which will cause problems as you pull source code to multiple machines.

Of course you will also need to manage your web.config files as you would with any ASP.NET based solution to make sure that you don’t step on each developer’s local settings.

Summary

If you follow these couple of guidelines you will overcome some of the more annoying aspects of developing an Umbraco solution.  Ultimately it will make the life of your developers will be much easier. 

(image) (image)



.NET Core–Push Nuget Package After Build

Thu, 26 Oct 2017 07:38:56 GMT

Originally posted on: http://geekswithblogs.net/SoftwareDoneRight/archive/2017/10/26/.net-corendashpush-nuget-package-after-build.aspx

You can configure .NET Core to automatically push your nuget package to the package server of your choice by adding a Target to your project file.

1) If your package server requires an api key, you can set it by calling

nuget.exe SetApiKey

2) Add the following target to your csproj file.   This sample is configured to only fire on Release Builds.

   
  
  
    
  
   
   https://www.nuget.org/api/v2/package ">

OR

Here’s a version that will ensure releases with a .0 revision number are properly pushed.


    < GetAssemblyIdentity AssemblyFiles="$(TargetPath)">
     
   
    < PropertyGroup>
      < vMajor>$([System.Version]::Parse(%(AssemblyVersion.Version)).Major)
      < vMinor>$([System.Version]::Parse(%(AssemblyVersion.Version)).Minor)
      < vBuild>$([System.Version]::Parse(%(AssemblyVersion.Version)).Build)
      < vRevision>$([System.Version]::Parse(%(AssemblyVersion.Version)).Revision)
   
       
    https://www.nuget.org/api/v2/package ">
 

(image) (image)



ETH security feature distribution

Mon, 23 Oct 2017 15:40:34 GMT

Originally posted on: http://geekswithblogs.net/foxjazz/archive/2017/10/23/eth-security-feature-distribution.aspx

Ethereum sidechain feature...

ETH nodes would have a private key to decrypt the code it would run, this code is copied x times based on keys that would have access to the data, one encryption for the sender, one for the receiver, and one for the computer and more for 3rd parties listed. 
The way to seed this process would be hardware accessed as the secret key owners, nodes would have to carry a hardware wallet copy for all running nodes which ETH would use to decrypt and process code.
This ensure ultimate privacy of the contract.

Currently contracts are open, and not private, they may contain things like listing of properties or trade in volume.
 
(image) (image)



NET USE in WSL

Wed, 18 Oct 2017 07:03:23 GMT

Originally posted on: http://geekswithblogs.net/WinAZ/archive/2017/10/18/net-use-in-wsl.aspx

Still getting into Windows Subsystem for Linux (WSL). I was using PowerShell to execute NET USE commands to access remote shares. Having tried this on WSL, it wasn’t immediately obvious how to get it to work, though I knew it should be possible. e.g. here was my first try

$net use \\sharename password /USER:domain\username
  

Which resulted in:

Invalid command: net use
 
  Usage: 
  net rpc             Run functions using RPC transport
  net rap             Run functions using RAP transport 
  …
  net help            Print usage information

  

Clearly, WSL is not impressed. After some trial and error, I landed on the following command line:

$net.exe use '\\sharename' password '/USER:domain\username'

First, notice that I used net.exe, instead of just net. That’s because we’re talking about two different tools. The net.exe is a Windows tool for working with users, groups, file shares, and more, but net is a Linux tool for working with Samba and CIFS servers (type man net in WSL for more details).

Next, I added quotes around the options with back-slashes. This keeps from needing to escape the back-slashes. Alternatively, you could write:

$net.exe use \\\\sharename password /USER:domain\\username

That explicitly escapes the back-slashes. Now, I’m able to access secure file shares through WSL.

@JoeMayo on Twitter

(image) (image) (image)



I'm back!

Sat, 14 Oct 2017 08:31:26 GMT

Originally posted on: http://geekswithblogs.net/paul/archive/2017/10/15/244610.aspx

Wow... just Wow... the posts before this are from when I was at TAFE and just beginning my studies. Oh how far I have come since then. I completed studies and got my Bachelor in Game Design/Computer Science, worked for a small council initiative doing some PHP stuff and now currently I am a .NET developer at a stockbroking firm working on in-house products. Anyway I am back and not going to post cringe-worthy stuff like I have previously haha!

Reason for being back is that I am exploring .NET Core 2.0 and React. I am working on a little project that I thought up a long time ago and plan on blogging about my problems, journey and experiences on here. The project could be a WordPress site really, would be much easier. But then I won't get to do what I love, learn and actually make it scalable as I have bigger plans for it later, not limited to but including a mobile phone app that will make calls to the database.

Anyway tomorrow will be Day 1 and here is my list:
- Create VS2017 project and install React
- Create work item cards and mockups for home page
- Research local database setup

If anyone has any tips or helpful links regarding that list, please feel free to link in comments :D
(image) (image)



Fix for slow iPhone 5s after IOS 11.0.1 11.0.2

Thu, 12 Oct 2017 05:00:43 GMT

Originally posted on: http://geekswithblogs.net/BlueProbe/archive/2017/10/12/244608.aspx

Love my iPhone 5s but it was brutal after IOS 11 upgrade. I couldn’t even swipe to answer the phone. If your a fan-boy you can probably do this yourself, but I took it into the Apple Store and they restored it back to base 11.0.3 using the iTunes app on a Mac wiping everything. And bingo, we’re back in business. Every couple of years going back to metal on my laptop has been a good thing too.

(image) (image)



Installing a Kubernetes Cluster on CentOS 7

Thu, 12 Oct 2017 02:05:12 GMT

Originally posted on: http://geekswithblogs.net/alexhildyard/archive/2017/10/12/installing-a-kubernetes-cluster-on-centos-7.aspx


Having played around with the GKE and MiniKube "one stop" clusters, I wanted to build a multi-node K8 cluster from a bunch of CentOS VMs. The experience was pleasantly straightforward, following the instructions at https://kubernetes.io/docs/setup/independent/install-kubeadm/ with one or two caveats.

First of all, after the initial K8 installation, you will need to disable swap and remove the certificate keys on all nodes before you can either initialise the cluster on the master node or join a slave node to the cluster:

swapoff -a
service kubelet stop
rm -rf /var/lib/kubelet/pki

Also pay attention to the final output from "kubeadm init"; you will not be able to install a pod network until you have manually copied and permissioned the admin.conf file as described; instead, at least with K8 1.8, you'll get confusing W102 warnings about K8 falling back on localhost:8080:

scp root@:/etc/kubernetes/admin.conf ~/.kube/config

And you'll need to do the same thing on the slave nodes if you want to be able to run cluster commands from them in addition. 

Finally. if you want to schedule pods on your master node, check it's taint-free (returns no taint) with:

kubectl describe node | grep -i taint

So far as pod networks go, I installed Calico without any issues. You can then continue with the Sock Shop sample installation, and check all the expected services are running with:

kubectl get services --all-namespaces

(image) (image)



DAX Studio 2.7.0 Released

Mon, 02 Oct 2017 03:57:10 GMT

Originally posted on: http://geekswithblogs.net/darrengosbell/archive/2017/10/02/dax-studio-2.7.0-released.aspxThe major change in this version is to the tracing engine. We’ve introduced a new trace type and made some changes to the way the tracing windows operate and incorporated some enhancements to crash reporting and enabling logging.We've also finished moving off our old codeplex home onto http://daxstudio.orgChanges to the way trace windows workPreviously when you clicked on a trace button, the window opened and the trace was started and when you switched off the trace the window closed. The running of the trace and the visibility of the window was closely linked. In v2.7 we have removed that tight linkage, when you click on a trace button the window opens and the trace still starts as it used to, but when you switch off the trace the window now remains open. The table below shows the 2 new states that trace windows now have. v2.6 and Earlier V2.7 or laterWindow Visible - Trace RunningN/AN/AWindow Closed – Trace Stopped Window Visible - Trace RunningWindow Visible – Trace Paused **Window Visible – Trace Stopped **Window Closed – Trace Stopped All trace windows now have a number of additional controls in their title area.  Starts a paused or stopped trace  Pauses a running trace  Stops a running trace   Clears any information captured in the current trace windowThe tabs for the traces now also have an indicator to show their state so that you can see the state of a given trace at a glance. In the image below you can see that the All Queries trace is stopped, while the Query Plan trace is running and the Server Timings trace is paused. Note that while a trace is paused the server side trace is still active it’s just the DAX Studio UI that is paused, so expensive trace events like Query Plans can still have an impact on the server.The other side effect of this change is that if a .dax file is saved while a trace window is open, when that file is re-opened the trace window will also re-open with the saved trace information, but now the trace will be in a stopped state (previously the trace would open and re-start). This prevents accidentally overwriting the saved information and also means that the saved trace information will open even if you cancel the connection dialog (which would not happen in v2.6 or earlier, cancelling the connection would cause the saved trace information not to open) The “All Queries” traceThe new trace type is called “All Queries” – which captures all queries against the current connection, regardless of the client application. This is useful for capturing queries from other client tools so that you can examine them. When the trace is active it will capture all events from any client tool. The screenshot below shows a capture session that was running against a PowerBI Desktop file. When you hover over the queries the tooltip shows you a larger preview of the query and double clicking on the query text copies it to the editor The “All Queries” trace has a few additional buttons in the title bar area.The following button in the trace window title will copy all the queries matching the current filter to the editor pane.The Filter button shows and hides the Filter controls, the clear filter button will clear any filter criteria from the filter controls. Filters can be set for a specific type of query DAX/MDX/SQL/DMX, for a duration range, username, database or query. The filter all do a “contains” style search that matches if the text you type is anywhere in the field. Note: You cannot have the "All Queries"[...]



Huge XML document to CSV

Wed, 27 Sep 2017 02:35:48 GMT

Originally posted on: http://geekswithblogs.net/bconlon/archive/2017/09/27/huge-xml-document-to-csv.aspx

I have a huge XML document (over 2GB) exported from an automotive company system which I wanted to be able to export data from it to a CSV file to import into a legacy reporting system. Yes I know this is a bit back to front but a job is a job.

My first thought was to write some C# code to extract the data, but this quickly became difficult as the XML document’s schema was non-trivial due to its use of substitution groups. I may have been able to use XSLT, but this is not a strong point for me, and I think I would have the same complexities with the XSD.

So instead I looked at using the Liquid Data Mapper which is part of Liquid Studio 2017. There is a video explaining how to go from CSV to XML, so I just reversed the Source and Target and this seemed to work OK.

In fact it was quite easy as a lot of the field names matched so the automatic ‘Connect Child Nodes’ connection matching function connected many of these for me.

However, when I clicked the run button I received an ‘Out of Memory’ error. Not so good.

I contacted the Liquid Technologies support team and they responded very quickly suggesting that I try running the generated transform .exe file from the command line rather than from inside Liquid Studio as shown in the video ‘Generate an Executable (.exe) Data Mapping Transform’.

The size of file that can be processed is ultimately dependant on the memory of the PC, but they say they have tested with files > 5GB using a 32GB PC and it works OK.

Anyway, I followed the instructions on the video and it worked straight away. This is very impressive and I would highly recommend using this approach for mapping data.

#

(image) (image)



Mesh WiFi 101

Sat, 23 Sep 2017 04:03:06 GMT

Originally posted on: http://geekswithblogs.net/MikeParks/archive/2017/09/23/244603.aspxIf you're tired of dealing with WiFi connectivity headaches, dead zones, and weak signals, from your old outdated traditional router, upgrading to Mesh WiFi for your home network is worth checking out. What is Mesh WiFi?The best home network upgrade I've ever made. I immediately gained:Max WiFi signal across the entire houseNo more WiFi dead zonesNo more range extendersGigabit WiFi speed capabilityAutomatic security updatesParental Controls to pause WiFi connections and filter contentThe very first speed test I ran from my iPhone gave me this (I pay for 200Mbps through Spectrum): width="560" height="315" src="https://www.youtube.com/embed/LfBiR-mtvSM?rel=0&showinfo=0" frameborder="0" allowfullscreen=""> What are my options?A few of the top options for Mesh WiFi on the market right now are Eero, Google Wifi, and Orbi. How easy is it to setup?It took me 10 minutes to replace my old traditional router. EeroGoogle WifiOrbi width="250" src="https://www.youtube.com/embed/YysjcLqinHs?rel=0&showinfo=0" frameborder="0" allowfullscreen=""> width="250" src="https://www.youtube.com/embed/z7PPYNs5Xao?rel=0&showinfo=0" frameborder="0" allowfullscreen=""> width="250" src="https://www.youtube.com/embed/Y-IgLQJFj0U?rel=0&showinfo=0" frameborder="0" allowfullscreen=""> What do people think about it?They love it. Check the reviews. EeroGoogle WifiOrbi width="250" src="https://www.youtube.com/embed/zMgWzJSKcuM?rel=0&showinfo=0" frameborder="0" allowfullscreen=""> width="250" src="https://www.youtube.com/embed/PM9nV91XgAA?rel=0&showinfo=0" frameborder="0" allowfullscreen=""> width="250" src="https://www.youtube.com/embed/jCquEAKCUXg?rel=0&showinfo=0" frameborder="0" allowfullscreen=""> What do tech experts think about it?Eero - https://www.tomsguide.com/us/eero-mesh-wifi-router,review-4302.htmlGoogle Wifi - https://www.tomsguide.com/us/google-wifi,review-4307.htmlOrbi - https://www.tomsguide.com/us/netgear-orbi,review-4263.html Where can I buy it?Here's all the Amazon's Mesh Wifi search results goodness. Eero Google Wifi Orbi eero Home WiFi System (1 eero + 2 eero Beacons) - TrueMesh Network Technology, Gigabit Speed, WPA2 Encryption, Replaces Wireless Router, Works with Alexa (2nd Gen.) Google Wifi system (set of 3) - Router replacement for whole home coverage NETGEAR Orbi Home WiFi System: AC3000 Tri Band Home Network with Router & Satellite Extender for up to 5,000sqft of WiFi coverage (RBK50) Works with Amazon Alexa Hope that helps! Enjoy!- Mike [...]



Simplify WMI

Fri, 22 Sep 2017 04:14:27 GMT

Originally posted on: http://geekswithblogs.net/hroggero/archive/2017/09/22/simplify-wmi.aspx

Windows Management Instrumentation (WMI) is a key component of any Windows-based infrastructure. WMI helps companies prepare for disaster recovery, audit patch compliance, help with security management and also with general server inventory. However using WMI can be challenging for many reasons.

My blog has moved! Check out the rest of this article here: https://www.herveroggero.com/single-post/2017/09/21/Simplify-WMI

Thank you!

(image) (image)



PowerShell: A curse in disguise

Fri, 22 Sep 2017 04:12:57 GMT

Originally posted on: http://geekswithblogs.net/hroggero/archive/2017/09/22/powershell-a-curse-in-disguise.aspx

I rarely think of technology as a problem, as most of the time people or processes are usually the root of organizational concerns. Many times will developers blame technology for being badly documented, or testers blame developers for not writing proper code, or database administrators blame IT for not having enough memory on a server. But there are exceptions; some technologies can become severe burdens on an organization, and PowerShell, in my opinion, could be one of them. But as you will see later, perhaps there is nothing wrong with the technology itself.

My blog has moved! To continue reading this post, please visit:  https://www.herveroggero.com/single-post/2017/09/22/PowerShell-A-Curse-in-Disguise 

Thank you

(image) (image)



Database Enums

Wed, 20 Sep 2017 16:38:39 GMT

Originally posted on: http://geekswithblogs.net/TimothyK/archive/2017/09/20/database-enums.aspxSo is it better to store enumerated types in a database as a string/varchar or integer?  Well it depends, but in general as a string is your best bet.  In this post I explore the pros and cons of each. Lists verse Enums Before we get to that let’s first be clear that I’m talking about enums here, not lists.  Let me explain the difference. For example let’s say to have a list of a available weight units: pounds, kilograms, grams, short tons, metric tons, long tons, stones, ounces, and etcetera.  You might be able to design your database so that the list of possible weight units is in a table. Your application should not have any advanced knowledge of the weight units that are defined in this table.  The application reads this dynamic list at run time.  Any information needed about these weight units must come from the database.  This includes conversion factors, display names with multilingual support, flags to indicate in which situations it is appropriate to offer these units as a choice, mapping rules to external systems, and anything else the application may need.  There should be no requirements that pounds must exist as a row or that kilograms must always be ID #1. If you can soft code every behaviour of the weight unit required by applications in the database then the best design is to soft code this in a table.  Installers may add or remove rows from this table as needed.  This is a “List” of available values, which is defined in a database table. If the available weight units is hard coded into the application as an enum or strongly typed enum then this is an “enum” not a “list”.  The available values cannot change without changes to the programs.  Therefore the available values should not be a database table.  Adding support for new weight units requires code changes, so it is to the code that you must go to make this change.  Having a database table falsely implies that the values are easy to change. Lookup Times The main argument for storing enum values as strings instead of integers is for human readability.  People looking at the database can easily see what the values mean.  Which of these two tables is easier to read? OrderID Weight WeightUnits 1 14 kg 2 23 lb 3 25 kg 4 11 lb 5 18 kg OrderID Weight WeightUnitID 1 14 1 2 23 2 3 25 1 4 11 2 5 18 1 Storing the values as a string makes it immediately obvious to anyone looking at the database what the units are.  There is no need have external lookup tables either as database tables, database functions, or external documentation.  It simply makes the system faster and easier to use. The principle is similar to how applications should be developed so that users can get around with a minimum number of clicks.  Adding just a few hundred milliseconds to the load time of a web page can cost thousands of dollars of lost sales.  Even if the only people looking at the database are your own employees (i.e. a “captive audience”) the lookup time is still burdensome.  Even if users have th[...]



Azure App Service Tools VSCode Extension

Tue, 19 Sep 2017 23:47:07 GMT

Originally posted on: http://geekswithblogs.net/KyleBurns/archive/2017/09/20/azure-app-service-tools-vscode-extension.aspx

Microsoft made their new Azure App Service Tools extension available today in the Visual Studio Marketplace. I had the opportunity to preview this extension and was very pleased. The process of provisioning and deploying the app service from VSCode was quite intuitive. I was able to "guess" my way through the process with my only wrong guess being how to start. I also very much appreciated that it generated reusable scripts (and opened them to make sure that you discovered them) as part of the process because I rarely work with projects where manual deployment from an IDE is desired. 

The one thing I would like to see changed is that creation of the website did not subsequently deploy or ask me if I wanted to do so. Since I triggered the tool in the context of working with a project in VSCode, it's not likely that I just decided to create an app service that was unrelated to that project, but after going through the wizard and being offered a link to the provisioned site I was presented with default content instead of my deployed app. I would like to have seen the wizard generate the provisioning script and a deploy script and then execute both. With only one thing that I would prefer to see implemented differently, I 'd say overall great job and thank you Microsoft for continuing to make my job easier!
(image) (image)



Fix VS2008 DPI issues on 4K display

Wed, 13 Sep 2017 08:47:50 GMT

Originally posted on: http://geekswithblogs.net/DougMoore/archive/2017/09/13/fix-vs2008-dpi-issues-on-4k-display.aspx

I don't know if you guys use 4K displays for your dev work, I sure do, but anyways...
...using VS2008 on a 4K display was driving me crazy!

Every time I attached to the target to downloaded an image the VS window and dialogs shrunk to a size where the text was readable but extremely small and very painful [for me] to read.

Here is the fix, jic you've encountered this as well.

Regards,
Doug

---

Windows Registry Editor Version 5.00

[HKEY_CURRENT_USER\Software\Microsoft\Windows NT\CurrentVersion\AppCompatFlags\Layers]
"C:\\Program Files (x86)\\Microsoft Visual Studio 9.0\\Common7\\IDE\\devenv.exe"="~ DPIUNAWARE"


(image) (image)



How to use dapper for MySQL in C#

Fri, 08 Sep 2017 08:24:25 GMT

Originally posted on: http://geekswithblogs.net/anirugu/archive/2017/09/08/how-to-use-dapper-for-mysql-in-c.aspx

Few days ago I look for a solution so  I can  just save my time writing CRUD. So I found a solution. It’s called Dapper.

For creating c# class from the database you can use this code

https://gist.github.com/anirugu/9fb82ce773c45578f42f7a6d899f3221

later add Dapper into your project and add Dapper.Contrib

now you don’t need to write a open Datareader and write some reading writing code again and again.

Dapper.Contrib give you some cool functionality like Insert and Update. You still need to write your SQL queries but it’s going to be good and easier to maintain the code. Last year I was working on a c# project and project become full mess of these code. With one line of  SELECT , INSERT  OR UPDATE code and 100 line of code later to just read those things from DataReader.

Dapper can save a lot of your time doing those same repeated thing and do it pretty well. 

https://github.com/StackExchange/Dapper

https://stackoverflow.com/questions/tagged/dapper

Happy Coding (image)

(image) (image)



Scaffolding failed, failed to build the project

Wed, 06 Sep 2017 11:03:25 GMT

Originally posted on: http://geekswithblogs.net/anirugu/archive/2017/09/06/scaffolding-failed-failed-to-build-the-project.aspx

If you are adding a view and you see this error, your project is not in the state it can be compiled. If you compile your software it will fail.

For fix the issue , fix the error in Errors List in your current MVC project and then add the view it will work.

Happy coding (image)

(image) (image)



Responsive Select2 in HTML

Tue, 05 Sep 2017 12:07:24 GMT

Originally posted on: http://geekswithblogs.net/anirugu/archive/2017/09/05/responsive-select2-in-html.aspx

From last few years I use select2 for making effective dropdown in bootstrap. In last days I am trying to make it responsive but it doesn’t perform so well.

Here is a nice thread to make it work.

https://github.com/select2/select2/issues/3278

using these tricks you can make it responsive which is quite awesome. Still it’s missing something.

In my implementation I need a feature that I can put it on a small space and I want to use the full width when someone use it.

So I look at The HTML generated by the plugin. I see select2 plugin generate some html div just after the select. If you try to set width or something on select2 after generation or before applying select2 thing will not work well.

The solution is you can write the css code for generated div. for example if you write

.select2-container{

width:90px;

}

it will make the select2 90px. that’s the way we modify the css of generated html, but wait, what about if I click on the select2 and I want to use more width available on the screen. I inspect and go into more detail and found that the plugin generated the div to show that searchbar and list that I see in select2.

the dropdown that you seen on html page have these classes

select2-dropdown select2-dropdown—below

so if you assign it more width  (that you want to see when someone open the select) then you need to make width to these dropdown container div, for example

.select2-dropdown{

width:180px;

}

so in this implementation I have a select2 with 90px width that will used 180px width when someone use it for type something or select.

Here is a quick demo for the post. https://codepen.io/anirugu/pen/YxMaqR

Thanks for reading my post.

Happy coding (image)

(image) (image)



Code Search - Visual Studio Text Editor Extension - v1.1 Update

Fri, 01 Sep 2017 04:07:32 GMT

Originally posted on: http://geekswithblogs.net/MikeParks/archive/2017/09/01/244592.aspxIt's been about a month since I blogged and released the Code Search - Visual Studio Text Editor Extension. The original release of this extension allowed you to use Microsoft's Code Search directly from the Text Editor in Visual Studio by highlighting text and clicking the Code Search context menu option. I've received a ton of positive feedback across the board for bridging the gap between the two platforms! I even got a couple blog shoutouts from the chief in editor of Visual Studio Magazine and Channel 9! (Much appreciated guys!)I made some minor updates to the code base for v1.1 here to refactor the error handling and added a backup call using LibGit2Sharp to grab the source control URL if the Team Explorer Reflection hack fails. While making these minor updates, I remembered reading about how (just like TFS/VSTS) GitHub also powers the back end of their code search engine with ElasticSearch. So, just for fun, I went ahead and added a little extra code to support GitHub's Repository Code Search as well.Now, if the repository you're working from in Visual Studio is hosted and connected to GitHub.com, you should be able to use GitHub's Code Search directly from the Text Editor in Visual Studio now by highlighting text and clicking on the same Code Search menu option. Looks like this: Side note: With the original release there was a little confusion from the description of the extension as to what it actually does. I originally made this to work specifically for Microsoft's Code Search, which runs as an extension either in Team Foundation Server or Visual Studio Team Services. A couple engineers stated they were trying to connect using BitBucket and other Git providers but Microsoft's Code Search is an extension that is only built to run in TFS or VSTS. The search technology varies across different source control platforms as well as the way the search URL's are constructed so the exceptions are expected. You can read more about Microsoft's Code Search (powered by ElasicSearch) here. I updated the description of the extension to hopefully eliminate some of the confusion. The extension is still all open source here on GitHub if you'd like to contribute or if you need to step through any errors, feel free to check it out. Thanks again for all the positive feedback from all my friends out there! Also, never stop upgrading! Check these out: Microsoft Surface Pro 4 (128 GB, 4 GB RAM, Intel Core i5) Logitech G502 Proteus Spectrum RGB Tunable Gaming Mouse, 12,000 DPI On-The-Fly DPI Shifting Google Wifi system (set of 3) - Router replacement for whole home coverage Gaming Keyboard, UtechSmart Saturn RGB Visual Effect Wired Gaming Keyboard with Rainbow LED Backlit Samsung 32GB BAR (METAL) USB 3.0 Flash Drive (MUF-32BA/AM) HP Pavilion 22cwa 21.5-inch IPS LED Backlit Monitor Mancro Business Water Resistant Polyester Laptop Backpack with USB Charging Port and Lock Fits Under 17-Inch Laptop and Notebook, Grey Rexing V1 Car Dash Cam 2.4" LCD FHD 1080p 170 Degree Wide Angle Dashboard Camera Recorder with Sony Exmor Video Sensor, G-Sensor, WDR, Loop Recording [...]



Sql Server: How do I get the filegroup, data file name, size and path of a database?

Tue, 29 Aug 2017 13:54:23 GMT

Originally posted on: http://geekswithblogs.net/AskPaula/archive/2017/08/29/sql-server-how-do-i-get-the-filegroup-data-file.aspx

-- start with this:
SELECT
dbfile.name AS DatabaseFileName,
dbfile.size/128 AS FileSizeInMB,
sysFG.name AS FileGroupName,
dbfile.physical_name AS DatabaseFilePath
FROM
sys.database_files AS dbfile
INNER JOIN
sys.filegroups AS sysFG
ON
dbfile.data_space_id = sysFG.data_space_id

-- for a more general look by filegroup, try this:

with fileConfig as
(SELECT
dbfile.name AS DatabaseFileName,
(dbfile.size/128)  AS FileSizeInMB,
sysFG.name AS FileGroupName,
dbfile.physical_name AS DatabaseFilePath
FROM
sys.database_files AS dbfile
INNER JOIN
sys.filegroups AS sysFG
ON
dbfile.data_space_id = sysFG.data_space_id
)
select FileGroupName,
       sum(FileSizeInMB) as TotalFilegroupInMB
 from fileConfig
  group by FileGroupName
  order by FileGroupName
(image) (image)



Microsoft Dynamics GP Stuck Batches

Mon, 28 Aug 2017 19:54:47 GMT

Originally posted on: http://geekswithblogs.net/RyanMcBee/archive/2017/08/28/244590.aspx

In all financial systems, Dynamics GP included, from time-to-time batches will get stuck during posting. I have experienced this problem in every financial system I have supported, and I’ve supported a few. In most systems, this problem is something you need to contact technical support to resolve.


(image)
(image) (image)



SharePoint Saturday Charlotte–2017 Edition

Sun, 27 Aug 2017 06:56:11 GMT

Originally posted on: http://geekswithblogs.net/kjones/archive/2017/08/27/244588.aspx

Another SharePoint Saturday Charlotte happened yesterday (8/26/2017) and, IMHO, it was a great event.  Thank you goes out to all of the speakers, sponsors, and event organizers!  A lot of behind the scenes work went into pulling it off.

As promised to those who attended my session…

How Office 365 has transformed Carolinas HealthCare System

Level: 200

Track: IT Pro, Business

Carolinas HealthCare System (CHS) is one of the largest non-profit healthcare systems in the US, with over 60,000 employees. In the last four years, CHS has upgraded Exchange and SharePoint to Office 365, which has introduced changes for both end users and the IT department. This session will cover the CHS upgrade / migration, how governance changed, and what operational changes have occurred along the way. Attendees will walk away from this session with both specific governance tactics they can implement, as well as, the reasoning behind them.

…here’s my PowerPoint slide deck.  I had way too much content, and will be reorganizing/focusing this presentation for the next conference I’m delivering it, the SharePoint Engage Conference in Raleigh.

If anyone has questions about our Office 365 (Exchange / SharePoint Online / OneDrive / Yammer) experience, either around migration or adoption or operations, please feel to reach out to me via Twitter, LinkedIn, or email (kdjones74@gmail.com).

(image) (image)



Visual Studio 2017 Version 15.4 Preview

Fri, 25 Aug 2017 07:34:52 GMT

Originally posted on: http://geekswithblogs.net/anirugu/archive/2017/08/25/visual-studio-2017-version-15.4-preview.aspxThis article is copied from Visual studio blog, It has been removed from the main place.  I have no attachment with Microsoft. So Words like "I'm" and "We" are not meant for myself. It's means of OP at the Visual studio blog.We are looking to improve your experience on the Visual Studio Blog. It would be very helpful if you could share your feedback via this short survey that should take less than 2 minutes. Thanks!I’m happy to announce that the first Preview of Visual Studio 2017 version 15.4 is now available! You can either download it from here, or if you already have Preview installed, you’ll receive a notification that the update is available. This latest Preview contains new tools and features in several key workloads such as Universal Windows Platform (UWP) development, .NET desktop development, and Mobile and Game development. It also continues our drive to improve and polish the fundamentals such as productivity and reliability and address customer-reported bugs. Read the feature highlight summary below, and check out the Visual Studio 2017 version 15.4 Preview Release notes for a detailed description of the new functionality contained in this Preview.Universal Windows Platform Development – Windows Fall Creators UpdateFirst, Visual Studio 2017 version 15.4 brings first class support for UWP developers targeting the upcoming Windows Fall Creators Update. To start building apps against this new Windows update, first, make sure you are enrolled in the Windows Insider Program. Once you are enrolled, install the latest pre-release version of the Windows Insider Preview SDK..NET Standard 2.0 SupportWith the release of the Windows Fall Creators Update, you will be able to leverage the power of .NET Standard 2.0 when building UWP applications. .NET Standard 2.0 brings an additional 20,000+ .NET APIs to Windows 10 UWP developers – many of which will be familiar to Windows Desktop (WPF, Windows Forms, etc…) developers. .NET Standard 2.0 also allows for easier sharing of code between various .NET project types as project-to-project references, or as NuGet packages. We are starting to see a variety of NuGet packages show up on NuGet.org with support for .NET Standard 2.0, all of which will be available for consumption inside UWP projects.To build UWP apps using the new .NET Standard 2.0 APIs, make sure you have the Windows Fall Creators Update Insider SDK Preview installed, and set the minimum version of your project to this version of the SDK.Windows Application Packaging ProjectIn Visual Studio 2017 version 15.4 Preview, you will get the first peek at a new project template that enables Classic Windows Desktop apps created with .NET or C++ to be packaged inside an .appx package for easier distribution via side-loading or submission to the Windows Store. These templates work for both new Classic Windows Desktop projects, as well as for existing projects.XAML Edit & Continue ImprovementsYou can edit or remove XAML resources using XAML Edit & Continue. In addition, you can also add ControlTemplates to your XAML while using XAML Edit & Contin[...]



SQL Server: Why is it taking so long to take a database offline?

Fri, 11 Aug 2017 19:21:11 GMT

Originally posted on: http://geekswithblogs.net/AskPaula/archive/2017/08/11/sql-server-why-is-it-taking-so-long-to-take-again.aspx

There are probably open sessions on the database you are attempting to bring offline. SQL Server is trying to roll back any existing workloads in-flight for that database. 

Issue the sp_who2 command from a new connection (master db) and view what's active. If you see activity, let it complete--or if you don't want the sessions to complete for whatever reason, issue the kill command for the spid(s).  

In the future, use this command:

ALTER DATABASE yourDBName SET OFFLINE WITH ROLLBACK IMMEDIATE;

To bring the db back online:

ALTER DATABASE yourDBName SET ONLINE

(image) (image)



SQL Server: Why is it taking so long to take a database offline?

Fri, 11 Aug 2017 19:21:10 GMT

Originally posted on: http://geekswithblogs.net/AskPaula/archive/2017/08/11/sql-server-why-is-it-taking-so-long-to-take.aspx

There are probably open sessions on the database you are attempting to bring offline. SQL Server is trying to roll back any existing workloads in-flight for that database. 

Issue the sp_who2 command from a new connection (master db) and view what's active. If you see activity, let it complete--or if you don't want the sessions to complete for whatever reason, issue the kill command for the spid(s).  

In the future, use this command:

ALTER DATABASE yourDBName SET OFFLINE WITH ROLLBACK IMMEDIATE;

To bring the db back online:

ALTER DATABASE yourDBName SET ONLINE

(image) (image)



Azure Functions Visual Studio 2017 Development

Thu, 10 Aug 2017 02:15:50 GMT

Originally posted on: http://geekswithblogs.net/tmurphy/archive/2017/08/10/azure-functions-visual-studio-2017-development.aspx The development tools and processes for Azure Functions are ever changing.  We started out only being able to create a function through the portal which I did a series on.  We then got a template in VS2015, but it really didn’t work very well.  They have since been able to create functions as Web Application libraries and now we are close the the release of a VS2017 template. This post will walk through the basics of using the VS2017 Preview with the Visual Studio Tools For Azure Functions which you can download here. Create New Project To create the initial solution open up the New Project dialog and find the Azure Function project type, name your project and click OK. Create New Function To add a function to your project, right-click the project and select New Item.  In the New Item dialog select Azure Function and provide a name for the class and click Add.  The next dialog which will appear is the New Azure Function dialog.  Here you will select the function trigger type and its parameters.  In the example below a timer trigger has been selected and a Cron schedule definition is automatically defined to execute every 5 minutes. Also in this dialog you can set the name of the function.  When you compile a folder will be created with that name in you bin directory which will be used later for deployment. Add Bindings With each generation of Azure Function development the way you initially define bindings changes (even if they stay the same behind the scenes).  Initially you had to use the portal Integrate page.  This had its advantages.  It would visually prompt you for the type of binding and the parameters for that binding. With the Visual Studio template you have to add attributes to the Run method of your function class.  This requires that you know what the attribute names are and what parameters are available and their proper values.  You can find a list of the main binding attributes here. At compile time the attributes will be used to generate a function.json file with your trigger and bindings definition. Add NuGet Packages If you are building functions in the portal you have to create a projects.json file that defines the packages you want to include.  This requires that you know the format of the file.  Thankfully with the Visual Studio template you can use the normal Nuget Package manager. Deploying There are a couple of ways to deploy your solution.  In the end a Function App is a specialized App Service.  This means you have the same deployment options of Visual Studio, PowerShell or via VSTS continuous deployment.  The main difference is that you don’t have a web.config file and have to manage you app settings and connection strings through the portal.  This can be reached by following the Application Settings link under the Configured Features section of the Function App Overview page. Summary While creating Azure Functions still isn’t a WYSIWYG turn key process the latest incarnation gives us an ALM capable solution[...]