Mesh WiFi 101

Sat, 23 Sep 2017 04:03:06 GMT

Originally posted on: you're tired of dealing with WiFi connectivity headaches, dead zones, and weak signals, from your old outdated traditional router, upgrading to Mesh WiFi for your home network is worth checking out. What is Mesh WiFi?The best home network upgrade I've ever made. I immediately gained:Max WiFi signal across the entire houseNo more WiFi dead zonesNo more range extendersGigabit WiFi speed capabilityAutomatic security updatesParental Controls to pause WiFi connections and filter contentThe very first speed test I ran from my iPhone gave me this (I pay for 200Mbps through Spectrum): width="560" height="315" src="" frameborder="0" allowfullscreen=""> What are my options?A few of the top options for Mesh WiFi on the market right now are Eero, Google Wifi, and Orbi. How easy is it to setup?It took me 10 minutes to replace my old traditional router. EeroGoogle WifiOrbi width="250" src="" frameborder="0" allowfullscreen=""> width="250" src="" frameborder="0" allowfullscreen=""> width="250" src="" frameborder="0" allowfullscreen=""> What do people think about it?They love it. Check the reviews. EeroGoogle WifiOrbi width="250" src="" frameborder="0" allowfullscreen=""> width="250" src="" frameborder="0" allowfullscreen=""> width="250" src="" frameborder="0" allowfullscreen=""> What do tech experts think about it?Eero -,review-4302.htmlGoogle Wifi -,review-4307.htmlOrbi -,review-4263.html Where can I buy it?Here's all the Amazon's Mesh Wifi search results goodness. Eero Google Wifi Orbi eero Home WiFi System (1 eero + 2 eero Beacons) - TrueMesh Network Technology, Gigabit Speed, WPA2 Encryption, Replaces Wireless Router, Works with Alexa (2nd Gen.) Google Wifi system (set of 3) - Router replacement for whole home coverage NETGEAR Orbi Home WiFi System: AC3000 Tri Band Home Network with Router & Satellite Extender for up to 5,000sqft of WiFi coverage (RBK50) Works with Amazon Alexa Hope that helps! Enjoy!- Mike [...]

Simplify WMI

Fri, 22 Sep 2017 04:14:27 GMT

Originally posted on:

Windows Management Instrumentation (WMI) is a key component of any Windows-based infrastructure. WMI helps companies prepare for disaster recovery, audit patch compliance, help with security management and also with general server inventory. However using WMI can be challenging for many reasons.

My blog has moved! Check out the rest of this article here:

Thank you!

(image) (image)

PowerShell: A curse in disguise

Fri, 22 Sep 2017 04:12:57 GMT

Originally posted on:

I rarely think of technology as a problem, as most of the time people or processes are usually the root of organizational concerns. Many times will developers blame technology for being badly documented, or testers blame developers for not writing proper code, or database administrators blame IT for not having enough memory on a server. But there are exceptions; some technologies can become severe burdens on an organization, and PowerShell, in my opinion, could be one of them. But as you will see later, perhaps there is nothing wrong with the technology itself.

My blog has moved! To continue reading this post, please visit: 

Thank you

(image) (image)

Database Enums

Wed, 20 Sep 2017 16:38:39 GMT

Originally posted on: is it better to store enumerated types in a database as a string/varchar or integer?  Well it depends, but in general as a string is your best bet.  In this post I explore the pros and cons of each. Lists verse Enums Before we get to that let’s first be clear that I’m talking about enums here, not lists.  Let me explain the difference. For example let’s say to have a list of a available weight units: pounds, kilograms, grams, short tons, metric tons, long tons, stones, ounces, and etcetera.  You might be able to design your database so that the list of possible weight units is in a table. Your application should not have any advanced knowledge of the weight units that are defined in this table.  The application reads this dynamic list at run time.  Any information needed about these weight units must come from the database.  This includes conversion factors, display names with multilingual support, flags to indicate in which situations it is appropriate to offer these units as a choice, mapping rules to external systems, and anything else the application may need.  There should be no requirements that pounds must exist as a row or that kilograms must always be ID #1. If you can soft code every behaviour of the weight unit required by applications in the database then the best design is to soft code this in a table.  Installers may add or remove rows from this table as needed.  This is a “List” of available values, which is defined in a database table. If the available weight units is hard coded into the application as an enum or strongly typed enum then this is an “enum” not a “list”.  The available values cannot change without changes to the programs.  Therefore the available values should not be a database table.  Adding support for new weight units requires code changes, so it is to the code that you must go to make this change.  Having a database table falsely implies that the values are easy to change. Lookup Times The main argument for storing enum values as strings instead of integers is for human readability.  People looking at the database can easily see what the values mean.  Which of these two tables is easier to read? OrderID Weight WeightUnits 1 14 kg 2 23 lb 3 25 kg 4 11 lb 5 18 kg OrderID Weight WeightUnitID 1 14 1 2 23 2 3 25 1 4 11 2 5 18 1 Storing the values as a string makes it immediately obvious to anyone looking at the database what the units are.  There is no need have external lookup tables either as database tables, database functions, or external documentation.  It simply makes the system faster and easier to use. The principle is similar to how applications should be developed so that users can get around with a minimum number of clicks.  Adding just a few hundred milliseconds to the load time of a web page can cost thousands of dollars of lost sales.  Even if the only people looking at the database are your own employees (i.e. a “captive audience”) the lookup time is still burdensome.  Even if users have the IDs memorized, there is still a non-zero amount of time that it takes for the mental jumping jacks to make this conversion.  There is a real cost to this lookup. Systems should be designed for human readability, not computer readability. Validation String enum columns usually offer better validation and data quality.  If someone enters a “jf” instead of “kg” for the units it is very obvious that a mistake has been made.  Whereas fat fingers on the number pad entering pres[...]

Azure App Service Tools VSCode Extension

Tue, 19 Sep 2017 23:47:07 GMT

Originally posted on:

Microsoft made their new Azure App Service Tools extension available today in the Visual Studio Marketplace. I had the opportunity to preview this extension and was very pleased. The process of provisioning and deploying the app service from VSCode was quite intuitive. I was able to "guess" my way through the process with my only wrong guess being how to start. I also very much appreciated that it generated reusable scripts (and opened them to make sure that you discovered them) as part of the process because I rarely work with projects where manual deployment from an IDE is desired. 

The one thing I would like to see changed is that creation of the website did not subsequently deploy or ask me if I wanted to do so. Since I triggered the tool in the context of working with a project in VSCode, it's not likely that I just decided to create an app service that was unrelated to that project, but after going through the wizard and being offered a link to the provisioned site I was presented with default content instead of my deployed app. I would like to have seen the wizard generate the provisioning script and a deploy script and then execute both. With only one thing that I would prefer to see implemented differently, I 'd say overall great job and thank you Microsoft for continuing to make my job easier!
(image) (image)

Fix VS2008 DPI issues on 4K display

Wed, 13 Sep 2017 08:47:50 GMT

Originally posted on:

I don't know if you guys use 4K displays for your dev work, I sure do, but anyways...
...using VS2008 on a 4K display was driving me crazy!

Every time I attached to the target to downloaded an image the VS window and dialogs shrunk to a size where the text was readable but extremely small and very painful [for me] to read.

Here is the fix, jic you've encountered this as well.



Windows Registry Editor Version 5.00

[HKEY_CURRENT_USER\Software\Microsoft\Windows NT\CurrentVersion\AppCompatFlags\Layers]
"C:\\Program Files (x86)\\Microsoft Visual Studio 9.0\\Common7\\IDE\\devenv.exe"="~ DPIUNAWARE"

(image) (image)

How to use dapper for MySQL in C#

Fri, 08 Sep 2017 08:24:25 GMT

Originally posted on:

Few days ago I look for a solution so  I can  just save my time writing CRUD. So I found a solution. It’s called Dapper.

For creating c# class from the database you can use this code

later add Dapper into your project and add Dapper.Contrib

now you don’t need to write a open Datareader and write some reading writing code again and again.

Dapper.Contrib give you some cool functionality like Insert and Update. You still need to write your SQL queries but it’s going to be good and easier to maintain the code. Last year I was working on a c# project and project become full mess of these code. With one line of  SELECT , INSERT  OR UPDATE code and 100 line of code later to just read those things from DataReader.

Dapper can save a lot of your time doing those same repeated thing and do it pretty well.

Happy Coding (image)

(image) (image)

Scaffolding failed, failed to build the project

Wed, 06 Sep 2017 11:03:25 GMT

Originally posted on:

If you are adding a view and you see this error, your project is not in the state it can be compiled. If you compile your software it will fail.

For fix the issue , fix the error in Errors List in your current MVC project and then add the view it will work.

Happy coding (image)

(image) (image)

Responsive Select2 in HTML

Tue, 05 Sep 2017 12:07:24 GMT

Originally posted on:

From last few years I use select2 for making effective dropdown in bootstrap. In last days I am trying to make it responsive but it doesn’t perform so well.

Here is a nice thread to make it work.

using these tricks you can make it responsive which is quite awesome. Still it’s missing something.

In my implementation I need a feature that I can put it on a small space and I want to use the full width when someone use it.

So I look at The HTML generated by the plugin. I see select2 plugin generate some html div just after the select. If you try to set width or something on select2 after generation or before applying select2 thing will not work well.

The solution is you can write the css code for generated div. for example if you write




it will make the select2 90px. that’s the way we modify the css of generated html, but wait, what about if I click on the select2 and I want to use more width available on the screen. I inspect and go into more detail and found that the plugin generated the div to show that searchbar and list that I see in select2.

the dropdown that you seen on html page have these classes

select2-dropdown select2-dropdown—below

so if you assign it more width  (that you want to see when someone open the select) then you need to make width to these dropdown container div, for example




so in this implementation I have a select2 with 90px width that will used 180px width when someone use it for type something or select.

Here is a quick demo for the post.

Thanks for reading my post.

Happy coding (image)

(image) (image)

Code Search - Visual Studio Text Editor Extension - v1.1 Update

Fri, 01 Sep 2017 04:07:32 GMT

Originally posted on:'s been about a month since I blogged and released the Code Search - Visual Studio Text Editor Extension. The original release of this extension allowed you to use Microsoft's Code Search directly from the Text Editor in Visual Studio by highlighting text and clicking the Code Search context menu option. I've received a ton of positive feedback across the board for bridging the gap between the two platforms! I even got a couple blog shoutouts from the chief in editor of Visual Studio Magazine and Channel 9! (Much appreciated guys!)I made some minor updates to the code base for v1.1 here to refactor the error handling and added a backup call using LibGit2Sharp to grab the source control URL if the Team Explorer Reflection hack fails. While making these minor updates, I remembered reading about how (just like TFS/VSTS) GitHub also powers the back end of their code search engine with ElasticSearch. So, just for fun, I went ahead and added a little extra code to support GitHub's Repository Code Search as well.Now, if the repository you're working from in Visual Studio is hosted and connected to, you should be able to use GitHub's Code Search directly from the Text Editor in Visual Studio now by highlighting text and clicking on the same Code Search menu option. Looks like this: Side note: With the original release there was a little confusion from the description of the extension as to what it actually does. I originally made this to work specifically for Microsoft's Code Search, which runs as an extension either in Team Foundation Server or Visual Studio Team Services. A couple engineers stated they were trying to connect using BitBucket and other Git providers but Microsoft's Code Search is an extension that is only built to run in TFS or VSTS. The search technology varies across different source control platforms as well as the way the search URL's are constructed so the exceptions are expected. You can read more about Microsoft's Code Search (powered by ElasicSearch) here. I updated the description of the extension to hopefully eliminate some of the confusion. The extension is still all open source here on GitHub if you'd like to contribute or if you need to step through any errors, feel free to check it out. Thanks again for all the positive feedback from all my friends out there! Also, never stop upgrading! Check these out: Microsoft Surface Pro 4 (128 GB, 4 GB RAM, Intel Core i5) Logitech G502 Proteus Spectrum RGB Tunable Gaming Mouse, 12,000 DPI On-The-Fly DPI Shifting Google Wifi system (set of 3) - Router replacement for whole home coverage Gaming Keyboard, UtechSmart Saturn RGB Visual Effect Wired Gaming Keyboard with Rainbow LED Backlit Samsung 32GB BAR (METAL) USB 3.0 Flash Drive (MUF-32BA/AM) HP Pavilion 22cwa 21.5-inch IPS LED Backlit Monitor Mancro Business Water Resistant Polyester Laptop Backpack with USB Charging Port and Lock Fits Under 17-Inch Laptop and Notebook, Grey Rexing V1 Car Dash Cam 2.4" LCD FHD 1080p 170 Degree Wide Angle Dashboard Camera Recorder with Sony Exmor Video Sensor, G-Sensor, WDR, Loop Recording [...]

Sql Server: How do I get the filegroup, data file name, size and path of a database?

Tue, 29 Aug 2017 13:54:23 GMT

Originally posted on:

-- start with this:
SELECT AS DatabaseFileName,
dbfile.size/128 AS FileSizeInMB, AS FileGroupName,
dbfile.physical_name AS DatabaseFilePath
sys.database_files AS dbfile
sys.filegroups AS sysFG
dbfile.data_space_id = sysFG.data_space_id

-- for a more general look by filegroup, try this:

with fileConfig as
(SELECT AS DatabaseFileName,
(dbfile.size/128)  AS FileSizeInMB, AS FileGroupName,
dbfile.physical_name AS DatabaseFilePath
sys.database_files AS dbfile
sys.filegroups AS sysFG
dbfile.data_space_id = sysFG.data_space_id
select FileGroupName,
       sum(FileSizeInMB) as TotalFilegroupInMB
 from fileConfig
  group by FileGroupName
  order by FileGroupName
(image) (image)

Microsoft Dynamics GP Stuck Batches

Mon, 28 Aug 2017 19:54:47 GMT

Originally posted on:

In all financial systems, Dynamics GP included, from time-to-time batches will get stuck during posting. I have experienced this problem in every financial system I have supported, and I’ve supported a few. In most systems, this problem is something you need to contact technical support to resolve.

(image) (image)

SharePoint Saturday Charlotte–2017 Edition

Sun, 27 Aug 2017 06:56:11 GMT

Originally posted on:

Another SharePoint Saturday Charlotte happened yesterday (8/26/2017) and, IMHO, it was a great event.  Thank you goes out to all of the speakers, sponsors, and event organizers!  A lot of behind the scenes work went into pulling it off.

As promised to those who attended my session…

How Office 365 has transformed Carolinas HealthCare System

Level: 200

Track: IT Pro, Business

Carolinas HealthCare System (CHS) is one of the largest non-profit healthcare systems in the US, with over 60,000 employees. In the last four years, CHS has upgraded Exchange and SharePoint to Office 365, which has introduced changes for both end users and the IT department. This session will cover the CHS upgrade / migration, how governance changed, and what operational changes have occurred along the way. Attendees will walk away from this session with both specific governance tactics they can implement, as well as, the reasoning behind them.

…here’s my PowerPoint slide deck.  I had way too much content, and will be reorganizing/focusing this presentation for the next conference I’m delivering it, the SharePoint Engage Conference in Raleigh.

If anyone has questions about our Office 365 (Exchange / SharePoint Online / OneDrive / Yammer) experience, either around migration or adoption or operations, please feel to reach out to me via Twitter, LinkedIn, or email (

(image) (image)

Visual Studio 2017 Version 15.4 Preview

Fri, 25 Aug 2017 07:34:52 GMT

Originally posted on: article is copied from Visual studio blog, It has been removed from the main place.  I have no attachment with Microsoft. So Words like "I'm" and "We" are not meant for myself. It's means of OP at the Visual studio blog.We are looking to improve your experience on the Visual Studio Blog. It would be very helpful if you could share your feedback via this short survey that should take less than 2 minutes. Thanks!I’m happy to announce that the first Preview of Visual Studio 2017 version 15.4 is now available! You can either download it from here, or if you already have Preview installed, you’ll receive a notification that the update is available. This latest Preview contains new tools and features in several key workloads such as Universal Windows Platform (UWP) development, .NET desktop development, and Mobile and Game development. It also continues our drive to improve and polish the fundamentals such as productivity and reliability and address customer-reported bugs. Read the feature highlight summary below, and check out the Visual Studio 2017 version 15.4 Preview Release notes for a detailed description of the new functionality contained in this Preview.Universal Windows Platform Development – Windows Fall Creators UpdateFirst, Visual Studio 2017 version 15.4 brings first class support for UWP developers targeting the upcoming Windows Fall Creators Update. To start building apps against this new Windows update, first, make sure you are enrolled in the Windows Insider Program. Once you are enrolled, install the latest pre-release version of the Windows Insider Preview SDK..NET Standard 2.0 SupportWith the release of the Windows Fall Creators Update, you will be able to leverage the power of .NET Standard 2.0 when building UWP applications. .NET Standard 2.0 brings an additional 20,000+ .NET APIs to Windows 10 UWP developers – many of which will be familiar to Windows Desktop (WPF, Windows Forms, etc…) developers. .NET Standard 2.0 also allows for easier sharing of code between various .NET project types as project-to-project references, or as NuGet packages. We are starting to see a variety of NuGet packages show up on with support for .NET Standard 2.0, all of which will be available for consumption inside UWP projects.To build UWP apps using the new .NET Standard 2.0 APIs, make sure you have the Windows Fall Creators Update Insider SDK Preview installed, and set the minimum version of your project to this version of the SDK.Windows Application Packaging ProjectIn Visual Studio 2017 version 15.4 Preview, you will get the first peek at a new project template that enables Classic Windows Desktop apps created with .NET or C++ to be packaged inside an .appx package for easier distribution via side-loading or submission to the Windows Store. These templates work for both new Classic Windows Desktop projects, as well as for existing projects.XAML Edit & Continue ImprovementsYou can edit or remove XAML resources using XAML Edit & Continue. In addition, you can also add ControlTemplates to your XAML while using XAML Edit & Continue. To leverage these new features, make sure you are running the Windows Fall Creators Update Preview.Mobile and Game DevelopmentUnityIn Visual Studio 2017 version 15.4 Preview we have made improvements and bugfixes in the tooling for Unity. There is better support for the latest released Unity 2017.1 runtime. This Preview also supports user-defined managed assemblies, a feature coming in Unity 2017.2 that helps to drastically minimize script compilation times of projects. To help with[...]

SQL Server: Why is it taking so long to take a database offline?

Fri, 11 Aug 2017 19:21:11 GMT

Originally posted on:

There are probably open sessions on the database you are attempting to bring offline. SQL Server is trying to roll back any existing workloads in-flight for that database. 

Issue the sp_who2 command from a new connection (master db) and view what's active. If you see activity, let it complete--or if you don't want the sessions to complete for whatever reason, issue the kill command for the spid(s).  

In the future, use this command:


To bring the db back online:


(image) (image)

SQL Server: Why is it taking so long to take a database offline?

Fri, 11 Aug 2017 19:21:10 GMT

Originally posted on:

There are probably open sessions on the database you are attempting to bring offline. SQL Server is trying to roll back any existing workloads in-flight for that database. 

Issue the sp_who2 command from a new connection (master db) and view what's active. If you see activity, let it complete--or if you don't want the sessions to complete for whatever reason, issue the kill command for the spid(s).  

In the future, use this command:


To bring the db back online:


(image) (image)

Azure Functions Visual Studio 2017 Development

Thu, 10 Aug 2017 02:15:50 GMT

Originally posted on: The development tools and processes for Azure Functions are ever changing.  We started out only being able to create a function through the portal which I did a series on.  We then got a template in VS2015, but it really didn’t work very well.  They have since been able to create functions as Web Application libraries and now we are close the the release of a VS2017 template. This post will walk through the basics of using the VS2017 Preview with the Visual Studio Tools For Azure Functions which you can download here. Create New Project To create the initial solution open up the New Project dialog and find the Azure Function project type, name your project and click OK. Create New Function To add a function to your project, right-click the project and select New Item.  In the New Item dialog select Azure Function and provide a name for the class and click Add.  The next dialog which will appear is the New Azure Function dialog.  Here you will select the function trigger type and its parameters.  In the example below a timer trigger has been selected and a Cron schedule definition is automatically defined to execute every 5 minutes. Also in this dialog you can set the name of the function.  When you compile a folder will be created with that name in you bin directory which will be used later for deployment. Add Bindings With each generation of Azure Function development the way you initially define bindings changes (even if they stay the same behind the scenes).  Initially you had to use the portal Integrate page.  This had its advantages.  It would visually prompt you for the type of binding and the parameters for that binding. With the Visual Studio template you have to add attributes to the Run method of your function class.  This requires that you know what the attribute names are and what parameters are available and their proper values.  You can find a list of the main binding attributes here. At compile time the attributes will be used to generate a function.json file with your trigger and bindings definition. Add NuGet Packages If you are building functions in the portal you have to create a projects.json file that defines the packages you want to include.  This requires that you know the format of the file.  Thankfully with the Visual Studio template you can use the normal Nuget Package manager. Deploying There are a couple of ways to deploy your solution.  In the end a Function App is a specialized App Service.  This means you have the same deployment options of Visual Studio, PowerShell or via VSTS continuous deployment.  The main difference is that you don’t have a web.config file and have to manage you app settings and connection strings through the portal.  This can be reached by following the Application Settings link under the Configured Features section of the Function App Overview page. Summary While creating Azure Functions still isn’t a WYSIWYG turn key process the latest incarnation gives us an ALM capable solution.  I believe this is the development approach that will stabilize for the foreseeable future and anyone who is creating Functions should invest in learning. [...]

SQL Server: How can I get a distinct count(*) with multiple columns?

Wed, 09 Aug 2017 11:45:07 GMT

Originally posted on:

To get a count(*) of distinct column combinations, do the count(*) over the distinct select statement.


SELECT count(*)
              FROM YourTable 
             ) x
(image) (image)

Query Application Insights REST API To Create Custom Notifications

Fri, 04 Aug 2017 00:35:52 GMT

Originally posted on: Application Insights is one of those tools that has been around for a number of years now, but is finally getting understood as more companies move to Azure as a cloud solution.  It has become an amazing tool for monitoring the performance of your application, but it can also work as a general logging platform as I have posted before. Now that you are capturing all this information how can you leverage it?  Going to the Azure portal whenever you want an answer is time consuming.  It would be great if you could automate this process.  Of course there are a number of metrics that you can create alerts for directly via the portal, but what if you want a non-standard metric or want to do something beside just send an alert? Fortunately Microsoft has a REST API in beta for Application Insights.  It allows you to check standard metrics as well as run custom queries as you do in the Analytics portal.  Let’s explore how to use this API. In this post will show how to create a demo that implements an Azure Function which calls the Application Insights REST API and then send the results out using SendGrid.  I created them with the VS2017 Preview and the new Azure Functions templates. Generate Custom Events First we need some data to work with.  The simplest way is to leverage the TrackEvent and TrackException method of the Application Insights API.  In order to do this you first need to setup a TelemetryClient.  The code below I have as part of the class level variables. private static string appInsightsKey = System.Environment.GetEnvironmentVariable("AppInsightKey", EnvironmentVariableTarget.Process); private static TelemetryClient telemetry = new TelemetryClient(); private static string key = TelemetryConfiguration.Active.InstrumentationKey = appInsightsKey; //System.Environment.GetEnvironmentVariable("AN:InsightKey", EnvironmentVariableTarget.Process); After that it is simple to call the TrackEvent method on the TelemetryClient object to log an activity in your code (be aware it may take 5 minutes for an event to show up in Application Insights). telemetry.TrackEvent($"This is a POC event"); Create a VS2017 Function Application I will have another post on the details in the future, but if you have Visual Studio 2017 Preview 15.3.0 installed you will be able to create an Azure Functions project. Right click the project and select the New Item context menu option and select Azure Function as shown below. On the New Azure Function dialog select TimerTrigger and leave the remaining options as default. Call Application Insights REST API Once there are events in the customEvents collection we can write a query and execute it against the Application Insights REST API.  To accomplish this the example uses a simple HttpClient call.  The API page for Application Insights can be found here and contains the ULRs and formats for each call type.  We will be using the Query API scenario which will be setup with a couple of variables. private const string URL = "{0}/query?query={1}"; private const string query = "customEvents | where timestamp >= ago(20m) and name contains \"This is a POC event\" | count"; The call to the service is a common pattern using the HttpClient as shown below.  Add this to the Run method of your new function. HttpClient client = new Htt[...]

New Visual Studio 2017 Extension - Code Search (VS Text Editor)

Thu, 03 Aug 2017 12:55:22 GMT

Originally posted on:'s been quite some time since I've cracked open Visual Studio to create an extension freebie for all my fellow engineers. If you're keeping up with the latest and greatest in the Microsoft Engineering world you've probably already heard about Code Search which delivers a one-stop solution for all your code exploration needs by allowing you to search across multiple projects and repositories at an unmatchable, incredibly fast speed (shoutout to Elasticsearch which supports the back-end, amazing job guys).I've been using this tool non-stop at work lately to explore the company code base while integrating new features. The value that it provides is extraordinary! The only downside I stumbled upon was the interruption of stopping my coding flow working inside Visual Studio, opening a new Chrome tab, navigating to the Code Search web page, then copying/pasting or typing in the keyword I wanted to search for which usually already existed in the code base I was working on from the Visual Studio Text Editor. To work around this, I created the Code Search - Visual Studio Text Editor Extension. This extension allows you to use Microsoft's Code Search directly from the Text Editor in Visual Studio by highlighting text and clicking the Code Search context menu option.It's been a pretty handy time saver for me and has minimized interruption in my code flow. I passed the Debug version around to a few co-workers and everyone's been enjoying it so far. If you're running Code Search + Visual Studio, feel free to check it out if you'd like to link the two platforms together! One thing to note, I only wired up the context menu GUID's to add the option to the most common text editors. The code base is published on GitHub so if you'd like to make any contributions, let me know! Code Search - Install: Search - Visual Studio Text Editor - Install: Repo: everyone!Also, if you know me, I like to try out everything from Boosted Boards to Drones to Snapchat Spectacles to basically any new tech I can hook into. I've been really into home automation lately, might blog about some of that soon! Check out some of my hand picked recent Amazon goodies just for my engineering friends, enjoy: (47% off right now) TeckNet Gryphon Pro LED Illuminated Programmable Gaming Keyboard and Mouse set, Water-Resistant Design, US layout (60% off right now) Anker 60W 10-Port USB Wall Charger, PowerPort 10 for iPhone 7 / 6s / Plus, iPad Pro / Air 2 / mini, Galaxy S7 / S6 / Edge / Plus, Note 5 / 4, LG, Nexus, HTC and More iRobot Roomba 690 Wi-Fi Connected Robotic Vacuum Cleaner, Works with Amazon Alexa Philips Hue 464479 60W Equivalent White and Color Ambiance A19 Starter Kit, 3rd Generation, Works with Amazon Alexa TP-Link Smart Wi-Fi Light Switch, No Hub Required, Single Pole, Control Your Fixtures From Anywhere, Works with Alexa (HS200) TP-Link Smart Wi-Fi Light Switch, No Hub Required, Single Pole, Control Your Fixtures From Anywhere, Works with Alexa (HS200) Philips Hue Lightstrip Plus Dimmable LED Smart Light (Compatible with Amazon Alexa, Apple HomeKit, and Google Assistant) TP-Link 8-Port Gigabit Ethernet Plastic Desktop Switch ([...]

Ok it has been about 7 years since my last post

Wed, 02 Aug 2017 07:40:40 GMT

Originally posted on:

Ok it has been about 7 years since my last post.  I have moved 3 times so far (current one is due family issues).  Have worked for almost 2 years with Wal Mart and various part time jobs (I am currently working 2 very part time jobs avg about 8 hours a week).  In the middle of a divorce (not my idea) due mostly not being able find a well paying job of any type (in my fields or out of it).  I may be down, I may be older BUT, I am not out of it. (image) (image)

My blog has moved.

Mon, 31 Jul 2017 00:28:36 GMT

Originally posted on:

You can find my new blog site over here:

(image) (image)

Worrying conclusions of Google's latest analysis of programmatic advertisin

Thu, 27 Jul 2017 06:57:26 GMT

Originally posted on: of the problems that have become one of those issues that generate tensions within the online advertising industry is fraud in advertising, especially where it is purchased programmatically. When an advertiser buys an ad space to an online medium, the margin of potential error is much lower than it could be if you buy into a marketplace and the risk of deception is much lower. So to speak, it is almost impossible that the ads one has bought so that they are seen in one space will end up seeing in another, because there is greater control of the situation. This does not happen in program advertising. Advertisers do not have real and effective control of where their ads end up and how they will be shown, because ultimately the decision maker is an algorithm. They look for a type of audience and that's what they ask the algorithm. What the algorithm does with it is already something else, as well put the YouTube scandal on the screen (when the brands discovered that their ads were being broadcast in scenarios where they did not want, under any circumstances, to appear) as explained by experts at online dissertation help . But the truth is that the problem is not only that the ads can appear in media or with contents that are not the most desirable, but invented the support, invented the trap. Fraud in program advertising is a problem and a serious problem. Not only have networks of cybercriminals been found to create fake media networks to sell advertising that no one was actually seeing, but there are also some divergences. For example, advertisers may be buying one type of media and appearing in others. They may be paying for premium media and not appear exactly on them. And that is the great fear of the industry. For advertisers it is a problem because they feel they are throwing money with ads that are not exactly the ones they want and look for. For the media it is also a complicated situation because they are seeing how the market loses value as someone is taking money that does not fit them. And in the midst of all this Google has just done an in-depth study by dissertation writing service to detect what happens and how the ads fail . The study has not been made public, but a leak to Business Insider has pointed out what the conclusions have been. The objective of the study was to understand how the market works to make decisions that allow to end spoofing , just that practice of selling advertising space that is not promising to appear in premium sites and not doing so. The results of the Google study Google has been doing different tests, associated with some of the media giants. Among its associates in the study were headers such as the American NBCU, CBS and The New York Times. What did they do? Associated media closed the tap on programmatic advertising for a while. They stopped selling their ads through these platforms for short periods of time, such as 10 to 15 minute intervals, to study what was happening. The results were not positive. What the Google study found was that, even though programmatic advertising had been blocked, it was possible for advertisers to keep buying those ads. In the different exchanges advertising continued to offer thousands of advertising space both video and display of the headers that were in[...]

How Game of Thrones measures the emotions to decide what to publish in social networks

Thu, 27 Jul 2017 06:24:01 GMT

Originally posted on: is employing feelings-measuring technology to establish the emotional reactions that are generated while viewers watch Game of Thrones as discussed by dissertation writing service.If there is an element that has become constant when talking about strategy and what companies should do (or not) in the future and when connecting with consumers, that should be the emotions. From time to time, everyone has thrown themselves into talking about emotions, how things change and how they are having an impact on the decisions that consumers make. Emotions are also a key element to understand the behavior of users of social networks and to establish why some content manage to connect with them much better than other content can achieve. The relationship between things and the emotional bond established with them in explaining their success is very high. It is not surprising, therefore, that a closer link between emotional and business decisions is being established and that more and more of the reactions generated by things are being studied in greater detail in order to try to position themselves when connecting with consumers . Companies measure the emotions that generate their products and are attentive to them to make future decisions. And this is done by everyone and with more and more goals in mind. The measurement of emotions is not only used to understand how things work and to connect with consumers, but also to refine the strategy to follow in social networks.That's what HBO does, the American cable network that comes via VoD to half the world and whose series are so influential. The company is employing sentiment-measuring technology to establish - according to the language of this technology - emotional reactions (ERs), emotional reactions, which are generated as viewers view the new Thriller Game chapters . From the results they achieve, they can establish a better strategy in social networks, as Mashable publishes . In the case of the premiere last season, for example, the main emotional bond was established that of love, which accounted for 24.3% of all emotional reactions. Emotion followed (19.9%), enjoyment (9.2%) and madness (8.9%). How does this tool work and how does HBO work to measure what its viewers feel? The information obtained allows conclusions to be drawn in two different areas. On the one hand, it helps to evaluate how the audience is receiving the new episodes. On the other hand, it helps determine what content will be the ones that get a better impact if they are promoted in social networks in the days following the issue. How they get the information?Information about consumers and their emotions comes from the social networks themselves. The television network uses the services of a specialized company, Canvs, which measures the reactions generated on Facebook, Twitter, Instagram and YouTube in real time, creating a kind of graph of emotions. What is important is therefore not how or where viewers view the content, but rather what they share about them.  As one of the firm's directors points out to Mashable , the platform takes into account the nuances of language. "We do not speak well, we invent words, we use emoticons," he says. The system has been creating a kind of database of the language of modern life to fully understand what [...]

Replicating SharePoint Online Lists to SQL Server

Thu, 27 Jul 2017 03:06:19 GMT

Originally posted on: all know that accessing SharePoint data can be challenging, specifically for SharePoint Online implementations. The SharePoint SDK requires advanced software development knowledge, and SharePoint Online itself lacks the ability to provide advanced reporting. As a result, many organizations are looking for a way to replicate SharePoint data to SQL Server on a regular basis so that business analysts can build advanced reports, and to facilitate general systems integration efforts. A Webinar showing how this is working in more details will take place on August 9, 2017 at noon (US Eastern Time). To register for this webinar, go to:  Enzo Unified allows you to easily setup a replication job that will create a duplicate copy of your SharePoint data in SQL Server, and optionally keep changes synchronized on a regular basis (for example every minute). This allows SQL Server to keep a near-time copy of SharePoint Online data for reporting and integration purposes. The steps involved in setting up replication with Enzo Unified have been simplified to a point where anyone with basic knowledge of SQL Server can set up replication and be up and running quickly. The diagram below shows how users (or systems) can communicate to SharePoint Online directly through Enzo Unified by issuing SQL commands. The SharePoint adapter, hosted by Enzo, allows this direct communication to take place. The Integration adapter uses the SharePoint adapter to communicate to SharePoint online, and create a replica of SharePoint lists. This architecture allows a similar implementation with other hosted systems, such as SalesForce tables for example. For an overview of how this works and further information, visit this page and take a look at the short video: About Herve Roggero Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified ( Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association. [...]

Can't Start NiFi from Ambari, How do I spin up NiFi service in Hortonworks sandbox?

Wed, 26 Jul 2017 15:43:10 GMT

Originally posted on: you've followed the instructions to spin-up NiFi from Ambari, however, the service still does not appear to have started up (read: Ambari shows the service as "red" or stopped; the http URL does not bring up the canvas) follow these steps from the root account:1. Go to the NiFi install directory (usually under /opt)2. Find the shell script to run it and execute it with the install switch. ( ./nifi-x.x.x/bin/ install)3. Type:  service nifi startAmbari will still show NiFi as down, however, you will be able to work with the canvas from the URL.If you don't have NiFi installed at all (just do a find in Linux for follow these steps:*Note:  At some point the Georgia Tech URL may no longer be a donor spot--and of course, the version number will be different. Look for the version of NiFi on one of the available sources ahead of time and replace what's on line 3 with that.1. sudo su && yum install -y git wget 2. cd /opt 3. wget 4. tar -xzvf ./nifi-1.1.2-bin.tar.gz 5. nifi-1.1.2/bin/ install 6. service nifi start [...]

How to study effectively?

Tue, 18 Jul 2017 05:06:50 GMT

Originally posted on:

(image) (image)

mysqli_connect(): (HY000/1045): Access denied for user (using password: YES)

Sun, 16 Jul 2017 04:58:18 GMT

Originally posted on: is the solution for the error:mysqli_connect(): (HY000/1045): Access denied for user (using password: YES)Solution:Make sure that your password doesn't have special characters and just keep a plain password (for ex: 12345), it will work. This is the strangest thing that I have ever seen. I spent about 2 hours to figure this out.Note: 12345 mentioned below is your plain password that you would like to set for the below username.GRANT ALL PRIVILEGES ON dbname.* TO 'yourusername'@'%' IDENTIFIED BY '12345';FLUSH Privileges; [...]

Pi Zero W - Streamer - Gains A Lego Case

Fri, 07 Jul 2017 13:26:14 GMT

Originally posted on:

So I’ve added a Lego Case To My Music Streamer 




Its working great.   Our boys can wander in with their tablets/phones and just wirelessly play what they like.    Music should just be like this.   Accessible and easy sounds awesome too.


Sum of parts.   Raspberry Pi Zero W - Hifi-Berry DAC Board - Some Lego - Cambridge Audio Amp and Speakers.    + Great Kids.

(image) (image)

Legacy Projects, Technical Debt and NDepend

Thu, 29 Jun 2017 13:44:51 GMT

Originally posted on: every project you've worked on has been green field and / or built with no time pressure, you'll have found yourself working on a legacy project at some point. Unwieldy methods, mystery sections of code, ancient technologies, wholesale duplication... it's not much fun, but it's a large percentage of the code that's out there. Projects to replace or rewrite these systems are commonplace, but where do you begin? What if you want to make a case to the business that such a system needs to be replaced? Technical debt can be a useful metaphor to make that case, but while it's easy to explain in the abstract, it's difficult to come up with anything concrete to justify the expense of an update to someone with an eye on their bottom line. Thankfully, the folks at NDepend have now built technical debt computation on top of their code analysis tools, giving you a much easier way to have these sorts of discussions. This is doubly powerful - as well as putting a concrete cost on choosing not to refactor, the data it presents has the authority of having been produced by a tool. Tools don't try to get nice, tidy-up projects for academic reasons - they impartially detect problems in code. Someone (I think Erik Deitrich, but I can't find the blog) recently pointed out the advantages of an automated critique of this sort - there's no politics or personal opinions involved, and that automatically means everyone takes it more seriously. A Real-World Example I'm currently working with a legacy project, so when I heard about NDepend's new technical debt capabilities, I was eager to fire it up and see what it said. With all the default settings, it said this! The main takeaways are: Based on the number of lines of code, the project took an estimated 2,536 days of development The code had 19,486 issues (!) of various severity - 2,736 were Major issues or worse Based on the number and types of issues, the project's technical debt will take 944 development days to fix; i.e. we are currently 944 days in the hole if we are going to sort this out completely. That's approximately 3.5 developers for a year! The debt cost was 37.23% (944 technical debt days / 2,536 development days); i.e. 37.23% of the total cost of developing the software now exists as technical debt. Sad face. As it was a legacy project, it predictably had no automated tests, which would have enabled NDepend to more precisely calculate the total annual interest incurred by the debt. Double sad face. You can still see NDepend's total interest calculation in the Debt and Issues explorer, though (see below) - it was 481 development days; i.e. an additional 481 days of development time needed every year the issues in the code base go unfixed - that's about 2 whole developers! These numbers make a powerful financial argument for refactoring and cleaning up the code. Which is exactly what we're doing :) But there's more - Debt and Issues As usual with NDepend, you can explore the issues it finds in great detail. Selecting from the Explore debt menu: can check out Debt and Issues on a rule-by-rule basis: The main offenders here ar[...]

SQL Azure as part of availability group, but not really doing any real work

Thu, 29 Jun 2017 04:38:30 GMT

Originally posted on:

Wouldn't it be nice to have SQL Azure as part of high availability group, but not really doing any real work?  I'm looking for a set up of Active/ReadOnly/DR-Azure.  So far I have no luck finding information on how to implement this. (image) (image)

SSMS 17.1 missing features

Thu, 29 Jun 2017 03:30:24 GMT

Originally posted on:

Just noticed that SSMS 17.1 (current version) does not support drag and drop of a SQL file.  Reverted back to SSMS 17 RC3 and that feature is back.  Going to check on other features. (image) (image)

How Do I Open a Port in Windows 10 (refused connection) ?

Tue, 27 Jun 2017 11:00:49 GMT

Originally posted on: you are seeing the "connection refused" message when attempting to set up a localhost access port, the chances are good that the port is blocked from allowing connections through Windows. To open the port, follow these instructions:1. Navigate to Control Panel, System and Security and Windows Firewall.2. Select Advanced settings and highlight Inbound Rules in the left pane.3. Right click Inbound Rules and select New Rule.4. Add the port you need to open and click Next.5. Add the protocol (TCP or UDP) and the port number into the next window and click Next.6. Select Allow the connection in the next window and hit Next.7. Select the network type as you see fit and click Next.8. Name the rule something meaningful and click Finish.You can also use  netsh (example):netsh advfirewall firewall add rule name="Open Port 80" dir=in action=allow protocol=TCP localport=80 [...]

WordPress 101 - Plugins!

Wed, 21 Jun 2017 02:28:23 GMT

Originally posted on: blog series covers just a few of the many features of WordPress from a developer's perspective.WordPress 101 - Setup/Configuration!WordPress 101 - Plugins!WordPress 101 - Themes!"WordPress powers over 25% of the internet..."If you've never heard of WordPress, you can think of it as the king of content management systems (CMS), allowing individuals and businesses to build and maintain robust websites with relative ease and with very little to no developer experience required. WordPress has a very flexible framework that allows for 3rd party developers to create "themes" and "plugins" that website owners can download and install with most freely available. Themes allow you to completely change the look and feel of your website with the click of a button and Plugins add functionality ranging from full blown online stores to hooking into Google Analytics... really just about anything you can think of.This article will focus on creating a simple plugin that will display a modal dialog when a user first visits your WordPress site.NOTE: You may want to download the full sample to be able to more easily follow along. is a WordPress plugin?"A WordPress Plugin is a program or a set of one or more functions written in the PHP scripting language, that adds a specific set of features or services to the WordPress site. You can seamlessly integrate a plugin with the site using access points and methods provided by the WordPress Plugin API."From a traditional developer view, what this really means is that this gives us the ability to hook into the framework and extend the base functionality to do nearly anything we want. So for example, say you have a client that needs to have their Google Analytics data displayed on an admin area of their website for the regional managers to review. Lucky for you there's a wonderful plugin already created for that and all you have to do is click a few buttons to install/configure and now you're looking like a boss! Or say your client tells you they want to capture more leads by asking site visitors to register with their newsletter when they first come to their site. Well I'm sure there's already a plugin that exists to do exactly that, however we'll use this requirement as the basis for our example (minus the newsletter part).If you have downloaded the sample plugin, let's go ahead and get that installed on your development instance of WordPress, otherwise you can skip ahead to the "Files and Locations" section.In WordPress, click on the "Plugins" option, or hover and then select "Add New"Then click "Choose File" and navigate to where you downloaded the sample .zip file and then click "Install Now"After a few moments, you'll see the following message... let's go ahead and activate the plugin while we're hereNow you'll see our new plugin listedFiles and LocationsAll WordPress plugins and associated files are located in the wp-content/plugins/ folder.  For our example, you'll find our files in wp-content\plugins\holisticsTG. In there you'll find one file and[...]

Creating a SharePoint DataLake with SQL Server using Enzo Unified

Mon, 19 Jun 2017 09:16:37 GMT

Originally posted on: this blog post I will show how you can easily copy a SharePoint list to a SQL Server table, and keep the data updated one a specific frequency, allowing you to easily create a DataLake for your SharePoint lists. This will work with SharePoint 2013 and higher, and with SharePoint Online. While you can spend a large amount of time learning the SharePoint APIs and its many subtleties, it is far more efficient to configure simple replication jobs that will work under most scenarios. The information provided in this post will help you get started in setting a replication of SharePoint lists to a SQL Server database, so that you can query the local SQL Server database from Excel, Reporting tools, or even directly to the database. You should also note that Enzo Unified provides direct real-time access to SharePoint Lists through native SQL commands so you can view, manage and update SharePoint List Items.   Installing Enzo Unified To try the steps provided in this lab, you will need the latest version of Enzo Unified (1.7 or higher) provided here: The download page also contains installation instructions. Enzo Unified Configuration Once Enzo Unified has been installed, start Enzo Manager (located in the Enzo Manager directory where Enzo was installed). Click on File –> Connect and enter the local Enzo connection information.  NOTE:  Enzo Unified is a Windows Service that looks like SQL Server; you must connect Enzo Manager to Enzo Unified which by default is running on port 9550. The password should be the one you specified during the installation steps. The following screen shows typical connection settings against Enzo Unified: Create Connection Strings Next, you will need to create “Central Connection Strings” so that Enzo will know how to connect to the source system (SharePoint) and the destination database (SQL Server). You manage connection strings from the Configuration –> Manage Connection Strings menu. In the screen below, you can see that a few connection strings have been created. The first one is actually a connection string to Enzo Unified, which we will need later. The next step is to configure the SharePoint adapter by specifying the credentials used by Enzo Unified. Configuring the SharePoint adapter is trivial: three parameters are needed: a SharePoint login name, the password for the login, and the URL to your SharePoint site. You should make sure the login has enough rights to access SharePoint lists and access SharePoint Fields. Once the configuration to the SharePoint site is complete, you can execute commands against Enzo Unified using SQL Server Management Studio. Fetch records from SharePoint using SQL Server Management Studio To try the above configuration, open SQL Server Management Studio, and connect to Enzo Unified (not SQL Server). From the same machine where Enzo is running, a typical connection screen looks like this: Once you are connected to Enzo Unif[...]