Subscribe: Rockford Lhotka
http://www.lhotka.net/WeBlog/SyndicationService.asmx/GetRss
Added By: Feedage Forager Feedage Grade B rated
Language: Malay
Tags:
blockchain  business  code  data  magenic  mark  microsoft  msa  people  problems  software  technology  time  work  xml  years 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Rockford Lhotka

Rockford Lhotka



Creator of the CSLA .NET framework



Last Build Date: Fri, 07 Apr 2017 16:02:32 GMT

Copyright: Marimer LLC
 



Exactly what is blockchain?

Fri, 07 Apr 2017 16:02:32 GMT

Trying to figure out the core "meat" behind blockchain is really difficult. I'm going to try and tease out all the hype, and all the references to specific use cases to get down to the actual technology at play here. (this is my second blockchain post, in my previous post I compare blockchain today to XML in the year 2000) There are, of course, tons of articles about blockchain out there. Nearly all of them talk about the technology only in the context of specific use cases. Most commonly bitcoin and distributed ledgers. But blockchain the technology is neither a currency nor a distributed ledger: it is a tool used to implement those two types of application or use case. There's precious little content out there about the technology involved, at least at the level of the content you can find for things like SQL Server, MongoDb, and other types of data store. And the thing is, blockchain is basically just a specialized type of data store that offers a defined set of behaviors. Different in the specifics from the behaviors of a RDBMS or document database, but comparable at a conceptual level. I suspect the lack of "consumable" technical information is because blockchain is very immature at the moment. Blockchain seems to be where RDBMS technology was around 1990. It exists, and uber-geeks are playing with it, and lots of business people see $$$ in their future with it. But it will take years for the technology to settle down and become tractable for mainstream DBA or app dev people. Today what you can find are discussions about extremely low-level mathematics, cryptography, and computer science. Which is fine, that's also what you find if you dig deep enough into Oracle's database, SQL Server, and lots of other building-block technologies on top of which nearly everything we do is created. In other words, only hard-core database geeks really care about how blockchain is implemented - just like only hard-core geeks really care about how an RDBMS is implemented. Obviously we need a small number of people to live and breathe that deep technology, so the rest of us can build cool stuff using that technology. So what is blockchain? From what I can gather, it is this: a distributed, immutable, persistent, append-only linked list. Breaking that down a bit: A linked list where each node contains data Immutable Each new node is cryptographically linked to the previous node The list and the data in each node is therefore immutable, tampering breaks the cryptography Append-only New nodes can be added to the list, though existing nodes can't be altered Persistent Hence it is a data store - the list and nodes of data are persisted Distributed Copies of the list exist on many physical devices/servers Failure of 1+ physical devices has no impact on the integrity of the data The physical devices form a type of networked cluster and work together New nodes are only appended to the list if some quorum of physical devices agree with the cryptography and validity of the node via consistent algorithms running on all devices This is why blockchain is often described as a "trusted third party", because the cluster is self-policing Terminology-wise, where I say "node" you can read "block". And where I say "data" a lot of the literature uses the term "transaction" or "block of transactions". But from what I've been able to discover, the underlying technology at play here doesn't really care if each block contains "transactions" or other arbitrary blobs of data. What we build on top of this technology then becomes the question. Thus far what we've seen are distributed ledgers for cryptocurrencies (e.g. bitcoin) and proof of concept ledgers for banking or other financial scenarios. Maybe that's all this is good for - and if so it is clearly still very valuable. But I strongly suspect that, as a low level foundational technology, blockchain will ultimately be used for other things as well. I'm also c[...]



Blockchain is today what XML was in 1998

Wed, 05 Apr 2017 17:40:44 GMT

It seems to me that blockchain today is where XML was at the beginning. A low level building block on which people are constructing massive hopes and dreams, mentally bypassing the massive amounts of work necessary to get from there to the goals. What I mean by this is perhaps best illustrated by a work environment I was in just prior to XML coming on the scene.

The business was in the bio-chemical agriculture sector, so they dealt with all the major chemical manufacturer/providers in the world. They'd been part of an industry working group composed of these manufacturers and various competitors for many years at that point. The purpose of the working group was to develop a standard way of describing the products, components, parts, and other aspects of the various "products" being manufactured, purchased, resold, and applied to farm fields.

You'll note that I used the word "product" twice, and put it in quotes. This is because, after all those years, the working group never did figure out a common definition for the word "product".

One more detail that's relevant, which is that everyone had agreed to transfer data via COBOL-defined file structures. I suppose that dated back to when they traded reels of magnetic tape, but carried forward to transferring files via ftp, and subsequently the web.

Along comes XML, offering (to some) a miracle solution. Of course XML only solved the part of the problem that these people had already solved, which was how to devise a common data transfer language. Was XML better than COBOL headers? Probably. Did it solve the actual problem of what the word "product" meant? Not at all.

I think blockchain is in the same position today. It is a distributed, append-only, reliable database. It doesn't define what goes in the database, just that whatever you put in there can't be altered or removed. So in that regard it is a lot like XML, which defined an encoding structure for data, but didn't define any semantics around that data.

The concept of XML then, and blockchain today, is enough to inspire people's imaginations in some amazing ways.

Given amazing amounts of work (mostly not technical work either, but at a business level) over many, many years XML became something useful via various XML languages (e.g. XAML). And a lot of the low-level technical benefits of XML have been superseded by JSON, but that's really kind of irrelevant, since all the hard work of devising standardized data definitions applies to JSON as well as XML.

I won't be at all surprised if the same general path isn't followed by blockchain. We're at the start of years and years of hard non-technical work to devise ways to use the concept of a distributed, append-only, reliable database. Along the way the underlying technology will become standardized and will merge into existing platforms like .NET, Java, AWS, Azure, etc. And I won't be surprised if some better technical solution is discovered (like JSON was) along the way, but that better technical solution probably won't really matter because the hard work is in figuring out the business-level models and data structures necessary to make use of this underlying database concept.

(image)



VS 2017 and netstandard projects

Thu, 30 Mar 2017 22:39:30 GMT

I really like the new VS 2017 tooling.

However, it has some real problems – it is far from stable for netstandard.

I have a netstandard class library project. Here are issues I’m facing.

  1. Every time I edit the compiler directives in the Build tab in project properties it adds YET ANOTHER set of compiler constants for RELEASE;NETSTANDARD1_6 – those duplicates add up fast!
  2. The output path acts really odd – always insists on appending something like \netstandard1.5\ to the end of the output path – even if the output path already ends with \netstandard1.5\ - in NO case can I get it to use the path I actually want!! This should act like normal projects imo – not arbitrarily appending crap to my path!
  3. I have one netstandard class library referencing another via a project reference and this doesn’t seem to be working at all – none of my types from the first class library project seem available in the second
  4. The Add References dialog doesn’t show existing references to Shared Projects – the only way to know that the reference is already there is to look at the csproj file in a text editor

We’re going in (what I think) is a good direction with the tooling, but right now it is hard/impossible to integrate netstandard projects into a normal workflow because the tooling is pretty buggy.

(image)



A Period of Stability

Mon, 20 Feb 2017 19:29:39 GMT

(image)

According to this article, the smartphone boom is over, and the "next big thing" isn't really here yet. I would argue that's good. We need a breather to catch up with all the changes from the past several years.

In a sense, there have been two periods in my career that were really fun from the perspective of solving business problems (as opposed to other points that were equally fun from the perspective of learning new tech).

One was a couple years before and after 1990, when the minicomputer ecosystem was generally stable (HP 3000, Unix, VAX were common options). The other period was the six years when VB6 was dominant, while .NET was still nascent, VB had matured, and Windows was the defacto target for all client software.

In both those cases there was a 5-6 year window when the platforms were slow-changing, the dev tools were mature, and disruption was around the fringes, not in the mainstream. From a "learn new tech" perspective those were probably pretty boring periods of time. But from a "solve big business problems" perspective they were amazing periods of time, because everyone felt pretty comfortable using the platforms/tools at hand to actually do something useful for end users.

The iPad turned the world on its ear, and we're just now back to a point where it is clear that the platform is .NET/Java on the server and Angular on the client (regardless of the client OS). The server tooling has been fine for years, but I think we can see real stability for client development in the near future - whew!

So if the chaos we've been suffering through for the past several years (decade?) is coming to an end, and there's no clear "next big thing", then with any luck we'll find ourselves in a nice period of actual productivity for a little while. And I think that'd be refreshing.

(image)



Microsoft identities - argh!

Wed, 08 Feb 2017 18:51:54 GMT

The concept of identity with Microsoft services is a mess, something I probably don't have to tell any Microsoft developer.

Some services can only be attached to a personal Microsoft Account (MSA), and other services can only be used from an AD account. For example, MSDN can only be directly associated with an MSA, while Office 365 can only be associated with an AD account. Some things, like VSTS, can be associated with either one depending on the scenario.

I used to have the following:

  • r___y@lhotka.net - MSA with my MSDN associated
  • r___y@magenic.com - Magenic AD account
  • r___y@magenic.com - MSA with nothing attached (I created this long ago and forgot about it)

That was a total pain when I started using O365 and an AD-linked VSTS site with my r___y@magenic.com AD account, because Microsoft couldn't automatically distinguish between my AD and MSA accounts; both named r___y@magenic.com. As a result, every time I tried to log into one of these sites they'd ask if this was a personal or work/school account.

Fortunately you can rename an MSA to a different email address. I renamed my (essentially unused) r___y@magenic.com account to a dummy email address so now I really just have two identities:

  • r___y@lhotka.net - MSA with my MSDN associated
  • r___y@magenic.com - Magenic AD account

This way Microsoft automatically knows that when I use my AD login that it is a work/school account and I don't have to mess with that confusion.

There's still the issue of having MSDN attached to an MSA, and also needing to have some connection from my AD account to my MSDN subscription. This is required because we have VSTS sites associated with Magenic's AD, so I need to log in with my AD account, but still need to ensure VSTS knows I'm a valid MSDN user.

Here's info on how to link your work account to your MSDN/MSA account.

At the end of the day, if I'd never created that r___y@magenic.com MSA account (many years ago) my life would have been much simpler to start. Fortunately the solution to that problem is to rename the MSA email to something else and remove the confusion between AD and MSA.

The MSDN linking makes sense, given that you need an MSA for MSDN, and many of us need corporate AD credentials for all our work sites.

(image)



Case study in unpreditability being bad

Thu, 05 Jan 2017 22:04:55 GMT

I was reading an HBR article about Why Being Unpredictable Is a Bad Strategy and all I could think about was the Windows 8 debacle.

Leading up to the development and release of Windows 8 Microsoft switched from an open and predictable model to a very closed and secretive model. Sure, they'd waffled back and forth in the years prior to Windows 8, but it wasn't until that point in their history that they went "entirely dark" about something as important as Windows itself.

Personally I think they were copying Apple, because at that point in time Apple was ascendant with the iPad and Microsoft was worried. The thing is, a secretive model works for Apple because nobody relies on their long-term vision for stability. Their target are consumers, who like fun stuff and care little if things break every couple years.

Microsoft's primary customer base are small, medium, and large enterprises who spend millions or billions on IT. They don't like fun, they like predictable roadmaps that minimize cost and risk. The last thing a business wants is a version of Windows that comes out of the blue and breaks all their software, or requires complete retraining of their entire user base.

Worse yet, Microsoft not only increased risk for all of its business customers with Windows 8, they totally cut off all avenues for feedback and improvement of the product until after it was released. After it was too late to address the numerous major issues with the new OS.

Fortunately Windows 10 has been a whole other story. Microsoft not only returned to their original open communication model, but they've actually became more open than they've ever been in their history. And it shows, in that the business world now has a predictable roadmap, and Windows has never been so closely shaped by real-world customer feedback.

The result is that Windows 10 adoption is proceeding at a rapid pace, and Microsoft is (very so) slowly rebuilding trust with its customers.

(image)



Frameworks have value

Tue, 03 Jan 2017 16:53:41 GMT

We're having a conversation on Magenic's internal forum where we're discussing the current JavaScript community reaction to all these frameworks. Some people in the industry are looking at the chaos of frameworks and libraries and deciding to just write everything in vanilla js - eschewing the use of external dependencies.

And I get that, I really do. However, I'm also an advocate of using frameworks - which shouldn't be surprising coming from the author of CSLA .NET.

Many years ago I spoke at a Java conference (they were trying to expand into the .NET space too).

At lunch I listened to a conversation between some other folks at the table; they were discussing the use of Spring (which was fairly new at the time).

Their conclusion was that although Spring did a ton of useful and powerful things, it was too big/complex and so they'd rather not use it and solve all those problems themselves (the problems solved by Spring).

I see the same thing all the time with CSLA .NET. People look at it and see something that is big and complex, and thing "those problems can't be that hard to solve", so they end up rewriting (usually poorly) large parts of CSLA.

I say "usually poorly" because their job isn't to create a well-tested and reusable framework. Their job is to solve some business problem. So they solve some subset of each problem that Spring or CSLA solves in-depth, and then wonder why their resulting app is unreliable, or performs badly, or whatever.

As the author of a widely used OSS framework, my job is to create a framework that solves and abstracts away key problems that business developers would otherwise encounter. Because of this, I'm able to solve those problems in a broader and deeper way than a business developer, who's goal is to put as little effort into solving the lower level problem, because it is just a distraction from solving the actual business problem.

So yeah, I do understand that some of these frameworks, like Angular, Spring, CSLA .NET, etc. are complex, and they have their own learning curve. But they exist because they solve a bunch of lower level non-business related problems that you will otherwise have to solve yourself. And the time you spend solving those problems provides zero business value, and does ultimately add to the long-term maintenance cost of your resulting business software.

There's not a perfect answer here to be sure. But for my part, I like to think that the massive amounts of time and energy spent by framework authors to truly understand solve those hard non-business problems is time well spent, allowing business developers to be more focused on solving the problems they are actually paid to address.

(image)



Coding Standards resolution (part 2)

Wed, 28 Dec 2016 20:05:51 GMT

In a previous blog post I related a coding standards horror story from early in my career. A couple commenters asked for part 2 of the story, mostly to see how my boss, Mark, dealt with the chaos we found in the company who acquired us.

There are two things I think are fortunate that relate to the story.

First, they bought our company because they wanted our software and because they wanted Mark. It is quite possible that nobody in the world understood the vertical industry and had the software dev chops that Mark provided, so he had a lot of personal value.

Second, before the acquisition I'd been tasked with writing tooling to enable globalization support for our software. Keep in mind that this was VT terminal style software, and all the human readable text shown anywhere on the screen came from text literals or strings generated by our code. The concept of a resx file like we have in Windows didn't (to our knowledge) exist, and certainly wasn't used in our code. Coming right out of university, the concept of lexical parsing and building compilers was fresh in my mind, so my solution was to write a relatively simplistic parser that found all the text in the code and externalized it into what today we'd call a resource file.

That project was one of the most fun things I've probably ever done. And one of the few times in my career where a university compiler design course directly applied to anything in the real world.

Because Mark was so well regarded by the new company, he ended up in charge of the entire new development team. As such, he had the authority to impose his coding standards on the whole group, including the team of chaos-embracing developers. Not that they were happy, but this was the late 1980s and jobs weren't plentiful, and my recollection is that they grumbled about it, and the fact that it was a "damn Yankee" imposing his will on the righteous people of The South. But they went along with the change.

However, that still left massive amounts of pre-existing code that was essentially unreadable and unmaintainable. To resolve that, Mark took my parser as a base and wrote tools (I don't remember if it was one tool, or one per coding "style") that automatically reformatted all that existing code into Mark's style. That was followed by a ton of manual work to fix edge cases and retest the code to make sure it worked. In total I think this process took around 2 months, certainly not longer.

I wasn't directly involved in any of that fix-up process, as I had been assigned to do the biggest project yet in my young career: building support for an entire new product line in a related vertical to the focus of our original software. Talk about a rush for someone just a year out of university!

(image)



Dealing with Automation

Tue, 27 Dec 2016 18:50:33 GMT

I've always been a fan of speculative fiction, and in particular the sub-genre of cyberpunk and what is often now called dark space opera (which usually has cyberpunk aspects). As most people have become aware over the past few decades, good science fiction explores possible futures that come about due to technological advancements. The focus is usually on the changes to society or mankind, with the technology being just a driver for the change. If you weren't aware that this is the core of good SF, then I'm happy to let you know that you should be reading this sort of fiction because it will help you be more prepared for changes as they occur. Among the key themes inherent in most of these speculative futures is the idea of automation. That computers, robots, and machines will automate away some (or nearly all) jobs that humans have done in the past, or that they do today. A couple decades ago this was true fiction; today we can see that this is an almost unavoidable future. Personally I find this interesting because my entire career has been in the software industry. Most software is all about automating away people's jobs. Not that we usually frame it that way, but the reality is that corporations used to have massive numbers of accountants, now they have a small handful because computers do the work of those many, many thousands of accountants from the past. And software drives robotics, and machines, and all kinds of automation. My career is all about driving toward a future of automation, and so I tend to think about what that means for society. For example, I was just reading that driverless cars will eliminate over 200 categories of jobs. We already know that nearly any factory work can be automated, it is just a matter of whether the automation is cheaper than offshore labor. There's essentially no way US labor can be cheap enough to avoid the work from being automated, so bringing jobs back from offshore is entirely unrealistic. This article from a Nobel economist sums up how robotics threatens jobs rather nicely. And explains why worries about outsourcing jobs, or thoughts of trying to "bring them back" are not really important. Capitalism and the free market drive companies to find the lowest cost way to provide the minimum viable product that makes the most profit. That's brutal, but it is true. Current US and European trends toward right-wing thinking tend to focus a lot on removing barriers so corporations can better pursue capitalistic and free market policies. So companies will either find super-cheap labor somewhere in the world, or if that's too hard or expensive then they'll automate those tasks so they don't require large numbers of humans at all. Whatever costs less in the long run will win, and that will not involve human labor. Assuming we're going to stick with capitalism, corporatism, and free market concepts (and I think that's a safe bet), the question isn't whether most people on the planet will become unemployed. The question is how humanity and society will deal with most people being unemployed. One common trope in speculative fiction, and in reality, is the idea of a basic income provided for unemployable people. This article against universal basic income (make sure to click through to the author's original article with details) makes some good points about the risks of a basic income. Sadly, even after you read the author's original (and often good) points, I think it is clear that he maintains unrealistic hopes about keeping most people employed in some manner. I don't have the answer. I don't know what society does look like when factories that required thousands of workers now require a few hundred technicians to keep the machines running. Can we retrain those thousands of unemployed piece workers for another factory? And what stops tha[...]



Coding standards horror story

Thu, 22 Dec 2016 18:30:01 GMT

Early in my career, actually my first "real" job, I worked for a guy named Mark. Mark was an amazingly smart and driven programmer and I learned a lot from him. But he and I would butt heads constantly over coding standards and coding style. Our platform was the DEC VAX and our code was written using VAX Basic. For this story to make sense, you must realize that VAX Basic wasn't really BASIC the way you typically think about it. The DEC compiler team had started with Basic syntax and then merged in all the goodness of FORTRAN and Modula II and Pascal. For example, even back in the late 1980's the language had structured error/exception handling. It is also important to understand that in the late 1980's there weren't code editors like Visual Studio. We used something called TPU (Text Processing Utility) that was more powerful than Notepad by far, but nothing compared to modern editors of today. It competed with Emacs and vi. As a result, it was up to the developer to style their code, not the editing tool. Mark had defined a strict set of coding standards and a style guide, and he'd dial back into work from home at night to review our code (yes, via at 1200 baud modem at the time). It was not uncommon to come into work the next day and have a meeting in Mark's office where he'd go through the styling mistakes in my code so I could go fix them. He defined things like every line would start with a tab, then 2 spaces. Or two tabs, then 2 spaces. Why the 2 spaces? I still have no idea, but that was the standard. He had a list of language keywords that were off limits. Roughly half the keywords were forbidden in fact. He defined where the Then would go after an If; on the next line down, indented. I chafed at all of this. The 2 space thing was particularly stupid in my view, as was making all those fun keywords off limits. I can't say I cared about the Then statement one way or the other, except that it forced me to type yet another line with the stupid 2 space indent. But what are you going to do? Mark was the boss, and this was my first job out of university in the late 1980's Reagan-era recession. In a job environment where thousands of experienced computer programmers were being laid off in Minnesota I wasn't going to risk my job. Which isn't to say that I didn't argue - anyone who knows me knows that I can't keep my opinions to myself :) The thing is, after a period of time it seems like any standard becomes second nature. I just internalized Mark's rules and coded happily away. In fact, what I found is that my "artistic expression" wasn't really crippled by the guidelines, I just had to learn how to be artistic within them. Knowing a few artists, I'm now aware that putting boundaries around what you do is key to creating art. Artists pretty much always limit themselves (often artificially) so they have a context in which to work. Not that I think code is 100% art, but I do think that there's art in good code. Several months after starting to work with Mark, and absorbing this rigorous standards-based approach to coding, our company was purchased by another company. Mark, myself, and a couple other employees survived this process. And we all moved from Minnesota to Birmingham, AL; which is another story entirely. Let me just say here that regional culture differences within the US are easily comparable to the difference between the US and many European countries. The company that bought my employer was also a software company, with a dev team of comparable size to what we'd had. But they had no coding standard or guideline. They did use the DEC VAX, and they did use VAX Basic. But each developer did whatever they wanted without regard for anyone else. One of the developers cut his teeth on FORTRAN. It turns out that VAX Basic cou[...]