Subscribe: Dan Bricklin's Log
Added By: Feedage Forager Feedage Grade B rated
Language: English
app  apps  bcs  computer  connectivity  design  lot  mac  material design  material  mobile  time  video  web site  web  years 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Dan Bricklin's Log

Dan Bricklin's Log

VisiCalc co-creator Dan Bricklin chronicles his life in the computer world with pictures, text, and commentary.

Last Build Date: Wed, 11 Jan 2017 17:01:57 GMT


My talk on

Wed, 11 Jan 2017 17:01:35 GMT

I'm excited that a TEDx Talk I gave last November has been posted as a video. The talk is a short retelling of the VisiCalc origin story. It's a much-condensed version of a talk I've been giving for many years, updated and enhanced. I'm very happy with how it came out.

Watch the video: "Meet the inventor of the electronic spreadsheet".

TEDxBeaconStreet, where I gave the talk, is part of the TEDx program. It is an independently-organized TED-like event, named after "...Boston’s longest street, uniting towns and cities, neighborhoods and businesses, schools and families along the way." After they are given, TED picks up some of the talks at TEDx events and posts video from those talks on

As I've often done in the past, I wrote an essay for the Writings Section of my website chronicling the process I went through to turn my long, rambling slide show into a tight, 12-minute TEDx-style talk. Viewers of the video may find the essay of interest. It was really interesting how helpful experience with crafting understandable, 140-character tweets on Twitter were for writing a smooth, short talk. I also found the ease and non-obtrusiveness of using popular recording devices (specifically, a GoPro Hero Session and the Voice Recorder app on my iPhone) for capturing feedback when practicing in front of others to be important.

Read "The story behind my TEDx Talk" in the Writings Section.

The Frankston Challenge

Tue, 10 Nov 2015 23:32:42 GMT

This morning I went for a normal check up with a healthcare provider I hadn't visited in a few years. The receptionist handed me a clipboard with paper forms to fill out to "catch up" -- nothing pre-filled out. Luckily, I was given a very sharp pen -- the fields were often way too small (when you live in "Newton Highlands" you need a lot of room for "City"). When I got to the section on "Pharmacy" I remembered that that office had sent a prescription a previous time to the wrong location (same street and company, wrong city and address). To make sure this time, I needed the address and phone number. I took out my cell phone to look it up. One bar. Very slow and dropped connectivity. I found the address but gave up on the phone number. I saw across the room a little sign with "WiFi" written on it. I guess I could have gotten up and followed the info to connect to their wireless router, if I trusted it, but I was too lazy. I left the field blank. This was a reminder of what I call "The Frankston Challenge". As Bob Frankston has written repeatedly over the years, we need to have ambient connectivity that works everywhere without a prior relationship or human intervention to get past sign-in screens. (For example, see: Understanding Ambient Connectivity.) Yes, there was Internet connectivity where I was (in a normal office building in a major suburb on a busy street right near I-95/Rt 128), but it wasn't good enough or easy enough to answer a simple question quickly. For the laptops that the doctors and nurses used it was no problem -- they had already keyed in the special codes. For a new mobile device walking into the office, it was trouble like waiting for an old dialup modem to connect or nothing. In the world of the Internet of Mobile Things (IoMT) you want to be able to just connect. Mobile Things include smartphones, tablets, wearables, automobiles, and so many coming things. They move with (or on) you, or on their own. The connectivity environment around them changes. The Frankston Challenge is very real for them today. For example, you can't, as Bob points out, require a pacemaker or wearable reporting info to doctors and other systems to be pre-authorized with wireless carriers and WiFi access points everywhere the person might go. You don't want to have to pull out a keyboard (and maybe a credit card) to authorize connectivity when you feel tightness in your chest. This problem won't be solved overnight. Bob has been working constantly to bring this to people's attention. For me, though, I'm in the world of making systems for business people to build apps that run on tablets and phones that move around with the users. We can't wait. This means that we won't have reliable connectivity everywhere, even where you would expect it. We have to build our systems to tolerate loss of connectivity. A consultant may be visiting a client's retail site or factory, or a repair person out in the field fixing a pump or transformer. The apps they run that replace the paper clipboard with something better and more tied into the flow of their tasks must be able to run offline. We at Alpha Software have been working steadily to make disconnection-capable apps easier to build so that can be the default configuration. This is especially true for data capture applications. Not necessarily ignoring connectivity when available, but not becoming useless when it isn't. A final irony. The office building where my doctor worked was the same one were my old company, Software Arts, had rented some space back in the early 1980s. It was across a parking lot from our main building. We used a centralized computer for timesharing to run our whole business. We had a very early Ethernet system installed to give "high-speed" connectivity to the terminals on everyone's desk. There was coax cable everywhere. (This was very early in the history of Ethernet and the IBM XT was just coming out.) To connect the "remote" office hundreds of feet away, we had a trench dug under the parking lot (and then paved over) to run [...]

Long time no blog

Tue, 10 Nov 2015 23:30:54 GMT

It's been a long time since I posted here on my blog. I've written a couple of essays that appeared on the main web site, and I've tweeted a lot (as @DanB), and even did some podcasting as part of the Adventures in Alpha Land podcast series, but not much here. My day job at Alpha Software has continued to keep me very busy with other things.

You may have missed my Alpha WatchBench app for Apple Watch. See "How Alpha WatchBench Came About" on my main web site.

Essay about issues and techniques related to disconnected mobile apps

Tue, 15 Jul 2014 19:22:56 GMT

We're spending a lot of time at Alpha Software working on making it easier to create mobile business apps that support sometimes- or frequently-disconnected operation. As part of that work, I've learned a lot about how this issue is an impediment to widespread mobile app deployment in business and also about many of the areas that must be addressed in order to do it well. I've just posted an essay that covers a lot of what I've learned.

You can read "Dealing with Disconnected Operation in a Mobile Business Application: Issues and Techniques for Supporting Offline Usage" in the Writings section of my web site.

[Image in the original post on Dan Bricklin's Log with caption: The Offline Problem: Some common images of low connectivity]

HTML5 First: Google innovators prototype in the browser

Fri, 27 Jun 2014 18:24:21 GMT

Two days ago, on Wednesday, I watched the Google I/O Keynote live stream. Early on in the two and a half hour mega-presentation they introduced their new "visual language" for UI design: Material Design. It has "materials" inspired by paper with shadows, bold colorful graphics, and "meaningful" motion/animation.

As a developer who makes heavy use of HTML5, what immediately struck me was this statement, made starting at about 20:20 into the video:

"Last year at I/O we announced Polymer, which is a powerful new UI library for the web.

Today, we're bringing you all of the Material Design capabilities to the web through Polymer. [Applause] As a web developer, you'll be able to build applications out of Material Design building blocks with all of the same surfaces, bold graphics, and smooth animations at 60 frames per second. [More applause, followed by the speaker smiling and ad-libbing: "That was good..."]

So, between the L preview and Polymer, you can bring the same, rich, fluid Material Design to every screen."

Wow, I thought, Google not only designed a mobile UI for their Java-driven devices, they went to the trouble of also then building it in HTML5 for web apps (mobile and desktop).

I was wrong. I did some looking at the documentation for Polymer, and in the FAQ I found this (emphasis added):

How is Polymer related to material design?

Polymer is the embodiment of material design for the web. The Polymer team works closely with the design teams behind material design. In fact, Polymer played a key role in material design's development: it was used to quickly prototype and iterate on design concepts. The material design components are still a work in progress, but will mature over the coming months.

So, the HTML5 version wasn't created after the native versions. It was the prototyping environment before the native code.

This is a great model to follow: Prototype, iterate, and even first ship, in HTML5. Once you know what you need, if necessary, take the time to do native code. This doesn't just apply to the old desktop (as it has for years). It also applies to today's polished, fluid mobile world.

As a developer who is closely connected to a system that produces HTML5, and that aids in rapid prototyping, I was delighted. Here are some of the leading-edge mobile developers, and they found that HTML5 has the power to do what they want, and do it quickly enough for the demands of the iterative design and testing that is so important in the mobile world. After hearing so many people claiming that "they hear" that HTML5 doesn't have the power for serious mobile applications, it was vindication to hear of people who actually build things choosing to go the HTML5 route, even when cost clearly wasn't the object, and succeeding.

To be fair, the Material Design developers did depend on the latest browser versions, which have built-in support for hardware-accelerated animation and compositing, and on a sophisticated JavaScript library to make coding easier. However, most mobile devices are now getting upgraded frequently to the latest browsers (Google announced better upgrading for Android in the same keynote and Apple brags about the large percentage of iOS users who run the latest version) and most mobile HTML5 developers and systems (the one I use included) make use of such libraries already.

I guess it's time for a new moble-related tag: #HTML5first.

A new app from Dan and Alpha: AlphaRef Reader

Tue, 20 May 2014 16:29:54 GMT

For various reasons I decided that I needed an app tuned to reading reference material. I've released a new app that demonstrates addressing this need to the Apple App Store and the Google Play Store: AlphaRef King James Version of the Bible. This is the text of the Bible presented in a new reading environment that I created along with my daughter Adina, a UX and graphics designer. It is implemented using Alpha Anywhere and delivered as an HTML5 app in a PhoneGap wrapper.

To read the whole story, and to see a video of the app in action, read "AlphaRef Reader: Tablet-first design of an app for reading reference material" in the Writings section of my web site.

BCS Mac Introduction Meeting video is now available -- how that came about

Sun, 26 Jan 2014 20:55:46 GMT

The January 30, 1984, video of the introduction of the Macintosh to the public at a Boston Computer Society General Meeting is now available online. Harry McCracken, an editor at large at TIME for personal technology, posted an exclusive look at the video early this morning, right after excerpts where shown at the Mac 30th event in California (an "official" event marking the 30th anniversary of the original Mac announcement at the same place where it occurred). The Computer History Museum will be posting material on its web site tomorrow, and will update the video when additional processing of it finishes. See "Exclusive: Watch Steve Jobs’ First Demonstration of the Mac for the Public, Unseen Since 1984" on Harry explains how there were other videos of prior deliveries of Steve Job's talk introducing the Mac, but that this one is special for several reasons, some of which I have touched upon in my blog post Friday about it, such as the Q&A with a general tech audience. He also explains a bit about how the release of the video came about. Let me fill in a bit more. These were not fully "lost" videos. I have had my copy of the January 1984 VHS tape for years and let others know I had it, showing excepts sometimes when I gave talks. Given the use of Apple material, and the special nature of the video, I did not feel it was time to release it yet to the public as I was slowly doing with other videos I have (not BCS ones). I was in contact with the videographer, Glenn Koenig, who shot most of the BCS meetings on Software Arts' behalf over the years. He let me know that he had masters of many of them, though he hadn't cataloged them (and didn't search out the Jobs one until Harry asked, which yielded better versions of parts of the presentation). I let him know of the material that I had made sure made it to the Computer History Museum. I encouraged him to continue to maintain his tapes in good condition, which is what he has been doing all these years. We always intended to complete the edits of this material, but there is a lot of it, and digitizing it properly and editing it takes a lot of time and money. We didn't have the funds to do that and it was unclear if others would chip in and how to bring that about. Doing the work prematurely with the wrong equipment could compromise the tapes. (Of course, waiting too long could cause them to deteriorate too much.) When Harry, a BCS member going back to the VisiCalc days, contacted BCS founder Jonathan Rotenberg in the fall of 2012 to ask about video from BCS meetings, Jonathan got him in contact with Glenn. Glenn, Jonathan, and I brainstormed about ways to finally make the restoration and release happen. I contacted Ray Ozzie, who I knew was interested in historic videos and had worked with us at Software Arts in the early 1980's. He, like Harry, suggested the Computer History Museum as a means for funding instead of something like Kickstarter. I have been a long-time supporter of the Museum, and have been inducted as one of their "fellows" for my work on VisiCalc, and knew this was the right way to go. Eventually, Jonathan, Glenn, and I started working with people at the Museum, cataloging which material we had between us, and worked up a budget and a plan for fundraising. Glenn pointed out the upcoming 30th anniversary of the Mac as a good "news hook" and that a hoped-for article by TIME could help publicize the entire set (thank-you, Harry, for coming through with a great one!). We met periodically on the phone (we all had other jobs to keep us very busy) and eventually crafted a "request" letter and a list of potential donors. Glenn produced a short trailer of excerpts from a few of the tapes using some refurbished equipment to show those people. In September of last year we had the material ready for fundraising and I hand-signed (and in some cases added a little note to) a few dozen lette[...]

The Mac turns 30 and because we preserved the history you will be able to relive it

Fri, 24 Jan 2014 19:44:48 GMT

Today is 30 years since the Apple Macintosh was announced. That event occurred at a shareholders meeting in California. The following Monday, Steve Jobs and a crew of people from Apple traveled to Boston, Massachusetts, to redo that event for the public. It was at the Boston Computer Society General Meeting on January 30, 1984, at the John Hancock Hall in Boston. In addition to Steve's speech and demonstration, he also brought many of the developers of the Mac software and hardware with him. After his presentation, they all sat on stage and answered questions from the audience and did further demos. For the over 1,000 people who attended, it was an amazing event. At the shareholders meeting there was the worry of how it would be received and if the demos would work. At the BCS meeting there was much less pressure and Steve was relaxed and confident and engaged the audience. The attendees were knowledgeable and savvy. There is something else about the Boston event: My old company, Software Arts, at my suggestion as I recall, was paying to have video tapes made of many of the BCS General Meetings. (I was a board member of the BCS at various points). For this event, Apple provided additional cameras and made sure the lighting was good (it was not good in California, apparently). We at Software Arts ended up with an edited VHS copy of the event that I have kept for these 30 years and sometimes show excerpts from when I talk to students to show them what it was like in the "old" days. I also would show it to my daughters to let them see how a real pro delivered a speech. At this point I know Steve's intonation on every syllable of the start of his presentation. (It's different than when he gave it in California.) What happened last year is something wonderful: It turns out that many of the original BCS meeting tapes still exist. Some were donated to the Computer History Museum with my help. Many of the master tapes were still with the videographer, Glenn Koenig, including 3/4" tapes that are higher quality than the VHS copies. Together with Glenn and Jonathan Rotenberg (the founder and initial head of the BCS), we finally started a project with the Computer History Museum to restore the tapes and make the videos available to the public. This involves careful work on refurbished old equipment and careful editing remembering the events themselves to get the best possible video and sound. It will also include transcripts and other related museum-type treatments. We have raised money to pay for this (there are over 20 meetings to process) and have started digitizing and editing the tapes. The Mac Introduction will be the first, but there are others of great interest with the leaders of many leading personal computing companies, from Microsoft to Radio Shack to Digital Research to Lotus and IBM. A thank you to Brad Feld, who remembers the impact the BCS had on his interest in both technology and entrepreneurship from his days at MIT, and all of the others who came through with the money we needed! The Mac video will be released soon (I'll post here when it's available) and others will follow over time. [Photo of Steve Jobs about to insert disk into Mac in January 1984, from my VHS copy appears here in the original blog post on Dan Bricklin's Log] For those of you who have only seen the young Steve Jobs portrayed by actors, seeing him and his team as they actually were should be a real treat. I get chills down my spine watching it. It is such a joy to realize that a decision we made over 30 years ago at my old company, when the personal computing industry was a young oddity, will bring those days to life for a new generation and for generations to come and that they will care and appreciate it. (This isn't the first time I've felt that way. Another recording we did, an internal one of a staff meeting as the IBM PC was being announce[...]

MassTLC unConference session on Responsive App Design

Fri, 08 Nov 2013 20:04:01 GMT

Just as it has for the three years before, this fall the Massachusetts Technology Leadership Council ran its "Innovation unConference". They have a web site,, that explains it all, and I posted lots of photos to Flickr as I have in the past.

This time, I proposed and moderated a session on Responsive App Design. I started things off by showing my video on the topic to get everybody up to speed with what I meant by the term. (On this blog I had called the topic "Responsive App Layout", but I've since switched to the more common word "Design", use the URL [] for the companion web site, and redid the video to better show the example app.)

In the session, we had a lively, useful discussion, which one of the attendees kindly captured on a flip chart that the conference had provided in the room. There were things worth sharing, so I'm reproducing what was written with a little embellishment:

- Contexts / Break Point (what to take into account for different designs)
-- Width / Columns / Landscape / Portrait
-- Sighted / Blind / Color blind
-- Mouse / Touch
-- User standing / sitting / walking / computer mounted on a wall
-- Language orientation (e.g., left to right vs. right to left)
-- One-handed use vs. two-handed use
-- Google Glass, Voice controls, Wrist devices
-- The user's state of mind: Calm, stressed, etc.
-- Location: Inside, outside, driving, moving

- Examples of apps to learn from
-- Dark design
-- Inpath / Pathapp (for innovative controls)
-- Prezi
-- Grid on iOS
-- Runkeeper
-- Messaging apps (Apple: context changing UX)
-- Tandem Seven (design firm HQ in Massachusetts)
-- Has lots of examples of UX of different apps organized by tasks (screenshots)

- Apps vs. Sites:
-- Sites have a Presentation Layer
-- Apps also have major Control Layers and Input Layers

- Don't forget different expectations in each OS for user interface stye
-- Android vs. iOS vs. ...?

- "Write Once, QA Many"

- UX control techniques
-- Ease of Discoverability vs. Learned / Taught

- Action Items
-- Design Guidelines (we're too soon for standards)
-- Need more good/successful examples of business apps

That's it. One of the attendees told us about how his company had worked quite hard on each variation of their native apps for each platform and the thinking that went into each input control and interaction. They are in an app category (messaging) that is quite competitive and where specifics of the UI is one of the distinguishing features of the product and what makes it worthwhile for users to choose their product instead of just using the built-in app. I hope I can get a podcast with him at some point to hear what they went through.

Responsive App Layout

Thu, 17 Oct 2013 16:13:18 GMT

I've been writing a lot here about using HTML5 for business web apps, especially mobile apps. A benefit of HTML5 is that apps that run on multiple platforms are usually easier and less expensive to write than writing in native code for each platform. A challenge is having that one codebase adjust to the screen size and orientation differences of the different devices it runs on. This is more extensive than that usually needed by informational web sites traditionally targeted for Responsive Web Design. Developers need to take into account not just screen width but also screen height, as well as on- and off-screen controls.

I've produced a video, "Responsively Laying Out Web Apps", that explores this issue, embedded below. I've also written an essay about it, "Responsive Web App Design".

[Video on YouTube embedded on original blog post on]

Feedback that I've gotten is that this is an important issue that developers need to deal with. The video also shows how we have started to address it on the tools side by adding new capabilities to Alpha Anywhere.