Subscribe: Adactio
http://www.adactio.com/journal/journal.rss
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Adactio

Adactio: Journal



The online journal of Jeremy Keith, an author and web developer living and working in Brighton, England.



 



Global Diversity CFP Day—Brighton edition

Thu, 01 Feb 2018 16:42:21 GMT

There are enough middle-aged straight white men like me speaking at conferences. That’s why the Global Diversity Call-For-Proposals Day is happening this Saturday, February 3rd.

The purpose is two-fold. One is to encourage a diverse range of people to submit talk proposals to conferences. The other is to help with the specifics—coming with ideas, writing a good title and abstract, preparing the presentation, and all that.

Julie is organising the Brighton edition. Clearleft are providing the venue—68 Middle Street. I’ll be on hand to facilitate. Rosa and Dot will be doing the real work, mentoring the attendees.

If you’ve ever thought about submitting a talk proposal to a conference but just don’t know where to start, or if you’re just interested in the idea, please do come along on Saturday. It’s starts at 11am and will be all wrapped up by 3pm.

See you there!







GDPR and Google Analytics

Mon, 29 Jan 2018 12:55:35 GMT

Enforcement of the European Union’s General Data Protection Regulation is coming very, very soon. Look busy. This regulation is not limited to companies based in the EU—it applies to any service anywhere in the world that can be used by citizens of the EU. It’s less about data protection and more like a user’s bill of rights. That’s good. Cennydd has written a techie’s rough guide to GDPR. The Open Data Institute’s Jeni Tennison wrote down her thoughts on how it could change data portability in particular. While she welcomes GDPR, she has some misgivings. Blaine—who really needs to get a blog—shared his concerns in the form of the online equivalent of interpretive dance …a twitter thread (it’s called a thread because it inevitably gets all tangled, and it’s easy to break.) It’s increasingly looking like GDPR is a massive scaled-up version of the idiotic and horrifically mis-managed “cookie law”.— Blaine Cook (@blaine) January 28, 2018 The interesting thing about the so-called “cookie law” is that it makes no mention of cookies whatsoever. It doesn’t list any specific technology. Instead it states that any means of tracking or identifying users across websites requires disclosure. So if you’re setting a cookie just to manage state—so that users can log in, or keep items in a shopping basket—the legislation doesn’t apply. But as soon as your site allows a third-party to set a cookie, it’s banner time. Google Analytics is a classic example of a third-party service that uses cookies to track people across domains. That’s pretty much why it exists. We, as site owners, get to use this incredibly powerful tool, and all we have to do in return is add one little snippet of JavaScript to our pages. In doing so, we’re allowing a third party to read or write a cookie from their domain. Before Google Analytics, Google—the search engine business—was able to identify and track what users were searching for, and which search results they clicked on. But as soon as the user left google.com, the trail went cold. By creating an enormously useful analytics product that only required site owners to add a single line of JavaScript, Google—the online advertising business—gained the ability to keep track of users across most of the web, whether they were on a site owned by Google or not. Under the old “cookie law”, using a third-party cookie-setting service like that meant you had to inform any of your users who were citizens of the EU. With GDPR, that changes. Now you have to get consent. A dismissible little overlay isn’t going to cut it any more. Implied consent isn’t enough. Now this situation raises an interesting question. Who’s responsible for getting consent? Is it the site owner or the third party whose script is the conduit for the tracking? In the first scenario, you’d need to wait for an explicit agreement from a visitor to your site before triggering the Google Analytics functionality. Suddenly it’s not as simple as adding a single line of JavaScript to your site. In the second scenario, you don’t do anything differently than before—you just add that single line of JavaScript. But now that script would need to launch the interface for getting consent before doing any tracking. Google Analytics would go from being something invisible to something that directly impacts the user experience of your site. I’m just using Google Analytics as an example here because it’s so widespread. This also applies to third-party sharing buttons—Twitter, Facebook, etc.—and of course, advertising. In the case of advertising, it gets even thornier because quite often, the site owner has no idea which third party is about to do the tracking. Many, many sites use intermediary services (y’know, ‘cause bloated ad scripts aren’t slowing down sites enough so let’s throw some just-in-time bidding into the mix too). You could get consent for the intermediary service, but not for the final advert[...]



Design ops for design systems

Thu, 25 Jan 2018 12:56:57 GMT

Leading Design was one of the best events I attended last year. To be honest, that surprised me—I wasn’t sure how relevant it would be to me, but it turned out to be the most on-the-nose conference I could’ve wished for. Seeing as the event was all about design leadership, there was inevitably some talk of design ops. But I noticed that the term was being used in two different ways. Sometimes a speaker would talk about design ops and mean “operations, specifically for designers.” That means all the usual office practicalities—equipment, furniture, software—that designers might need to do their jobs. For example, one of the speakers recommended having a dedicated design ops person rather than trying to juggle that yourself. That’s good advice, as long as you understand what’s meant by design ops in that context. There’s another context of use for the phrase “design ops”, and it’s one that we use far more often at Clearleft. It’s related to design systems. Now, “design system” is itself a term that can be ambiguous. See also “pattern library” and “style guide”. Quite a few people have had a stab at disambiguating those terms, and I think there’s general agreement—a design system is the overall big-picture “thing” that can contain a pattern library, and/or a style guide, and/or much more besides: Styleguides, Pattern Libraries and Design Languages Design Systems vs. Pattern Libraries vs. Style Guides – What’s the Difference? Design Systems, Style Guides, and Pattern Libraries: Oh My! What’s the difference between style guides, pattern libraries, and design systems? None of those great posts attempt to define design ops, and that’s totally fair, because they’re all attempting to define things—style guides, pattern libraries, and design systems—whereas design ops isn’t a thing, it’s a practice. But I do think that design ops follows on nicely from design systems. I think that design ops is the practice of adopting and using a design system. There are plenty of posts out there about the challenges of getting people to use a design system, and while very few of them use the term design ops, I think that’s what all of them are about: Why Design Systems Fail Tips for in-house teams in a free market software culture Putting your design system into practice Clearly design systems and design ops are very closely related: you really can’t have one without the other. What I find interesting is that a lot of the challenges relating to design systems (and pattern libraries, and style guides) might be technical, whereas the challenges of design ops are almost entirely cultural. I realise that tying design ops directly to design systems is somewhat limiting, and the truth is that design ops can encompass much more. I like Andy’s description: Design Ops is essentially the practice of reducing operational inefficiencies in the design workflow through process and technological advancements. Now, in theory, that can encompass any operational stuff—equipment, furniture, software—but in practice, when we’re dealing with design ops, 90% of the time it’s related to a design system. I guess I could use a whole new term (design systems ops?) but I think the term design ops works well …as long as everyone involved is clear on the kind of design ops we’re all talking about. [...]



Needs must

Sat, 20 Jan 2018 17:07:06 GMT

I got a follow-up comment to my follow-up post about the follow-up comment I got on my original post about Google Analytics. Keep up. I made the point that, from a front-end performance perspective, server logs have no impact whereas a JavaScript-based analytics solution must have some impact on the end user. Paul Anthony says: Google won the analytics war because dropping one line of JS in the footer and handing a tried and tested interface to customers is an obvious no brainer in comparison to setting up an open source option that needs a cron job to parse the files, a database to store the results and doesn’t provide mobile interface. Good point. Dropping one snippet of JavaScript into your front-end codebase is certainly an easier solution …easier for you, that is. The cost is passed on to your users. This is a classic example of where user needs and developer needs are in opposition. I’ve said it before and I’ll say it again: Given the choice between making something my problem, and making something the user’s problem, I’ll choose to make it my problem every time. It’s true that this often means doing more work. That’s why it’s called work. This is literally what our jobs are supposed to entail: we put in the work to make life easier for users. We’re supposed to be saving them time, not passing it along. The example of Google Analytics is pretty extreme, I’ll grant you. The cost to the user of adding that snippet of JavaScript—if you’ve configured things reasonably well—is pretty small (again, just from a performance perspective; there’s still the cost of allowing Google to track them across domains), and the cost to you of setting up a comparable analytics system based on server logs can indeed be disproportionately high. But this tension between user needs and developer needs is something I see play out again and again. I’ve often thought the HTML design principle called the priority of constituencies could be adopted by web developers: In case of conflict, consider users over authors over implementors over specifiers over theoretical purity. In other words costs or difficulties to the user should be given more weight than costs to authors. In Resilient Web Design, I documented the three-step approach I take when I’m building anything on the web: Identify core functionality. Make that functionality available using the simplest possible technology. Enhance! Now I’m wondering if I should’ve clarified that second step further. When I talk about choosing “the simplest possible technology”, what I mean is “the simplest possible technology for the user”, not “the simplest possible technology for the developer.” For example, suppose I were going to build a news website. The core functionality is fairly easy to identify: providing the news. Next comes the step where I choose the simplest possible technology. Now, if I were a developer who had plenty of experience building JavaScript-driven single page apps, I might conclude that the simplest route for me would be to render the news via JavaScript. But that would be a fragile starting point if I’m trying to reach as many people as possible (I might well end up building a swishy JavaScript-driven single page app in step three, but step two should almost certainly be good ol’ HTML). Time and time again, I see decisions that favour developer convenience over user needs. Don’t get me wrong—as a developer, I absolutely want developer convenience …but not at the expense of user needs. I know that “empathy” is an over-used word in the world of user experience and design, but with good reason. I think we should try to remind ourselves of why we make our architectural decisions by invoking who those decisions benefit. For example, “This tech stack is best option for our team”, or “This solution is the best for the widest range of users.” Then, given the choi[...]



Heisenberg

Fri, 19 Jan 2018 11:30:23 GMT

I wrote about Google Analytics yesterday. As usual, I syndicated the post to Ev’s blog, and I got an interesting response over there. Kelly Burgett set me straight on some of the finer details of how goals work, and finished with this thought: You mention “delivering a performant, accessible, responsive, scalable website isn’t enough” as if it should be, and I have to disagree. It’s not enough for a business to simply have a great website if you are unable to understand performance of channel marketing, track user demographics and behavior on-site, and optimize your site/brand based on that data. I’ve seen a lot of ugly sites who have done exceptionally well in terms of ROI, simply because they are getting the data they need from the site in order make better business decisions. If your site cannot do that (ie. through data collection, often third party scripts), then your beautifully-designed site can only take you so far. That makes an excellent case for having analytics. But that’s not necessarily the same as having Google analytics, or even JavaScript-driven analytics at all. By far the most useful information you get from analytics is around where people have come from, where did they go next, and what kind of device are they using. None of that information requires JavaScript. It’s all available from your server logs. I don’t want to come across all old-man-yell-at-cloud here, but I’m trying to remember at what point self-hosted software for analysing your log traffic became not good enough. Here’s the thing: logging on the server has no effect on the user experience. It’s basically free, in terms of performance. Logging via JavaScript, by its very nature, has some cost. Even if its negligible, that’s one more request, and that’s one more bit of processing for the CPU. All of the data that you can only get via JavaScript (in-page actions, heat maps, etc.) are, in my experience, better handled by dedicated software. To me, that kind of more precise data feels different to analytics in the sense of funnels, conversions, goals and all that stuff. So in order to get more fine-grained data to analyse, our analytics software has now doubled down on a technology—JavaScript—that has an impact on the end user, where previously the act of observation could be done at a distance. There are also blind spots that come with JavaScript-based tracking. According to Google Analytics, 0% of your customers don’t have JavaScript. That’s not necessarily true, but there’s literally no way for Google Analytics—which relies on JavaScript—to even do its job in the absence of JavaScript. That can lead to a dangerous situation where you might be led to think that 100% of your potential customers are getting by, when actually a proportion might be struggling, but you’ll never find out about it. Related: according to Google Analytics, 0% of your customers are using ad-blockers that block requests to Google’s servers. Again, that’s not necessarily a true fact. So I completely agree than analytics are a good thing to have for your business. But it does not follow that Google Analytics is a good thing for your business. Other options are available. I feel like the assumption that “analytics = Google Analytics” is like the slippery slope in reverse. If we’re all agreed that analytics are important, then aren’t we also all agreed that JavaScript-based tracking is important? In a word, no. This reminds me of the arguments made in favour of intrusive, bloated advertising scripts. All of the arguments focus on the need for advertising—to stay in business, to pay the writers—which are all great reasons for advertising, but have nothing to do with JavaScript, which is at the root of the problem. Everyone I know who uses an ad-blocker—including me—doesn’t use it to stop seeing adverts, but to stop the performance of the page being degra[...]



Analysing analytics

Thu, 18 Jan 2018 19:32:25 GMT

Hell is other people’s JavaScript. There’s nothing quite so crushing as building a beautifully performant website only to have it infested with a plague of third-party scripts that add to the weight of each page and reduce the responsiveness, making a mockery of your well-considered performance budget. Trent has been writing about this: My latest realization is that delivering a performant, accessible, responsive, scalable website isn’t enough: I also need to consider the impact of third-party scripts. He’s started the process by itemising third-party scripts. Frustratingly though, there’s rarely one single culprit that you can point to—it’s the cumulative effect of “just one more beacon” and “just one more analytics script” and “just one more A/B testing tool” that adds up to a crappy experience that warms your user’s hands by ensuring your site is constantly draining their battery. Actually, having just said that there’s rarely one single culprit, Adobe Tag Manager is often at the root of third-party problems. That and adverts. It’s like opening the door of your beautifully curated dream home, and inviting a pack of diarrhetic elephants in: “Please, crap wherever you like.” But even the more well-behaved third-party scripts can get out of hand. Google Analytics is so ubiquitous that it’s hardly even considered in the list of potentially harmful third-party scripts. On the whole, it’s a fairly well-behaved citizen of your site’s population of third-party scripts (y’know, leaving aside the whole surveillance capitalism business model that allows you to use such a useful tool for free in exchange for Google tracking your site’s visitors across the web and selling the insights from that data to advertisers). The initial analytics script that you—asynchronously—load into your page isn’t very big. But depending on how you’ve configured your Google Analytics account, that might just be the start of a longer chain of downloads and event handlers. Ed recently gave a lunchtime presentation at Clearleft on using Google Analytics—he professes modesty but he really knows his stuff. He was making sure that everyone knew how to set up goals’n’stuff. As I understand it, there are two main categories of goals: events and destinations (there are also durations and pages, but they feel similar to destinations). You use events to answer questions like “Did the user click on this button?” or “Did the user click on that search field?”. You use destinations to answer questions like “Did the user arrive at this page?” or “Did the user come from that page?” You can add as many goals to your site’s analytics as you want. That’s an intoxicating offer. The problem is that there is potentially a cost for each goal you create. It’s an invisible cost. It’s paid by the user in the currency of JavaScript sent down the wire (I wish that the Google Analytics admin interface were more like the old interface for Google Fonts, where each extra file you added literally pushed a needle higher on a dial). It strikes me that the event-based goals would necessarily require more JavaScript in order to listen out for those clicks and fire off that information. The destination-based goals should be able to get all the information needed from regular page navigations. So I have a hypothesis. I think that destination-based goals are less harmful to performance than event-based goals. I might well be wrong about that, and if I am, please let me know. With that hypothesis in mind, and until I learn otherwise, I’ve got two rules of thumb to offer when it comes to using Google Analytics: Try to keep the number of goals to a minimum. If you must create a goal, favour destinations over events. [...]



Words I wrote in 2017

Mon, 01 Jan 2018 12:29:49 GMT

I wrote 78 blog posts in 2017. That works out at an average of six and a half blog posts per month. I’ll take it.

Here are some pieces of writing from 2017 that I’m relatively happy with:

Going Rogue. A look at the ethical questions raised by Rogue One

In AMP we trust. My unease with Google’s AMP format was growing by the day.

A minority report on artificial intelligence. Revisiting two of Spielberg’s films after a decade and a half.

Progressing the web. I really don’t want progressive web apps to just try to imitate native apps. They can be so much more.

CSS. Simple, yes, but not easy.

Intolerable. A screed. I still get very, very angry when I think about how that manifestbro duped people.

Акула. Recounting a story told by a taxi driver.

Hooked and booked. Does A/B testing lead to dark patterns?

Ubiquity and consistency. Different approaches to building on the web.

I hope there’s something in there that you like. It always a nice bonus when other people like something I’ve written, but I write for myself first and foremost. Writing is how I figure out what I think. I will, of course, continue to write and publish on my website in 2018. I’d really like it if you did the same.




Food I ate in 2017

Mon, 01 Jan 2018 11:47:11 GMT

I did a fair bit of travelling in 2017, which I always enjoy. I particularly enjoy it when Jessica comes with me and we get to sample the cuisine of other countries. Portugal will always be a culinary hotspot for me, particularly Porto (“tripas à moda do Porto” is one of the best things I’ve ever tasted). When I was teaching at the New Digital School in Porto back in February, I took full advantage of the culinary landscape. A seafood rice (and goose barnacles) at O Gaveto in Matosinhos was a particular highlight. The most unexpected thing I ate in Porto was when I wandered off for lunch on my own one day. I ended up in a little place where, when I walked in, it was kind of like that bit in the Western when the music stops and everyone turns to look. This was clearly a place for locals. The owner didn’t speak any English. I didn’t speak any Portuguese. But we figured it out. She mimed something sandwich-like and said a word I wasn’t familiar with: bifana. Okay, I said. Then she mimed the universal action for drinking, so I said “agua.” She looked at with a very confused expression. “Agua!? Não. Cerveja!” Who am I to argue? Anyway, she produced this thing which was basically some wet meat in a bun. It didn’t look very appetising. But this was the kind of situation where I couldn’t back out of eating it. So I took a bite and …it was delicious! Like, really, really delicious. Later in February, we went to Pittsburgh to visit Cindy and Matt. We were there for my birthday, so Cindy prepared the most amazing meal. She reproduced a dish from the French Laundry—sous-vide lobster on orzo. It was divine! Later in the year, we went to Singapore for the first time. The culture of hawker centres makes it the ideal place for trying lots of different foods. There were some real revelations in there. We visited lots of other great places like Reykjavík, Lisbon, Barcelona, and Nuremberg. But as well as sampling the cuisine of distant locations, I had some very fine food right here in Brighton, home to Trollburger, purveyors of the best burger you’ll ever eat. I also have a thing for hot wings, so it’s very fortunate that The Joker, home to the best wings in Brighton, is just around the corner from the dance studio where Jessica goes for ballet. Regular wing nights became a thing in 2017. I started a little routine in 2017 where I’d take a break from work in the middle of the afternoon, wander down to the seafront, and buy a single oyster. It only took a few minutes out of the day but it was a great little dose of perspective each time. But when I think of my favourite meals of 2017, most of them were home-cooked. [...]



Audio I listened to in 2017

Sat, 30 Dec 2017 20:21:55 GMT

I huffduffed 290 pieces of audio in 2017. I’ve still got a bit of a backlog of items I haven’t listened to yet, but I thought I’d share some of my favourite items from the past year. Here are twelve pieces of audio, one for each month of 2017… Donald Hoffman’s TED talk, Do we see reality as it really is?. TED talks are supposed to blow your mind, right? (22:15) Donald Hoffman: Do we see reality as it is? | TED Talk | TED.com on Huffduffer How to Become Batman on Invisibilia. Alix Spiegel and Lulu Miller challenge you to think of blindness as social construct. Hear ‘em out. (58:02) 🎧 How to Become Batman | Invisibilia (NPR) on Huffduffer Where to find what’s disappeared online, and a whole lot more: the Internet Archive on Public Radio International. I just love hearing Brewster Kahle’s enthusiasm and excitement. (42:43) Where to find what’s disappeared online, and a whole lot more: the Internet Archive | Public Radio International on Huffduffer Every Tuesday At Nine on Irish Music Stories. I’ve been really enjoying Shannon Heaton’s podcast this year. This one digs into that certain something that happens at an Irish music session. (40:50) Episode 03-Every Tuesday at Nine | shannonheatonmusic.com on Huffduffer Adam Buxton talks to Brian Eno (part two is here). A fun and interesting chat about Brian Eno’s life and work. (53:10 and 46:35) EP.37 - Brian Eno, Part One on Huffduffer EP.38 - Brian Eno, Part Two on Huffduffer Nick Cave and Warren Ellis on Kreative Kontrol. This was far more revealing than I expected: genuine and unpretentious. (57:07) Ep. #323: Nick Cave and Warren Ellis | Kreative Kontrol on Huffduffer Paul Lloyd at Patterns Day. All the talks at Patterns Day were brilliant. Paul’s really stuck with me. (28:21) Patterns Day: Paul Lloyd on Huffduffer James Gleick on Time Travel at The Long Now. There were so many great talks from The Long Now’s seminars on long-term thinking. Nicky Case and Jennifer Pahlka were standouts too. (1:20:31) James Gleick: Time Travel - The Long Now on Huffduffer Long Distance on Reply All. It all starts with a simple phone call. (47:27) #102 Long Distance on Huffduffer The King of Tears on Revisionist History. Malcolm Gladwell’s style suits podcasting very well. I liked this episode about country songwriter Bobby Braddock. Related: Jon’s Troika episode on tearjerkers. (42:14) The King of Tears on Huffduffer Feet on the Ground, Eyes on the Stars: The True Story of a Real Rocket Man with G.A. “Jim” Ogle. This was easily my favourite podcast episode of 2017. It’s on the User Defenders podcast but it’s not about UX. Instead, host Jason Ogle interviews his father, a rocket scientist who worked on everything from Apollo to every space shuttle mission. His story is fascinating. (2:38:21) Feet on the Ground, Eyes on the Stars: The True Story of a Real Rocket Man with G.A. “Jim” Ogle – User Defenders podcast : Inspiring Interviews with UX Superheroes. on Huffduffer R.E.M. on Song Exploder. Breaking down the song Try Not To Breathe from Automatic For The People. (16:15) Song Exploder | R.E.M. on Huffduffer I’ve gone back and added the tag “2017roundup” to each of these items. So if you’d like to subscribe to a podcast of just these episodes, here are the links: RSS Subscribe in Podcasts app Subscribe in Overcast Subscribe in Downcast Subscribe in Instacast Subscribe in another app [...]