Subscribe: Adactio
http://www.adactio.com/journal/journal.rss
Added By: Feedage Forager Feedage Grade A rated
Language: English
Tags:
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Adactio

Adactio: Journal



The online journal of Jeremy Keith, an author and web developer living and working in Brighton, England.



 



Heisenberg

Fri, 19 Jan 2018 11:30:23 GMT

I wrote about Google Analytics yesterday. As usual, I syndicated the post to Ev’s blog, and I got an interesting response over there. Kelly Burgett set me straight on some of the finer details of how goals work, and finished with this thought: You mention “delivering a performant, accessible, responsive, scalable website isn’t enough” as if it should be, and I have to disagree. It’s not enough for a business to simply have a great website if you are unable to understand performance of channel marketing, track user demographics and behavior on-site, and optimize your site/brand based on that data. I’ve seen a lot of ugly sites who have done exceptionally well in terms of ROI, simply because they are getting the data they need from the site in order make better business decisions. If your site cannot do that (ie. through data collection, often third party scripts), then your beautifully-designed site can only take you so far. That makes an excellent case for having analytics. But that’s not necessarily the same as having Google analytics, or even JavaScript-driven analytics at all. By far the most useful information you get from analytics is around where people have come from, where did they go next, and what kind of device are they using. None of that information requires JavaScript. It’s all available from your server logs. I don’t want to come across all old-man-yell-at-cloud here, but I’m trying to remember at what point self-hosted software for analysing your log traffic became not good enough. Here’s the thing: logging on the server has no effect on the user experience. It’s basically free, in terms of performance. Logging via JavaScript, by its very nature, has some cost. Even if its negligible, that’s one more request, and that’s one more bit of processing for the CPU. All of the data that you can only get via JavaScript (in-page actions, heat maps, etc.) are, in my experience, better handled by dedicated software. To me, that kind of more precise data feels different to analytics in the sense of funnels, conversions, goals and all that stuff. So in order to get more fine-grained data to analyse, our analytics software has now doubled down on a technology—JavaScript—that has an impact on the end user, where previously the act of observation could be done at a distance. There are also blind spots that come with JavaScript-based tracking. According to Google Analytics, 0% of your customers don’t have JavaScript. That’s not necessarily true, but there’s literally no way for Google Analytics—which relies on JavaScript—to even do its job in the absence of JavaScript. That can lead to a dangerous situation where you might be led to think that 100% of your potential customers are getting by, when actually a proportion might be struggling, but you’ll never find out about it. Related: according to Google Analytics, 0% of your customers are using ad-blockers that block requests to Google’s servers. Again, that’s not necessarily a true fact. So I completely agree than analytics are a good thing to have for your business. But it does not follow that Google Analytics is a good thing for your business. Other options are available. I feel like the assumption that “analytics = Google Analytics” is like the slippery slope in reverse. If we’re all agreed that analytics are important, then aren’t we also all agreed that JavaScript-based tracking is important? In a word, no. This is reminds me of the arguments made in favour of intrusive, bloated advertising scripts. All of the arguments focus on the need for advertising—to stay in business, to pay the writers—which are all great reasons for advertising, but have nothing to do with JavaScript, which is at the root of the problem. Everyone I know who uses an ad-blocker—including me—doesn’t use it to stop seeing adverts, but to stop the performance of the page being degraded (and to avoid being tracked across domains). So let’s not confuse the means with the ends. If you need to have advertising, tha[...]



Analysing analytics

Thu, 18 Jan 2018 19:32:25 GMT

Hell is other people’s JavaScript. There’s nothing quite so crushing as building a beautifully performant website only to have it infested with a plague of third-party scripts that add to the weight of each page and reduce the responsiveness, making a mockery of your well-considered performance budget. Trent has been writing about this: My latest realization is that delivering a performant, accessible, responsive, scalable website isn’t enough: I also need to consider the impact of third-party scripts. He’s started the process by itemising third-party scripts. Frustratingly though, there’s rarely one single culprit that you can point to—it’s the cumulative effect of “just one more beacon” and “just one more analytics script” and “just one more A/B testing tool” that adds up to a crappy experience that warms your user’s hands by ensuring your site is constantly draining their battery. Actually, having just said that there’s rarely one single culprit, Adobe Tag Manager is often at the root of third-party problems. That and adverts. It’s like opening the door of your beautifully curated dream home, and inviting a pack of diarrhetic elephants in: “Please, crap wherever you like.” But even the more well-behaved third-party scripts can get out of hand. Google Analytics is so ubiquitous that it’s hardly even considered in the list of potentially harmful third-party scripts. On the whole, it’s a fairly well-behaved citizen of your site’s population of third-party scripts (y’know, leaving aside the whole surveillance capitalism business model that allows you to use such a useful tool for free in exchange for Google tracking your site’s visitors across the web and selling the insights from that data to advertisers). The initial analytics script that you—asynchronously—load into your page isn’t very big. But depending on how you’ve configured your Google Analytics account, that might just be the start of a longer chain of downloads and event handlers. Ed recently gave a lunchtime presentation at Clearleft on using Google Analytics—he professes modesty but he really knows his stuff. He was making sure that everyone knew how to set up goals’n’stuff. As I understand it, there are two main categories of goals: events and destinations (there are also durations and pages, but they feel similar to destinations). You use events to answer questions like “Did the user click on this button?” or “Did the user click on that search field?”. You use destinations to answer questions like “Did the user arrive at this page?” or “Did the user come from that page?” You can add as many goals to your site’s analytics as you want. That’s an intoxicating offer. The problem is that there is potentially a cost for each goal you create. It’s an invisible cost. It’s paid by the user in the currency of JavaScript sent down the wire (I wish that the Google Analytics admin interface were more like the old interface for Google Fonts, where each extra file you added literally pushed a needle higher on a dial). It strikes me that the event-based goals would necessarily require more JavaScript in order to listen out for those clicks and fire off that information. The destination-based goals should be able to get all the information needed from regular page navigations. So I have a hypothesis. I think that destination-based goals are less harmful to performance than event-based goals. I might well be wrong about that, and if I am, please let me know. With that hypothesis in mind, and until I learn otherwise, I’ve got two rules of thumb to offer when it comes to using Google Analytics: Try to keep the number of goals to a minimum. If you must create a goal, favour destinations over events. [...]



Words I wrote in 2017

Mon, 01 Jan 2018 12:29:49 GMT

I wrote 78 blog posts in 2017. That works out at an average of six and a half blog posts per month. I’ll take it.

Here are some pieces of writing from 2017 that I’m relatively happy with:

Going Rogue. A look at the ethical questions raised by Rogue One

In AMP we trust. My unease with Google’s AMP format was growing by the day.

A minority report on artificial intelligence. Revisiting two of Spielberg’s films after a decade and a half.

Progressing the web. I really don’t want progressive web apps to just try to imitate native apps. They can be so much more.

CSS. Simple, yes, but not easy.

Intolerable. A screed. I still get very, very angry when I think about how that manifestbro duped people.

Акула. Recounting a story told by a taxi driver.

Hooked and booked. Does A/B testing lead to dark patterns?

Ubiquity and consistency. Different approaches to building on the web.

I hope there’s something in there that you like. It always a nice bonus when other people like something I’ve written, but I write for myself first and foremost. Writing is how I figure out what I think. I will, of course, continue to write and publish on my website in 2018. I’d really like it if you did the same.




Food I ate in 2017

Mon, 01 Jan 2018 11:47:11 GMT

I did a fair bit of travelling in 2017, which I always enjoy. I particularly enjoy it when Jessica comes with me and we get to sample the cuisine of other countries. Portugal will always be a culinary hotspot for me, particularly Porto (“tripas à moda do Porto” is one of the best things I’ve ever tasted). When I was teaching at the New Digital School in Porto back in February, I took full advantage of the culinary landscape. A seafood rice (and goose barnacles) at O Gaveto in Matosinhos was a particular highlight. The most unexpected thing I ate in Porto was when I wandered off for lunch on my own one day. I ended up in a little place where, when I walked in, it was kind of like that bit in the Western when the music stops and everyone turns to look. This was clearly a place for locals. The owner didn’t speak any English. I didn’t speak any Portuguese. But we figured it out. She mimed something sandwich-like and said a word I wasn’t familiar with: bifana. Okay, I said. Then she mimed the universal action for drinking, so I said “agua.” She looked at with a very confused expression. “Agua!? Não. Cerveja!” Who am I to argue? Anyway, she produced this thing which was basically some wet meat in a bun. It didn’t look very appetising. But this was the kind of situation where I couldn’t back out of eating it. So I took a bite and …it was delicious! Like, really, really delicious. Later in February, we went to Pittsburgh to visit Cindy and Matt. We were there for my birthday, so Cindy prepared the most amazing meal. She reproduced a dish from the French Laundry—sous-vide lobster on orzo. It was divine! Later in the year, we went to Singapore for the first time. The culture of hawker centres makes it the ideal place for trying lots of different foods. There were some real revelations in there. We visited lots of other great places like Reykjavík, Lisbon, Barcelona, and Nuremberg. But as well as sampling the cuisine of distant locations, I had some very fine food right here in Brighton, home to Trollburger, purveyors of the best burger you’ll ever eat. I also have a thing for hot wings, so it’s very fortunate that The Joker, home to the best wings in Brighton, is just around the corner from the dance studio where Jessica goes for ballet. Regular wing nights became a thing in 2017. I started a little routine in 2017 where I’d take a break from work in the middle of the afternoon, wander down to the seafront, and buy a single oyster. It only took a few minutes out of the day but it was a great little dose of perspective each time. But when I think of my favourite meals of 2017, most of them were home-cooked. [...]



Audio I listened to in 2017

Sat, 30 Dec 2017 20:21:55 GMT

I huffduffed 290 pieces of audio in 2017. I’ve still got a bit of a backlog of items I haven’t listened to yet, but I thought I’d share some of my favourite items from the past year. Here are twelve pieces of audio, one for each month of 2017… Donald Hoffman’s TED talk, Do we see reality as it really is?. TED talks are supposed to blow your mind, right? (22:15) Donald Hoffman: Do we see reality as it is? | TED Talk | TED.com on Huffduffer How to Become Batman on Invisibilia. Alix Spiegel and Lulu Miller challenge you to think of blindness as social construct. Hear ‘em out. (58:02) 🎧 How to Become Batman | Invisibilia (NPR) on Huffduffer Where to find what’s disappeared online, and a whole lot more: the Internet Archive on Public Radio International. I just love hearing Brewster Kahle’s enthusiasm and excitement. (42:43) Where to find what’s disappeared online, and a whole lot more: the Internet Archive | Public Radio International on Huffduffer Every Tuesday At Nine on Irish Music Stories. I’ve been really enjoying Shannon Heaton’s podcast this year. This one digs into that certain something that happens at an Irish music session. (40:50) Episode 03-Every Tuesday at Nine | shannonheatonmusic.com on Huffduffer Adam Buxton talks to Brian Eno (part two is here). A fun and interesting chat about Brian Eno’s life and work. (53:10 and 46:35) EP.37 - Brian Eno, Part One on Huffduffer EP.38 - Brian Eno, Part Two on Huffduffer Nick Cave and Warren Ellis on Kreative Kontrol. This was far more revealing than I expected: genuine and unpretentious. (57:07) Ep. #323: Nick Cave and Warren Ellis | Kreative Kontrol on Huffduffer Paul Lloyd at Patterns Day. All the talks at Patterns Day were brilliant. Paul’s really stuck with me. (28:21) Patterns Day: Paul Lloyd on Huffduffer James Gleick on Time Travel at The Long Now. There were so many great talks from The Long Now’s seminars on long-term thinking. Nicky Case and Jennifer Pahlka were standouts too. (1:20:31) James Gleick: Time Travel - The Long Now on Huffduffer Long Distance on Reply All. It all starts with a simple phone call. (47:27) #102 Long Distance on Huffduffer The King of Tears on Revisionist History. Malcolm Gladwell’s style suits podcasting very well. I liked this episode about country songwriter Bobby Braddock. Related: Jon’s Troika episode on tearjerkers. (42:14) The King of Tears on Huffduffer Feet on the Ground, Eyes on the Stars: The True Story of a Real Rocket Man with G.A. “Jim” Ogle. This was easily my favourite podcast episode of 2017. It’s on the User Defenders podcast but it’s not about UX. Instead, host Jason Ogle interviews his father, a rocket scientist who worked on everything from Apollo to every space shuttle mission. His story is fascinating. (2:38:21) Feet on the Ground, Eyes on the Stars: The True Story of a Real Rocket Man with G.A. “Jim” Ogle – User Defenders podcast : Inspiring Interviews with UX Superheroes. on Huffduffer R.E.M. on Song Exploder. Breaking down the song Try Not To Breathe from Automatic For The People. (16:15) Song Exploder | R.E.M. on Huffduffer I’ve gone back and added the tag “2017roundup” to each of these items. So if you’d like to subscribe to a podcast of just these episodes, here are the links: RSS Subscribe in Podcasts app Subscribe in Overcast Subscribe in Downcast Subscribe in Instacast Subscribe in another app [...]



Books I read in 2017

Thu, 28 Dec 2017 13:49:15 GMT

Here are the books I read in 2017. It’s not as many as I hoped. I set myself a constraint this year so that I’d have to alternate between reading fiction and non-fiction: no reading two fiction books back-to-back, and no reading two non-fiction books back-to-back. I quite like the balanced book diet that resulted. I think I might keep it going. Anyway, in order of consumption, here are those books… Leviathan Wakes by James S.A. Corey ★★★☆☆ I had already seen—and quite enjoyed—the first series of the television adaption of The Expanse so I figured I’d dive into the books that everyone kept telling me about. The book was fun …but no more than that. I don’t think I’m invested enough to read any of the further books. In some ways, I think this makes for better TV than reading (despite the TV’s shows annoying “slow motion in zero G” trope that somewhat lessens the hard sci-fi credentials). Black Box Thinking by Matthew Syed ★★★★☆ This was recommended by James Box, and on the whole, I really liked it. There’s a lot of anecdata though. Still, the fundamental premise is a good one, comparing the attitudes towards risk in two different industries; aviation and healthcare. A little bit more trimming down would’ve helped the book—it dragged on just a bit too long. The Separation by Christopher Priest ★★★★★ I need to read at least one Christopher Priest book a year. They’re in a league of their own, somehow outside the normal rules of criticism. This one is a true stand-out. As ever, it messes with your head and gets weirder as it goes on. If you haven’t read any Christopher Priest, I reckon this would be a great one to start with. Deep Sea and Foreign Going by Rose George ★★★★☆ Recommended by both Jessica and Danielle, this is a well-crafted look into life on board a cargo ship, as well as an examination of ocean-going logistics. If you liked the Containers podcast, you’ll like this. I found it a little bit episodic—more like a collection of magazine articles sometimes—but still enjoyable. Bloodchild by Octavia E. Butler A false start. This is a short story, not a novel—I didn’t know that when I downloaded it to my Kindle. It’s an excellent short story though. Still, I felt it didn’t count in my zigzagging between fiction and non-fiction so I followed it with… Star Maker by Olaf Stapledon ★★★☆☆ Science fiction from the 1930s. The breadth of imagination is quite staggering, even if the writing is sometimes a bit of a slog. Still, it seems remarkably ahead of its time in many ways. The Sense Of Style by Steven Pinker ★★★★☆ I spent a portion of 2017 writing a book so I was eager to read Steven Pinker’s take on a style guide, having thoroughly enjoyed The Language Instinct and The Blank Slate. This book starts with a bang—a critique of some examples of great writing. Then there’s some good practical advice, and then there’s a bit of a laundry list of non-rules. Typical of Pinker, the points about unclear writing are illustrated with humorous real-world examples. Overall, a good guide but perhaps a little longer than it needs to be. Aurora by Kim Stanley Robinson ★★★★★ I loved everything about this book. Writing On The Wall by Tom Standage ★★☆☆☆ I’ve read of all of Tom Standage’s books but none of them have ever matched the brilliance of The Victorian Internet. This one was frustratingly shallow. Every now and then there were glimpses of a better book. There’s a chapter on radio that gets genuinely exciting and intriguing. If Tom Standage wrote a whole book on that, I’d read it in a heartbeat. But in this collection of social media through the ages, it just reminded me of how much better he can be. Grass by Sheri S. Tepper [...]



The Last Jedi

Tue, 26 Dec 2017 15:49:38 GMT

If you haven’t seen The Last Jedi (yet), please stop reading. Spoilers ahoy. I’ve been listening to many, many podcast episodes about the latest Star Wars film. They’re all here on Huffduffer. You can subscribe to a feed of just those episodes if you want. I am well aware that the last thing anybody wants or needs is one more hot take on this film, but what the heck? I figured I’d jot down my somewhat simplistic thoughts. I loved it. But I wasn’t sure at first. I’ve talked to other people who felt similarly on first viewing—they weren’t sure if they liked it or not. I know some people who, on reflection, decided they definitely didn’t like it. I completely understand that. A second viewing helped to cement my positive feelings towards this film. This is starting to become a trend: I didn’t think much of Rogue One on first viewing, but a second watch reversed my opinion completely. Maybe I just find it hard to really get into the flow when I’m seeing a new Star Wars film for the very first time—an event that I once thought would never occur again. My first viewing of The Last Jedi wasn’t helped by having the worst seats in the house. My original plan was to see it with Jessica at a minute past midnight in The Duke Of York’s in Brighton. I bought front-row tickets as soon as they were available. But then it turned out that we were going to be in Seattle at that time instead. We quickly grabbed whatever tickets were left. Those seats were right at the front and far edge of the cinema, so the screen was more trapezoid than rectangular. The lights went down, the fanfare blared, and the opening crawl begin its march up …and to the left. My brain tried to compensate for the perspective effects but it was hard. Is Snoke’s face supposed to look like that? Does that person really have such a tiny head? But while the spectacle was somewhat marred, the story unfolded in all its surprising delight. I thoroughly enjoyed the feeling of having the narrative rug repeatedly pulled out from under me. I loved the unexpected end of Snoke in his vampiric boudoir. Let’s face it, he was the least interesting part of The Force Awakens—a two-dimensional evil mastermind. To despatch him in the middle of the middle chapter was the biggest signal that The Last Jedi was not simply going to retread the beats of the original trilogy. I loved the reveal of Rey’s parentage. This was what I had been hoping for—that Rey came from nowhere in particular. After The Force Awakens, I wrote: Personally, I’d like it if her parentage were unremarkable. Maybe it’s the socialist in me, but I’ve never liked the idea that the Force is based on eugenics; a genetic form of inherited wealth for the lucky 1%. I prefer to think of the Force as something that could potentially be unlocked by anyone who tries hard enough. But I had resigned myself to the inevitable reveal that would tie her heritage into an existing lineage. What an absolute joy, then, that The Force is finally returned into everyone’s hands! Anil Dash describes this wonderfully in his post Every Last Jedi: Though it’s well-grounded in the first definitions of The Force that we were introduced to in the original trilogy, The Last Jedi presents a radically inclusive new view of the Force that is bigger and broader than the Jedi religion which has thus-far colored our view of the entire Star Wars universe. I was less keen on the sudden Force usage by Leia. I think it was the execution more than the idea that bothered me. Still, I realise that the problem lies just as much with me. See, lots of the criticism of this film comes from people (justifiably) saying “That’s not how The Force works!” in relation to Rey, Kylo Ren, or Luke Skywalker. I don’t share that reaction and I want to say, &[...]



Ubiquity and consistency

Sat, 23 Dec 2017 09:45:10 GMT

I keep thinking about this post from Baldur Bjarnason, Over-engineering is under-engineering. It took me a while to get my head around what he was saying, but now that (I think) I understand it, I find it to be very astute. Let’s take a single interface element, say, a dropdown menu. This is the example Laura uses in her article for 24 Ways called Accessibility Through Semantic HTML. You’ve got two choices, broadly speaking: Use the HTML select element. Create your own dropdown widget using JavaScript (working with divs and spans). The advantage of the first choice is that it’s lightweight, it works everywhere, and the browser does all the hard work for you. But… You don’t get complete control. Because the browser is doing the heavy lifting, you can’t craft the details of the dropdown to look identical on different browser/OS combinations. That’s where the second option comes in. By scripting your own dropdown, you get complete control over the appearance and behaviour of the widget. The disadvantage is that, because you’re now doing all the work instead of the browser, it’s up to you to do all the work—that means lots of JavaScript, thinking about edge cases, and making the whole thing accessible. This is the point that Baldur makes: no matter how much you over-engineer your own custom solution, there’ll always be something that falls between the cracks. So, ironically, the over-engineered solution—when compared to the simple under-engineered native browser solution—ends up being under-engineered. Is it worth it? Rian Rietveld asks: It is impossible to style select option. But is that really necessary? Is it worth abandoning the native browser behavior for a complete rewrite in JavaScript of the functionality? The answer, as ever, is it depends. It depends on your priorities. If your priority is having consistent control over the details, then foregoing native browser functionality in favour of scripting everything yourself aligns with your goals. But I’m reminded of something that Eric often says: The web does not value consistency. The web values ubiquity. Ubiquity; universality; accessibility—however you want to label it, it’s what lies at the heart of the World Wide Web. It’s the idea that anyone should be able to access a resource, regardless of technical or personal constraints. It’s an admirable goal, and what’s even more admirable is that the web succeeds in this goal! But sometimes something’s gotta give, and that something is control. Rian again: The days that a website must be pixel perfect and must look the same in every browser are over. There are so many devices these days, that an identical design for all is not doable. Or we must take a huge effort for custom form elements design. So far I’ve only been looking at the micro scale of a single interface element, but this tension between ubiquity and consistency plays out at larger scales too. Take page navigations. That’s literally what browsers do. Click on a link, and the browser fetches that URL, displaying progress at it goes. The alternative, as exemplified by single page apps, is to do all of that for yourself using JavaScript: figure out the routing, show some kind of progress, load some JSON, parse it, convert it into HTML, and update the DOM. Personally, I tend to go for the first option. Partly that’s because I like to apply the rule of least power, but mostly it’s because I’m very lazy (I also have qualms about sending a whole lotta JavaScript down the wire just so the end user gets to do something that their browser would do for them anyway). But I get it. I understand why others might wish for greater control, even if it comes with a price tag of fragility. I think Jake’s navigation tra[...]



Origin story

Sat, 09 Dec 2017 10:32:49 GMT

In an excellent piece called The First Web Apps: 5 Apps That Shaped the Internet as We Know It, Matthew Guay wrote: The world wide web wasn’t supposed to be this fun. Berners-Lee imagined the internet as a place to collaborate around text, somewhere to share research data and thesis papers. In his somewhat confused talk at FFConf this year, James Kyle said: The web was designed to share documents. Douglas Crockford said The web was not designed to do any of things it is doing. It was intended to be a simple—even primitive—document retrieval system. Some rando on Hacker News declared: Essentially every single aspect of the web is terrible. It was designed as a static document presentation system with hyperlinks. It appears to be a universally accepted truth. The web was designed for sharing documents, and was never meant for the kind of applications we can build these days. I don’t think that’s quite right. I think it’s fairer to say that the first use case for the web was document retrieval. And yes, that initial use case certainly influenced the first iteration of HTML. But right from the start, the vision for the web wasn’t constrained by what it was being asked to do at the time. (I mean, if you need an example of vision, Tim Berners-Lee called it the World Wide Web when it was just on one computer!) The original people working on the web—Tim Berners-Lee, Robert Cailliau, Jean-Francois Groff, etc.—didn’t to try define the edges of what the web would be capable of. Quite the opposite. All of them really wanted a more interactive read-write web where documents could not only be read, but also edited and updated. As for the idea of having a programming language in browsers (as well as a markup language), Tim Berners-Lee was all for it …as long as it could be truly ubiquitous. To say that the web was made for sharing documents is like saying that the internet was made for email. It’s true in the sense that it was the most popular use case, but that never defined the limits of the system. The secret sauce of the internet lies in its flexibility—it’s a deliberately dumb network that doesn’t care about the specifics of what runs on it. This lesson was then passed on to the web—another deliberately simple system designed to be agnostic to use cases. It’s true that the web of today is very, very different to its initial incarnation. We got CSS; we got JavaScript; HTML has evolved; HTTP has evolved; URLs have …well, cool URIs don’t change, but you get the idea. The web is like the ship of Theseus—so much of it has been changed and added to over time. That doesn’t mean its initial design was flawed—just the opposite. It means that its initial design wasn’t unnecessarily rigid. The simplicity of the early web wasn’t a bug, it was a feature. The web (like the internet upon which it runs) was designed to be flexible, and to adjust to future use-cases that couldn’t be predicted in advance. The best proof of this flexibility is the fact that we can and do now build rich interactive applications on the World Wide Web. If the web had truly been designed only for documents, that wouldn’t be possible. [...]



Nosediving

Tue, 28 Nov 2017 12:01:04 GMT

Nosedive is the first episode of season three of Black Mirror. It’s fairly light-hearted by the standards of Black Mirror, but all the more chilling for that. It depicts a dysutopia where people rate one another for points that unlock preferential treatment. It’s like a twisted version of the whuffie from Cory Doctorow’s Down And Out In The Magic Kingdom. Cory himself points out that reputation economies are a terrible idea. Nosedive has become a handy shortcut for pointing to the dangers of social media (in the same way that Minority Report was a handy shortcut for gestural interfaces and Her is a handy shortcut for voice interfaces). “Social media is bad, m’kay?” is an understandable but, I think, fairly shallow reading of Nosedive. The problem isn’t with the apps, it’s with the system. A world in which we desperately need to keep our score up if we want to have any hope of advancing? That’s a nightmare scenario. The thing is …that system exists today. Credit scores are literally a means of applying a numeric value to human beings. Nosedive depicts a world where your score determines which seats you get in a restaurant, or which model of car you can rent. Meanwhile, in our world, your score determines whether or not you can get a mortgage. Nosedive depicts a world in which you know your own score. Meanwhile, in our world, good luck with that: It is very difficult for a consumer to know in advance whether they have a high enough credit score to be accepted for credit with a given lender. This situation is due to the complexity and structure of credit scoring, which differs from one lender to another. Lenders need not reveal their credit score head, nor need they reveal the minimum credit score required for the applicant to be accepted. Owing only to this lack of information to the consumer, it is impossible for him or her to know in advance if they will pass a lender’s credit scoring requirements. Black Mirror has a good track record of exposing what’s unsavoury about our current time and place. On the surface, Nosedive seems to be an exposé on the dangers of going to far with the presentation of self in everyday life. Scratch a little deeper though, and it reveals an even more uncomfortable truth: that we’re living in a world driven by systems even worse than what’s depicted in this dystopia. How about this for a nightmare scenario: Two years ago Douglas Rushkoff had an unpleasant encounter outside his Brooklyn home. Taking out the rubbish on Christmas Eve, he was mugged — held at knife-point by an assailant who took his money, his phone and his bank cards. Shaken, he went back indoors and sent an email to his local residents’ group to warn them about what had happened. “I got two emails back within the hour,” he says. “Not from people asking if I was OK, but complaining that I’d posted the exact spot where the mugging had taken place — because it might adversely affect their property values.” [...]