Subscribe: Adactio
http://adactio.com/journal.php/rss
Added By: Feedage Forager Feedage Grade A rated
Language: English
Tags:
amber  building  day  don  end  google  make  much  page  people  progressive web  service  site  time  web  week  work  writing 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Adactio

Adactio: Journal



The online journal of Jeremy Keith, an author and web developer living and working in Brighton, England.



 



Writing on the web

Wed, 22 Mar 2017 18:52:08 GMT

Some people have been putting Paul’s crazy idea into practice.

  • Mike revived his site a while back and he’s been posting gold dust ever since. I enjoy his no-holds-barred perspective on his time in San Francisco.
  • Garrett’s writing goes all the way back to 2005. The cumulative result is two fascinating interweaving narratives—one about his health, another about his business.
  • Charlotte has been documenting her move from Brighton to Sydney. Much as I love her articles about front-end development, I’m liking the slice-of-life updates on life down under even more.
  • Amber has a great way with words. As well as regularly writing on her blog, she’s two-thirds of the way through writing 100 words every day for 100 days.
  • Ethan has been writing about responsive design—of course—but it’s his more personal posts that make me really grateful for his site.
  • Jeffrey and Eric never stopped writing on their own sites. Sure, there’s good stuff on their about web design and development, but it’s the writing about their non-web lives that’s so powerful.

There are more people I could mention …but, to be honest, not that many more. Seems like most people are happy to only publish on Ev’s blog or not at all.

I know not everybody wants to write on the web, and that’s fine. But it makes me sad when people choose not to publish their thoughts because they think no-one will be interested, or that it’s all been said before. I understand where those worries come from, but I believe—no, I know—that they are unfounded.

It’s a world wide web out there. There’s plenty of room for everyone. And I, for one, love reading the words of others.




Progressive Web App questions

Wed, 15 Mar 2017 11:17:29 GMT

I got a nice email recently from Colin van Eenige. He wrote: For my graduation project I’m researching the development of Progressive Web Apps and found your offline book called resilient web design. I was very impressed by the implementation of the website and it really was a nice experience. I’m very interested in your vision on progressive web apps and what capabilities are waiting for us regarding offline content. Would it be fine if I’d send you some questions? I said that would be fine, although I couldn’t promise a swift response. He sent me four questions. I finally got ‘round to sending my answers… 1. https://resilientwebdesign.com/ is an offline web book (progressive web app). What was the primary reason make it available like this (besides the other formats)? Well, given the subject matter, it felt right that the canonical version of the book should be not just online, but made with the building blocks of the web. The other formats are all nice to have, but the HTML version feels (to me) like the “real” book. Interestingly, it wasn’t too much trouble for people to generate other formats from the HTML (ePub, MOBI, PDF), whereas I think trying to go in the other direction would be trickier. As for the offline part, that felt like a natural fit. I had already done that with a previous book of mine, HTML5 For Web Designers, which I put online a year or two after its print publication. In that case, I used AppCache for the offline functionality. AppCache is horrible, but this use case might be one of the few where it works well: a static book that’s never going to change. Cache invalidation is one of the worst parts of using AppCache so by not having any kinds of updates at all, I dodged that bullet. But when it came time for Resilient Web Design, a service worker was definitely the right technology. Still, I’ve got AppCache in there as well for the browsers that don’t yet support service workers. 2. What effect you you think Progressive Web Apps will have on content consuming and do you think these will take over the purpose of some Native Apps? The biggest effect that service workers could have is to change the expectations that people have about using the web, especially on mobile devices. Right now, people associate the web on mobile with long waits and horrible spammy overlays. Service workers can help solve that first part. If people then start adding sites to their home screen, that will be a great sign that the web is really holding its own. But I don’t think we should get too optimistic about that: for a user, there’s no difference between a prompt on their screen saying “add to home screen” and a prompt on their screen saying “download our app”—they’re equally likely to be dismissed because we’ve trained people to dismiss anything that covers up the content they actually came for. It’s entirely possible that websites could start taking over much of the functionality that previously was only possible in a native app. But I think that inertia and habit will keep people using native apps for quite some time. The big exception is in markets where storage space on devices is in short supply. That’s where the decision to install a native app isn’t taken likely (given the choice between your family photos and an app, most people will reject the app). The web can truly shine here if we build lightweight, performant services. Even in that situation, I’m still not sure how many people will end up adding those sites to their home screen (it might feel so similar to installing a native app that there may be some residual worry about storage space) but I don’t think that’s too much of a problem: if people get to a site via search or typing, that’s fine. I worry that the messaging around “progressive web apps” is perhaps over-fetishising the home screen. I don’t think that’s the real battlegr[...]



In AMP we trust

Mon, 13 Mar 2017 17:27:37 GMT

AMP Conf was one of those deep dive events, with two days dedicated to one single technology: AMP. Except AMP isn’t really one technology, is it? And therein lies the confusion. This was at the heart of the panel I was on. When we talk about AMP, we could be talking about one of three things: The AMP format. A bunch of web components. For instance, instead of using an img element on an AMP page, you use an amp-img element instead. The AMP rules. There’s one JavaScript file, hosted on Google’s servers, that turns those web components from spans into working elements. No other JavaScript is allowed. All your styles must be in a style element instead of an external file, and there’s a limit on what you can do with those styles. The AMP cache. The source of most confusion—and even downright enmity—this is what’s behind the fact that when you launch an AMP result from Google search, you don’t go to another website. You see Google’s cached copy of the page instead of the original. The first piece of AMP—the format—is kind of like a collection of marginal gains. Where the img element might have some performance issues, the amp-img element optimises for perceived performance. But if you just used the AMP web components, it wouldn’t be enough to make your site blazingly fast. The second part of AMP—the rules—is where the speed gains start to really show. You can’t have an external style sheet, and crucially, you can’t have any third-party scripts other than the AMP script itself. This is key to making AMP pages super fast. It’s not so much about what AMP does; it’s more about what it doesn’t allow. If you never used a single AMP component, but stuck to AMP’s rules disallowing external styles and scripts, you could easily make a page that’s even faster than what AMP can do. At AMP Conf, Natalia pointed out that The Guardian’s non-AMP pages beat out the AMP pages for performance. So why even have AMP pages? Well, that’s down to the third, most contentious, part of the AMP puzzle. The AMP cache turns the user experience of visiting an AMP page from fast to instant. While you’re still on the search results page, Google will pre-render an AMP page in the background. Not pre-fetch, pre-render. That’s why it opens so damn fast. It’s also what causes the most confusion for end users. From my unscientific polling, the behaviour of AMP results confuses the hell out of people. The fact that the page opens instantly isn’t the problem—far from it. It’s the fact that you don’t actually go to an another page. Technically, you’re still on Google. An analogous mental model would be an RSS reader, or an email client: you don’t go to an item or an email; you view it in situ. Well, that mental model would be fine if it were consistent. But in Google search, only some results will behave that way (the AMP pages) and others will behave just like regular links to other websites. No wonder people are confused! Some search results take them away and some search results keep them on Google …even though the page looks like a different website. The price that we pay for the instantly-opening AMP pages from the Google cache is the URL. Because we’re looking at Google’s pre-rendered copy instead of the original URL, the address bar is not pointing to the site the browser claims to be showing. Everything in the body of the browser looks like an article from The Guardian, but if I look at the URL (which is what security people have been telling us for years is important to avoid being phished), then I’ll see a domain that is not The Guardian’s. But wait! Couldn’t Google pre-render the page at its original URL? Yes, they could. But they won’t. This was a point that Paul kept coming back to: trust. There’s no way that Google can trust that someone else’s URL will play by the AMP [...]



Empire State

Mon, 06 Mar 2017 22:32:20 GMT

I’m in New York. Again. This time it’s for Google’s AMP Conf, where I’ll be giving ‘em a piece of my mind on a panel. The conference starts tomorrow so I’ve had a day or two to acclimatise and explore. Seeing as Google are footing the bill for travel and accommodation, I’m staying at a rather nice hotel close to the conference venue in Tribeca. There’s live jazz in the lounge most evenings, a cinema downstairs, and should I request it, I can even have a goldfish in my room. Today I realised that my hotel sits in the apex of a triangle of interesting buildings: carrier hotels. Looming above my hotel is 32 Avenue of the Americas. On the outside the building looks like your classic Gozer the Gozerian style of New York building. Inside, the lobby features a mosaic on the ceiling, and another on the wall extolling the connective power of radio and telephone. The same architects also designed 60 Hudson Street, which has a similar Art Deco feel to it. Inside, there’s a cavernous hallway running through the ground floor but I can’t show you a picture of it. A security guard told me I couldn’t take any photos inside …which is a little strange seeing as it’s splashed across the website of the building. I walked around the outside of 60 Hudson, taking more pictures. Another security guard asked me what I was doing. I told her I was interested in the history of the building, which is true; it was the headquarters of Western Union. For much of the twentieth century, it was a world hub of telegraphic communication, in much the same way that a beach hut in Porthcurno was the nexus of the nineteenth century. For a 21st century hub, there’s the third and final corner of the triangle at 33 Thomas Street. It’s a breathtaking building. It looks like a spaceship from a Chris Foss painting. It was probably designed more like a spacecraft than a traditional building—it’s primary purpose was to withstand an atomic blast. Gone are niceties like windows. Instead there’s an impenetrable monolith that looks like something straight out of a dystopian sci-fi film. Brutalist on the outside, its interior is host to even more brutal acts of invasive surveillance. The Snowden papers revealed this AT&T building to be a centrepiece of the Titanpointe programme: They called it Project X. It was an unusually audacious, highly sensitive assignment: to build a massive skyscraper, capable of withstanding an atomic blast, in the middle of New York City. It would have no windows, 29 floors with three basement levels, and enough food to last 1,500 people two weeks in the event of a catastrophe. But the building’s primary purpose would not be to protect humans from toxic radiation amid nuclear war. Rather, the fortified skyscraper would safeguard powerful computers, cables, and switchboards. It would house one of the most important telecommunications hubs in the United States… Looking at the building, it requires very little imagination to picture it as the lair of villainous activity. Laura Poitras’s short film Project X basically consists of a voiceover of someone reading an NSA manual, some ominous background music, and shots of 33 Thomas Street looming in its oh-so-loomy way. A top-secret handbook takes viewers on an undercover journey to Titanpointe, the site of a hidden partnership. Narrated by Rami Malek and Michelle Williams, and based on classified NSA documents, Project X reveals the inner workings of a windowless skyscraper in downtown Manhattan. [...]



Small steps

Fri, 03 Mar 2017 11:51:58 GMT

The new Clearleft website is live! Huzzah!

Many people have been working very hard on it and it’s all looking rather nice. But, as I said before, the site launch isn’t the end—it’s just the beginning.

There are some obvious next steps: fixing bugs, adding content, tweaking copy, and, oh yeah, that whole “testing with real users” thing. But there’s also an opportunity to have some fun on the front end. Now that the site is out there in the wild, there’s a real incentive to improve its performance.

Off the top of my head, these are some areas where I think we can play around:

  • Font loading. Right now the site is just using @font-face. A smart font-loading strategy—at least for the body copy—could really help improve the perceived performance.
  • Responsive images. A long-term solution will require some wrangling on the back end, but I reckon we can come up with some way of generating different sized images to reference in srcset.
  • Service worker. It’s a no-brainer. Now that the Clearleft site is (finally!) running on HTTPS, having a simple service worker to cache static assets like CSS, JavaScript and some images seems like the obvious next step. The question is: what other offline shenanigans could we get up to?

I’m looking forward to tinkering with some of those technologies. Each one should make an incremental improvement to the site’s performance. There are already some steps on the back-end that are making a big difference: upgrading to PHP7 and using HTTP2.

Now the real fun begins.




Long betting

Wed, 22 Feb 2017 11:42:22 GMT

It has been exactly six years to the day since I instantiated this prediction:

The original URL for this prediction (www.longbets.org/601) will no longer be available in eleven years.

It is exactly five years to the day until the prediction condition resolves to a Boolean true or false.

If it resolves to true, The Bletchly Park Trust will receive $1000.

If it resolves to false, The Internet Archive will receive $1000.

Much as I would like Bletchley Park to get the cash, I’m hoping to lose this bet. I don’t want my pessimism about URL longevity to be rewarded.

So, to recap, the bet was placed on

02011-02-22

It is currently

02017-02-22

And the bet times out on

02022-02-22.




Variable fonts

Tue, 21 Feb 2017 15:50:22 GMT

We have a tradition here at Clearleft of having the occasional lunchtime braindump. They’re somewhat sporadic, but it’s always a good day when there’s a “brown bag” gathering.

When Google’s AMP format came out and I had done some investigating, I led a brown bag playback on that. Recently Mark did one on Fractal so that everyone knew how work on that was progressing.

Today Richard gave us a quick brown bag talk on variable web fonts. He talked us through how these will work on the web and in operating systems. We got a good explanation of how these fonts would get designed—the type designer designs the “extreme” edges of size, weight, or whatever, and then the file format itself can extrapolate all the in-between stages. So, in theory, one single font file can hold hundreds, thousands, or hundreds of thousands of potential variations. It feels like switching from bitmap images to SVG—there’s suddenly much greater flexibility.

A variable font is a single font file that behaves like multiple fonts.

There were a couple of interesting tidbits that Rich pointed out…

While this is a new file format, there isn’t going to be a new file extension. These will be .ttf files, and so by extension, they can be .woff and .woff2 files too.

This isn’t some proposed theoretical standard: an unprecedented amount of co-operation has gone into the creation of this format. Adobe, Apple, Google, and Microsoft have all contributed. Agreement is the hardest part of any standards process. Once that’s taken care of, the technical solution follows quickly. So you can expect this to land very quickly and widely.

This technology is landing in web browsers before it lands in operating systems. It’s already available in the Safari Technology Preview. That means that for a while, the very best on-screen typography will be delivered not in eBook readers, but in web browsers. So if you want to deliver the absolute best reading experience, look to the web.

And here’s the part that I found fascinating…

We can currently use numbers for the font-weight property in CSS. Those number values increment in hundreds: 100, 200, 300, etc. Now with variable fonts, we can start using integers: 321, 417, 183, etc. How fortuitous that we have 99 free slots between our current set of values!

Well, that’s no accident. The reason why the numbers were originally specced in increments of 100 back in 1996 was precisely so that some future sci-fi technology could make use of the ranges in between. That’s some future-friendly thinking! And as Håkon wrote:

One of the reasons we chose to use three-digit numbers was to support intermediate values in the future. And the future is now :)

Needless to say, variable fonts will be covered in Richard’s forthcoming book.




Amber

Mon, 20 Feb 2017 17:40:54 GMT

I really enjoyed teaching in Porto last week. It was like having a week-long series of CodeBar sessions. Whenever I’m teaching at CodeBar, I like to be paired up with people who are just starting out. There’s something about explaining the web and HTML from first principles that I really like. And people often have lots and lots of questions that I enjoy answering (if I can). At CodeBar—and at The New Digital School—I found myself saying “Great question!” multiple times. The really great questions are the ones that I respond to with “I don’t know …let’s find out!” CodeBar is always a very rewarding experience for me. It has given me the opportunity to try teaching. And having tried it, I can now safely say that I like it. It’s also a great chance to meet people from all walks of life. It gets me out of my bubble. I can’t remember when I was first paired up with Amber at CodeBar. It must have been sometime last year. I do remember that she had lots of great questions—at some point I found myself explaining how hexadecimal colours work. I was impressed with Amber’s eagerness to learn. I also liked that she was making her own website. I told her about Homebrew Website Club and she started coming along to that (along with other CodeBar people like Cassie and Alice). I’ve mentioned to multiple CodeBar students that there’s pretty much an open-door policy at Clearleft when it comes to shadowing: feel free to come along and sit with a front-end developer while they’re working on client projects. A few people have taken up the offer and enjoyed observing myself or Charlotte at work. Amber was one of those people. Again, I was very impressed with her drive. She’s got a full-time job (with sometimes-crazy hours) but she’s so determined to get into the world of web design and development that she’s willing to spend her free time visiting Clearleft to soak up the atmosphere of a design studio. We’ve decided to turn this into something more structured. Amber and I will get together for a couple of hours once a week. She’s given me a list of some of the areas she wants to explore, and I think it’s a fine-looking list: I want to gather base, structural knowledge about the web and all related aspects. Things seem to float around in a big cloud at the moment. I want to adhere to best practices. I want to learn more about what direction I want to go in, find a niche. I’d love to opportunity to chat with the brilliant people who work at Clearleft and gain a broad range of knowledge from them. My plan right now is to take a two-track approach: one track about the theory, and another track about the practicalities. The practicalities will be HTML, CSS, JavaScript, and related technologies. The theory will be about understanding the history of the web and its strengths and weaknesses as a medium. And I want to make sure there’s plenty of UX, research, information architecture and content strategy covered too. Seeing as we’ll only have a couple of hours every week, this won’t be quite like the masterclass I just finished up in Porto. Instead I imagine I’ll be laying some groundwork and then pointing to topics to research. I guess it’s a kind of homework. For example, after we talked today, I set Amber this little bit of research for the next time we meet: “What is the difference between the internet and the World Wide Web?” I’m excited to see where this will lead. I find Amber’s drive and enthusiasm very inspiring. I also feel a certain weight of responsibility—I don’t want to enter into this lightly. I’m not really sure what to call this though. Is it mentorship? Or is it coaching? Or training? All of the above? Whatever it is, I̵[...]



Teaching in Porto, day five

Fri, 17 Feb 2017 23:49:51 GMT

For the final day of the week-long masterclass, I had no agenda. This was a time for the students to work on their own projects, but I was there to answer any remaining questions they might have.

As I suspected, the people with the most interest and experience in development were the ones with plenty of questions. I was more than happy to answer them. With no specific schedule for the day, we were free to merrily go chasing down rabbit holes.

SVG? Sure, I’d be happy to talk about that. More JavaScript? My pleasure! Databases? Not really my area of expertise, but I’m more than willing to share what I know.

It was a fun day. The centrepiece was a most excellent lunch across the river at a really traditional seafood place.

At the very end of the day, after everyone else had gone, I sat down with Tiago to discuss how the week went. Overall, I was happy. I was nervous going into this masterclass—I had never done a whole week of teaching—but based on the feedback I got, I think I did okay. There were times when I got impatient, and I wish I could turn back the clock and erase those moments. I noticed that those moments tended to occur when it was time for hands-on-keyboards coding: “no, not like that—like this!” I need to get better at handling those situations. But when we working on paper, or having stand-up discussions, or when I was just geeking out on a particular topic, everything felt quite positive.

All in all, this week has been a great experience. I know it sounds like a cliché, but I felt it was a real honour and a privilege to be involved with the New Digital School. I’ve enjoyed doing hands-on teaching, and I’d like to do more of it.




Teaching in Porto, day four

Fri, 17 Feb 2017 00:23:43 GMT

Day one covered HTML (amongst other things), day two covered CSS, and day three covered JavaScript. Each one of those days involved a certain amount of hands-on coding, with the students getting their hands dirty with angle brackets, curly braces, and semi-colons. Day four was a deliberate step away from all that. No more laptops, just paper. Whereas the previous days had focused on collaboratively working on a single document, today I wanted everyone to work on a separate site. The sites were generated randomly. I made five cards with types of sites on them: news, social network, shopping, travel, and learning. Another five cards had subjects: books, music, food, pets, and cars. And another five cards had audiences: students, parents, the elderly, commuters, and teachers. Everyone was dealt a random card from each deck, resulting in briefs like “a travel site about food for the elderly” or “a social network about music for commuters.” For a bit of fun, the first brainstorming exercise (run as a 6-up) was to come with potential names for this service—4 minutes for 6 ideas. Then we went around the table, shared the ideas, got feedback, and settled on the names. Now I asked everyone to come up with a one-sentence mission statement for their newly-named service. This was a good way of teasing out the most important verbs and nouns, which led nicely into the next task: answering the question “what is the core functionality?” If that sounds familiar, it’s because it’s the first part of the three-step process I outlined in Resilient Web Design: Identify core functionality. Make that functionality available using the simplest possible technology. Enhance! We did some URL design, figuring out what structures would make sense for straightforward GET requests, like: /things /things/ID Then, once it was clear what the primary “thing” was (a car, a book, etc.), I asked them to write down all the pieces that might appear on such a page; one post-it note per item e.g. “title”, “description”, “img”, “rating”, etc. The next step involved prioritisation. They took those post-it notes and put them on the wall, but they had to put them in a vertical line from top to bottom in decreasing order of importance. This can be a challenge, but it’s better to solve these problems now rather than later. Okay. I know asked them to “mark up” those vertical lists of post-it notes: writing HTML tag names by each one. By doing this before doing any visual design, it meant they were thinking about the meaning of the content first. After that, we did a good ol’ fashioned classic 6-up sketching exercise, followed by critique (including a “designated dissenter” for each round). At this point, I was encouraging them to go crazy with ideas—they already had the core functionality figured out (with plain ol’ client/server requests and responses) so they could all the bells and whistles they wanted on top of that. We finished up with a discussion of some of those bells and whistles, and how they could be used to improve the user experience: Ajax, geolocation, service workers, notifications, background sync …the sky’s the limit. It was a whirlwind tour for just one day but I think it helped emphasise the importance of thinking about the fundamentals before adding enhancements. This marked the end of the structured masterclass lessons. Tomorrow I’m around to answer any miscellaneous questions (if I can) and chat to the students individually while they work on their term projects. [...]