Sidebar

Remember Me
Forgot username  Forgot password

Skype chat

Monday to Friday
9 am - 4 pm CET -4

How to build faster websites

image

Ahead of his talk at Generate London on 21 September we caught up with Patrick Hamann, a web performance engineer at Fastly, who is on a mission to build a faster web for all. 

What does your role at Fastly involve?
Patrick Hamann: Fastly is an edge cloud platform that underpins some of the world's largest brands. My role predominantly focuses on R&D; working with teams within Fastly to utilise client-side technologies and web standards to improve the performance and delivery of our products and – most importantly – our customers' services. Some current projects include initiatives around browser performance monitoring, metrics and Service Workers.

Before you joined Fastly, you spent time at both the Guardian and the Financial Times. How did they approach web performance?
PH: Performance is no longer a post-deploy add-on or checklist item. It needs to be a constant effort that every person in the organisation considers, from design through to delivery. This is something these news organisations realised very early on, introducing practices such as building monitoring infrastructure to measure and compare performance against competitors, prioritising the delivery of content over other features and utilising technologies like Service Workers. 

What's the biggest obstacle to a fast experience online right now?
PH: One word: JavaScript. I guess I should elaborate on this slightly: The web is at the peak of a JavaScript obesity crisis. The average web page now delivers around 500kb of script. Script which takes more than a second to just parse – let alone execute – on a low-powered device and greater than five seconds to get to a state which the user can interact with the page. Therefore, the only way to improve the user experience of our sites is to measure, optimise and reduce our JavaScript – above all else.

WebPageTest runs a free website speed test from multiple locations around the globe using real browsers and at real consumer connection speeds

What are your favourite tools to optimise web performance?
PH: I am a strong believer that you cannot optimise what you haven't yet measured. So my toolbox is heavily weighted to measurement and profiling tools. For synthetic measurement, I’ll always reach for WebPageTest and browser developer tools (network and performance panes) first. However, nothing beats measuring real user experiences too (R.U.M), so a good knowledge of the browser performance timing APIs helps as well.

You've worked on some very large codebases. What are the challenges of working at scale?
PH: By and large I'd argue that a big codebase shares most of the problems you'd find in a smaller one: eliminating unused CSS (a problem I think is unsolved), caching, asset build pipelines and versioning, and so on. You also get more 'bit rot' – best practices becoming anti-patterns over time. But personally I've found most of the challenges at scale are people problems, not technical ones. I'm yet to work in a large organisation that isn't affected by Conway's Law. 

You call yourself a progressive enhancement advocate. What is it about that approach that resonates with you?
PH: Contrary to popular belief, our users actually use the products we build in the real world: one full of non-ideal browsing conditions and failure around every corner. Progressive enhancement enables us to build experiences that are inclusive to all our users and are resilient to the failures of the real world. It's quite simple: just start with the basics, not a 300kb JavaScript library that your local barista told you about. We should not forget the basics. 

What are you excited about in frontend development at the moment?
PH: The web is under threat. Users are spending more time in native – and thus siloed – app experiences away from the openness of the web. Yet we still seem to be building sites that take 20 seconds and cost £1.20 per load on my roaming connection, ultimately driving our users away further.

Fortunately, technologies like Service Worker and associated APIs are here to help. I can't wait for a new era of fast, resilient websites that still function offline, can synchronise my data in the background and notify me with updates. Some people are calling this progression 'progressive web apps'. I prefer the term 'the web'. 

In his talk at Generate London Patrick Hamann will explore the current, past, and future best-practices for loading assets in the browser

What can people expect to learn from your talk at Generate London?
PH: To the outsider, serving a website seems pretty simple: send some HTML and CSS down the wire then the browser decides what to do next. However, a lot is actually going on under the hood, all coming at a cost to our users.

How does the browser determine what asset to request next? How can we measure the perceived speed of our websites? How can we use modern web platform features to influence the priority and speed of our assets? Hopefully my talk will answer these questions and more. Giving the audience the tools to create faster, more resilient experiences for their users.

Generate London on 21/22 September features 15 other presentations covering web animations, UX strategy, prototyping, accessibility, responsive CSS components, and much more. There are also four workshops to choose from on the day before the conference but tickets are very limited. Reserve your spot now!

Original author: Oliver

Copyright

© FLIPBOARD - ORIGINAL AUTHORS

Rate this blog entry:
What Will the Currency of a Workless, Cashless Fut...
How to Increase Your Sales 49% Faster by Using You...

Related Posts