Asset Performance and Optimization – So why do any of this? – Measure everything



Asset Performance and Optimization – So why do any of this? – Measure everything

0 0


Asset-Performance-and-Optimization

A talk about front end performance.

On Github keathley / Asset-Performance-and-Optimization

Asset Performance and Optimization

Chris Keathley / @ChrisKeathley Brett Wise / @BrettWise

What we were listening to:

The "Electrochill" playlist on Spotify

&

Music is srsbsns

What are we gunna talk about today

Optimizing all the things

Disclaimer

Don't check your brain at the door

All of this is based on experience and research. But your situation is always going to be different then ours so make sure that you test things and make decisions based on data.

So why do any of this?

Impacts on mobile users

Users wait ~6 seconds before they bounce

40%

40% of users will bounce after waiting 3 secs

Count'em

2

Overall people have 2 sec expectation

85% & 75%

85 expect mobile to load as faster or faster than desktop 75 will leave your site for comp if its slow

Impacts on Desktop

Users wait ~3 seconds before they bounce

Measure everything

  • YSlow
  • PageSpeed
  • Boomerang
  • Webpagetest
webpagetest - has a cool filmstrip feature that shows you your site's state at intervals of time as it loads answering the question how long before your page actually starts to render

Determine your target browsers

You have to know what you're dealing with in order to make optimizations
Real User Monitoring. Google Analytics has a feature to allow you to see how quickly your site is being delievered to your users on their devices. But maybe pingdom is better. Because goog just samples and doesn't give you all results. Pingdom does. And it's performance oriented. But there are lots of tools.

First steps

Enable Gzip

Who gzips? who knows what gzip is? It's really easy and is free (if you pre-gzip). Doesn't increase clock cycles. Saves you money if you're going through a CDN or otherwise paying for transfer costs.

Set Cache headers

Cache everything for at least a year. Make sure you have cache headers in order to bust cachers later.

all-ur-images.jpg

images up majority of site weight

Image bytes = ↗

bytes up 30% over last year

🍇

Low hanging fruit

???

So what can we do. Glad you asked.

💾

We can start low tech. Basic. Photoshop is a good place to start. Specifically save for web feature. But which format Brett.

JPEG

Best for photos and images with many colors

GIF

don't hit next until you say following best used for. well you know.

PNG

8 vs. 24

Think logos. 8 is good for images with few colors 24 is for if you need partial transparency

WebP?

JPEG XR?

webp is is a google thing it's good because about 25% smaller than jpegs jpegxr is a microsoft thing support in IE since IE9 it's good because well maybe it's not. claim that it's better in areas like compression and has support for higher color accuracy but the tests I've seen smack that down.

Intermediate

Serve assets through a CDN

How many people are currently using a CDN. At least turn on cloudflare.

Avoid DNS lookups

DNS lookups for CSS are expensive. Serve everything from a single domain if you can. There are exceptions to this rule, for instance forcing downloads in parallel. However, by and large you should optimize away DNS lookups.

Use smart caching strats

It's typically good to have a vendor manifest that can be cached independently from the "application" code. You probably don't/ shouldn't change your vendor files that much. This will keep your users from having to download all of your vendor assets every time you make a change to your application code.

UnCSS

Build starts with sass or scss in my case. The prod gets uncss. Comments get removed. Everything concatted. Explain how it works. Feed it html or urls.

Advanced topics

Optimize your initcwnd

Who knows what the initial conngestion window is? Who knows how much data you can shove in a packet? (around 1500 bytes) When we open an http connection the browser sends the server a syn, the server responds with a syn,ack the browser responds with an ack and then the server sends some initial amount of packets. Optimally we can get that window to 10 which means we have around 15k worth of data to send. Thus if we can get our files in that range then we can send the entire file in one burst and the browser doesn't have to send us more acks.

Chunk files into 10k groups

Create multiple entry point files

You can use tools like webpack to create multiple entry points for a site. That way all of the important styles are loaded on first page load

Inline small scripts

Webpack is also really good at inlinging specific scripts Only really worth it for very small files and in special circumstances.

So Technology

💿

how about those high tech approaches though

Enter imgix.com

Dependency-Free JS Libray

Python, Ruby, Etc.

it does all the things for you. Connect existing img store. S3 blah blah. First time img is requested img fetched from img store automatically deliever new formats like webp if end user can handle it. using imgix.js container size for image and pix dense is detected. then the exact image for that screen size is delievered.

Crazy People Stuff

Are you ready to...

Mind = Blown

Concatination maybe not the best thing since sliced bread?

WHat if I told you... Concatinating all your files may not be the fastest way. Tell story of how you found it. The point is, loading only 1 file is going to hinder an important optimization method, dynamic parallel loading in favor of saving the small (but non-trivial) HTTP request overhead.

Concatination

May Not Be the Best Way

WTF

h5bp github issues I was hanging on the webs. gettin deep on some github issues. this one in particular

Conclusion