Google I/O 2010 – Optimize your site with Page Speed


welcome to the sex talk thank you for coming and today we’ll be talking about optimizing every bit of your site serving and web pages using PageSpeed I’m Richard rabbit I’m a product manager at Google I’m Brad McQuaid I’m a software engineer working on web performance before we start has anybody ever used PageSpeed can I show you see a show hands great perfect this is the link to the Google Wave we encourage you to look at it there are live notes being taken as well as you can put moderator questions there I oh the speed on bit ly and what you’re going to get from this talk quickly this is a one-on-one kind of talk so we don’t assume a lot of pre pass the knowledge in terms of understanding of page speed and performance so we cover some of the basics but will also go into some more details more advanced details we’re going to cover a few things most importantly why you should be here and why performance affect your site and why you should pay attention to performance we’re going to make sure you become familiar with Paige feed and the new features in Page Speed and we’ll also be talking about for new product features namely export functionality and PageSpeed the SDK and Apache module as well as page week for as analytics we’re going to spend some time looking at talking about web performance so I wanted to canal give you for those of the people that haven’t seen Paige feed they briefed you I basically a bunch of rules and how a web page is doing against the rules we’re going to go over the details but since we’re spending some time talking about web performance go to for you guys to see the product based web performance 101 and here’s in why should you mad why she should speed matter to you we know from a lot of user studies that speed is more people viewing your site more people coming back to your site last year at the velocity of conference that’s run by o’reilly we were fortunate enough to have a number of companies actually share some of the data on there on how performance actually affects traffic so in those lists Google ran what what we consider a four hundred millisecond latency in case so basically we we took a bunch of people that we serve more slowly by 400 milliseconds and actually like correspondent to about point-six percent search decrease which is pretty substantial for a company such as Google Yahoo did a similar experiment it actually hit their traffic by five to nine percent subs illa went a little bit further so what they did is they basically the architect that the whole UI and it actually contributed to about five second latency decrease and they got it twelve percent revenue increase and not only that it actually the case that opec’s cost because they needed to use less hardware to do the serving so important things that you should worry about whenever you’re developing your web web site so Brian so now that we’ve seen why speed you know web speed is important I’ll do a little bit of a more of a deep dive into the technical aspects why don’t we start with the building blocks of web performance so there are three categories you need to be thinking about you know when you’re think about web performance or that end-to-end picture that is performance at the web server on the network and in the client on the browser as well so on the server really feel the most important thing are the only thing we really we look at is that server processing time how long does it take your server to generate the response so for a static resource like a file stored on disk you know you’d expect that to be close to zero but for a dynamic response you know something in response to a user query you might see a increased processing time and will actually talk about a little later in the talk we’ll talk about some ways to mitigate the impact of that processing but that’s you know that’s the primary factor at the web server and then on the network the two factors the primary factors are you know bandwidth or the contributing factors bandwidth and round trip time and then finally pull dive into those a little bit more on future slides as well finally on the client and the browser you’re looking at parse time so how efficient is the browser at parsing HTML resource fetch time you know how efficient is the browser at finding and fetching resources so we’ve seen a big improvement in efficiencies and browsers in the last 12 months in terms of resource discovery and resource fetching you know previously we browsers had fetched JavaScript serially so now most modern browsers all modern browsers in fact in the past 12 months do parallel JavaScript fetches is a big win and we’re continuing to see improvements there and then finally the last two categories layout and render and JavaScript and for most traditional web pages on the out there these categories don’t tend to traditional being the pre Ajax pages these don’t have as big an impact but if you’ve got a large dome or a complex Dom layout can actually be a significant time time contributor and then JavaScript again if you’re using a JavaScript heavy Ajax page that’s a potentially an important contributor as well so in those latter two cases there’s actually another tool and hopefully you got to attend the Tech Talks Google speed tracer that does a nice job at giving you a timeline and drilling down into the specifics of time spent and layout and rendering and has been executing and parsing JavaScript and I’d recommend checking that out it’s a google chrome extension that does a nice job so now that we’ve looked at the building blocks why don’t we look at an example page load and sort of see how those building blocks come together to you know over the lifetime of the page load so we’ll look at a page load for Google search request a search query for Half Dome photos and what I’ll show is we’ve got sort of three columns here client and server these are operations that happen either on the client or the server and in the third column we’ve got the render column this is what the page looks like as a result of these different steps along the course of the page load so what we’ve done here is we’ve really slowed down the page load and we’ll see you know the discrete steps that we go through that in turn what that looks like in the browser as a result so we’ll understand you know how these building blocks come together to actually display the page for the user so first first thing the browser has to do every time you you know navigate to a web page is potentially perform a DNS lookup subsequently once the DNS lookup completes and that’s it that takes about a round trip time dns lookup in fact in many cases it’ll take longer than a round-trip time because you’ll hit multiple dns cache is along the way but roughly speaking you’re looking at one round-trip time tcp connection connect to the server another round trip time and then finally after those two round-trip times the client will send the HTTP request to the server so you know asking for that specific resource the server begins to process that query and we’ll start sending back the response and at this point we’ve seen three round-trip times past so round trip time varies considerably depending on where you are you know how well-connected you are on the internet but you’re looking at anywhere from single digit milliseconds on a local land to you know 1040 I think the average is about 70 milliseconds up to you know hundreds of milliseconds or even a second in the worst cases so minimizing round-trip times is a really important part of optimizing your website so finally once the response comes back after these three round-trip times the browser can begin to root parse that content and then we start to see the page rendering on the screen and subsequently more of the content comes back the browser continues to parse and in this case the browser’s discover that there are four image resources embedded in the response and so it begins to fetch those resources and what we see is that the network’s decos and begins fetching these each of those sub fetches is potentially going to incur a DNS lookup a TCP connection as well so you’re seeing additional latency there as well and then eventually we see these responses start to come back and I’ll mention to the gray section is the sort of off-screen such a portion of the page so we’re looking at the top portion is the part the user can actually see what we’re seeing here is that the page is rendering you know that most important content the user visible content first and then the yellow regions are the current they’re repainted regions during that last iteration of the load so we start to see as you know the image responses come back they continue to fill in and finally the page finish the rendering so this is sort of how these different factors dns tcp you know client-side things parsed layout sub resource fetches come together during the life cycle the page load to to sort of load and render that page so in fact yeah typically you know when you’re performing google search it feels like it loads hopefully if it was like it loads like that but you know in fact all these little discrete steps are happening along the way and understanding those understanding how they come together can help to understand how to optimize the page so given that Richard will summarize so if you go away from this tech talk and you need to remember that 33 things out of this these are like the three speed guidelines you should like always worry about when whenever you are developing a web app and the first one is you want to try to serve your bites and you’re going over the network you want to try to minimize the number of bytes that you’re sending over the network because they take it in packets there are so many round trips so the way some of the ways that we suggest that you do it is by compressing at serving enable gzip compression obviously lots of people do it some do still don’t if you have a web host web hoster that is hosting your your content and it’s not enabling compression move to another one optimize the images a lot of the images that come out of of a camera are very worthy and verbose there’s like a lot of meta information that’s unnecessary get rid of it with a lot of open source tools you can see a bunch of them and also make sure that you’re only sending the right size resolution of the image it saves bites on a wire but also saves processing time on the server on the client side get rid of all the content in the HTML and JavaScript and in the style sheets that is something that you put for development sake all the comments are things that your browser doesn’t care about so get rid of them use minification tools such as closure compiler which is also an open source project and also catch aggressively the way I think of it is the best the fastest serving is when you don’t have to serve that you know everything is in your cash so see if you can push things earlier to the browser that is going to be is going to be needed a little bit later so you’re not waiting the user is not waiting so circular bites paradise resource download modern browsers use up to 60 parallel connections try to make use of them all and we’ll talk a little bit about like one of the rules about optimizing order of stars and script which also helps in parallelism and don’t shy away from promoting modern browsers don’t develop for the lowest common denominator it doesn’t help push the envelope if you need to support older browsers protect the user agent and serve unoptimized content for that for that user agent for example don’t serve sprites to old browsers that don’t support it but use writing image spriting when you use the user agent can support it so three things serve fewer bytes paralyzed and push the envelope in terms of browser support so I know it’s been a few minutes since we started the stock and people are anxious to see it so PageSpeed is as if it’s a firefox Firebug extension and we have about 1 million active users so for the people that haven’t used it download it and join the fun this is our site go to google.com / speed / page speed and the way you’re going to use it is this is the little Firebug and you start it up page speed is an extension in Firebug and it tells you about like the new features that we have and the first thing you’re going to do is analyze the performance of that page so I’m analyzing this page on code site and it gives me a bunch of rules the first thing you see is a score its core is a is something that we believe is good indication for it a good metric that you can use that really that’s reliably reproducible so you’re getting about 82 over hundred we think it it’s okay it’s a it’s an okay web page and it’s gonna you have a bunch of rules that we executed and each one is going to tell you what what the issue is so for example here leverage browser caching so all these a lot of this javascript is as an expiry time of like one hour you should look at the disparity do you really need to be one hour or can you push it can you like you can you put it at 24 hours or seven seven days seven days is a good time you will make sure that anybody that comes back to your site can actually can actually have it in the sky and in the cache of the browser and obviously not a number of rules here I encourage you to explore them and I encourage you to also like look at some of the documentation so the easiest way to get to the human tations just a press-on rule because once you seriou lingo like i don’t understand what this rule is just press on the rule and we have a lot of documentation all is all the documentation open source and we we try to be very descriptive about what the problem is and how you can resolve it so going back to going back to our presentation right so let’s look at one example of why speed mind the development matters so for a for each of those PageSpeed suggestions why is it important that you adopt that’s that you know suggestion you apply it to your site what is it doing I was in making site faster so we’ll look at one specific example which we talked a little bit about earlier around parallelization so the ordering of styles and scripts so here’s an example ahead of an HTML web page but we’ve got a sort of some interspersed CSS and JavaScript content looks reasonable enough but in fact in some browsers what you’ll see is that intermixing CSS and JavaScript like this so you know some CSS and JavaScript and CSS introduces additional serialization delays in the page load so what you get is you get the to the CSS and the javascript file will load you get another delay on the next javascript file and then the finals of CSS files load and it turns out so so we’re looking at here is roughly 300 milliseconds in this example if it’s 100 millisecond roundtrip so it turns out that if you just reorder these things so you put all the CSS up front followed by the JavaScript the browser can more efficiently for some browsers anyway more efficiently will fetch that content you’ll be able to remove one of the round trip times so this is an example where it’s an easy fix the easy thing to do all of our suggestions this one and all of our suggestions won’t have any shouldn’t have any impact on the look and feel of your page so as far as the user is concerned the page is exactly the same in it and then finally what you get is you know you reduce one of those round trip times and go from the 300 milliseconds to 200 milliseconds without any other change in the page so over the last we launched in June so it’s been about a year and over the last year we’ve been you know working hard on a number of things we’ve added some new rules and fixed some others and we just want to talk about a few of them just to give some examples so we added a rule called minimize request size in the last year the idea there is that each request that the browser makes has some overhead and there are things you can do reducing cookie size reducing the length of URL in fact can keep that request size small so that it fits within a single TCP packet and is more likely to be transmitted efficiently and quickly over the network and that’s especially important in mobile because mobile tends to have high latency and asymmetric bandwidth where you’ve got you know slower up than a downlink so next specify a cache validator is a new rule that we added actually pretty recently and the idea there is that for static content for static resources once they do expire so if you set an expiration of a week or a month or year once they do expire it’s possible for the browser to ask the server hey you know I have this resource it’s not fresh anymore it’s not it’s not it’s expired has it changed when the server can side note hasn’t changed you can update it and keep it for another week for instance using a cache validator allows you to do that otherwise you have to download the entire resource again even if it hasn’t changed so that’s a rule we’ve added specify a character set early it turns out that if you you know if you’re serving HTML content and you don’t specify the character sets a utf-8 or ship just or whatever it might be the browser has to guess as to what the character said is in order to do that it buffers content in it’s in memory before it actually starts parsing it so the browser is downloading the content it’s being served from the server as quickly as possible but the users not seeing anything on the screen until it finishes buffering analyzes all that content to guess the character set which it could possibly gets wrong and you know only at that points to start rendering content so just best fighting a character set an HTTP response headers you know content-type text HTML semicolon charset equals whatever it might be allows the browser to more efficiently parse and render the content as it comes as it arose on the wire and then finally minimize DNS lookups is a rule that be implemented initially based on analyzing so pages internally a long time ago in fact and what we noticed is that for certain sites and for certain content third-party content it tended to flag those resources and sort of say you know you should it would flag resource that we felt we probably shouldn’t be flagging so we spent some time and just recently looked at the algorithm and tuned that algorithm so that it doesn’t fly it’s basically more accurate and gives more accurate recommendations so actually we just released PageSpeed 1.8 which has a new implementation to minimize dns lookups that is more accurate and less likely to give you sort of incorrect suggestions so we’re constantly tuning these rules we’re constantly adding new rules both as we find issues either ourselves or from feedback from users so we’ll send you a link to the page speed discussion forum at the end of the talk and then right yes so go ahead Richard so a bunch of new features and today we’ll talk a little bit about the DX for spark of functionality it’s basically a beacon that you can send and I will just go through the demo directly so let’s go back here and so we have of course right so we have export functionality that will allow us to export to every loop yeah we’re switch browsers yeah yeah I don’t want to start switch to Safari like I said we’re always finding and fixing problems yeah so basically we have to export functionalities and one we send the data back in JSON format and we also send the scores to WS locom if you want if you guys want to try it if you have HP’s running just send it out you’re going to have a bit of a legal disclaimer we worked with this outside you know independent developer who maintains show slow calm and basically it gives you a weight of a keeping track of your page feed score across across time and in this case I’m showing an example of I believe google com and youtube com and gmail.com and measurements that we’re sending to show slow so when you’re doing the development you change some of your you get up to the web page to be more performant and you can track the performance of your page across across time so I encourage you to use this the great functionality don’t hit show slow calm with too many too many beacons at the same time you are sure to say it shows your little bit oh yeah so the the actual site here is here’s the site why still is a competitor to page speed we encourage you to use as many performance tools as available as you can you can try out and in this case what we what we did earlier is we sent a bunch of requests a bunch of beacons to show so calm and recourse them right here so you can keep track of them and you can these are the comparisons so google com and youtube com are here and you can see over time the the performance of your page obviously okay so let’s go to the next feature by so so one of the things we’ve been working on over the past year is the page beat SDK so at the time of our initial launch page people is entirely a JavaScript implementation is pretty it was tightly coupled to Firefox api’s and what we found was that we wanted to reuse the PageSpeed logic in other environments so one air at one spot one place early on that we said we’d like to provide this is in Google Webmaster Tools how many how many are familiar with Google Webmaster Tools great so hopefully maybe you’ve seen that there are PageSpeed suggestions actually in the webmaster tools you I in the lab section and if you haven’t use webmaster tools before I would definitely encourage you to check it out it’s a great resource with lots of good helpful information for your website assuming you have a website you can sign up and learn about your site on that site so what we did was we over time over the last you know nine months we’ve been porting rules from the JavaScript space to a sort of browser independent library implemented in C++ that we’re able to reuse in PageSpeed for Firefox in webmaster tools and in other environments as well so you can now download that SDK use it we’ve gotta build set up for linux and for windows and if you want to build on Mac I don’t think it’ll take much work so we’d if you figure if you figure out what small changes to make to the make file in order to get that’s work we’d definitely feel free to share that with us and we’d be more than happy to include it in our open source repository so I mentioned webmaster tools do you learn at home just lay that is one of the places where PageSpeed is available today and so here’s here’s an example webmaster tools you can sort of see right that this is the YouTube area of webmaster tools which we were able to see and it gives you some example feedback for some pages on your site so you can drill down and for instance see you know that these four rules have specific suggestions to help you tune and optimize the site you know here for example is real combine external javascript which you can learn more about in our documentation Richard showed earlier but the idea is that if you combine these two resources the browser will be able to load the page more efficiently at least in some browsers so now not only do you have access to PageSpeed suggestions in the Firefox tool you can just go to this website webmaster tools google webmaster tools and get this information without having to install an extension without having to you know run it live on your site this date is just provided for you as part of the Google Webmaster Tools service so in addition we’ve actually you know so we’ve worked with a couple other tools as well this is the PageSpeed for Firefox you I paid paid for Firefox is now driven off of the PageSpeed SDK as well gomez web performance company we’ve been working with also integrated the PageSpeed SDK ruleset and they’re providing that in their tools and this is a pre-release they haven’t actually launched this yet but this is something that will be coming soon and then Steve Souders excuse me Steve Souders built a nice web page where you can take a horror file har file is an HTTP archive file sort of a new JSON format that lets you capture all the information about a page load so all the resource content headers timing information you post that into this web page it uses the PageSpeed SDK and comes back and gives you a PageSpeed score so these are just a few deployments we launched the SDK about a month ago and though we’ve seen great uptake and we and let’s do a little bit of a deep dive into the SDK now to see how you might use it so if you wanted to use the page be at SDK it’s pretty straightforward he just need to choose an output format or sort of how do you want to present the results do you want just plain text HTML json etc pick the PageSpeed rules you’d like to run specify a source of your input data so for instance har or some other you know input source and then just invoke the engine so let’s look at a snippet of code that does that now so we’re choosing to use a text format or just something that will print to standard out in this case and we’re populating the core rule set the core PageSpeed rule sets of the rule they you’re familiar with in the tool we’re going to use a har file as our input so this is an example how hard is a JSON format the dot dot would be a big blob of content that contains all the resource bodies and other things and then finally will invoke the HP engine so pass it the rules initialize compute and format results and at this point the results will be printed to the standard out on the on the on the console so let’s look at that actually so one of the tools bundled with the PageSpeed SDK is called hard to page speed which is actually the tool that powers the hard to page to be website this deep built and you invoke it you sent the code we just looked at is the core the guts of that tool it’s you know we’ve got the ability to read a file from a command-line argument but beyond that I mean it’s essentially what we just looked up and so you run it like this very simple right now you’re not in a browser anymore you’re on the command line different different environment and you’re able to get that same information those same results here on the console easily and quickly and you know so potentially you could write an automation tool that use something like this to automatically analyze fire files over time all without having to stand up a browser and run PageSpeed and that sort of thing so we can learn specifically what we can do to speed up the web page so that’s the PageSpeed SDK now let’s look at another deployment of PHP technology that we’ve been working on it sort of it’s very early in the life cycle of the project but what we decided we wanted to do is try to shift as much as possible from telling web developers what they can do to speed up their site and just actually try to do that for them so essentially so what we decided to do was implement an Apache module that encapsulates a lot of these PageSpeed suggestions so all you have to do is install this module on your apache server and then you don’t think about the problem anymore ideally we just automatically optimized you know the images the HTML the CSS the JavaScript combined resources extend caching lifetimes using a technique called resource fingerprinting which is talk about our documentation as well all these things are captured automatically so you don’t have to go to the trouble of implementing them or what web content hosters don’t have to do that and they can just sort of have this applied automatically so this project is open source as well like I said it’s early in the development cycle so it’s not ready for you yet but if you’re interested take a look at our code google com repository so the question is it does it insert semicolon at the end of lines I don’t know actually I want to say so the I actually don’t think it preserves the new line so it would need to insert semicolon so so that’s actually a good example of a case we can get the question because yeah so so he asked if it per se retains semicolons at the end of new lines because one of the things JavaScript menta fires do is they tend to removing a little new lines JavaScript new lines implicitly add a semicolon so if you just combine the two lines you can end up with JavaScript that breaks I want to say we do fix that but I’d have to double check in any case if you run into a problem do you know you guys to go to the same URL and file an issue or post on our discussion forum and you know we’d be happy that we’re always happy to accept code patches if you’re interested in submitting patches or you know will try to fix the issue ourselves for a subsequent release so here’s an example of you know as the HTML flows through apache you know sort of you know coming in unoptimized like this perhaps will you know parse that HTML and perform some optimizations and what you end up with is HTML that’s a little more minified into serving fewer resources so what we’ve done here is we’ve combined the two CSS resources we’ve combined the two JavaScript resources well you can see here as we would have also extended caching lifetimes and removed unnecessary white space walk away before you why don’t you talk a little bit about the extension of caching lifetime because it’s quite interesting oh sure so one thing that will often find is that caching lifetimes are either unspecified or set sort of not very aggressively on some sites anyway and developers are sometimes concerned that well if I extend it for a week or a year what if I need to change that resource and so what we recommend is a technique we call fingerprinting URL fingerprinting essentially which looks at the actual content of the resource and embed that fingerprint in the URL so what you’re looking at here is / cash / some blob that makes no sense right dot CSS and what that actually is isn’t it part of an md5 sum of the concatenation of a wal CSS and b dot CSS so now because we’ve sort of captured a fingerprint of the actual contents in the URL we can use a really aggressive caching lifetime we can set this thing to not expire for a year and then if it does happen to change well the contents will change the fingerprint will change and in turn the URL will change right so the browser will know oh I have to go fetch this other resource which has a different URL that’s not in my cash so this lets you sort of instead of specifying how long the Browse is you basically can expire the resource when it expires instead of having to wait for that expiration time you just change the URL and the browser will download it as soon as the URL changes so that the technique will do it’s a bit of a fragile technique you know you have to sort of match up the content signatures with actually URLs and the content which you can do by hand or if you use my page we will do that for you automatically so that’s my page speed so so basically it came out of google loves we spent a lot of time trying to understand optimization of you I at Google we built a lot of these rules internally and after we we released it as open source about a year ago just like Brian said we got a lot of good feedback and one of the one of the most important pieces of feedback is a website is not usually coming from 11 property there’s this content that for example for a publisher there’s a contenders content that the reporter is writing there’s the odd systems that are that are that you’re shipping so that you can monetize your your your your pages and there’s also tracking analytics that you need to so you can keep track of measurements and on metrics that you care so we spent a lot of time trying to understand how to adjust this and our approaches is to try to give as much information back to the developer as possible and to do that we basically started focusing as a first step in terms of third-party content on ads and trackers and I will show the demo so this is the YouTube page and I’m going to start Firebug this is page speed and there is a filter option with these these options the first one has analyzed ads only analyze trackers on the content only and the complete page the complete page is what you used to when you run PageSpeed for those of you that have run it and now what we’re going to do is we’re going to filter only the ads and try to and try to see what the what we’re going to get in terms of that so I’m first going to analyze the performance for the complete page I get a bunch of rules obviously there there’s a lot of recommendations for just about every rule and then I’m going to analyze the ads only refresh analysis you can see that all these rules are not applicable anymore and what we’re what we’re looking at is specifically the add content so we have a number of filters for for what we think our ads and by the way although all the filters are open source so if you have suggestions for adding more we know today we’re not very we don’t have a lot of coverage internationally so it will be good to have more international coverage for a new ads systems adds a source that we don’t capture today and you’ll see like you know double-click is being served on YouTube but obviously it’s a it’s an ad we’re going to give you recommendations about this ad the same thing happens for analytics although i don’t believe that YouTube has analytics on their on their pages so what this will give you is enough information to understand how third-party content is affecting the performance of your pages we’re going to extend it to gadget we believe gadgets are becoming a big a big part of every web page and we we think it’s important for every web developer to actually understand the impact of all the content that they have and understand the impact of things that they can they have control over versus things they don’t have control over and make the right decision and every every development is it is a is a balance between adding more features and first thinking about speed as a feature and we’re sure you can find the balance there with with this we hope that giving all this information will will also spur a third party content to actually try to make sure that they are optimal so that when they served out of your web pages they they are fast and performance and this is a feature we just added in that so we just released it this morning we just pushed it out it’s in beta the ten percent of what our users are getting it as if this morning you may get it automatically if your HP installed if not you can go to the PHP download page and download the PageSpeed beta and you’ll get that feature as part of that download and so it’s a new feature so give us feedback online on the discussion this will be great and obviously I just covered this future work yep so looking forward I’ll talk about a few of the rules that we’re thinking about adding to the rule set over the coming months I’ll talk about three rules the first is to recommend using chunked encoding so chunk encoding is a technique that allows you to send a page in pieces as opposed to sending the whole thing after generating little thing and this actually relates to the bid I talked about at the beginning where you know server latency what we call it server processing time can add to the page load often times if you use chunked encoding you’ll mitigate or even eliminate that that as far as the user is concerned what it’ll lets you do is essentially send this so the assumption is that most pages that are dynamic search results pages user customized pages like in email websites etc have sort of a static bit of content at the head of the page that doesn’t take anything to compute it’s essentially just a static string and the idea is that you send that as the first chunk of the response while you’re doing that you start computing the actual dynamic data the user requested and what you have is in parallel you’re sending you know that static data on the wire while you’re computing the user’s result and then as soon as that dynamic content is generated and ready to serve you serve it right behind as a separate chunk and as far as the depending on the user’s connection it may just look like a straight a consistent stream of data that actually was never interrupted so chunked encoding so that i should say the default behavior in HTTP is to specify the length response in the response headers the response headers come for the entire response body so by default you have to wait and buffer the entire response the entire dynamic response before you start sending any of it so chunked encoding lets you do this in chunks you know send the static header first in the dynamic body afterwards and you know what we see is that this has been a big win for Google properties like searching calendar that it fit that constraint of dynamic response with a static header and so whoops so what we’ll see oftentimes is that before implementing this kind of technique you’ll have this HTTP waterfall chart you know that shows the timeline of the resources being downloaded it looks something like this HTML you’ll take you’ll spend a lot of time downloading that HTML resource towards the end of that download you’ll start downloading the sub resources declared in that content and once you enable chunking what you get is so one nice side effect of this is that external javascript and CSS are oftentimes declared in that static chunk so by sending that static trunk much sooner you pull in those sub resource fetches consider considerably and allow the browser to start downloading parsing and applying those resources much earlier in the page load so this is a useful technique for dynamic responses so second minimize the size of early loaded resources and the idea here is that browsers have become much more efficient at downloading resources specifically JavaScript a year ago most browsers out there would download javascript serially they sort of you know if you had tens javascript files declared in a row download fun wait for it to finish parson executed move on down to the next one and you saw this stair step in the waterfall chart so what we’re seeing now with the modern browsers all the modern browsers all the major browser vendors had implemented this is that you get parallelized javascript fetches much more efficient use of the network but regardless the browser can’t show anything till to the user until all of those resources have been downloaded and all of the javascript has been parsed and executed CSS as well so the less you serve up front the less you serve in the head of the page and the more you can defer to later in the page until after content has been rendered the faster that initial flash of content that initial sort of time to first paint will be in the faster the less time the users sort of sitting there staring waiting at a blank screen so you can actually accomplish this technique today using two of our rules in the PageSpeed rule set and Firefox remove unused CSS and defer loading of JavaScript which will help you to understand which JavaScript and CSS are actually used on your page and which which are not until later we’re going to sort of what this rule will do is to streamline that process a little bit to make it a little easier to apply the technique and then finally minimize minimize fetches from JavaScript so as browsers have become more efficient in the last 12 months and they parallelized javascript fetches what we’ve seen is that javascript that’s fetched using javascript still gets serialized so we pay a penalty for fetching javascript from javascript so you so sometimes JavaScript libraries you’ll see this in a lot of actually major websites will you know do something like this very straightforward write a couple script tags and then they’ll use some JavaScript library to load a couple of these JavaScript resources seems pretty reasonable but what this does in the modern browsers traditionally this actually had no latency impact in a year ago most browsers it didn’t make a difference because it was going to be fetched serially whether you fetched it using javascript or whether you fetched it using a script tag and so what happens here is that the browser uses sort of a speculative fetcher so it goes and parses ahead of the renderer looks for tags and since I found a script tag okay food is I’ll fetch that parses ahead parses ahead hits a script tag it says well I’m just a speculative fetcher I don’t actually execute JavaScript so I can’t do anything about this one skip it then eventually the renderer receives who jss jf commonjs and affects jas parses and execute those and says ok next I’ll execute that script block because it execute scripts in order and what this ends up looking light is that stair stepping behavior that you saw in older browsers this serialized JavaScript fetches so if you’ve got JavaScript fetched in this way in your page and it’s easy to just turn them into script tags you’ll go from that serialized JavaScript fetching to parallel JavaScript touching making your page display its content sooner otherwise would so then finally a literature talk so a lot of the development for pagespeed happened early on when we didn’t have grown we didn’t have developer tools and the past year we’ve been focused a lot on the rules and the the correctness of the rules and building that that you know the SDK we’re going to be really using a version of a speed for Chrome with integration with chrome developer tools and we’re hoping to get it addy and by the end of this year we know it’s a div Dupree high request by everybody we apologize for like not having been able to do it earlier but developer tools are now such a complete developer environment for us that we know we’re going to be landing in the landing in chrome and you know this year so where can you get more information a bunch of places so we have it very developed with website thanks to our wonderful tech writer so everything is that code google com / speed speed speed we we have all our development is an open source we don’t do any developments in the sandbox somewhere do contribute if you’d like to contribute there’s also bug you know just as asking for features and bugs is great and at a pre active mailing list that you can subscribe to and help us make the product better and tell us about success stories using PageSpeed or problems of how you use pagespeed and how it didn’t perform so we can make it better thank you so let’s see if we have anything on moderator to cover and if you have questions and if you have questions please the microphone this here you

As found on YouTube

What's your reaction?

In Love
Not Sure

You may also like