4
votes

Scenario: You are building a large javascript-driven web application where you want as few page refreshes as possible. Imagine 80-100MB of unminified javascript, just for the sake of having a number to define "large".

My assumption is that if you lazy-load your javascript files you can get a better balance on your load times (meaning, you don't have to wait a few seconds each time the page refreshes), hopefully resulting in the user not really noticing a lag during loading. I'm guessing that in a scenario like this, lazy-loading would be more desireable than your typical single minified .js file.

Now, theoretically, there is a fixed cost for a request to any file on a given server, regardless of the file's size. So, too many requests would not be desirable. For example, if a small javascript file loads at the same time as 10 other small- or medium-sized files, it might be better to group them together to save on the cost of multiple requests.

My question is, assuming reasonable defaults (say the client has a 3-5Mbps connection and a decent piece of hardware), what is an ideal size of a file to request? Too large, and you are back to loading too much at one time; too small, and the cost of the request becomes more expensive than the amount of data you are getting back, reducing your data-per-second economy.

Edit: All the answers were fantastic. I only chose Ben's because he gave a specific number.

3
Great question! I am curious for the actual numbers here myself. </awkwardedly phrased sentence>Razor Storm
Nitpick: "mb" stands for millibit, "Mb" stands for megabit and "MB" stands for megabyte. May I assume that you actually mean 80-100MB and 3-5Mbps respectively?BalusC
@BalusC thanks for the clarification! It should read correctly now :)benekastah

3 Answers

2
votes

I would try to keep the amount that needs to be loaded to show the page (even if just the loading indicator) under 300K. After that, I would pull down additional data in chunks of up to 5MB at a time, with a loading indicator (maybe a progress bar) shown. I've had 15MB downloads fail on coffee shop broadband wifi that otherwise seemed OK. If it was bad enough that <5MB downloads failed I probably wouldn't blame a website for not working.

I also would consider downloading two files at a time, beyond the initial <300K file, using a loader like LabJS or HeadJS to programmatically add script tags.

2
votes

I think it's clear that making the client download more than a MB of js before they can do anything is bad. And also making the client download more of anything than is necessary is also bad. But there's a clear benefit to having it all cached.

Factors that will influence the numbers:

  1. Round-trip time
  2. Server response time
  3. Header Size (including cookies)
  4. Caching Technique
  5. Browser (see http://browserscope.com)

Balancing parallel downloading and different cache requirements are also factors to worry about. This was partially covered recently by Kyle Simpson here: http://www.reddit.com/r/javascript/comments/j7ng4/do_you_use_script_loaders/c29wza8