We're building a non-trival web application using Backbone, RequireJS and Handlebars, and well, I'm just curious. At the moment, each of our models sorta looks like this:
define(['Backbone', 'js/thing/a', 'js/thing/b', 'js/lib/bob'], function(a, b, bob) {
return Backbone.Router.extend({
// stuff here
});
});
where thing/a, thing/b both have their own dependencies, for example on Handlebars templates, etc. What happens now is that in my main.js, all of the 'top-level' routers are loaded and initialized; each top-level router has a set of dependencies (models, views, etc) which each have their own dependencies (templates, helpers, utils, etc). Basically, a big tree structure.
The problem in this case is that this entire tree is resolved and loaded on page load. I don't mind that per sé, as we'll run it through the optimizer eventually and end up with one big single file (reducing RequireJS to basically a modularization framework). However, I am curious whether you can load stuff like views and templates 'on demand'.
There is the "simplified CommonJS wrapping" explained here, so I tried that:
define(function(require) {
Backbone = require('Backbone');
return Backbone.Router.extend({
doStuff: function() {
var MyView = require('js/myView');
new MyView().render();
}
});
});
However, looking at Chrome's network inspector, it seems that RequireJS - somehow, even without triggering the route that triggers the doStuff handler - still loads the myView
dependency. Questions:
- Is this actually possible? Are there black magicks in RequireJS that looks for calls to
require()
without actually triggering thedoStuff
route? - Is this the theoretically correct way of going about 'on-demand', lazy loading of RequireJS modules and resources?
- Does the r.js optimizer still work as advertised if you use this notation?