5
votes

We are creating a framework that we intend to use across multiple projects. All projects will use require.js to manage modules and dependencies.

Ideally I'd like to use the r.js optimizer to compile the framework into a single file that can be provided to the applications that use it. That file will contain all modules of the framework such that in my application I can write code like:

define(["framework/util/a", "framework/views/b"], function(A, B) {
  var a = new A();
  // etc...
});

But it appears there are two problems with this approach.

  1. Depending on framework/util/a doesn't tell require.js that it needs to load framework.js in which it will find util/a
  2. The optimize tool generates names for all modules included in framework.js such as define("util/a", function() { ... } ); Even if require.js loaded framework.js there is nothing that tells it that the defined module util/a is a relative module to framework and as such is identified as framework/util/a

Am I missing something or is a better approach to structure my framework as a CommonJS package and use require.js's packages configuration option?

1
I don't think there is a reasonable way to do this. From @jrburke: "just 'distribute modules in a directory in source form', no build."rharper

1 Answers

2
votes

Re: 1. It seems that indeed r.js optimisation not was designed to optimise partial dependancy trees, as lazy loading hinges on file paths. E.g. asking path/to/module to actually load path/to would seem like a hack. One solution would be to forgo lazy loading and include framework-built.js above your application code.

Re: 2. So you'll now need your framework-built.js with full paths. One way would be to build a dummy parent that requires all of framework, say dummy-framework.js. That way your dummy-framework-built.js will have the full path defines for framework and if not lazy loaded, it should work fine.

Disclaimer: I haven't used require.js all that much, though that's my best effort :)