Alternatives and Environment Matching
jkrems opened this issue · 43 comments
One question that @guybedford brought up was how this proposal could handle things like different implementations depending on the platform.
Known Use Cases
Browser/Node Split
A package that uses very specific DOM (or node) APIs and wants to provide one export for all browser and another one for all versions node. Examples:
- TBD
Node Legacy Build
A package that uses newer node APIs (or JS features) wants to provide a clean export for those while still supporting older versions of node by pointing them to a pre-compiled version of its source. Examples:
yarn
's legacy build.
Browser Legacy Build (dropped from scope)
A package that includes code using the latest JS features but also wants to provide code that can be imported as-is into slightly older browsers to support a compilation-less experience. Examples:
- TBD
Production Build
A package that has a lot of development-specific code that would be too expensive to run in a production environment. As part of its publishing process, it generates an optimized production build but also wants to provide the original development sources for their better debugging abilities. Examples:
Compilation Cache
A package wants to be able to be up-and-running asap so it ships a precompiled version of its source, be it binary AST/V8 compilation cache etc.. This cache may be very specific to the engine and its version, so it wants to fall back to the original code if loading the cache fails. Examples:
yarn
's use ofv8-compile-cache
Prior Art
- Import map's array of alternative URLs
- The
browser
field - The
react-native
field - Browserslist
- Editions
- The
engines
field - The
syntax
field - Import map feature tests
Jan's Strawman
The idea is to embrace the import map format of specifying an array of alternatives. At its simplest, it's mostly useless:
{
// "Try each of these URLs, use the first that exists or is supported."
"exports": ["./unpublished-file.mjs", "./lib/real-file.mjs"],
// [...]
}
One possible use of this form would be to expose V8 compile caches etc., falling back onto the original script. But it becomes more interesting when we allow to replace any element in the array with an object describing a set of constraints being satisfied by this version:
{
"exports": [
{
"node": "6",
"from": "./lib/real-file.mjs"
},
"./build/es5.mjs"
],
// [...]
}
Multiple constraints can be specified for each element. The constraints themselves are considered host-defined. The only reserved property name is from
. The examples below will use the following constraints:
node
: The (minimum) node version this variant supports.browserslist
: A browserslist expression describing the set of browsers this variant supports.production
: A boolean constraint that specifies that this is an optimized build meant for production (e.g.NODE_ENV=production
).
When evaluating the constraints, the following rules should be followed:
- Entries are evaluated in the order they appear in the array, one at a time.
- String entries are always considered matching.
- Unrecognized constraints must be ignored.
- If none of the constraints is recognized, the entry must be considered incompatible.
- If there are recognized constraints and they are compatible with the targeted environment, the entry is picked.
- A picked entry acts just like a simple string entry: Other entries will only be considered if loading the resource from this path fails.
Jan's Strawman: Applied to Use Cases
Browser/Node Split
{
"exports": [
{
"browserslist": "100%",
"from": "./lib/browser.mjs"
},
"./lib/node.mjs"
],
}
Browser and Node Legacy Build
{
"exports": [
{
"node": "6",
"browserslist": "latest 2 Chrome, latest 2 Firefox",
"from": "./lib/modern.mjs"
},
"./build/es5.mjs"
],
}
Production Build
{
"exports": [
{
"production": false,
"from": "./lib/index.mjs"
},
"./build/optimized.mjs"
],
}
The issue with user agent sniffing is that it goes stale very quickly.
For example, latest 2 Chrome
is a moving target. Now this is fine in your build configuration which is updated daily, but for a package published on npm, this is a recipe for something to break in future.
For this reason I'm not sure how I feel about version-based matching being designed into the system at such a low level.
This is why I tend to prefer targets being specific environment conditions, which can be known to be either "on" or "off".
User agent sniffing is generally a last resort when dealing with deployed code. Rather feature detection is much more robust here.
Feature detection without code execution though is its own can of worms.
Throwing out a new idea here - one option here might be to permit a feature detection module to be defined which can itself determine if the condition should pass:
{
"exports": {
"./feature.js": "./main-feature.js",
"default": "./main.js"
]
}
where if the condition is a "plain name" it is an environment conditoin, and where it is a "relative name", it is a local condition module with a default export to be treated as a boolean.
What might help further as well is to try to think in terms of very tangible use cases, over the general problem space.
(where I'm saying I'm not so sure about use cases 1 and 2 in #29 (comment))
For example,
latest 2 Chrome
is a moving target.
I think it's important here that "this code supported the last 2 versions of Chrome at the time of publishing" will only break if Chrome removes support for features in the future which is pretty big "if". Building an exhaustive list of features that are being used is a worthy ideal but I'm not sure it's realistic outside of auto-generating such a field.
Throwing out a new idea here - one option here might be to permit a feature detection module to be defined which can itself determine if the condition should pass
If that check can come from a library, maybe. But even then it would most likely lead to weird boilerplate code for things like "this is the production build targeting modern browsers and node 8+". But it may allow us to just punt on this and push everything into userland.
I think it's important here that "this code supported the last 2 versions of Chrome at the time of publishing" will only break if Chrome removes support for features in the future which is pretty big "if". Building an exhaustive list of features that are being used is a worthy ideal but I'm not sure it's realistic outside of auto-generating such a field.
It's a moving target in that it means we now have to read time-based package.json metadata in combination with this field, and then work out which chrome versions these were based on the date. Publish time might not even be a reliable metric as well, say using a mirror that changed it for example (like filestamps are not reliable),
A version reference like ^chrome@32
is certainly fine though, but these are also harder for the Node.js project where versions can make breaking changes. If a user has node@^12
and the package works fine in Node 14, how can users override this? Versions just bring up a lot of problems that will require other mechanisms to fix.
Rather, I would like us to explicitly understand the use case around version matching, with specific example relating to future workflows where es modules are the baseline, and how they help. Personally I think that the standard workflow in future will always be to compile the evergreen features to the "compatibility target", and that publishing evergreen syntax is an unnecessary goal, just like publishing evergreen code to the web is an impossible goal due to supporting older browsers.
Let me take an example of a library I was working on recently: It was using the global URL object. So the library was compatible with node 10+ and browsers. For older versions of node (8 and older), it would have needed a URL object injected from a polyfill. For older browsers I wouldn't have wanted to include my own URL implementation. I would've rather broken the library (unless the app was loading a proper global URL polyfill) so I wouldn't risk needlessly including a huge chunk of code.
There's 3 possible consumers here:
- The module loader in node (we'll assume all involved node versions have
import
support). - A bundler trying to assemble builds for different predefined sets of browsers.
- A dev server trying to serve the right (ad-hoc) import map based on the browser requesting the page.
My package provides getCurrentURL
, a function that returns an URL
object with the current URL being served. The implementation for node is in lib/current-url.mjs
. A proxy of it providing a URL polyfill is in lib/node-6.mjs
. A browser implementation is in lib/browser.mjs
.
If there would be a feature test script, how would I make sure that the right version of each file is picked, especially in the bundler case? Would the bundler build a fake global environment using JSDOM, representing each possible browser targeted by the build? That seems fairly unrealistic.
Under the strawman above, I can solve this use case with the following package.json#exports
value:
[
{
"node": 10,
"from": "./lib/current-url.mjs"
},
{
"browserslist": "100%",
"from": "./lib/browser.mjs"
},
"./lib/node-6.mjs"
]
The better way to handle this workflow is to have URL
as a module not a global. Globals won't be used anymore in a world of standard modules in the browser.
So rather you'd just have import { URL } from 'url'
and be done with it. The build tool would know that url
is Node.js's standard library and include the minimal browser adaptor for it.
Note you also didn't include support for IE11 :)
In the case where URL
was say a browser specific feature, where you needed to custom fork in Node.js then you might do something like:
import URL from 'url';
and then have an exports map that maps the actual plain name:
"exports": {
"url": {
"browser": "std:url",
"node": "./node-url-version.js"
}
}
within that node file, if we had top-level await, then conditional polyfilling could be done too.
(and top-level await is making progress...)
Note you also didn't include support for IE11 :)
Right, on purpose! Because otherwise the bundler may actually include my polyfill and another polyfill. It's a feature, not a bug of that config. A simple feature test would have forced me to include superfluous code if the app is doing a global polyfill (which it may).
Right, on purpose! Because otherwise the bundler may actually include my polyfill and another polyfill. It's a feature, not a bug of that config. A simple feature test would have forced me to include superfluous code if the app is doing a global polyfill (which it may).
Ahh, your polyfill is in browser.mjs
? Your users might be annoyed to find out they included a polyfill they didn't want.
Are we at all certain that standard modules for built-in features will become a thing for every single feature? Also, that doesn't really cover language features, only importable APIs.
Ahh, your polyfill is in
browser.mjs
? Your users might be annoyed to find out they included a polyfill they didn't want.
No, but I'm including a polyfill on older versions of node that don't have the global (or not even require('url').URL
for node 6).
In addition the array maps could be useful to reserve for passing directly to import maps in the browser...
If what we interact with is covered by a standard module, we don't have to get fancy:
"exports": {
"url": ["std:url", "./node-url-version.js"]
}
Which would imply though that exports
allows overriding imports inside of the package as opposed to only imports of the bare specifier itself.
Btw, for node purposes we can keep the "browserslist" as out of scope. That would be a bundler (and potentially dev server) concern. And there compilation is happening anyhow, so version ranges are less interesting.
Node versions on the other hand are more tricky.
- I don't think it's practical to compile away decorators and other future JS features on install or on run. And node will continue to add new APIs and I don't think it's realistic that each feature will be exposed as exactly one module, see recursive
mkdir
. NODE_ENV
checks ("production builds") will stay relevant and unless we say "it's impossible to run a node app in production without bundling it first with 3rd party tools", we should have a story for it. The story could be top level await and hoping that browser bundlers can optimize it away but it's a hack. And it's still super painful in theexport * from './dev.js'
case (multi-export).
I'm fine updating my examples with browser: true
instead of browserslist: "100%"
. It would encourage people to publish only the build for the current evergreen browsers and make build tooling responsible to ensure proper compilation which I think is fair.
Prior art to be aware of: https://blog.meteor.com/meteor-1-7-and-the-evergreen-dream-a8c1270b0901. It deals with many of the same issues.
Also I wouldn’t let users define custom top-level fields, that just gives us headaches like we have now with trying to recreate "module"
. Instead:
{
"exports": [
{
"path": "/foo",
"from": "./lib/foo.mjs",
"target": {
"node": "6"
}
}
]
}
Hey, I'm a big fan of this strawman. I think it could work for other things too, such as testing for the existence of another built-in module, or the presence of some property of the global object. Imagine if BigInts were in a module std:bigint
, and we have both a full polyfill, as well as a polyfill for just BigInt64Array/BigUint64Array. We could include a list like this:
{
"exports": [
{
"moduleExists": "std:bigint",
"from": [
{
"modulePropertyExists": "std:bigint.BigUint64Array",
"from": "std:bigint"
}
"./bigint-array-polyfill.mjs"
]
},
"./full-bigint-polyfill.mjs"
]
}
In this example, we have to test for multiple conditions being true at the same time, so from:
is allowed to be its own fallback list, with its own conditionals inside of it.
Okay, I'm starting to come around to @guybedford's position of "can we do feature testing instead of version constraints", especially if we'd have shorthands for some feature sets. E.g. for syntax features like ES20XX..?
E.g. for syntax features like ES20XX..?
That would resemble Babel's presets that users are already familiar with.
Though don't forget ES modules are part of ES2015 😢
Well, it's the babel presets people are actively encouraged to migrate away from right now... The current recommended approach is using babel-preset-env
with a browserslist expression:
As of Babel v6, all the yearly presets have been deprecated. We recommend using @babel/preset-env instead.
-- Babel Docs
But yes, anything below ES2015 wouldn't really make sense, to a certain degree.
For syntax, how about something like this?
{
"js-syntax": "import()",
"from": "something-using-dynamic-import.mjs"
}
The string would be parsed as JS (as a Module?), and if there were no syntax errors, it would select the indicated module.
I would be a bit concerned about this from a bundler perspective. E.g. a bundler/precompiler targeting environment X would have to parse the string, analyze the result, and then match it to a set of features (and then those to the target of the bundling). Definitely possible but seems a bit involved. We could define SyntaxError
during parse of the from
file as a defining factor as well..?
P.S.: Yes, that last sentence would be even more tricky for bundlers but would at least also remove any uncertainty about the code and the js-syntax
getting out of sync.
Node
(e.g requires Node API's, NAPI)
package.json
{
"name": "a",
"engines": {
"node": ">= 10"
},
"main": "./src", // Script (CJS)
"exports": { // Module
".": "./src/index.js",
...
}
}
Browser
(e.g requires WEB API's)
package.json
{
name: "b",
engines: {
"browser": "last 2 versions" // Browserlists or other feature detection
},
"exports": { // Module
".": "./src/index.js",
...
},
"browser": "./src/index.min.js" // Script (ES5, Bundled, Minified, ...) ?
}
Dual Mode
package.json
{
"name": "c",
"engines": {
"node": ">= 12",
"browser": "last 2 versions"
},
"main": "./src", // Script (CJS)
"exports": { // Module
".": "./src/index.js",
...
},
"browser": "./src/index.min.js" // Script (ES5, Bundled, Minified, ...) ?
}
Usage
$PWD
npm engines -n|--node ">= 10" -b|--browser "chrome 74"
[WARNING] a ... requires a node environment ...
[WARNING] c ... requires node >= 12 ...
...
@michael-ciniawsky Right, engines is another field worth noting as a precedent above. Unfortunately your example works for specifically the case where the browser build is ES5 and where only a single file is involved and not different files with different requirements etc..
It also encourages the use of engines
which is a feature that (to me at least) was finally made completely irrelevant by yarn
. For me the primary issue with engines
is that it conflates "system that installs the package" with "target platform of this specific file load".
Added engines
to prior art.
@jkrems hm, how did yarn
make it irrelevant? yarn chose to error on a mismatched engines
field, unlike npm which long ago learned it was best to only consider it advisory.
Just one quick thought re "js-syntax": "import()"
: don’t forget that many features that people might want to detect for are new methods on global objects, like String.prototype.includes
. That’s valid syntax even in runtimes that don’t include that method.
hm, how did
yarn
make it irrelevant? yarn chose to error on a mismatchedengines
field, unlike npm which long ago learned it was best to only consider it advisory.
Yeah, the error is what forced us to add ignore-engines
to all our projects. Because there were multiple packages that said "needs node 8!" because they used modern syntax. But we never used them in node, we only used them in browser builds. So the error was just plain wrong and kept tripping people up for no good reason.
@jkrems yes, i agree that erroring on "engines" is always wrong :-) thanks for clarifying.
@GeoffreyBooth I think the argument would be that non-syntax features should be mirrored (?) by APIs that can be imported. E.g. std:String.includes
as a super ad-hoc example.
That's a pretty hard problem to solve in a static way, given that the mere presence of API or syntax isn't sufficient - the semantics (including environment-specific bugs) of both also matter, including whether API methods are polyfilled or not, and how correctly.
Two more things to include in the list of ways to detect along with browserlist and Babel presets is https://node.green/ and where it gets its list, https://github.com/kangax/compat-table. That’s a list of basically every feature from ES2015 onward, and what versions of Node support what.
@ljharb It may not fill all the use cases, but it's a step along the way to satisfy many use cases. I imagine dynamic checks would be possible as well to fill the gaps.
@GeoffreyBooth I'd be skeptical of basing a system on the Kangax checks specifically. These test some particular aspects, while leaving others open.
@littledan yeah I was just pointing out prior art. But if none of these can work then that's concerning. What can we use as a feature list then?
@GeoffreyBooth I don't have a good answer; I'm not really sure how to bound the work of checking for known bugs. I'd suggest starting with tests for properties/modules existing and syntax not being a syntax error.
Hey ho 👋, just wanted to say that this is a super useful feature and I'm happy that this proposal is trying to implement this kind of switch at the package level.
Regarding the engines/compatibility discussion: I think this can get quite complex and it might be too much for this proposal. For the first iteration I think it's enough to have this kind of "node", "browser" (maybe "electron") switch that we currently have. Ideally, the syntax/proposal would be extendable in such a way that we can implement this compatibility feature in a later iteration.
Looks like from the Chrome side (@mathiasbynens and @developit), there's currently a push for a syntax
field as a shorthand for the yearly spec versions: whatwg/html#4432
See proposed package.json
field: https://twitter.com/_developit/status/1105582360907649024
The pushback on that thread seems pretty convincing; we should watch it closely.
Added @littledan's write-up in https://github.com/littledan/import-map-feature-tests/blob/master/README.md to the prior art.
A basic "is browser"-test would be { "if": { "global": "document" } }
potentially when following that proposal.
Supporting arrays is as far as this proposal goes, we now have a separate proposal for a syntax to do environment matching that can be used inside of the array syntax: https://github.com/guybedford/proposal-pkg-targets/issues. Support in node for something like it would be possible to add in the future.