modules aren't cached
akoskm opened this issue · 7 comments
Just discovered this while require()
-ing modules that hold some kind of state. Requiring a module from the webworkify-ed web worker loads the module again instead of returning it from the cache.
Something like:
src/beam.js
var Store = require('../model/store');
Store.put(item); // put something into the store
src/worker.js
var Store = require('../model/store');
Store.list() // list store items, store is empty
Because:
Modules are cached based on their resolved filename. Since modules may resolve to a different filename based on the location of the calling module (loading from node_modules folders), it is not a guarantee that require('foo') will always return the exact same object, if it would resolve to different files.
especially:
Since modules may resolve to a different filename based on the location of the calling module (loading from node_modules folders)
But because both beam.js and worker.js are in the same folder and their require
statements are identical I can only speculate that the way how these workers are loaded (new Blob
) somehow makes difference in the caching.
Any ideas?
https://github.com/substack/webworkify/blob/master/index.js#L63
If I understand correctly the plugin takes the dependencies from the web worker script, concatenates them and feeds this result to new Blob
, basically creating a new script.
So in my example it doesn't really require('../model/store');
in fact it skips the entire caching mechanism and only takes the script and concatenates it with the rest of the require
statements and the main web worker script. And this is the reason why the module is loaded again instead of being returned from the cache.
Does it make sense?
This is the expected behavior — the worker bundle is isolated from the one in the main thread, and the only way of communicating between the two is using the WebWorker postMessage mechanism. So of course modifying a module in one will not be reflected in the other.
Thanks @mourner. I'm working on an application which is written in OOP-fashion. I have to use constructors in the worker that require some stateful objects and I had problem posting those as message (DATA_CLONE_ERR). I can't really find a way to share in-memory objects with the worker, probably because there isn't any.
@akoskm yes, you'll have to introduce serialization to and from JSON for your objects for this. We do this in https://github.com/mapbox/mapbox-gl-js.
@mourner I've looked at the web worker parts, nicely done! Surprisingly, I'm also working on a WebGL application.
I have a few question, if you could answer then I would really appreciate that.
So the part where you post messages to the actual workers happens here:
I also see that here: https://github.com/mapbox/mapbox-gl-js/blob/2924b910d7a9a0a12e0d71bd0771f7529acfade8/js/util/web_worker.js#L38 you set workerBus.target = parentBus;
, where parentBus
is an instance of MessageBus
, which also contains function objects. Shouldn't the structured clone algorithm throw DATA_CLONE_ERR
when you try to post this.target
to the web worker:
postListeners[i]({data: data, target: this.target});
@akoskm that file is purely for testing in Node; the browser one is much simpler: https://github.com/mapbox/mapbox-gl-js/blob/2924b910d7a9a0a12e0d71bd0771f7529acfade8/js/util/browser/web_worker.js
Oh yeah, I just realized that the code I grabbed from https://github.com/mapbox/mapbox-gl-js/blob/2924b910d7a9a0a12e0d71bd0771f7529acfade8/js/util/web_worker.js#L38 is actually postListeners
, not postMessage
.
I see, you're using a WorkerPool
and every Actor
has a worker and this is where posting messages happens.
Looks like I must find a way to serialize the current state because I can't just throw objects into web workers and expect them to work.
Thanks @mourner. 😃