Should source update algorithm run asynchronously?
kornelski opened this issue · 10 comments
Let's say we have a <picture>
element that has some source that matches and we change sources so that a different source starts to match.
var before = img.currentSrc;
picture.insertBefore(newSource, null);
var after = img.currentSrc;
before === after?
Should this change be immediately observable?
As far as I understand the spec, it just says to select sources in response to DOM mutation, so the source selection algorithm has to run synchronously.
Could that cause problems? "Source trashing" like "layout trashing"? Is synchronous MQ evaluation a problem?
My original idea was to always schedule the algorithm to run on the next tick (so before === after
in the above example, you'd need setTimeout
or such to see the change), which gives implementations ability to delay and coalesce source updates as much as they want.
The downside (for authors) is that authors who want to observe how <picture>
selects sources can't do it easily.
The upside (for browsers) is that authors who want to observe how <picture>
selects sources can't do it easily :)
At least for changes to src
the change needs to be immediately applied if the new image is in the image cache, for Web compat. Old srcset also runs sync.
Possibly we could delay running the algorithm if the img element has a srcset attribute or has a picture parent, or some such, if running it immediately is a problem for perf or so. However it could be implemented lazily so that it's not run until the script finishes or the script asks for currentSrc/naturalWidth/width/getComputedStyle/etc. @yoavweiss WDYT?
The actual loading of a new resource is already delayed with "await a stable state" so the script can do a bunch of changes and still result in a single new request when the script is done.
From an impl standpoint, I don't think it would be a problem to immediately apply changes to the src
attribute on <img>
(as it does today - for the sake of cached images), while delaying application of changes to everything else (<source>
mutations, srcset
changes, etc) till we reach stable state.
In this case, currentSrc
/naturalWidth
/etc won't change and would still show their old value,until stable state is reached, and a new resource is loaded.
I think that would solve the Web compat problem, while protecting us from complex mutations (and complex algorithms to handle them).
I see a few possible strategies:
(1) What we have now (all changes trigger the algorithm, selection is sync, fetch awaits stable state)
(2) Make all changes trigger the algorithm, but early exit if srcset is present or if the parent is picture
(3) Make only changes to src/crossorigin run the algorithm sync, and other mutations are queued or await a stable state.
I don't see (1) as being more complex than (2) or (3) in the spec.
I think the problem with (2) and (3) is that it makes a difference in which order you set the src
attribute relative to srcset
or relative to insertion in picture
:
var img = new Image();
img.srcset = 'foo.jpg 2x';
img.src = 'bar.jpg';
alert(img.currentSrc); // ???
vs
var img = new Image();
img.src = 'bar.jpg';
img.srcset = 'foo.jpg 2x';
alert(img.currentSrc); // ???
I think this is something we should try to avoid.
The way I think of (3), both will show bar.jpg
as the currentSrc
, until stable state is reached.
(2) may be complex if you want to look at the node's parent, since the node may or may be linked to its parent when the attributes are set in JS, and it will not be linked to its parent when set in the parser.
Yeah, interpreting #3 as "run the algorithm sync, using the data you have syncly available to you" seems to work fine, and keeps back-compat. The source will then update async to possibly something different, but that's okay.
This doesn't imply that browsers will start downloading the src attribute immediately. They can still hold off on actually downloading until they've done async selection with all the available data.
An argument against async selection is that it makes it more annoying to feature-detect things we add in the future.
True, async selection will be hard to observe for feature detection. However, I think it may be a good thing.
The spec gives UAs some freedom to innovate which images are chosen ("In a UA-specific manner, choose one image source from source set"), so e.g. they may always choose 1x images when bandwidth is slow/expensive or prefer cached versions of images.
I'm worried that if it's too easy to inspect what the selection algorithm does exactly, then some pages will test what sources have been selected and make poor assumptions based on that. This will make UA vendors reluctant to improve the selection algorithm to avoid web-compat problems.
So I think async algorithm may be harder to abuse, and more open to innovation (e.g. if UA wants to prefer selecting images that are already in the cache, then that may require disk I/O, so shouldn't happen synchronously on the main JS thread!)
If we add new features to <picture>
or <source>
that need feature detection, then we can also add properties that make it feature-detectable (which may be as simple as a DOM property reflecting a new attribute).
The image cache already needs to be sync for src
. But you may be right that sync selection is bad for innovation.
Do we have a decision here? @zcorpan & @tabatkins, what do you guys want to do here?
I'm OK with making it async in the scripted case. In the HTML parser case we should probably do the selection sync when the img
is inserted into its parent so that it performs well. I think this one requires a bit of surgery to the HTML spec to do it right.