Why does Lua::context does not take &mut self
Closed this issue · 8 comments
I do not understand why Lua::context
and other gc_*
methods are taking &self
instead of &mut self
. The use of the context may modify the lua state, so I find this weird.
In addition, tell me if I am wrong, but making this method mut
would allow to impl Sync
on Lua
, and still require a mutex to use state modifying methods.
The reason that Lua
methods don't take &mut self
is at least partially a matter of history: the API did not used to be based on Context
callbacks, and everything that was in Context
used to be in Lua
.
The design of the main API (the parts now in Context
) would not work at all if it required &mut
references, as every handle type must hold a reference to the Context
it came from. Context
methods can't take a &mut
and Context
can't be Sync
, otherwise you wouldn't be able to hold a handle while also doing anything with it. This is not just an artifact of how the lifetimes are set up... Lua is massively internally mutable, so doing almost anything can result in almost anything being mutated out from underneath you, so taking non-mutable references and being honest about internal mutability (everything is !Sync
) is the only API that I could find that made any sense.
Now that Lua
is mostly just global setup functions and the context
entry function, I theoretically could make them take &mut self
, but why? Right now, it is perfectly safe to call into multiple independent Context
s... in fact callbacks will spawn separate Context
s every time they're called, so why would I want to forbid the user from doing this using the top level Lua struct? I could do this, but it would be a completely artificial limitation, and it wouldn't really be saving the user of the API from any potential misuse either, Lua just is internally mutable, everywhere, all the time.
I could make Lua
Sync
if I did this, but it wouldn't buy you anything. If I did that, every method of Lua
would have to take &mut
for soundness, and you end up in the same place... Lua
would become Sync
only because &Lua
would become worthless.
So I don't really see any benefit? Is there a benefit I'm missing somewhere?
Thank you for your complete response.
For the benefits of making Lua Sync, well, for instance I use a ECS framework that encapsulate every resources in RefCell
s, and require T
to be Send
+ Sync
. So I will encapsulate Lua
in a Mutex
, which would be encapsulated in a RefCell
... not optimal, right ?
However I understand the benefits of not making methods &mut
, and I think I agree.
It seems a bit odd that the ECS would require resources be inside a RefCell
and require them to be Sync
? RefCell
and Ref
and RefMut
are all blanket !Sync
... so I don't get it?
I may lack some knowledge on how it works, but lets say it would accept Lua
in a RefCell
: multiple threads could Immutably borrow Lua
at the same time. But because Lua methods are not mut
, it would allow the different threads to call context
and would cause undefined behavior, right ?
For the benefits of making Lua Sync, well, for instance I use a ECS framework that encapsulate every resources in RefCells, and require T to be Send + Sync. So I will encapsulate Lua in a Mutex, which would be encapsulated in a RefCell... not optimal, right ?
The confusing part is that it would require RefCell and Sync, the combination is not Send or Sync, so requiring Sync of the inner type is pointless. Maybe I'm missing something?
What it should allow you to do is put Lua in just a Mutex, which will make the combination Sync, or if Sync is not required, a RefCell and then remove the inner Sync bounds.
Which ECS is this?
The ECS framework specs. The implementation of the resource sharing is in shred.
I am missing some knowledge on how this is implemented, but I think the system does is: for each system to dispatch, it takes mutable or immutable reference of the required resources and give it to the system. The dispatcher only run in parallel system that does not require the same resource or that require only an immutable reference to the same resource (so the RefCell::borrow_mut
never fails).
Oh, okay I'm sure whatever shred is doing internally is probably correct. Your issue makes sense now though, and I understand better why you might want this.
It's a tad inconvenient that specs always requires resources to be Sync, but you can keep the resource in a mutex and then give your system mutable access and use Mutex::get_mut to get the inner type and guarantee no locking. It's kind of a weird use of Mutex just to get rid of a Sync requirement, but it works. Maybe this was actually already what you were doing?
I agree it's a bit silly to have to do this, but I guess in specs there is no way to say that a resource can only be used mutably, so the Sync bound is required?
No, the only way to say that a resource can only be used mutably is to make every useful methods of the resource to take &mut self
.
Good idea for the Mutex::get_mut
, did not know about that. Thanks a lot