rust-lang/trait-system-refactor-initiative

`associated_type_defaults` are unsound in the new solver

compiler-errors opened this issue · 8 comments

rust-lang/rust#110673 attempted to make alias bounds sound in the new solver by only assembling alias bounds for a projection (such as <A as B>::C) when the trait that the projection comes from (<A as B>) can be satisfied via a param-env candidate.

This is problematic when checking whether an associated type default is well-formed, because we assume within the trait that Self: Trait<..> holds.

// compile-flags: -Ztrait-solver=next
#![feature(associated_type_defaults)]

trait Tr {
    type Ty: Copy = String;

    fn clone(x: &Self::Ty) -> Self::Ty {
        *x
    }
}

impl Tr for () {}

fn main() {
    let x = String::from("hello, world");
    let y = <() as Tr>::clone(&x);
    drop(x);
    println!("{y}");
    // ^^
}

During check_type_bounds in the above program, we must prove:

[
    Obligation(predicate=Binder { value: TraitPredicate(<<Self as Tr>::Ty as std::marker::Sized>, polarity:Positive), bound_vars: [] }, depth=0),
    Obligation(predicate=Binder { value: TraitPredicate(<<Self as Tr>::Ty as std::marker::Copy>, polarity:Positive), bound_vars: [] }, depth=0),
]

given the caller bounds of:

[
    Binder { value: TraitPredicate(<Self as Tr>, polarity:Positive), bound_vars: [] },
]

Which means that the check implemented in EvalCtxt::validate_alias_bound_self_from_param_env succeeds.

lcnr commented

This feels somewhat unsolvable until we fully enable coinductive trait goals and check impl where clauses instead of the current hack which disables alias bounds candidates for stuff proven via impls

It seems that the example provided does not compile anymore.

@TroyKomodo: When the test says // compile-flags: -Ztrait-solver=next, we mean this like a UI test flag. These do not get automatically applied in the rust playground, only in Rust's internal test suite. You're responsible for passing those flags manually to Rust if you want to test it on your own.

For the record, this still is unsound on nightly: https://godbolt.org/z/rWr78rbTf

Ah I see, sorry @compiler-errors my bad. Thanks for the insight 👍

pitaj commented

I'm not associated with the initiative, but I am interested in this issue. So I may be entirely off-base here.

Would it not be sufficient to prove that associated type defaults are valid at the trait definition itself, rather than when the trait is used?

I assume this has been considered but it's not clear to me why it wouldn't work. It must have something to do with alias bounds - are associated types treated as type aliases internally?

Would it not be sufficient to prove that associated type defaults are valid at the trait definition itself, rather than when the trait is used?

Sorry, but what led you to believe that was not currently the case?

We already check that the associated type default is valid in the definition of the trait. This issue is pointing out a specific limitation in that approach that has to do with the way we treat assumptions in the new trait solver.

pitaj commented

I guess my misunderstanding was deeper than I thought. The example and description led me to believe that the check wasn't happening because it was pushed to a later point.

It makes more sense on the second read through, sorry.

This got fixed. We can reopen it if it's possible to side-step, but no reason to keep it open now.