Union and Intersection types are inferred when they should not
Closed this issue · 10 comments
Hi! This issue is reproducible in scalamock 7.3.0-RC2.
The problem is - things which should not compile - compile just fine. (It doesn't compile in scala 2 and probably this is a regression)
Minimized example below.
Compiler version
3.6.3, 3.3.5
Minimized code
trait Foo[A, B] {
def bar(x: A => B): Unit = ()
}
def toFoo[A, B](f: A => B): Foo[A, B] = new Foo[A, B] {}
trait Baz {
def oneArg(x: Int): String = ""
}
val baz = new Baz {}
toFoo(baz.oneArg).bar((_: String) => 123)Output
Compiles.
As I understood compiler infers .bar method argument as function Int & String => String | Int, when should infer Int => String.
Expectation
Not compiles. Infers Int => String.
Looks as designed to me. The compiler found a way to instantiate type parameters to make the code compile. That's the purpose of type inference. Unless it produces code that CCE or something, I don't see an issue here.
If this is by design - can I somehow achieve desired behavior - same as in scala 2 without providing types explicitly? (Not considering =:= or <:<) Tried using <:< and =:=, but did not found any solution, everything just blows up
What is the desired behavior? Your reproduction is, as is, pointless. What's the larger context?
//> using scala 3.3.5
//> using dep org.scalamock::scalamock:7.3.0-RC2
import org.scalamock.stubs._
trait Foo {
def foo(x: Int): Int
def foo2(x: Int, y: String): Int
}
object Main {
def main(args: Array[String]): Unit = {
val stubs = new Stubs {}
import stubs._
val f = stub[Foo]
(f.foo _).returns((x: String) => 1L) // should not compile
f.foo(1)
// I expect compiler to prevent this from happening
/*
Exception in thread "main" java.lang.ClassCastException: class java.lang.Integer cannot be cast to class java.lang.String (java.lang.Integer and java.lang.String are in module java.base of loader 'bootstrap')
at org.scalamock.stubs.StubbedMethod$Internal.impl(StubbedMethod.scala:279)
*/
(f.foo2 _).returns(_ => "") // should not compile
f.foo2(1, "2")
// I expect compiler to prevent this from happening
/*
Exception in thread "main" java.lang.ClassCastException: class java.lang.String cannot be cast to class java.lang.Integer (java.lang.String and java.lang.Integer are in module java.base of loader 'bootstrap')
at scala.runtime.BoxesRunTime.unboxToInt(
*/
}
}
If you change scala version to scala 2 - you see desired behaviour:
//> using scala 2.13.16type mismatch;
found : Long(1L)
required: Int [17:38]
type mismatch;
found : String("")
required: Int [18:29]What is the desired behavior? Your reproduction is, as is, pointless. What's the larger context?
Do you have an asInstanceOf anywhere in the chain of method calls? Including in code generated by macros?
If yes, looks like that asInstanceOf is actually unsound. If not, then we have a compiler issue.
Yes, there are bunch of asInstanceOf. Using it maybe unsound but still I'm almost sure the problem is not in them, macros crafted very carefully and they work exactly as I suppose them to do. And making such thing without asInstanceOf for me is hardly imaginable (even with scala 3 powers).
And also there is minimal example without asInstanceOf above at the very top.
Method def foo2(x: Int, y: String): Int has exact signature and it corresponds to a function (Int, String) => Int. And not (Int, String) => Int | String, this is probably a supertype, which I don't need here.
And I expect compiler to check that such function is provided and scala 2 compiler did it well.
I'll try to explain more in depth what I'm trying to achieve, will be grateful if you can help me achieve it the other way around or prove me wrong.
Given StubbedMethod definition
trait StubbedMethod[A, R] {
def returns(a: A => R): Unit
}
object StubbedMethod {
class Internal[A, R] extends StubbedMethod[A, R] {
private val functionRef = new AtomicReference[Option[A => R]](None)
// function here should match to method signature calling it
def returns(a: A => R): Unit = functionRef.set(Some(a))
def impl(args: A): R = functionRef.get() match {
case None => ???
case Some(fun) => fun(args)
}
}
Given definition:
trait Foo {
def foo2(x: Int, y: String): Int
}I need to generate:
val foo: Foo = new Foo with scala.reflect.Selectable {
val stub$foo2$0 = new StubbedMethod.Internal[Any, Any](..., ..., ...)
override def foo2(x: Int, y: String): Int = stub$foo$0.impl((x, y)).asInstanceOf[Int]
}Now when someone uses it there is an implicit conversion from eta-expanded method selection to StubbedMethod and it is done via selectDynamic which returns Any.
This:
(foo.foo2 _).returns((x, y) => 1) Expands to:
foo.selectDynamic("stub$foo2$0").asInstanceOf[StubbedMethod[(Int, String), Int]].returns((x, y) => 1)And now somehow this:
(foo.foo2 _).returns((x, y) => "1") Expands to:
foo.selectDynamic("stub$foo2$0").asInstanceOf[StubbedMethod[(Int, String), Int | String]].returns((x, y) => "1")Instead of expanding to and failing compilation for this:
foo.selectDynamic("stub$foo2$0").asInstanceOf[StubbedMethod[(Int, String), Int]].returns((x, y) => "1")Got some help, not very obvious , but adding using (T, O) =:= (T, O) fixes things and it is now exactly how it is scala 2.
Now I'm not insisting it is a bug, but, please, investigate it closely and if it is by design I would ask for adding the problem and its solution to the specification of type inference / documentation.
Thank you for your time and have a good day!
trait Foo[A, B] {
def bar(x: A => B): Unit = ()
}
def toFoo[T, O](f: T => O)(using (T, O) =:= (T, O)): Foo[T, O] = new Foo {}
trait Baz {
def oneArg(x: Int): String = ""
}
val baz = new Baz {}
toFoo(baz.oneArg).bar((_: String) => 123)I still don't think it's an issue.
It seems you're trying to convince the type inference to do what you want, and nothing else. You had one solution to convince Scala 2, and now you need a different solution for Scala 3. In both cases, the specific types that get inferred are completely incidental, as far as the spec is concerned. The specification has little to say about type inference. The only constraint is that type inference should only infer types that then typecheck. There is no guarantee, neither in 2 nor 3, that it won't infer "something you don't like", unless it would actually fail to compile.
Also this has nothing to do with unions and intersections. Scala 2 could very well have inferred Nothing and Any instead of Int & String and Int | String. This would be just as valid. The code would typecheck after inference.
You cannot complain that type inference infers something that makes your code compile. That's literally its job.
Thank you for the explanation!
