microsoft/TypeScript

Enable type parameter lower-bound syntax

jyuhuan opened this issue Β· 51 comments

TypeScript Version: 2.1.1

Code

class Animal {}
class Cat extends Animal {}
class Kitten extends Cat{}

function foo<A super Kitten>(a: A) { /* */ }

Expected behavior:
The type parameter A has the type Kitten as lower-bound.

Actual behavior:
Compilation failure. The syntax is unsupported.

Discussion:
The upper-bound counterpart of the failed code works fine:

class Animal {}
class Cat extends Animal {}
class Kitten extends Cat{}

function foo<A extends Animal>(a: A) { /* */ }

People in issue #13337 have suggested to use

function foo <X extends Y, Y>(y: Y) { /* */ }

to lower-bound Y with X. But this does not cover the case where X is an actual type (instead of a type parameter).

What is this useful for?

@RyanCavanaugh : In short, it mimics contravariance, just as extends mimics covariance.

We will try to sort an array of cats to see the necessity of this feature.

To do comparison-based sorting, we need a Comparator interface. For this example, we define it as follows:

interface Comparator<T> {
  compare(x: T, y: T): number
}

The following code shows that the class Cat has Animal as its super-class:

class Animal {}
class Cat extends Animal {}

Now we can write a sorting function that supports arbitrary Cat comparators as follows:

function sort(cats: Cat[], comparator: Comparator<Cat>): void {
  // Some comparison-based sorting algorithm.
  // The following line uses the comparator to compare two cats.
  comparator.compare(cats[0], cats[1]);
  // ...
}

Now, we will try to use the sort function. The first thing is to implement a CatComparator:

class CatComparator implements Comparator<Cat> {
  compare(x: Cat, y: Cat): number {
    throw new Error('Method not implemented.');
  }
}

Then we create a list of Cats,

const cats = [ new Cat(), new Cat(), new Cat() ]

Now we can call sort as follows without any problem:

sort(cats, new CatComparator());

We have not seen the need for contravariance so far.

Now, suppose we are told that someone has already implemented a comparator for Animals as follows:

class AnimalComparator implements Comparator<Animal> {
  compare(x: Animal, y: Animal): number {
    throw new Error('Method not implemented.');
  }
}

Since a Cat is also an Animal, this AnimalComparator is also able to handle Cats, because the compare function in AnimalComparator takes two Animals as input. I can just pass two Cats to it and there will be no problem.

Naturally, we would want to use AnimalComparator for sort too, i.e., call the sort function as:

sort(cats, new AnimalComparator());

However, since the following two types:

  • Comparator<Animal>
  • Comparator<Cat>

are not related from the point of view of TypeScript's type system, we cannot do that.

Therefore, I would like the sort function to look like the following

function sort<T super Cat>(cats: Cat[], comparator: Comparator<T>): void {
  // Some comparison-based sorting algorithm.
  // The following line uses the comparator to compare two cats.
  comparator.compare(cats[0], cats[1]);
  // ...
}

or as in Java,

function sort(cats: Cat[], comparator: Comparator<? super Cat>): void {
  // Some comparison-based sorting algorithm.
  // The following line uses the comparator to compare two cats.
  comparator.compare(cats[0], cats[1]);
  // ...
}

I am aware of the fact that TypeScript does not complain if I pass AnimalComparator to sort. But I would like TypeScript's type system to explicitly handle type lower-bounds. In fact, the current type system of TypeScript will let some type error closely related to this issue silently pass the compiler's check (see issue #14524).

This is the best example of contravariance I've read in a long time. Props! πŸ™Œ

As a reference point, flowtype uses - and + to indicate covariance and contravariance

class ReadOnlyMap<K, +V> {
  store: { +[k:K]: V };
  constructor(store) { this.store = store; }
  get(k: K): ?V { return this.store[k]; }
}

Hi @jyuhuan , since you probably already know this https://github.com/Microsoft/TypeScript/wiki/FAQ#why-are-function-parameters-bivariant

I'm afraid lower bound isn't that useful in current TypeScript where variance is unsound.

Indeed, there are also cases lower bound can be useful without variance. Like

interface Array<T> {
  concat<U super T>(arg: ): Array<U>
}

var a = [new Cat]
a.concat(new Animal) // inferred as Animal[]

In such case like immutable sequence container, lower bound helps TypeScript to enable pattern where new generic type is wider than original type.

Yet such usage still needs more proposal since TS also has union type. For example should strArr.concat(num) yield a Array<string | number>?

migrated from #14728:

currently when we see a constraint <A extends B> it means A is a subtype of B, so that

declare var a: A;
declare var b: B;
b = a; // allowed
a = b; // not allowed

consider adding a new constraint of the reversed relation: <A within B> that would mean A is a supertype of B (or in other words B is subtype of A), so that:

declare var a: A;
declare var b: B;
b = a; // not allowed
a = b; // allowed

use case

i have a BigFatClass with 50 methods and a 1 little property, now i want to run some assertions over it, if i declare these assertions like expect<T>() and toEqual<T> of the same T then toEqual asks me for 50 methods that don't matter for the test, and that's the problem

what i need it to declare expect<T>() and toEqual<U within T>() so that i could simply write:

expect(new BigFatClass()).toEqual({ value: true });

<U within keyof T> could be confusing, since U in this case should be a super-set of T keys, not "within" the keys of T.

the idea is that U is always a subset, never a superset, so if so it should not be a problem

So <U within keyof T> would be the same as <U extends keyof T>?

i always forget that subtype of a union is a subset of that union, conversely a supertype of a union must be a superset, so you right that within would look like a poor choice of word to indicate that something is a superset of something else, how about A unties B or A relaxes B or loosens, frees, eases etc: https://www.powerthesaurus.org/loosen/synonyms

@aaronbeall problem with Partial<T> for making a supertype of a product type, is that it only works at the top level, so it doesn't really work for my use case, consider:

type Super<T> = Partial<T>;
type Data = { nested: { value: number; }; }
const one: Super<Data> = {}; // works
const another: Super<Data> = { nested: {} }; // bummer

so i am back to hammering the expected values with the type assertions

expect(data).toEqual(<Data>{ nested: {} });

@Aleksey-Bykov Probably doesn't make this feature any less valuable but for your case I think you can use a recursive partial type:

type DeepPartial<T> = {
  [P in keyof T]?: DeepPartial<T[P]>;
}

(This was suggested as an addition to the standard type defs, which I think would be very useful.)

@aaronbeall DeepPartial almost works for my needs, except that due to having an empty object in it, it can be assigned by anything (except null and undefined), and it's a problem:

type Super<T> = DeepPartial<T>
type Data = { value: number; }
const one: Data = 5; // not allowed
const another: Super<Data> = null; // not allowed
const yetAnother: Super<Data> = 5; // allowed
const justLike: {} = 5; // <-- reason why (allowed)

@Aleksey-Bykov Woah, you just answered my question. But now I have another one, why is this allowed?

const a1: {} = 0;
const a2: {} = '0';
const a3: {} = false;
const a4: {} = { a: 0 };

Is {} the same asany minus null and undefined?

EDIT: I tried the following DeepPartial definition in the link I provided and it seems to work.

type DeepPartial<T> = {[P in keyof T]?: T[P] | (DeepPartial<T[P]> & object); };

It is pretty late, maybe that won't work either, I'll probably think of some example that will break it once I wake up.

Instead of <A super Kitten> or <A within Kitten> or any other new keyword, why not just <Kitten extends A>, i.e. allow putting the new type variable on the right of the extends?

I really like @HerringtonDarkholme's example in #14520 (comment). I'd add indexOf as another common, standard library method where this would be useful. That is, if I have:

declare a: string[];
declare b: string | number;

a.indexOf(b); // fails, would be nice for this to succeed without a type assertion on b.

That could happen if the types looked like:

interface Array<T> {
  indexOf<U super T>(arg: U): boolean 
}
LPTK commented

@ethanresnick note that making the indexOf method too flexible can be dangerous and lead to surprising type checking behavior.

We can see that in Scala, where indexOf has the type you propose, as shown in this REPL session:

Welcome to the Ammonite Repl 1.6.7
(Scala 2.12.8 Java 1.8.0_121)
If you like Ammonite, please support our development at www.patreon.com/lihaoyi
@ Array(1,2,3).indexOf
def indexOf[B >: A](elem: B): Int             def indexOf[B >: A](elem: B,from: Int): Int
@ Array(1,2,3).indexOf("oops")
res0: Int = -1

Here, the problem is that Scala manages to type-check without errors the expression Array(1,2,3).indexOf("oops") by picking the type parameter B as Any and then using the fact that Any is a supertype of String; i.e., it infers Array[Int](1,2,3).indexOf[Any]("oops").

But it would be more useful to get an error in this case.

If I understand this correctly, what @ethanresnick has proposed would essentially set the parameter type for indexOf to any. Not literally, but essentially because any argument of any type could be passed to indexOf resulting in an infinite number of possible types.

@LPTK mentions the problem with being too flexible, but what about being too restrictive?
Array<2 | 3>[2, 3].indexOf(<1 | 2 | 3>n)?
This is a valid use case that presents a compiler error.

The only real solution in this case would be to test for any type overlap between the array type union and arg type union. But that may not be easy.

Is typecasting as any the best solution to over restrictive type checking in indexOf, or does anyone have a better solution?

I think exposing some notion of whether two types overlap could be quite interesting as a separate feature request, but it's kinda distinct from this issue about bounds. Not sure what to say about indexOf absent that...

Hi all.
I use as as temporary solution

const anyString = 'asdasdasdas';
const TAG_TYPES: Array<TTags> = ['p', 'h1', 'h2', ....];
(TAG_TYPES as Array<string>).indexOf(anyString) === -1;

I think I would like to add here base type. Probably explicitly.
How about adding something like:

TAG_TYPES.indexOf<string>(anyString)

or adding some algorithm to find base type implicitly(probably there is one)?

Conditional types make it possible to emulate this for primitives, at least.

interface Array<T> {
  includes(
    searchElement: T extends string
      ? string
      : T extends number
      ? number
      : T extends boolean
      ? boolean
      : T,
    fromIndex?: number,
  ): boolean;

  indexOf(
    searchElement: T extends string
      ? string
      : T extends number
      ? number
      : T extends boolean
      ? boolean
      : T,
    fromIndex?: number,
  ): number;
}
svr93 commented

Is typecasting as any the best solution to over restrictive type checking in indexOf, or does anyone have a better solution?

I hope you find this example helpful: #26255 (comment)

UPD: works only with β€œstrictNullChecks” -> false

Also, it’s possible to overload existing signature globally

Not sure it was mentionaed before, such bound may be useful in orm-like things, where you want to allow selecting some subset of given entity properties, like such:

class User {
  id: number;
  name: string;
  birthday: Date;
  photos: Photo[];
}

const userNames = getRepository(User).select({ name: true }).where(...).getAll();

You want select to accept object (lets call it "shape") with only properties, that entity(User in our case) has. But you cannot restrict it in this way, because extends allows shape to has extra properties. With super you can restrict passed shape in such way:

type ShapeOf<T extends object> = {
  [P in keyof T]: true | false;
};

type OnlyShape<E extends object, S extends object> = {
  [P in keyof E]: P extends keyof S ? S[P] extends true ? E[P] : never : never;
};

class Repository<Entity, Selected> {
  select<Shape super ShapeOf<Entity>>(shape: Shape): Repository<Entity, OnlyShape<Entity, Shape>> {...}
  getAll(): Selected[] {...}
}

so extra properties are prohibited. You may achieve this currently using very hacky method,
when you calculate Shape type using Shape itself:

type ShapeOf<E, S> = {
  [P in keyof S]: P extends keyof E ? S[P] extends boolean ? S[P] : never : never;
}
...
  select<Shape extends ShapeOf<Entity, Shape>>(shape: Shape): Repository<Entity, OnlyShape<Entity, Shape>> {...}
...

Is there any chance of moving this along from "Discussion" phase, @RyanCavanaugh ? Several other issues about e.g. the signature of Array#includes / Array#indexOf have been closed in favor of this one, but the team doesn't seem to be involved here at all.

It don't think it's controversial that this should pass type checking:

declare const n: number;
const arr = [1,2,3] as const;
arr.includes(n);

At least, I haven't seen anybody in any of the related threads identifying this construct -- specifically for methods on const arrays of primitive literals -- as "error prone". If I'm right, and allowing this is uncontroversial, what's the next step in moving towards a solution?

If I'm wrong, and it is controversial, a) why?, and b) what's a better / safer way to write the above?

Also having issues with includes, but with string literals and the generic string type (#26255). Can't pass in a string to Array.includes if the array is filled with only literals, which seems to completely undermine the whole point of an includes method.

This would be useful for specifying an API that wants to specify the outer bound of what is supported.

For instance an API that only supports JSON serialization, so wants to specify that all the type parameters are subtypes of some JsonValue type (for instance the one defined by https://github.com/sindresorhus/type-fest/blob/main/source/basic.d.ts)

Put the constraint on a conditional type could help solving this issue.

For OP's example:

class Animal {
    #brand!: symbol;
}
class Cat extends Animal {
    #brand!: symbol;
}
class Kitten extends Cat{
    #brand!: symbol;
}

function foo</* T super Cat */ T>(a: Cat extends T ? T : never) { /* */ }

foo(new Animal); // OK
foo(new Cat); // OK
foo(new Kitten); // Error, as expected.

For methods like Array#includes, Array#indexOf, etc...

// `includes` is a bivariant operation. 
// I only add a overload for the contravariant behavior here, and the covariant behavior is ensured by the existed overload.

interface Array<T> {
    includes</* U super T */ U>(this: T extends U ? this : never, searchElement: U, fromIndex?: number | undefined): boolean;
}

interface ReadonlyArray<T> {
    includes</* U super T */ U>(this: T extends U ? this : never, searchElement: U, fromIndex?: number | undefined): boolean;
}

var arr1 = ['a', 'b', 'c'];
var arr2 = ['a', 'b', 'c'] as const;
declare var v1: string;
declare var v2: number;
declare var v3: 'd';

arr1.includes(v1); // OK
arr1.includes(v2); // Error as expected
arr1.includes(v3); // OK
arr2.includes(v1); // OK
arr2.includes(v2); // Error as expected
arr2.includes(v3); // Error as expected

Let me present here yet another case :

// If the property of the object may be undefined (ie: string | undefined), then mark the property as optional
export type OptionalUndefined<T, V extends keyof T = keyof T> = Partial<{
  [K in V as undefined extends T[K] ? K : never]: T[K];
}> & {
  [K in V as undefined extends T[K] ? never : K]: T[K];
} & Omit<T, V>;

type Demo = OptionalUndefined<{ a: string | undefined; b: string | undefined; c: string}, 'a'>
// Expect { a?: string | undefined; b: string | undefined; c: string }

So far so good. But once you plug it into a generic, since never extends literally everything and generic can always be overwritten, well, every targeted property is now never. You can try the following in the playground :

export type OptionalUndefined<T, V extends keyof T = keyof T> = Partial<{
  [K in V as undefined extends T[K] ? K : never]: T[K];
}> & {
  [K in V as undefined extends T[K] ? never : K]: T[K];
} & Omit<T, V>;

type ConfigObject<OptionValue, AllowUndefined extends boolean = false> = OptionalUndefined<{
  value: AllowUndefined extends true ? OptionValue | undefined : OptionValue;
  allowUndefined: AllowUndefined;
  foo: string;
}>;

// Errors :
// Property 'value' does not exist on type 'ConfigObject<OptionValue, AllowUndefined>'
// Property 'allowUndefined' does not exist on type 'ConfigObject<OptionValue, AllowUndefined>'
function myFunction<OptionValue, AllowUndefined extends boolean = false>({ value, allowUndefined, foo }: ConfigObject<OptionValue, AllowUndefined>): void {
  
}

I hate to add to an already long thread, but I wanted to contribute what I think is a much simpler example to motivate the value of super-type constraints, using nothing more than a list and an abstraction of .forEach().

Start with a list class that abstracts an array down to just its forEach:

class List1<T> {
  constructor(private readonly xs: readonly T[]) { }
  forEach(f: (x: T) => void) {
    this.xs.forEach(f);
  }
}

Next, give the caller control over the dispatch. Like something halfway between a Stream (push) and an Iterator (pull).

class List2<T> {
  constructor(private readonly xs: readonly T[]) { }
  forEach(f: (x: T) => void): () => void {
    let i = 0;
    return () => f(this.xs[i++]);
  }
}

to be called like:

function logFirstFew(xs: List2<number>) {
  const forEach = xs.forEach(x => console.log(x));
  forEach.next();
  forEach.next();
  forEach.next();
}

Finally, turn that iteration control into its own class:

class List3<T> {
  constructor(private readonly xs: readonly T[]) { }
  forEach(f: (x: T) => void): ForEach<T> {
    return new ForEach<T>(this.xs, f);
  }
}
class ForEach<T> {
  private i = 0;
  constructor(
    private readonly xs: readonly T[],
    private readonly f: (x: T) => void,
  ) { }
  next() {
    this.f(this.xs[this.i++]);
  }
}

Where are supertype constraints needed?

The usage of T in List<T> is essentially covariant, so List<T> should be assignable to List<U> if T is assignable to U. In particular, List<never> should be assignable to any List<T>.

Let's see if that's true:

const empty1 = new List1<never>([]);
const empty2 = new List2<never>([]);
const empty3 = new List3<never>([]);

let xs1: List1<number> = empty1;
let xs2: List2<number> = empty2;
let xs3: List3<number> = empty3;  // oh no, this fails.

The reason List3<T> suddenly lost its covariance of T (where List3<never> is no longer assignable to List3<?>) is because the private use of T in ForEach<T> - which is not visible on the public interface of ForEach<T> - has leaked into the type of List3<T>, forcing invariance.

Surely this scenario can be done without them?

Yes there are ways to restore the covariance of T in List<3>, such as:

  1. Boilerplate to split the interface/implementation of ForEach:
    ...in order to hide T from List<T>::forEach. e.g.

      forEach(f: (x: T) => void): { next(): void } {
        return new ForEach<T>(this.xs, f);
      }

    If the methods in the ForEach type grow, this gets cumbersome.

  2. Leveraging some DeMorgan-like equivalence of ? extends ((x: T) => void) <=> (x: ? super T) => void
    ...and introducing a generic F for the type of the callback function.

    class List3<T> {
      forEach<F extends (x: T) => void>(f: F): ForEach<T, F> {
        return new ForEach<T, F>(this.xs, f);
      }
    }
    class ForEach<T, F extends (x: T) => void> {
      constructor(
        private readonly xs: readonly T[],
        private readonly f: F,
      ) { }
    }

    This works, but the need for F is relatively confusing, even to those comfortable with variance.

What could it look like with them?

It would be a lot simpler and more concise if there was a Java-like super constraint and/or an existential wildcard ?, allowing:

  forEach<U super T>(f: (x: U) => void): ForEach<U> {
    return new ForEach<U>(this.xs, f);
  }

or

class ForEach<T> {
  constructor(
    private readonly xs: readonly T[],
    private readonly f: (x: ? super T) => void,
  ) { }
}

In particular, List3<never>::forEach::f would become (? super never) => void instead of (never) => void, thus retaining the ability to assign List3<never> to List3<?>.

A simple example of where lower bound constraints would be useful.

A service type defines secrets it has access to at runtime. Another composable runtime component wants to be able to get secrets for a service with at least a secret of a certain name. Ideally you should be able to define this with a lower bounded string union of secrets you expect the service parameter to have.

Eg. Playground

type Service<SecretName> = {secrets: SecretName[]}

// here you want to say accept any Service which has defined at least the `auth` secret
// eg. let getAuth = (service: Service<[> 'auth']>) => {}
let getAuth = (service: Service<'auth'>) => {}

let a: Service<never> = {secrets: []}
let b: Service<'auth'> = {secrets: ['auth']}
let c: Service<'auth' | 'docusign'> = {secrets: ['auth', 'docusign']}

let a1 = getAuth(a) // should fail
let b1 = getAuth(b) // should be fine
let c1 = getAuth(c) // should be fine

@chrisui if the generic naturally shows up in a contravariant position in your actual data model you already get this behaviour playground

type Service<SecretName> = {retrieveSecret: (s:SecretName)=>unknown}

let getAuth = (service: Service<'auth'>) => ''

declare let a: Service<never>;
declare let b: Service<'auth'>;
declare let c: Service<'auth' | 'docusign'>;

let a1 = getAuth(a) // fails
let b1 = getAuth(b) // works
let c1 = getAuth(c) // works

Cheers for the reply and playground @tadhgmister. That solution requires having only a function which would be contravariant though, if I understand? But in my case the definition is, by-design, just some [deserialised, configuration] data. Ie. As soon as I add that back into the Service type it fails.

Might still be a way for me to use this though with a rejig so thank you. Think I just need to separate usage of/inferrance from the "service definition" type/data-shape and the "service runtime" type/interface

Yeah, like I say this only works if it naturally comes up in the actual interface in a contravariant position. In your case @chrisui I figured maybe something that is considered secret would want a method to filter access and the actual property be hidden but if you do want the secrets to be typed not solely through the arguments of a method then yeah a solution using the existing mechanism will likely feel hacky.

Is there any movement in implementing this?
It has been 6 years and it is now blocking a bunch of other PRs and it would be amazing to have some of that functionality in TS.

A bit of a somewhat working workaround for anyone interested πŸ˜ƒ

type Range<Lower, Upper> = Upper & Partial<Extract<Lower, Upper>>

// test
type Big = { f1: number, f2: string, f3: string }
type Small = { f1: number }

const scoped: Range<Big, Small> = { f1: 1, f2: 'one' }
const small: Small = scoped
const big: Big = scoped // no no

This seems like the most common thing I want from TypeScript that it can't currently do. Examples:

/** Numeric text box component. `number` must be a valid value of `Num`,
    but TypeScript does not accept <number extends Num extends NFTHNum>` */
function NumericTextField<Num extends NumEx>(...): JSX.Element {...}
type NumEx = number | undefined | null | { toString():string };

/** This should say `null extends CFV` i.e. that null must be a valid value of CFV,
    but TypeScript doesn't support that kind of constraint AFAICT. */
function WithNullCheckbox<CFV = CustomFieldValue>(...) {...}

Until just now I thought the syntax should "obviously" be <constraint extends T>, e.g. number extends T means that let t: T = 7 must be legal. I realize now that this can't work:

type U = ...;
type V = ...;
// It's unclear whether the `U` or `V` is meant as the parameter name, and for 
// backward compatibility, `U` must be the name and `V` must be the constraint.
function Foo<U extends V>() {}

So I propose that when this feature is implemented, the TypeScript compiler should detect what the user is trying to do in most cases and offer a correction, e.g. if the final syntax were Foo<T includes Bound>, TypeScript could offer error messages like:

// Currently the error is "Cannot find name 'Y'" but if `X` was already defined, 
// I propose "Cannot find name 'Y'. Did you mean 'Y includes X'?"
function Foo<X extends Y>() {}

// Currently the error is "Type parameter name cannot be 'number'". I propose
// "Type parameter name cannot be 'number'. Did you mean 'T includes number'?"
function Foo<number extends T>() {}

// These don't parse, so better error messaging would need parser changes.
function Foo<{ toString(): string } extends T>() {}
function Foo<(X|Y) extends T>() {}
function Foo<X[] extends T>() {}

The use case I came across for this is an "action executor" in the context of an application.

interface AppContext { database; service1; service2; serviceN; }
function executeAction<T>(actionFn: (context: AppContext, args) => T) { ... }

Naturally, you can pass in an action that takes a subset of AppContext, which is desirable (so that they can be tested, for example, without a full context), but what if I want to have an interface/type for an action?

interface Action<
  TContext, // super AppContext
  TResult
> {
  (context: TContext): void;
  name: string;
  metadata;
}

How do I constrain TContext correctly? Seems like it can't be done.

However, there is a relatively easy workaround, though kind of annoying, which is to always use a function type when you want a contravariant type parameter:

interface Action<
  TContextFn extends (context: AppContext) => void,
  TResult
> {
  (context: Parameters<TContextFn>[0]): TResult;
  name: string;
  metadata;
}

But that makes it kind of hard to explictly type Action. In my case, that's not a big deal, but it would be really nice to have a proper lower bound syntax.

jcalz commented

This would also allow safer property writes:

interface Foo { a: true }
const foo: Foo = { a: true };

function f(o: { a: boolean }) {
  o.a = Math.random() < 0.5; // is accepted but unsafe
}
f(foo); // is accepted but unsafe

function g<T super boolean>(o: { a: T }) {  
//           ^^^^^ <--- proposed feature
  o.a = Math.random() < 0.5; // accepted because it's safe
}
g(foo); // rejected
LPTK commented

@jcalz the underlying issue here is that TS unsoundly treats mutable records as covariant. The f(foo) call itself really should be forbidden. Using a lower bound in g does not solve anything as g's type can still be instantiated to f's type and called similarly: g<boolean>(foo).

@jcalz the underlying issue here is that TS unsoundly treats mutable records as covariant. The f(foo) call itself really should be forbidden. Using a lower bound in g does not solve anything as g's type can still be instantiated to f's type and called similarly: g<boolean>(foo).

Yes the solution is to have readonly properties with the current assignability, and mutable properties that are invariant. Perhaps a strict setting would make all properties of a received object readonly (preserving current assignability for the sake of external compatibility), highlighting unsafe code. The author could then mark the property mutable to make it invariant, or choose to override the error e.g. via cast to continue using unsafe code. #18770

export const isInstanceOf = <I>(c: new () => I) => <T supertype I>(v: T): v is I => {
    return v instanceof c
}

This is my use case. I would like to be able to do things like:

export const isInstanceOfString = isInstanceOf(String);

I’d like to present a compelling use case for introducing such a feature, inspired by Ramda’s R.includes function:

import * as R from "ramda";

type State = 0 | 1 | 2;

const includes0 = R.includes(0 as const);
const states: State[] = [1, 0, 1, 2];
includes0(states)

Ideally, the type signature of includes0 should be something like <T super 0>(list: readonly T[]) => boolean, allowing it to accept any array type that includes the value 0. For this to work, the type of R.includes itself would need to be structured as function includes<T>(target: T): <U super T>(list: readonly U[]) => boolean, but achieving this is currently not possible in TypeScript.

If you paste the code above into the TypeScript Playground, it actually results in an error:

includes0(states)
//        ~~~~~~
// Argument of type 'State[]' is not assignable to parameter of type 'readonly 0[]'.
//   Type 'State' is not assignable to type '0'.
//     Type '1' is not assignable to type '0'.

This error highlights limitations in the current typing of R.includes:

export function includes(s: string): (list: readonly string[] | string) => boolean;
export function includes<T>(target: T): (list: readonly T[]) => boolean;

The additional overload for string-based use cases exists to support scenarios like const includesFoo = R.includes("foo" as const); const xs: Array<"foo" | "bar"> = ["foo", "bar"]; includesFoo(xs), avoiding errors in these cases. However, this workaround is quite specific and doesn’t generalize well to other types.

I’ve encountered several similar issues due to the contravariant nature of function parameter types, where I desperately wish for syntax like <T super Type>, which could elegantly resolve such cases.

My use case is related to array.includes().
I would like to check whether an item is in the array

For example,

// Type of myArr is ["a", "b"]
const myArr = ['a', 'b'] as const;

// This item could come from reading env or an API response, anywhere that I can not know the exact value or type
const myItem = process.env.MY_ITEM; 

// This will error out saying: Argument of type 'string' is not assignable to parameter of type '"a" | "b"'
myArr.includes(myItem) 

The reason I call myArr.includes(myItem) is to check whether this item exists in myArr, but now TS refuses to execute this function becausemyItem is a super-type (string) of "a" | "b". But, if I know myItem is either "a" or "b", then why do I need to call includes()?

@linkfang very much agree, I have run into this many times especially when trying to validate a value is in some allowed set.

I believe I ran into this issue when letting users upload a file via a input element of type file. Here's the gist of the code:

export const AllowedUploadFileTypes: ("image/jpeg" | "image/png" | "image/webp")[] = [
	"image/jpeg", 
	"image/png", 
	"image/webp",
]

//elsewhere in the input handler for uploading files

if (!AllowedUploadFileTypes.includes(file.type)) {
  //handle rejection of unsupported file type
}

Within the .includes() argument, I get the TS error of: Argument of type 'string' is not assignable to parameter of type '"image/jpeg" | "image/png" | "image/webp"'.

This scenario feels like a common one, and it's unreasonable to have to fight Typescript over this. I could water down the type of AllowedUploadFileTypes to just be string, but then I lose helpful intellisense hints when I hover over it as to the exact allowed types.

I'm not sure if it is related, but would it ever be possible for TS to treat reading and assigning properties differently? Ideally, assignment should use lower bound, otherwise we get issues like these:

type Foo = {
  value: number | string
}

type Bar = {
  value: number
}

function test(foo: Foo) {
  foo.value = 'a'
}

const bar: Bar = { value: 1 }
test(bar)
bar.value // oops!

This issue comes up with React ref objects, preventing us from using refs for less specific types such as HTMLElement.

I think that is just an accepted unsoundness which exists in any subtyping based structurally typed language. Otherwise, it would have a lot of annoying type errors.

Though treating reading and writing is certainly a possible solution, though it would require a lot of new syntax

I'm not sure if it is related, but would it ever be possible for TS to treat reading and assigning properties differently? Ideally, assignment should use lower bound, otherwise we get issues like these:

Yes there is a route for it. Make objects readonly with a flag, and then add keywords like mutable/writeonly. But I don't think the TS team is interested in these types of changes. I've considered taking on further development of something like this, but it's not a project to take on lightly and I'd need funding. So we're a little stuck.

The workaround is to validate the type yourself:

type Foo = {
  value: number | string
}

type Bar = {
  value: number
}

type Writable<O, T> = O extends T ? O : never

function test<T extends Foo>(foo: Writable<Foo, T>) {
  foo.value = 'a'
}

const foo: Foo = { value: 1 }
test(foo) // OK

const bar: Bar = { value: 1 }
test(bar) // Err

Playground

This is very annoying when dealing with Array.includes. It expects searchElement to be the same type as the array which causes issues when dealing with union types.

A simple validation lookup like this is marked as an error:

Playground link