-
Notifications
You must be signed in to change notification settings - Fork 13.2k
Description
I sometimes run into scenarios where it's clear to me what a generic type argument should be, but typescript fails to infer it. I think this is because inference only looks at the call site, and not the context. The example I just ran into was this:
function comparing <A,B> (f : (x : A) => B) : (x : A, y : A) => number {
return function (x : A, y : A) : number {
return f(x) < f(y) ? -1 : x == y ? 0 : 1;
}
}
var x : Array<{ foo : string }> = [{ foo : "hello" }, { foo : "world"}];
// Works, explicit types
function getFoo (o : { foo : string }) : string { return o.foo }
x.sort(comparing(getFoo))
// Works, inferred any everywhere
function getFoo2 (o) { return o.foo }
x.sort(comparing(getFoo2))
// Doesn't work, infers type parameters to {}
x.sort(comparing(o => o.foo))In the last case it infers type parameter A to {}, causing it to fail since {} has no property foo. But from the context (comparing should return a function (x : { foo : string }, y : { foo : string }) : number) because it is used as an argument to sort on x) it should be clear that A should be { foo : string }.
Is there any way this kind of inference could be added, or is this too complicated due to overloads/subtyping/etc.
P.S. There seem to be several issues about things like this, but they all seem slightly different. If they're not, my apologies. I'd like to have a place where I can track this issue, so I would appreciate any pointers to another ticket, a FAQ, etc.