Support function decorators excellently
sixolet opened this issue ยท 22 comments
Decorators currently stress mypy's support for functional programming to the point that many decorators are impossible to type. I'm intending this issue as a project plan and exploration of the issue of decorators and how to type them. It's a collection point for problems or incompletenesses that keep decorators from being fully supported, plans for solutions to those problems, and fully-implemented solutions to those problems.
Decorators that can only decorate a function of a fixed signature
You can define your decorator's signature explicitly, and it's completely supported:
from typing import Callable, Tuple
def with_strlen(f: Callable[[int], str]) -> Callable[[int], Tuple[str, int]]:
def ret(__i: int) -> Tuple[str, int]:
r = f(__i)
return r, len(r)
return ret
@with_strlen
def lol(x: int) -> str:
return "lol"*x
reveal_type(lol) # E: Revealed type is 'def (builtins.int) -> Tuple[builtins.str, builtins.int]
Notes:
- You can even use type variables to get some flexibility in argument and return types, but this falls over as soon as you don't know the exact number of arguments to expect.
- Your decorated function's arguments can't be called by name.
- #2607 enables that
Decorators that do not change the signature of the function
Nearly fully supported. Here's how you do it:
from typing import TypeVar, Callable, cast
T = TypeVar('T')
def print_callcount(f: T) -> T:
x = 0
def ret(*args, **kwargs):
nonlocal x
x += 1
print("%d calls so far" % x)
return f(*args, **kwargs)
return cast(T, ret)
@print_callcount
def lol(x: int) -> str:
return "lol"*x
reveal_type(lol) # E: Revealed type is 'def (x: builtins.int) -> builtins.str'
Notes:
- Mypy trusts that you can call
f
. You can set a bound onT
to beCallable
, but you don't need to for it to typecheck. - This business doesn't typecheck without the cast. That's a symptom of the fact that mypy doesn't understand the function nature of the argument to
print_callcount
, and there's no way to declare that argument as aCallable
explicitly without losing the argument type oflol
later. We'd like to minimize the places where casts are required. Doing so requires something along the lines of "variadic argument variables", discussed below.
Decorators that take arguments ("second-order"?)
Plenty of decorators "take arguments" by actually being functions that return a decorator. For example, we'd like to be able to do this:
from typing import Any, TypeVar, Callable, cast
T = TypeVar('T')
def callback_callcount(cb: Callable[[int], None]) -> Callable[[T], T]:
def outer(f: T) -> T:
x = 0
def inner(*args, **kwargs):
nonlocal x
x += 1
cb(x)
return f(*args, **kwargs)
return cast(T, inner)
return outer
def print_int(x: int) -> None:
print(x)
@callback_callcount(print_int)
def lol(x: int) -> str:
return "lol"*x
reveal_type(lol) # E: Revealed type is 'def (x: builtins.int) -> builtins.str'
Notes:
- This does not typecheck yet -- errors on calling the decorator, and
lol
ends up typed asNone
- Relevant issue: #1551
- #3028 fixes this.
- Still has a non-ideal cast...
Mess with the return type or with arguments
For an arbitrary function you can't do this at all yet -- there isn't even a syntax. Here's me making up some syntax for it.
Messing with the return type
from typing import Any, Dict, Callable
from mypy_extensions import SomeArguments
def reprify(f: Callable[[SomeArguments], Any]) -> Callable[[SomeArguments], str]:
def ret(*args: SomeArguments.positional, **kwargs: SomeArguments.keyword):
return repr(f(*args, **kwargs))
return ret
@reprify
def lol(x: int) -> Dict[str, int]:
return {"lol": x}
reveal_type(lol) # E: Revealed type is 'def (x: builtins.int) -> builtins.str'
Messing with the arguments
from typing import Any, Callable, TypeVar
from mypy_extensions import SomeArguments
R = TypeVar('R')
def supply_zero(f: Callable[[int, SomeArguments], R]) -> Callable[[SomeArguments], R]:
def ret(*args: SomeArguments.positional, **kwargs: SomeArguments.keyword):
return f(0, *args, **kwargs)
return ret
@supply_zero
def lol(x: int, y: str) -> str:
return "%d and %s" % (x, y)
reveal_type(lol) # E: Revealed type is 'def (y: builtins.str) -> builtins.str'
The syntax here is fungible, but we would need a way to do approximately this thing -- capture the types and kinds of all a function's arguments in some kind of variation on a type variable.
Relevant issues and discussions:
Variadic type variables alone (python/typing#193) get you some of the way there, but lose all keyword arguments of the decorated function.
Things to do:
- Implement variadic type variables (fill in PR when I have it)
- Write up detailed proposal for semantics of argument variables
- ... and how they interact with
*args
and**kwargs
- ... and their relationship to variadic type variables and the expand operation
- ... and the semantics of an easy-to-use
SomeArguments
-style alias, so nobody has to actually engage with the details of the above when writing normal decorators.
- ... and how they interact with
- Come to some kind of mypy-community consensus or near-consensus on that proposal. It'll be in
mypy_extensions
nottyping
at first -- this can be fodder for the future of PEP484, but while we're playing in such experimental land, not yet. - PR to implment the
SomeArguments
thing.
Great to see this may be really happening! The Twitter discussion gave me a new suggestion for what to name "second-order decorators" -- people seemed to like "decorator factory" best.
"decoratory" :)
Here are some thoughts about decorators that tweak arguments/return types. This proposal combines features from @sixolet's proposal and other sources, such as #1927 and python/typing#193.
Argspec type variables
We'd add a new kinds of type variable: argspec. An argspec type variable represents an arbitrary sequence of callable argument types/kinds, such as DefaultArg(int, 'b')
using the syntax recently implemented by @sixolet. (We could also generalize this to variadic type variables that represent simple sequences of types, but that's out of scope for this issue.)
This is how we'd define one:
from typing import TypeVar
Args = TypeVar('Args', argspec=True)
For example, we could use this in a stub for contextlib
to define the signature of contextmanager
:
RT = TypeVar('RT') # Regular type variable
def contextmanager(
func: Callable[Args, Iterator[RT]]) -> Callable[Args, GeneratorContextManager[RT]]: ...
Decorator implementations
Outside a stub, we hit a problem: it's going to be tricky to type check implementations of functions that use an argspec type variable in their signatures. We can sidestep this issue by not supporting this at all -- instead we'd require a separate external signature declaration and an implementation for such functions, and the implementation can't use argspecs. This is analogous to how overloaded functions with implementations work. Sketch of how to implement contextmanager
:
from typing import declared_type
...
@declared_type
def contextmanager(
func: Callable[Args, Iterator[RT]]) -> Callable[Args, GeneratorContextManager[RT]]: ...
def contextmanager(
func: Callable[..., Iterator[RT]]) -> Callable[..., GeneratorContextManager[RT]]:
<implementation of context manager>
return <something>
The name declared_type
is analogous to decorated_type
(see #3291), but I'm not convinced that this the best name we can think of. Maybe we should even combine declared_type
and decorated_type
.
__call__
Now what about a decorator that is not a callable but a more general object? We could still generalize this approach by making it possible to use an argspec to define the type of __call__
(here I assume a stub file):
class dec(Generic[Args, RT]):
def __init__(self, fn: Callable[Args, RT]) -> None: ...
def __call__(self, *args: Args, **kwargs: Args) -> RT: ...
Note that Args
can be used for both *args
and **kwargs
-- actually, it must always be used in such a way if used outside a Callable
type, as only *args: Args
doesn't really make sense.
Expand
We can support an Expand[...]
operator that lets us tweak argument lists. For example, here we add an extra int
argument to the beginning of an argument list:
def dec(x: Callable[Args, RT]) -> Callable[[int, Expand[Args]], RT]: ...
The relationship between Args
and Expand[Args]
is like the relationship between alist
and *alist
.
Oh and the updated TypeVar
and declared_type
would initially live in mypy_extensions
.
I have two clarifying questions/proposals:
- Maybe we can allow a simpler version of
Expand
for Python 3.5 and later:def dec(x: Callable[Args, RT]) -> Callable[[int, *Args], RT]: ...
- How should be defined decorators that reduce number of variables? Just in an opposite way?
def dec(x: Callable[[int, *Args], RT]) -> Callable[Args, RT]: ...
Outside a stub, we hit a problem: it's going to be tricky to type check implementations of functions that use an argspec type variable in their signatures.
How so? I'm aware that it's difficult for general variadic functions (as mentioned in python/typing#193), but I'd expect that most decorators will just treat values of Args
type as opaque objects to be passed on to their wrapped function.
@JukkaL I have been pondering this problem for the last two weeks, and haven't started an implementation of any solution because I hadn't solved it yet in my head. I like a whole lot about your suggestions, and I think they get us a lot further.
Some wiggly bits I have:
-
I don't think argspec is a kind of type variable. I think it's different enough it should be its own thing.
-
When you have this code block:
from typing import declared_type
@declared_type
def contextmanager(
func: Callable[Args, Iterator[RT]]) -> Callable[Args, GeneratorContextManager[RT]]: ...
def contextmanager(
func: Callable[..., Iterator[RT]]) -> Callable[..., GeneratorContextManager[RT]]:
<implementation of context manager>
return <something>
Can we consider this instead:
contextmanager_sig = Callable[
[Callable[Args, Iterator[RT]],
Callable[Args, GeneratorContextManager[RT]]
]
@declared_type(contextmanager_sig)
def contextmanager(
func: Callable[..., Iterator[RT]]) -> Callable[..., GeneratorContextManager[RT]]:
<implementation of context manager>
return <something>
This uses a more direct method than redefinition to declare the type of the function aside from its implementation. It also neatly sidesteps the problem of how to define an argspec in the def
syntax at all, for some weird signatures:
@declared_type(Callable[[Callable[Args, R], Expand[Args]], R)
def apply(f: Callable[..., R], *args, **kwargs) -> R: ...
... I don't think these weird signatures are important enough to specifically jump through hoops on their own. Should do this form if we feel like it's cleaner.
The following is speculation and musing and I'm not particularly strongly attached to it.
I have some kind of sneaking suspicion that what we're calling "argspecs" here and variadic type variables are more closely related than (at least I) suspected. I'm looking for a way to relate these two concepts -- for example, a way to cleanly and pleasantly write the signature of map
.
Maybe something like:
Args = ArgSpec('ArgSpec', keyword=False) # matches only positional arguments
R = TypeVar('R')
@declared_type(Callable[[Callable[Args, R], Expand[Iterable[Args], Args]], Iterable[R]])
def map(f: Callable[..., R], *args) -> Iterable[R]: ...
(Here Expand
takes an second argument, specifically the argspec/variadic type var to expand. This turns it into some kind of comprehension.)
- Whichever of these forms we pick, the form for declaring the type of a decorated function should be exactly the same -- they shouldn't be two concepts.
I don't think argspec is a kind of type variable. I think it's different enough it should be its own thing.
That's fair. Anyway, we don't need to decide this very early, since this is only a matter of syntax.
This uses a more direct method than redefinition to declare the type of the function aside from its implementation.
I have no strong opinion either way right now, but I have a feeling that the flexible callable syntax can make more complex signatures hard to read. Again, this is mostly a matter of syntax and we can bikeshed it later after we've agreed on basic principles. And I agree that making this consistent with how we declare the decorated signature of a function would be nice.
I have some kind of sneaking suspicion that what we're calling "argspecs" here and variadic type variables are more closely related than (at least I) suspected.
Yes, I think that a variadic type variables would mostly behave like an argspec, but it wouldn't have any argument kinds (just a sequence of types). I left this out from my proposal because it's not very directly related to the current issue. A variadic type variable would also allow things like Tuple[Expand[X]]
, which argspecs wouldn't support.
How so? I'm aware that it's difficult for general variadic functions (as mentioned in python/typing#193), but I'd expect that most decorators will just treat values of Args type as opaque objects to be passed on to their wrapped function.
Yes, most decorators probably are trivial in that respect. However, any non-trivial decorators could be very tricky to type check. I'm worried that users have a significant number of those (though still a minority of all decorators) and if we don't type check them properly it will be a never-ending stream of bug reports and ideas about handling various edge cases. Also, as these are otherwise quite similar to variadic type variables, it might feel a little odd if we could type check one but not the other.
Maybe we can allow a simpler version of Expand for Python 3.5 and later:
Unfortunately that doesn't work with variadic type variables for things like Tuple[Expand[X]]
, and I'd rather not have two different syntax variants for expanding type variables.
How should be defined decorators that reduce number of variables? Just in an opposite way?
Yes, though changing the number of arguments might be something we implement only a bit later as the feature would be quite useful even without it.
It's been over a year w/o updates to this thread, but it's still the first one that comes up for me in a google search for mypy and decorators. Is this still the right place to look for updates/status/plans for decorator typing improvements?
Just for the record: if someone needs to change the return type of the function inside the decorator and still have typed parameters, you can use a custom mypy
plugin that literally takes 15 LoC: https://github.com/dry-python/returns/blob/92eda5574a8e41f4f5af4dd29887337886301ee3/returns/contrib/mypy/decorator_plugin.py
Saved me a lot of time!
@sobolevn This is btw exactly how mypy preserves precise types for @contextmanager
.
We have just submitted a PEP to provide an extension to the type system to allow for the modification of return types of functions by decorators without the use of plugins at python/peps#1259. We would love to hear feedback over on typing-sig@python.org, especially by anyone on this issue that has struggled with this in the past (thread is here https://mail.python.org/archives/list/typing-sig@python.org/thread/UDHSH4EVVIDKNLRX2YGCIUCBGZ5ALRKC/)
We have just submitted a PEP to provide an extension to the type system to allow for the modification of return types of functions by decorators without the use of plugins at python/peps#1259. We would love to hear feedback over on typing-sig@python.org, especially by anyone on this issue that has struggled with this in the past (thread is here https://mail.python.org/archives/list/typing-sig@python.org/thread/UDHSH4EVVIDKNLRX2YGCIUCBGZ5ALRKC/)
Thanks! From reading the PEP, this will not help with decorators that change the argument type or return type, correct? Like the supply_zero
example above?
@rggjan The PEP will address decorators that alter the return type, but not that alter the arguments.
For altering the arguments, we have another PEP coming down the pipe that may fit your needs, ListVariadics.
For addition:
from typing import Callable, TypeVar
from pyre_extensions.type_variable_operators import Concatenate
Ts = pyre_extensions.ListVariadic("Ts")
def prepend_addition_argument(f: Callable[[Ts], int]) -> Callable[[Concatenate[int, Ts]], str]:
def inner(x: int, *args: Ts) -> str:
return str(x + f( *args))
return inner
@prepend_addition_argument
def foo(x: int, y: int) -> int:
return x + y
reveal_type(foo) # typing.Callable(foo)[[int, int, int], str]
For removal:
from typing import Callable, TypeVar, List
from pyre_extensions.type_variable_operators import Concatenate
Ts = pyre_extensions.ListVariadic("Ts")
TReturn = TypeVar("TReturn")
def simple_partial_application(
f: Callable[[Concatenate[float, Ts]], TReturn]
) -> Callable[[Ts], TReturn]:
def inner( *args: Ts) -> TReturn:
return f(42.0, *args)
return inner
@simple_partial_application
def foo(x: float, y: str, z: bool) -> int:
return 3
reveal_type(foo) # typing.Callable(foo)[[str, bool], int]
For more details on ListVariadics, you can read this presentation from the last typing summit (https://github.com/facebook/pyre-check/blob/master/docs/Variadic_Type_Variables_for_Decorators_and_Tensors.pdf)
The trade-off here is that by going in and out of this ListVariadic, we lose the names of the arguments, meaning that, for example, foo(y="A", z=True)
would not be accepted by Pyre in the second example, even though it would work in the runtime.
For reference, this is what supply_zero could look like
def supply_zero(f: Callable[[Concatenate[int, SomeArguments]], R]) -> Callable[[SomeArguments], R]:
def ret(*args: SomeArguments):
return f(0, *args)
return ret
Supporting mutation in the full-fidelity ParameterSpecifications
would require rich handling of name collision, which would get very complex very quickly. In my opinion working out the specification/implementation of that isn't worth blocking the rest of this, since it seems like the combination of these two features can get us a lot of the way there.
Thanks for the heads-up and detailed explanation! This looks very useful indeed as well. What I'm actually looking for currently is a way to let decorators transform
arguments. Eg you have a function taking any number of arguments (like str, int, int
), and the decorator turns it into a function taking a list of each argument (List[str], List[int], List[int]
). The decorater itself would then do the work of taking elements from the list and giving them to the actual function. This seems to be still out of scope, even with ListVariadic's, as far as I can see?
@rggjan , this is actually in scope for ListVariadics, and in fact is already implemented (but unfortunately only documented at this point in https://github.com/facebook/pyre-check/blob/master/docs/Variadic_Type_Variables_for_Decorators_and_Tensors.pdf).
The addition you'll need here is pyre_extensions.type_variable_operators.Map
.
For your example, you'll need this:
from pyre_extensions import ListVariadic
from pyre_extensions.type_variable_operators import Map
from typing import Callable, TypeVar, List
Ts = ListVariadic("Ts")
TR = TypeVar("TR")
def transformer(f: Callable[[Ts], TR]) -> Callable[[Map[List, Ts]], TR]: ...
@transformer
def transformed(x: int, y: str, z: bool) -> None: ...
reveal_type(transformed) # Callable[[List[int], List[str], List[bool]], None]
Note that again you're losing the names of those parameters by passing through a ListVariadic, but the transformation you're looking for is there.
Does that work for your use case?
@mrkmndz Thanks a lot, that seems to be exactly what I was looking for (except the keeping parameter names would be nice, of course). What's pyre
and pyre_extensions
and how is it connected to mypy
?
Which parts of @mrkmndz's pdf are on a standards track? Is this a strawman implementation for a PEP? (Otherwise, very interesting ideas in this document.)
@rggjan Pyre (https://pyre-check.org/) is another implementation of PEP484 type checking, and pyre_extensions (https://pypi.org/project/pyre-extensions/) is our pip package for the runtime components of our extensions that are not yet standardized.
As for MyPy, from my conversations with @ilevkivskyi and @JukkaL I believe that they are planning on building out a compatible implementation on the MyPy side in near future with the help of @theodoretliu .
With regards to mapping a ParameterSpecification instead of a ListVariadic, one could definitely imagine an analogous Callable[Map[TParams, List], TR]
syntax, but to me that seems too heavy-duty to implement for the amount of usages I have actually seen in practice. Map on ListVariadics comes up in enough places where we felt it was worth implementing there, and in my opinion, that same case does not exist for Parameter Specifications.
@kaste I will be working on drafting a PEP on ListVariadic, Map, and Concatenate in the next month. I am planning on deferring the IntVar related stuff (e.g. Index etc.) into another one once that has an example implementation. Syntax is definitely subject to change, but I would anticipate that most of the core ideas there are planned to be headed for a standards-track PEP.
I think the initial post of this issue should be updated to briefly mention the current state of decorator support. A while ago, I skimmed through this issue and got the impression that "second-order decorators" (without modifying arguments) are not supported by Mypy. Only later I discovered that in fact they are supported.
This is also a documentation issue, the current documentation only explanis bare decorators. I will look into submitting a PR for that.
Can this issue be closed now?
This business doesn't typecheck without the cast.
Now it can:
from typing import TypeVar, Callable, cast, ParamSpec
T = TypeVar('T')
P = ParamSpec('P')
def print_callcount(f: Callable[P, T]) -> Callable[P, T]:
x = 0
def ret(*args: P.args, **kwargs: P.kwargs) -> T:
nonlocal x
x += 1
print("%d calls so far" % x)
return f(*args, **kwargs)
return ret # No cast!
@print_callcount
def lol(x: int) -> str:
return "lol"*x
reveal_type(lol) # E: Revealed type is 'def (x: builtins.int) -> builtins.str'
The other broken example depends on completion of #8645