python/mypy

Declare and use attributes of function objects

dmoisset opened this issue ยท 57 comments

Python functions can have attributes, like

def some_function(x): ...
some_function.some_attribute = 42

The code above fails to typecheck, but there's some real code that uses those attributes in a pretty structured way. For example django tags some metadata about the function that allows formatting the function results when displaying on the admin (examples at https://docs.djangoproject.com/en/1.10/ref/contrib/admin/#django.contrib.admin.ModelAdmin.list_display). So it would be nice to be able to do something like:

class FunctionForAdmin(Callable[[], Any]):
    short_description: str
    boolean: bool = False
    admin_order_field: str
    empty_value_display: str

some_function: FunctionForAdmin
def some_function():
    return ...
some_function.short_description = "What a nice little function!"

And having it properly typechecked... is this possible now?

Oh wow, nice use case for PEP 526! I guess the equivalent pre-PEP-526 syntax would be

some_function = None  # type: FunctionForAdmin
def some_function():
    ...

(In a stub the initializer could be ... instead of None.)

Then all we have to do is change mypy so that this type-checks.

Type checking code like that could be tricky. Just def some_function() isn't really compatible with FunctionForAdmin using ordinary type checking rules, since it doesn't define the extra attributes and it also isn't an instance of the class. Rewriting it makes it clear:

...
def _some_function():
    return ...
some_function: FunctionForAdmin = _some_function   # not compatible
some_function.short_description = "What a nice little function!"
# even now boolean etc. are missing from the object

For this to make sense we could perhaps support some form of multi-step initialization and structural subtyping, but it's unclear how exactly. Also Callable is not valid as a base class. We could perhaps define __call__ instead, but that wouldn't be quite as pretty.

What about using a decorator instead that declares that a function conforms to a protocol (it doesn't define any attributes -- they need to come from somewhere else):

from typing import Protocol, apply_protocol

class AdminAttributes(Protocol):
    short_description: str
    boolean: bool = False
    ...

@apply_protocol(AdminAttributes)
def some_function():
    return ...
some_function.short_description = "..."

Here apply_protocol would generate an intersection type Intersection[AdminAttributes, <callable corresponding to the signature of the function>] for the function. This would have the benefits of working around the subtyping issue and preserving the exact signature of the function (via some magic).

@JukkaL your notation looks ok for me (I wasn't proposing a particular notation, just the general idea in the best way I could write it).

any progress here? what's the latest way to work around this, other than just # type: ignore (my current method)

No, this hasn't reached criticality yet...

Maybe this is a situation where we can (re-)use @declared_type from #3291 to specify a more precise type for the function?

Perhaps, though how would you define the type of a function f(int) -> int that also has an attribute foo: int? You can't inherit from Callable[[int], int].

You can't inherit from Callable[[int], int].

It is possible at runtime (since practically it is just an alias to collections.abc.Callable), but mypy will complain about this. Maybe we can do exactly the same that we do with Tuple[...] -- subclassing Callable will create its copy but with a fallback to an actual instance?

Even if we allowed subclassing from Callable (like in the OP's initial comment), we'd still need some hack in mypy that allows a plain function to be assignable to a variable whose type is a subclass of Callable. See what Jukka wrote above.

we'd still need some hack in mypy that allows a plain function to be assignable to a variable whose type is a subclass of Callable. See what Jukka wrote above.

Exactly, this is why I propose to re-use @declared_type together with subclassing Callable:

class AdminAttributes(Callable[[int], int]):
    short_description: str
    boolean: bool = False
    ...

@declared_type(AdminAttributes)
def some_function(x: int) -> int:
    return ...
some_function.short_description = "..."

This way we will not need an additional decorator @apply_protocol, otherwise the semantics is the same as Jukka proposes.

But this doesn't follow from the rules for @declared_type -- it would require special-casing in mypy, because the "natural" type of some_function is not a subtype of AdminAttributes.

But this doesn't follow from the rules for @declared_type -- it would require special-casing in mypy, because the "natural" type of some_function is not a subtype of AdminAttributes.

Maybe I misunderstood @declared_type, but I thought it is always like this -- the type in @declared_type contains a more precise type for the function, so it should be the subtype of the type inferred by mypy, not vice versa, for example:

def magic_deco(func: Any) -> Any:
    # do some magic
    return func

@declared_type(Callable[[int], int])  # this is more precise than a "natural" type Any
@magic_deco
def fun(x: int) -> int:
    ...

Hm, I always thought it was the other way around. But what you say makes sense too.

I find the draft text for the PEP rather vague on this, and most unit tests, like your example above, skirt the issue by using Any, which is both a subtype and a supertype of everything.

If I look at the implementation it appears to be checking that the inferred type (i.e. before applying @declared_type) is a subtype of the declared type (i.e. the argument to @declared_type).

But this test seems to suggest that you're right.

Where did I go astray?

But this test seems to suggest that you're right.

Not really, note that the callables are contravariant in args. So that in that test the declared type is actually wider (i.e. "worse" in some sense) than the inferred.

Anyway, FWIW, the above example already works with #3291 (if I define __call__ instead of subclassing Callable, which is not allowed now).

I am not sure what is the reason for the subtyping check, but in doesn't influence anything in the PR #3291, it just emits an error. If I reverse the direction, the error of course disappears.

OK, so your suggestion sounds like it would work, but we're all confused about #3291? Let me ping @sixolet.

This request appeared again recently in #3882, so that I think we should allow subclassing Callable, especially taking into account that this already works at runtime for quite some time and it should be quite easy to implement (using fallbacks as I proposed above).

Also mentioning #3831 here, it discusses what to do with fallbacks for CallableType.

OK. I think we should also have a brief amendment to PEP 484 (and discuss with pytype and PyCharm folks).

Two updates:

  • It looks like @JukkaL doesn't like the idea of using @declared_type to work as a cast, so we could go with the original proposed name @apply_protocol.
  • It was proposed on gitter to use the same decorator for classes, so that it will allow monkey-patching

So the example usage will be like this

class Described(Protocol):
    description: str

@apply_protocol(Described)
def func(arg: str) -> None:
    pass

func.description = 'This function does nothing.'
func.description = 1 # Error!
func.whatever = 'else' # Error!

and

class Patched(Protocol):
    def added_meth(self, x: int) -> int: ...

@apply_protocol(Patched)
class Basic:
    ...

def add_one_meth(self, x: int) -> int:
    return x + 1

Basic.added_meth = add_one_meth # OK
Basic.other = 1 # Error!

looks like @JukkaL doesn't like the idea of using @declared_type to work as a cast [...]

Where did he say that? And what exactly does it mean? (I have an idea.)

I like the idea of adding a protocol to a function to declare its attributes (since there's no easy other way). It extend the type. (Though perhaps there should also be some other, equivalent way, e.g. using @describe_type with a class that defines an appropriate __call__ method matching the function to which it's applied.)

But the extension to classes seems to have quite different semantics -- it seems to endow the class with writable methods. This seems a fairly arbitrary mechanism -- you could just as well create an ABC that has an attribute of type Callable and inherit from it. No protocols and magic decorator needed.

@gvanrossum

Where did he say that? And what exactly does it mean? (I have an idea.)

In the context of this comment #3291 (comment) about the direction of the subtype check.

But the extension to classes seems to have quite different semantics -- it seems to endow the class with writable methods. This seems a fairly arbitrary mechanism -- you could just as well create an ABC that has an attribute of type Callable and inherit from it. No protocols and magic decorator needed.

There are some situations where monkey-patching may be desirable/required. Currently mypy prohibits all monkey-patching of created classes:

class C:
    pass

C.x = 1 # Error!

But fully allowing it is not possible. Applying the proposed decorator to classes would tell mypy adding which attributes is fine outside of a class definition (very similar to the situation with function).

I agree there's a use case. I'm not so sure using the same decorator for two different purposes is good API design -- I'd rather have one API for declaring function attributes and a different one for declaring monkey-patchable class attributes - they don't have much in common conceptually.

I'm not so sure using the same decorator for two different purposes is good API design

Using two different names is OK. But we need to be careful not to have too many. Currently three decorators are proposed:

  • @declared_type(typ)
  • @apply_protocol(proto)
  • @allow_patching(proto) or similar name

Maybe we can have only two -- one for classes and one for functions -- by somehow combining the first two in a single API? (But then this may conflict with the idea that it should not work as a cast).

They really are three different use cases. How would you document a combination of the first two bullets? If it starts by explaining the two different use cases, you're better off not combining them.

They really are three different use cases. How would you document a combination of the first two bullets?

If we allow @declared_type to act as a cast then both use cases are covered:

@declared_type(Callable[[int], int])
@complex_deco
def f(x: str) -> None:
    ...

class LabeledCallable:
    label: str
    def __call__(x: int) -> int: ...

@delared_type(LabeledCallable)
def g(x: int) -> int:
    ...

g.label = 'OK'

But if we don't want it to act it as a cast, then yes, we need separate decorators. I see value in both choices. But it is probably not that important, so I am also leaning towards just having three separate names.

When you say "cast" do you mean something that overwrites the type without checking it, like the cast() function in PEP 484? Or is it one of those other meanings of the word, like C's cast from int to float or vice versa (really a conversion) or C++ casts (where I can never remember which way is considered down or up)? I really don't want anything else to behave like PEP 484 cast() -- it's too worrisome having another thing that changes the type without a check, one of those is enough. (Actually in a sense we have another way, # type: ignore. In any case we don't need another way.)

When you say "cast" do you mean something that overwrites the type without checking it, like the cast() function in PEP 484? Or is it one of those other meanings of the word, like C's cast from int to float or vice versa (really a conversion) or C++ casts (where I can never remember which way is considered down or up)?

I mainly mean the last one. I don't remember up/down either, here is an example:

class B:
    ...
class C(B):
    ...

@declared_type(Callable[[], C])  # OK
def func() -> B:
    ...

@declared_type(int)  # Error! Must be a subtype of inferred Callable[[str], str].
def other(x: str) -> str:
    ...

this would be implemented by just reversing the direction of subtype check in PR #3291. But anyway it is still a cast, we still need to "trust" the user to certain extent here. IIUC the current choice of direction is to avoid false negatives independently of what appears in @declared_type(...), so maybe we are better keeping it and just having three decorators (as discussed above).

So, thinking and reading about it some more, a way to remember is that an upcast goes in the direction of the superclass. It's considered safe by mypy (which is why we complain about Liskov violations).

class B: ...
class C(B): ...
@declared_type(Callable[[], C])  # OK
def func() -> B: ...

This is an unsafe downcast, because mypy won't actually verify that func() returns a C. So this is the form of cast I don't like. I'm also unsure what would be the use case (why not just declare func() as returning C and add a cast() call in the return statement if necessary?).

I'd prefer to limit @declared_type() to types that are supertypes of the inferred type (and IIUC that's currently how #3291 works).

The proposed @apply_protocol would only work for functions and depend on the implementation of functions always allowing arbitrary attributes; see #2087 (comment).

For a monkey-patchable class we'd have to invent a third form, @allow_patching:

class Patched(Protocol):
    def added_meth(self, x: int) -> int: ...

@allow_patching(Patched)
class Basic:
    ...

def add_one_meth(self, x: int) -> int:
    return x + 1

Basic.added_meth = add_one_meth # OK
Basic.other = 1 # Error!

I agree that @declared_type should not work as a cast.

For a monkey-patchable class we'd have to invent a third form, @allow_patching

More generally, it might make sense to somehow support applying protocols to classes defined elsewhere, including to library classes. However, this would make modular reasoning harder. Say, if module x patches class y.A, would an unrelated module z see the patched methods? If yes, this makes the implementation quite tricky. I had some ideas about this in the early days of mypy but I gave up because of the apparent complexity. And people could just fork the library stubs and apply protocols directly to library classes.

@gvanrossum

I'm also unsure what would be the use case (why not just declare func() as returning C and add a cast() call in the return statement if necessary?).

This is an oversimplified example, the actual use case would be a subtype of Callable with additional attributes, like in #2087 (comment) (but defining __call__ instead of subclassing Callable while it is prohibited). But...

@JukkaL

I agree that @declared_type should not work as a cast.

OK, then let us make three separate decorators:

  • @declared_type(typ)
  • @apply_protocol(proto)
  • @allow_patching(proto)

I propose to implement the simple version (allow them only at class/function definition site as decorators) first. Then we will see if someone wants the "action at a distance" behaviour.

Are you going to use Naomi's (@sixolet's) work on @declared_type() as a starting point?

If necessary I can actually finish #3291. Adding two other decorators seems to be quite simple and straightforward. For @allow_patching I will update names with necessary precautions, for @apply_protocol I will just set CallableType fallback to the protocol Instance (it will be probably also needed to manually insert necessary standard function attributes, I don't like the fake builtins.function).

OK, then let us make three separate decorators:

I'm not sure if we are quite ready for an implementation yet.

@apply_protocol doesn't support decorators that add attributes (#3882, for example). It looks like a general approach would have to support declaring that the callable returned by a function conforms to a protocol. We could have a separate new decorator for that, but this would be inelegant. I feel like we are inventing one-off features that could be more generally represented using intersection types -- but intersection types have another set of issues.

Also, I don't remember seeing any real use case where @allow_patching would have been clearly useful. There was a user on gitter who wanted to monkey patch classes but it's unclear if @apply_protocol would have helped them. I think that this feature needs more research.

(Raising priority to high since this is a popular feature request).

Also it looks like we didn't discuss another popular use case here: adding attributes from another class in a generic manner. For example, if someone wants to define a proxy class:

class OtherCls:
    val: str

@allow_attributes_of(T)  # We can first support concrete classes/protocols like above
class Proxy(Generic[T]):
    def __init__(self, wrapped: T) -> None:
        self.wrapped = wrapped
    def __getattr__(self, attr):  # This pattern is used quite often
        ...
        return getattr(self.wrapped, attr)

pr = Proxy(SomeCls())
pr.val  # OK, inferred type is "str"

Fwiw typescript solved this problem in 3.1 by changing their type inference to find properties on functions and include them in the inferred type. maybe mypy could copy or learn something from that approach?

see "Easier properties on function declarations" in https://blogs.msdn.microsoft.com/typescript/announcing-typescript-3-1/

Oh, that's nice. So after

def foo() -> None: ...
foo.bar = 12

the type of foo is "callable with no args returning None and attribute bar of type int".

It is worth mentioning though that Typescript also has a way of expressing the type, which existed before this intelligent inference

You can do the same in mypy with protocols:

class FancyCB(Protocol):
    def __call__(self) -> None: ...
    bar: int

And it will work as expected. The problem is that mypy currently prohibits "monkey-patching". Once the body of a function/class ends, there no way to define something more. So we need to relax this restriction somehow. I am not sure allowing it always is a good idea. For example, what if an attribute is added in another module after importing the function? Should we also allow this? I would say no.

Perhaps the "same scope" rule is what we want?

Yes, maybe, anyway, we don't have anyone to work on this now :-(

Perhaps the "same scope" rule is what we want?

I like this idea the most. We'd basically have special a construct like this (a function definition followed by some attribute assignments):

def name(...): 
    ...

name.attr = expr
name.attr2 = expr2
...

Mypy would internally construct a new protocol type for the function.

There are a few remaining questions:

  • Does the assignment have to immediately follow the function definition? I think that anywhere in the same scope after the function definition should be accepted.
  • How to use this in stubs? Perhaps like name.attr: a_type. This also implies that a variable annotation should be allowed in the assignment, and also a type comment.

Idea for an implementation: The semantic analyzer could recognize this construct and record the attributes/assignments that apply to each function. We'd mark the type of the function "partial" during type checking until all relevant assignments have been processed. If all attributes have annotations, we could perhaps construct the type earlier during semantic analysis.

I was having the same issue (trying to add an action to Django admin) and Google kept bringing me here. I'm pretty new to typing/mypy, so I don't know if this is perfect, but in case its of use to anyone, I was able to solve my issue with the below workaround:

class AdminAttributes(Protocol):
    short_description: str


def admin_attr_decorator(func: Any) -> AdminAttributes:
    return func


@admin_attr_decorator
def action(modeladmin: admin.ModelAdmin, request: HttpRequest, queryset: QuerySet) -> None:
    do_stuff()


action.short_description = 'do stuff'

I have a use case for this as well. I would like to extend a Callable type with attributes.

komuw commented

not to add to the noise, I'm just documenting here for my future self since I know I'll google this use case again in the future.

from typing import Callable, Any
import functools


def execute_only_once(func: Callable[..., Any]) -> Callable[..., Any]:
    """
    decorator function that ensures that the decorated function
    is only ran once.

    usage:
        class Cool:
            @execute_only_once
            def my_func(self, name):
                print("name:", name)
        c = Cool()
        c.my_func(name="John")
        c.my_func(name="James")
    """

    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        if not wrapper.already_run:
            wrapper.already_run = True
            return func(*args, **kwargs)

    wrapper.already_run = False
    return wrapper

currently fails with error;

error: "Callable[..., Any]" has no attribute "already_run"

I have the same issue, wanting to use function attributes as a way of storing static variables for a function i.e.:

def camel_to_snake(x: str) -> str:
    x = x.replace(" ", "_")
    try:
        return camel_to_snake.regex.sub(r"\1_\2", x).lower()
    except AttributeError:
        camel_to_snake.regex = re.compile(r"([^A-Z]+?)([A-Z])")
        return camel_to_snake.regex.sub(r"\1_\2", x).lower()

gives:

error: "Callable[[str], str]" has no attribute "regex"

Lowering priority since the core team is unlikely to have bandwidth to work on this in the near/medium term.

If you use @chris104957's nice hack and want the function to actually still be callable for mypy, the following works with mypy 0.761, although I don't know if it's perfectly legit:

from typing import Protocol, TypeVar, Callable, Optional, cast

# Note: can use a more restrictive bound if wanted.
F = TypeVar("F", bound=Callable[..., object])

class ActionWithAttributes(Protocol[F]):
    short_description: Optional[str]
    __call__: F

def action_with_attributes(action: F) -> ActionWithAttributes[F]:
    action_with_attributes = cast(ActionWithAttributes[F], func)
    # Make sure the cast isn't a lie.
    action_with_attributes.short_description = None
    return action_with_attributes

Testing with mypy --strict:

@action_with_attributes
def my_action(some: int, params: str) -> str:
    return "it works"

my_action.short_description = 'do stuff'
# -> OK

my_action.bad = ''
# -> error: "ActionWithAttributes[Callable[[int, str], str]]" has no attribute "bad"

my_action("wrong")
# -> error: Too few arguments for "my_action"
# -> error: Argument 1 to "my_action" has incompatible type "str"; expected "int"

my_action(10, params="ok")
# -> OK

my_action(10, typo="ok")
# -> error: Unexpected keyword argument "typo" for "my_action"

reveal_type(my_action.short_description)
# -> note: Revealed type is 'builtins.str'

This hack doesn't seem to work for methods, as it complains about missing arguments when calling the method (it stops thinking that self was passed in implicitly I think). Is there any workaround for that?

@libre-man you can likely fix this by annotating the __call__ with a @ staticmethod, depending on your exact goal (seem to work for me -- the static method declaration just makes it look like a regular function to the checker)

I was having the same issue (trying to add an action to Django admin) and Google kept bringing me here. I'm pretty new to typing/mypy, so I don't know if this is perfect, but in case its of use to anyone, I was able to solve my issue with the below workaround:

class AdminAttributes(Protocol):
    short_description: str


def admin_attr_decorator(func: Any) -> AdminAttributes:
    return func


@admin_attr_decorator
def action(modeladmin: admin.ModelAdmin, request: HttpRequest, queryset: QuerySet) -> None:
    do_stuff()


action.short_description = 'do stuff'

This worked for me, but If anyone did not know what a Protocol is (like me). It comes from this import from typing_extensions import Protocol https://mypy.readthedocs.io/en/stable/protocols.html

I have a similar issue, and I don't know how to solve it: I have a decorator factory that returns a decorator. The decorator adds one or more attributes to the decorated function, which are known beforehand. The only thing we know about the functions that are decorated, is that they won't ever return anything (they're purely there for the side effects)

Example:

@dataclass
class Metadata:
    events: list[str] = field(default_factory=list)
    
def process(event_type: str) -> ???: # what should this type be?
    def process_decorator(func: Callable[..., None]) -> ???: # and what should this type be?
        func.metadata = getattr(func, "metadata", Metadata()) # functions might be decorated multiple times, so the metadata attribute might already exist
        func.metadata.events.append(event_type)
        return func

    return process_decorator

What are the return types for the decorator factory and the decorator?

If I understand your description correctly, this should work:

P = ParamSpec("P")
R = TypeVar("R", covariant=True)

class FunctionWithMetadata(Protocol[P, R]):
    metadata: Metadata

    def __call__(self, *args: P.args, **kwargs: P.kwargs) -> R:
        ...

def process(event_type: str) -> Callable[[Callable[P, R]], FunctionWithMetadata[P, R]]:
    def process_decorator(func: Callable[P, R]) -> FunctionWithMetadata[P, R]:
        func.metadata = getattr(func, "metadata", Metadata())
        func.metadata.events.append(event_type)
        return cast(FunctionWithMetadata[P, R], func)

    return process_decorator

@ilevkivskyi:

(Raising priority to high since this is a popular feature request).

Also it looks like we didn't discuss another popular use case here: adding attributes from another class in a generic manner. For example ...

Such proxy class could for example also check that all attributes of a dataclass were read. That allows to persists type checks of read attributes and also ensure no attributes were kept not accessed. Would play handy when you try to read schema objects and apply those to models in webserver scenario

The idea suggested in #2087 (comment) unfortunately doesn't work with the latest mypy 1.0.0 (at the time of writing):

# asd.py
from typing import *

class Metadata:
    ...

P = ParamSpec("P")

R = TypeVar("R", covariant=True)

class FunctionWithMetadata(Protocol[P, R]):
    metadata: Metadata

    def __call__(self, *args: P.args, **kwargs: P.kwargs) -> R:
        ...

def process(event_type: str) -> Callable[[Callable[P, R]], FunctionWithMetadata[P, R]]:
    def process_decorator(func: Callable[P, R]) -> FunctionWithMetadata[P, R]:
        func.metadata = getattr(func, "metadata", Metadata())
        func.metadata.events.append(event_type)
        return cast(FunctionWithMetadata[P, R], func)

    return process_decorator
$ mypy asd.py
asd.py:18: error: "Callable[P, R]" has no attribute "metadata"  [attr-defined]
asd.py:19: error: "Callable[P, R]" has no attribute "metadata"  [attr-defined]
Found 2 errors in 1 file (checked 1 source file)

I was having the same issue (trying to add an action to Django admin) and Google kept bringing me here. I'm pretty new to typing/mypy, so I don't know if this is perfect, but in case its of use to anyone, I was able to solve my issue with the below workaround:

class AdminAttributes(Protocol):
    short_description: str


def admin_attr_decorator(func: Any) -> AdminAttributes:
    return func


@admin_attr_decorator
def action(modeladmin: admin.ModelAdmin, request: HttpRequest, queryset: QuerySet) -> None:
    do_stuff()


action.short_description = 'do stuff'

This worked for me, but If anyone did not know what a Protocol is (like me). It comes from this import from typing_extensions import Protocol https://mypy.readthedocs.io/en/stable/protocols.html

First, for mypy 1.0.0, to get this to work (including actually calling action()) I had to add a dummy __call__ to AdminAttributes, like this:

from typing import Any, Protocol

class AdminAttributes(Protocol):
    short_description: str
    def __call__(self):
        pass

def admin_attr_decorator(func: Any) -> AdminAttributes:
    return func

@admin_attr_decorator
def action() -> None:
    print(action.short_description)

action.short_description = 'doing stuff'
action()

However mypy also allows this, which is more concise, maybe more standard, and I think in most cases (maybe all?) achieves the same result:

class Action():
    short_description : str = "Default description"
    def __call__(self) -> None:
        print(self.short_description)
action: Action = Action()

action()
action.short_description = "doing stuff"
action()

Here's another way to achieve this that mypy does not allow:

class action():
    short_description: str = "Default description"
    def __new__(self) -> str: 
        print(self.short_description)
        return self.short_description

action.short_description = "doing stuff"
action()

mypy says:
test.py:5: error: Incompatible return type for "new" (returns "str", but must return a subtype of "action") [misc]

Maybe that's a mypy requirement, but it doesn't seem to be a python requirement:
discuss.python.org/t/metaclass-new-return-value/13376/6

I can even make an example based on this that uses no type annotations at all, and mypy rejects it even without --check-untyped-defs:

class foo():
    def print(self):  
        print("Hello World")
        return ("fooish")

class bar():
    ''' A callable class with a return value'''
    def __new__(self):  
        return foo()

bar().print()

This works fine in python 3.8 but mypy says:
test3.py:12: error: "bar" has no attribute "print" [attr-defined]

I guess I am surprised to see how much valid python mypy restricts, even for completely untyped code.

I'm new to mypy and may well miss the point, but it surprises me given that mypy is advertised as the reference implementation of typing PEPs, yet seems to be imposing non-reference restrictions on the language. Of course I get the argument above, that it may simply be really hard to do.

Here are the related bugs for __new__: #1020 #14426 #8330
It happens to be related because one may try to use both __new__ or function attributes to create a callable with an arbitrary return type and with "static" attributes that may or may not need to also be externally accessible, and in both cases find that mypy complains. It's also related because of the general issue discussed there regarding if the purpose of mypy should be to enforce good coding practices, in some cases without (I think) even any good way to ignore the enforcement. Should there even be any discussion of valid use cases in a reference implementation?

This is a long and very old issue. Note that you can basically accomplish what is needed using Protocols:

from typing import *

class FunctionForAdmin:
    short_description: str
    boolean: bool
    admin_order_field: str
    empty_value_display: str
    
    def __call__(self) -> Any: ...  # or whatever signature you want

# Option 1: Declare type of some_function as below, note you may have to # type: ignore[no-redef] or disable that error code
some_function: FunctionForAdmin

# Option 2: Declare a decorator like in https://github.com/python/mypy/issues/2087#issuecomment-462726600, then use it to decorate some_function

def some_function():
    return ...
    
reveal_type(some_function.short_description)

As others have already noted, in some cases you may be better served by using a normal class with __call__ instead of patching attributes onto functions.

If you want to discuss intersection types, python/typing#213 / https://github.com/CarliJoy/intersection_examples is a good place to do this.

If you have needs that are not met by the above, please open a new issue.

@alessio-b2c2 you need to type ignore in your inner function (or make the input type Any), and everything will work nicely. mypy's not going to understand the process of turning a function from Callable into FunctionWithMetadata