Of course, these aren’t very heavily namespaced, but they could be if we were to tweak the design a bit. For example, if you import foo.stdtypecheck, then std_type_check becomes available. The keyword arguments that the Annotation class could take could be registered dynamically by frameworks. > ann(std_type_check=Bar, doc="a Bar instance"): **kwargs: ann(std_type_check=list, doc="keyword arguments")) \ … or … def foo(*args: ann(std_type_check=list, doc="arguments"), Annotations could be declared thusly: foo_args = Annotation(std_type_check=list, doc="arguments")įoo_kwargs = Annotation(std_type_check=dict, doc="keyword argumentsįoo_return = Annotation(std_type_check=Bar, doc="a bar instance)ĭef foo(*args: foo_args, **kwargs: foo_kwargs) -> foo_return: For example, let’s say we created a class called “Annotation” and a corresponding “ann” function. We could work around PEP 3107’s syntax by mostly ignoring it. Here one idea, but I’d like to hear others. That is, a module that provides a set of functions, classes, and possibly decorators that make using the function annotations support provided in PEP 3107 much easier, and making it possible for people to write function annotation processors that will follow conventions for annotation layering, working around the problem described above by following some basic conventions. What I’m proposing for this project is to work on a PEP 3107 “meta-framework”. If you compare how annotations would look using the PEP 3107 syntax to the kinds of ‘annotation decorators’ you find in the Pyanno project, you can see exactly what I mean. *args: īut isn’t this getting very ugly, very quickly? Function Annotation SyntaxĪn aside: my personal opinion is that the function annotation syntax is very, very ugly, and will quickly clutter function definitions so as to make them totally unreadable. If we used dicts, this could be avoided since you could do something like this: def foo( However, this won’t always work: what if two frameworks are both expecting strings, or two frameworks are both expecting classes, with different semantics? If an annotation expression is a tuple, then every framework should iterate through the items of the tuple until they find an item of the matching type. For example, one could imagine tuples being used for this. It’s true that some standard for this could organically form in the community. This amounts to lack of a standard for layering function annotations. If I have a framework that expects all the annotations on my function definition to be docstrings, and another framework that expects all the annotations to be classes, how do I annotate my function with both documentation and type checks? For example, let’s say that in my code, I use function annotations both for documentation and for optional run-time type checking. The problem only arises when you consider that at some point in the future, there may be more than one use case for function annotations (as the PEP suggests). There is a huge flaw with the creation of Python annotations, IMO. I saw that recently, interest in function annotations for type-checking was revived by GvR, and thought I might resurrect this discussion. Nearly 4 years ago, I wrote this response to the PEP, but I published it to a discussion site that ended up becoming defunct (Clusterify). In 2010, the Python core team wrote PEP 3107, which introduced function annotations for Python 3.x.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |