typing
The typing module: Support for gradual typing as defined by PEP 484 and subsequent PEPs.
Among other things, the module includes the following:
- Generic, Protocol, and internal machinery to support generic aliases. All subscripted types like X[int], Union[int, str] are generic aliases.
- Various "special forms" that have unique meanings in type annotations: NoReturn, Never, ClassVar, Self, Concatenate, Unpack, and others.
- Classes whose instances can be type arguments to generic classes and functions: TypeVar, ParamSpec, TypeVarTuple.
- Public helper functions: get_type_hints, overload, cast, final, and others.
- Several protocols to support duck-typing: SupportsFloat, SupportsIndex, SupportsAbs, and others.
- Special types: NewType, NamedTuple, TypedDict.
- Deprecated aliases for builtin types and collections.abc ABCs.
Any name not present in __all__ is an implementation detail that may be changed without notice. Use at your own risk!
1""" 2The typing module: Support for gradual typing as defined by PEP 484 and subsequent PEPs. 3 4Among other things, the module includes the following: 5* Generic, Protocol, and internal machinery to support generic aliases. 6 All subscripted types like X[int], Union[int, str] are generic aliases. 7* Various "special forms" that have unique meanings in type annotations: 8 NoReturn, Never, ClassVar, Self, Concatenate, Unpack, and others. 9* Classes whose instances can be type arguments to generic classes and functions: 10 TypeVar, ParamSpec, TypeVarTuple. 11* Public helper functions: get_type_hints, overload, cast, final, and others. 12* Several protocols to support duck-typing: 13 SupportsFloat, SupportsIndex, SupportsAbs, and others. 14* Special types: NewType, NamedTuple, TypedDict. 15* Deprecated aliases for builtin types and collections.abc ABCs. 16 17Any name not present in __all__ is an implementation detail 18that may be changed without notice. Use at your own risk! 19""" 20 21from abc import abstractmethod, ABCMeta 22import collections 23from collections import defaultdict 24import collections.abc 25import copyreg 26import functools 27import operator 28import sys 29import types 30from types import WrapperDescriptorType, MethodWrapperType, MethodDescriptorType, GenericAlias 31 32from _typing import ( 33 _idfunc, 34 TypeVar, 35 ParamSpec, 36 TypeVarTuple, 37 ParamSpecArgs, 38 ParamSpecKwargs, 39 TypeAliasType, 40 Generic, 41 NoDefault, 42) 43 44# Please keep __all__ alphabetized within each category. 45__all__ = [ 46 # Super-special typing primitives. 47 'Annotated', 48 'Any', 49 'Callable', 50 'ClassVar', 51 'Concatenate', 52 'Final', 53 'ForwardRef', 54 'Generic', 55 'Literal', 56 'Optional', 57 'ParamSpec', 58 'Protocol', 59 'Tuple', 60 'Type', 61 'TypeVar', 62 'TypeVarTuple', 63 'Union', 64 65 # ABCs (from collections.abc). 66 'AbstractSet', # collections.abc.Set. 67 'ByteString', 68 'Container', 69 'ContextManager', 70 'Hashable', 71 'ItemsView', 72 'Iterable', 73 'Iterator', 74 'KeysView', 75 'Mapping', 76 'MappingView', 77 'MutableMapping', 78 'MutableSequence', 79 'MutableSet', 80 'Sequence', 81 'Sized', 82 'ValuesView', 83 'Awaitable', 84 'AsyncIterator', 85 'AsyncIterable', 86 'Coroutine', 87 'Collection', 88 'AsyncGenerator', 89 'AsyncContextManager', 90 91 # Structural checks, a.k.a. protocols. 92 'Reversible', 93 'SupportsAbs', 94 'SupportsBytes', 95 'SupportsComplex', 96 'SupportsFloat', 97 'SupportsIndex', 98 'SupportsInt', 99 'SupportsRound', 100 101 # Concrete collection types. 102 'ChainMap', 103 'Counter', 104 'Deque', 105 'Dict', 106 'DefaultDict', 107 'List', 108 'OrderedDict', 109 'Set', 110 'FrozenSet', 111 'NamedTuple', # Not really a type. 112 'TypedDict', # Not really a type. 113 'Generator', 114 115 # Other concrete types. 116 'BinaryIO', 117 'IO', 118 'Match', 119 'Pattern', 120 'TextIO', 121 122 # One-off things. 123 'AnyStr', 124 'assert_type', 125 'assert_never', 126 'cast', 127 'clear_overloads', 128 'dataclass_transform', 129 'final', 130 'get_args', 131 'get_origin', 132 'get_overloads', 133 'get_protocol_members', 134 'get_type_hints', 135 'is_protocol', 136 'is_typeddict', 137 'LiteralString', 138 'Never', 139 'NewType', 140 'no_type_check', 141 'no_type_check_decorator', 142 'NoDefault', 143 'NoReturn', 144 'NotRequired', 145 'overload', 146 'override', 147 'ParamSpecArgs', 148 'ParamSpecKwargs', 149 'ReadOnly', 150 'Required', 151 'reveal_type', 152 'runtime_checkable', 153 'Self', 154 'Text', 155 'TYPE_CHECKING', 156 'TypeAlias', 157 'TypeGuard', 158 'TypeIs', 159 'TypeAliasType', 160 'Unpack', 161] 162 163 164def _type_convert(arg, module=None, *, allow_special_forms=False): 165 """For converting None to type(None), and strings to ForwardRef.""" 166 if arg is None: 167 return type(None) 168 if isinstance(arg, str): 169 return ForwardRef(arg, module=module, is_class=allow_special_forms) 170 return arg 171 172 173def _type_check(arg, msg, is_argument=True, module=None, *, allow_special_forms=False): 174 """Check that the argument is a type, and return it (internal helper). 175 176 As a special case, accept None and return type(None) instead. Also wrap strings 177 into ForwardRef instances. Consider several corner cases, for example plain 178 special forms like Union are not valid, while Union[int, str] is OK, etc. 179 The msg argument is a human-readable error message, e.g.:: 180 181 "Union[arg, ...]: arg should be a type." 182 183 We append the repr() of the actual value (truncated to 100 chars). 184 """ 185 invalid_generic_forms = (Generic, Protocol) 186 if not allow_special_forms: 187 invalid_generic_forms += (ClassVar,) 188 if is_argument: 189 invalid_generic_forms += (Final,) 190 191 arg = _type_convert(arg, module=module, allow_special_forms=allow_special_forms) 192 if (isinstance(arg, _GenericAlias) and 193 arg.__origin__ in invalid_generic_forms): 194 raise TypeError(f"{arg} is not valid as type argument") 195 if arg in (Any, LiteralString, NoReturn, Never, Self, TypeAlias): 196 return arg 197 if allow_special_forms and arg in (ClassVar, Final): 198 return arg 199 if isinstance(arg, _SpecialForm) or arg in (Generic, Protocol): 200 raise TypeError(f"Plain {arg} is not valid as type argument") 201 if type(arg) is tuple: 202 raise TypeError(f"{msg} Got {arg!r:.100}.") 203 return arg 204 205 206def _is_param_expr(arg): 207 return arg is ... or isinstance(arg, 208 (tuple, list, ParamSpec, _ConcatenateGenericAlias)) 209 210 211def _should_unflatten_callable_args(typ, args): 212 """Internal helper for munging collections.abc.Callable's __args__. 213 214 The canonical representation for a Callable's __args__ flattens the 215 argument types, see https://github.com/python/cpython/issues/86361. 216 217 For example:: 218 219 >>> import collections.abc 220 >>> P = ParamSpec('P') 221 >>> collections.abc.Callable[[int, int], str].__args__ == (int, int, str) 222 True 223 >>> collections.abc.Callable[P, str].__args__ == (P, str) 224 True 225 226 As a result, if we need to reconstruct the Callable from its __args__, 227 we need to unflatten it. 228 """ 229 return ( 230 typ.__origin__ is collections.abc.Callable 231 and not (len(args) == 2 and _is_param_expr(args[0])) 232 ) 233 234 235def _type_repr(obj): 236 """Return the repr() of an object, special-casing types (internal helper). 237 238 If obj is a type, we return a shorter version than the default 239 type.__repr__, based on the module and qualified name, which is 240 typically enough to uniquely identify a type. For everything 241 else, we fall back on repr(obj). 242 """ 243 # When changing this function, don't forget about 244 # `_collections_abc._type_repr`, which does the same thing 245 # and must be consistent with this one. 246 if isinstance(obj, type): 247 if obj.__module__ == 'builtins': 248 return obj.__qualname__ 249 return f'{obj.__module__}.{obj.__qualname__}' 250 if obj is ...: 251 return '...' 252 if isinstance(obj, types.FunctionType): 253 return obj.__name__ 254 if isinstance(obj, tuple): 255 # Special case for `repr` of types with `ParamSpec`: 256 return '[' + ', '.join(_type_repr(t) for t in obj) + ']' 257 return repr(obj) 258 259 260def _collect_type_parameters(args, *, enforce_default_ordering: bool = True): 261 """Collect all type parameters in args 262 in order of first appearance (lexicographic order). 263 264 For example:: 265 266 >>> P = ParamSpec('P') 267 >>> T = TypeVar('T') 268 >>> _collect_type_parameters((T, Callable[P, T])) 269 (~T, ~P) 270 """ 271 # required type parameter cannot appear after parameter with default 272 default_encountered = False 273 # or after TypeVarTuple 274 type_var_tuple_encountered = False 275 parameters = [] 276 for t in args: 277 if isinstance(t, type): 278 # We don't want __parameters__ descriptor of a bare Python class. 279 pass 280 elif isinstance(t, tuple): 281 # `t` might be a tuple, when `ParamSpec` is substituted with 282 # `[T, int]`, or `[int, *Ts]`, etc. 283 for x in t: 284 for collected in _collect_type_parameters([x]): 285 if collected not in parameters: 286 parameters.append(collected) 287 elif hasattr(t, '__typing_subst__'): 288 if t not in parameters: 289 if enforce_default_ordering: 290 if type_var_tuple_encountered and t.has_default(): 291 raise TypeError('Type parameter with a default' 292 ' follows TypeVarTuple') 293 294 if t.has_default(): 295 default_encountered = True 296 elif default_encountered: 297 raise TypeError(f'Type parameter {t!r} without a default' 298 ' follows type parameter with a default') 299 300 parameters.append(t) 301 else: 302 if _is_unpacked_typevartuple(t): 303 type_var_tuple_encountered = True 304 for x in getattr(t, '__parameters__', ()): 305 if x not in parameters: 306 parameters.append(x) 307 return tuple(parameters) 308 309 310def _check_generic_specialization(cls, arguments): 311 """Check correct count for parameters of a generic cls (internal helper). 312 313 This gives a nice error message in case of count mismatch. 314 """ 315 expected_len = len(cls.__parameters__) 316 if not expected_len: 317 raise TypeError(f"{cls} is not a generic class") 318 actual_len = len(arguments) 319 if actual_len != expected_len: 320 # deal with defaults 321 if actual_len < expected_len: 322 # If the parameter at index `actual_len` in the parameters list 323 # has a default, then all parameters after it must also have 324 # one, because we validated as much in _collect_type_parameters(). 325 # That means that no error needs to be raised here, despite 326 # the number of arguments being passed not matching the number 327 # of parameters: all parameters that aren't explicitly 328 # specialized in this call are parameters with default values. 329 if cls.__parameters__[actual_len].has_default(): 330 return 331 332 expected_len -= sum(p.has_default() for p in cls.__parameters__) 333 expect_val = f"at least {expected_len}" 334 else: 335 expect_val = expected_len 336 337 raise TypeError(f"Too {'many' if actual_len > expected_len else 'few'} arguments" 338 f" for {cls}; actual {actual_len}, expected {expect_val}") 339 340 341def _unpack_args(*args): 342 newargs = [] 343 for arg in args: 344 subargs = getattr(arg, '__typing_unpacked_tuple_args__', None) 345 if subargs is not None and not (subargs and subargs[-1] is ...): 346 newargs.extend(subargs) 347 else: 348 newargs.append(arg) 349 return newargs 350 351def _deduplicate(params, *, unhashable_fallback=False): 352 # Weed out strict duplicates, preserving the first of each occurrence. 353 try: 354 return dict.fromkeys(params) 355 except TypeError: 356 if not unhashable_fallback: 357 raise 358 # Happens for cases like `Annotated[dict, {'x': IntValidator()}]` 359 return _deduplicate_unhashable(params) 360 361def _deduplicate_unhashable(unhashable_params): 362 new_unhashable = [] 363 for t in unhashable_params: 364 if t not in new_unhashable: 365 new_unhashable.append(t) 366 return new_unhashable 367 368def _compare_args_orderless(first_args, second_args): 369 first_unhashable = _deduplicate_unhashable(first_args) 370 second_unhashable = _deduplicate_unhashable(second_args) 371 t = list(second_unhashable) 372 try: 373 for elem in first_unhashable: 374 t.remove(elem) 375 except ValueError: 376 return False 377 return not t 378 379def _remove_dups_flatten(parameters): 380 """Internal helper for Union creation and substitution. 381 382 Flatten Unions among parameters, then remove duplicates. 383 """ 384 # Flatten out Union[Union[...], ...]. 385 params = [] 386 for p in parameters: 387 if isinstance(p, (_UnionGenericAlias, types.UnionType)): 388 params.extend(p.__args__) 389 else: 390 params.append(p) 391 392 return tuple(_deduplicate(params, unhashable_fallback=True)) 393 394 395def _flatten_literal_params(parameters): 396 """Internal helper for Literal creation: flatten Literals among parameters.""" 397 params = [] 398 for p in parameters: 399 if isinstance(p, _LiteralGenericAlias): 400 params.extend(p.__args__) 401 else: 402 params.append(p) 403 return tuple(params) 404 405 406_cleanups = [] 407_caches = {} 408 409 410def _tp_cache(func=None, /, *, typed=False): 411 """Internal wrapper caching __getitem__ of generic types. 412 413 For non-hashable arguments, the original function is used as a fallback. 414 """ 415 def decorator(func): 416 # The callback 'inner' references the newly created lru_cache 417 # indirectly by performing a lookup in the global '_caches' dictionary. 418 # This breaks a reference that can be problematic when combined with 419 # C API extensions that leak references to types. See GH-98253. 420 421 cache = functools.lru_cache(typed=typed)(func) 422 _caches[func] = cache 423 _cleanups.append(cache.cache_clear) 424 del cache 425 426 @functools.wraps(func) 427 def inner(*args, **kwds): 428 try: 429 return _caches[func](*args, **kwds) 430 except TypeError: 431 pass # All real errors (not unhashable args) are raised below. 432 return func(*args, **kwds) 433 return inner 434 435 if func is not None: 436 return decorator(func) 437 438 return decorator 439 440 441def _deprecation_warning_for_no_type_params_passed(funcname: str) -> None: 442 import warnings 443 444 depr_message = ( 445 f"Failing to pass a value to the 'type_params' parameter " 446 f"of {funcname!r} is deprecated, as it leads to incorrect behaviour " 447 f"when calling {funcname} on a stringified annotation " 448 f"that references a PEP 695 type parameter. " 449 f"It will be disallowed in Python 3.15." 450 ) 451 warnings.warn(depr_message, category=DeprecationWarning, stacklevel=3) 452 453 454class _Sentinel: 455 __slots__ = () 456 def __repr__(self): 457 return '<sentinel>' 458 459 460_sentinel = _Sentinel() 461 462 463def _eval_type(t, globalns, localns, type_params=_sentinel, *, recursive_guard=frozenset()): 464 """Evaluate all forward references in the given type t. 465 466 For use of globalns and localns see the docstring for get_type_hints(). 467 recursive_guard is used to prevent infinite recursion with a recursive 468 ForwardRef. 469 """ 470 if type_params is _sentinel: 471 _deprecation_warning_for_no_type_params_passed("typing._eval_type") 472 type_params = () 473 if isinstance(t, ForwardRef): 474 return t._evaluate(globalns, localns, type_params, recursive_guard=recursive_guard) 475 if isinstance(t, (_GenericAlias, GenericAlias, types.UnionType)): 476 if isinstance(t, GenericAlias): 477 args = tuple( 478 ForwardRef(arg) if isinstance(arg, str) else arg 479 for arg in t.__args__ 480 ) 481 is_unpacked = t.__unpacked__ 482 if _should_unflatten_callable_args(t, args): 483 t = t.__origin__[(args[:-1], args[-1])] 484 else: 485 t = t.__origin__[args] 486 if is_unpacked: 487 t = Unpack[t] 488 489 ev_args = tuple( 490 _eval_type( 491 a, globalns, localns, type_params, recursive_guard=recursive_guard 492 ) 493 for a in t.__args__ 494 ) 495 if ev_args == t.__args__: 496 return t 497 if isinstance(t, GenericAlias): 498 return GenericAlias(t.__origin__, ev_args) 499 if isinstance(t, types.UnionType): 500 return functools.reduce(operator.or_, ev_args) 501 else: 502 return t.copy_with(ev_args) 503 return t 504 505 506class _Final: 507 """Mixin to prohibit subclassing.""" 508 509 __slots__ = ('__weakref__',) 510 511 def __init_subclass__(cls, /, *args, **kwds): 512 if '_root' not in kwds: 513 raise TypeError("Cannot subclass special typing classes") 514 515 516class _NotIterable: 517 """Mixin to prevent iteration, without being compatible with Iterable. 518 519 That is, we could do:: 520 521 def __iter__(self): raise TypeError() 522 523 But this would make users of this mixin duck type-compatible with 524 collections.abc.Iterable - isinstance(foo, Iterable) would be True. 525 526 Luckily, we can instead prevent iteration by setting __iter__ to None, which 527 is treated specially. 528 """ 529 530 __slots__ = () 531 __iter__ = None 532 533 534# Internal indicator of special typing constructs. 535# See __doc__ instance attribute for specific docs. 536class _SpecialForm(_Final, _NotIterable, _root=True): 537 __slots__ = ('_name', '__doc__', '_getitem') 538 539 def __init__(self, getitem): 540 self._getitem = getitem 541 self._name = getitem.__name__ 542 self.__doc__ = getitem.__doc__ 543 544 def __getattr__(self, item): 545 if item in {'__name__', '__qualname__'}: 546 return self._name 547 548 raise AttributeError(item) 549 550 def __mro_entries__(self, bases): 551 raise TypeError(f"Cannot subclass {self!r}") 552 553 def __repr__(self): 554 return 'typing.' + self._name 555 556 def __reduce__(self): 557 return self._name 558 559 def __call__(self, *args, **kwds): 560 raise TypeError(f"Cannot instantiate {self!r}") 561 562 def __or__(self, other): 563 return Union[self, other] 564 565 def __ror__(self, other): 566 return Union[other, self] 567 568 def __instancecheck__(self, obj): 569 raise TypeError(f"{self} cannot be used with isinstance()") 570 571 def __subclasscheck__(self, cls): 572 raise TypeError(f"{self} cannot be used with issubclass()") 573 574 @_tp_cache 575 def __getitem__(self, parameters): 576 return self._getitem(self, parameters) 577 578 579class _TypedCacheSpecialForm(_SpecialForm, _root=True): 580 def __getitem__(self, parameters): 581 if not isinstance(parameters, tuple): 582 parameters = (parameters,) 583 return self._getitem(self, *parameters) 584 585 586class _AnyMeta(type): 587 def __instancecheck__(self, obj): 588 if self is Any: 589 raise TypeError("typing.Any cannot be used with isinstance()") 590 return super().__instancecheck__(obj) 591 592 def __repr__(self): 593 if self is Any: 594 return "typing.Any" 595 return super().__repr__() # respect to subclasses 596 597 598class Any(metaclass=_AnyMeta): 599 """Special type indicating an unconstrained type. 600 601 - Any is compatible with every type. 602 - Any assumed to have all methods. 603 - All values assumed to be instances of Any. 604 605 Note that all the above statements are true from the point of view of 606 static type checkers. At runtime, Any should not be used with instance 607 checks. 608 """ 609 610 def __new__(cls, *args, **kwargs): 611 if cls is Any: 612 raise TypeError("Any cannot be instantiated") 613 return super().__new__(cls) 614 615 616@_SpecialForm 617def NoReturn(self, parameters): 618 """Special type indicating functions that never return. 619 620 Example:: 621 622 from typing import NoReturn 623 624 def stop() -> NoReturn: 625 raise Exception('no way') 626 627 NoReturn can also be used as a bottom type, a type that 628 has no values. Starting in Python 3.11, the Never type should 629 be used for this concept instead. Type checkers should treat the two 630 equivalently. 631 """ 632 raise TypeError(f"{self} is not subscriptable") 633 634# This is semantically identical to NoReturn, but it is implemented 635# separately so that type checkers can distinguish between the two 636# if they want. 637@_SpecialForm 638def Never(self, parameters): 639 """The bottom type, a type that has no members. 640 641 This can be used to define a function that should never be 642 called, or a function that never returns:: 643 644 from typing import Never 645 646 def never_call_me(arg: Never) -> None: 647 pass 648 649 def int_or_str(arg: int | str) -> None: 650 never_call_me(arg) # type checker error 651 match arg: 652 case int(): 653 print("It's an int") 654 case str(): 655 print("It's a str") 656 case _: 657 never_call_me(arg) # OK, arg is of type Never 658 """ 659 raise TypeError(f"{self} is not subscriptable") 660 661 662@_SpecialForm 663def Self(self, parameters): 664 """Used to spell the type of "self" in classes. 665 666 Example:: 667 668 from typing import Self 669 670 class Foo: 671 def return_self(self) -> Self: 672 ... 673 return self 674 675 This is especially useful for: 676 - classmethods that are used as alternative constructors 677 - annotating an `__enter__` method which returns self 678 """ 679 raise TypeError(f"{self} is not subscriptable") 680 681 682@_SpecialForm 683def LiteralString(self, parameters): 684 """Represents an arbitrary literal string. 685 686 Example:: 687 688 from typing import LiteralString 689 690 def run_query(sql: LiteralString) -> None: 691 ... 692 693 def caller(arbitrary_string: str, literal_string: LiteralString) -> None: 694 run_query("SELECT * FROM students") # OK 695 run_query(literal_string) # OK 696 run_query("SELECT * FROM " + literal_string) # OK 697 run_query(arbitrary_string) # type checker error 698 run_query( # type checker error 699 f"SELECT * FROM students WHERE name = {arbitrary_string}" 700 ) 701 702 Only string literals and other LiteralStrings are compatible 703 with LiteralString. This provides a tool to help prevent 704 security issues such as SQL injection. 705 """ 706 raise TypeError(f"{self} is not subscriptable") 707 708 709@_SpecialForm 710def ClassVar(self, parameters): 711 """Special type construct to mark class variables. 712 713 An annotation wrapped in ClassVar indicates that a given 714 attribute is intended to be used as a class variable and 715 should not be set on instances of that class. 716 717 Usage:: 718 719 class Starship: 720 stats: ClassVar[dict[str, int]] = {} # class variable 721 damage: int = 10 # instance variable 722 723 ClassVar accepts only types and cannot be further subscribed. 724 725 Note that ClassVar is not a class itself, and should not 726 be used with isinstance() or issubclass(). 727 """ 728 item = _type_check(parameters, f'{self} accepts only single type.', allow_special_forms=True) 729 return _GenericAlias(self, (item,)) 730 731@_SpecialForm 732def Final(self, parameters): 733 """Special typing construct to indicate final names to type checkers. 734 735 A final name cannot be re-assigned or overridden in a subclass. 736 737 For example:: 738 739 MAX_SIZE: Final = 9000 740 MAX_SIZE += 1 # Error reported by type checker 741 742 class Connection: 743 TIMEOUT: Final[int] = 10 744 745 class FastConnector(Connection): 746 TIMEOUT = 1 # Error reported by type checker 747 748 There is no runtime checking of these properties. 749 """ 750 item = _type_check(parameters, f'{self} accepts only single type.', allow_special_forms=True) 751 return _GenericAlias(self, (item,)) 752 753@_SpecialForm 754def Union(self, parameters): 755 """Union type; Union[X, Y] means either X or Y. 756 757 On Python 3.10 and higher, the | operator 758 can also be used to denote unions; 759 X | Y means the same thing to the type checker as Union[X, Y]. 760 761 To define a union, use e.g. Union[int, str]. Details: 762 - The arguments must be types and there must be at least one. 763 - None as an argument is a special case and is replaced by 764 type(None). 765 - Unions of unions are flattened, e.g.:: 766 767 assert Union[Union[int, str], float] == Union[int, str, float] 768 769 - Unions of a single argument vanish, e.g.:: 770 771 assert Union[int] == int # The constructor actually returns int 772 773 - Redundant arguments are skipped, e.g.:: 774 775 assert Union[int, str, int] == Union[int, str] 776 777 - When comparing unions, the argument order is ignored, e.g.:: 778 779 assert Union[int, str] == Union[str, int] 780 781 - You cannot subclass or instantiate a union. 782 - You can use Optional[X] as a shorthand for Union[X, None]. 783 """ 784 if parameters == (): 785 raise TypeError("Cannot take a Union of no types.") 786 if not isinstance(parameters, tuple): 787 parameters = (parameters,) 788 msg = "Union[arg, ...]: each arg must be a type." 789 parameters = tuple(_type_check(p, msg) for p in parameters) 790 parameters = _remove_dups_flatten(parameters) 791 if len(parameters) == 1: 792 return parameters[0] 793 if len(parameters) == 2 and type(None) in parameters: 794 return _UnionGenericAlias(self, parameters, name="Optional") 795 return _UnionGenericAlias(self, parameters) 796 797def _make_union(left, right): 798 """Used from the C implementation of TypeVar. 799 800 TypeVar.__or__ calls this instead of returning types.UnionType 801 because we want to allow unions between TypeVars and strings 802 (forward references). 803 """ 804 return Union[left, right] 805 806@_SpecialForm 807def Optional(self, parameters): 808 """Optional[X] is equivalent to Union[X, None].""" 809 arg = _type_check(parameters, f"{self} requires a single type.") 810 return Union[arg, type(None)] 811 812@_TypedCacheSpecialForm 813@_tp_cache(typed=True) 814def Literal(self, *parameters): 815 """Special typing form to define literal types (a.k.a. value types). 816 817 This form can be used to indicate to type checkers that the corresponding 818 variable or function parameter has a value equivalent to the provided 819 literal (or one of several literals):: 820 821 def validate_simple(data: Any) -> Literal[True]: # always returns True 822 ... 823 824 MODE = Literal['r', 'rb', 'w', 'wb'] 825 def open_helper(file: str, mode: MODE) -> str: 826 ... 827 828 open_helper('/some/path', 'r') # Passes type check 829 open_helper('/other/path', 'typo') # Error in type checker 830 831 Literal[...] cannot be subclassed. At runtime, an arbitrary value 832 is allowed as type argument to Literal[...], but type checkers may 833 impose restrictions. 834 """ 835 # There is no '_type_check' call because arguments to Literal[...] are 836 # values, not types. 837 parameters = _flatten_literal_params(parameters) 838 839 try: 840 parameters = tuple(p for p, _ in _deduplicate(list(_value_and_type_iter(parameters)))) 841 except TypeError: # unhashable parameters 842 pass 843 844 return _LiteralGenericAlias(self, parameters) 845 846 847@_SpecialForm 848def TypeAlias(self, parameters): 849 """Special form for marking type aliases. 850 851 Use TypeAlias to indicate that an assignment should 852 be recognized as a proper type alias definition by type 853 checkers. 854 855 For example:: 856 857 Predicate: TypeAlias = Callable[..., bool] 858 859 It's invalid when used anywhere except as in the example above. 860 """ 861 raise TypeError(f"{self} is not subscriptable") 862 863 864@_SpecialForm 865def Concatenate(self, parameters): 866 """Special form for annotating higher-order functions. 867 868 ``Concatenate`` can be used in conjunction with ``ParamSpec`` and 869 ``Callable`` to represent a higher-order function which adds, removes or 870 transforms the parameters of a callable. 871 872 For example:: 873 874 Callable[Concatenate[int, P], int] 875 876 See PEP 612 for detailed information. 877 """ 878 if parameters == (): 879 raise TypeError("Cannot take a Concatenate of no types.") 880 if not isinstance(parameters, tuple): 881 parameters = (parameters,) 882 if not (parameters[-1] is ... or isinstance(parameters[-1], ParamSpec)): 883 raise TypeError("The last parameter to Concatenate should be a " 884 "ParamSpec variable or ellipsis.") 885 msg = "Concatenate[arg, ...]: each arg must be a type." 886 parameters = (*(_type_check(p, msg) for p in parameters[:-1]), parameters[-1]) 887 return _ConcatenateGenericAlias(self, parameters) 888 889 890@_SpecialForm 891def TypeGuard(self, parameters): 892 """Special typing construct for marking user-defined type predicate functions. 893 894 ``TypeGuard`` can be used to annotate the return type of a user-defined 895 type predicate function. ``TypeGuard`` only accepts a single type argument. 896 At runtime, functions marked this way should return a boolean. 897 898 ``TypeGuard`` aims to benefit *type narrowing* -- a technique used by static 899 type checkers to determine a more precise type of an expression within a 900 program's code flow. Usually type narrowing is done by analyzing 901 conditional code flow and applying the narrowing to a block of code. The 902 conditional expression here is sometimes referred to as a "type predicate". 903 904 Sometimes it would be convenient to use a user-defined boolean function 905 as a type predicate. Such a function should use ``TypeGuard[...]`` or 906 ``TypeIs[...]`` as its return type to alert static type checkers to 907 this intention. ``TypeGuard`` should be used over ``TypeIs`` when narrowing 908 from an incompatible type (e.g., ``list[object]`` to ``list[int]``) or when 909 the function does not return ``True`` for all instances of the narrowed type. 910 911 Using ``-> TypeGuard[NarrowedType]`` tells the static type checker that 912 for a given function: 913 914 1. The return value is a boolean. 915 2. If the return value is ``True``, the type of its argument 916 is ``NarrowedType``. 917 918 For example:: 919 920 def is_str_list(val: list[object]) -> TypeGuard[list[str]]: 921 '''Determines whether all objects in the list are strings''' 922 return all(isinstance(x, str) for x in val) 923 924 def func1(val: list[object]): 925 if is_str_list(val): 926 # Type of ``val`` is narrowed to ``list[str]``. 927 print(" ".join(val)) 928 else: 929 # Type of ``val`` remains as ``list[object]``. 930 print("Not a list of strings!") 931 932 Strict type narrowing is not enforced -- ``TypeB`` need not be a narrower 933 form of ``TypeA`` (it can even be a wider form) and this may lead to 934 type-unsafe results. The main reason is to allow for things like 935 narrowing ``list[object]`` to ``list[str]`` even though the latter is not 936 a subtype of the former, since ``list`` is invariant. The responsibility of 937 writing type-safe type predicates is left to the user. 938 939 ``TypeGuard`` also works with type variables. For more information, see 940 PEP 647 (User-Defined Type Guards). 941 """ 942 item = _type_check(parameters, f'{self} accepts only single type.') 943 return _GenericAlias(self, (item,)) 944 945 946@_SpecialForm 947def TypeIs(self, parameters): 948 """Special typing construct for marking user-defined type predicate functions. 949 950 ``TypeIs`` can be used to annotate the return type of a user-defined 951 type predicate function. ``TypeIs`` only accepts a single type argument. 952 At runtime, functions marked this way should return a boolean and accept 953 at least one argument. 954 955 ``TypeIs`` aims to benefit *type narrowing* -- a technique used by static 956 type checkers to determine a more precise type of an expression within a 957 program's code flow. Usually type narrowing is done by analyzing 958 conditional code flow and applying the narrowing to a block of code. The 959 conditional expression here is sometimes referred to as a "type predicate". 960 961 Sometimes it would be convenient to use a user-defined boolean function 962 as a type predicate. Such a function should use ``TypeIs[...]`` or 963 ``TypeGuard[...]`` as its return type to alert static type checkers to 964 this intention. ``TypeIs`` usually has more intuitive behavior than 965 ``TypeGuard``, but it cannot be used when the input and output types 966 are incompatible (e.g., ``list[object]`` to ``list[int]``) or when the 967 function does not return ``True`` for all instances of the narrowed type. 968 969 Using ``-> TypeIs[NarrowedType]`` tells the static type checker that for 970 a given function: 971 972 1. The return value is a boolean. 973 2. If the return value is ``True``, the type of its argument 974 is the intersection of the argument's original type and 975 ``NarrowedType``. 976 3. If the return value is ``False``, the type of its argument 977 is narrowed to exclude ``NarrowedType``. 978 979 For example:: 980 981 from typing import assert_type, final, TypeIs 982 983 class Parent: pass 984 class Child(Parent): pass 985 @final 986 class Unrelated: pass 987 988 def is_parent(val: object) -> TypeIs[Parent]: 989 return isinstance(val, Parent) 990 991 def run(arg: Child | Unrelated): 992 if is_parent(arg): 993 # Type of ``arg`` is narrowed to the intersection 994 # of ``Parent`` and ``Child``, which is equivalent to 995 # ``Child``. 996 assert_type(arg, Child) 997 else: 998 # Type of ``arg`` is narrowed to exclude ``Parent``, 999 # so only ``Unrelated`` is left. 1000 assert_type(arg, Unrelated) 1001 1002 The type inside ``TypeIs`` must be consistent with the type of the 1003 function's argument; if it is not, static type checkers will raise 1004 an error. An incorrectly written ``TypeIs`` function can lead to 1005 unsound behavior in the type system; it is the user's responsibility 1006 to write such functions in a type-safe manner. 1007 1008 ``TypeIs`` also works with type variables. For more information, see 1009 PEP 742 (Narrowing types with ``TypeIs``). 1010 """ 1011 item = _type_check(parameters, f'{self} accepts only single type.') 1012 return _GenericAlias(self, (item,)) 1013 1014 1015class ForwardRef(_Final, _root=True): 1016 """Internal wrapper to hold a forward reference.""" 1017 1018 __slots__ = ('__forward_arg__', '__forward_code__', 1019 '__forward_evaluated__', '__forward_value__', 1020 '__forward_is_argument__', '__forward_is_class__', 1021 '__forward_module__') 1022 1023 def __init__(self, arg, is_argument=True, module=None, *, is_class=False): 1024 if not isinstance(arg, str): 1025 raise TypeError(f"Forward reference must be a string -- got {arg!r}") 1026 1027 # If we do `def f(*args: *Ts)`, then we'll have `arg = '*Ts'`. 1028 # Unfortunately, this isn't a valid expression on its own, so we 1029 # do the unpacking manually. 1030 if arg.startswith('*'): 1031 arg_to_compile = f'({arg},)[0]' # E.g. (*Ts,)[0] or (*tuple[int, int],)[0] 1032 else: 1033 arg_to_compile = arg 1034 try: 1035 code = compile(arg_to_compile, '<string>', 'eval') 1036 except SyntaxError: 1037 raise SyntaxError(f"Forward reference must be an expression -- got {arg!r}") 1038 1039 self.__forward_arg__ = arg 1040 self.__forward_code__ = code 1041 self.__forward_evaluated__ = False 1042 self.__forward_value__ = None 1043 self.__forward_is_argument__ = is_argument 1044 self.__forward_is_class__ = is_class 1045 self.__forward_module__ = module 1046 1047 def _evaluate(self, globalns, localns, type_params=_sentinel, *, recursive_guard): 1048 if type_params is _sentinel: 1049 _deprecation_warning_for_no_type_params_passed("typing.ForwardRef._evaluate") 1050 type_params = () 1051 if self.__forward_arg__ in recursive_guard: 1052 return self 1053 if not self.__forward_evaluated__ or localns is not globalns: 1054 if globalns is None and localns is None: 1055 globalns = localns = {} 1056 elif globalns is None: 1057 globalns = localns 1058 elif localns is None: 1059 localns = globalns 1060 if self.__forward_module__ is not None: 1061 globalns = getattr( 1062 sys.modules.get(self.__forward_module__, None), '__dict__', globalns 1063 ) 1064 1065 # type parameters require some special handling, 1066 # as they exist in their own scope 1067 # but `eval()` does not have a dedicated parameter for that scope. 1068 # For classes, names in type parameter scopes should override 1069 # names in the global scope (which here are called `localns`!), 1070 # but should in turn be overridden by names in the class scope 1071 # (which here are called `globalns`!) 1072 if type_params: 1073 globalns, localns = dict(globalns), dict(localns) 1074 for param in type_params: 1075 param_name = param.__name__ 1076 if not self.__forward_is_class__ or param_name not in globalns: 1077 globalns[param_name] = param 1078 localns.pop(param_name, None) 1079 1080 type_ = _type_check( 1081 eval(self.__forward_code__, globalns, localns), 1082 "Forward references must evaluate to types.", 1083 is_argument=self.__forward_is_argument__, 1084 allow_special_forms=self.__forward_is_class__, 1085 ) 1086 self.__forward_value__ = _eval_type( 1087 type_, 1088 globalns, 1089 localns, 1090 type_params, 1091 recursive_guard=(recursive_guard | {self.__forward_arg__}), 1092 ) 1093 self.__forward_evaluated__ = True 1094 return self.__forward_value__ 1095 1096 def __eq__(self, other): 1097 if not isinstance(other, ForwardRef): 1098 return NotImplemented 1099 if self.__forward_evaluated__ and other.__forward_evaluated__: 1100 return (self.__forward_arg__ == other.__forward_arg__ and 1101 self.__forward_value__ == other.__forward_value__) 1102 return (self.__forward_arg__ == other.__forward_arg__ and 1103 self.__forward_module__ == other.__forward_module__) 1104 1105 def __hash__(self): 1106 return hash((self.__forward_arg__, self.__forward_module__)) 1107 1108 def __or__(self, other): 1109 return Union[self, other] 1110 1111 def __ror__(self, other): 1112 return Union[other, self] 1113 1114 def __repr__(self): 1115 if self.__forward_module__ is None: 1116 module_repr = '' 1117 else: 1118 module_repr = f', module={self.__forward_module__!r}' 1119 return f'ForwardRef({self.__forward_arg__!r}{module_repr})' 1120 1121 1122def _is_unpacked_typevartuple(x: Any) -> bool: 1123 return ((not isinstance(x, type)) and 1124 getattr(x, '__typing_is_unpacked_typevartuple__', False)) 1125 1126 1127def _is_typevar_like(x: Any) -> bool: 1128 return isinstance(x, (TypeVar, ParamSpec)) or _is_unpacked_typevartuple(x) 1129 1130 1131def _typevar_subst(self, arg): 1132 msg = "Parameters to generic types must be types." 1133 arg = _type_check(arg, msg, is_argument=True) 1134 if ((isinstance(arg, _GenericAlias) and arg.__origin__ is Unpack) or 1135 (isinstance(arg, GenericAlias) and getattr(arg, '__unpacked__', False))): 1136 raise TypeError(f"{arg} is not valid as type argument") 1137 return arg 1138 1139 1140def _typevartuple_prepare_subst(self, alias, args): 1141 params = alias.__parameters__ 1142 typevartuple_index = params.index(self) 1143 for param in params[typevartuple_index + 1:]: 1144 if isinstance(param, TypeVarTuple): 1145 raise TypeError(f"More than one TypeVarTuple parameter in {alias}") 1146 1147 alen = len(args) 1148 plen = len(params) 1149 left = typevartuple_index 1150 right = plen - typevartuple_index - 1 1151 var_tuple_index = None 1152 fillarg = None 1153 for k, arg in enumerate(args): 1154 if not isinstance(arg, type): 1155 subargs = getattr(arg, '__typing_unpacked_tuple_args__', None) 1156 if subargs and len(subargs) == 2 and subargs[-1] is ...: 1157 if var_tuple_index is not None: 1158 raise TypeError("More than one unpacked arbitrary-length tuple argument") 1159 var_tuple_index = k 1160 fillarg = subargs[0] 1161 if var_tuple_index is not None: 1162 left = min(left, var_tuple_index) 1163 right = min(right, alen - var_tuple_index - 1) 1164 elif left + right > alen: 1165 raise TypeError(f"Too few arguments for {alias};" 1166 f" actual {alen}, expected at least {plen-1}") 1167 if left == alen - right and self.has_default(): 1168 replacement = _unpack_args(self.__default__) 1169 else: 1170 replacement = args[left: alen - right] 1171 1172 return ( 1173 *args[:left], 1174 *([fillarg]*(typevartuple_index - left)), 1175 replacement, 1176 *([fillarg]*(plen - right - left - typevartuple_index - 1)), 1177 *args[alen - right:], 1178 ) 1179 1180 1181def _paramspec_subst(self, arg): 1182 if isinstance(arg, (list, tuple)): 1183 arg = tuple(_type_check(a, "Expected a type.") for a in arg) 1184 elif not _is_param_expr(arg): 1185 raise TypeError(f"Expected a list of types, an ellipsis, " 1186 f"ParamSpec, or Concatenate. Got {arg}") 1187 return arg 1188 1189 1190def _paramspec_prepare_subst(self, alias, args): 1191 params = alias.__parameters__ 1192 i = params.index(self) 1193 if i == len(args) and self.has_default(): 1194 args = [*args, self.__default__] 1195 if i >= len(args): 1196 raise TypeError(f"Too few arguments for {alias}") 1197 # Special case where Z[[int, str, bool]] == Z[int, str, bool] in PEP 612. 1198 if len(params) == 1 and not _is_param_expr(args[0]): 1199 assert i == 0 1200 args = (args,) 1201 # Convert lists to tuples to help other libraries cache the results. 1202 elif isinstance(args[i], list): 1203 args = (*args[:i], tuple(args[i]), *args[i+1:]) 1204 return args 1205 1206 1207@_tp_cache 1208def _generic_class_getitem(cls, args): 1209 """Parameterizes a generic class. 1210 1211 At least, parameterizing a generic class is the *main* thing this method 1212 does. For example, for some generic class `Foo`, this is called when we 1213 do `Foo[int]` - there, with `cls=Foo` and `args=int`. 1214 1215 However, note that this method is also called when defining generic 1216 classes in the first place with `class Foo(Generic[T]): ...`. 1217 """ 1218 if not isinstance(args, tuple): 1219 args = (args,) 1220 1221 args = tuple(_type_convert(p) for p in args) 1222 is_generic_or_protocol = cls in (Generic, Protocol) 1223 1224 if is_generic_or_protocol: 1225 # Generic and Protocol can only be subscripted with unique type variables. 1226 if not args: 1227 raise TypeError( 1228 f"Parameter list to {cls.__qualname__}[...] cannot be empty" 1229 ) 1230 if not all(_is_typevar_like(p) for p in args): 1231 raise TypeError( 1232 f"Parameters to {cls.__name__}[...] must all be type variables " 1233 f"or parameter specification variables.") 1234 if len(set(args)) != len(args): 1235 raise TypeError( 1236 f"Parameters to {cls.__name__}[...] must all be unique") 1237 else: 1238 # Subscripting a regular Generic subclass. 1239 for param in cls.__parameters__: 1240 prepare = getattr(param, '__typing_prepare_subst__', None) 1241 if prepare is not None: 1242 args = prepare(cls, args) 1243 _check_generic_specialization(cls, args) 1244 1245 new_args = [] 1246 for param, new_arg in zip(cls.__parameters__, args): 1247 if isinstance(param, TypeVarTuple): 1248 new_args.extend(new_arg) 1249 else: 1250 new_args.append(new_arg) 1251 args = tuple(new_args) 1252 1253 return _GenericAlias(cls, args) 1254 1255 1256def _generic_init_subclass(cls, *args, **kwargs): 1257 super(Generic, cls).__init_subclass__(*args, **kwargs) 1258 tvars = [] 1259 if '__orig_bases__' in cls.__dict__: 1260 error = Generic in cls.__orig_bases__ 1261 else: 1262 error = (Generic in cls.__bases__ and 1263 cls.__name__ != 'Protocol' and 1264 type(cls) != _TypedDictMeta) 1265 if error: 1266 raise TypeError("Cannot inherit from plain Generic") 1267 if '__orig_bases__' in cls.__dict__: 1268 tvars = _collect_type_parameters(cls.__orig_bases__) 1269 # Look for Generic[T1, ..., Tn]. 1270 # If found, tvars must be a subset of it. 1271 # If not found, tvars is it. 1272 # Also check for and reject plain Generic, 1273 # and reject multiple Generic[...]. 1274 gvars = None 1275 for base in cls.__orig_bases__: 1276 if (isinstance(base, _GenericAlias) and 1277 base.__origin__ is Generic): 1278 if gvars is not None: 1279 raise TypeError( 1280 "Cannot inherit from Generic[...] multiple times.") 1281 gvars = base.__parameters__ 1282 if gvars is not None: 1283 tvarset = set(tvars) 1284 gvarset = set(gvars) 1285 if not tvarset <= gvarset: 1286 s_vars = ', '.join(str(t) for t in tvars if t not in gvarset) 1287 s_args = ', '.join(str(g) for g in gvars) 1288 raise TypeError(f"Some type variables ({s_vars}) are" 1289 f" not listed in Generic[{s_args}]") 1290 tvars = gvars 1291 cls.__parameters__ = tuple(tvars) 1292 1293 1294def _is_dunder(attr): 1295 return attr.startswith('__') and attr.endswith('__') 1296 1297class _BaseGenericAlias(_Final, _root=True): 1298 """The central part of the internal API. 1299 1300 This represents a generic version of type 'origin' with type arguments 'params'. 1301 There are two kind of these aliases: user defined and special. The special ones 1302 are wrappers around builtin collections and ABCs in collections.abc. These must 1303 have 'name' always set. If 'inst' is False, then the alias can't be instantiated; 1304 this is used by e.g. typing.List and typing.Dict. 1305 """ 1306 1307 def __init__(self, origin, *, inst=True, name=None): 1308 self._inst = inst 1309 self._name = name 1310 self.__origin__ = origin 1311 self.__slots__ = None # This is not documented. 1312 1313 def __call__(self, *args, **kwargs): 1314 if not self._inst: 1315 raise TypeError(f"Type {self._name} cannot be instantiated; " 1316 f"use {self.__origin__.__name__}() instead") 1317 result = self.__origin__(*args, **kwargs) 1318 try: 1319 result.__orig_class__ = self 1320 # Some objects raise TypeError (or something even more exotic) 1321 # if you try to set attributes on them; we guard against that here 1322 except Exception: 1323 pass 1324 return result 1325 1326 def __mro_entries__(self, bases): 1327 res = [] 1328 if self.__origin__ not in bases: 1329 res.append(self.__origin__) 1330 1331 # Check if any base that occurs after us in `bases` is either itself a 1332 # subclass of Generic, or something which will add a subclass of Generic 1333 # to `__bases__` via its `__mro_entries__`. If not, add Generic 1334 # ourselves. The goal is to ensure that Generic (or a subclass) will 1335 # appear exactly once in the final bases tuple. If we let it appear 1336 # multiple times, we risk "can't form a consistent MRO" errors. 1337 i = bases.index(self) 1338 for b in bases[i+1:]: 1339 if isinstance(b, _BaseGenericAlias): 1340 break 1341 if not isinstance(b, type): 1342 meth = getattr(b, "__mro_entries__", None) 1343 new_bases = meth(bases) if meth else None 1344 if ( 1345 isinstance(new_bases, tuple) and 1346 any( 1347 isinstance(b2, type) and issubclass(b2, Generic) 1348 for b2 in new_bases 1349 ) 1350 ): 1351 break 1352 elif issubclass(b, Generic): 1353 break 1354 else: 1355 res.append(Generic) 1356 return tuple(res) 1357 1358 def __getattr__(self, attr): 1359 if attr in {'__name__', '__qualname__'}: 1360 return self._name or self.__origin__.__name__ 1361 1362 # We are careful for copy and pickle. 1363 # Also for simplicity we don't relay any dunder names 1364 if '__origin__' in self.__dict__ and not _is_dunder(attr): 1365 return getattr(self.__origin__, attr) 1366 raise AttributeError(attr) 1367 1368 def __setattr__(self, attr, val): 1369 if _is_dunder(attr) or attr in {'_name', '_inst', '_nparams', '_defaults'}: 1370 super().__setattr__(attr, val) 1371 else: 1372 setattr(self.__origin__, attr, val) 1373 1374 def __instancecheck__(self, obj): 1375 return self.__subclasscheck__(type(obj)) 1376 1377 def __subclasscheck__(self, cls): 1378 raise TypeError("Subscripted generics cannot be used with" 1379 " class and instance checks") 1380 1381 def __dir__(self): 1382 return list(set(super().__dir__() 1383 + [attr for attr in dir(self.__origin__) if not _is_dunder(attr)])) 1384 1385 1386# Special typing constructs Union, Optional, Generic, Callable and Tuple 1387# use three special attributes for internal bookkeeping of generic types: 1388# * __parameters__ is a tuple of unique free type parameters of a generic 1389# type, for example, Dict[T, T].__parameters__ == (T,); 1390# * __origin__ keeps a reference to a type that was subscripted, 1391# e.g., Union[T, int].__origin__ == Union, or the non-generic version of 1392# the type. 1393# * __args__ is a tuple of all arguments used in subscripting, 1394# e.g., Dict[T, int].__args__ == (T, int). 1395 1396 1397class _GenericAlias(_BaseGenericAlias, _root=True): 1398 # The type of parameterized generics. 1399 # 1400 # That is, for example, `type(List[int])` is `_GenericAlias`. 1401 # 1402 # Objects which are instances of this class include: 1403 # * Parameterized container types, e.g. `Tuple[int]`, `List[int]`. 1404 # * Note that native container types, e.g. `tuple`, `list`, use 1405 # `types.GenericAlias` instead. 1406 # * Parameterized classes: 1407 # class C[T]: pass 1408 # # C[int] is a _GenericAlias 1409 # * `Callable` aliases, generic `Callable` aliases, and 1410 # parameterized `Callable` aliases: 1411 # T = TypeVar('T') 1412 # # _CallableGenericAlias inherits from _GenericAlias. 1413 # A = Callable[[], None] # _CallableGenericAlias 1414 # B = Callable[[T], None] # _CallableGenericAlias 1415 # C = B[int] # _CallableGenericAlias 1416 # * Parameterized `Final`, `ClassVar`, `TypeGuard`, and `TypeIs`: 1417 # # All _GenericAlias 1418 # Final[int] 1419 # ClassVar[float] 1420 # TypeGuard[bool] 1421 # TypeIs[range] 1422 1423 def __init__(self, origin, args, *, inst=True, name=None): 1424 super().__init__(origin, inst=inst, name=name) 1425 if not isinstance(args, tuple): 1426 args = (args,) 1427 self.__args__ = tuple(... if a is _TypingEllipsis else 1428 a for a in args) 1429 enforce_default_ordering = origin in (Generic, Protocol) 1430 self.__parameters__ = _collect_type_parameters( 1431 args, 1432 enforce_default_ordering=enforce_default_ordering, 1433 ) 1434 if not name: 1435 self.__module__ = origin.__module__ 1436 1437 def __eq__(self, other): 1438 if not isinstance(other, _GenericAlias): 1439 return NotImplemented 1440 return (self.__origin__ == other.__origin__ 1441 and self.__args__ == other.__args__) 1442 1443 def __hash__(self): 1444 return hash((self.__origin__, self.__args__)) 1445 1446 def __or__(self, right): 1447 return Union[self, right] 1448 1449 def __ror__(self, left): 1450 return Union[left, self] 1451 1452 @_tp_cache 1453 def __getitem__(self, args): 1454 # Parameterizes an already-parameterized object. 1455 # 1456 # For example, we arrive here doing something like: 1457 # T1 = TypeVar('T1') 1458 # T2 = TypeVar('T2') 1459 # T3 = TypeVar('T3') 1460 # class A(Generic[T1]): pass 1461 # B = A[T2] # B is a _GenericAlias 1462 # C = B[T3] # Invokes _GenericAlias.__getitem__ 1463 # 1464 # We also arrive here when parameterizing a generic `Callable` alias: 1465 # T = TypeVar('T') 1466 # C = Callable[[T], None] 1467 # C[int] # Invokes _GenericAlias.__getitem__ 1468 1469 if self.__origin__ in (Generic, Protocol): 1470 # Can't subscript Generic[...] or Protocol[...]. 1471 raise TypeError(f"Cannot subscript already-subscripted {self}") 1472 if not self.__parameters__: 1473 raise TypeError(f"{self} is not a generic class") 1474 1475 # Preprocess `args`. 1476 if not isinstance(args, tuple): 1477 args = (args,) 1478 args = _unpack_args(*(_type_convert(p) for p in args)) 1479 new_args = self._determine_new_args(args) 1480 r = self.copy_with(new_args) 1481 return r 1482 1483 def _determine_new_args(self, args): 1484 # Determines new __args__ for __getitem__. 1485 # 1486 # For example, suppose we had: 1487 # T1 = TypeVar('T1') 1488 # T2 = TypeVar('T2') 1489 # class A(Generic[T1, T2]): pass 1490 # T3 = TypeVar('T3') 1491 # B = A[int, T3] 1492 # C = B[str] 1493 # `B.__args__` is `(int, T3)`, so `C.__args__` should be `(int, str)`. 1494 # Unfortunately, this is harder than it looks, because if `T3` is 1495 # anything more exotic than a plain `TypeVar`, we need to consider 1496 # edge cases. 1497 1498 params = self.__parameters__ 1499 # In the example above, this would be {T3: str} 1500 for param in params: 1501 prepare = getattr(param, '__typing_prepare_subst__', None) 1502 if prepare is not None: 1503 args = prepare(self, args) 1504 alen = len(args) 1505 plen = len(params) 1506 if alen != plen: 1507 raise TypeError(f"Too {'many' if alen > plen else 'few'} arguments for {self};" 1508 f" actual {alen}, expected {plen}") 1509 new_arg_by_param = dict(zip(params, args)) 1510 return tuple(self._make_substitution(self.__args__, new_arg_by_param)) 1511 1512 def _make_substitution(self, args, new_arg_by_param): 1513 """Create a list of new type arguments.""" 1514 new_args = [] 1515 for old_arg in args: 1516 if isinstance(old_arg, type): 1517 new_args.append(old_arg) 1518 continue 1519 1520 substfunc = getattr(old_arg, '__typing_subst__', None) 1521 if substfunc: 1522 new_arg = substfunc(new_arg_by_param[old_arg]) 1523 else: 1524 subparams = getattr(old_arg, '__parameters__', ()) 1525 if not subparams: 1526 new_arg = old_arg 1527 else: 1528 subargs = [] 1529 for x in subparams: 1530 if isinstance(x, TypeVarTuple): 1531 subargs.extend(new_arg_by_param[x]) 1532 else: 1533 subargs.append(new_arg_by_param[x]) 1534 new_arg = old_arg[tuple(subargs)] 1535 1536 if self.__origin__ == collections.abc.Callable and isinstance(new_arg, tuple): 1537 # Consider the following `Callable`. 1538 # C = Callable[[int], str] 1539 # Here, `C.__args__` should be (int, str) - NOT ([int], str). 1540 # That means that if we had something like... 1541 # P = ParamSpec('P') 1542 # T = TypeVar('T') 1543 # C = Callable[P, T] 1544 # D = C[[int, str], float] 1545 # ...we need to be careful; `new_args` should end up as 1546 # `(int, str, float)` rather than `([int, str], float)`. 1547 new_args.extend(new_arg) 1548 elif _is_unpacked_typevartuple(old_arg): 1549 # Consider the following `_GenericAlias`, `B`: 1550 # class A(Generic[*Ts]): ... 1551 # B = A[T, *Ts] 1552 # If we then do: 1553 # B[float, int, str] 1554 # The `new_arg` corresponding to `T` will be `float`, and the 1555 # `new_arg` corresponding to `*Ts` will be `(int, str)`. We 1556 # should join all these types together in a flat list 1557 # `(float, int, str)` - so again, we should `extend`. 1558 new_args.extend(new_arg) 1559 elif isinstance(old_arg, tuple): 1560 # Corner case: 1561 # P = ParamSpec('P') 1562 # T = TypeVar('T') 1563 # class Base(Generic[P]): ... 1564 # Can be substituted like this: 1565 # X = Base[[int, T]] 1566 # In this case, `old_arg` will be a tuple: 1567 new_args.append( 1568 tuple(self._make_substitution(old_arg, new_arg_by_param)), 1569 ) 1570 else: 1571 new_args.append(new_arg) 1572 return new_args 1573 1574 def copy_with(self, args): 1575 return self.__class__(self.__origin__, args, name=self._name, inst=self._inst) 1576 1577 def __repr__(self): 1578 if self._name: 1579 name = 'typing.' + self._name 1580 else: 1581 name = _type_repr(self.__origin__) 1582 if self.__args__: 1583 args = ", ".join([_type_repr(a) for a in self.__args__]) 1584 else: 1585 # To ensure the repr is eval-able. 1586 args = "()" 1587 return f'{name}[{args}]' 1588 1589 def __reduce__(self): 1590 if self._name: 1591 origin = globals()[self._name] 1592 else: 1593 origin = self.__origin__ 1594 args = tuple(self.__args__) 1595 if len(args) == 1 and not isinstance(args[0], tuple): 1596 args, = args 1597 return operator.getitem, (origin, args) 1598 1599 def __mro_entries__(self, bases): 1600 if isinstance(self.__origin__, _SpecialForm): 1601 raise TypeError(f"Cannot subclass {self!r}") 1602 1603 if self._name: # generic version of an ABC or built-in class 1604 return super().__mro_entries__(bases) 1605 if self.__origin__ is Generic: 1606 if Protocol in bases: 1607 return () 1608 i = bases.index(self) 1609 for b in bases[i+1:]: 1610 if isinstance(b, _BaseGenericAlias) and b is not self: 1611 return () 1612 return (self.__origin__,) 1613 1614 def __iter__(self): 1615 yield Unpack[self] 1616 1617 1618# _nparams is the number of accepted parameters, e.g. 0 for Hashable, 1619# 1 for List and 2 for Dict. It may be -1 if variable number of 1620# parameters are accepted (needs custom __getitem__). 1621 1622class _SpecialGenericAlias(_NotIterable, _BaseGenericAlias, _root=True): 1623 def __init__(self, origin, nparams, *, inst=True, name=None, defaults=()): 1624 if name is None: 1625 name = origin.__name__ 1626 super().__init__(origin, inst=inst, name=name) 1627 self._nparams = nparams 1628 self._defaults = defaults 1629 if origin.__module__ == 'builtins': 1630 self.__doc__ = f'A generic version of {origin.__qualname__}.' 1631 else: 1632 self.__doc__ = f'A generic version of {origin.__module__}.{origin.__qualname__}.' 1633 1634 @_tp_cache 1635 def __getitem__(self, params): 1636 if not isinstance(params, tuple): 1637 params = (params,) 1638 msg = "Parameters to generic types must be types." 1639 params = tuple(_type_check(p, msg) for p in params) 1640 if (self._defaults 1641 and len(params) < self._nparams 1642 and len(params) + len(self._defaults) >= self._nparams 1643 ): 1644 params = (*params, *self._defaults[len(params) - self._nparams:]) 1645 actual_len = len(params) 1646 1647 if actual_len != self._nparams: 1648 if self._defaults: 1649 expected = f"at least {self._nparams - len(self._defaults)}" 1650 else: 1651 expected = str(self._nparams) 1652 if not self._nparams: 1653 raise TypeError(f"{self} is not a generic class") 1654 raise TypeError(f"Too {'many' if actual_len > self._nparams else 'few'} arguments for {self};" 1655 f" actual {actual_len}, expected {expected}") 1656 return self.copy_with(params) 1657 1658 def copy_with(self, params): 1659 return _GenericAlias(self.__origin__, params, 1660 name=self._name, inst=self._inst) 1661 1662 def __repr__(self): 1663 return 'typing.' + self._name 1664 1665 def __subclasscheck__(self, cls): 1666 if isinstance(cls, _SpecialGenericAlias): 1667 return issubclass(cls.__origin__, self.__origin__) 1668 if not isinstance(cls, _GenericAlias): 1669 return issubclass(cls, self.__origin__) 1670 return super().__subclasscheck__(cls) 1671 1672 def __reduce__(self): 1673 return self._name 1674 1675 def __or__(self, right): 1676 return Union[self, right] 1677 1678 def __ror__(self, left): 1679 return Union[left, self] 1680 1681 1682class _DeprecatedGenericAlias(_SpecialGenericAlias, _root=True): 1683 def __init__( 1684 self, origin, nparams, *, removal_version, inst=True, name=None 1685 ): 1686 super().__init__(origin, nparams, inst=inst, name=name) 1687 self._removal_version = removal_version 1688 1689 def __instancecheck__(self, inst): 1690 import warnings 1691 warnings._deprecated( 1692 f"{self.__module__}.{self._name}", remove=self._removal_version 1693 ) 1694 return super().__instancecheck__(inst) 1695 1696 1697class _CallableGenericAlias(_NotIterable, _GenericAlias, _root=True): 1698 def __repr__(self): 1699 assert self._name == 'Callable' 1700 args = self.__args__ 1701 if len(args) == 2 and _is_param_expr(args[0]): 1702 return super().__repr__() 1703 return (f'typing.Callable' 1704 f'[[{", ".join([_type_repr(a) for a in args[:-1]])}], ' 1705 f'{_type_repr(args[-1])}]') 1706 1707 def __reduce__(self): 1708 args = self.__args__ 1709 if not (len(args) == 2 and _is_param_expr(args[0])): 1710 args = list(args[:-1]), args[-1] 1711 return operator.getitem, (Callable, args) 1712 1713 1714class _CallableType(_SpecialGenericAlias, _root=True): 1715 def copy_with(self, params): 1716 return _CallableGenericAlias(self.__origin__, params, 1717 name=self._name, inst=self._inst) 1718 1719 def __getitem__(self, params): 1720 if not isinstance(params, tuple) or len(params) != 2: 1721 raise TypeError("Callable must be used as " 1722 "Callable[[arg, ...], result].") 1723 args, result = params 1724 # This relaxes what args can be on purpose to allow things like 1725 # PEP 612 ParamSpec. Responsibility for whether a user is using 1726 # Callable[...] properly is deferred to static type checkers. 1727 if isinstance(args, list): 1728 params = (tuple(args), result) 1729 else: 1730 params = (args, result) 1731 return self.__getitem_inner__(params) 1732 1733 @_tp_cache 1734 def __getitem_inner__(self, params): 1735 args, result = params 1736 msg = "Callable[args, result]: result must be a type." 1737 result = _type_check(result, msg) 1738 if args is Ellipsis: 1739 return self.copy_with((_TypingEllipsis, result)) 1740 if not isinstance(args, tuple): 1741 args = (args,) 1742 args = tuple(_type_convert(arg) for arg in args) 1743 params = args + (result,) 1744 return self.copy_with(params) 1745 1746 1747class _TupleType(_SpecialGenericAlias, _root=True): 1748 @_tp_cache 1749 def __getitem__(self, params): 1750 if not isinstance(params, tuple): 1751 params = (params,) 1752 if len(params) >= 2 and params[-1] is ...: 1753 msg = "Tuple[t, ...]: t must be a type." 1754 params = tuple(_type_check(p, msg) for p in params[:-1]) 1755 return self.copy_with((*params, _TypingEllipsis)) 1756 msg = "Tuple[t0, t1, ...]: each t must be a type." 1757 params = tuple(_type_check(p, msg) for p in params) 1758 return self.copy_with(params) 1759 1760 1761class _UnionGenericAlias(_NotIterable, _GenericAlias, _root=True): 1762 def copy_with(self, params): 1763 return Union[params] 1764 1765 def __eq__(self, other): 1766 if not isinstance(other, (_UnionGenericAlias, types.UnionType)): 1767 return NotImplemented 1768 try: # fast path 1769 return set(self.__args__) == set(other.__args__) 1770 except TypeError: # not hashable, slow path 1771 return _compare_args_orderless(self.__args__, other.__args__) 1772 1773 def __hash__(self): 1774 return hash(frozenset(self.__args__)) 1775 1776 def __repr__(self): 1777 args = self.__args__ 1778 if len(args) == 2: 1779 if args[0] is type(None): 1780 return f'typing.Optional[{_type_repr(args[1])}]' 1781 elif args[1] is type(None): 1782 return f'typing.Optional[{_type_repr(args[0])}]' 1783 return super().__repr__() 1784 1785 def __instancecheck__(self, obj): 1786 for arg in self.__args__: 1787 if isinstance(obj, arg): 1788 return True 1789 return False 1790 1791 def __subclasscheck__(self, cls): 1792 for arg in self.__args__: 1793 if issubclass(cls, arg): 1794 return True 1795 return False 1796 1797 def __reduce__(self): 1798 func, (origin, args) = super().__reduce__() 1799 return func, (Union, args) 1800 1801 1802def _value_and_type_iter(parameters): 1803 return ((p, type(p)) for p in parameters) 1804 1805 1806class _LiteralGenericAlias(_GenericAlias, _root=True): 1807 def __eq__(self, other): 1808 if not isinstance(other, _LiteralGenericAlias): 1809 return NotImplemented 1810 1811 return set(_value_and_type_iter(self.__args__)) == set(_value_and_type_iter(other.__args__)) 1812 1813 def __hash__(self): 1814 return hash(frozenset(_value_and_type_iter(self.__args__))) 1815 1816 1817class _ConcatenateGenericAlias(_GenericAlias, _root=True): 1818 def copy_with(self, params): 1819 if isinstance(params[-1], (list, tuple)): 1820 return (*params[:-1], *params[-1]) 1821 if isinstance(params[-1], _ConcatenateGenericAlias): 1822 params = (*params[:-1], *params[-1].__args__) 1823 return super().copy_with(params) 1824 1825 1826@_SpecialForm 1827def Unpack(self, parameters): 1828 """Type unpack operator. 1829 1830 The type unpack operator takes the child types from some container type, 1831 such as `tuple[int, str]` or a `TypeVarTuple`, and 'pulls them out'. 1832 1833 For example:: 1834 1835 # For some generic class `Foo`: 1836 Foo[Unpack[tuple[int, str]]] # Equivalent to Foo[int, str] 1837 1838 Ts = TypeVarTuple('Ts') 1839 # Specifies that `Bar` is generic in an arbitrary number of types. 1840 # (Think of `Ts` as a tuple of an arbitrary number of individual 1841 # `TypeVar`s, which the `Unpack` is 'pulling out' directly into the 1842 # `Generic[]`.) 1843 class Bar(Generic[Unpack[Ts]]): ... 1844 Bar[int] # Valid 1845 Bar[int, str] # Also valid 1846 1847 From Python 3.11, this can also be done using the `*` operator:: 1848 1849 Foo[*tuple[int, str]] 1850 class Bar(Generic[*Ts]): ... 1851 1852 And from Python 3.12, it can be done using built-in syntax for generics:: 1853 1854 Foo[*tuple[int, str]] 1855 class Bar[*Ts]: ... 1856 1857 The operator can also be used along with a `TypedDict` to annotate 1858 `**kwargs` in a function signature:: 1859 1860 class Movie(TypedDict): 1861 name: str 1862 year: int 1863 1864 # This function expects two keyword arguments - *name* of type `str` and 1865 # *year* of type `int`. 1866 def foo(**kwargs: Unpack[Movie]): ... 1867 1868 Note that there is only some runtime checking of this operator. Not 1869 everything the runtime allows may be accepted by static type checkers. 1870 1871 For more information, see PEPs 646 and 692. 1872 """ 1873 item = _type_check(parameters, f'{self} accepts only single type.') 1874 return _UnpackGenericAlias(origin=self, args=(item,)) 1875 1876 1877class _UnpackGenericAlias(_GenericAlias, _root=True): 1878 def __repr__(self): 1879 # `Unpack` only takes one argument, so __args__ should contain only 1880 # a single item. 1881 return f'typing.Unpack[{_type_repr(self.__args__[0])}]' 1882 1883 def __getitem__(self, args): 1884 if self.__typing_is_unpacked_typevartuple__: 1885 return args 1886 return super().__getitem__(args) 1887 1888 @property 1889 def __typing_unpacked_tuple_args__(self): 1890 assert self.__origin__ is Unpack 1891 assert len(self.__args__) == 1 1892 arg, = self.__args__ 1893 if isinstance(arg, (_GenericAlias, types.GenericAlias)): 1894 if arg.__origin__ is not tuple: 1895 raise TypeError("Unpack[...] must be used with a tuple type") 1896 return arg.__args__ 1897 return None 1898 1899 @property 1900 def __typing_is_unpacked_typevartuple__(self): 1901 assert self.__origin__ is Unpack 1902 assert len(self.__args__) == 1 1903 return isinstance(self.__args__[0], TypeVarTuple) 1904 1905 1906class _TypingEllipsis: 1907 """Internal placeholder for ... (ellipsis).""" 1908 1909 1910_TYPING_INTERNALS = frozenset({ 1911 '__parameters__', '__orig_bases__', '__orig_class__', 1912 '_is_protocol', '_is_runtime_protocol', '__protocol_attrs__', 1913 '__non_callable_proto_members__', '__type_params__', 1914}) 1915 1916_SPECIAL_NAMES = frozenset({ 1917 '__abstractmethods__', '__annotations__', '__dict__', '__doc__', 1918 '__init__', '__module__', '__new__', '__slots__', 1919 '__subclasshook__', '__weakref__', '__class_getitem__', 1920 '__match_args__', '__static_attributes__', '__firstlineno__', 1921}) 1922 1923# These special attributes will be not collected as protocol members. 1924EXCLUDED_ATTRIBUTES = _TYPING_INTERNALS | _SPECIAL_NAMES | {'_MutableMapping__marker'} 1925 1926 1927def _get_protocol_attrs(cls): 1928 """Collect protocol members from a protocol class objects. 1929 1930 This includes names actually defined in the class dictionary, as well 1931 as names that appear in annotations. Special names (above) are skipped. 1932 """ 1933 attrs = set() 1934 for base in cls.__mro__[:-1]: # without object 1935 if base.__name__ in {'Protocol', 'Generic'}: 1936 continue 1937 annotations = getattr(base, '__annotations__', {}) 1938 for attr in (*base.__dict__, *annotations): 1939 if not attr.startswith('_abc_') and attr not in EXCLUDED_ATTRIBUTES: 1940 attrs.add(attr) 1941 return attrs 1942 1943 1944def _no_init_or_replace_init(self, *args, **kwargs): 1945 cls = type(self) 1946 1947 if cls._is_protocol: 1948 raise TypeError('Protocols cannot be instantiated') 1949 1950 # Already using a custom `__init__`. No need to calculate correct 1951 # `__init__` to call. This can lead to RecursionError. See bpo-45121. 1952 if cls.__init__ is not _no_init_or_replace_init: 1953 return 1954 1955 # Initially, `__init__` of a protocol subclass is set to `_no_init_or_replace_init`. 1956 # The first instantiation of the subclass will call `_no_init_or_replace_init` which 1957 # searches for a proper new `__init__` in the MRO. The new `__init__` 1958 # replaces the subclass' old `__init__` (ie `_no_init_or_replace_init`). Subsequent 1959 # instantiation of the protocol subclass will thus use the new 1960 # `__init__` and no longer call `_no_init_or_replace_init`. 1961 for base in cls.__mro__: 1962 init = base.__dict__.get('__init__', _no_init_or_replace_init) 1963 if init is not _no_init_or_replace_init: 1964 cls.__init__ = init 1965 break 1966 else: 1967 # should not happen 1968 cls.__init__ = object.__init__ 1969 1970 cls.__init__(self, *args, **kwargs) 1971 1972 1973def _caller(depth=1, default='__main__'): 1974 try: 1975 return sys._getframemodulename(depth + 1) or default 1976 except AttributeError: # For platforms without _getframemodulename() 1977 pass 1978 try: 1979 return sys._getframe(depth + 1).f_globals.get('__name__', default) 1980 except (AttributeError, ValueError): # For platforms without _getframe() 1981 pass 1982 return None 1983 1984def _allow_reckless_class_checks(depth=2): 1985 """Allow instance and class checks for special stdlib modules. 1986 1987 The abc and functools modules indiscriminately call isinstance() and 1988 issubclass() on the whole MRO of a user class, which may contain protocols. 1989 """ 1990 return _caller(depth) in {'abc', 'functools', None} 1991 1992 1993_PROTO_ALLOWLIST = { 1994 'collections.abc': [ 1995 'Callable', 'Awaitable', 'Iterable', 'Iterator', 'AsyncIterable', 1996 'AsyncIterator', 'Hashable', 'Sized', 'Container', 'Collection', 1997 'Reversible', 'Buffer', 1998 ], 1999 'contextlib': ['AbstractContextManager', 'AbstractAsyncContextManager'], 2000} 2001 2002 2003@functools.cache 2004def _lazy_load_getattr_static(): 2005 # Import getattr_static lazily so as not to slow down the import of typing.py 2006 # Cache the result so we don't slow down _ProtocolMeta.__instancecheck__ unnecessarily 2007 from inspect import getattr_static 2008 return getattr_static 2009 2010 2011_cleanups.append(_lazy_load_getattr_static.cache_clear) 2012 2013def _pickle_psargs(psargs): 2014 return ParamSpecArgs, (psargs.__origin__,) 2015 2016copyreg.pickle(ParamSpecArgs, _pickle_psargs) 2017 2018def _pickle_pskwargs(pskwargs): 2019 return ParamSpecKwargs, (pskwargs.__origin__,) 2020 2021copyreg.pickle(ParamSpecKwargs, _pickle_pskwargs) 2022 2023del _pickle_psargs, _pickle_pskwargs 2024 2025 2026# Preload these once, as globals, as a micro-optimisation. 2027# This makes a significant difference to the time it takes 2028# to do `isinstance()`/`issubclass()` checks 2029# against runtime-checkable protocols with only one callable member. 2030_abc_instancecheck = ABCMeta.__instancecheck__ 2031_abc_subclasscheck = ABCMeta.__subclasscheck__ 2032 2033 2034def _type_check_issubclass_arg_1(arg): 2035 """Raise TypeError if `arg` is not an instance of `type` 2036 in `issubclass(arg, <protocol>)`. 2037 2038 In most cases, this is verified by type.__subclasscheck__. 2039 Checking it again unnecessarily would slow down issubclass() checks, 2040 so, we don't perform this check unless we absolutely have to. 2041 2042 For various error paths, however, 2043 we want to ensure that *this* error message is shown to the user 2044 where relevant, rather than a typing.py-specific error message. 2045 """ 2046 if not isinstance(arg, type): 2047 # Same error message as for issubclass(1, int). 2048 raise TypeError('issubclass() arg 1 must be a class') 2049 2050 2051class _ProtocolMeta(ABCMeta): 2052 # This metaclass is somewhat unfortunate, 2053 # but is necessary for several reasons... 2054 def __new__(mcls, name, bases, namespace, /, **kwargs): 2055 if name == "Protocol" and bases == (Generic,): 2056 pass 2057 elif Protocol in bases: 2058 for base in bases: 2059 if not ( 2060 base in {object, Generic} 2061 or base.__name__ in _PROTO_ALLOWLIST.get(base.__module__, []) 2062 or ( 2063 issubclass(base, Generic) 2064 and getattr(base, "_is_protocol", False) 2065 ) 2066 ): 2067 raise TypeError( 2068 f"Protocols can only inherit from other protocols, " 2069 f"got {base!r}" 2070 ) 2071 return super().__new__(mcls, name, bases, namespace, **kwargs) 2072 2073 def __init__(cls, *args, **kwargs): 2074 super().__init__(*args, **kwargs) 2075 if getattr(cls, "_is_protocol", False): 2076 cls.__protocol_attrs__ = _get_protocol_attrs(cls) 2077 2078 def __subclasscheck__(cls, other): 2079 if cls is Protocol: 2080 return type.__subclasscheck__(cls, other) 2081 if ( 2082 getattr(cls, '_is_protocol', False) 2083 and not _allow_reckless_class_checks() 2084 ): 2085 if not getattr(cls, '_is_runtime_protocol', False): 2086 _type_check_issubclass_arg_1(other) 2087 raise TypeError( 2088 "Instance and class checks can only be used with " 2089 "@runtime_checkable protocols" 2090 ) 2091 if ( 2092 # this attribute is set by @runtime_checkable: 2093 cls.__non_callable_proto_members__ 2094 and cls.__dict__.get("__subclasshook__") is _proto_hook 2095 ): 2096 _type_check_issubclass_arg_1(other) 2097 non_method_attrs = sorted(cls.__non_callable_proto_members__) 2098 raise TypeError( 2099 "Protocols with non-method members don't support issubclass()." 2100 f" Non-method members: {str(non_method_attrs)[1:-1]}." 2101 ) 2102 return _abc_subclasscheck(cls, other) 2103 2104 def __instancecheck__(cls, instance): 2105 # We need this method for situations where attributes are 2106 # assigned in __init__. 2107 if cls is Protocol: 2108 return type.__instancecheck__(cls, instance) 2109 if not getattr(cls, "_is_protocol", False): 2110 # i.e., it's a concrete subclass of a protocol 2111 return _abc_instancecheck(cls, instance) 2112 2113 if ( 2114 not getattr(cls, '_is_runtime_protocol', False) and 2115 not _allow_reckless_class_checks() 2116 ): 2117 raise TypeError("Instance and class checks can only be used with" 2118 " @runtime_checkable protocols") 2119 2120 if _abc_instancecheck(cls, instance): 2121 return True 2122 2123 getattr_static = _lazy_load_getattr_static() 2124 for attr in cls.__protocol_attrs__: 2125 try: 2126 val = getattr_static(instance, attr) 2127 except AttributeError: 2128 break 2129 # this attribute is set by @runtime_checkable: 2130 if val is None and attr not in cls.__non_callable_proto_members__: 2131 break 2132 else: 2133 return True 2134 2135 return False 2136 2137 2138@classmethod 2139def _proto_hook(cls, other): 2140 if not cls.__dict__.get('_is_protocol', False): 2141 return NotImplemented 2142 2143 for attr in cls.__protocol_attrs__: 2144 for base in other.__mro__: 2145 # Check if the members appears in the class dictionary... 2146 if attr in base.__dict__: 2147 if base.__dict__[attr] is None: 2148 return NotImplemented 2149 break 2150 2151 # ...or in annotations, if it is a sub-protocol. 2152 annotations = getattr(base, '__annotations__', {}) 2153 if (isinstance(annotations, collections.abc.Mapping) and 2154 attr in annotations and 2155 issubclass(other, Generic) and getattr(other, '_is_protocol', False)): 2156 break 2157 else: 2158 return NotImplemented 2159 return True 2160 2161 2162class Protocol(Generic, metaclass=_ProtocolMeta): 2163 """Base class for protocol classes. 2164 2165 Protocol classes are defined as:: 2166 2167 class Proto(Protocol): 2168 def meth(self) -> int: 2169 ... 2170 2171 Such classes are primarily used with static type checkers that recognize 2172 structural subtyping (static duck-typing). 2173 2174 For example:: 2175 2176 class C: 2177 def meth(self) -> int: 2178 return 0 2179 2180 def func(x: Proto) -> int: 2181 return x.meth() 2182 2183 func(C()) # Passes static type check 2184 2185 See PEP 544 for details. Protocol classes decorated with 2186 @typing.runtime_checkable act as simple-minded runtime protocols that check 2187 only the presence of given attributes, ignoring their type signatures. 2188 Protocol classes can be generic, they are defined as:: 2189 2190 class GenProto[T](Protocol): 2191 def meth(self) -> T: 2192 ... 2193 """ 2194 2195 __slots__ = () 2196 _is_protocol = True 2197 _is_runtime_protocol = False 2198 2199 def __init_subclass__(cls, *args, **kwargs): 2200 super().__init_subclass__(*args, **kwargs) 2201 2202 # Determine if this is a protocol or a concrete subclass. 2203 if not cls.__dict__.get('_is_protocol', False): 2204 cls._is_protocol = any(b is Protocol for b in cls.__bases__) 2205 2206 # Set (or override) the protocol subclass hook. 2207 if '__subclasshook__' not in cls.__dict__: 2208 cls.__subclasshook__ = _proto_hook 2209 2210 # Prohibit instantiation for protocol classes 2211 if cls._is_protocol and cls.__init__ is Protocol.__init__: 2212 cls.__init__ = _no_init_or_replace_init 2213 2214 2215class _AnnotatedAlias(_NotIterable, _GenericAlias, _root=True): 2216 """Runtime representation of an annotated type. 2217 2218 At its core 'Annotated[t, dec1, dec2, ...]' is an alias for the type 't' 2219 with extra annotations. The alias behaves like a normal typing alias. 2220 Instantiating is the same as instantiating the underlying type; binding 2221 it to types is also the same. 2222 2223 The metadata itself is stored in a '__metadata__' attribute as a tuple. 2224 """ 2225 2226 def __init__(self, origin, metadata): 2227 if isinstance(origin, _AnnotatedAlias): 2228 metadata = origin.__metadata__ + metadata 2229 origin = origin.__origin__ 2230 super().__init__(origin, origin, name='Annotated') 2231 self.__metadata__ = metadata 2232 2233 def copy_with(self, params): 2234 assert len(params) == 1 2235 new_type = params[0] 2236 return _AnnotatedAlias(new_type, self.__metadata__) 2237 2238 def __repr__(self): 2239 return "typing.Annotated[{}, {}]".format( 2240 _type_repr(self.__origin__), 2241 ", ".join(repr(a) for a in self.__metadata__) 2242 ) 2243 2244 def __reduce__(self): 2245 return operator.getitem, ( 2246 Annotated, (self.__origin__,) + self.__metadata__ 2247 ) 2248 2249 def __eq__(self, other): 2250 if not isinstance(other, _AnnotatedAlias): 2251 return NotImplemented 2252 return (self.__origin__ == other.__origin__ 2253 and self.__metadata__ == other.__metadata__) 2254 2255 def __hash__(self): 2256 return hash((self.__origin__, self.__metadata__)) 2257 2258 def __getattr__(self, attr): 2259 if attr in {'__name__', '__qualname__'}: 2260 return 'Annotated' 2261 return super().__getattr__(attr) 2262 2263 def __mro_entries__(self, bases): 2264 return (self.__origin__,) 2265 2266 2267@_TypedCacheSpecialForm 2268@_tp_cache(typed=True) 2269def Annotated(self, *params): 2270 """Add context-specific metadata to a type. 2271 2272 Example: Annotated[int, runtime_check.Unsigned] indicates to the 2273 hypothetical runtime_check module that this type is an unsigned int. 2274 Every other consumer of this type can ignore this metadata and treat 2275 this type as int. 2276 2277 The first argument to Annotated must be a valid type. 2278 2279 Details: 2280 2281 - It's an error to call `Annotated` with less than two arguments. 2282 - Access the metadata via the ``__metadata__`` attribute:: 2283 2284 assert Annotated[int, '$'].__metadata__ == ('$',) 2285 2286 - Nested Annotated types are flattened:: 2287 2288 assert Annotated[Annotated[T, Ann1, Ann2], Ann3] == Annotated[T, Ann1, Ann2, Ann3] 2289 2290 - Instantiating an annotated type is equivalent to instantiating the 2291 underlying type:: 2292 2293 assert Annotated[C, Ann1](5) == C(5) 2294 2295 - Annotated can be used as a generic type alias:: 2296 2297 type Optimized[T] = Annotated[T, runtime.Optimize()] 2298 # type checker will treat Optimized[int] 2299 # as equivalent to Annotated[int, runtime.Optimize()] 2300 2301 type OptimizedList[T] = Annotated[list[T], runtime.Optimize()] 2302 # type checker will treat OptimizedList[int] 2303 # as equivalent to Annotated[list[int], runtime.Optimize()] 2304 2305 - Annotated cannot be used with an unpacked TypeVarTuple:: 2306 2307 type Variadic[*Ts] = Annotated[*Ts, Ann1] # NOT valid 2308 2309 This would be equivalent to:: 2310 2311 Annotated[T1, T2, T3, ..., Ann1] 2312 2313 where T1, T2 etc. are TypeVars, which would be invalid, because 2314 only one type should be passed to Annotated. 2315 """ 2316 if len(params) < 2: 2317 raise TypeError("Annotated[...] should be used " 2318 "with at least two arguments (a type and an " 2319 "annotation).") 2320 if _is_unpacked_typevartuple(params[0]): 2321 raise TypeError("Annotated[...] should not be used with an " 2322 "unpacked TypeVarTuple") 2323 msg = "Annotated[t, ...]: t must be a type." 2324 origin = _type_check(params[0], msg, allow_special_forms=True) 2325 metadata = tuple(params[1:]) 2326 return _AnnotatedAlias(origin, metadata) 2327 2328 2329def runtime_checkable(cls): 2330 """Mark a protocol class as a runtime protocol. 2331 2332 Such protocol can be used with isinstance() and issubclass(). 2333 Raise TypeError if applied to a non-protocol class. 2334 This allows a simple-minded structural check very similar to 2335 one trick ponies in collections.abc such as Iterable. 2336 2337 For example:: 2338 2339 @runtime_checkable 2340 class Closable(Protocol): 2341 def close(self): ... 2342 2343 assert isinstance(open('/some/file'), Closable) 2344 2345 Warning: this will check only the presence of the required methods, 2346 not their type signatures! 2347 """ 2348 if not issubclass(cls, Generic) or not getattr(cls, '_is_protocol', False): 2349 raise TypeError('@runtime_checkable can be only applied to protocol classes,' 2350 ' got %r' % cls) 2351 cls._is_runtime_protocol = True 2352 # PEP 544 prohibits using issubclass() 2353 # with protocols that have non-method members. 2354 # See gh-113320 for why we compute this attribute here, 2355 # rather than in `_ProtocolMeta.__init__` 2356 cls.__non_callable_proto_members__ = set() 2357 for attr in cls.__protocol_attrs__: 2358 try: 2359 is_callable = callable(getattr(cls, attr, None)) 2360 except Exception as e: 2361 raise TypeError( 2362 f"Failed to determine whether protocol member {attr!r} " 2363 "is a method member" 2364 ) from e 2365 else: 2366 if not is_callable: 2367 cls.__non_callable_proto_members__.add(attr) 2368 return cls 2369 2370 2371def cast(typ, val): 2372 """Cast a value to a type. 2373 2374 This returns the value unchanged. To the type checker this 2375 signals that the return value has the designated type, but at 2376 runtime we intentionally don't check anything (we want this 2377 to be as fast as possible). 2378 """ 2379 return val 2380 2381 2382def assert_type(val, typ, /): 2383 """Ask a static type checker to confirm that the value is of the given type. 2384 2385 At runtime this does nothing: it returns the first argument unchanged with no 2386 checks or side effects, no matter the actual type of the argument. 2387 2388 When a static type checker encounters a call to assert_type(), it 2389 emits an error if the value is not of the specified type:: 2390 2391 def greet(name: str) -> None: 2392 assert_type(name, str) # OK 2393 assert_type(name, int) # type checker error 2394 """ 2395 return val 2396 2397 2398_allowed_types = (types.FunctionType, types.BuiltinFunctionType, 2399 types.MethodType, types.ModuleType, 2400 WrapperDescriptorType, MethodWrapperType, MethodDescriptorType) 2401 2402 2403def get_type_hints(obj, globalns=None, localns=None, include_extras=False): 2404 """Return type hints for an object. 2405 2406 This is often the same as obj.__annotations__, but it handles 2407 forward references encoded as string literals and recursively replaces all 2408 'Annotated[T, ...]' with 'T' (unless 'include_extras=True'). 2409 2410 The argument may be a module, class, method, or function. The annotations 2411 are returned as a dictionary. For classes, annotations include also 2412 inherited members. 2413 2414 TypeError is raised if the argument is not of a type that can contain 2415 annotations, and an empty dictionary is returned if no annotations are 2416 present. 2417 2418 BEWARE -- the behavior of globalns and localns is counterintuitive 2419 (unless you are familiar with how eval() and exec() work). The 2420 search order is locals first, then globals. 2421 2422 - If no dict arguments are passed, an attempt is made to use the 2423 globals from obj (or the respective module's globals for classes), 2424 and these are also used as the locals. If the object does not appear 2425 to have globals, an empty dictionary is used. For classes, the search 2426 order is globals first then locals. 2427 2428 - If one dict argument is passed, it is used for both globals and 2429 locals. 2430 2431 - If two dict arguments are passed, they specify globals and 2432 locals, respectively. 2433 """ 2434 if getattr(obj, '__no_type_check__', None): 2435 return {} 2436 # Classes require a special treatment. 2437 if isinstance(obj, type): 2438 hints = {} 2439 for base in reversed(obj.__mro__): 2440 if globalns is None: 2441 base_globals = getattr(sys.modules.get(base.__module__, None), '__dict__', {}) 2442 else: 2443 base_globals = globalns 2444 ann = base.__dict__.get('__annotations__', {}) 2445 if isinstance(ann, types.GetSetDescriptorType): 2446 ann = {} 2447 base_locals = dict(vars(base)) if localns is None else localns 2448 if localns is None and globalns is None: 2449 # This is surprising, but required. Before Python 3.10, 2450 # get_type_hints only evaluated the globalns of 2451 # a class. To maintain backwards compatibility, we reverse 2452 # the globalns and localns order so that eval() looks into 2453 # *base_globals* first rather than *base_locals*. 2454 # This only affects ForwardRefs. 2455 base_globals, base_locals = base_locals, base_globals 2456 for name, value in ann.items(): 2457 if value is None: 2458 value = type(None) 2459 if isinstance(value, str): 2460 value = ForwardRef(value, is_argument=False, is_class=True) 2461 value = _eval_type(value, base_globals, base_locals, base.__type_params__) 2462 hints[name] = value 2463 return hints if include_extras else {k: _strip_annotations(t) for k, t in hints.items()} 2464 2465 if globalns is None: 2466 if isinstance(obj, types.ModuleType): 2467 globalns = obj.__dict__ 2468 else: 2469 nsobj = obj 2470 # Find globalns for the unwrapped object. 2471 while hasattr(nsobj, '__wrapped__'): 2472 nsobj = nsobj.__wrapped__ 2473 globalns = getattr(nsobj, '__globals__', {}) 2474 if localns is None: 2475 localns = globalns 2476 elif localns is None: 2477 localns = globalns 2478 hints = getattr(obj, '__annotations__', None) 2479 if hints is None: 2480 # Return empty annotations for something that _could_ have them. 2481 if isinstance(obj, _allowed_types): 2482 return {} 2483 else: 2484 raise TypeError('{!r} is not a module, class, method, ' 2485 'or function.'.format(obj)) 2486 hints = dict(hints) 2487 type_params = getattr(obj, "__type_params__", ()) 2488 for name, value in hints.items(): 2489 if value is None: 2490 value = type(None) 2491 if isinstance(value, str): 2492 # class-level forward refs were handled above, this must be either 2493 # a module-level annotation or a function argument annotation 2494 value = ForwardRef( 2495 value, 2496 is_argument=not isinstance(obj, types.ModuleType), 2497 is_class=False, 2498 ) 2499 hints[name] = _eval_type(value, globalns, localns, type_params) 2500 return hints if include_extras else {k: _strip_annotations(t) for k, t in hints.items()} 2501 2502 2503def _strip_annotations(t): 2504 """Strip the annotations from a given type.""" 2505 if isinstance(t, _AnnotatedAlias): 2506 return _strip_annotations(t.__origin__) 2507 if hasattr(t, "__origin__") and t.__origin__ in (Required, NotRequired, ReadOnly): 2508 return _strip_annotations(t.__args__[0]) 2509 if isinstance(t, _GenericAlias): 2510 stripped_args = tuple(_strip_annotations(a) for a in t.__args__) 2511 if stripped_args == t.__args__: 2512 return t 2513 return t.copy_with(stripped_args) 2514 if isinstance(t, GenericAlias): 2515 stripped_args = tuple(_strip_annotations(a) for a in t.__args__) 2516 if stripped_args == t.__args__: 2517 return t 2518 return GenericAlias(t.__origin__, stripped_args) 2519 if isinstance(t, types.UnionType): 2520 stripped_args = tuple(_strip_annotations(a) for a in t.__args__) 2521 if stripped_args == t.__args__: 2522 return t 2523 return functools.reduce(operator.or_, stripped_args) 2524 2525 return t 2526 2527 2528def get_origin(tp): 2529 """Get the unsubscripted version of a type. 2530 2531 This supports generic types, Callable, Tuple, Union, Literal, Final, ClassVar, 2532 Annotated, and others. Return None for unsupported types. 2533 2534 Examples:: 2535 2536 >>> P = ParamSpec('P') 2537 >>> assert get_origin(Literal[42]) is Literal 2538 >>> assert get_origin(int) is None 2539 >>> assert get_origin(ClassVar[int]) is ClassVar 2540 >>> assert get_origin(Generic) is Generic 2541 >>> assert get_origin(Generic[T]) is Generic 2542 >>> assert get_origin(Union[T, int]) is Union 2543 >>> assert get_origin(List[Tuple[T, T]][int]) is list 2544 >>> assert get_origin(P.args) is P 2545 """ 2546 if isinstance(tp, _AnnotatedAlias): 2547 return Annotated 2548 if isinstance(tp, (_BaseGenericAlias, GenericAlias, 2549 ParamSpecArgs, ParamSpecKwargs)): 2550 return tp.__origin__ 2551 if tp is Generic: 2552 return Generic 2553 if isinstance(tp, types.UnionType): 2554 return types.UnionType 2555 return None 2556 2557 2558def get_args(tp): 2559 """Get type arguments with all substitutions performed. 2560 2561 For unions, basic simplifications used by Union constructor are performed. 2562 2563 Examples:: 2564 2565 >>> T = TypeVar('T') 2566 >>> assert get_args(Dict[str, int]) == (str, int) 2567 >>> assert get_args(int) == () 2568 >>> assert get_args(Union[int, Union[T, int], str][int]) == (int, str) 2569 >>> assert get_args(Union[int, Tuple[T, int]][str]) == (int, Tuple[str, int]) 2570 >>> assert get_args(Callable[[], T][int]) == ([], int) 2571 """ 2572 if isinstance(tp, _AnnotatedAlias): 2573 return (tp.__origin__,) + tp.__metadata__ 2574 if isinstance(tp, (_GenericAlias, GenericAlias)): 2575 res = tp.__args__ 2576 if _should_unflatten_callable_args(tp, res): 2577 res = (list(res[:-1]), res[-1]) 2578 return res 2579 if isinstance(tp, types.UnionType): 2580 return tp.__args__ 2581 return () 2582 2583 2584def is_typeddict(tp): 2585 """Check if an annotation is a TypedDict class. 2586 2587 For example:: 2588 2589 >>> from typing import TypedDict 2590 >>> class Film(TypedDict): 2591 ... title: str 2592 ... year: int 2593 ... 2594 >>> is_typeddict(Film) 2595 True 2596 >>> is_typeddict(dict) 2597 False 2598 """ 2599 return isinstance(tp, _TypedDictMeta) 2600 2601 2602_ASSERT_NEVER_REPR_MAX_LENGTH = 100 2603 2604 2605def assert_never(arg: Never, /) -> Never: 2606 """Statically assert that a line of code is unreachable. 2607 2608 Example:: 2609 2610 def int_or_str(arg: int | str) -> None: 2611 match arg: 2612 case int(): 2613 print("It's an int") 2614 case str(): 2615 print("It's a str") 2616 case _: 2617 assert_never(arg) 2618 2619 If a type checker finds that a call to assert_never() is 2620 reachable, it will emit an error. 2621 2622 At runtime, this throws an exception when called. 2623 """ 2624 value = repr(arg) 2625 if len(value) > _ASSERT_NEVER_REPR_MAX_LENGTH: 2626 value = value[:_ASSERT_NEVER_REPR_MAX_LENGTH] + '...' 2627 raise AssertionError(f"Expected code to be unreachable, but got: {value}") 2628 2629 2630def no_type_check(arg): 2631 """Decorator to indicate that annotations are not type hints. 2632 2633 The argument must be a class or function; if it is a class, it 2634 applies recursively to all methods and classes defined in that class 2635 (but not to methods defined in its superclasses or subclasses). 2636 2637 This mutates the function(s) or class(es) in place. 2638 """ 2639 if isinstance(arg, type): 2640 for key in dir(arg): 2641 obj = getattr(arg, key) 2642 if ( 2643 not hasattr(obj, '__qualname__') 2644 or obj.__qualname__ != f'{arg.__qualname__}.{obj.__name__}' 2645 or getattr(obj, '__module__', None) != arg.__module__ 2646 ): 2647 # We only modify objects that are defined in this type directly. 2648 # If classes / methods are nested in multiple layers, 2649 # we will modify them when processing their direct holders. 2650 continue 2651 # Instance, class, and static methods: 2652 if isinstance(obj, types.FunctionType): 2653 obj.__no_type_check__ = True 2654 if isinstance(obj, types.MethodType): 2655 obj.__func__.__no_type_check__ = True 2656 # Nested types: 2657 if isinstance(obj, type): 2658 no_type_check(obj) 2659 try: 2660 arg.__no_type_check__ = True 2661 except TypeError: # built-in classes 2662 pass 2663 return arg 2664 2665 2666def no_type_check_decorator(decorator): 2667 """Decorator to give another decorator the @no_type_check effect. 2668 2669 This wraps the decorator with something that wraps the decorated 2670 function in @no_type_check. 2671 """ 2672 import warnings 2673 warnings._deprecated("typing.no_type_check_decorator", remove=(3, 15)) 2674 @functools.wraps(decorator) 2675 def wrapped_decorator(*args, **kwds): 2676 func = decorator(*args, **kwds) 2677 func = no_type_check(func) 2678 return func 2679 2680 return wrapped_decorator 2681 2682 2683def _overload_dummy(*args, **kwds): 2684 """Helper for @overload to raise when called.""" 2685 raise NotImplementedError( 2686 "You should not call an overloaded function. " 2687 "A series of @overload-decorated functions " 2688 "outside a stub module should always be followed " 2689 "by an implementation that is not @overload-ed.") 2690 2691 2692# {module: {qualname: {firstlineno: func}}} 2693_overload_registry = defaultdict(functools.partial(defaultdict, dict)) 2694 2695 2696def overload(func): 2697 """Decorator for overloaded functions/methods. 2698 2699 In a stub file, place two or more stub definitions for the same 2700 function in a row, each decorated with @overload. 2701 2702 For example:: 2703 2704 @overload 2705 def utf8(value: None) -> None: ... 2706 @overload 2707 def utf8(value: bytes) -> bytes: ... 2708 @overload 2709 def utf8(value: str) -> bytes: ... 2710 2711 In a non-stub file (i.e. a regular .py file), do the same but 2712 follow it with an implementation. The implementation should *not* 2713 be decorated with @overload:: 2714 2715 @overload 2716 def utf8(value: None) -> None: ... 2717 @overload 2718 def utf8(value: bytes) -> bytes: ... 2719 @overload 2720 def utf8(value: str) -> bytes: ... 2721 def utf8(value): 2722 ... # implementation goes here 2723 2724 The overloads for a function can be retrieved at runtime using the 2725 get_overloads() function. 2726 """ 2727 # classmethod and staticmethod 2728 f = getattr(func, "__func__", func) 2729 try: 2730 _overload_registry[f.__module__][f.__qualname__][f.__code__.co_firstlineno] = func 2731 except AttributeError: 2732 # Not a normal function; ignore. 2733 pass 2734 return _overload_dummy 2735 2736 2737def get_overloads(func): 2738 """Return all defined overloads for *func* as a sequence.""" 2739 # classmethod and staticmethod 2740 f = getattr(func, "__func__", func) 2741 if f.__module__ not in _overload_registry: 2742 return [] 2743 mod_dict = _overload_registry[f.__module__] 2744 if f.__qualname__ not in mod_dict: 2745 return [] 2746 return list(mod_dict[f.__qualname__].values()) 2747 2748 2749def clear_overloads(): 2750 """Clear all overloads in the registry.""" 2751 _overload_registry.clear() 2752 2753 2754def final(f): 2755 """Decorator to indicate final methods and final classes. 2756 2757 Use this decorator to indicate to type checkers that the decorated 2758 method cannot be overridden, and decorated class cannot be subclassed. 2759 2760 For example:: 2761 2762 class Base: 2763 @final 2764 def done(self) -> None: 2765 ... 2766 class Sub(Base): 2767 def done(self) -> None: # Error reported by type checker 2768 ... 2769 2770 @final 2771 class Leaf: 2772 ... 2773 class Other(Leaf): # Error reported by type checker 2774 ... 2775 2776 There is no runtime checking of these properties. The decorator 2777 attempts to set the ``__final__`` attribute to ``True`` on the decorated 2778 object to allow runtime introspection. 2779 """ 2780 try: 2781 f.__final__ = True 2782 except (AttributeError, TypeError): 2783 # Skip the attribute silently if it is not writable. 2784 # AttributeError happens if the object has __slots__ or a 2785 # read-only property, TypeError if it's a builtin class. 2786 pass 2787 return f 2788 2789 2790# Some unconstrained type variables. These were initially used by the container types. 2791# They were never meant for export and are now unused, but we keep them around to 2792# avoid breaking compatibility with users who import them. 2793T = TypeVar('T') # Any type. 2794KT = TypeVar('KT') # Key type. 2795VT = TypeVar('VT') # Value type. 2796T_co = TypeVar('T_co', covariant=True) # Any type covariant containers. 2797V_co = TypeVar('V_co', covariant=True) # Any type covariant containers. 2798VT_co = TypeVar('VT_co', covariant=True) # Value type covariant containers. 2799T_contra = TypeVar('T_contra', contravariant=True) # Ditto contravariant. 2800# Internal type variable used for Type[]. 2801CT_co = TypeVar('CT_co', covariant=True, bound=type) 2802 2803 2804# A useful type variable with constraints. This represents string types. 2805# (This one *is* for export!) 2806AnyStr = TypeVar('AnyStr', bytes, str) 2807 2808 2809# Various ABCs mimicking those in collections.abc. 2810_alias = _SpecialGenericAlias 2811 2812Hashable = _alias(collections.abc.Hashable, 0) # Not generic. 2813Awaitable = _alias(collections.abc.Awaitable, 1) 2814Coroutine = _alias(collections.abc.Coroutine, 3) 2815AsyncIterable = _alias(collections.abc.AsyncIterable, 1) 2816AsyncIterator = _alias(collections.abc.AsyncIterator, 1) 2817Iterable = _alias(collections.abc.Iterable, 1) 2818Iterator = _alias(collections.abc.Iterator, 1) 2819Reversible = _alias(collections.abc.Reversible, 1) 2820Sized = _alias(collections.abc.Sized, 0) # Not generic. 2821Container = _alias(collections.abc.Container, 1) 2822Collection = _alias(collections.abc.Collection, 1) 2823Callable = _CallableType(collections.abc.Callable, 2) 2824Callable.__doc__ = \ 2825 """Deprecated alias to collections.abc.Callable. 2826 2827 Callable[[int], str] signifies a function that takes a single 2828 parameter of type int and returns a str. 2829 2830 The subscription syntax must always be used with exactly two 2831 values: the argument list and the return type. 2832 The argument list must be a list of types, a ParamSpec, 2833 Concatenate or ellipsis. The return type must be a single type. 2834 2835 There is no syntax to indicate optional or keyword arguments; 2836 such function types are rarely used as callback types. 2837 """ 2838AbstractSet = _alias(collections.abc.Set, 1, name='AbstractSet') 2839MutableSet = _alias(collections.abc.MutableSet, 1) 2840# NOTE: Mapping is only covariant in the value type. 2841Mapping = _alias(collections.abc.Mapping, 2) 2842MutableMapping = _alias(collections.abc.MutableMapping, 2) 2843Sequence = _alias(collections.abc.Sequence, 1) 2844MutableSequence = _alias(collections.abc.MutableSequence, 1) 2845ByteString = _DeprecatedGenericAlias( 2846 collections.abc.ByteString, 0, removal_version=(3, 14) # Not generic. 2847) 2848# Tuple accepts variable number of parameters. 2849Tuple = _TupleType(tuple, -1, inst=False, name='Tuple') 2850Tuple.__doc__ = \ 2851 """Deprecated alias to builtins.tuple. 2852 2853 Tuple[X, Y] is the cross-product type of X and Y. 2854 2855 Example: Tuple[T1, T2] is a tuple of two elements corresponding 2856 to type variables T1 and T2. Tuple[int, float, str] is a tuple 2857 of an int, a float and a string. 2858 2859 To specify a variable-length tuple of homogeneous type, use Tuple[T, ...]. 2860 """ 2861List = _alias(list, 1, inst=False, name='List') 2862Deque = _alias(collections.deque, 1, name='Deque') 2863Set = _alias(set, 1, inst=False, name='Set') 2864FrozenSet = _alias(frozenset, 1, inst=False, name='FrozenSet') 2865MappingView = _alias(collections.abc.MappingView, 1) 2866KeysView = _alias(collections.abc.KeysView, 1) 2867ItemsView = _alias(collections.abc.ItemsView, 2) 2868ValuesView = _alias(collections.abc.ValuesView, 1) 2869Dict = _alias(dict, 2, inst=False, name='Dict') 2870DefaultDict = _alias(collections.defaultdict, 2, name='DefaultDict') 2871OrderedDict = _alias(collections.OrderedDict, 2) 2872Counter = _alias(collections.Counter, 1) 2873ChainMap = _alias(collections.ChainMap, 2) 2874Generator = _alias(collections.abc.Generator, 3, defaults=(types.NoneType, types.NoneType)) 2875AsyncGenerator = _alias(collections.abc.AsyncGenerator, 2, defaults=(types.NoneType,)) 2876Type = _alias(type, 1, inst=False, name='Type') 2877Type.__doc__ = \ 2878 """Deprecated alias to builtins.type. 2879 2880 builtins.type or typing.Type can be used to annotate class objects. 2881 For example, suppose we have the following classes:: 2882 2883 class User: ... # Abstract base for User classes 2884 class BasicUser(User): ... 2885 class ProUser(User): ... 2886 class TeamUser(User): ... 2887 2888 And a function that takes a class argument that's a subclass of 2889 User and returns an instance of the corresponding class:: 2890 2891 def new_user[U](user_class: Type[U]) -> U: 2892 user = user_class() 2893 # (Here we could write the user object to a database) 2894 return user 2895 2896 joe = new_user(BasicUser) 2897 2898 At this point the type checker knows that joe has type BasicUser. 2899 """ 2900 2901 2902@runtime_checkable 2903class SupportsInt(Protocol): 2904 """An ABC with one abstract method __int__.""" 2905 2906 __slots__ = () 2907 2908 @abstractmethod 2909 def __int__(self) -> int: 2910 pass 2911 2912 2913@runtime_checkable 2914class SupportsFloat(Protocol): 2915 """An ABC with one abstract method __float__.""" 2916 2917 __slots__ = () 2918 2919 @abstractmethod 2920 def __float__(self) -> float: 2921 pass 2922 2923 2924@runtime_checkable 2925class SupportsComplex(Protocol): 2926 """An ABC with one abstract method __complex__.""" 2927 2928 __slots__ = () 2929 2930 @abstractmethod 2931 def __complex__(self) -> complex: 2932 pass 2933 2934 2935@runtime_checkable 2936class SupportsBytes(Protocol): 2937 """An ABC with one abstract method __bytes__.""" 2938 2939 __slots__ = () 2940 2941 @abstractmethod 2942 def __bytes__(self) -> bytes: 2943 pass 2944 2945 2946@runtime_checkable 2947class SupportsIndex(Protocol): 2948 """An ABC with one abstract method __index__.""" 2949 2950 __slots__ = () 2951 2952 @abstractmethod 2953 def __index__(self) -> int: 2954 pass 2955 2956 2957@runtime_checkable 2958class SupportsAbs[T](Protocol): 2959 """An ABC with one abstract method __abs__ that is covariant in its return type.""" 2960 2961 __slots__ = () 2962 2963 @abstractmethod 2964 def __abs__(self) -> T: 2965 pass 2966 2967 2968@runtime_checkable 2969class SupportsRound[T](Protocol): 2970 """An ABC with one abstract method __round__ that is covariant in its return type.""" 2971 2972 __slots__ = () 2973 2974 @abstractmethod 2975 def __round__(self, ndigits: int = 0) -> T: 2976 pass 2977 2978 2979def _make_nmtuple(name, types, module, defaults = ()): 2980 fields = [n for n, t in types] 2981 types = {n: _type_check(t, f"field {n} annotation must be a type") 2982 for n, t in types} 2983 nm_tpl = collections.namedtuple(name, fields, 2984 defaults=defaults, module=module) 2985 nm_tpl.__annotations__ = nm_tpl.__new__.__annotations__ = types 2986 return nm_tpl 2987 2988 2989# attributes prohibited to set in NamedTuple class syntax 2990_prohibited = frozenset({'__new__', '__init__', '__slots__', '__getnewargs__', 2991 '_fields', '_field_defaults', 2992 '_make', '_replace', '_asdict', '_source'}) 2993 2994_special = frozenset({'__module__', '__name__', '__annotations__'}) 2995 2996 2997class NamedTupleMeta(type): 2998 def __new__(cls, typename, bases, ns): 2999 assert _NamedTuple in bases 3000 for base in bases: 3001 if base is not _NamedTuple and base is not Generic: 3002 raise TypeError( 3003 'can only inherit from a NamedTuple type and Generic') 3004 bases = tuple(tuple if base is _NamedTuple else base for base in bases) 3005 types = ns.get('__annotations__', {}) 3006 default_names = [] 3007 for field_name in types: 3008 if field_name in ns: 3009 default_names.append(field_name) 3010 elif default_names: 3011 raise TypeError(f"Non-default namedtuple field {field_name} " 3012 f"cannot follow default field" 3013 f"{'s' if len(default_names) > 1 else ''} " 3014 f"{', '.join(default_names)}") 3015 nm_tpl = _make_nmtuple(typename, types.items(), 3016 defaults=[ns[n] for n in default_names], 3017 module=ns['__module__']) 3018 nm_tpl.__bases__ = bases 3019 if Generic in bases: 3020 class_getitem = _generic_class_getitem 3021 nm_tpl.__class_getitem__ = classmethod(class_getitem) 3022 # update from user namespace without overriding special namedtuple attributes 3023 for key, val in ns.items(): 3024 if key in _prohibited: 3025 raise AttributeError("Cannot overwrite NamedTuple attribute " + key) 3026 elif key not in _special: 3027 if key not in nm_tpl._fields: 3028 setattr(nm_tpl, key, val) 3029 try: 3030 set_name = type(val).__set_name__ 3031 except AttributeError: 3032 pass 3033 else: 3034 try: 3035 set_name(val, nm_tpl, key) 3036 except BaseException as e: 3037 e.add_note( 3038 f"Error calling __set_name__ on {type(val).__name__!r} " 3039 f"instance {key!r} in {typename!r}" 3040 ) 3041 raise 3042 3043 if Generic in bases: 3044 nm_tpl.__init_subclass__() 3045 return nm_tpl 3046 3047 3048def NamedTuple(typename, fields=_sentinel, /, **kwargs): 3049 """Typed version of namedtuple. 3050 3051 Usage:: 3052 3053 class Employee(NamedTuple): 3054 name: str 3055 id: int 3056 3057 This is equivalent to:: 3058 3059 Employee = collections.namedtuple('Employee', ['name', 'id']) 3060 3061 The resulting class has an extra __annotations__ attribute, giving a 3062 dict that maps field names to types. (The field names are also in 3063 the _fields attribute, which is part of the namedtuple API.) 3064 An alternative equivalent functional syntax is also accepted:: 3065 3066 Employee = NamedTuple('Employee', [('name', str), ('id', int)]) 3067 """ 3068 if fields is _sentinel: 3069 if kwargs: 3070 deprecated_thing = "Creating NamedTuple classes using keyword arguments" 3071 deprecation_msg = ( 3072 "{name} is deprecated and will be disallowed in Python {remove}. " 3073 "Use the class-based or functional syntax instead." 3074 ) 3075 else: 3076 deprecated_thing = "Failing to pass a value for the 'fields' parameter" 3077 example = f"`{typename} = NamedTuple({typename!r}, [])`" 3078 deprecation_msg = ( 3079 "{name} is deprecated and will be disallowed in Python {remove}. " 3080 "To create a NamedTuple class with 0 fields " 3081 "using the functional syntax, " 3082 "pass an empty list, e.g. " 3083 ) + example + "." 3084 elif fields is None: 3085 if kwargs: 3086 raise TypeError( 3087 "Cannot pass `None` as the 'fields' parameter " 3088 "and also specify fields using keyword arguments" 3089 ) 3090 else: 3091 deprecated_thing = "Passing `None` as the 'fields' parameter" 3092 example = f"`{typename} = NamedTuple({typename!r}, [])`" 3093 deprecation_msg = ( 3094 "{name} is deprecated and will be disallowed in Python {remove}. " 3095 "To create a NamedTuple class with 0 fields " 3096 "using the functional syntax, " 3097 "pass an empty list, e.g. " 3098 ) + example + "." 3099 elif kwargs: 3100 raise TypeError("Either list of fields or keywords" 3101 " can be provided to NamedTuple, not both") 3102 if fields is _sentinel or fields is None: 3103 import warnings 3104 warnings._deprecated(deprecated_thing, message=deprecation_msg, remove=(3, 15)) 3105 fields = kwargs.items() 3106 nt = _make_nmtuple(typename, fields, module=_caller()) 3107 nt.__orig_bases__ = (NamedTuple,) 3108 return nt 3109 3110_NamedTuple = type.__new__(NamedTupleMeta, 'NamedTuple', (), {}) 3111 3112def _namedtuple_mro_entries(bases): 3113 assert NamedTuple in bases 3114 return (_NamedTuple,) 3115 3116NamedTuple.__mro_entries__ = _namedtuple_mro_entries 3117 3118 3119def _get_typeddict_qualifiers(annotation_type): 3120 while True: 3121 annotation_origin = get_origin(annotation_type) 3122 if annotation_origin is Annotated: 3123 annotation_args = get_args(annotation_type) 3124 if annotation_args: 3125 annotation_type = annotation_args[0] 3126 else: 3127 break 3128 elif annotation_origin is Required: 3129 yield Required 3130 (annotation_type,) = get_args(annotation_type) 3131 elif annotation_origin is NotRequired: 3132 yield NotRequired 3133 (annotation_type,) = get_args(annotation_type) 3134 elif annotation_origin is ReadOnly: 3135 yield ReadOnly 3136 (annotation_type,) = get_args(annotation_type) 3137 else: 3138 break 3139 3140 3141class _TypedDictMeta(type): 3142 def __new__(cls, name, bases, ns, total=True): 3143 """Create a new typed dict class object. 3144 3145 This method is called when TypedDict is subclassed, 3146 or when TypedDict is instantiated. This way 3147 TypedDict supports all three syntax forms described in its docstring. 3148 Subclasses and instances of TypedDict return actual dictionaries. 3149 """ 3150 for base in bases: 3151 if type(base) is not _TypedDictMeta and base is not Generic: 3152 raise TypeError('cannot inherit from both a TypedDict type ' 3153 'and a non-TypedDict base class') 3154 3155 if any(issubclass(b, Generic) for b in bases): 3156 generic_base = (Generic,) 3157 else: 3158 generic_base = () 3159 3160 tp_dict = type.__new__(_TypedDictMeta, name, (*generic_base, dict), ns) 3161 3162 if not hasattr(tp_dict, '__orig_bases__'): 3163 tp_dict.__orig_bases__ = bases 3164 3165 annotations = {} 3166 own_annotations = ns.get('__annotations__', {}) 3167 msg = "TypedDict('Name', {f0: t0, f1: t1, ...}); each t must be a type" 3168 own_annotations = { 3169 n: _type_check(tp, msg, module=tp_dict.__module__) 3170 for n, tp in own_annotations.items() 3171 } 3172 required_keys = set() 3173 optional_keys = set() 3174 readonly_keys = set() 3175 mutable_keys = set() 3176 3177 for base in bases: 3178 annotations.update(base.__dict__.get('__annotations__', {})) 3179 3180 base_required = base.__dict__.get('__required_keys__', set()) 3181 required_keys |= base_required 3182 optional_keys -= base_required 3183 3184 base_optional = base.__dict__.get('__optional_keys__', set()) 3185 required_keys -= base_optional 3186 optional_keys |= base_optional 3187 3188 readonly_keys.update(base.__dict__.get('__readonly_keys__', ())) 3189 mutable_keys.update(base.__dict__.get('__mutable_keys__', ())) 3190 3191 annotations.update(own_annotations) 3192 for annotation_key, annotation_type in own_annotations.items(): 3193 qualifiers = set(_get_typeddict_qualifiers(annotation_type)) 3194 if Required in qualifiers: 3195 is_required = True 3196 elif NotRequired in qualifiers: 3197 is_required = False 3198 else: 3199 is_required = total 3200 3201 if is_required: 3202 required_keys.add(annotation_key) 3203 optional_keys.discard(annotation_key) 3204 else: 3205 optional_keys.add(annotation_key) 3206 required_keys.discard(annotation_key) 3207 3208 if ReadOnly in qualifiers: 3209 if annotation_key in mutable_keys: 3210 raise TypeError( 3211 f"Cannot override mutable key {annotation_key!r}" 3212 " with read-only key" 3213 ) 3214 readonly_keys.add(annotation_key) 3215 else: 3216 mutable_keys.add(annotation_key) 3217 readonly_keys.discard(annotation_key) 3218 3219 assert required_keys.isdisjoint(optional_keys), ( 3220 f"Required keys overlap with optional keys in {name}:" 3221 f" {required_keys=}, {optional_keys=}" 3222 ) 3223 tp_dict.__annotations__ = annotations 3224 tp_dict.__required_keys__ = frozenset(required_keys) 3225 tp_dict.__optional_keys__ = frozenset(optional_keys) 3226 tp_dict.__readonly_keys__ = frozenset(readonly_keys) 3227 tp_dict.__mutable_keys__ = frozenset(mutable_keys) 3228 tp_dict.__total__ = total 3229 return tp_dict 3230 3231 __call__ = dict # static method 3232 3233 def __subclasscheck__(cls, other): 3234 # Typed dicts are only for static structural subtyping. 3235 raise TypeError('TypedDict does not support instance and class checks') 3236 3237 __instancecheck__ = __subclasscheck__ 3238 3239 3240def TypedDict(typename, fields=_sentinel, /, *, total=True): 3241 """A simple typed namespace. At runtime it is equivalent to a plain dict. 3242 3243 TypedDict creates a dictionary type such that a type checker will expect all 3244 instances to have a certain set of keys, where each key is 3245 associated with a value of a consistent type. This expectation 3246 is not checked at runtime. 3247 3248 Usage:: 3249 3250 >>> class Point2D(TypedDict): 3251 ... x: int 3252 ... y: int 3253 ... label: str 3254 ... 3255 >>> a: Point2D = {'x': 1, 'y': 2, 'label': 'good'} # OK 3256 >>> b: Point2D = {'z': 3, 'label': 'bad'} # Fails type check 3257 >>> Point2D(x=1, y=2, label='first') == dict(x=1, y=2, label='first') 3258 True 3259 3260 The type info can be accessed via the Point2D.__annotations__ dict, and 3261 the Point2D.__required_keys__ and Point2D.__optional_keys__ frozensets. 3262 TypedDict supports an additional equivalent form:: 3263 3264 Point2D = TypedDict('Point2D', {'x': int, 'y': int, 'label': str}) 3265 3266 By default, all keys must be present in a TypedDict. It is possible 3267 to override this by specifying totality:: 3268 3269 class Point2D(TypedDict, total=False): 3270 x: int 3271 y: int 3272 3273 This means that a Point2D TypedDict can have any of the keys omitted. A type 3274 checker is only expected to support a literal False or True as the value of 3275 the total argument. True is the default, and makes all items defined in the 3276 class body be required. 3277 3278 The Required and NotRequired special forms can also be used to mark 3279 individual keys as being required or not required:: 3280 3281 class Point2D(TypedDict): 3282 x: int # the "x" key must always be present (Required is the default) 3283 y: NotRequired[int] # the "y" key can be omitted 3284 3285 See PEP 655 for more details on Required and NotRequired. 3286 3287 The ReadOnly special form can be used 3288 to mark individual keys as immutable for type checkers:: 3289 3290 class DatabaseUser(TypedDict): 3291 id: ReadOnly[int] # the "id" key must not be modified 3292 username: str # the "username" key can be changed 3293 3294 """ 3295 if fields is _sentinel or fields is None: 3296 import warnings 3297 3298 if fields is _sentinel: 3299 deprecated_thing = "Failing to pass a value for the 'fields' parameter" 3300 else: 3301 deprecated_thing = "Passing `None` as the 'fields' parameter" 3302 3303 example = f"`{typename} = TypedDict({typename!r}, {{{{}}}})`" 3304 deprecation_msg = ( 3305 "{name} is deprecated and will be disallowed in Python {remove}. " 3306 "To create a TypedDict class with 0 fields " 3307 "using the functional syntax, " 3308 "pass an empty dictionary, e.g. " 3309 ) + example + "." 3310 warnings._deprecated(deprecated_thing, message=deprecation_msg, remove=(3, 15)) 3311 fields = {} 3312 3313 ns = {'__annotations__': dict(fields)} 3314 module = _caller() 3315 if module is not None: 3316 # Setting correct module is necessary to make typed dict classes pickleable. 3317 ns['__module__'] = module 3318 3319 td = _TypedDictMeta(typename, (), ns, total=total) 3320 td.__orig_bases__ = (TypedDict,) 3321 return td 3322 3323_TypedDict = type.__new__(_TypedDictMeta, 'TypedDict', (), {}) 3324TypedDict.__mro_entries__ = lambda bases: (_TypedDict,) 3325 3326 3327@_SpecialForm 3328def Required(self, parameters): 3329 """Special typing construct to mark a TypedDict key as required. 3330 3331 This is mainly useful for total=False TypedDicts. 3332 3333 For example:: 3334 3335 class Movie(TypedDict, total=False): 3336 title: Required[str] 3337 year: int 3338 3339 m = Movie( 3340 title='The Matrix', # typechecker error if key is omitted 3341 year=1999, 3342 ) 3343 3344 There is no runtime checking that a required key is actually provided 3345 when instantiating a related TypedDict. 3346 """ 3347 item = _type_check(parameters, f'{self._name} accepts only a single type.') 3348 return _GenericAlias(self, (item,)) 3349 3350 3351@_SpecialForm 3352def NotRequired(self, parameters): 3353 """Special typing construct to mark a TypedDict key as potentially missing. 3354 3355 For example:: 3356 3357 class Movie(TypedDict): 3358 title: str 3359 year: NotRequired[int] 3360 3361 m = Movie( 3362 title='The Matrix', # typechecker error if key is omitted 3363 year=1999, 3364 ) 3365 """ 3366 item = _type_check(parameters, f'{self._name} accepts only a single type.') 3367 return _GenericAlias(self, (item,)) 3368 3369 3370@_SpecialForm 3371def ReadOnly(self, parameters): 3372 """A special typing construct to mark an item of a TypedDict as read-only. 3373 3374 For example:: 3375 3376 class Movie(TypedDict): 3377 title: ReadOnly[str] 3378 year: int 3379 3380 def mutate_movie(m: Movie) -> None: 3381 m["year"] = 1992 # allowed 3382 m["title"] = "The Matrix" # typechecker error 3383 3384 There is no runtime checking for this property. 3385 """ 3386 item = _type_check(parameters, f'{self._name} accepts only a single type.') 3387 return _GenericAlias(self, (item,)) 3388 3389 3390class NewType: 3391 """NewType creates simple unique types with almost zero runtime overhead. 3392 3393 NewType(name, tp) is considered a subtype of tp 3394 by static type checkers. At runtime, NewType(name, tp) returns 3395 a dummy callable that simply returns its argument. 3396 3397 Usage:: 3398 3399 UserId = NewType('UserId', int) 3400 3401 def name_by_id(user_id: UserId) -> str: 3402 ... 3403 3404 UserId('user') # Fails type check 3405 3406 name_by_id(42) # Fails type check 3407 name_by_id(UserId(42)) # OK 3408 3409 num = UserId(5) + 1 # type: int 3410 """ 3411 3412 __call__ = _idfunc 3413 3414 def __init__(self, name, tp): 3415 self.__qualname__ = name 3416 if '.' in name: 3417 name = name.rpartition('.')[-1] 3418 self.__name__ = name 3419 self.__supertype__ = tp 3420 def_mod = _caller() 3421 if def_mod != 'typing': 3422 self.__module__ = def_mod 3423 3424 def __mro_entries__(self, bases): 3425 # We defined __mro_entries__ to get a better error message 3426 # if a user attempts to subclass a NewType instance. bpo-46170 3427 superclass_name = self.__name__ 3428 3429 class Dummy: 3430 def __init_subclass__(cls): 3431 subclass_name = cls.__name__ 3432 raise TypeError( 3433 f"Cannot subclass an instance of NewType. Perhaps you were looking for: " 3434 f"`{subclass_name} = NewType({subclass_name!r}, {superclass_name})`" 3435 ) 3436 3437 return (Dummy,) 3438 3439 def __repr__(self): 3440 return f'{self.__module__}.{self.__qualname__}' 3441 3442 def __reduce__(self): 3443 return self.__qualname__ 3444 3445 def __or__(self, other): 3446 return Union[self, other] 3447 3448 def __ror__(self, other): 3449 return Union[other, self] 3450 3451 3452# Python-version-specific alias (Python 2: unicode; Python 3: str) 3453Text = str 3454 3455 3456# Constant that's True when type checking, but False here. 3457TYPE_CHECKING = False 3458 3459 3460class IO(Generic[AnyStr]): 3461 """Generic base class for TextIO and BinaryIO. 3462 3463 This is an abstract, generic version of the return of open(). 3464 3465 NOTE: This does not distinguish between the different possible 3466 classes (text vs. binary, read vs. write vs. read/write, 3467 append-only, unbuffered). The TextIO and BinaryIO subclasses 3468 below capture the distinctions between text vs. binary, which is 3469 pervasive in the interface; however we currently do not offer a 3470 way to track the other distinctions in the type system. 3471 """ 3472 3473 __slots__ = () 3474 3475 @property 3476 @abstractmethod 3477 def mode(self) -> str: 3478 pass 3479 3480 @property 3481 @abstractmethod 3482 def name(self) -> str: 3483 pass 3484 3485 @abstractmethod 3486 def close(self) -> None: 3487 pass 3488 3489 @property 3490 @abstractmethod 3491 def closed(self) -> bool: 3492 pass 3493 3494 @abstractmethod 3495 def fileno(self) -> int: 3496 pass 3497 3498 @abstractmethod 3499 def flush(self) -> None: 3500 pass 3501 3502 @abstractmethod 3503 def isatty(self) -> bool: 3504 pass 3505 3506 @abstractmethod 3507 def read(self, n: int = -1) -> AnyStr: 3508 pass 3509 3510 @abstractmethod 3511 def readable(self) -> bool: 3512 pass 3513 3514 @abstractmethod 3515 def readline(self, limit: int = -1) -> AnyStr: 3516 pass 3517 3518 @abstractmethod 3519 def readlines(self, hint: int = -1) -> List[AnyStr]: 3520 pass 3521 3522 @abstractmethod 3523 def seek(self, offset: int, whence: int = 0) -> int: 3524 pass 3525 3526 @abstractmethod 3527 def seekable(self) -> bool: 3528 pass 3529 3530 @abstractmethod 3531 def tell(self) -> int: 3532 pass 3533 3534 @abstractmethod 3535 def truncate(self, size: int = None) -> int: 3536 pass 3537 3538 @abstractmethod 3539 def writable(self) -> bool: 3540 pass 3541 3542 @abstractmethod 3543 def write(self, s: AnyStr) -> int: 3544 pass 3545 3546 @abstractmethod 3547 def writelines(self, lines: List[AnyStr]) -> None: 3548 pass 3549 3550 @abstractmethod 3551 def __enter__(self) -> 'IO[AnyStr]': 3552 pass 3553 3554 @abstractmethod 3555 def __exit__(self, type, value, traceback) -> None: 3556 pass 3557 3558 3559class BinaryIO(IO[bytes]): 3560 """Typed version of the return of open() in binary mode.""" 3561 3562 __slots__ = () 3563 3564 @abstractmethod 3565 def write(self, s: Union[bytes, bytearray]) -> int: 3566 pass 3567 3568 @abstractmethod 3569 def __enter__(self) -> 'BinaryIO': 3570 pass 3571 3572 3573class TextIO(IO[str]): 3574 """Typed version of the return of open() in text mode.""" 3575 3576 __slots__ = () 3577 3578 @property 3579 @abstractmethod 3580 def buffer(self) -> BinaryIO: 3581 pass 3582 3583 @property 3584 @abstractmethod 3585 def encoding(self) -> str: 3586 pass 3587 3588 @property 3589 @abstractmethod 3590 def errors(self) -> Optional[str]: 3591 pass 3592 3593 @property 3594 @abstractmethod 3595 def line_buffering(self) -> bool: 3596 pass 3597 3598 @property 3599 @abstractmethod 3600 def newlines(self) -> Any: 3601 pass 3602 3603 @abstractmethod 3604 def __enter__(self) -> 'TextIO': 3605 pass 3606 3607 3608def reveal_type[T](obj: T, /) -> T: 3609 """Ask a static type checker to reveal the inferred type of an expression. 3610 3611 When a static type checker encounters a call to ``reveal_type()``, 3612 it will emit the inferred type of the argument:: 3613 3614 x: int = 1 3615 reveal_type(x) 3616 3617 Running a static type checker (e.g., mypy) on this example 3618 will produce output similar to 'Revealed type is "builtins.int"'. 3619 3620 At runtime, the function prints the runtime type of the 3621 argument and returns the argument unchanged. 3622 """ 3623 print(f"Runtime type is {type(obj).__name__!r}", file=sys.stderr) 3624 return obj 3625 3626 3627class _IdentityCallable(Protocol): 3628 def __call__[T](self, arg: T, /) -> T: 3629 ... 3630 3631 3632def dataclass_transform( 3633 *, 3634 eq_default: bool = True, 3635 order_default: bool = False, 3636 kw_only_default: bool = False, 3637 frozen_default: bool = False, 3638 field_specifiers: tuple[type[Any] | Callable[..., Any], ...] = (), 3639 **kwargs: Any, 3640) -> _IdentityCallable: 3641 """Decorator to mark an object as providing dataclass-like behaviour. 3642 3643 The decorator can be applied to a function, class, or metaclass. 3644 3645 Example usage with a decorator function:: 3646 3647 @dataclass_transform() 3648 def create_model[T](cls: type[T]) -> type[T]: 3649 ... 3650 return cls 3651 3652 @create_model 3653 class CustomerModel: 3654 id: int 3655 name: str 3656 3657 On a base class:: 3658 3659 @dataclass_transform() 3660 class ModelBase: ... 3661 3662 class CustomerModel(ModelBase): 3663 id: int 3664 name: str 3665 3666 On a metaclass:: 3667 3668 @dataclass_transform() 3669 class ModelMeta(type): ... 3670 3671 class ModelBase(metaclass=ModelMeta): ... 3672 3673 class CustomerModel(ModelBase): 3674 id: int 3675 name: str 3676 3677 The ``CustomerModel`` classes defined above will 3678 be treated by type checkers similarly to classes created with 3679 ``@dataclasses.dataclass``. 3680 For example, type checkers will assume these classes have 3681 ``__init__`` methods that accept ``id`` and ``name``. 3682 3683 The arguments to this decorator can be used to customize this behavior: 3684 - ``eq_default`` indicates whether the ``eq`` parameter is assumed to be 3685 ``True`` or ``False`` if it is omitted by the caller. 3686 - ``order_default`` indicates whether the ``order`` parameter is 3687 assumed to be True or False if it is omitted by the caller. 3688 - ``kw_only_default`` indicates whether the ``kw_only`` parameter is 3689 assumed to be True or False if it is omitted by the caller. 3690 - ``frozen_default`` indicates whether the ``frozen`` parameter is 3691 assumed to be True or False if it is omitted by the caller. 3692 - ``field_specifiers`` specifies a static list of supported classes 3693 or functions that describe fields, similar to ``dataclasses.field()``. 3694 - Arbitrary other keyword arguments are accepted in order to allow for 3695 possible future extensions. 3696 3697 At runtime, this decorator records its arguments in the 3698 ``__dataclass_transform__`` attribute on the decorated object. 3699 It has no other runtime effect. 3700 3701 See PEP 681 for more details. 3702 """ 3703 def decorator(cls_or_fn): 3704 cls_or_fn.__dataclass_transform__ = { 3705 "eq_default": eq_default, 3706 "order_default": order_default, 3707 "kw_only_default": kw_only_default, 3708 "frozen_default": frozen_default, 3709 "field_specifiers": field_specifiers, 3710 "kwargs": kwargs, 3711 } 3712 return cls_or_fn 3713 return decorator 3714 3715 3716type _Func = Callable[..., Any] 3717 3718 3719def override[F: _Func](method: F, /) -> F: 3720 """Indicate that a method is intended to override a method in a base class. 3721 3722 Usage:: 3723 3724 class Base: 3725 def method(self) -> None: 3726 pass 3727 3728 class Child(Base): 3729 @override 3730 def method(self) -> None: 3731 super().method() 3732 3733 When this decorator is applied to a method, the type checker will 3734 validate that it overrides a method or attribute with the same name on a 3735 base class. This helps prevent bugs that may occur when a base class is 3736 changed without an equivalent change to a child class. 3737 3738 There is no runtime checking of this property. The decorator attempts to 3739 set the ``__override__`` attribute to ``True`` on the decorated object to 3740 allow runtime introspection. 3741 3742 See PEP 698 for details. 3743 """ 3744 try: 3745 method.__override__ = True 3746 except (AttributeError, TypeError): 3747 # Skip the attribute silently if it is not writable. 3748 # AttributeError happens if the object has __slots__ or a 3749 # read-only property, TypeError if it's a builtin class. 3750 pass 3751 return method 3752 3753 3754def is_protocol(tp: type, /) -> bool: 3755 """Return True if the given type is a Protocol. 3756 3757 Example:: 3758 3759 >>> from typing import Protocol, is_protocol 3760 >>> class P(Protocol): 3761 ... def a(self) -> str: ... 3762 ... b: int 3763 >>> is_protocol(P) 3764 True 3765 >>> is_protocol(int) 3766 False 3767 """ 3768 return ( 3769 isinstance(tp, type) 3770 and getattr(tp, '_is_protocol', False) 3771 and tp != Protocol 3772 ) 3773 3774 3775def get_protocol_members(tp: type, /) -> frozenset[str]: 3776 """Return the set of members defined in a Protocol. 3777 3778 Example:: 3779 3780 >>> from typing import Protocol, get_protocol_members 3781 >>> class P(Protocol): 3782 ... def a(self) -> str: ... 3783 ... b: int 3784 >>> get_protocol_members(P) == frozenset({'a', 'b'}) 3785 True 3786 3787 Raise a TypeError for arguments that are not Protocols. 3788 """ 3789 if not is_protocol(tp): 3790 raise TypeError(f'{tp!r} is not a Protocol') 3791 return frozenset(tp.__protocol_attrs__) 3792 3793 3794def __getattr__(attr): 3795 """Improve the import time of the typing module. 3796 3797 Soft-deprecated objects which are costly to create 3798 are only created on-demand here. 3799 """ 3800 if attr in {"Pattern", "Match"}: 3801 import re 3802 obj = _alias(getattr(re, attr), 1) 3803 elif attr in {"ContextManager", "AsyncContextManager"}: 3804 import contextlib 3805 obj = _alias(getattr(contextlib, f"Abstract{attr}"), 2, name=attr, defaults=(bool | None,)) 3806 elif attr == "_collect_parameters": 3807 import warnings 3808 3809 depr_message = ( 3810 "The private _collect_parameters function is deprecated and will be" 3811 " removed in a future version of Python. Any use of private functions" 3812 " is discouraged and may break in the future." 3813 ) 3814 warnings.warn(depr_message, category=DeprecationWarning, stacklevel=2) 3815 obj = _collect_type_parameters 3816 else: 3817 raise AttributeError(f"module {__name__!r} has no attribute {attr!r}") 3818 globals()[attr] = obj 3819 return obj
Add context-specific metadata to a type.
Example: Annotated[int, runtime_check.Unsigned] indicates to the hypothetical runtime_check module that this type is an unsigned int. Every other consumer of this type can ignore this metadata and treat this type as int.
The first argument to Annotated must be a valid type.
Details:
- It's an error to call
Annotated
with less than two arguments. Access the metadata via the
__metadata__
attribute::assert Annotated[int, '$'].__metadata__ == ('$',)
Nested Annotated types are flattened::
assert Annotated[Annotated[T, Ann1, Ann2], Ann3] == Annotated[T, Ann1, Ann2, Ann3]
Instantiating an annotated type is equivalent to instantiating the underlying type::
assert AnnotatedC, Ann1 == C(5)
Annotated can be used as a generic type alias::
type Optimized[T] = Annotated[T, runtime.Optimize()]
type checker will treat Optimized[int]
as equivalent to Annotated[int, runtime.Optimize()]
type OptimizedList[T] = Annotated[list[T], runtime.Optimize()]
type checker will treat OptimizedList[int]
as equivalent to Annotated[list[int], runtime.Optimize()]
Annotated cannot be used with an unpacked TypeVarTuple::
type Variadic[Ts] = Annotated[Ts, Ann1] # NOT valid
This would be equivalent to::
Annotated[T1, T2, T3, ..., Ann1]
where T1, T2 etc. are TypeVars, which would be invalid, because only one type should be passed to Annotated.
599class Any(metaclass=_AnyMeta): 600 """Special type indicating an unconstrained type. 601 602 - Any is compatible with every type. 603 - Any assumed to have all methods. 604 - All values assumed to be instances of Any. 605 606 Note that all the above statements are true from the point of view of 607 static type checkers. At runtime, Any should not be used with instance 608 checks. 609 """ 610 611 def __new__(cls, *args, **kwargs): 612 if cls is Any: 613 raise TypeError("Any cannot be instantiated") 614 return super().__new__(cls)
Special type indicating an unconstrained type.
- Any is compatible with every type.
- Any assumed to have all methods.
- All values assumed to be instances of Any.
Note that all the above statements are true from the point of view of static type checkers. At runtime, Any should not be used with instance checks.
Special type construct to mark class variables.
An annotation wrapped in ClassVar indicates that a given attribute is intended to be used as a class variable and should not be set on instances of that class.
Usage::
class Starship:
stats: ClassVar[dict[str, int]] = {} # class variable
damage: int = 10 # instance variable
ClassVar accepts only types and cannot be further subscribed.
Note that ClassVar is not a class itself, and should not be used with isinstance() or issubclass().
Special form for annotating higher-order functions.
Concatenate
can be used in conjunction with ParamSpec
and
Callable
to represent a higher-order function which adds, removes or
transforms the parameters of a callable.
For example::
Callable[Concatenate[int, P], int]
See PEP 612 for detailed information.
Special typing construct to indicate final names to type checkers.
A final name cannot be re-assigned or overridden in a subclass.
For example::
MAX_SIZE: Final = 9000
MAX_SIZE += 1 # Error reported by type checker
class Connection:
TIMEOUT: Final[int] = 10
class FastConnector(Connection):
TIMEOUT = 1 # Error reported by type checker
There is no runtime checking of these properties.
1016class ForwardRef(_Final, _root=True): 1017 """Internal wrapper to hold a forward reference.""" 1018 1019 __slots__ = ('__forward_arg__', '__forward_code__', 1020 '__forward_evaluated__', '__forward_value__', 1021 '__forward_is_argument__', '__forward_is_class__', 1022 '__forward_module__') 1023 1024 def __init__(self, arg, is_argument=True, module=None, *, is_class=False): 1025 if not isinstance(arg, str): 1026 raise TypeError(f"Forward reference must be a string -- got {arg!r}") 1027 1028 # If we do `def f(*args: *Ts)`, then we'll have `arg = '*Ts'`. 1029 # Unfortunately, this isn't a valid expression on its own, so we 1030 # do the unpacking manually. 1031 if arg.startswith('*'): 1032 arg_to_compile = f'({arg},)[0]' # E.g. (*Ts,)[0] or (*tuple[int, int],)[0] 1033 else: 1034 arg_to_compile = arg 1035 try: 1036 code = compile(arg_to_compile, '<string>', 'eval') 1037 except SyntaxError: 1038 raise SyntaxError(f"Forward reference must be an expression -- got {arg!r}") 1039 1040 self.__forward_arg__ = arg 1041 self.__forward_code__ = code 1042 self.__forward_evaluated__ = False 1043 self.__forward_value__ = None 1044 self.__forward_is_argument__ = is_argument 1045 self.__forward_is_class__ = is_class 1046 self.__forward_module__ = module 1047 1048 def _evaluate(self, globalns, localns, type_params=_sentinel, *, recursive_guard): 1049 if type_params is _sentinel: 1050 _deprecation_warning_for_no_type_params_passed("typing.ForwardRef._evaluate") 1051 type_params = () 1052 if self.__forward_arg__ in recursive_guard: 1053 return self 1054 if not self.__forward_evaluated__ or localns is not globalns: 1055 if globalns is None and localns is None: 1056 globalns = localns = {} 1057 elif globalns is None: 1058 globalns = localns 1059 elif localns is None: 1060 localns = globalns 1061 if self.__forward_module__ is not None: 1062 globalns = getattr( 1063 sys.modules.get(self.__forward_module__, None), '__dict__', globalns 1064 ) 1065 1066 # type parameters require some special handling, 1067 # as they exist in their own scope 1068 # but `eval()` does not have a dedicated parameter for that scope. 1069 # For classes, names in type parameter scopes should override 1070 # names in the global scope (which here are called `localns`!), 1071 # but should in turn be overridden by names in the class scope 1072 # (which here are called `globalns`!) 1073 if type_params: 1074 globalns, localns = dict(globalns), dict(localns) 1075 for param in type_params: 1076 param_name = param.__name__ 1077 if not self.__forward_is_class__ or param_name not in globalns: 1078 globalns[param_name] = param 1079 localns.pop(param_name, None) 1080 1081 type_ = _type_check( 1082 eval(self.__forward_code__, globalns, localns), 1083 "Forward references must evaluate to types.", 1084 is_argument=self.__forward_is_argument__, 1085 allow_special_forms=self.__forward_is_class__, 1086 ) 1087 self.__forward_value__ = _eval_type( 1088 type_, 1089 globalns, 1090 localns, 1091 type_params, 1092 recursive_guard=(recursive_guard | {self.__forward_arg__}), 1093 ) 1094 self.__forward_evaluated__ = True 1095 return self.__forward_value__ 1096 1097 def __eq__(self, other): 1098 if not isinstance(other, ForwardRef): 1099 return NotImplemented 1100 if self.__forward_evaluated__ and other.__forward_evaluated__: 1101 return (self.__forward_arg__ == other.__forward_arg__ and 1102 self.__forward_value__ == other.__forward_value__) 1103 return (self.__forward_arg__ == other.__forward_arg__ and 1104 self.__forward_module__ == other.__forward_module__) 1105 1106 def __hash__(self): 1107 return hash((self.__forward_arg__, self.__forward_module__)) 1108 1109 def __or__(self, other): 1110 return Union[self, other] 1111 1112 def __ror__(self, other): 1113 return Union[other, self] 1114 1115 def __repr__(self): 1116 if self.__forward_module__ is None: 1117 module_repr = '' 1118 else: 1119 module_repr = f', module={self.__forward_module__!r}' 1120 return f'ForwardRef({self.__forward_arg__!r}{module_repr})'
Internal wrapper to hold a forward reference.
1024 def __init__(self, arg, is_argument=True, module=None, *, is_class=False): 1025 if not isinstance(arg, str): 1026 raise TypeError(f"Forward reference must be a string -- got {arg!r}") 1027 1028 # If we do `def f(*args: *Ts)`, then we'll have `arg = '*Ts'`. 1029 # Unfortunately, this isn't a valid expression on its own, so we 1030 # do the unpacking manually. 1031 if arg.startswith('*'): 1032 arg_to_compile = f'({arg},)[0]' # E.g. (*Ts,)[0] or (*tuple[int, int],)[0] 1033 else: 1034 arg_to_compile = arg 1035 try: 1036 code = compile(arg_to_compile, '<string>', 'eval') 1037 except SyntaxError: 1038 raise SyntaxError(f"Forward reference must be an expression -- got {arg!r}") 1039 1040 self.__forward_arg__ = arg 1041 self.__forward_code__ = code 1042 self.__forward_evaluated__ = False 1043 self.__forward_value__ = None 1044 self.__forward_is_argument__ = is_argument 1045 self.__forward_is_class__ = is_class 1046 self.__forward_module__ = module
Abstract base class for generic types.
On Python 3.12 and newer, generic classes implicitly inherit from Generic when they declare a parameter list after the class's name::
class Mapping[KT, VT]:
def __getitem__(self, key: KT) -> VT:
...
# Etc.
On older versions of Python, however, generic classes have to explicitly inherit from Generic.
After a class has been declared to be generic, it can then be used as follows::
def lookup_name[KT, VT](mapping: Mapping[KT, VT], key: KT, default: VT) -> VT:
try:
return mapping[key]
except KeyError:
return default
Special typing form to define literal types (a.k.a. value types).
This form can be used to indicate to type checkers that the corresponding variable or function parameter has a value equivalent to the provided literal (or one of several literals)::
def validate_simple(data: Any) -> Literal[True]: # always returns True
...
MODE = Literal['r', 'rb', 'w', 'wb']
def open_helper(file: str, mode: MODE) -> str:
...
open_helper('/some/path', 'r') # Passes type check
open_helper('/other/path', 'typo') # Error in type checker
Literal[...] cannot be subclassed. At runtime, an arbitrary value is allowed as type argument to Literal[...], but type checkers may impose restrictions.
Optional[X] is equivalent to Union[X, None].
Parameter specification variable.
The preferred way to construct a parameter specification is via the dedicated syntax for generic functions, classes, and type aliases, where the use of '**' creates a parameter specification::
type IntFunc[**P] = Callable[P, int]
The following syntax creates a parameter specification that defaults to a callable accepting two positional-only arguments of types int and str:
type IntFuncDefault[**P = (int, str)] = Callable[P, int]
For compatibility with Python 3.11 and earlier, ParamSpec objects can also be created as follows::
P = ParamSpec('P')
DefaultP = ParamSpec('DefaultP', default=(int, str))
Parameter specification variables exist primarily for the benefit of
static type checkers. They are used to forward the parameter types of
one callable to another callable, a pattern commonly found in
higher-order functions and decorators. They are only valid when used
in Concatenate
, or as the first argument to Callable
, or as
parameters for user-defined Generics. See class Generic for more
information on generic types.
An example for annotating a decorator::
def add_logging[**P, T](f: Callable[P, T]) -> Callable[P, T]:
'''A type-safe decorator to add logging to a function.'''
def inner(*args: P.args, **kwargs: P.kwargs) -> T:
logging.info(f'{f.__name__} was called')
return f(*args, **kwargs)
return inner
@add_logging
def add_two(x: float, y: float) -> float:
'''Add two numbers together.'''
return x + y
Parameter specification variables can be introspected. e.g.::
>>> P = ParamSpec("P")
>>> P.__name__
'P'
Note that only parameter specification variables defined in the global scope can be pickled.
2163class Protocol(Generic, metaclass=_ProtocolMeta): 2164 """Base class for protocol classes. 2165 2166 Protocol classes are defined as:: 2167 2168 class Proto(Protocol): 2169 def meth(self) -> int: 2170 ... 2171 2172 Such classes are primarily used with static type checkers that recognize 2173 structural subtyping (static duck-typing). 2174 2175 For example:: 2176 2177 class C: 2178 def meth(self) -> int: 2179 return 0 2180 2181 def func(x: Proto) -> int: 2182 return x.meth() 2183 2184 func(C()) # Passes static type check 2185 2186 See PEP 544 for details. Protocol classes decorated with 2187 @typing.runtime_checkable act as simple-minded runtime protocols that check 2188 only the presence of given attributes, ignoring their type signatures. 2189 Protocol classes can be generic, they are defined as:: 2190 2191 class GenProto[T](Protocol): 2192 def meth(self) -> T: 2193 ... 2194 """ 2195 2196 __slots__ = () 2197 _is_protocol = True 2198 _is_runtime_protocol = False 2199 2200 def __init_subclass__(cls, *args, **kwargs): 2201 super().__init_subclass__(*args, **kwargs) 2202 2203 # Determine if this is a protocol or a concrete subclass. 2204 if not cls.__dict__.get('_is_protocol', False): 2205 cls._is_protocol = any(b is Protocol for b in cls.__bases__) 2206 2207 # Set (or override) the protocol subclass hook. 2208 if '__subclasshook__' not in cls.__dict__: 2209 cls.__subclasshook__ = _proto_hook 2210 2211 # Prohibit instantiation for protocol classes 2212 if cls._is_protocol and cls.__init__ is Protocol.__init__: 2213 cls.__init__ = _no_init_or_replace_init
Base class for protocol classes.
Protocol classes are defined as::
class Proto(Protocol):
def meth(self) -> int:
...
Such classes are primarily used with static type checkers that recognize structural subtyping (static duck-typing).
For example::
class C:
def meth(self) -> int:
return 0
def func(x: Proto) -> int:
return x.meth()
func(C()) # Passes static type check
See PEP 544 for details. Protocol classes decorated with @typing.runtime_checkable act as simple-minded runtime protocols that check only the presence of given attributes, ignoring their type signatures. Protocol classes can be generic, they are defined as::
class GenProto[T](Protocol):
def meth(self) -> T:
...
Type variable.
The preferred way to construct a type variable is via the dedicated syntax for generic functions, classes, and type aliases::
class Sequence[T]: # T is a TypeVar
...
This syntax can also be used to create bound and constrained type variables::
# S is a TypeVar bound to str
class StrSequence[S: str]:
...
# A is a TypeVar constrained to str or bytes
class StrOrBytesSequence[A: (str, bytes)]:
...
Type variables can also have defaults:
class IntDefault[T = int]: ...
However, if desired, reusable type variables can also be constructed manually, like so::
T = TypeVar('T') # Can be anything S = TypeVar('S', bound=str) # Can be any subtype of str A = TypeVar('A', str, bytes) # Must be exactly str or bytes D = TypeVar('D', default=int) # Defaults to int
Type variables exist primarily for the benefit of static type checkers. They serve as the parameters for generic types as well as for generic function and type alias definitions.
The variance of type variables is inferred by type checkers when they
are created through the type parameter syntax and when
infer_variance=True
is passed. Manually created type variables may
be explicitly marked covariant or contravariant by passing
covariant=True
or contravariant=True
. By default, manually
created type variables are invariant. See PEP 484 and PEP 695 for more
details.
Type variable tuple. A specialized form of type variable that enables variadic generics.
The preferred way to construct a type variable tuple is via the dedicated syntax for generic functions, classes, and type aliases, where a single '*' indicates a type variable tuple::
def move_first_element_to_last[T, *Ts](tup: tuple[T, *Ts]) -> tuple[*Ts, T]:
return (*tup[1:], tup[0])
Type variables tuples can have default values:
type AliasWithDefault[Ts = (str, int)] = tuple[Ts]
For compatibility with Python 3.11 and earlier, TypeVarTuple objects can also be created as follows::
Ts = TypeVarTuple('Ts') # Can be given any name
DefaultTs = TypeVarTuple('Ts', default=(str, int))
Just as a TypeVar (type variable) is a placeholder for a single type, a TypeVarTuple is a placeholder for an arbitrary number of types. For example, if we define a generic class using a TypeVarTuple::
class C[*Ts]: ...
Then we can parameterize that class with an arbitrary number of type arguments::
C[int] # Fine
C[int, str] # Also fine
C[()] # Even this is fine
For more details, see PEP 646.
Note that only TypeVarTuples defined in the global scope can be pickled.
Union type; Union[X, Y] means either X or Y.
On Python 3.10 and higher, the | operator can also be used to denote unions; X | Y means the same thing to the type checker as Union[X, Y].
To define a union, use e.g. Union[int, str]. Details:
- The arguments must be types and there must be at least one.
- None as an argument is a special case and is replaced by type(None).
- Unions of unions are flattened, e.g.::
assert Union[Union[int, str], float] == Union[int, str, float]
Unions of a single argument vanish, e.g.::
assert Union[int] == int # The constructor actually returns int
Redundant arguments are skipped, e.g.::
assert Union[int, str, int] == Union[int, str]
When comparing unions, the argument order is ignored, e.g.::
assert Union[int, str] == Union[str, int]
You cannot subclass or instantiate a union.
- You can use Optional as a shorthand for Union[X, None].
2958@runtime_checkable 2959class SupportsAbs[T](Protocol): 2960 """An ABC with one abstract method __abs__ that is covariant in its return type.""" 2961 2962 __slots__ = () 2963 2964 @abstractmethod 2965 def __abs__(self) -> T: 2966 pass
An ABC with one abstract method __abs__ that is covariant in its return type.
2936@runtime_checkable 2937class SupportsBytes(Protocol): 2938 """An ABC with one abstract method __bytes__.""" 2939 2940 __slots__ = () 2941 2942 @abstractmethod 2943 def __bytes__(self) -> bytes: 2944 pass
An ABC with one abstract method __bytes__.
2925@runtime_checkable 2926class SupportsComplex(Protocol): 2927 """An ABC with one abstract method __complex__.""" 2928 2929 __slots__ = () 2930 2931 @abstractmethod 2932 def __complex__(self) -> complex: 2933 pass
An ABC with one abstract method __complex__.
2914@runtime_checkable 2915class SupportsFloat(Protocol): 2916 """An ABC with one abstract method __float__.""" 2917 2918 __slots__ = () 2919 2920 @abstractmethod 2921 def __float__(self) -> float: 2922 pass
An ABC with one abstract method __float__.
2947@runtime_checkable 2948class SupportsIndex(Protocol): 2949 """An ABC with one abstract method __index__.""" 2950 2951 __slots__ = () 2952 2953 @abstractmethod 2954 def __index__(self) -> int: 2955 pass
An ABC with one abstract method __index__.
2903@runtime_checkable 2904class SupportsInt(Protocol): 2905 """An ABC with one abstract method __int__.""" 2906 2907 __slots__ = () 2908 2909 @abstractmethod 2910 def __int__(self) -> int: 2911 pass
An ABC with one abstract method __int__.
2969@runtime_checkable 2970class SupportsRound[T](Protocol): 2971 """An ABC with one abstract method __round__ that is covariant in its return type.""" 2972 2973 __slots__ = () 2974 2975 @abstractmethod 2976 def __round__(self, ndigits: int = 0) -> T: 2977 pass
An ABC with one abstract method __round__ that is covariant in its return type.
3049def NamedTuple(typename, fields=_sentinel, /, **kwargs): 3050 """Typed version of namedtuple. 3051 3052 Usage:: 3053 3054 class Employee(NamedTuple): 3055 name: str 3056 id: int 3057 3058 This is equivalent to:: 3059 3060 Employee = collections.namedtuple('Employee', ['name', 'id']) 3061 3062 The resulting class has an extra __annotations__ attribute, giving a 3063 dict that maps field names to types. (The field names are also in 3064 the _fields attribute, which is part of the namedtuple API.) 3065 An alternative equivalent functional syntax is also accepted:: 3066 3067 Employee = NamedTuple('Employee', [('name', str), ('id', int)]) 3068 """ 3069 if fields is _sentinel: 3070 if kwargs: 3071 deprecated_thing = "Creating NamedTuple classes using keyword arguments" 3072 deprecation_msg = ( 3073 "{name} is deprecated and will be disallowed in Python {remove}. " 3074 "Use the class-based or functional syntax instead." 3075 ) 3076 else: 3077 deprecated_thing = "Failing to pass a value for the 'fields' parameter" 3078 example = f"`{typename} = NamedTuple({typename!r}, [])`" 3079 deprecation_msg = ( 3080 "{name} is deprecated and will be disallowed in Python {remove}. " 3081 "To create a NamedTuple class with 0 fields " 3082 "using the functional syntax, " 3083 "pass an empty list, e.g. " 3084 ) + example + "." 3085 elif fields is None: 3086 if kwargs: 3087 raise TypeError( 3088 "Cannot pass `None` as the 'fields' parameter " 3089 "and also specify fields using keyword arguments" 3090 ) 3091 else: 3092 deprecated_thing = "Passing `None` as the 'fields' parameter" 3093 example = f"`{typename} = NamedTuple({typename!r}, [])`" 3094 deprecation_msg = ( 3095 "{name} is deprecated and will be disallowed in Python {remove}. " 3096 "To create a NamedTuple class with 0 fields " 3097 "using the functional syntax, " 3098 "pass an empty list, e.g. " 3099 ) + example + "." 3100 elif kwargs: 3101 raise TypeError("Either list of fields or keywords" 3102 " can be provided to NamedTuple, not both") 3103 if fields is _sentinel or fields is None: 3104 import warnings 3105 warnings._deprecated(deprecated_thing, message=deprecation_msg, remove=(3, 15)) 3106 fields = kwargs.items() 3107 nt = _make_nmtuple(typename, fields, module=_caller()) 3108 nt.__orig_bases__ = (NamedTuple,) 3109 return nt
Typed version of namedtuple.
Usage::
class Employee(NamedTuple):
name: str
id: int
This is equivalent to::
Employee = collections.namedtuple('Employee', ['name', 'id'])
The resulting class has an extra __annotations__ attribute, giving a dict that maps field names to types. (The field names are also in the _fields attribute, which is part of the namedtuple API.) An alternative equivalent functional syntax is also accepted::
Employee = NamedTuple('Employee', [('name', str), ('id', int)])
3241def TypedDict(typename, fields=_sentinel, /, *, total=True): 3242 """A simple typed namespace. At runtime it is equivalent to a plain dict. 3243 3244 TypedDict creates a dictionary type such that a type checker will expect all 3245 instances to have a certain set of keys, where each key is 3246 associated with a value of a consistent type. This expectation 3247 is not checked at runtime. 3248 3249 Usage:: 3250 3251 >>> class Point2D(TypedDict): 3252 ... x: int 3253 ... y: int 3254 ... label: str 3255 ... 3256 >>> a: Point2D = {'x': 1, 'y': 2, 'label': 'good'} # OK 3257 >>> b: Point2D = {'z': 3, 'label': 'bad'} # Fails type check 3258 >>> Point2D(x=1, y=2, label='first') == dict(x=1, y=2, label='first') 3259 True 3260 3261 The type info can be accessed via the Point2D.__annotations__ dict, and 3262 the Point2D.__required_keys__ and Point2D.__optional_keys__ frozensets. 3263 TypedDict supports an additional equivalent form:: 3264 3265 Point2D = TypedDict('Point2D', {'x': int, 'y': int, 'label': str}) 3266 3267 By default, all keys must be present in a TypedDict. It is possible 3268 to override this by specifying totality:: 3269 3270 class Point2D(TypedDict, total=False): 3271 x: int 3272 y: int 3273 3274 This means that a Point2D TypedDict can have any of the keys omitted. A type 3275 checker is only expected to support a literal False or True as the value of 3276 the total argument. True is the default, and makes all items defined in the 3277 class body be required. 3278 3279 The Required and NotRequired special forms can also be used to mark 3280 individual keys as being required or not required:: 3281 3282 class Point2D(TypedDict): 3283 x: int # the "x" key must always be present (Required is the default) 3284 y: NotRequired[int] # the "y" key can be omitted 3285 3286 See PEP 655 for more details on Required and NotRequired. 3287 3288 The ReadOnly special form can be used 3289 to mark individual keys as immutable for type checkers:: 3290 3291 class DatabaseUser(TypedDict): 3292 id: ReadOnly[int] # the "id" key must not be modified 3293 username: str # the "username" key can be changed 3294 3295 """ 3296 if fields is _sentinel or fields is None: 3297 import warnings 3298 3299 if fields is _sentinel: 3300 deprecated_thing = "Failing to pass a value for the 'fields' parameter" 3301 else: 3302 deprecated_thing = "Passing `None` as the 'fields' parameter" 3303 3304 example = f"`{typename} = TypedDict({typename!r}, {{{{}}}})`" 3305 deprecation_msg = ( 3306 "{name} is deprecated and will be disallowed in Python {remove}. " 3307 "To create a TypedDict class with 0 fields " 3308 "using the functional syntax, " 3309 "pass an empty dictionary, e.g. " 3310 ) + example + "." 3311 warnings._deprecated(deprecated_thing, message=deprecation_msg, remove=(3, 15)) 3312 fields = {} 3313 3314 ns = {'__annotations__': dict(fields)} 3315 module = _caller() 3316 if module is not None: 3317 # Setting correct module is necessary to make typed dict classes pickleable. 3318 ns['__module__'] = module 3319 3320 td = _TypedDictMeta(typename, (), ns, total=total) 3321 td.__orig_bases__ = (TypedDict,) 3322 return td
A simple typed namespace. At runtime it is equivalent to a plain dict.
TypedDict creates a dictionary type such that a type checker will expect all instances to have a certain set of keys, where each key is associated with a value of a consistent type. This expectation is not checked at runtime.
Usage::
>>> class Point2D(TypedDict):
... x: int
... y: int
... label: str
...
>>> a: Point2D = {'x': 1, 'y': 2, 'label': 'good'} # OK
>>> b: Point2D = {'z': 3, 'label': 'bad'} # Fails type check
>>> Point2D(x=1, y=2, label='first') == dict(x=1, y=2, label='first')
True
The type info can be accessed via the Point2D.__annotations__ dict, and the Point2D.__required_keys__ and Point2D.__optional_keys__ frozensets. TypedDict supports an additional equivalent form::
Point2D = TypedDict('Point2D', {'x': int, 'y': int, 'label': str})
By default, all keys must be present in a TypedDict. It is possible to override this by specifying totality::
class Point2D(TypedDict, total=False):
x: int
y: int
This means that a Point2D TypedDict can have any of the keys omitted. A type checker is only expected to support a literal False or True as the value of the total argument. True is the default, and makes all items defined in the class body be required.
The Required and NotRequired special forms can also be used to mark individual keys as being required or not required::
class Point2D(TypedDict):
x: int # the "x" key must always be present (Required is the default)
y: NotRequired[int] # the "y" key can be omitted
See PEP 655 for more details on Required and NotRequired.
The ReadOnly special form can be used to mark individual keys as immutable for type checkers::
class DatabaseUser(TypedDict):
id: ReadOnly[int] # the "id" key must not be modified
username: str # the "username" key can be changed
3560class BinaryIO(IO[bytes]): 3561 """Typed version of the return of open() in binary mode.""" 3562 3563 __slots__ = () 3564 3565 @abstractmethod 3566 def write(self, s: Union[bytes, bytearray]) -> int: 3567 pass 3568 3569 @abstractmethod 3570 def __enter__(self) -> 'BinaryIO': 3571 pass
Typed version of the return of open() in binary mode.
3461class IO(Generic[AnyStr]): 3462 """Generic base class for TextIO and BinaryIO. 3463 3464 This is an abstract, generic version of the return of open(). 3465 3466 NOTE: This does not distinguish between the different possible 3467 classes (text vs. binary, read vs. write vs. read/write, 3468 append-only, unbuffered). The TextIO and BinaryIO subclasses 3469 below capture the distinctions between text vs. binary, which is 3470 pervasive in the interface; however we currently do not offer a 3471 way to track the other distinctions in the type system. 3472 """ 3473 3474 __slots__ = () 3475 3476 @property 3477 @abstractmethod 3478 def mode(self) -> str: 3479 pass 3480 3481 @property 3482 @abstractmethod 3483 def name(self) -> str: 3484 pass 3485 3486 @abstractmethod 3487 def close(self) -> None: 3488 pass 3489 3490 @property 3491 @abstractmethod 3492 def closed(self) -> bool: 3493 pass 3494 3495 @abstractmethod 3496 def fileno(self) -> int: 3497 pass 3498 3499 @abstractmethod 3500 def flush(self) -> None: 3501 pass 3502 3503 @abstractmethod 3504 def isatty(self) -> bool: 3505 pass 3506 3507 @abstractmethod 3508 def read(self, n: int = -1) -> AnyStr: 3509 pass 3510 3511 @abstractmethod 3512 def readable(self) -> bool: 3513 pass 3514 3515 @abstractmethod 3516 def readline(self, limit: int = -1) -> AnyStr: 3517 pass 3518 3519 @abstractmethod 3520 def readlines(self, hint: int = -1) -> List[AnyStr]: 3521 pass 3522 3523 @abstractmethod 3524 def seek(self, offset: int, whence: int = 0) -> int: 3525 pass 3526 3527 @abstractmethod 3528 def seekable(self) -> bool: 3529 pass 3530 3531 @abstractmethod 3532 def tell(self) -> int: 3533 pass 3534 3535 @abstractmethod 3536 def truncate(self, size: int = None) -> int: 3537 pass 3538 3539 @abstractmethod 3540 def writable(self) -> bool: 3541 pass 3542 3543 @abstractmethod 3544 def write(self, s: AnyStr) -> int: 3545 pass 3546 3547 @abstractmethod 3548 def writelines(self, lines: List[AnyStr]) -> None: 3549 pass 3550 3551 @abstractmethod 3552 def __enter__(self) -> 'IO[AnyStr]': 3553 pass 3554 3555 @abstractmethod 3556 def __exit__(self, type, value, traceback) -> None: 3557 pass
Generic base class for TextIO and BinaryIO.
This is an abstract, generic version of the return of open().
NOTE: This does not distinguish between the different possible classes (text vs. binary, read vs. write vs. read/write, append-only, unbuffered). The TextIO and BinaryIO subclasses below capture the distinctions between text vs. binary, which is pervasive in the interface; however we currently do not offer a way to track the other distinctions in the type system.
3574class TextIO(IO[str]): 3575 """Typed version of the return of open() in text mode.""" 3576 3577 __slots__ = () 3578 3579 @property 3580 @abstractmethod 3581 def buffer(self) -> BinaryIO: 3582 pass 3583 3584 @property 3585 @abstractmethod 3586 def encoding(self) -> str: 3587 pass 3588 3589 @property 3590 @abstractmethod 3591 def errors(self) -> Optional[str]: 3592 pass 3593 3594 @property 3595 @abstractmethod 3596 def line_buffering(self) -> bool: 3597 pass 3598 3599 @property 3600 @abstractmethod 3601 def newlines(self) -> Any: 3602 pass 3603 3604 @abstractmethod 3605 def __enter__(self) -> 'TextIO': 3606 pass
Typed version of the return of open() in text mode.
2383def assert_type(val, typ, /): 2384 """Ask a static type checker to confirm that the value is of the given type. 2385 2386 At runtime this does nothing: it returns the first argument unchanged with no 2387 checks or side effects, no matter the actual type of the argument. 2388 2389 When a static type checker encounters a call to assert_type(), it 2390 emits an error if the value is not of the specified type:: 2391 2392 def greet(name: str) -> None: 2393 assert_type(name, str) # OK 2394 assert_type(name, int) # type checker error 2395 """ 2396 return val
Ask a static type checker to confirm that the value is of the given type.
At runtime this does nothing: it returns the first argument unchanged with no checks or side effects, no matter the actual type of the argument.
When a static type checker encounters a call to assert_type(), it emits an error if the value is not of the specified type::
def greet(name: str) -> None:
assert_type(name, str) # OK
assert_type(name, int) # type checker error
2606def assert_never(arg: Never, /) -> Never: 2607 """Statically assert that a line of code is unreachable. 2608 2609 Example:: 2610 2611 def int_or_str(arg: int | str) -> None: 2612 match arg: 2613 case int(): 2614 print("It's an int") 2615 case str(): 2616 print("It's a str") 2617 case _: 2618 assert_never(arg) 2619 2620 If a type checker finds that a call to assert_never() is 2621 reachable, it will emit an error. 2622 2623 At runtime, this throws an exception when called. 2624 """ 2625 value = repr(arg) 2626 if len(value) > _ASSERT_NEVER_REPR_MAX_LENGTH: 2627 value = value[:_ASSERT_NEVER_REPR_MAX_LENGTH] + '...' 2628 raise AssertionError(f"Expected code to be unreachable, but got: {value}")
Statically assert that a line of code is unreachable.
Example::
def int_or_str(arg: int | str) -> None:
match arg:
case int():
print("It's an int")
case str():
print("It's a str")
case _:
assert_never(arg)
If a type checker finds that a call to assert_never() is reachable, it will emit an error.
At runtime, this throws an exception when called.
2372def cast(typ, val): 2373 """Cast a value to a type. 2374 2375 This returns the value unchanged. To the type checker this 2376 signals that the return value has the designated type, but at 2377 runtime we intentionally don't check anything (we want this 2378 to be as fast as possible). 2379 """ 2380 return val
Cast a value to a type.
This returns the value unchanged. To the type checker this signals that the return value has the designated type, but at runtime we intentionally don't check anything (we want this to be as fast as possible).
2750def clear_overloads(): 2751 """Clear all overloads in the registry.""" 2752 _overload_registry.clear()
Clear all overloads in the registry.
3633def dataclass_transform( 3634 *, 3635 eq_default: bool = True, 3636 order_default: bool = False, 3637 kw_only_default: bool = False, 3638 frozen_default: bool = False, 3639 field_specifiers: tuple[type[Any] | Callable[..., Any], ...] = (), 3640 **kwargs: Any, 3641) -> _IdentityCallable: 3642 """Decorator to mark an object as providing dataclass-like behaviour. 3643 3644 The decorator can be applied to a function, class, or metaclass. 3645 3646 Example usage with a decorator function:: 3647 3648 @dataclass_transform() 3649 def create_model[T](cls: type[T]) -> type[T]: 3650 ... 3651 return cls 3652 3653 @create_model 3654 class CustomerModel: 3655 id: int 3656 name: str 3657 3658 On a base class:: 3659 3660 @dataclass_transform() 3661 class ModelBase: ... 3662 3663 class CustomerModel(ModelBase): 3664 id: int 3665 name: str 3666 3667 On a metaclass:: 3668 3669 @dataclass_transform() 3670 class ModelMeta(type): ... 3671 3672 class ModelBase(metaclass=ModelMeta): ... 3673 3674 class CustomerModel(ModelBase): 3675 id: int 3676 name: str 3677 3678 The ``CustomerModel`` classes defined above will 3679 be treated by type checkers similarly to classes created with 3680 ``@dataclasses.dataclass``. 3681 For example, type checkers will assume these classes have 3682 ``__init__`` methods that accept ``id`` and ``name``. 3683 3684 The arguments to this decorator can be used to customize this behavior: 3685 - ``eq_default`` indicates whether the ``eq`` parameter is assumed to be 3686 ``True`` or ``False`` if it is omitted by the caller. 3687 - ``order_default`` indicates whether the ``order`` parameter is 3688 assumed to be True or False if it is omitted by the caller. 3689 - ``kw_only_default`` indicates whether the ``kw_only`` parameter is 3690 assumed to be True or False if it is omitted by the caller. 3691 - ``frozen_default`` indicates whether the ``frozen`` parameter is 3692 assumed to be True or False if it is omitted by the caller. 3693 - ``field_specifiers`` specifies a static list of supported classes 3694 or functions that describe fields, similar to ``dataclasses.field()``. 3695 - Arbitrary other keyword arguments are accepted in order to allow for 3696 possible future extensions. 3697 3698 At runtime, this decorator records its arguments in the 3699 ``__dataclass_transform__`` attribute on the decorated object. 3700 It has no other runtime effect. 3701 3702 See PEP 681 for more details. 3703 """ 3704 def decorator(cls_or_fn): 3705 cls_or_fn.__dataclass_transform__ = { 3706 "eq_default": eq_default, 3707 "order_default": order_default, 3708 "kw_only_default": kw_only_default, 3709 "frozen_default": frozen_default, 3710 "field_specifiers": field_specifiers, 3711 "kwargs": kwargs, 3712 } 3713 return cls_or_fn 3714 return decorator
Decorator to mark an object as providing dataclass-like behaviour.
The decorator can be applied to a function, class, or metaclass.
Example usage with a decorator function::
@dataclass_transform()
def create_model[T](cls: type[T]) -> type[T]:
...
return cls
@create_model
class CustomerModel:
id: int
name: str
On a base class::
@dataclass_transform()
class ModelBase: ...
class CustomerModel(ModelBase):
id: int
name: str
On a metaclass::
@dataclass_transform()
class ModelMeta(type): ...
class ModelBase(metaclass=ModelMeta): ...
class CustomerModel(ModelBase):
id: int
name: str
The CustomerModel
classes defined above will
be treated by type checkers similarly to classes created with
@dataclasses.dataclass
.
For example, type checkers will assume these classes have
__init__
methods that accept id
and name
.
The arguments to this decorator can be used to customize this behavior:
eq_default
indicates whether theeq
parameter is assumed to beTrue
orFalse
if it is omitted by the caller.order_default
indicates whether theorder
parameter is assumed to be True or False if it is omitted by the caller.kw_only_default
indicates whether thekw_only
parameter is assumed to be True or False if it is omitted by the caller.frozen_default
indicates whether thefrozen
parameter is assumed to be True or False if it is omitted by the caller.field_specifiers
specifies a static list of supported classes or functions that describe fields, similar todataclasses.field()
.- Arbitrary other keyword arguments are accepted in order to allow for possible future extensions.
At runtime, this decorator records its arguments in the
__dataclass_transform__
attribute on the decorated object.
It has no other runtime effect.
See PEP 681 for more details.
2755def final(f): 2756 """Decorator to indicate final methods and final classes. 2757 2758 Use this decorator to indicate to type checkers that the decorated 2759 method cannot be overridden, and decorated class cannot be subclassed. 2760 2761 For example:: 2762 2763 class Base: 2764 @final 2765 def done(self) -> None: 2766 ... 2767 class Sub(Base): 2768 def done(self) -> None: # Error reported by type checker 2769 ... 2770 2771 @final 2772 class Leaf: 2773 ... 2774 class Other(Leaf): # Error reported by type checker 2775 ... 2776 2777 There is no runtime checking of these properties. The decorator 2778 attempts to set the ``__final__`` attribute to ``True`` on the decorated 2779 object to allow runtime introspection. 2780 """ 2781 try: 2782 f.__final__ = True 2783 except (AttributeError, TypeError): 2784 # Skip the attribute silently if it is not writable. 2785 # AttributeError happens if the object has __slots__ or a 2786 # read-only property, TypeError if it's a builtin class. 2787 pass 2788 return f
Decorator to indicate final methods and final classes.
Use this decorator to indicate to type checkers that the decorated method cannot be overridden, and decorated class cannot be subclassed.
For example::
class Base:
@final
def done(self) -> None:
...
class Sub(Base):
def done(self) -> None: # Error reported by type checker
...
@final
class Leaf:
...
class Other(Leaf): # Error reported by type checker
...
There is no runtime checking of these properties. The decorator
attempts to set the __final__
attribute to True
on the decorated
object to allow runtime introspection.
2559def get_args(tp): 2560 """Get type arguments with all substitutions performed. 2561 2562 For unions, basic simplifications used by Union constructor are performed. 2563 2564 Examples:: 2565 2566 >>> T = TypeVar('T') 2567 >>> assert get_args(Dict[str, int]) == (str, int) 2568 >>> assert get_args(int) == () 2569 >>> assert get_args(Union[int, Union[T, int], str][int]) == (int, str) 2570 >>> assert get_args(Union[int, Tuple[T, int]][str]) == (int, Tuple[str, int]) 2571 >>> assert get_args(Callable[[], T][int]) == ([], int) 2572 """ 2573 if isinstance(tp, _AnnotatedAlias): 2574 return (tp.__origin__,) + tp.__metadata__ 2575 if isinstance(tp, (_GenericAlias, GenericAlias)): 2576 res = tp.__args__ 2577 if _should_unflatten_callable_args(tp, res): 2578 res = (list(res[:-1]), res[-1]) 2579 return res 2580 if isinstance(tp, types.UnionType): 2581 return tp.__args__ 2582 return ()
Get type arguments with all substitutions performed.
For unions, basic simplifications used by Union constructor are performed.
Examples::
>>> T = TypeVar('T')
>>> assert get_args(Dict[str, int]) == (str, int)
>>> assert get_args(int) == ()
>>> assert get_args(Union[int, Union[T, int], str][int]) == (int, str)
>>> assert get_args(Union[int, Tuple[T, int]][str]) == (int, Tuple[str, int])
>>> assert get_args(Callable[[], T][int]) == ([], int)
2529def get_origin(tp): 2530 """Get the unsubscripted version of a type. 2531 2532 This supports generic types, Callable, Tuple, Union, Literal, Final, ClassVar, 2533 Annotated, and others. Return None for unsupported types. 2534 2535 Examples:: 2536 2537 >>> P = ParamSpec('P') 2538 >>> assert get_origin(Literal[42]) is Literal 2539 >>> assert get_origin(int) is None 2540 >>> assert get_origin(ClassVar[int]) is ClassVar 2541 >>> assert get_origin(Generic) is Generic 2542 >>> assert get_origin(Generic[T]) is Generic 2543 >>> assert get_origin(Union[T, int]) is Union 2544 >>> assert get_origin(List[Tuple[T, T]][int]) is list 2545 >>> assert get_origin(P.args) is P 2546 """ 2547 if isinstance(tp, _AnnotatedAlias): 2548 return Annotated 2549 if isinstance(tp, (_BaseGenericAlias, GenericAlias, 2550 ParamSpecArgs, ParamSpecKwargs)): 2551 return tp.__origin__ 2552 if tp is Generic: 2553 return Generic 2554 if isinstance(tp, types.UnionType): 2555 return types.UnionType 2556 return None
Get the unsubscripted version of a type.
This supports generic types, Callable, Tuple, Union, Literal, Final, ClassVar, Annotated, and others. Return None for unsupported types.
Examples::
>>> P = ParamSpec('P')
>>> assert get_origin(Literal[42]) is Literal
>>> assert get_origin(int) is None
>>> assert get_origin(ClassVar[int]) is ClassVar
>>> assert get_origin(Generic) is Generic
>>> assert get_origin(Generic[T]) is Generic
>>> assert get_origin(Union[T, int]) is Union
>>> assert get_origin(List[Tuple[T, T]][int]) is list
>>> assert get_origin(P.args) is P
2738def get_overloads(func): 2739 """Return all defined overloads for *func* as a sequence.""" 2740 # classmethod and staticmethod 2741 f = getattr(func, "__func__", func) 2742 if f.__module__ not in _overload_registry: 2743 return [] 2744 mod_dict = _overload_registry[f.__module__] 2745 if f.__qualname__ not in mod_dict: 2746 return [] 2747 return list(mod_dict[f.__qualname__].values())
Return all defined overloads for func as a sequence.
3776def get_protocol_members(tp: type, /) -> frozenset[str]: 3777 """Return the set of members defined in a Protocol. 3778 3779 Example:: 3780 3781 >>> from typing import Protocol, get_protocol_members 3782 >>> class P(Protocol): 3783 ... def a(self) -> str: ... 3784 ... b: int 3785 >>> get_protocol_members(P) == frozenset({'a', 'b'}) 3786 True 3787 3788 Raise a TypeError for arguments that are not Protocols. 3789 """ 3790 if not is_protocol(tp): 3791 raise TypeError(f'{tp!r} is not a Protocol') 3792 return frozenset(tp.__protocol_attrs__)
Return the set of members defined in a Protocol.
Example::
>>> from typing import Protocol, get_protocol_members
>>> class P(Protocol):
... def a(self) -> str: ...
... b: int
>>> get_protocol_members(P) == frozenset({'a', 'b'})
True
Raise a TypeError for arguments that are not Protocols.
2404def get_type_hints(obj, globalns=None, localns=None, include_extras=False): 2405 """Return type hints for an object. 2406 2407 This is often the same as obj.__annotations__, but it handles 2408 forward references encoded as string literals and recursively replaces all 2409 'Annotated[T, ...]' with 'T' (unless 'include_extras=True'). 2410 2411 The argument may be a module, class, method, or function. The annotations 2412 are returned as a dictionary. For classes, annotations include also 2413 inherited members. 2414 2415 TypeError is raised if the argument is not of a type that can contain 2416 annotations, and an empty dictionary is returned if no annotations are 2417 present. 2418 2419 BEWARE -- the behavior of globalns and localns is counterintuitive 2420 (unless you are familiar with how eval() and exec() work). The 2421 search order is locals first, then globals. 2422 2423 - If no dict arguments are passed, an attempt is made to use the 2424 globals from obj (or the respective module's globals for classes), 2425 and these are also used as the locals. If the object does not appear 2426 to have globals, an empty dictionary is used. For classes, the search 2427 order is globals first then locals. 2428 2429 - If one dict argument is passed, it is used for both globals and 2430 locals. 2431 2432 - If two dict arguments are passed, they specify globals and 2433 locals, respectively. 2434 """ 2435 if getattr(obj, '__no_type_check__', None): 2436 return {} 2437 # Classes require a special treatment. 2438 if isinstance(obj, type): 2439 hints = {} 2440 for base in reversed(obj.__mro__): 2441 if globalns is None: 2442 base_globals = getattr(sys.modules.get(base.__module__, None), '__dict__', {}) 2443 else: 2444 base_globals = globalns 2445 ann = base.__dict__.get('__annotations__', {}) 2446 if isinstance(ann, types.GetSetDescriptorType): 2447 ann = {} 2448 base_locals = dict(vars(base)) if localns is None else localns 2449 if localns is None and globalns is None: 2450 # This is surprising, but required. Before Python 3.10, 2451 # get_type_hints only evaluated the globalns of 2452 # a class. To maintain backwards compatibility, we reverse 2453 # the globalns and localns order so that eval() looks into 2454 # *base_globals* first rather than *base_locals*. 2455 # This only affects ForwardRefs. 2456 base_globals, base_locals = base_locals, base_globals 2457 for name, value in ann.items(): 2458 if value is None: 2459 value = type(None) 2460 if isinstance(value, str): 2461 value = ForwardRef(value, is_argument=False, is_class=True) 2462 value = _eval_type(value, base_globals, base_locals, base.__type_params__) 2463 hints[name] = value 2464 return hints if include_extras else {k: _strip_annotations(t) for k, t in hints.items()} 2465 2466 if globalns is None: 2467 if isinstance(obj, types.ModuleType): 2468 globalns = obj.__dict__ 2469 else: 2470 nsobj = obj 2471 # Find globalns for the unwrapped object. 2472 while hasattr(nsobj, '__wrapped__'): 2473 nsobj = nsobj.__wrapped__ 2474 globalns = getattr(nsobj, '__globals__', {}) 2475 if localns is None: 2476 localns = globalns 2477 elif localns is None: 2478 localns = globalns 2479 hints = getattr(obj, '__annotations__', None) 2480 if hints is None: 2481 # Return empty annotations for something that _could_ have them. 2482 if isinstance(obj, _allowed_types): 2483 return {} 2484 else: 2485 raise TypeError('{!r} is not a module, class, method, ' 2486 'or function.'.format(obj)) 2487 hints = dict(hints) 2488 type_params = getattr(obj, "__type_params__", ()) 2489 for name, value in hints.items(): 2490 if value is None: 2491 value = type(None) 2492 if isinstance(value, str): 2493 # class-level forward refs were handled above, this must be either 2494 # a module-level annotation or a function argument annotation 2495 value = ForwardRef( 2496 value, 2497 is_argument=not isinstance(obj, types.ModuleType), 2498 is_class=False, 2499 ) 2500 hints[name] = _eval_type(value, globalns, localns, type_params) 2501 return hints if include_extras else {k: _strip_annotations(t) for k, t in hints.items()}
Return type hints for an object.
This is often the same as obj.__annotations__, but it handles forward references encoded as string literals and recursively replaces all 'Annotated[T, ...]' with 'T' (unless 'include_extras=True').
The argument may be a module, class, method, or function. The annotations are returned as a dictionary. For classes, annotations include also inherited members.
TypeError is raised if the argument is not of a type that can contain annotations, and an empty dictionary is returned if no annotations are present.
BEWARE -- the behavior of globalns and localns is counterintuitive (unless you are familiar with how eval() and exec() work). The search order is locals first, then globals.
If no dict arguments are passed, an attempt is made to use the globals from obj (or the respective module's globals for classes), and these are also used as the locals. If the object does not appear to have globals, an empty dictionary is used. For classes, the search order is globals first then locals.
If one dict argument is passed, it is used for both globals and locals.
If two dict arguments are passed, they specify globals and locals, respectively.
3755def is_protocol(tp: type, /) -> bool: 3756 """Return True if the given type is a Protocol. 3757 3758 Example:: 3759 3760 >>> from typing import Protocol, is_protocol 3761 >>> class P(Protocol): 3762 ... def a(self) -> str: ... 3763 ... b: int 3764 >>> is_protocol(P) 3765 True 3766 >>> is_protocol(int) 3767 False 3768 """ 3769 return ( 3770 isinstance(tp, type) 3771 and getattr(tp, '_is_protocol', False) 3772 and tp != Protocol 3773 )
Return True if the given type is a Protocol.
Example::
>>> from typing import Protocol, is_protocol
>>> class P(Protocol):
... def a(self) -> str: ...
... b: int
>>> is_protocol(P)
True
>>> is_protocol(int)
False
2585def is_typeddict(tp): 2586 """Check if an annotation is a TypedDict class. 2587 2588 For example:: 2589 2590 >>> from typing import TypedDict 2591 >>> class Film(TypedDict): 2592 ... title: str 2593 ... year: int 2594 ... 2595 >>> is_typeddict(Film) 2596 True 2597 >>> is_typeddict(dict) 2598 False 2599 """ 2600 return isinstance(tp, _TypedDictMeta)
Check if an annotation is a TypedDict class.
For example::
>>> from typing import TypedDict
>>> class Film(TypedDict):
... title: str
... year: int
...
>>> is_typeddict(Film)
True
>>> is_typeddict(dict)
False
Represents an arbitrary literal string.
Example::
from typing import LiteralString
def run_query(sql: LiteralString) -> None:
...
def caller(arbitrary_string: str, literal_string: LiteralString) -> None:
run_query("SELECT * FROM students") # OK
run_query(literal_string) # OK
run_query("SELECT * FROM " + literal_string) # OK
run_query(arbitrary_string) # type checker error
run_query( # type checker error
f"SELECT * FROM students WHERE name = {arbitrary_string}"
)
Only string literals and other LiteralStrings are compatible with LiteralString. This provides a tool to help prevent security issues such as SQL injection.
The bottom type, a type that has no members.
This can be used to define a function that should never be called, or a function that never returns::
from typing import Never
def never_call_me(arg: Never) -> None:
pass
def int_or_str(arg: int | str) -> None:
never_call_me(arg) # type checker error
match arg:
case int():
print("It's an int")
case str():
print("It's a str")
case _:
never_call_me(arg) # OK, arg is of type Never
3391class NewType: 3392 """NewType creates simple unique types with almost zero runtime overhead. 3393 3394 NewType(name, tp) is considered a subtype of tp 3395 by static type checkers. At runtime, NewType(name, tp) returns 3396 a dummy callable that simply returns its argument. 3397 3398 Usage:: 3399 3400 UserId = NewType('UserId', int) 3401 3402 def name_by_id(user_id: UserId) -> str: 3403 ... 3404 3405 UserId('user') # Fails type check 3406 3407 name_by_id(42) # Fails type check 3408 name_by_id(UserId(42)) # OK 3409 3410 num = UserId(5) + 1 # type: int 3411 """ 3412 3413 __call__ = _idfunc 3414 3415 def __init__(self, name, tp): 3416 self.__qualname__ = name 3417 if '.' in name: 3418 name = name.rpartition('.')[-1] 3419 self.__name__ = name 3420 self.__supertype__ = tp 3421 def_mod = _caller() 3422 if def_mod != 'typing': 3423 self.__module__ = def_mod 3424 3425 def __mro_entries__(self, bases): 3426 # We defined __mro_entries__ to get a better error message 3427 # if a user attempts to subclass a NewType instance. bpo-46170 3428 superclass_name = self.__name__ 3429 3430 class Dummy: 3431 def __init_subclass__(cls): 3432 subclass_name = cls.__name__ 3433 raise TypeError( 3434 f"Cannot subclass an instance of NewType. Perhaps you were looking for: " 3435 f"`{subclass_name} = NewType({subclass_name!r}, {superclass_name})`" 3436 ) 3437 3438 return (Dummy,) 3439 3440 def __repr__(self): 3441 return f'{self.__module__}.{self.__qualname__}' 3442 3443 def __reduce__(self): 3444 return self.__qualname__ 3445 3446 def __or__(self, other): 3447 return Union[self, other] 3448 3449 def __ror__(self, other): 3450 return Union[other, self]
NewType creates simple unique types with almost zero runtime overhead.
NewType(name, tp) is considered a subtype of tp by static type checkers. At runtime, NewType(name, tp) returns a dummy callable that simply returns its argument.
Usage::
UserId = NewType('UserId', int)
def name_by_id(user_id: UserId) -> str:
...
UserId('user') # Fails type check
name_by_id(42) # Fails type check
name_by_id(UserId(42)) # OK
num = UserId(5) + 1 # type: int
2631def no_type_check(arg): 2632 """Decorator to indicate that annotations are not type hints. 2633 2634 The argument must be a class or function; if it is a class, it 2635 applies recursively to all methods and classes defined in that class 2636 (but not to methods defined in its superclasses or subclasses). 2637 2638 This mutates the function(s) or class(es) in place. 2639 """ 2640 if isinstance(arg, type): 2641 for key in dir(arg): 2642 obj = getattr(arg, key) 2643 if ( 2644 not hasattr(obj, '__qualname__') 2645 or obj.__qualname__ != f'{arg.__qualname__}.{obj.__name__}' 2646 or getattr(obj, '__module__', None) != arg.__module__ 2647 ): 2648 # We only modify objects that are defined in this type directly. 2649 # If classes / methods are nested in multiple layers, 2650 # we will modify them when processing their direct holders. 2651 continue 2652 # Instance, class, and static methods: 2653 if isinstance(obj, types.FunctionType): 2654 obj.__no_type_check__ = True 2655 if isinstance(obj, types.MethodType): 2656 obj.__func__.__no_type_check__ = True 2657 # Nested types: 2658 if isinstance(obj, type): 2659 no_type_check(obj) 2660 try: 2661 arg.__no_type_check__ = True 2662 except TypeError: # built-in classes 2663 pass 2664 return arg
Decorator to indicate that annotations are not type hints.
The argument must be a class or function; if it is a class, it applies recursively to all methods and classes defined in that class (but not to methods defined in its superclasses or subclasses).
This mutates the function(s) or class(es) in place.
2667def no_type_check_decorator(decorator): 2668 """Decorator to give another decorator the @no_type_check effect. 2669 2670 This wraps the decorator with something that wraps the decorated 2671 function in @no_type_check. 2672 """ 2673 import warnings 2674 warnings._deprecated("typing.no_type_check_decorator", remove=(3, 15)) 2675 @functools.wraps(decorator) 2676 def wrapped_decorator(*args, **kwds): 2677 func = decorator(*args, **kwds) 2678 func = no_type_check(func) 2679 return func 2680 2681 return wrapped_decorator
Decorator to give another decorator the @no_type_check effect.
This wraps the decorator with something that wraps the decorated function in @no_type_check.
Special type indicating functions that never return.
Example::
from typing import NoReturn
def stop() -> NoReturn:
raise Exception('no way')
NoReturn can also be used as a bottom type, a type that has no values. Starting in Python 3.11, the Never type should be used for this concept instead. Type checkers should treat the two equivalently.
Special typing construct to mark a TypedDict key as potentially missing.
For example::
class Movie(TypedDict):
title: str
year: NotRequired[int]
m = Movie(
title='The Matrix', # typechecker error if key is omitted
year=1999,
)
2697def overload(func): 2698 """Decorator for overloaded functions/methods. 2699 2700 In a stub file, place two or more stub definitions for the same 2701 function in a row, each decorated with @overload. 2702 2703 For example:: 2704 2705 @overload 2706 def utf8(value: None) -> None: ... 2707 @overload 2708 def utf8(value: bytes) -> bytes: ... 2709 @overload 2710 def utf8(value: str) -> bytes: ... 2711 2712 In a non-stub file (i.e. a regular .py file), do the same but 2713 follow it with an implementation. The implementation should *not* 2714 be decorated with @overload:: 2715 2716 @overload 2717 def utf8(value: None) -> None: ... 2718 @overload 2719 def utf8(value: bytes) -> bytes: ... 2720 @overload 2721 def utf8(value: str) -> bytes: ... 2722 def utf8(value): 2723 ... # implementation goes here 2724 2725 The overloads for a function can be retrieved at runtime using the 2726 get_overloads() function. 2727 """ 2728 # classmethod and staticmethod 2729 f = getattr(func, "__func__", func) 2730 try: 2731 _overload_registry[f.__module__][f.__qualname__][f.__code__.co_firstlineno] = func 2732 except AttributeError: 2733 # Not a normal function; ignore. 2734 pass 2735 return _overload_dummy
Decorator for overloaded functions/methods.
In a stub file, place two or more stub definitions for the same function in a row, each decorated with @overload.
For example::
@overload
def utf8(value: None) -> None: ...
@overload
def utf8(value: bytes) -> bytes: ...
@overload
def utf8(value: str) -> bytes: ...
In a non-stub file (i.e. a regular .py file), do the same but follow it with an implementation. The implementation should not be decorated with @overload::
@overload
def utf8(value: None) -> None: ...
@overload
def utf8(value: bytes) -> bytes: ...
@overload
def utf8(value: str) -> bytes: ...
def utf8(value):
... # implementation goes here
The overloads for a function can be retrieved at runtime using the get_overloads() function.
3720def override[F: _Func](method: F, /) -> F: 3721 """Indicate that a method is intended to override a method in a base class. 3722 3723 Usage:: 3724 3725 class Base: 3726 def method(self) -> None: 3727 pass 3728 3729 class Child(Base): 3730 @override 3731 def method(self) -> None: 3732 super().method() 3733 3734 When this decorator is applied to a method, the type checker will 3735 validate that it overrides a method or attribute with the same name on a 3736 base class. This helps prevent bugs that may occur when a base class is 3737 changed without an equivalent change to a child class. 3738 3739 There is no runtime checking of this property. The decorator attempts to 3740 set the ``__override__`` attribute to ``True`` on the decorated object to 3741 allow runtime introspection. 3742 3743 See PEP 698 for details. 3744 """ 3745 try: 3746 method.__override__ = True 3747 except (AttributeError, TypeError): 3748 # Skip the attribute silently if it is not writable. 3749 # AttributeError happens if the object has __slots__ or a 3750 # read-only property, TypeError if it's a builtin class. 3751 pass 3752 return method
Indicate that a method is intended to override a method in a base class.
Usage::
class Base:
def method(self) -> None:
pass
class Child(Base):
@override
def method(self) -> None:
super().method()
When this decorator is applied to a method, the type checker will validate that it overrides a method or attribute with the same name on a base class. This helps prevent bugs that may occur when a base class is changed without an equivalent change to a child class.
There is no runtime checking of this property. The decorator attempts to
set the __override__
attribute to True
on the decorated object to
allow runtime introspection.
See PEP 698 for details.
The args for a ParamSpec object.
Given a ParamSpec object P, P.args is an instance of ParamSpecArgs.
ParamSpecArgs objects have a reference back to their ParamSpec::
>>> P = ParamSpec("P")
>>> P.args.__origin__ is P
True
This type is meant for runtime introspection and has no special meaning to static type checkers.
The kwargs for a ParamSpec object.
Given a ParamSpec object P, P.kwargs is an instance of ParamSpecKwargs.
ParamSpecKwargs objects have a reference back to their ParamSpec::
>>> P = ParamSpec("P")
>>> P.kwargs.__origin__ is P
True
This type is meant for runtime introspection and has no special meaning to static type checkers.
A special typing construct to mark an item of a TypedDict as read-only.
For example::
class Movie(TypedDict):
title: ReadOnly[str]
year: int
def mutate_movie(m: Movie) -> None:
m["year"] = 1992 # allowed
m["title"] = "The Matrix" # typechecker error
There is no runtime checking for this property.
Special typing construct to mark a TypedDict key as required.
This is mainly useful for total=False TypedDicts.
For example::
class Movie(TypedDict, total=False):
title: Required[str]
year: int
m = Movie(
title='The Matrix', # typechecker error if key is omitted
year=1999,
)
There is no runtime checking that a required key is actually provided when instantiating a related TypedDict.
3609def reveal_type[T](obj: T, /) -> T: 3610 """Ask a static type checker to reveal the inferred type of an expression. 3611 3612 When a static type checker encounters a call to ``reveal_type()``, 3613 it will emit the inferred type of the argument:: 3614 3615 x: int = 1 3616 reveal_type(x) 3617 3618 Running a static type checker (e.g., mypy) on this example 3619 will produce output similar to 'Revealed type is "builtins.int"'. 3620 3621 At runtime, the function prints the runtime type of the 3622 argument and returns the argument unchanged. 3623 """ 3624 print(f"Runtime type is {type(obj).__name__!r}", file=sys.stderr) 3625 return obj
Ask a static type checker to reveal the inferred type of an expression.
When a static type checker encounters a call to reveal_type()
,
it will emit the inferred type of the argument::
x: int = 1
reveal_type(x)
Running a static type checker (e.g., mypy) on this example will produce output similar to 'Revealed type is "builtins.int"'.
At runtime, the function prints the runtime type of the argument and returns the argument unchanged.
2330def runtime_checkable(cls): 2331 """Mark a protocol class as a runtime protocol. 2332 2333 Such protocol can be used with isinstance() and issubclass(). 2334 Raise TypeError if applied to a non-protocol class. 2335 This allows a simple-minded structural check very similar to 2336 one trick ponies in collections.abc such as Iterable. 2337 2338 For example:: 2339 2340 @runtime_checkable 2341 class Closable(Protocol): 2342 def close(self): ... 2343 2344 assert isinstance(open('/some/file'), Closable) 2345 2346 Warning: this will check only the presence of the required methods, 2347 not their type signatures! 2348 """ 2349 if not issubclass(cls, Generic) or not getattr(cls, '_is_protocol', False): 2350 raise TypeError('@runtime_checkable can be only applied to protocol classes,' 2351 ' got %r' % cls) 2352 cls._is_runtime_protocol = True 2353 # PEP 544 prohibits using issubclass() 2354 # with protocols that have non-method members. 2355 # See gh-113320 for why we compute this attribute here, 2356 # rather than in `_ProtocolMeta.__init__` 2357 cls.__non_callable_proto_members__ = set() 2358 for attr in cls.__protocol_attrs__: 2359 try: 2360 is_callable = callable(getattr(cls, attr, None)) 2361 except Exception as e: 2362 raise TypeError( 2363 f"Failed to determine whether protocol member {attr!r} " 2364 "is a method member" 2365 ) from e 2366 else: 2367 if not is_callable: 2368 cls.__non_callable_proto_members__.add(attr) 2369 return cls
Mark a protocol class as a runtime protocol.
Such protocol can be used with isinstance() and issubclass(). Raise TypeError if applied to a non-protocol class. This allows a simple-minded structural check very similar to one trick ponies in collections.abc such as Iterable.
For example::
@runtime_checkable
class Closable(Protocol):
def close(self): ...
assert isinstance(open('/some/file'), Closable)
Warning: this will check only the presence of the required methods, not their type signatures!
Used to spell the type of "self" in classes.
Example::
from typing import Self
class Foo:
def return_self(self) -> Self:
...
return self
This is especially useful for:
- classmethods that are used as alternative constructors
- annotating an
__enter__
method which returns self
Special form for marking type aliases.
Use TypeAlias to indicate that an assignment should be recognized as a proper type alias definition by type checkers.
For example::
Predicate: TypeAlias = Callable[..., bool]
It's invalid when used anywhere except as in the example above.
Special typing construct for marking user-defined type predicate functions.
TypeGuard
can be used to annotate the return type of a user-defined
type predicate function. TypeGuard
only accepts a single type argument.
At runtime, functions marked this way should return a boolean.
TypeGuard
aims to benefit type narrowing -- a technique used by static
type checkers to determine a more precise type of an expression within a
program's code flow. Usually type narrowing is done by analyzing
conditional code flow and applying the narrowing to a block of code. The
conditional expression here is sometimes referred to as a "type predicate".
Sometimes it would be convenient to use a user-defined boolean function
as a type predicate. Such a function should use TypeGuard[...]
or
TypeIs[...]
as its return type to alert static type checkers to
this intention. TypeGuard
should be used over TypeIs
when narrowing
from an incompatible type (e.g., list[object]
to list[int]
) or when
the function does not return True
for all instances of the narrowed type.
Using -> TypeGuard[NarrowedType]
tells the static type checker that
for a given function:
- The return value is a boolean.
- If the return value is
True
, the type of its argument isNarrowedType
.
For example::
def is_str_list(val: list[object]) -> TypeGuard[list[str]]:
'''Determines whether all objects in the list are strings'''
return all(isinstance(x, str) for x in val)
def func1(val: list[object]):
if is_str_list(val):
# Type of ``val`` is narrowed to ``list[str]``.
print(" ".join(val))
else:
# Type of ``val`` remains as ``list[object]``.
print("Not a list of strings!")
Strict type narrowing is not enforced -- TypeB
need not be a narrower
form of TypeA
(it can even be a wider form) and this may lead to
type-unsafe results. The main reason is to allow for things like
narrowing list[object]
to list[str]
even though the latter is not
a subtype of the former, since list
is invariant. The responsibility of
writing type-safe type predicates is left to the user.
TypeGuard
also works with type variables. For more information, see
PEP 647 (User-Defined Type Guards).
Special typing construct for marking user-defined type predicate functions.
TypeIs
can be used to annotate the return type of a user-defined
type predicate function. TypeIs
only accepts a single type argument.
At runtime, functions marked this way should return a boolean and accept
at least one argument.
TypeIs
aims to benefit type narrowing -- a technique used by static
type checkers to determine a more precise type of an expression within a
program's code flow. Usually type narrowing is done by analyzing
conditional code flow and applying the narrowing to a block of code. The
conditional expression here is sometimes referred to as a "type predicate".
Sometimes it would be convenient to use a user-defined boolean function
as a type predicate. Such a function should use TypeIs[...]
or
TypeGuard[...]
as its return type to alert static type checkers to
this intention. TypeIs
usually has more intuitive behavior than
TypeGuard
, but it cannot be used when the input and output types
are incompatible (e.g., list[object]
to list[int]
) or when the
function does not return True
for all instances of the narrowed type.
Using -> TypeIs[NarrowedType]
tells the static type checker that for
a given function:
- The return value is a boolean.
- If the return value is
True
, the type of its argument is the intersection of the argument's original type andNarrowedType
. - If the return value is
False
, the type of its argument is narrowed to excludeNarrowedType
.
For example::
from typing import assert_type, final, TypeIs
class Parent: pass
class Child(Parent): pass
@final
class Unrelated: pass
def is_parent(val: object) -> TypeIs[Parent]:
return isinstance(val, Parent)
def run(arg: Child | Unrelated):
if is_parent(arg):
# Type of ``arg`` is narrowed to the intersection
# of ``Parent`` and ``Child``, which is equivalent to
# ``Child``.
assert_type(arg, Child)
else:
# Type of ``arg`` is narrowed to exclude ``Parent``,
# so only ``Unrelated`` is left.
assert_type(arg, Unrelated)
The type inside TypeIs
must be consistent with the type of the
function's argument; if it is not, static type checkers will raise
an error. An incorrectly written TypeIs
function can lead to
unsound behavior in the type system; it is the user's responsibility
to write such functions in a type-safe manner.
TypeIs
also works with type variables. For more information, see
PEP 742 (Narrowing types with TypeIs
).
Type alias.
Type aliases are created through the type statement::
type Alias = int
In this example, Alias and int will be treated equivalently by static type checkers.
At runtime, Alias is an instance of TypeAliasType. The __name__ attribute holds the name of the type alias. The value of the type alias is stored in the __value__ attribute. It is evaluated lazily, so the value is computed only if the attribute is accessed.
Type aliases can also be generic::
type ListOrSet[T] = list[T] | set[T]
In this case, the type parameters of the alias are stored in the __type_params__ attribute.
See PEP 695 for more information.
Type unpack operator.
The type unpack operator takes the child types from some container type,
such as tuple[int, str]
or a TypeVarTuple
, and 'pulls them out'.
For example::
# For some generic class `Foo`:
Foo[Unpack[tuple[int, str]]] # Equivalent to Foo[int, str]
Ts = TypeVarTuple('Ts')
# Specifies that `Bar` is generic in an arbitrary number of types.
# (Think of `Ts` as a tuple of an arbitrary number of individual
# `TypeVar`s, which the `Unpack` is 'pulling out' directly into the
# `Generic[]`.)
class Bar(Generic[Unpack[Ts]]): ...
Bar[int] # Valid
Bar[int, str] # Also valid
From Python 3.11, this can also be done using the *
operator::
Foo[*tuple[int, str]]
class Bar(Generic[*Ts]): ...
And from Python 3.12, it can be done using built-in syntax for generics::
Foo[*tuple[int, str]]
class Bar[*Ts]: ...
The operator can also be used along with a TypedDict
to annotate
**kwargs
in a function signature::
class Movie(TypedDict):
name: str
year: int
# This function expects two keyword arguments - *name* of type `str` and
# *year* of type `int`.
def foo(**kwargs: Unpack[Movie]): ...
Note that there is only some runtime checking of this operator. Not everything the runtime allows may be accepted by static type checkers.
For more information, see PEPs 646 and 692.