typing

The typing module: Support for gradual typing as defined by PEP 484 and subsequent PEPs.

Among other things, the module includes the following:

  • Generic, Protocol, and internal machinery to support generic aliases. All subscripted types like X[int], Union[int, str] are generic aliases.
  • Various "special forms" that have unique meanings in type annotations: NoReturn, Never, ClassVar, Self, Concatenate, Unpack, and others.
  • Classes whose instances can be type arguments to generic classes and functions: TypeVar, ParamSpec, TypeVarTuple.
  • Public helper functions: get_type_hints, overload, cast, final, and others.
  • Several protocols to support duck-typing: SupportsFloat, SupportsIndex, SupportsAbs, and others.
  • Special types: NewType, NamedTuple, TypedDict.
  • Deprecated aliases for builtin types and collections.abc ABCs.

Any name not present in __all__ is an implementation detail that may be changed without notice. Use at your own risk!

   1"""
   2The typing module: Support for gradual typing as defined by PEP 484 and subsequent PEPs.
   3
   4Among other things, the module includes the following:
   5* Generic, Protocol, and internal machinery to support generic aliases.
   6  All subscripted types like X[int], Union[int, str] are generic aliases.
   7* Various "special forms" that have unique meanings in type annotations:
   8  NoReturn, Never, ClassVar, Self, Concatenate, Unpack, and others.
   9* Classes whose instances can be type arguments to generic classes and functions:
  10  TypeVar, ParamSpec, TypeVarTuple.
  11* Public helper functions: get_type_hints, overload, cast, final, and others.
  12* Several protocols to support duck-typing:
  13  SupportsFloat, SupportsIndex, SupportsAbs, and others.
  14* Special types: NewType, NamedTuple, TypedDict.
  15* Deprecated aliases for builtin types and collections.abc ABCs.
  16
  17Any name not present in __all__ is an implementation detail
  18that may be changed without notice. Use at your own risk!
  19"""
  20
  21from abc import abstractmethod, ABCMeta
  22import collections
  23from collections import defaultdict
  24import collections.abc
  25import copyreg
  26import functools
  27import operator
  28import sys
  29import types
  30from types import WrapperDescriptorType, MethodWrapperType, MethodDescriptorType, GenericAlias
  31
  32from _typing import (
  33    _idfunc,
  34    TypeVar,
  35    ParamSpec,
  36    TypeVarTuple,
  37    ParamSpecArgs,
  38    ParamSpecKwargs,
  39    TypeAliasType,
  40    Generic,
  41    NoDefault,
  42)
  43
  44# Please keep __all__ alphabetized within each category.
  45__all__ = [
  46    # Super-special typing primitives.
  47    'Annotated',
  48    'Any',
  49    'Callable',
  50    'ClassVar',
  51    'Concatenate',
  52    'Final',
  53    'ForwardRef',
  54    'Generic',
  55    'Literal',
  56    'Optional',
  57    'ParamSpec',
  58    'Protocol',
  59    'Tuple',
  60    'Type',
  61    'TypeVar',
  62    'TypeVarTuple',
  63    'Union',
  64
  65    # ABCs (from collections.abc).
  66    'AbstractSet',  # collections.abc.Set.
  67    'ByteString',
  68    'Container',
  69    'ContextManager',
  70    'Hashable',
  71    'ItemsView',
  72    'Iterable',
  73    'Iterator',
  74    'KeysView',
  75    'Mapping',
  76    'MappingView',
  77    'MutableMapping',
  78    'MutableSequence',
  79    'MutableSet',
  80    'Sequence',
  81    'Sized',
  82    'ValuesView',
  83    'Awaitable',
  84    'AsyncIterator',
  85    'AsyncIterable',
  86    'Coroutine',
  87    'Collection',
  88    'AsyncGenerator',
  89    'AsyncContextManager',
  90
  91    # Structural checks, a.k.a. protocols.
  92    'Reversible',
  93    'SupportsAbs',
  94    'SupportsBytes',
  95    'SupportsComplex',
  96    'SupportsFloat',
  97    'SupportsIndex',
  98    'SupportsInt',
  99    'SupportsRound',
 100
 101    # Concrete collection types.
 102    'ChainMap',
 103    'Counter',
 104    'Deque',
 105    'Dict',
 106    'DefaultDict',
 107    'List',
 108    'OrderedDict',
 109    'Set',
 110    'FrozenSet',
 111    'NamedTuple',  # Not really a type.
 112    'TypedDict',  # Not really a type.
 113    'Generator',
 114
 115    # Other concrete types.
 116    'BinaryIO',
 117    'IO',
 118    'Match',
 119    'Pattern',
 120    'TextIO',
 121
 122    # One-off things.
 123    'AnyStr',
 124    'assert_type',
 125    'assert_never',
 126    'cast',
 127    'clear_overloads',
 128    'dataclass_transform',
 129    'final',
 130    'get_args',
 131    'get_origin',
 132    'get_overloads',
 133    'get_protocol_members',
 134    'get_type_hints',
 135    'is_protocol',
 136    'is_typeddict',
 137    'LiteralString',
 138    'Never',
 139    'NewType',
 140    'no_type_check',
 141    'no_type_check_decorator',
 142    'NoDefault',
 143    'NoReturn',
 144    'NotRequired',
 145    'overload',
 146    'override',
 147    'ParamSpecArgs',
 148    'ParamSpecKwargs',
 149    'ReadOnly',
 150    'Required',
 151    'reveal_type',
 152    'runtime_checkable',
 153    'Self',
 154    'Text',
 155    'TYPE_CHECKING',
 156    'TypeAlias',
 157    'TypeGuard',
 158    'TypeIs',
 159    'TypeAliasType',
 160    'Unpack',
 161]
 162
 163
 164def _type_convert(arg, module=None, *, allow_special_forms=False):
 165    """For converting None to type(None), and strings to ForwardRef."""
 166    if arg is None:
 167        return type(None)
 168    if isinstance(arg, str):
 169        return ForwardRef(arg, module=module, is_class=allow_special_forms)
 170    return arg
 171
 172
 173def _type_check(arg, msg, is_argument=True, module=None, *, allow_special_forms=False):
 174    """Check that the argument is a type, and return it (internal helper).
 175
 176    As a special case, accept None and return type(None) instead. Also wrap strings
 177    into ForwardRef instances. Consider several corner cases, for example plain
 178    special forms like Union are not valid, while Union[int, str] is OK, etc.
 179    The msg argument is a human-readable error message, e.g.::
 180
 181        "Union[arg, ...]: arg should be a type."
 182
 183    We append the repr() of the actual value (truncated to 100 chars).
 184    """
 185    invalid_generic_forms = (Generic, Protocol)
 186    if not allow_special_forms:
 187        invalid_generic_forms += (ClassVar,)
 188        if is_argument:
 189            invalid_generic_forms += (Final,)
 190
 191    arg = _type_convert(arg, module=module, allow_special_forms=allow_special_forms)
 192    if (isinstance(arg, _GenericAlias) and
 193            arg.__origin__ in invalid_generic_forms):
 194        raise TypeError(f"{arg} is not valid as type argument")
 195    if arg in (Any, LiteralString, NoReturn, Never, Self, TypeAlias):
 196        return arg
 197    if allow_special_forms and arg in (ClassVar, Final):
 198        return arg
 199    if isinstance(arg, _SpecialForm) or arg in (Generic, Protocol):
 200        raise TypeError(f"Plain {arg} is not valid as type argument")
 201    if type(arg) is tuple:
 202        raise TypeError(f"{msg} Got {arg!r:.100}.")
 203    return arg
 204
 205
 206def _is_param_expr(arg):
 207    return arg is ... or isinstance(arg,
 208            (tuple, list, ParamSpec, _ConcatenateGenericAlias))
 209
 210
 211def _should_unflatten_callable_args(typ, args):
 212    """Internal helper for munging collections.abc.Callable's __args__.
 213
 214    The canonical representation for a Callable's __args__ flattens the
 215    argument types, see https://github.com/python/cpython/issues/86361.
 216
 217    For example::
 218
 219        >>> import collections.abc
 220        >>> P = ParamSpec('P')
 221        >>> collections.abc.Callable[[int, int], str].__args__ == (int, int, str)
 222        True
 223        >>> collections.abc.Callable[P, str].__args__ == (P, str)
 224        True
 225
 226    As a result, if we need to reconstruct the Callable from its __args__,
 227    we need to unflatten it.
 228    """
 229    return (
 230        typ.__origin__ is collections.abc.Callable
 231        and not (len(args) == 2 and _is_param_expr(args[0]))
 232    )
 233
 234
 235def _type_repr(obj):
 236    """Return the repr() of an object, special-casing types (internal helper).
 237
 238    If obj is a type, we return a shorter version than the default
 239    type.__repr__, based on the module and qualified name, which is
 240    typically enough to uniquely identify a type.  For everything
 241    else, we fall back on repr(obj).
 242    """
 243    # When changing this function, don't forget about
 244    # `_collections_abc._type_repr`, which does the same thing
 245    # and must be consistent with this one.
 246    if isinstance(obj, type):
 247        if obj.__module__ == 'builtins':
 248            return obj.__qualname__
 249        return f'{obj.__module__}.{obj.__qualname__}'
 250    if obj is ...:
 251        return '...'
 252    if isinstance(obj, types.FunctionType):
 253        return obj.__name__
 254    if isinstance(obj, tuple):
 255        # Special case for `repr` of types with `ParamSpec`:
 256        return '[' + ', '.join(_type_repr(t) for t in obj) + ']'
 257    return repr(obj)
 258
 259
 260def _collect_type_parameters(args, *, enforce_default_ordering: bool = True):
 261    """Collect all type parameters in args
 262    in order of first appearance (lexicographic order).
 263
 264    For example::
 265
 266        >>> P = ParamSpec('P')
 267        >>> T = TypeVar('T')
 268        >>> _collect_type_parameters((T, Callable[P, T]))
 269        (~T, ~P)
 270    """
 271    # required type parameter cannot appear after parameter with default
 272    default_encountered = False
 273    # or after TypeVarTuple
 274    type_var_tuple_encountered = False
 275    parameters = []
 276    for t in args:
 277        if isinstance(t, type):
 278            # We don't want __parameters__ descriptor of a bare Python class.
 279            pass
 280        elif isinstance(t, tuple):
 281            # `t` might be a tuple, when `ParamSpec` is substituted with
 282            # `[T, int]`, or `[int, *Ts]`, etc.
 283            for x in t:
 284                for collected in _collect_type_parameters([x]):
 285                    if collected not in parameters:
 286                        parameters.append(collected)
 287        elif hasattr(t, '__typing_subst__'):
 288            if t not in parameters:
 289                if enforce_default_ordering:
 290                    if type_var_tuple_encountered and t.has_default():
 291                        raise TypeError('Type parameter with a default'
 292                                        ' follows TypeVarTuple')
 293
 294                    if t.has_default():
 295                        default_encountered = True
 296                    elif default_encountered:
 297                        raise TypeError(f'Type parameter {t!r} without a default'
 298                                        ' follows type parameter with a default')
 299
 300                parameters.append(t)
 301        else:
 302            if _is_unpacked_typevartuple(t):
 303                type_var_tuple_encountered = True
 304            for x in getattr(t, '__parameters__', ()):
 305                if x not in parameters:
 306                    parameters.append(x)
 307    return tuple(parameters)
 308
 309
 310def _check_generic_specialization(cls, arguments):
 311    """Check correct count for parameters of a generic cls (internal helper).
 312
 313    This gives a nice error message in case of count mismatch.
 314    """
 315    expected_len = len(cls.__parameters__)
 316    if not expected_len:
 317        raise TypeError(f"{cls} is not a generic class")
 318    actual_len = len(arguments)
 319    if actual_len != expected_len:
 320        # deal with defaults
 321        if actual_len < expected_len:
 322            # If the parameter at index `actual_len` in the parameters list
 323            # has a default, then all parameters after it must also have
 324            # one, because we validated as much in _collect_type_parameters().
 325            # That means that no error needs to be raised here, despite
 326            # the number of arguments being passed not matching the number
 327            # of parameters: all parameters that aren't explicitly
 328            # specialized in this call are parameters with default values.
 329            if cls.__parameters__[actual_len].has_default():
 330                return
 331
 332            expected_len -= sum(p.has_default() for p in cls.__parameters__)
 333            expect_val = f"at least {expected_len}"
 334        else:
 335            expect_val = expected_len
 336
 337        raise TypeError(f"Too {'many' if actual_len > expected_len else 'few'} arguments"
 338                        f" for {cls}; actual {actual_len}, expected {expect_val}")
 339
 340
 341def _unpack_args(*args):
 342    newargs = []
 343    for arg in args:
 344        subargs = getattr(arg, '__typing_unpacked_tuple_args__', None)
 345        if subargs is not None and not (subargs and subargs[-1] is ...):
 346            newargs.extend(subargs)
 347        else:
 348            newargs.append(arg)
 349    return newargs
 350
 351def _deduplicate(params, *, unhashable_fallback=False):
 352    # Weed out strict duplicates, preserving the first of each occurrence.
 353    try:
 354        return dict.fromkeys(params)
 355    except TypeError:
 356        if not unhashable_fallback:
 357            raise
 358        # Happens for cases like `Annotated[dict, {'x': IntValidator()}]`
 359        return _deduplicate_unhashable(params)
 360
 361def _deduplicate_unhashable(unhashable_params):
 362    new_unhashable = []
 363    for t in unhashable_params:
 364        if t not in new_unhashable:
 365            new_unhashable.append(t)
 366    return new_unhashable
 367
 368def _compare_args_orderless(first_args, second_args):
 369    first_unhashable = _deduplicate_unhashable(first_args)
 370    second_unhashable = _deduplicate_unhashable(second_args)
 371    t = list(second_unhashable)
 372    try:
 373        for elem in first_unhashable:
 374            t.remove(elem)
 375    except ValueError:
 376        return False
 377    return not t
 378
 379def _remove_dups_flatten(parameters):
 380    """Internal helper for Union creation and substitution.
 381
 382    Flatten Unions among parameters, then remove duplicates.
 383    """
 384    # Flatten out Union[Union[...], ...].
 385    params = []
 386    for p in parameters:
 387        if isinstance(p, (_UnionGenericAlias, types.UnionType)):
 388            params.extend(p.__args__)
 389        else:
 390            params.append(p)
 391
 392    return tuple(_deduplicate(params, unhashable_fallback=True))
 393
 394
 395def _flatten_literal_params(parameters):
 396    """Internal helper for Literal creation: flatten Literals among parameters."""
 397    params = []
 398    for p in parameters:
 399        if isinstance(p, _LiteralGenericAlias):
 400            params.extend(p.__args__)
 401        else:
 402            params.append(p)
 403    return tuple(params)
 404
 405
 406_cleanups = []
 407_caches = {}
 408
 409
 410def _tp_cache(func=None, /, *, typed=False):
 411    """Internal wrapper caching __getitem__ of generic types.
 412
 413    For non-hashable arguments, the original function is used as a fallback.
 414    """
 415    def decorator(func):
 416        # The callback 'inner' references the newly created lru_cache
 417        # indirectly by performing a lookup in the global '_caches' dictionary.
 418        # This breaks a reference that can be problematic when combined with
 419        # C API extensions that leak references to types. See GH-98253.
 420
 421        cache = functools.lru_cache(typed=typed)(func)
 422        _caches[func] = cache
 423        _cleanups.append(cache.cache_clear)
 424        del cache
 425
 426        @functools.wraps(func)
 427        def inner(*args, **kwds):
 428            try:
 429                return _caches[func](*args, **kwds)
 430            except TypeError:
 431                pass  # All real errors (not unhashable args) are raised below.
 432            return func(*args, **kwds)
 433        return inner
 434
 435    if func is not None:
 436        return decorator(func)
 437
 438    return decorator
 439
 440
 441def _deprecation_warning_for_no_type_params_passed(funcname: str) -> None:
 442    import warnings
 443
 444    depr_message = (
 445        f"Failing to pass a value to the 'type_params' parameter "
 446        f"of {funcname!r} is deprecated, as it leads to incorrect behaviour "
 447        f"when calling {funcname} on a stringified annotation "
 448        f"that references a PEP 695 type parameter. "
 449        f"It will be disallowed in Python 3.15."
 450    )
 451    warnings.warn(depr_message, category=DeprecationWarning, stacklevel=3)
 452
 453
 454class _Sentinel:
 455    __slots__ = ()
 456    def __repr__(self):
 457        return '<sentinel>'
 458
 459
 460_sentinel = _Sentinel()
 461
 462
 463def _eval_type(t, globalns, localns, type_params=_sentinel, *, recursive_guard=frozenset()):
 464    """Evaluate all forward references in the given type t.
 465
 466    For use of globalns and localns see the docstring for get_type_hints().
 467    recursive_guard is used to prevent infinite recursion with a recursive
 468    ForwardRef.
 469    """
 470    if type_params is _sentinel:
 471        _deprecation_warning_for_no_type_params_passed("typing._eval_type")
 472        type_params = ()
 473    if isinstance(t, ForwardRef):
 474        return t._evaluate(globalns, localns, type_params, recursive_guard=recursive_guard)
 475    if isinstance(t, (_GenericAlias, GenericAlias, types.UnionType)):
 476        if isinstance(t, GenericAlias):
 477            args = tuple(
 478                ForwardRef(arg) if isinstance(arg, str) else arg
 479                for arg in t.__args__
 480            )
 481            is_unpacked = t.__unpacked__
 482            if _should_unflatten_callable_args(t, args):
 483                t = t.__origin__[(args[:-1], args[-1])]
 484            else:
 485                t = t.__origin__[args]
 486            if is_unpacked:
 487                t = Unpack[t]
 488
 489        ev_args = tuple(
 490            _eval_type(
 491                a, globalns, localns, type_params, recursive_guard=recursive_guard
 492            )
 493            for a in t.__args__
 494        )
 495        if ev_args == t.__args__:
 496            return t
 497        if isinstance(t, GenericAlias):
 498            return GenericAlias(t.__origin__, ev_args)
 499        if isinstance(t, types.UnionType):
 500            return functools.reduce(operator.or_, ev_args)
 501        else:
 502            return t.copy_with(ev_args)
 503    return t
 504
 505
 506class _Final:
 507    """Mixin to prohibit subclassing."""
 508
 509    __slots__ = ('__weakref__',)
 510
 511    def __init_subclass__(cls, /, *args, **kwds):
 512        if '_root' not in kwds:
 513            raise TypeError("Cannot subclass special typing classes")
 514
 515
 516class _NotIterable:
 517    """Mixin to prevent iteration, without being compatible with Iterable.
 518
 519    That is, we could do::
 520
 521        def __iter__(self): raise TypeError()
 522
 523    But this would make users of this mixin duck type-compatible with
 524    collections.abc.Iterable - isinstance(foo, Iterable) would be True.
 525
 526    Luckily, we can instead prevent iteration by setting __iter__ to None, which
 527    is treated specially.
 528    """
 529
 530    __slots__ = ()
 531    __iter__ = None
 532
 533
 534# Internal indicator of special typing constructs.
 535# See __doc__ instance attribute for specific docs.
 536class _SpecialForm(_Final, _NotIterable, _root=True):
 537    __slots__ = ('_name', '__doc__', '_getitem')
 538
 539    def __init__(self, getitem):
 540        self._getitem = getitem
 541        self._name = getitem.__name__
 542        self.__doc__ = getitem.__doc__
 543
 544    def __getattr__(self, item):
 545        if item in {'__name__', '__qualname__'}:
 546            return self._name
 547
 548        raise AttributeError(item)
 549
 550    def __mro_entries__(self, bases):
 551        raise TypeError(f"Cannot subclass {self!r}")
 552
 553    def __repr__(self):
 554        return 'typing.' + self._name
 555
 556    def __reduce__(self):
 557        return self._name
 558
 559    def __call__(self, *args, **kwds):
 560        raise TypeError(f"Cannot instantiate {self!r}")
 561
 562    def __or__(self, other):
 563        return Union[self, other]
 564
 565    def __ror__(self, other):
 566        return Union[other, self]
 567
 568    def __instancecheck__(self, obj):
 569        raise TypeError(f"{self} cannot be used with isinstance()")
 570
 571    def __subclasscheck__(self, cls):
 572        raise TypeError(f"{self} cannot be used with issubclass()")
 573
 574    @_tp_cache
 575    def __getitem__(self, parameters):
 576        return self._getitem(self, parameters)
 577
 578
 579class _TypedCacheSpecialForm(_SpecialForm, _root=True):
 580    def __getitem__(self, parameters):
 581        if not isinstance(parameters, tuple):
 582            parameters = (parameters,)
 583        return self._getitem(self, *parameters)
 584
 585
 586class _AnyMeta(type):
 587    def __instancecheck__(self, obj):
 588        if self is Any:
 589            raise TypeError("typing.Any cannot be used with isinstance()")
 590        return super().__instancecheck__(obj)
 591
 592    def __repr__(self):
 593        if self is Any:
 594            return "typing.Any"
 595        return super().__repr__()  # respect to subclasses
 596
 597
 598class Any(metaclass=_AnyMeta):
 599    """Special type indicating an unconstrained type.
 600
 601    - Any is compatible with every type.
 602    - Any assumed to have all methods.
 603    - All values assumed to be instances of Any.
 604
 605    Note that all the above statements are true from the point of view of
 606    static type checkers. At runtime, Any should not be used with instance
 607    checks.
 608    """
 609
 610    def __new__(cls, *args, **kwargs):
 611        if cls is Any:
 612            raise TypeError("Any cannot be instantiated")
 613        return super().__new__(cls)
 614
 615
 616@_SpecialForm
 617def NoReturn(self, parameters):
 618    """Special type indicating functions that never return.
 619
 620    Example::
 621
 622        from typing import NoReturn
 623
 624        def stop() -> NoReturn:
 625            raise Exception('no way')
 626
 627    NoReturn can also be used as a bottom type, a type that
 628    has no values. Starting in Python 3.11, the Never type should
 629    be used for this concept instead. Type checkers should treat the two
 630    equivalently.
 631    """
 632    raise TypeError(f"{self} is not subscriptable")
 633
 634# This is semantically identical to NoReturn, but it is implemented
 635# separately so that type checkers can distinguish between the two
 636# if they want.
 637@_SpecialForm
 638def Never(self, parameters):
 639    """The bottom type, a type that has no members.
 640
 641    This can be used to define a function that should never be
 642    called, or a function that never returns::
 643
 644        from typing import Never
 645
 646        def never_call_me(arg: Never) -> None:
 647            pass
 648
 649        def int_or_str(arg: int | str) -> None:
 650            never_call_me(arg)  # type checker error
 651            match arg:
 652                case int():
 653                    print("It's an int")
 654                case str():
 655                    print("It's a str")
 656                case _:
 657                    never_call_me(arg)  # OK, arg is of type Never
 658    """
 659    raise TypeError(f"{self} is not subscriptable")
 660
 661
 662@_SpecialForm
 663def Self(self, parameters):
 664    """Used to spell the type of "self" in classes.
 665
 666    Example::
 667
 668        from typing import Self
 669
 670        class Foo:
 671            def return_self(self) -> Self:
 672                ...
 673                return self
 674
 675    This is especially useful for:
 676        - classmethods that are used as alternative constructors
 677        - annotating an `__enter__` method which returns self
 678    """
 679    raise TypeError(f"{self} is not subscriptable")
 680
 681
 682@_SpecialForm
 683def LiteralString(self, parameters):
 684    """Represents an arbitrary literal string.
 685
 686    Example::
 687
 688        from typing import LiteralString
 689
 690        def run_query(sql: LiteralString) -> None:
 691            ...
 692
 693        def caller(arbitrary_string: str, literal_string: LiteralString) -> None:
 694            run_query("SELECT * FROM students")  # OK
 695            run_query(literal_string)  # OK
 696            run_query("SELECT * FROM " + literal_string)  # OK
 697            run_query(arbitrary_string)  # type checker error
 698            run_query(  # type checker error
 699                f"SELECT * FROM students WHERE name = {arbitrary_string}"
 700            )
 701
 702    Only string literals and other LiteralStrings are compatible
 703    with LiteralString. This provides a tool to help prevent
 704    security issues such as SQL injection.
 705    """
 706    raise TypeError(f"{self} is not subscriptable")
 707
 708
 709@_SpecialForm
 710def ClassVar(self, parameters):
 711    """Special type construct to mark class variables.
 712
 713    An annotation wrapped in ClassVar indicates that a given
 714    attribute is intended to be used as a class variable and
 715    should not be set on instances of that class.
 716
 717    Usage::
 718
 719        class Starship:
 720            stats: ClassVar[dict[str, int]] = {} # class variable
 721            damage: int = 10                     # instance variable
 722
 723    ClassVar accepts only types and cannot be further subscribed.
 724
 725    Note that ClassVar is not a class itself, and should not
 726    be used with isinstance() or issubclass().
 727    """
 728    item = _type_check(parameters, f'{self} accepts only single type.', allow_special_forms=True)
 729    return _GenericAlias(self, (item,))
 730
 731@_SpecialForm
 732def Final(self, parameters):
 733    """Special typing construct to indicate final names to type checkers.
 734
 735    A final name cannot be re-assigned or overridden in a subclass.
 736
 737    For example::
 738
 739        MAX_SIZE: Final = 9000
 740        MAX_SIZE += 1  # Error reported by type checker
 741
 742        class Connection:
 743            TIMEOUT: Final[int] = 10
 744
 745        class FastConnector(Connection):
 746            TIMEOUT = 1  # Error reported by type checker
 747
 748    There is no runtime checking of these properties.
 749    """
 750    item = _type_check(parameters, f'{self} accepts only single type.', allow_special_forms=True)
 751    return _GenericAlias(self, (item,))
 752
 753@_SpecialForm
 754def Union(self, parameters):
 755    """Union type; Union[X, Y] means either X or Y.
 756
 757    On Python 3.10 and higher, the | operator
 758    can also be used to denote unions;
 759    X | Y means the same thing to the type checker as Union[X, Y].
 760
 761    To define a union, use e.g. Union[int, str]. Details:
 762    - The arguments must be types and there must be at least one.
 763    - None as an argument is a special case and is replaced by
 764      type(None).
 765    - Unions of unions are flattened, e.g.::
 766
 767        assert Union[Union[int, str], float] == Union[int, str, float]
 768
 769    - Unions of a single argument vanish, e.g.::
 770
 771        assert Union[int] == int  # The constructor actually returns int
 772
 773    - Redundant arguments are skipped, e.g.::
 774
 775        assert Union[int, str, int] == Union[int, str]
 776
 777    - When comparing unions, the argument order is ignored, e.g.::
 778
 779        assert Union[int, str] == Union[str, int]
 780
 781    - You cannot subclass or instantiate a union.
 782    - You can use Optional[X] as a shorthand for Union[X, None].
 783    """
 784    if parameters == ():
 785        raise TypeError("Cannot take a Union of no types.")
 786    if not isinstance(parameters, tuple):
 787        parameters = (parameters,)
 788    msg = "Union[arg, ...]: each arg must be a type."
 789    parameters = tuple(_type_check(p, msg) for p in parameters)
 790    parameters = _remove_dups_flatten(parameters)
 791    if len(parameters) == 1:
 792        return parameters[0]
 793    if len(parameters) == 2 and type(None) in parameters:
 794        return _UnionGenericAlias(self, parameters, name="Optional")
 795    return _UnionGenericAlias(self, parameters)
 796
 797def _make_union(left, right):
 798    """Used from the C implementation of TypeVar.
 799
 800    TypeVar.__or__ calls this instead of returning types.UnionType
 801    because we want to allow unions between TypeVars and strings
 802    (forward references).
 803    """
 804    return Union[left, right]
 805
 806@_SpecialForm
 807def Optional(self, parameters):
 808    """Optional[X] is equivalent to Union[X, None]."""
 809    arg = _type_check(parameters, f"{self} requires a single type.")
 810    return Union[arg, type(None)]
 811
 812@_TypedCacheSpecialForm
 813@_tp_cache(typed=True)
 814def Literal(self, *parameters):
 815    """Special typing form to define literal types (a.k.a. value types).
 816
 817    This form can be used to indicate to type checkers that the corresponding
 818    variable or function parameter has a value equivalent to the provided
 819    literal (or one of several literals)::
 820
 821        def validate_simple(data: Any) -> Literal[True]:  # always returns True
 822            ...
 823
 824        MODE = Literal['r', 'rb', 'w', 'wb']
 825        def open_helper(file: str, mode: MODE) -> str:
 826            ...
 827
 828        open_helper('/some/path', 'r')  # Passes type check
 829        open_helper('/other/path', 'typo')  # Error in type checker
 830
 831    Literal[...] cannot be subclassed. At runtime, an arbitrary value
 832    is allowed as type argument to Literal[...], but type checkers may
 833    impose restrictions.
 834    """
 835    # There is no '_type_check' call because arguments to Literal[...] are
 836    # values, not types.
 837    parameters = _flatten_literal_params(parameters)
 838
 839    try:
 840        parameters = tuple(p for p, _ in _deduplicate(list(_value_and_type_iter(parameters))))
 841    except TypeError:  # unhashable parameters
 842        pass
 843
 844    return _LiteralGenericAlias(self, parameters)
 845
 846
 847@_SpecialForm
 848def TypeAlias(self, parameters):
 849    """Special form for marking type aliases.
 850
 851    Use TypeAlias to indicate that an assignment should
 852    be recognized as a proper type alias definition by type
 853    checkers.
 854
 855    For example::
 856
 857        Predicate: TypeAlias = Callable[..., bool]
 858
 859    It's invalid when used anywhere except as in the example above.
 860    """
 861    raise TypeError(f"{self} is not subscriptable")
 862
 863
 864@_SpecialForm
 865def Concatenate(self, parameters):
 866    """Special form for annotating higher-order functions.
 867
 868    ``Concatenate`` can be used in conjunction with ``ParamSpec`` and
 869    ``Callable`` to represent a higher-order function which adds, removes or
 870    transforms the parameters of a callable.
 871
 872    For example::
 873
 874        Callable[Concatenate[int, P], int]
 875
 876    See PEP 612 for detailed information.
 877    """
 878    if parameters == ():
 879        raise TypeError("Cannot take a Concatenate of no types.")
 880    if not isinstance(parameters, tuple):
 881        parameters = (parameters,)
 882    if not (parameters[-1] is ... or isinstance(parameters[-1], ParamSpec)):
 883        raise TypeError("The last parameter to Concatenate should be a "
 884                        "ParamSpec variable or ellipsis.")
 885    msg = "Concatenate[arg, ...]: each arg must be a type."
 886    parameters = (*(_type_check(p, msg) for p in parameters[:-1]), parameters[-1])
 887    return _ConcatenateGenericAlias(self, parameters)
 888
 889
 890@_SpecialForm
 891def TypeGuard(self, parameters):
 892    """Special typing construct for marking user-defined type predicate functions.
 893
 894    ``TypeGuard`` can be used to annotate the return type of a user-defined
 895    type predicate function.  ``TypeGuard`` only accepts a single type argument.
 896    At runtime, functions marked this way should return a boolean.
 897
 898    ``TypeGuard`` aims to benefit *type narrowing* -- a technique used by static
 899    type checkers to determine a more precise type of an expression within a
 900    program's code flow.  Usually type narrowing is done by analyzing
 901    conditional code flow and applying the narrowing to a block of code.  The
 902    conditional expression here is sometimes referred to as a "type predicate".
 903
 904    Sometimes it would be convenient to use a user-defined boolean function
 905    as a type predicate.  Such a function should use ``TypeGuard[...]`` or
 906    ``TypeIs[...]`` as its return type to alert static type checkers to
 907    this intention. ``TypeGuard`` should be used over ``TypeIs`` when narrowing
 908    from an incompatible type (e.g., ``list[object]`` to ``list[int]``) or when
 909    the function does not return ``True`` for all instances of the narrowed type.
 910
 911    Using  ``-> TypeGuard[NarrowedType]`` tells the static type checker that
 912    for a given function:
 913
 914    1. The return value is a boolean.
 915    2. If the return value is ``True``, the type of its argument
 916       is ``NarrowedType``.
 917
 918    For example::
 919
 920         def is_str_list(val: list[object]) -> TypeGuard[list[str]]:
 921             '''Determines whether all objects in the list are strings'''
 922             return all(isinstance(x, str) for x in val)
 923
 924         def func1(val: list[object]):
 925             if is_str_list(val):
 926                 # Type of ``val`` is narrowed to ``list[str]``.
 927                 print(" ".join(val))
 928             else:
 929                 # Type of ``val`` remains as ``list[object]``.
 930                 print("Not a list of strings!")
 931
 932    Strict type narrowing is not enforced -- ``TypeB`` need not be a narrower
 933    form of ``TypeA`` (it can even be a wider form) and this may lead to
 934    type-unsafe results.  The main reason is to allow for things like
 935    narrowing ``list[object]`` to ``list[str]`` even though the latter is not
 936    a subtype of the former, since ``list`` is invariant.  The responsibility of
 937    writing type-safe type predicates is left to the user.
 938
 939    ``TypeGuard`` also works with type variables.  For more information, see
 940    PEP 647 (User-Defined Type Guards).
 941    """
 942    item = _type_check(parameters, f'{self} accepts only single type.')
 943    return _GenericAlias(self, (item,))
 944
 945
 946@_SpecialForm
 947def TypeIs(self, parameters):
 948    """Special typing construct for marking user-defined type predicate functions.
 949
 950    ``TypeIs`` can be used to annotate the return type of a user-defined
 951    type predicate function.  ``TypeIs`` only accepts a single type argument.
 952    At runtime, functions marked this way should return a boolean and accept
 953    at least one argument.
 954
 955    ``TypeIs`` aims to benefit *type narrowing* -- a technique used by static
 956    type checkers to determine a more precise type of an expression within a
 957    program's code flow.  Usually type narrowing is done by analyzing
 958    conditional code flow and applying the narrowing to a block of code.  The
 959    conditional expression here is sometimes referred to as a "type predicate".
 960
 961    Sometimes it would be convenient to use a user-defined boolean function
 962    as a type predicate.  Such a function should use ``TypeIs[...]`` or
 963    ``TypeGuard[...]`` as its return type to alert static type checkers to
 964    this intention.  ``TypeIs`` usually has more intuitive behavior than
 965    ``TypeGuard``, but it cannot be used when the input and output types
 966    are incompatible (e.g., ``list[object]`` to ``list[int]``) or when the
 967    function does not return ``True`` for all instances of the narrowed type.
 968
 969    Using  ``-> TypeIs[NarrowedType]`` tells the static type checker that for
 970    a given function:
 971
 972    1. The return value is a boolean.
 973    2. If the return value is ``True``, the type of its argument
 974       is the intersection of the argument's original type and
 975       ``NarrowedType``.
 976    3. If the return value is ``False``, the type of its argument
 977       is narrowed to exclude ``NarrowedType``.
 978
 979    For example::
 980
 981        from typing import assert_type, final, TypeIs
 982
 983        class Parent: pass
 984        class Child(Parent): pass
 985        @final
 986        class Unrelated: pass
 987
 988        def is_parent(val: object) -> TypeIs[Parent]:
 989            return isinstance(val, Parent)
 990
 991        def run(arg: Child | Unrelated):
 992            if is_parent(arg):
 993                # Type of ``arg`` is narrowed to the intersection
 994                # of ``Parent`` and ``Child``, which is equivalent to
 995                # ``Child``.
 996                assert_type(arg, Child)
 997            else:
 998                # Type of ``arg`` is narrowed to exclude ``Parent``,
 999                # so only ``Unrelated`` is left.
1000                assert_type(arg, Unrelated)
1001
1002    The type inside ``TypeIs`` must be consistent with the type of the
1003    function's argument; if it is not, static type checkers will raise
1004    an error.  An incorrectly written ``TypeIs`` function can lead to
1005    unsound behavior in the type system; it is the user's responsibility
1006    to write such functions in a type-safe manner.
1007
1008    ``TypeIs`` also works with type variables.  For more information, see
1009    PEP 742 (Narrowing types with ``TypeIs``).
1010    """
1011    item = _type_check(parameters, f'{self} accepts only single type.')
1012    return _GenericAlias(self, (item,))
1013
1014
1015class ForwardRef(_Final, _root=True):
1016    """Internal wrapper to hold a forward reference."""
1017
1018    __slots__ = ('__forward_arg__', '__forward_code__',
1019                 '__forward_evaluated__', '__forward_value__',
1020                 '__forward_is_argument__', '__forward_is_class__',
1021                 '__forward_module__')
1022
1023    def __init__(self, arg, is_argument=True, module=None, *, is_class=False):
1024        if not isinstance(arg, str):
1025            raise TypeError(f"Forward reference must be a string -- got {arg!r}")
1026
1027        # If we do `def f(*args: *Ts)`, then we'll have `arg = '*Ts'`.
1028        # Unfortunately, this isn't a valid expression on its own, so we
1029        # do the unpacking manually.
1030        if arg.startswith('*'):
1031            arg_to_compile = f'({arg},)[0]'  # E.g. (*Ts,)[0] or (*tuple[int, int],)[0]
1032        else:
1033            arg_to_compile = arg
1034        try:
1035            code = compile(arg_to_compile, '<string>', 'eval')
1036        except SyntaxError:
1037            raise SyntaxError(f"Forward reference must be an expression -- got {arg!r}")
1038
1039        self.__forward_arg__ = arg
1040        self.__forward_code__ = code
1041        self.__forward_evaluated__ = False
1042        self.__forward_value__ = None
1043        self.__forward_is_argument__ = is_argument
1044        self.__forward_is_class__ = is_class
1045        self.__forward_module__ = module
1046
1047    def _evaluate(self, globalns, localns, type_params=_sentinel, *, recursive_guard):
1048        if type_params is _sentinel:
1049            _deprecation_warning_for_no_type_params_passed("typing.ForwardRef._evaluate")
1050            type_params = ()
1051        if self.__forward_arg__ in recursive_guard:
1052            return self
1053        if not self.__forward_evaluated__ or localns is not globalns:
1054            if globalns is None and localns is None:
1055                globalns = localns = {}
1056            elif globalns is None:
1057                globalns = localns
1058            elif localns is None:
1059                localns = globalns
1060            if self.__forward_module__ is not None:
1061                globalns = getattr(
1062                    sys.modules.get(self.__forward_module__, None), '__dict__', globalns
1063                )
1064
1065            # type parameters require some special handling,
1066            # as they exist in their own scope
1067            # but `eval()` does not have a dedicated parameter for that scope.
1068            # For classes, names in type parameter scopes should override
1069            # names in the global scope (which here are called `localns`!),
1070            # but should in turn be overridden by names in the class scope
1071            # (which here are called `globalns`!)
1072            if type_params:
1073                globalns, localns = dict(globalns), dict(localns)
1074                for param in type_params:
1075                    param_name = param.__name__
1076                    if not self.__forward_is_class__ or param_name not in globalns:
1077                        globalns[param_name] = param
1078                        localns.pop(param_name, None)
1079
1080            type_ = _type_check(
1081                eval(self.__forward_code__, globalns, localns),
1082                "Forward references must evaluate to types.",
1083                is_argument=self.__forward_is_argument__,
1084                allow_special_forms=self.__forward_is_class__,
1085            )
1086            self.__forward_value__ = _eval_type(
1087                type_,
1088                globalns,
1089                localns,
1090                type_params,
1091                recursive_guard=(recursive_guard | {self.__forward_arg__}),
1092            )
1093            self.__forward_evaluated__ = True
1094        return self.__forward_value__
1095
1096    def __eq__(self, other):
1097        if not isinstance(other, ForwardRef):
1098            return NotImplemented
1099        if self.__forward_evaluated__ and other.__forward_evaluated__:
1100            return (self.__forward_arg__ == other.__forward_arg__ and
1101                    self.__forward_value__ == other.__forward_value__)
1102        return (self.__forward_arg__ == other.__forward_arg__ and
1103                self.__forward_module__ == other.__forward_module__)
1104
1105    def __hash__(self):
1106        return hash((self.__forward_arg__, self.__forward_module__))
1107
1108    def __or__(self, other):
1109        return Union[self, other]
1110
1111    def __ror__(self, other):
1112        return Union[other, self]
1113
1114    def __repr__(self):
1115        if self.__forward_module__ is None:
1116            module_repr = ''
1117        else:
1118            module_repr = f', module={self.__forward_module__!r}'
1119        return f'ForwardRef({self.__forward_arg__!r}{module_repr})'
1120
1121
1122def _is_unpacked_typevartuple(x: Any) -> bool:
1123    return ((not isinstance(x, type)) and
1124            getattr(x, '__typing_is_unpacked_typevartuple__', False))
1125
1126
1127def _is_typevar_like(x: Any) -> bool:
1128    return isinstance(x, (TypeVar, ParamSpec)) or _is_unpacked_typevartuple(x)
1129
1130
1131def _typevar_subst(self, arg):
1132    msg = "Parameters to generic types must be types."
1133    arg = _type_check(arg, msg, is_argument=True)
1134    if ((isinstance(arg, _GenericAlias) and arg.__origin__ is Unpack) or
1135        (isinstance(arg, GenericAlias) and getattr(arg, '__unpacked__', False))):
1136        raise TypeError(f"{arg} is not valid as type argument")
1137    return arg
1138
1139
1140def _typevartuple_prepare_subst(self, alias, args):
1141    params = alias.__parameters__
1142    typevartuple_index = params.index(self)
1143    for param in params[typevartuple_index + 1:]:
1144        if isinstance(param, TypeVarTuple):
1145            raise TypeError(f"More than one TypeVarTuple parameter in {alias}")
1146
1147    alen = len(args)
1148    plen = len(params)
1149    left = typevartuple_index
1150    right = plen - typevartuple_index - 1
1151    var_tuple_index = None
1152    fillarg = None
1153    for k, arg in enumerate(args):
1154        if not isinstance(arg, type):
1155            subargs = getattr(arg, '__typing_unpacked_tuple_args__', None)
1156            if subargs and len(subargs) == 2 and subargs[-1] is ...:
1157                if var_tuple_index is not None:
1158                    raise TypeError("More than one unpacked arbitrary-length tuple argument")
1159                var_tuple_index = k
1160                fillarg = subargs[0]
1161    if var_tuple_index is not None:
1162        left = min(left, var_tuple_index)
1163        right = min(right, alen - var_tuple_index - 1)
1164    elif left + right > alen:
1165        raise TypeError(f"Too few arguments for {alias};"
1166                        f" actual {alen}, expected at least {plen-1}")
1167    if left == alen - right and self.has_default():
1168        replacement = _unpack_args(self.__default__)
1169    else:
1170        replacement = args[left: alen - right]
1171
1172    return (
1173        *args[:left],
1174        *([fillarg]*(typevartuple_index - left)),
1175        replacement,
1176        *([fillarg]*(plen - right - left - typevartuple_index - 1)),
1177        *args[alen - right:],
1178    )
1179
1180
1181def _paramspec_subst(self, arg):
1182    if isinstance(arg, (list, tuple)):
1183        arg = tuple(_type_check(a, "Expected a type.") for a in arg)
1184    elif not _is_param_expr(arg):
1185        raise TypeError(f"Expected a list of types, an ellipsis, "
1186                        f"ParamSpec, or Concatenate. Got {arg}")
1187    return arg
1188
1189
1190def _paramspec_prepare_subst(self, alias, args):
1191    params = alias.__parameters__
1192    i = params.index(self)
1193    if i == len(args) and self.has_default():
1194        args = (*args, self.__default__)
1195    if i >= len(args):
1196        raise TypeError(f"Too few arguments for {alias}")
1197    # Special case where Z[[int, str, bool]] == Z[int, str, bool] in PEP 612.
1198    if len(params) == 1 and not _is_param_expr(args[0]):
1199        assert i == 0
1200        args = (args,)
1201    # Convert lists to tuples to help other libraries cache the results.
1202    elif isinstance(args[i], list):
1203        args = (*args[:i], tuple(args[i]), *args[i+1:])
1204    return args
1205
1206
1207@_tp_cache
1208def _generic_class_getitem(cls, args):
1209    """Parameterizes a generic class.
1210
1211    At least, parameterizing a generic class is the *main* thing this method
1212    does. For example, for some generic class `Foo`, this is called when we
1213    do `Foo[int]` - there, with `cls=Foo` and `args=int`.
1214
1215    However, note that this method is also called when defining generic
1216    classes in the first place with `class Foo(Generic[T]): ...`.
1217    """
1218    if not isinstance(args, tuple):
1219        args = (args,)
1220
1221    args = tuple(_type_convert(p) for p in args)
1222    is_generic_or_protocol = cls in (Generic, Protocol)
1223
1224    if is_generic_or_protocol:
1225        # Generic and Protocol can only be subscripted with unique type variables.
1226        if not args:
1227            raise TypeError(
1228                f"Parameter list to {cls.__qualname__}[...] cannot be empty"
1229            )
1230        if not all(_is_typevar_like(p) for p in args):
1231            raise TypeError(
1232                f"Parameters to {cls.__name__}[...] must all be type variables "
1233                f"or parameter specification variables.")
1234        if len(set(args)) != len(args):
1235            raise TypeError(
1236                f"Parameters to {cls.__name__}[...] must all be unique")
1237    else:
1238        # Subscripting a regular Generic subclass.
1239        try:
1240            parameters = cls.__parameters__
1241        except AttributeError as e:
1242            init_subclass = getattr(cls, '__init_subclass__', None)
1243            if init_subclass not in {None, Generic.__init_subclass__}:
1244                e.add_note(
1245                    f"Note: this exception may have been caused by "
1246                    f"{init_subclass.__qualname__!r} (or the "
1247                    f"'__init_subclass__' method on a superclass) not "
1248                    f"calling 'super().__init_subclass__()'"
1249                )
1250            raise
1251        for param in parameters:
1252            prepare = getattr(param, '__typing_prepare_subst__', None)
1253            if prepare is not None:
1254                args = prepare(cls, args)
1255        _check_generic_specialization(cls, args)
1256
1257        new_args = []
1258        for param, new_arg in zip(parameters, args):
1259            if isinstance(param, TypeVarTuple):
1260                new_args.extend(new_arg)
1261            else:
1262                new_args.append(new_arg)
1263        args = tuple(new_args)
1264
1265    return _GenericAlias(cls, args)
1266
1267
1268def _generic_init_subclass(cls, *args, **kwargs):
1269    super(Generic, cls).__init_subclass__(*args, **kwargs)
1270    tvars = []
1271    if '__orig_bases__' in cls.__dict__:
1272        error = Generic in cls.__orig_bases__
1273    else:
1274        error = (Generic in cls.__bases__ and
1275                    cls.__name__ != 'Protocol' and
1276                    type(cls) != _TypedDictMeta)
1277    if error:
1278        raise TypeError("Cannot inherit from plain Generic")
1279    if '__orig_bases__' in cls.__dict__:
1280        tvars = _collect_type_parameters(cls.__orig_bases__)
1281        # Look for Generic[T1, ..., Tn].
1282        # If found, tvars must be a subset of it.
1283        # If not found, tvars is it.
1284        # Also check for and reject plain Generic,
1285        # and reject multiple Generic[...].
1286        gvars = None
1287        for base in cls.__orig_bases__:
1288            if (isinstance(base, _GenericAlias) and
1289                    base.__origin__ is Generic):
1290                if gvars is not None:
1291                    raise TypeError(
1292                        "Cannot inherit from Generic[...] multiple times.")
1293                gvars = base.__parameters__
1294        if gvars is not None:
1295            tvarset = set(tvars)
1296            gvarset = set(gvars)
1297            if not tvarset <= gvarset:
1298                s_vars = ', '.join(str(t) for t in tvars if t not in gvarset)
1299                s_args = ', '.join(str(g) for g in gvars)
1300                raise TypeError(f"Some type variables ({s_vars}) are"
1301                                f" not listed in Generic[{s_args}]")
1302            tvars = gvars
1303    cls.__parameters__ = tuple(tvars)
1304
1305
1306def _is_dunder(attr):
1307    return attr.startswith('__') and attr.endswith('__')
1308
1309class _BaseGenericAlias(_Final, _root=True):
1310    """The central part of the internal API.
1311
1312    This represents a generic version of type 'origin' with type arguments 'params'.
1313    There are two kind of these aliases: user defined and special. The special ones
1314    are wrappers around builtin collections and ABCs in collections.abc. These must
1315    have 'name' always set. If 'inst' is False, then the alias can't be instantiated;
1316    this is used by e.g. typing.List and typing.Dict.
1317    """
1318
1319    def __init__(self, origin, *, inst=True, name=None):
1320        self._inst = inst
1321        self._name = name
1322        self.__origin__ = origin
1323        self.__slots__ = None  # This is not documented.
1324
1325    def __call__(self, *args, **kwargs):
1326        if not self._inst:
1327            raise TypeError(f"Type {self._name} cannot be instantiated; "
1328                            f"use {self.__origin__.__name__}() instead")
1329        result = self.__origin__(*args, **kwargs)
1330        try:
1331            result.__orig_class__ = self
1332        # Some objects raise TypeError (or something even more exotic)
1333        # if you try to set attributes on them; we guard against that here
1334        except Exception:
1335            pass
1336        return result
1337
1338    def __mro_entries__(self, bases):
1339        res = []
1340        if self.__origin__ not in bases:
1341            res.append(self.__origin__)
1342
1343        # Check if any base that occurs after us in `bases` is either itself a
1344        # subclass of Generic, or something which will add a subclass of Generic
1345        # to `__bases__` via its `__mro_entries__`. If not, add Generic
1346        # ourselves. The goal is to ensure that Generic (or a subclass) will
1347        # appear exactly once in the final bases tuple. If we let it appear
1348        # multiple times, we risk "can't form a consistent MRO" errors.
1349        i = bases.index(self)
1350        for b in bases[i+1:]:
1351            if isinstance(b, _BaseGenericAlias):
1352                break
1353            if not isinstance(b, type):
1354                meth = getattr(b, "__mro_entries__", None)
1355                new_bases = meth(bases) if meth else None
1356                if (
1357                    isinstance(new_bases, tuple) and
1358                    any(
1359                        isinstance(b2, type) and issubclass(b2, Generic)
1360                        for b2 in new_bases
1361                    )
1362                ):
1363                    break
1364            elif issubclass(b, Generic):
1365                break
1366        else:
1367            res.append(Generic)
1368        return tuple(res)
1369
1370    def __getattr__(self, attr):
1371        if attr in {'__name__', '__qualname__'}:
1372            return self._name or self.__origin__.__name__
1373
1374        # We are careful for copy and pickle.
1375        # Also for simplicity we don't relay any dunder names
1376        if '__origin__' in self.__dict__ and not _is_dunder(attr):
1377            return getattr(self.__origin__, attr)
1378        raise AttributeError(attr)
1379
1380    def __setattr__(self, attr, val):
1381        if _is_dunder(attr) or attr in {'_name', '_inst', '_nparams', '_defaults'}:
1382            super().__setattr__(attr, val)
1383        else:
1384            setattr(self.__origin__, attr, val)
1385
1386    def __instancecheck__(self, obj):
1387        return self.__subclasscheck__(type(obj))
1388
1389    def __subclasscheck__(self, cls):
1390        raise TypeError("Subscripted generics cannot be used with"
1391                        " class and instance checks")
1392
1393    def __dir__(self):
1394        return list(set(super().__dir__()
1395                + [attr for attr in dir(self.__origin__) if not _is_dunder(attr)]))
1396
1397
1398# Special typing constructs Union, Optional, Generic, Callable and Tuple
1399# use three special attributes for internal bookkeeping of generic types:
1400# * __parameters__ is a tuple of unique free type parameters of a generic
1401#   type, for example, Dict[T, T].__parameters__ == (T,);
1402# * __origin__ keeps a reference to a type that was subscripted,
1403#   e.g., Union[T, int].__origin__ == Union, or the non-generic version of
1404#   the type.
1405# * __args__ is a tuple of all arguments used in subscripting,
1406#   e.g., Dict[T, int].__args__ == (T, int).
1407
1408
1409class _GenericAlias(_BaseGenericAlias, _root=True):
1410    # The type of parameterized generics.
1411    #
1412    # That is, for example, `type(List[int])` is `_GenericAlias`.
1413    #
1414    # Objects which are instances of this class include:
1415    # * Parameterized container types, e.g. `Tuple[int]`, `List[int]`.
1416    #  * Note that native container types, e.g. `tuple`, `list`, use
1417    #    `types.GenericAlias` instead.
1418    # * Parameterized classes:
1419    #     class C[T]: pass
1420    #     # C[int] is a _GenericAlias
1421    # * `Callable` aliases, generic `Callable` aliases, and
1422    #   parameterized `Callable` aliases:
1423    #     T = TypeVar('T')
1424    #     # _CallableGenericAlias inherits from _GenericAlias.
1425    #     A = Callable[[], None]  # _CallableGenericAlias
1426    #     B = Callable[[T], None]  # _CallableGenericAlias
1427    #     C = B[int]  # _CallableGenericAlias
1428    # * Parameterized `Final`, `ClassVar`, `TypeGuard`, and `TypeIs`:
1429    #     # All _GenericAlias
1430    #     Final[int]
1431    #     ClassVar[float]
1432    #     TypeGuard[bool]
1433    #     TypeIs[range]
1434
1435    def __init__(self, origin, args, *, inst=True, name=None):
1436        super().__init__(origin, inst=inst, name=name)
1437        if not isinstance(args, tuple):
1438            args = (args,)
1439        self.__args__ = tuple(... if a is _TypingEllipsis else
1440                              a for a in args)
1441        enforce_default_ordering = origin in (Generic, Protocol)
1442        self.__parameters__ = _collect_type_parameters(
1443            args,
1444            enforce_default_ordering=enforce_default_ordering,
1445        )
1446        if not name:
1447            self.__module__ = origin.__module__
1448
1449    def __eq__(self, other):
1450        if not isinstance(other, _GenericAlias):
1451            return NotImplemented
1452        return (self.__origin__ == other.__origin__
1453                and self.__args__ == other.__args__)
1454
1455    def __hash__(self):
1456        return hash((self.__origin__, self.__args__))
1457
1458    def __or__(self, right):
1459        return Union[self, right]
1460
1461    def __ror__(self, left):
1462        return Union[left, self]
1463
1464    @_tp_cache
1465    def __getitem__(self, args):
1466        # Parameterizes an already-parameterized object.
1467        #
1468        # For example, we arrive here doing something like:
1469        #   T1 = TypeVar('T1')
1470        #   T2 = TypeVar('T2')
1471        #   T3 = TypeVar('T3')
1472        #   class A(Generic[T1]): pass
1473        #   B = A[T2]  # B is a _GenericAlias
1474        #   C = B[T3]  # Invokes _GenericAlias.__getitem__
1475        #
1476        # We also arrive here when parameterizing a generic `Callable` alias:
1477        #   T = TypeVar('T')
1478        #   C = Callable[[T], None]
1479        #   C[int]  # Invokes _GenericAlias.__getitem__
1480
1481        if self.__origin__ in (Generic, Protocol):
1482            # Can't subscript Generic[...] or Protocol[...].
1483            raise TypeError(f"Cannot subscript already-subscripted {self}")
1484        if not self.__parameters__:
1485            raise TypeError(f"{self} is not a generic class")
1486
1487        # Preprocess `args`.
1488        if not isinstance(args, tuple):
1489            args = (args,)
1490        args = _unpack_args(*(_type_convert(p) for p in args))
1491        new_args = self._determine_new_args(args)
1492        r = self.copy_with(new_args)
1493        return r
1494
1495    def _determine_new_args(self, args):
1496        # Determines new __args__ for __getitem__.
1497        #
1498        # For example, suppose we had:
1499        #   T1 = TypeVar('T1')
1500        #   T2 = TypeVar('T2')
1501        #   class A(Generic[T1, T2]): pass
1502        #   T3 = TypeVar('T3')
1503        #   B = A[int, T3]
1504        #   C = B[str]
1505        # `B.__args__` is `(int, T3)`, so `C.__args__` should be `(int, str)`.
1506        # Unfortunately, this is harder than it looks, because if `T3` is
1507        # anything more exotic than a plain `TypeVar`, we need to consider
1508        # edge cases.
1509
1510        params = self.__parameters__
1511        # In the example above, this would be {T3: str}
1512        for param in params:
1513            prepare = getattr(param, '__typing_prepare_subst__', None)
1514            if prepare is not None:
1515                args = prepare(self, args)
1516        alen = len(args)
1517        plen = len(params)
1518        if alen != plen:
1519            raise TypeError(f"Too {'many' if alen > plen else 'few'} arguments for {self};"
1520                            f" actual {alen}, expected {plen}")
1521        new_arg_by_param = dict(zip(params, args))
1522        return tuple(self._make_substitution(self.__args__, new_arg_by_param))
1523
1524    def _make_substitution(self, args, new_arg_by_param):
1525        """Create a list of new type arguments."""
1526        new_args = []
1527        for old_arg in args:
1528            if isinstance(old_arg, type):
1529                new_args.append(old_arg)
1530                continue
1531
1532            substfunc = getattr(old_arg, '__typing_subst__', None)
1533            if substfunc:
1534                new_arg = substfunc(new_arg_by_param[old_arg])
1535            else:
1536                subparams = getattr(old_arg, '__parameters__', ())
1537                if not subparams:
1538                    new_arg = old_arg
1539                else:
1540                    subargs = []
1541                    for x in subparams:
1542                        if isinstance(x, TypeVarTuple):
1543                            subargs.extend(new_arg_by_param[x])
1544                        else:
1545                            subargs.append(new_arg_by_param[x])
1546                    new_arg = old_arg[tuple(subargs)]
1547
1548            if self.__origin__ == collections.abc.Callable and isinstance(new_arg, tuple):
1549                # Consider the following `Callable`.
1550                #   C = Callable[[int], str]
1551                # Here, `C.__args__` should be (int, str) - NOT ([int], str).
1552                # That means that if we had something like...
1553                #   P = ParamSpec('P')
1554                #   T = TypeVar('T')
1555                #   C = Callable[P, T]
1556                #   D = C[[int, str], float]
1557                # ...we need to be careful; `new_args` should end up as
1558                # `(int, str, float)` rather than `([int, str], float)`.
1559                new_args.extend(new_arg)
1560            elif _is_unpacked_typevartuple(old_arg):
1561                # Consider the following `_GenericAlias`, `B`:
1562                #   class A(Generic[*Ts]): ...
1563                #   B = A[T, *Ts]
1564                # If we then do:
1565                #   B[float, int, str]
1566                # The `new_arg` corresponding to `T` will be `float`, and the
1567                # `new_arg` corresponding to `*Ts` will be `(int, str)`. We
1568                # should join all these types together in a flat list
1569                # `(float, int, str)` - so again, we should `extend`.
1570                new_args.extend(new_arg)
1571            elif isinstance(old_arg, tuple):
1572                # Corner case:
1573                #    P = ParamSpec('P')
1574                #    T = TypeVar('T')
1575                #    class Base(Generic[P]): ...
1576                # Can be substituted like this:
1577                #    X = Base[[int, T]]
1578                # In this case, `old_arg` will be a tuple:
1579                new_args.append(
1580                    tuple(self._make_substitution(old_arg, new_arg_by_param)),
1581                )
1582            else:
1583                new_args.append(new_arg)
1584        return new_args
1585
1586    def copy_with(self, args):
1587        return self.__class__(self.__origin__, args, name=self._name, inst=self._inst)
1588
1589    def __repr__(self):
1590        if self._name:
1591            name = 'typing.' + self._name
1592        else:
1593            name = _type_repr(self.__origin__)
1594        if self.__args__:
1595            args = ", ".join([_type_repr(a) for a in self.__args__])
1596        else:
1597            # To ensure the repr is eval-able.
1598            args = "()"
1599        return f'{name}[{args}]'
1600
1601    def __reduce__(self):
1602        if self._name:
1603            origin = globals()[self._name]
1604        else:
1605            origin = self.__origin__
1606        args = tuple(self.__args__)
1607        if len(args) == 1 and not isinstance(args[0], tuple):
1608            args, = args
1609        return operator.getitem, (origin, args)
1610
1611    def __mro_entries__(self, bases):
1612        if isinstance(self.__origin__, _SpecialForm):
1613            raise TypeError(f"Cannot subclass {self!r}")
1614
1615        if self._name:  # generic version of an ABC or built-in class
1616            return super().__mro_entries__(bases)
1617        if self.__origin__ is Generic:
1618            if Protocol in bases:
1619                return ()
1620            i = bases.index(self)
1621            for b in bases[i+1:]:
1622                if isinstance(b, _BaseGenericAlias) and b is not self:
1623                    return ()
1624        return (self.__origin__,)
1625
1626    def __iter__(self):
1627        yield Unpack[self]
1628
1629
1630# _nparams is the number of accepted parameters, e.g. 0 for Hashable,
1631# 1 for List and 2 for Dict.  It may be -1 if variable number of
1632# parameters are accepted (needs custom __getitem__).
1633
1634class _SpecialGenericAlias(_NotIterable, _BaseGenericAlias, _root=True):
1635    def __init__(self, origin, nparams, *, inst=True, name=None, defaults=()):
1636        if name is None:
1637            name = origin.__name__
1638        super().__init__(origin, inst=inst, name=name)
1639        self._nparams = nparams
1640        self._defaults = defaults
1641        if origin.__module__ == 'builtins':
1642            self.__doc__ = f'A generic version of {origin.__qualname__}.'
1643        else:
1644            self.__doc__ = f'A generic version of {origin.__module__}.{origin.__qualname__}.'
1645
1646    @_tp_cache
1647    def __getitem__(self, params):
1648        if not isinstance(params, tuple):
1649            params = (params,)
1650        msg = "Parameters to generic types must be types."
1651        params = tuple(_type_check(p, msg) for p in params)
1652        if (self._defaults
1653            and len(params) < self._nparams
1654            and len(params) + len(self._defaults) >= self._nparams
1655        ):
1656            params = (*params, *self._defaults[len(params) - self._nparams:])
1657        actual_len = len(params)
1658
1659        if actual_len != self._nparams:
1660            if self._defaults:
1661                expected = f"at least {self._nparams - len(self._defaults)}"
1662            else:
1663                expected = str(self._nparams)
1664            if not self._nparams:
1665                raise TypeError(f"{self} is not a generic class")
1666            raise TypeError(f"Too {'many' if actual_len > self._nparams else 'few'} arguments for {self};"
1667                            f" actual {actual_len}, expected {expected}")
1668        return self.copy_with(params)
1669
1670    def copy_with(self, params):
1671        return _GenericAlias(self.__origin__, params,
1672                             name=self._name, inst=self._inst)
1673
1674    def __repr__(self):
1675        return 'typing.' + self._name
1676
1677    def __subclasscheck__(self, cls):
1678        if isinstance(cls, _SpecialGenericAlias):
1679            return issubclass(cls.__origin__, self.__origin__)
1680        if not isinstance(cls, _GenericAlias):
1681            return issubclass(cls, self.__origin__)
1682        return super().__subclasscheck__(cls)
1683
1684    def __reduce__(self):
1685        return self._name
1686
1687    def __or__(self, right):
1688        return Union[self, right]
1689
1690    def __ror__(self, left):
1691        return Union[left, self]
1692
1693
1694class _DeprecatedGenericAlias(_SpecialGenericAlias, _root=True):
1695    def __init__(
1696        self, origin, nparams, *, removal_version, inst=True, name=None
1697    ):
1698        super().__init__(origin, nparams, inst=inst, name=name)
1699        self._removal_version = removal_version
1700
1701    def __instancecheck__(self, inst):
1702        import warnings
1703        warnings._deprecated(
1704            f"{self.__module__}.{self._name}", remove=self._removal_version
1705        )
1706        return super().__instancecheck__(inst)
1707
1708
1709class _CallableGenericAlias(_NotIterable, _GenericAlias, _root=True):
1710    def __repr__(self):
1711        assert self._name == 'Callable'
1712        args = self.__args__
1713        if len(args) == 2 and _is_param_expr(args[0]):
1714            return super().__repr__()
1715        return (f'typing.Callable'
1716                f'[[{", ".join([_type_repr(a) for a in args[:-1]])}], '
1717                f'{_type_repr(args[-1])}]')
1718
1719    def __reduce__(self):
1720        args = self.__args__
1721        if not (len(args) == 2 and _is_param_expr(args[0])):
1722            args = list(args[:-1]), args[-1]
1723        return operator.getitem, (Callable, args)
1724
1725
1726class _CallableType(_SpecialGenericAlias, _root=True):
1727    def copy_with(self, params):
1728        return _CallableGenericAlias(self.__origin__, params,
1729                                     name=self._name, inst=self._inst)
1730
1731    def __getitem__(self, params):
1732        if not isinstance(params, tuple) or len(params) != 2:
1733            raise TypeError("Callable must be used as "
1734                            "Callable[[arg, ...], result].")
1735        args, result = params
1736        # This relaxes what args can be on purpose to allow things like
1737        # PEP 612 ParamSpec.  Responsibility for whether a user is using
1738        # Callable[...] properly is deferred to static type checkers.
1739        if isinstance(args, list):
1740            params = (tuple(args), result)
1741        else:
1742            params = (args, result)
1743        return self.__getitem_inner__(params)
1744
1745    @_tp_cache
1746    def __getitem_inner__(self, params):
1747        args, result = params
1748        msg = "Callable[args, result]: result must be a type."
1749        result = _type_check(result, msg)
1750        if args is Ellipsis:
1751            return self.copy_with((_TypingEllipsis, result))
1752        if not isinstance(args, tuple):
1753            args = (args,)
1754        args = tuple(_type_convert(arg) for arg in args)
1755        params = args + (result,)
1756        return self.copy_with(params)
1757
1758
1759class _TupleType(_SpecialGenericAlias, _root=True):
1760    @_tp_cache
1761    def __getitem__(self, params):
1762        if not isinstance(params, tuple):
1763            params = (params,)
1764        if len(params) >= 2 and params[-1] is ...:
1765            msg = "Tuple[t, ...]: t must be a type."
1766            params = tuple(_type_check(p, msg) for p in params[:-1])
1767            return self.copy_with((*params, _TypingEllipsis))
1768        msg = "Tuple[t0, t1, ...]: each t must be a type."
1769        params = tuple(_type_check(p, msg) for p in params)
1770        return self.copy_with(params)
1771
1772
1773class _UnionGenericAlias(_NotIterable, _GenericAlias, _root=True):
1774    def copy_with(self, params):
1775        return Union[params]
1776
1777    def __eq__(self, other):
1778        if not isinstance(other, (_UnionGenericAlias, types.UnionType)):
1779            return NotImplemented
1780        try:  # fast path
1781            return set(self.__args__) == set(other.__args__)
1782        except TypeError:  # not hashable, slow path
1783            return _compare_args_orderless(self.__args__, other.__args__)
1784
1785    def __hash__(self):
1786        return hash(frozenset(self.__args__))
1787
1788    def __repr__(self):
1789        args = self.__args__
1790        if len(args) == 2:
1791            if args[0] is type(None):
1792                return f'typing.Optional[{_type_repr(args[1])}]'
1793            elif args[1] is type(None):
1794                return f'typing.Optional[{_type_repr(args[0])}]'
1795        return super().__repr__()
1796
1797    def __instancecheck__(self, obj):
1798        for arg in self.__args__:
1799            if isinstance(obj, arg):
1800                return True
1801        return False
1802
1803    def __subclasscheck__(self, cls):
1804        for arg in self.__args__:
1805            if issubclass(cls, arg):
1806                return True
1807        return False
1808
1809    def __reduce__(self):
1810        func, (origin, args) = super().__reduce__()
1811        return func, (Union, args)
1812
1813
1814def _value_and_type_iter(parameters):
1815    return ((p, type(p)) for p in parameters)
1816
1817
1818class _LiteralGenericAlias(_GenericAlias, _root=True):
1819    def __eq__(self, other):
1820        if not isinstance(other, _LiteralGenericAlias):
1821            return NotImplemented
1822
1823        return set(_value_and_type_iter(self.__args__)) == set(_value_and_type_iter(other.__args__))
1824
1825    def __hash__(self):
1826        return hash(frozenset(_value_and_type_iter(self.__args__)))
1827
1828
1829class _ConcatenateGenericAlias(_GenericAlias, _root=True):
1830    def copy_with(self, params):
1831        if isinstance(params[-1], (list, tuple)):
1832            return (*params[:-1], *params[-1])
1833        if isinstance(params[-1], _ConcatenateGenericAlias):
1834            params = (*params[:-1], *params[-1].__args__)
1835        return super().copy_with(params)
1836
1837
1838@_SpecialForm
1839def Unpack(self, parameters):
1840    """Type unpack operator.
1841
1842    The type unpack operator takes the child types from some container type,
1843    such as `tuple[int, str]` or a `TypeVarTuple`, and 'pulls them out'.
1844
1845    For example::
1846
1847        # For some generic class `Foo`:
1848        Foo[Unpack[tuple[int, str]]]  # Equivalent to Foo[int, str]
1849
1850        Ts = TypeVarTuple('Ts')
1851        # Specifies that `Bar` is generic in an arbitrary number of types.
1852        # (Think of `Ts` as a tuple of an arbitrary number of individual
1853        #  `TypeVar`s, which the `Unpack` is 'pulling out' directly into the
1854        #  `Generic[]`.)
1855        class Bar(Generic[Unpack[Ts]]): ...
1856        Bar[int]  # Valid
1857        Bar[int, str]  # Also valid
1858
1859    From Python 3.11, this can also be done using the `*` operator::
1860
1861        Foo[*tuple[int, str]]
1862        class Bar(Generic[*Ts]): ...
1863
1864    And from Python 3.12, it can be done using built-in syntax for generics::
1865
1866        Foo[*tuple[int, str]]
1867        class Bar[*Ts]: ...
1868
1869    The operator can also be used along with a `TypedDict` to annotate
1870    `**kwargs` in a function signature::
1871
1872        class Movie(TypedDict):
1873            name: str
1874            year: int
1875
1876        # This function expects two keyword arguments - *name* of type `str` and
1877        # *year* of type `int`.
1878        def foo(**kwargs: Unpack[Movie]): ...
1879
1880    Note that there is only some runtime checking of this operator. Not
1881    everything the runtime allows may be accepted by static type checkers.
1882
1883    For more information, see PEPs 646 and 692.
1884    """
1885    item = _type_check(parameters, f'{self} accepts only single type.')
1886    return _UnpackGenericAlias(origin=self, args=(item,))
1887
1888
1889class _UnpackGenericAlias(_GenericAlias, _root=True):
1890    def __repr__(self):
1891        # `Unpack` only takes one argument, so __args__ should contain only
1892        # a single item.
1893        return f'typing.Unpack[{_type_repr(self.__args__[0])}]'
1894
1895    def __getitem__(self, args):
1896        if self.__typing_is_unpacked_typevartuple__:
1897            return args
1898        return super().__getitem__(args)
1899
1900    @property
1901    def __typing_unpacked_tuple_args__(self):
1902        assert self.__origin__ is Unpack
1903        assert len(self.__args__) == 1
1904        arg, = self.__args__
1905        if isinstance(arg, (_GenericAlias, types.GenericAlias)):
1906            if arg.__origin__ is not tuple:
1907                raise TypeError("Unpack[...] must be used with a tuple type")
1908            return arg.__args__
1909        return None
1910
1911    @property
1912    def __typing_is_unpacked_typevartuple__(self):
1913        assert self.__origin__ is Unpack
1914        assert len(self.__args__) == 1
1915        return isinstance(self.__args__[0], TypeVarTuple)
1916
1917
1918class _TypingEllipsis:
1919    """Internal placeholder for ... (ellipsis)."""
1920
1921
1922_TYPING_INTERNALS = frozenset({
1923    '__parameters__', '__orig_bases__',  '__orig_class__',
1924    '_is_protocol', '_is_runtime_protocol', '__protocol_attrs__',
1925    '__non_callable_proto_members__', '__type_params__',
1926})
1927
1928_SPECIAL_NAMES = frozenset({
1929    '__abstractmethods__', '__annotations__', '__dict__', '__doc__',
1930    '__init__', '__module__', '__new__', '__slots__',
1931    '__subclasshook__', '__weakref__', '__class_getitem__',
1932    '__match_args__', '__static_attributes__', '__firstlineno__',
1933})
1934
1935# These special attributes will be not collected as protocol members.
1936EXCLUDED_ATTRIBUTES = _TYPING_INTERNALS | _SPECIAL_NAMES | {'_MutableMapping__marker'}
1937
1938
1939def _get_protocol_attrs(cls):
1940    """Collect protocol members from a protocol class objects.
1941
1942    This includes names actually defined in the class dictionary, as well
1943    as names that appear in annotations. Special names (above) are skipped.
1944    """
1945    attrs = set()
1946    for base in cls.__mro__[:-1]:  # without object
1947        if base.__name__ in {'Protocol', 'Generic'}:
1948            continue
1949        annotations = getattr(base, '__annotations__', {})
1950        for attr in (*base.__dict__, *annotations):
1951            if not attr.startswith('_abc_') and attr not in EXCLUDED_ATTRIBUTES:
1952                attrs.add(attr)
1953    return attrs
1954
1955
1956def _no_init_or_replace_init(self, *args, **kwargs):
1957    cls = type(self)
1958
1959    if cls._is_protocol:
1960        raise TypeError('Protocols cannot be instantiated')
1961
1962    # Already using a custom `__init__`. No need to calculate correct
1963    # `__init__` to call. This can lead to RecursionError. See bpo-45121.
1964    if cls.__init__ is not _no_init_or_replace_init:
1965        return
1966
1967    # Initially, `__init__` of a protocol subclass is set to `_no_init_or_replace_init`.
1968    # The first instantiation of the subclass will call `_no_init_or_replace_init` which
1969    # searches for a proper new `__init__` in the MRO. The new `__init__`
1970    # replaces the subclass' old `__init__` (ie `_no_init_or_replace_init`). Subsequent
1971    # instantiation of the protocol subclass will thus use the new
1972    # `__init__` and no longer call `_no_init_or_replace_init`.
1973    for base in cls.__mro__:
1974        init = base.__dict__.get('__init__', _no_init_or_replace_init)
1975        if init is not _no_init_or_replace_init:
1976            cls.__init__ = init
1977            break
1978    else:
1979        # should not happen
1980        cls.__init__ = object.__init__
1981
1982    cls.__init__(self, *args, **kwargs)
1983
1984
1985def _caller(depth=1, default='__main__'):
1986    try:
1987        return sys._getframemodulename(depth + 1) or default
1988    except AttributeError:  # For platforms without _getframemodulename()
1989        pass
1990    try:
1991        return sys._getframe(depth + 1).f_globals.get('__name__', default)
1992    except (AttributeError, ValueError):  # For platforms without _getframe()
1993        pass
1994    return None
1995
1996def _allow_reckless_class_checks(depth=2):
1997    """Allow instance and class checks for special stdlib modules.
1998
1999    The abc and functools modules indiscriminately call isinstance() and
2000    issubclass() on the whole MRO of a user class, which may contain protocols.
2001    """
2002    return _caller(depth) in {'abc', 'functools', None}
2003
2004
2005_PROTO_ALLOWLIST = {
2006    'collections.abc': [
2007        'Callable', 'Awaitable', 'Iterable', 'Iterator', 'AsyncIterable',
2008        'AsyncIterator', 'Hashable', 'Sized', 'Container', 'Collection',
2009        'Reversible', 'Buffer',
2010    ],
2011    'contextlib': ['AbstractContextManager', 'AbstractAsyncContextManager'],
2012}
2013
2014
2015@functools.cache
2016def _lazy_load_getattr_static():
2017    # Import getattr_static lazily so as not to slow down the import of typing.py
2018    # Cache the result so we don't slow down _ProtocolMeta.__instancecheck__ unnecessarily
2019    from inspect import getattr_static
2020    return getattr_static
2021
2022
2023_cleanups.append(_lazy_load_getattr_static.cache_clear)
2024
2025def _pickle_psargs(psargs):
2026    return ParamSpecArgs, (psargs.__origin__,)
2027
2028copyreg.pickle(ParamSpecArgs, _pickle_psargs)
2029
2030def _pickle_pskwargs(pskwargs):
2031    return ParamSpecKwargs, (pskwargs.__origin__,)
2032
2033copyreg.pickle(ParamSpecKwargs, _pickle_pskwargs)
2034
2035del _pickle_psargs, _pickle_pskwargs
2036
2037
2038# Preload these once, as globals, as a micro-optimisation.
2039# This makes a significant difference to the time it takes
2040# to do `isinstance()`/`issubclass()` checks
2041# against runtime-checkable protocols with only one callable member.
2042_abc_instancecheck = ABCMeta.__instancecheck__
2043_abc_subclasscheck = ABCMeta.__subclasscheck__
2044
2045
2046def _type_check_issubclass_arg_1(arg):
2047    """Raise TypeError if `arg` is not an instance of `type`
2048    in `issubclass(arg, <protocol>)`.
2049
2050    In most cases, this is verified by type.__subclasscheck__.
2051    Checking it again unnecessarily would slow down issubclass() checks,
2052    so, we don't perform this check unless we absolutely have to.
2053
2054    For various error paths, however,
2055    we want to ensure that *this* error message is shown to the user
2056    where relevant, rather than a typing.py-specific error message.
2057    """
2058    if not isinstance(arg, type):
2059        # Same error message as for issubclass(1, int).
2060        raise TypeError('issubclass() arg 1 must be a class')
2061
2062
2063class _ProtocolMeta(ABCMeta):
2064    # This metaclass is somewhat unfortunate,
2065    # but is necessary for several reasons...
2066    def __new__(mcls, name, bases, namespace, /, **kwargs):
2067        if name == "Protocol" and bases == (Generic,):
2068            pass
2069        elif Protocol in bases:
2070            for base in bases:
2071                if not (
2072                    base in {object, Generic}
2073                    or base.__name__ in _PROTO_ALLOWLIST.get(base.__module__, [])
2074                    or (
2075                        issubclass(base, Generic)
2076                        and getattr(base, "_is_protocol", False)
2077                    )
2078                ):
2079                    raise TypeError(
2080                        f"Protocols can only inherit from other protocols, "
2081                        f"got {base!r}"
2082                    )
2083        return super().__new__(mcls, name, bases, namespace, **kwargs)
2084
2085    def __init__(cls, *args, **kwargs):
2086        super().__init__(*args, **kwargs)
2087        if getattr(cls, "_is_protocol", False):
2088            cls.__protocol_attrs__ = _get_protocol_attrs(cls)
2089
2090    def __subclasscheck__(cls, other):
2091        if cls is Protocol:
2092            return type.__subclasscheck__(cls, other)
2093        if (
2094            getattr(cls, '_is_protocol', False)
2095            and not _allow_reckless_class_checks()
2096        ):
2097            if not getattr(cls, '_is_runtime_protocol', False):
2098                _type_check_issubclass_arg_1(other)
2099                raise TypeError(
2100                    "Instance and class checks can only be used with "
2101                    "@runtime_checkable protocols"
2102                )
2103            if (
2104                # this attribute is set by @runtime_checkable:
2105                cls.__non_callable_proto_members__
2106                and cls.__dict__.get("__subclasshook__") is _proto_hook
2107            ):
2108                _type_check_issubclass_arg_1(other)
2109                non_method_attrs = sorted(cls.__non_callable_proto_members__)
2110                raise TypeError(
2111                    "Protocols with non-method members don't support issubclass()."
2112                    f" Non-method members: {str(non_method_attrs)[1:-1]}."
2113                )
2114        return _abc_subclasscheck(cls, other)
2115
2116    def __instancecheck__(cls, instance):
2117        # We need this method for situations where attributes are
2118        # assigned in __init__.
2119        if cls is Protocol:
2120            return type.__instancecheck__(cls, instance)
2121        if not getattr(cls, "_is_protocol", False):
2122            # i.e., it's a concrete subclass of a protocol
2123            return _abc_instancecheck(cls, instance)
2124
2125        if (
2126            not getattr(cls, '_is_runtime_protocol', False) and
2127            not _allow_reckless_class_checks()
2128        ):
2129            raise TypeError("Instance and class checks can only be used with"
2130                            " @runtime_checkable protocols")
2131
2132        if _abc_instancecheck(cls, instance):
2133            return True
2134
2135        getattr_static = _lazy_load_getattr_static()
2136        for attr in cls.__protocol_attrs__:
2137            try:
2138                val = getattr_static(instance, attr)
2139            except AttributeError:
2140                break
2141            # this attribute is set by @runtime_checkable:
2142            if val is None and attr not in cls.__non_callable_proto_members__:
2143                break
2144        else:
2145            return True
2146
2147        return False
2148
2149
2150@classmethod
2151def _proto_hook(cls, other):
2152    if not cls.__dict__.get('_is_protocol', False):
2153        return NotImplemented
2154
2155    for attr in cls.__protocol_attrs__:
2156        for base in other.__mro__:
2157            # Check if the members appears in the class dictionary...
2158            if attr in base.__dict__:
2159                if base.__dict__[attr] is None:
2160                    return NotImplemented
2161                break
2162
2163            # ...or in annotations, if it is a sub-protocol.
2164            annotations = getattr(base, '__annotations__', {})
2165            if (isinstance(annotations, collections.abc.Mapping) and
2166                    attr in annotations and
2167                    issubclass(other, Generic) and getattr(other, '_is_protocol', False)):
2168                break
2169        else:
2170            return NotImplemented
2171    return True
2172
2173
2174class Protocol(Generic, metaclass=_ProtocolMeta):
2175    """Base class for protocol classes.
2176
2177    Protocol classes are defined as::
2178
2179        class Proto(Protocol):
2180            def meth(self) -> int:
2181                ...
2182
2183    Such classes are primarily used with static type checkers that recognize
2184    structural subtyping (static duck-typing).
2185
2186    For example::
2187
2188        class C:
2189            def meth(self) -> int:
2190                return 0
2191
2192        def func(x: Proto) -> int:
2193            return x.meth()
2194
2195        func(C())  # Passes static type check
2196
2197    See PEP 544 for details. Protocol classes decorated with
2198    @typing.runtime_checkable act as simple-minded runtime protocols that check
2199    only the presence of given attributes, ignoring their type signatures.
2200    Protocol classes can be generic, they are defined as::
2201
2202        class GenProto[T](Protocol):
2203            def meth(self) -> T:
2204                ...
2205    """
2206
2207    __slots__ = ()
2208    _is_protocol = True
2209    _is_runtime_protocol = False
2210
2211    def __init_subclass__(cls, *args, **kwargs):
2212        super().__init_subclass__(*args, **kwargs)
2213
2214        # Determine if this is a protocol or a concrete subclass.
2215        if not cls.__dict__.get('_is_protocol', False):
2216            cls._is_protocol = any(b is Protocol for b in cls.__bases__)
2217
2218        # Set (or override) the protocol subclass hook.
2219        if '__subclasshook__' not in cls.__dict__:
2220            cls.__subclasshook__ = _proto_hook
2221
2222        # Prohibit instantiation for protocol classes
2223        if cls._is_protocol and cls.__init__ is Protocol.__init__:
2224            cls.__init__ = _no_init_or_replace_init
2225
2226
2227class _AnnotatedAlias(_NotIterable, _GenericAlias, _root=True):
2228    """Runtime representation of an annotated type.
2229
2230    At its core 'Annotated[t, dec1, dec2, ...]' is an alias for the type 't'
2231    with extra annotations. The alias behaves like a normal typing alias.
2232    Instantiating is the same as instantiating the underlying type; binding
2233    it to types is also the same.
2234
2235    The metadata itself is stored in a '__metadata__' attribute as a tuple.
2236    """
2237
2238    def __init__(self, origin, metadata):
2239        if isinstance(origin, _AnnotatedAlias):
2240            metadata = origin.__metadata__ + metadata
2241            origin = origin.__origin__
2242        super().__init__(origin, origin, name='Annotated')
2243        self.__metadata__ = metadata
2244
2245    def copy_with(self, params):
2246        assert len(params) == 1
2247        new_type = params[0]
2248        return _AnnotatedAlias(new_type, self.__metadata__)
2249
2250    def __repr__(self):
2251        return "typing.Annotated[{}, {}]".format(
2252            _type_repr(self.__origin__),
2253            ", ".join(repr(a) for a in self.__metadata__)
2254        )
2255
2256    def __reduce__(self):
2257        return operator.getitem, (
2258            Annotated, (self.__origin__,) + self.__metadata__
2259        )
2260
2261    def __eq__(self, other):
2262        if not isinstance(other, _AnnotatedAlias):
2263            return NotImplemented
2264        return (self.__origin__ == other.__origin__
2265                and self.__metadata__ == other.__metadata__)
2266
2267    def __hash__(self):
2268        return hash((self.__origin__, self.__metadata__))
2269
2270    def __getattr__(self, attr):
2271        if attr in {'__name__', '__qualname__'}:
2272            return 'Annotated'
2273        return super().__getattr__(attr)
2274
2275    def __mro_entries__(self, bases):
2276        return (self.__origin__,)
2277
2278
2279@_TypedCacheSpecialForm
2280@_tp_cache(typed=True)
2281def Annotated(self, *params):
2282    """Add context-specific metadata to a type.
2283
2284    Example: Annotated[int, runtime_check.Unsigned] indicates to the
2285    hypothetical runtime_check module that this type is an unsigned int.
2286    Every other consumer of this type can ignore this metadata and treat
2287    this type as int.
2288
2289    The first argument to Annotated must be a valid type.
2290
2291    Details:
2292
2293    - It's an error to call `Annotated` with less than two arguments.
2294    - Access the metadata via the ``__metadata__`` attribute::
2295
2296        assert Annotated[int, '$'].__metadata__ == ('$',)
2297
2298    - Nested Annotated types are flattened::
2299
2300        assert Annotated[Annotated[T, Ann1, Ann2], Ann3] == Annotated[T, Ann1, Ann2, Ann3]
2301
2302    - Instantiating an annotated type is equivalent to instantiating the
2303    underlying type::
2304
2305        assert Annotated[C, Ann1](5) == C(5)
2306
2307    - Annotated can be used as a generic type alias::
2308
2309        type Optimized[T] = Annotated[T, runtime.Optimize()]
2310        # type checker will treat Optimized[int]
2311        # as equivalent to Annotated[int, runtime.Optimize()]
2312
2313        type OptimizedList[T] = Annotated[list[T], runtime.Optimize()]
2314        # type checker will treat OptimizedList[int]
2315        # as equivalent to Annotated[list[int], runtime.Optimize()]
2316
2317    - Annotated cannot be used with an unpacked TypeVarTuple::
2318
2319        type Variadic[*Ts] = Annotated[*Ts, Ann1]  # NOT valid
2320
2321      This would be equivalent to::
2322
2323        Annotated[T1, T2, T3, ..., Ann1]
2324
2325      where T1, T2 etc. are TypeVars, which would be invalid, because
2326      only one type should be passed to Annotated.
2327    """
2328    if len(params) < 2:
2329        raise TypeError("Annotated[...] should be used "
2330                        "with at least two arguments (a type and an "
2331                        "annotation).")
2332    if _is_unpacked_typevartuple(params[0]):
2333        raise TypeError("Annotated[...] should not be used with an "
2334                        "unpacked TypeVarTuple")
2335    msg = "Annotated[t, ...]: t must be a type."
2336    origin = _type_check(params[0], msg, allow_special_forms=True)
2337    metadata = tuple(params[1:])
2338    return _AnnotatedAlias(origin, metadata)
2339
2340
2341def runtime_checkable(cls):
2342    """Mark a protocol class as a runtime protocol.
2343
2344    Such protocol can be used with isinstance() and issubclass().
2345    Raise TypeError if applied to a non-protocol class.
2346    This allows a simple-minded structural check very similar to
2347    one trick ponies in collections.abc such as Iterable.
2348
2349    For example::
2350
2351        @runtime_checkable
2352        class Closable(Protocol):
2353            def close(self): ...
2354
2355        assert isinstance(open('/some/file'), Closable)
2356
2357    Warning: this will check only the presence of the required methods,
2358    not their type signatures!
2359    """
2360    if not issubclass(cls, Generic) or not getattr(cls, '_is_protocol', False):
2361        raise TypeError('@runtime_checkable can be only applied to protocol classes,'
2362                        ' got %r' % cls)
2363    cls._is_runtime_protocol = True
2364    # PEP 544 prohibits using issubclass()
2365    # with protocols that have non-method members.
2366    # See gh-113320 for why we compute this attribute here,
2367    # rather than in `_ProtocolMeta.__init__`
2368    cls.__non_callable_proto_members__ = set()
2369    for attr in cls.__protocol_attrs__:
2370        try:
2371            is_callable = callable(getattr(cls, attr, None))
2372        except Exception as e:
2373            raise TypeError(
2374                f"Failed to determine whether protocol member {attr!r} "
2375                "is a method member"
2376            ) from e
2377        else:
2378            if not is_callable:
2379                cls.__non_callable_proto_members__.add(attr)
2380    return cls
2381
2382
2383def cast(typ, val):
2384    """Cast a value to a type.
2385
2386    This returns the value unchanged.  To the type checker this
2387    signals that the return value has the designated type, but at
2388    runtime we intentionally don't check anything (we want this
2389    to be as fast as possible).
2390    """
2391    return val
2392
2393
2394def assert_type(val, typ, /):
2395    """Ask a static type checker to confirm that the value is of the given type.
2396
2397    At runtime this does nothing: it returns the first argument unchanged with no
2398    checks or side effects, no matter the actual type of the argument.
2399
2400    When a static type checker encounters a call to assert_type(), it
2401    emits an error if the value is not of the specified type::
2402
2403        def greet(name: str) -> None:
2404            assert_type(name, str)  # OK
2405            assert_type(name, int)  # type checker error
2406    """
2407    return val
2408
2409
2410_allowed_types = (types.FunctionType, types.BuiltinFunctionType,
2411                  types.MethodType, types.ModuleType,
2412                  WrapperDescriptorType, MethodWrapperType, MethodDescriptorType)
2413
2414
2415def get_type_hints(obj, globalns=None, localns=None, include_extras=False):
2416    """Return type hints for an object.
2417
2418    This is often the same as obj.__annotations__, but it handles
2419    forward references encoded as string literals and recursively replaces all
2420    'Annotated[T, ...]' with 'T' (unless 'include_extras=True').
2421
2422    The argument may be a module, class, method, or function. The annotations
2423    are returned as a dictionary. For classes, annotations include also
2424    inherited members.
2425
2426    TypeError is raised if the argument is not of a type that can contain
2427    annotations, and an empty dictionary is returned if no annotations are
2428    present.
2429
2430    BEWARE -- the behavior of globalns and localns is counterintuitive
2431    (unless you are familiar with how eval() and exec() work).  The
2432    search order is locals first, then globals.
2433
2434    - If no dict arguments are passed, an attempt is made to use the
2435      globals from obj (or the respective module's globals for classes),
2436      and these are also used as the locals.  If the object does not appear
2437      to have globals, an empty dictionary is used.  For classes, the search
2438      order is globals first then locals.
2439
2440    - If one dict argument is passed, it is used for both globals and
2441      locals.
2442
2443    - If two dict arguments are passed, they specify globals and
2444      locals, respectively.
2445    """
2446    if getattr(obj, '__no_type_check__', None):
2447        return {}
2448    # Classes require a special treatment.
2449    if isinstance(obj, type):
2450        hints = {}
2451        for base in reversed(obj.__mro__):
2452            if globalns is None:
2453                base_globals = getattr(sys.modules.get(base.__module__, None), '__dict__', {})
2454            else:
2455                base_globals = globalns
2456            ann = base.__dict__.get('__annotations__', {})
2457            if isinstance(ann, types.GetSetDescriptorType):
2458                ann = {}
2459            base_locals = dict(vars(base)) if localns is None else localns
2460            if localns is None and globalns is None:
2461                # This is surprising, but required.  Before Python 3.10,
2462                # get_type_hints only evaluated the globalns of
2463                # a class.  To maintain backwards compatibility, we reverse
2464                # the globalns and localns order so that eval() looks into
2465                # *base_globals* first rather than *base_locals*.
2466                # This only affects ForwardRefs.
2467                base_globals, base_locals = base_locals, base_globals
2468            for name, value in ann.items():
2469                if value is None:
2470                    value = type(None)
2471                if isinstance(value, str):
2472                    value = ForwardRef(value, is_argument=False, is_class=True)
2473                value = _eval_type(value, base_globals, base_locals, base.__type_params__)
2474                hints[name] = value
2475        return hints if include_extras else {k: _strip_annotations(t) for k, t in hints.items()}
2476
2477    if globalns is None:
2478        if isinstance(obj, types.ModuleType):
2479            globalns = obj.__dict__
2480        else:
2481            nsobj = obj
2482            # Find globalns for the unwrapped object.
2483            while hasattr(nsobj, '__wrapped__'):
2484                nsobj = nsobj.__wrapped__
2485            globalns = getattr(nsobj, '__globals__', {})
2486        if localns is None:
2487            localns = globalns
2488    elif localns is None:
2489        localns = globalns
2490    hints = getattr(obj, '__annotations__', None)
2491    if hints is None:
2492        # Return empty annotations for something that _could_ have them.
2493        if isinstance(obj, _allowed_types):
2494            return {}
2495        else:
2496            raise TypeError('{!r} is not a module, class, method, '
2497                            'or function.'.format(obj))
2498    hints = dict(hints)
2499    type_params = getattr(obj, "__type_params__", ())
2500    for name, value in hints.items():
2501        if value is None:
2502            value = type(None)
2503        if isinstance(value, str):
2504            # class-level forward refs were handled above, this must be either
2505            # a module-level annotation or a function argument annotation
2506            value = ForwardRef(
2507                value,
2508                is_argument=not isinstance(obj, types.ModuleType),
2509                is_class=False,
2510            )
2511        hints[name] = _eval_type(value, globalns, localns, type_params)
2512    return hints if include_extras else {k: _strip_annotations(t) for k, t in hints.items()}
2513
2514
2515def _strip_annotations(t):
2516    """Strip the annotations from a given type."""
2517    if isinstance(t, _AnnotatedAlias):
2518        return _strip_annotations(t.__origin__)
2519    if hasattr(t, "__origin__") and t.__origin__ in (Required, NotRequired, ReadOnly):
2520        return _strip_annotations(t.__args__[0])
2521    if isinstance(t, _GenericAlias):
2522        stripped_args = tuple(_strip_annotations(a) for a in t.__args__)
2523        if stripped_args == t.__args__:
2524            return t
2525        return t.copy_with(stripped_args)
2526    if isinstance(t, GenericAlias):
2527        stripped_args = tuple(_strip_annotations(a) for a in t.__args__)
2528        if stripped_args == t.__args__:
2529            return t
2530        return GenericAlias(t.__origin__, stripped_args)
2531    if isinstance(t, types.UnionType):
2532        stripped_args = tuple(_strip_annotations(a) for a in t.__args__)
2533        if stripped_args == t.__args__:
2534            return t
2535        return functools.reduce(operator.or_, stripped_args)
2536
2537    return t
2538
2539
2540def get_origin(tp):
2541    """Get the unsubscripted version of a type.
2542
2543    This supports generic types, Callable, Tuple, Union, Literal, Final, ClassVar,
2544    Annotated, and others. Return None for unsupported types.
2545
2546    Examples::
2547
2548        >>> P = ParamSpec('P')
2549        >>> assert get_origin(Literal[42]) is Literal
2550        >>> assert get_origin(int) is None
2551        >>> assert get_origin(ClassVar[int]) is ClassVar
2552        >>> assert get_origin(Generic) is Generic
2553        >>> assert get_origin(Generic[T]) is Generic
2554        >>> assert get_origin(Union[T, int]) is Union
2555        >>> assert get_origin(List[Tuple[T, T]][int]) is list
2556        >>> assert get_origin(P.args) is P
2557    """
2558    if isinstance(tp, _AnnotatedAlias):
2559        return Annotated
2560    if isinstance(tp, (_BaseGenericAlias, GenericAlias,
2561                       ParamSpecArgs, ParamSpecKwargs)):
2562        return tp.__origin__
2563    if tp is Generic:
2564        return Generic
2565    if isinstance(tp, types.UnionType):
2566        return types.UnionType
2567    return None
2568
2569
2570def get_args(tp):
2571    """Get type arguments with all substitutions performed.
2572
2573    For unions, basic simplifications used by Union constructor are performed.
2574
2575    Examples::
2576
2577        >>> T = TypeVar('T')
2578        >>> assert get_args(Dict[str, int]) == (str, int)
2579        >>> assert get_args(int) == ()
2580        >>> assert get_args(Union[int, Union[T, int], str][int]) == (int, str)
2581        >>> assert get_args(Union[int, Tuple[T, int]][str]) == (int, Tuple[str, int])
2582        >>> assert get_args(Callable[[], T][int]) == ([], int)
2583    """
2584    if isinstance(tp, _AnnotatedAlias):
2585        return (tp.__origin__,) + tp.__metadata__
2586    if isinstance(tp, (_GenericAlias, GenericAlias)):
2587        res = tp.__args__
2588        if _should_unflatten_callable_args(tp, res):
2589            res = (list(res[:-1]), res[-1])
2590        return res
2591    if isinstance(tp, types.UnionType):
2592        return tp.__args__
2593    return ()
2594
2595
2596def is_typeddict(tp):
2597    """Check if an annotation is a TypedDict class.
2598
2599    For example::
2600
2601        >>> from typing import TypedDict
2602        >>> class Film(TypedDict):
2603        ...     title: str
2604        ...     year: int
2605        ...
2606        >>> is_typeddict(Film)
2607        True
2608        >>> is_typeddict(dict)
2609        False
2610    """
2611    return isinstance(tp, _TypedDictMeta)
2612
2613
2614_ASSERT_NEVER_REPR_MAX_LENGTH = 100
2615
2616
2617def assert_never(arg: Never, /) -> Never:
2618    """Statically assert that a line of code is unreachable.
2619
2620    Example::
2621
2622        def int_or_str(arg: int | str) -> None:
2623            match arg:
2624                case int():
2625                    print("It's an int")
2626                case str():
2627                    print("It's a str")
2628                case _:
2629                    assert_never(arg)
2630
2631    If a type checker finds that a call to assert_never() is
2632    reachable, it will emit an error.
2633
2634    At runtime, this throws an exception when called.
2635    """
2636    value = repr(arg)
2637    if len(value) > _ASSERT_NEVER_REPR_MAX_LENGTH:
2638        value = value[:_ASSERT_NEVER_REPR_MAX_LENGTH] + '...'
2639    raise AssertionError(f"Expected code to be unreachable, but got: {value}")
2640
2641
2642def no_type_check(arg):
2643    """Decorator to indicate that annotations are not type hints.
2644
2645    The argument must be a class or function; if it is a class, it
2646    applies recursively to all methods and classes defined in that class
2647    (but not to methods defined in its superclasses or subclasses).
2648
2649    This mutates the function(s) or class(es) in place.
2650    """
2651    if isinstance(arg, type):
2652        for key in dir(arg):
2653            obj = getattr(arg, key)
2654            if (
2655                not hasattr(obj, '__qualname__')
2656                or obj.__qualname__ != f'{arg.__qualname__}.{obj.__name__}'
2657                or getattr(obj, '__module__', None) != arg.__module__
2658            ):
2659                # We only modify objects that are defined in this type directly.
2660                # If classes / methods are nested in multiple layers,
2661                # we will modify them when processing their direct holders.
2662                continue
2663            # Instance, class, and static methods:
2664            if isinstance(obj, types.FunctionType):
2665                obj.__no_type_check__ = True
2666            if isinstance(obj, types.MethodType):
2667                obj.__func__.__no_type_check__ = True
2668            # Nested types:
2669            if isinstance(obj, type):
2670                no_type_check(obj)
2671    try:
2672        arg.__no_type_check__ = True
2673    except TypeError:  # built-in classes
2674        pass
2675    return arg
2676
2677
2678def no_type_check_decorator(decorator):
2679    """Decorator to give another decorator the @no_type_check effect.
2680
2681    This wraps the decorator with something that wraps the decorated
2682    function in @no_type_check.
2683    """
2684    import warnings
2685    warnings._deprecated("typing.no_type_check_decorator", remove=(3, 15))
2686    @functools.wraps(decorator)
2687    def wrapped_decorator(*args, **kwds):
2688        func = decorator(*args, **kwds)
2689        func = no_type_check(func)
2690        return func
2691
2692    return wrapped_decorator
2693
2694
2695def _overload_dummy(*args, **kwds):
2696    """Helper for @overload to raise when called."""
2697    raise NotImplementedError(
2698        "You should not call an overloaded function. "
2699        "A series of @overload-decorated functions "
2700        "outside a stub module should always be followed "
2701        "by an implementation that is not @overload-ed.")
2702
2703
2704# {module: {qualname: {firstlineno: func}}}
2705_overload_registry = defaultdict(functools.partial(defaultdict, dict))
2706
2707
2708def overload(func):
2709    """Decorator for overloaded functions/methods.
2710
2711    In a stub file, place two or more stub definitions for the same
2712    function in a row, each decorated with @overload.
2713
2714    For example::
2715
2716        @overload
2717        def utf8(value: None) -> None: ...
2718        @overload
2719        def utf8(value: bytes) -> bytes: ...
2720        @overload
2721        def utf8(value: str) -> bytes: ...
2722
2723    In a non-stub file (i.e. a regular .py file), do the same but
2724    follow it with an implementation.  The implementation should *not*
2725    be decorated with @overload::
2726
2727        @overload
2728        def utf8(value: None) -> None: ...
2729        @overload
2730        def utf8(value: bytes) -> bytes: ...
2731        @overload
2732        def utf8(value: str) -> bytes: ...
2733        def utf8(value):
2734            ...  # implementation goes here
2735
2736    The overloads for a function can be retrieved at runtime using the
2737    get_overloads() function.
2738    """
2739    # classmethod and staticmethod
2740    f = getattr(func, "__func__", func)
2741    try:
2742        _overload_registry[f.__module__][f.__qualname__][f.__code__.co_firstlineno] = func
2743    except AttributeError:
2744        # Not a normal function; ignore.
2745        pass
2746    return _overload_dummy
2747
2748
2749def get_overloads(func):
2750    """Return all defined overloads for *func* as a sequence."""
2751    # classmethod and staticmethod
2752    f = getattr(func, "__func__", func)
2753    if f.__module__ not in _overload_registry:
2754        return []
2755    mod_dict = _overload_registry[f.__module__]
2756    if f.__qualname__ not in mod_dict:
2757        return []
2758    return list(mod_dict[f.__qualname__].values())
2759
2760
2761def clear_overloads():
2762    """Clear all overloads in the registry."""
2763    _overload_registry.clear()
2764
2765
2766def final(f):
2767    """Decorator to indicate final methods and final classes.
2768
2769    Use this decorator to indicate to type checkers that the decorated
2770    method cannot be overridden, and decorated class cannot be subclassed.
2771
2772    For example::
2773
2774        class Base:
2775            @final
2776            def done(self) -> None:
2777                ...
2778        class Sub(Base):
2779            def done(self) -> None:  # Error reported by type checker
2780                ...
2781
2782        @final
2783        class Leaf:
2784            ...
2785        class Other(Leaf):  # Error reported by type checker
2786            ...
2787
2788    There is no runtime checking of these properties. The decorator
2789    attempts to set the ``__final__`` attribute to ``True`` on the decorated
2790    object to allow runtime introspection.
2791    """
2792    try:
2793        f.__final__ = True
2794    except (AttributeError, TypeError):
2795        # Skip the attribute silently if it is not writable.
2796        # AttributeError happens if the object has __slots__ or a
2797        # read-only property, TypeError if it's a builtin class.
2798        pass
2799    return f
2800
2801
2802# Some unconstrained type variables.  These were initially used by the container types.
2803# They were never meant for export and are now unused, but we keep them around to
2804# avoid breaking compatibility with users who import them.
2805T = TypeVar('T')  # Any type.
2806KT = TypeVar('KT')  # Key type.
2807VT = TypeVar('VT')  # Value type.
2808T_co = TypeVar('T_co', covariant=True)  # Any type covariant containers.
2809V_co = TypeVar('V_co', covariant=True)  # Any type covariant containers.
2810VT_co = TypeVar('VT_co', covariant=True)  # Value type covariant containers.
2811T_contra = TypeVar('T_contra', contravariant=True)  # Ditto contravariant.
2812# Internal type variable used for Type[].
2813CT_co = TypeVar('CT_co', covariant=True, bound=type)
2814
2815
2816# A useful type variable with constraints.  This represents string types.
2817# (This one *is* for export!)
2818AnyStr = TypeVar('AnyStr', bytes, str)
2819
2820
2821# Various ABCs mimicking those in collections.abc.
2822_alias = _SpecialGenericAlias
2823
2824Hashable = _alias(collections.abc.Hashable, 0)  # Not generic.
2825Awaitable = _alias(collections.abc.Awaitable, 1)
2826Coroutine = _alias(collections.abc.Coroutine, 3)
2827AsyncIterable = _alias(collections.abc.AsyncIterable, 1)
2828AsyncIterator = _alias(collections.abc.AsyncIterator, 1)
2829Iterable = _alias(collections.abc.Iterable, 1)
2830Iterator = _alias(collections.abc.Iterator, 1)
2831Reversible = _alias(collections.abc.Reversible, 1)
2832Sized = _alias(collections.abc.Sized, 0)  # Not generic.
2833Container = _alias(collections.abc.Container, 1)
2834Collection = _alias(collections.abc.Collection, 1)
2835Callable = _CallableType(collections.abc.Callable, 2)
2836Callable.__doc__ = \
2837    """Deprecated alias to collections.abc.Callable.
2838
2839    Callable[[int], str] signifies a function that takes a single
2840    parameter of type int and returns a str.
2841
2842    The subscription syntax must always be used with exactly two
2843    values: the argument list and the return type.
2844    The argument list must be a list of types, a ParamSpec,
2845    Concatenate or ellipsis. The return type must be a single type.
2846
2847    There is no syntax to indicate optional or keyword arguments;
2848    such function types are rarely used as callback types.
2849    """
2850AbstractSet = _alias(collections.abc.Set, 1, name='AbstractSet')
2851MutableSet = _alias(collections.abc.MutableSet, 1)
2852# NOTE: Mapping is only covariant in the value type.
2853Mapping = _alias(collections.abc.Mapping, 2)
2854MutableMapping = _alias(collections.abc.MutableMapping, 2)
2855Sequence = _alias(collections.abc.Sequence, 1)
2856MutableSequence = _alias(collections.abc.MutableSequence, 1)
2857ByteString = _DeprecatedGenericAlias(
2858    collections.abc.ByteString, 0, removal_version=(3, 17)  # Not generic.
2859)
2860# Tuple accepts variable number of parameters.
2861Tuple = _TupleType(tuple, -1, inst=False, name='Tuple')
2862Tuple.__doc__ = \
2863    """Deprecated alias to builtins.tuple.
2864
2865    Tuple[X, Y] is the cross-product type of X and Y.
2866
2867    Example: Tuple[T1, T2] is a tuple of two elements corresponding
2868    to type variables T1 and T2.  Tuple[int, float, str] is a tuple
2869    of an int, a float and a string.
2870
2871    To specify a variable-length tuple of homogeneous type, use Tuple[T, ...].
2872    """
2873List = _alias(list, 1, inst=False, name='List')
2874Deque = _alias(collections.deque, 1, name='Deque')
2875Set = _alias(set, 1, inst=False, name='Set')
2876FrozenSet = _alias(frozenset, 1, inst=False, name='FrozenSet')
2877MappingView = _alias(collections.abc.MappingView, 1)
2878KeysView = _alias(collections.abc.KeysView, 1)
2879ItemsView = _alias(collections.abc.ItemsView, 2)
2880ValuesView = _alias(collections.abc.ValuesView, 1)
2881Dict = _alias(dict, 2, inst=False, name='Dict')
2882DefaultDict = _alias(collections.defaultdict, 2, name='DefaultDict')
2883OrderedDict = _alias(collections.OrderedDict, 2)
2884Counter = _alias(collections.Counter, 1)
2885ChainMap = _alias(collections.ChainMap, 2)
2886Generator = _alias(collections.abc.Generator, 3, defaults=(types.NoneType, types.NoneType))
2887AsyncGenerator = _alias(collections.abc.AsyncGenerator, 2, defaults=(types.NoneType,))
2888Type = _alias(type, 1, inst=False, name='Type')
2889Type.__doc__ = \
2890    """Deprecated alias to builtins.type.
2891
2892    builtins.type or typing.Type can be used to annotate class objects.
2893    For example, suppose we have the following classes::
2894
2895        class User: ...  # Abstract base for User classes
2896        class BasicUser(User): ...
2897        class ProUser(User): ...
2898        class TeamUser(User): ...
2899
2900    And a function that takes a class argument that's a subclass of
2901    User and returns an instance of the corresponding class::
2902
2903        def new_user[U](user_class: Type[U]) -> U:
2904            user = user_class()
2905            # (Here we could write the user object to a database)
2906            return user
2907
2908        joe = new_user(BasicUser)
2909
2910    At this point the type checker knows that joe has type BasicUser.
2911    """
2912
2913
2914@runtime_checkable
2915class SupportsInt(Protocol):
2916    """An ABC with one abstract method __int__."""
2917
2918    __slots__ = ()
2919
2920    @abstractmethod
2921    def __int__(self) -> int:
2922        pass
2923
2924
2925@runtime_checkable
2926class SupportsFloat(Protocol):
2927    """An ABC with one abstract method __float__."""
2928
2929    __slots__ = ()
2930
2931    @abstractmethod
2932    def __float__(self) -> float:
2933        pass
2934
2935
2936@runtime_checkable
2937class SupportsComplex(Protocol):
2938    """An ABC with one abstract method __complex__."""
2939
2940    __slots__ = ()
2941
2942    @abstractmethod
2943    def __complex__(self) -> complex:
2944        pass
2945
2946
2947@runtime_checkable
2948class SupportsBytes(Protocol):
2949    """An ABC with one abstract method __bytes__."""
2950
2951    __slots__ = ()
2952
2953    @abstractmethod
2954    def __bytes__(self) -> bytes:
2955        pass
2956
2957
2958@runtime_checkable
2959class SupportsIndex(Protocol):
2960    """An ABC with one abstract method __index__."""
2961
2962    __slots__ = ()
2963
2964    @abstractmethod
2965    def __index__(self) -> int:
2966        pass
2967
2968
2969@runtime_checkable
2970class SupportsAbs[T](Protocol):
2971    """An ABC with one abstract method __abs__ that is covariant in its return type."""
2972
2973    __slots__ = ()
2974
2975    @abstractmethod
2976    def __abs__(self) -> T:
2977        pass
2978
2979
2980@runtime_checkable
2981class SupportsRound[T](Protocol):
2982    """An ABC with one abstract method __round__ that is covariant in its return type."""
2983
2984    __slots__ = ()
2985
2986    @abstractmethod
2987    def __round__(self, ndigits: int = 0) -> T:
2988        pass
2989
2990
2991def _make_nmtuple(name, types, module, defaults = ()):
2992    fields = [n for n, t in types]
2993    types = {n: _type_check(t, f"field {n} annotation must be a type")
2994             for n, t in types}
2995    nm_tpl = collections.namedtuple(name, fields,
2996                                    defaults=defaults, module=module)
2997    nm_tpl.__annotations__ = nm_tpl.__new__.__annotations__ = types
2998    return nm_tpl
2999
3000
3001# attributes prohibited to set in NamedTuple class syntax
3002_prohibited = frozenset({'__new__', '__init__', '__slots__', '__getnewargs__',
3003                         '_fields', '_field_defaults',
3004                         '_make', '_replace', '_asdict', '_source'})
3005
3006_special = frozenset({'__module__', '__name__', '__annotations__'})
3007
3008
3009class NamedTupleMeta(type):
3010    def __new__(cls, typename, bases, ns):
3011        assert _NamedTuple in bases
3012        for base in bases:
3013            if base is not _NamedTuple and base is not Generic:
3014                raise TypeError(
3015                    'can only inherit from a NamedTuple type and Generic')
3016        bases = tuple(tuple if base is _NamedTuple else base for base in bases)
3017        types = ns.get('__annotations__', {})
3018        default_names = []
3019        for field_name in types:
3020            if field_name in ns:
3021                default_names.append(field_name)
3022            elif default_names:
3023                raise TypeError(f"Non-default namedtuple field {field_name} "
3024                                f"cannot follow default field"
3025                                f"{'s' if len(default_names) > 1 else ''} "
3026                                f"{', '.join(default_names)}")
3027        nm_tpl = _make_nmtuple(typename, types.items(),
3028                               defaults=[ns[n] for n in default_names],
3029                               module=ns['__module__'])
3030        nm_tpl.__bases__ = bases
3031        if Generic in bases:
3032            class_getitem = _generic_class_getitem
3033            nm_tpl.__class_getitem__ = classmethod(class_getitem)
3034        # update from user namespace without overriding special namedtuple attributes
3035        for key, val in ns.items():
3036            if key in _prohibited:
3037                raise AttributeError("Cannot overwrite NamedTuple attribute " + key)
3038            elif key not in _special:
3039                if key not in nm_tpl._fields:
3040                    setattr(nm_tpl, key, val)
3041                try:
3042                    set_name = type(val).__set_name__
3043                except AttributeError:
3044                    pass
3045                else:
3046                    try:
3047                        set_name(val, nm_tpl, key)
3048                    except BaseException as e:
3049                        e.add_note(
3050                            f"Error calling __set_name__ on {type(val).__name__!r} "
3051                            f"instance {key!r} in {typename!r}"
3052                        )
3053                        raise
3054
3055        if Generic in bases:
3056            nm_tpl.__init_subclass__()
3057        return nm_tpl
3058
3059
3060def NamedTuple(typename, fields=_sentinel, /, **kwargs):
3061    """Typed version of namedtuple.
3062
3063    Usage::
3064
3065        class Employee(NamedTuple):
3066            name: str
3067            id: int
3068
3069    This is equivalent to::
3070
3071        Employee = collections.namedtuple('Employee', ['name', 'id'])
3072
3073    The resulting class has an extra __annotations__ attribute, giving a
3074    dict that maps field names to types.  (The field names are also in
3075    the _fields attribute, which is part of the namedtuple API.)
3076    An alternative equivalent functional syntax is also accepted::
3077
3078        Employee = NamedTuple('Employee', [('name', str), ('id', int)])
3079    """
3080    if fields is _sentinel:
3081        if kwargs:
3082            deprecated_thing = "Creating NamedTuple classes using keyword arguments"
3083            deprecation_msg = (
3084                "{name} is deprecated and will be disallowed in Python {remove}. "
3085                "Use the class-based or functional syntax instead."
3086            )
3087        else:
3088            deprecated_thing = "Failing to pass a value for the 'fields' parameter"
3089            example = f"`{typename} = NamedTuple({typename!r}, [])`"
3090            deprecation_msg = (
3091                "{name} is deprecated and will be disallowed in Python {remove}. "
3092                "To create a NamedTuple class with 0 fields "
3093                "using the functional syntax, "
3094                "pass an empty list, e.g. "
3095            ) + example + "."
3096    elif fields is None:
3097        if kwargs:
3098            raise TypeError(
3099                "Cannot pass `None` as the 'fields' parameter "
3100                "and also specify fields using keyword arguments"
3101            )
3102        else:
3103            deprecated_thing = "Passing `None` as the 'fields' parameter"
3104            example = f"`{typename} = NamedTuple({typename!r}, [])`"
3105            deprecation_msg = (
3106                "{name} is deprecated and will be disallowed in Python {remove}. "
3107                "To create a NamedTuple class with 0 fields "
3108                "using the functional syntax, "
3109                "pass an empty list, e.g. "
3110            ) + example + "."
3111    elif kwargs:
3112        raise TypeError("Either list of fields or keywords"
3113                        " can be provided to NamedTuple, not both")
3114    if fields is _sentinel or fields is None:
3115        import warnings
3116        warnings._deprecated(deprecated_thing, message=deprecation_msg, remove=(3, 15))
3117        fields = kwargs.items()
3118    nt = _make_nmtuple(typename, fields, module=_caller())
3119    nt.__orig_bases__ = (NamedTuple,)
3120    return nt
3121
3122_NamedTuple = type.__new__(NamedTupleMeta, 'NamedTuple', (), {})
3123
3124def _namedtuple_mro_entries(bases):
3125    assert NamedTuple in bases
3126    return (_NamedTuple,)
3127
3128NamedTuple.__mro_entries__ = _namedtuple_mro_entries
3129
3130
3131def _get_typeddict_qualifiers(annotation_type):
3132    while True:
3133        annotation_origin = get_origin(annotation_type)
3134        if annotation_origin is Annotated:
3135            annotation_args = get_args(annotation_type)
3136            if annotation_args:
3137                annotation_type = annotation_args[0]
3138            else:
3139                break
3140        elif annotation_origin is Required:
3141            yield Required
3142            (annotation_type,) = get_args(annotation_type)
3143        elif annotation_origin is NotRequired:
3144            yield NotRequired
3145            (annotation_type,) = get_args(annotation_type)
3146        elif annotation_origin is ReadOnly:
3147            yield ReadOnly
3148            (annotation_type,) = get_args(annotation_type)
3149        else:
3150            break
3151
3152
3153class _TypedDictMeta(type):
3154    def __new__(cls, name, bases, ns, total=True):
3155        """Create a new typed dict class object.
3156
3157        This method is called when TypedDict is subclassed,
3158        or when TypedDict is instantiated. This way
3159        TypedDict supports all three syntax forms described in its docstring.
3160        Subclasses and instances of TypedDict return actual dictionaries.
3161        """
3162        for base in bases:
3163            if type(base) is not _TypedDictMeta and base is not Generic:
3164                raise TypeError('cannot inherit from both a TypedDict type '
3165                                'and a non-TypedDict base class')
3166
3167        if any(issubclass(b, Generic) for b in bases):
3168            generic_base = (Generic,)
3169        else:
3170            generic_base = ()
3171
3172        tp_dict = type.__new__(_TypedDictMeta, name, (*generic_base, dict), ns)
3173
3174        if not hasattr(tp_dict, '__orig_bases__'):
3175            tp_dict.__orig_bases__ = bases
3176
3177        annotations = {}
3178        own_annotations = ns.get('__annotations__', {})
3179        msg = "TypedDict('Name', {f0: t0, f1: t1, ...}); each t must be a type"
3180        own_annotations = {
3181            n: _type_check(tp, msg, module=tp_dict.__module__)
3182            for n, tp in own_annotations.items()
3183        }
3184        required_keys = set()
3185        optional_keys = set()
3186        readonly_keys = set()
3187        mutable_keys = set()
3188
3189        for base in bases:
3190            annotations.update(base.__dict__.get('__annotations__', {}))
3191
3192            base_required = base.__dict__.get('__required_keys__', set())
3193            required_keys |= base_required
3194            optional_keys -= base_required
3195
3196            base_optional = base.__dict__.get('__optional_keys__', set())
3197            required_keys -= base_optional
3198            optional_keys |= base_optional
3199
3200            readonly_keys.update(base.__dict__.get('__readonly_keys__', ()))
3201            mutable_keys.update(base.__dict__.get('__mutable_keys__', ()))
3202
3203        annotations.update(own_annotations)
3204        for annotation_key, annotation_type in own_annotations.items():
3205            qualifiers = set(_get_typeddict_qualifiers(annotation_type))
3206            if Required in qualifiers:
3207                is_required = True
3208            elif NotRequired in qualifiers:
3209                is_required = False
3210            else:
3211                is_required = total
3212
3213            if is_required:
3214                required_keys.add(annotation_key)
3215                optional_keys.discard(annotation_key)
3216            else:
3217                optional_keys.add(annotation_key)
3218                required_keys.discard(annotation_key)
3219
3220            if ReadOnly in qualifiers:
3221                if annotation_key in mutable_keys:
3222                    raise TypeError(
3223                        f"Cannot override mutable key {annotation_key!r}"
3224                        " with read-only key"
3225                    )
3226                readonly_keys.add(annotation_key)
3227            else:
3228                mutable_keys.add(annotation_key)
3229                readonly_keys.discard(annotation_key)
3230
3231        assert required_keys.isdisjoint(optional_keys), (
3232            f"Required keys overlap with optional keys in {name}:"
3233            f" {required_keys=}, {optional_keys=}"
3234        )
3235        tp_dict.__annotations__ = annotations
3236        tp_dict.__required_keys__ = frozenset(required_keys)
3237        tp_dict.__optional_keys__ = frozenset(optional_keys)
3238        tp_dict.__readonly_keys__ = frozenset(readonly_keys)
3239        tp_dict.__mutable_keys__ = frozenset(mutable_keys)
3240        tp_dict.__total__ = total
3241        return tp_dict
3242
3243    __call__ = dict  # static method
3244
3245    def __subclasscheck__(cls, other):
3246        # Typed dicts are only for static structural subtyping.
3247        raise TypeError('TypedDict does not support instance and class checks')
3248
3249    __instancecheck__ = __subclasscheck__
3250
3251
3252def TypedDict(typename, fields=_sentinel, /, *, total=True):
3253    """A simple typed namespace. At runtime it is equivalent to a plain dict.
3254
3255    TypedDict creates a dictionary type such that a type checker will expect all
3256    instances to have a certain set of keys, where each key is
3257    associated with a value of a consistent type. This expectation
3258    is not checked at runtime.
3259
3260    Usage::
3261
3262        >>> class Point2D(TypedDict):
3263        ...     x: int
3264        ...     y: int
3265        ...     label: str
3266        ...
3267        >>> a: Point2D = {'x': 1, 'y': 2, 'label': 'good'}  # OK
3268        >>> b: Point2D = {'z': 3, 'label': 'bad'}           # Fails type check
3269        >>> Point2D(x=1, y=2, label='first') == dict(x=1, y=2, label='first')
3270        True
3271
3272    The type info can be accessed via the Point2D.__annotations__ dict, and
3273    the Point2D.__required_keys__ and Point2D.__optional_keys__ frozensets.
3274    TypedDict supports an additional equivalent form::
3275
3276        Point2D = TypedDict('Point2D', {'x': int, 'y': int, 'label': str})
3277
3278    By default, all keys must be present in a TypedDict. It is possible
3279    to override this by specifying totality::
3280
3281        class Point2D(TypedDict, total=False):
3282            x: int
3283            y: int
3284
3285    This means that a Point2D TypedDict can have any of the keys omitted. A type
3286    checker is only expected to support a literal False or True as the value of
3287    the total argument. True is the default, and makes all items defined in the
3288    class body be required.
3289
3290    The Required and NotRequired special forms can also be used to mark
3291    individual keys as being required or not required::
3292
3293        class Point2D(TypedDict):
3294            x: int               # the "x" key must always be present (Required is the default)
3295            y: NotRequired[int]  # the "y" key can be omitted
3296
3297    See PEP 655 for more details on Required and NotRequired.
3298
3299    The ReadOnly special form can be used
3300    to mark individual keys as immutable for type checkers::
3301
3302        class DatabaseUser(TypedDict):
3303            id: ReadOnly[int]  # the "id" key must not be modified
3304            username: str      # the "username" key can be changed
3305
3306    """
3307    if fields is _sentinel or fields is None:
3308        import warnings
3309
3310        if fields is _sentinel:
3311            deprecated_thing = "Failing to pass a value for the 'fields' parameter"
3312        else:
3313            deprecated_thing = "Passing `None` as the 'fields' parameter"
3314
3315        example = f"`{typename} = TypedDict({typename!r}, {{{{}}}})`"
3316        deprecation_msg = (
3317            "{name} is deprecated and will be disallowed in Python {remove}. "
3318            "To create a TypedDict class with 0 fields "
3319            "using the functional syntax, "
3320            "pass an empty dictionary, e.g. "
3321        ) + example + "."
3322        warnings._deprecated(deprecated_thing, message=deprecation_msg, remove=(3, 15))
3323        fields = {}
3324
3325    ns = {'__annotations__': dict(fields)}
3326    module = _caller()
3327    if module is not None:
3328        # Setting correct module is necessary to make typed dict classes pickleable.
3329        ns['__module__'] = module
3330
3331    td = _TypedDictMeta(typename, (), ns, total=total)
3332    td.__orig_bases__ = (TypedDict,)
3333    return td
3334
3335_TypedDict = type.__new__(_TypedDictMeta, 'TypedDict', (), {})
3336TypedDict.__mro_entries__ = lambda bases: (_TypedDict,)
3337
3338
3339@_SpecialForm
3340def Required(self, parameters):
3341    """Special typing construct to mark a TypedDict key as required.
3342
3343    This is mainly useful for total=False TypedDicts.
3344
3345    For example::
3346
3347        class Movie(TypedDict, total=False):
3348            title: Required[str]
3349            year: int
3350
3351        m = Movie(
3352            title='The Matrix',  # typechecker error if key is omitted
3353            year=1999,
3354        )
3355
3356    There is no runtime checking that a required key is actually provided
3357    when instantiating a related TypedDict.
3358    """
3359    item = _type_check(parameters, f'{self._name} accepts only a single type.')
3360    return _GenericAlias(self, (item,))
3361
3362
3363@_SpecialForm
3364def NotRequired(self, parameters):
3365    """Special typing construct to mark a TypedDict key as potentially missing.
3366
3367    For example::
3368
3369        class Movie(TypedDict):
3370            title: str
3371            year: NotRequired[int]
3372
3373        m = Movie(
3374            title='The Matrix',  # typechecker error if key is omitted
3375            year=1999,
3376        )
3377    """
3378    item = _type_check(parameters, f'{self._name} accepts only a single type.')
3379    return _GenericAlias(self, (item,))
3380
3381
3382@_SpecialForm
3383def ReadOnly(self, parameters):
3384    """A special typing construct to mark an item of a TypedDict as read-only.
3385
3386    For example::
3387
3388        class Movie(TypedDict):
3389            title: ReadOnly[str]
3390            year: int
3391
3392        def mutate_movie(m: Movie) -> None:
3393            m["year"] = 1992  # allowed
3394            m["title"] = "The Matrix"  # typechecker error
3395
3396    There is no runtime checking for this property.
3397    """
3398    item = _type_check(parameters, f'{self._name} accepts only a single type.')
3399    return _GenericAlias(self, (item,))
3400
3401
3402class NewType:
3403    """NewType creates simple unique types with almost zero runtime overhead.
3404
3405    NewType(name, tp) is considered a subtype of tp
3406    by static type checkers. At runtime, NewType(name, tp) returns
3407    a dummy callable that simply returns its argument.
3408
3409    Usage::
3410
3411        UserId = NewType('UserId', int)
3412
3413        def name_by_id(user_id: UserId) -> str:
3414            ...
3415
3416        UserId('user')          # Fails type check
3417
3418        name_by_id(42)          # Fails type check
3419        name_by_id(UserId(42))  # OK
3420
3421        num = UserId(5) + 1     # type: int
3422    """
3423
3424    __call__ = _idfunc
3425
3426    def __init__(self, name, tp):
3427        self.__qualname__ = name
3428        if '.' in name:
3429            name = name.rpartition('.')[-1]
3430        self.__name__ = name
3431        self.__supertype__ = tp
3432        def_mod = _caller()
3433        if def_mod != 'typing':
3434            self.__module__ = def_mod
3435
3436    def __mro_entries__(self, bases):
3437        # We defined __mro_entries__ to get a better error message
3438        # if a user attempts to subclass a NewType instance. bpo-46170
3439        superclass_name = self.__name__
3440
3441        class Dummy:
3442            def __init_subclass__(cls):
3443                subclass_name = cls.__name__
3444                raise TypeError(
3445                    f"Cannot subclass an instance of NewType. Perhaps you were looking for: "
3446                    f"`{subclass_name} = NewType({subclass_name!r}, {superclass_name})`"
3447                )
3448
3449        return (Dummy,)
3450
3451    def __repr__(self):
3452        return f'{self.__module__}.{self.__qualname__}'
3453
3454    def __reduce__(self):
3455        return self.__qualname__
3456
3457    def __or__(self, other):
3458        return Union[self, other]
3459
3460    def __ror__(self, other):
3461        return Union[other, self]
3462
3463
3464# Python-version-specific alias (Python 2: unicode; Python 3: str)
3465Text = str
3466
3467
3468# Constant that's True when type checking, but False here.
3469TYPE_CHECKING = False
3470
3471
3472class IO(Generic[AnyStr]):
3473    """Generic base class for TextIO and BinaryIO.
3474
3475    This is an abstract, generic version of the return of open().
3476
3477    NOTE: This does not distinguish between the different possible
3478    classes (text vs. binary, read vs. write vs. read/write,
3479    append-only, unbuffered).  The TextIO and BinaryIO subclasses
3480    below capture the distinctions between text vs. binary, which is
3481    pervasive in the interface; however we currently do not offer a
3482    way to track the other distinctions in the type system.
3483    """
3484
3485    __slots__ = ()
3486
3487    @property
3488    @abstractmethod
3489    def mode(self) -> str:
3490        pass
3491
3492    @property
3493    @abstractmethod
3494    def name(self) -> str:
3495        pass
3496
3497    @abstractmethod
3498    def close(self) -> None:
3499        pass
3500
3501    @property
3502    @abstractmethod
3503    def closed(self) -> bool:
3504        pass
3505
3506    @abstractmethod
3507    def fileno(self) -> int:
3508        pass
3509
3510    @abstractmethod
3511    def flush(self) -> None:
3512        pass
3513
3514    @abstractmethod
3515    def isatty(self) -> bool:
3516        pass
3517
3518    @abstractmethod
3519    def read(self, n: int = -1) -> AnyStr:
3520        pass
3521
3522    @abstractmethod
3523    def readable(self) -> bool:
3524        pass
3525
3526    @abstractmethod
3527    def readline(self, limit: int = -1) -> AnyStr:
3528        pass
3529
3530    @abstractmethod
3531    def readlines(self, hint: int = -1) -> List[AnyStr]:
3532        pass
3533
3534    @abstractmethod
3535    def seek(self, offset: int, whence: int = 0) -> int:
3536        pass
3537
3538    @abstractmethod
3539    def seekable(self) -> bool:
3540        pass
3541
3542    @abstractmethod
3543    def tell(self) -> int:
3544        pass
3545
3546    @abstractmethod
3547    def truncate(self, size: int = None) -> int:
3548        pass
3549
3550    @abstractmethod
3551    def writable(self) -> bool:
3552        pass
3553
3554    @abstractmethod
3555    def write(self, s: AnyStr) -> int:
3556        pass
3557
3558    @abstractmethod
3559    def writelines(self, lines: List[AnyStr]) -> None:
3560        pass
3561
3562    @abstractmethod
3563    def __enter__(self) -> 'IO[AnyStr]':
3564        pass
3565
3566    @abstractmethod
3567    def __exit__(self, type, value, traceback) -> None:
3568        pass
3569
3570
3571class BinaryIO(IO[bytes]):
3572    """Typed version of the return of open() in binary mode."""
3573
3574    __slots__ = ()
3575
3576    @abstractmethod
3577    def write(self, s: Union[bytes, bytearray]) -> int:
3578        pass
3579
3580    @abstractmethod
3581    def __enter__(self) -> 'BinaryIO':
3582        pass
3583
3584
3585class TextIO(IO[str]):
3586    """Typed version of the return of open() in text mode."""
3587
3588    __slots__ = ()
3589
3590    @property
3591    @abstractmethod
3592    def buffer(self) -> BinaryIO:
3593        pass
3594
3595    @property
3596    @abstractmethod
3597    def encoding(self) -> str:
3598        pass
3599
3600    @property
3601    @abstractmethod
3602    def errors(self) -> Optional[str]:
3603        pass
3604
3605    @property
3606    @abstractmethod
3607    def line_buffering(self) -> bool:
3608        pass
3609
3610    @property
3611    @abstractmethod
3612    def newlines(self) -> Any:
3613        pass
3614
3615    @abstractmethod
3616    def __enter__(self) -> 'TextIO':
3617        pass
3618
3619
3620def reveal_type[T](obj: T, /) -> T:
3621    """Ask a static type checker to reveal the inferred type of an expression.
3622
3623    When a static type checker encounters a call to ``reveal_type()``,
3624    it will emit the inferred type of the argument::
3625
3626        x: int = 1
3627        reveal_type(x)
3628
3629    Running a static type checker (e.g., mypy) on this example
3630    will produce output similar to 'Revealed type is "builtins.int"'.
3631
3632    At runtime, the function prints the runtime type of the
3633    argument and returns the argument unchanged.
3634    """
3635    print(f"Runtime type is {type(obj).__name__!r}", file=sys.stderr)
3636    return obj
3637
3638
3639class _IdentityCallable(Protocol):
3640    def __call__[T](self, arg: T, /) -> T:
3641        ...
3642
3643
3644def dataclass_transform(
3645    *,
3646    eq_default: bool = True,
3647    order_default: bool = False,
3648    kw_only_default: bool = False,
3649    frozen_default: bool = False,
3650    field_specifiers: tuple[type[Any] | Callable[..., Any], ...] = (),
3651    **kwargs: Any,
3652) -> _IdentityCallable:
3653    """Decorator to mark an object as providing dataclass-like behaviour.
3654
3655    The decorator can be applied to a function, class, or metaclass.
3656
3657    Example usage with a decorator function::
3658
3659        @dataclass_transform()
3660        def create_model[T](cls: type[T]) -> type[T]:
3661            ...
3662            return cls
3663
3664        @create_model
3665        class CustomerModel:
3666            id: int
3667            name: str
3668
3669    On a base class::
3670
3671        @dataclass_transform()
3672        class ModelBase: ...
3673
3674        class CustomerModel(ModelBase):
3675            id: int
3676            name: str
3677
3678    On a metaclass::
3679
3680        @dataclass_transform()
3681        class ModelMeta(type): ...
3682
3683        class ModelBase(metaclass=ModelMeta): ...
3684
3685        class CustomerModel(ModelBase):
3686            id: int
3687            name: str
3688
3689    The ``CustomerModel`` classes defined above will
3690    be treated by type checkers similarly to classes created with
3691    ``@dataclasses.dataclass``.
3692    For example, type checkers will assume these classes have
3693    ``__init__`` methods that accept ``id`` and ``name``.
3694
3695    The arguments to this decorator can be used to customize this behavior:
3696    - ``eq_default`` indicates whether the ``eq`` parameter is assumed to be
3697        ``True`` or ``False`` if it is omitted by the caller.
3698    - ``order_default`` indicates whether the ``order`` parameter is
3699        assumed to be True or False if it is omitted by the caller.
3700    - ``kw_only_default`` indicates whether the ``kw_only`` parameter is
3701        assumed to be True or False if it is omitted by the caller.
3702    - ``frozen_default`` indicates whether the ``frozen`` parameter is
3703        assumed to be True or False if it is omitted by the caller.
3704    - ``field_specifiers`` specifies a static list of supported classes
3705        or functions that describe fields, similar to ``dataclasses.field()``.
3706    - Arbitrary other keyword arguments are accepted in order to allow for
3707        possible future extensions.
3708
3709    At runtime, this decorator records its arguments in the
3710    ``__dataclass_transform__`` attribute on the decorated object.
3711    It has no other runtime effect.
3712
3713    See PEP 681 for more details.
3714    """
3715    def decorator(cls_or_fn):
3716        cls_or_fn.__dataclass_transform__ = {
3717            "eq_default": eq_default,
3718            "order_default": order_default,
3719            "kw_only_default": kw_only_default,
3720            "frozen_default": frozen_default,
3721            "field_specifiers": field_specifiers,
3722            "kwargs": kwargs,
3723        }
3724        return cls_or_fn
3725    return decorator
3726
3727
3728type _Func = Callable[..., Any]
3729
3730
3731def override[F: _Func](method: F, /) -> F:
3732    """Indicate that a method is intended to override a method in a base class.
3733
3734    Usage::
3735
3736        class Base:
3737            def method(self) -> None:
3738                pass
3739
3740        class Child(Base):
3741            @override
3742            def method(self) -> None:
3743                super().method()
3744
3745    When this decorator is applied to a method, the type checker will
3746    validate that it overrides a method or attribute with the same name on a
3747    base class.  This helps prevent bugs that may occur when a base class is
3748    changed without an equivalent change to a child class.
3749
3750    There is no runtime checking of this property. The decorator attempts to
3751    set the ``__override__`` attribute to ``True`` on the decorated object to
3752    allow runtime introspection.
3753
3754    See PEP 698 for details.
3755    """
3756    try:
3757        method.__override__ = True
3758    except (AttributeError, TypeError):
3759        # Skip the attribute silently if it is not writable.
3760        # AttributeError happens if the object has __slots__ or a
3761        # read-only property, TypeError if it's a builtin class.
3762        pass
3763    return method
3764
3765
3766def is_protocol(tp: type, /) -> bool:
3767    """Return True if the given type is a Protocol.
3768
3769    Example::
3770
3771        >>> from typing import Protocol, is_protocol
3772        >>> class P(Protocol):
3773        ...     def a(self) -> str: ...
3774        ...     b: int
3775        >>> is_protocol(P)
3776        True
3777        >>> is_protocol(int)
3778        False
3779    """
3780    return (
3781        isinstance(tp, type)
3782        and getattr(tp, '_is_protocol', False)
3783        and tp != Protocol
3784    )
3785
3786
3787def get_protocol_members(tp: type, /) -> frozenset[str]:
3788    """Return the set of members defined in a Protocol.
3789
3790    Example::
3791
3792        >>> from typing import Protocol, get_protocol_members
3793        >>> class P(Protocol):
3794        ...     def a(self) -> str: ...
3795        ...     b: int
3796        >>> get_protocol_members(P) == frozenset({'a', 'b'})
3797        True
3798
3799    Raise a TypeError for arguments that are not Protocols.
3800    """
3801    if not is_protocol(tp):
3802        raise TypeError(f'{tp!r} is not a Protocol')
3803    return frozenset(tp.__protocol_attrs__)
3804
3805
3806def __getattr__(attr):
3807    """Improve the import time of the typing module.
3808
3809    Soft-deprecated objects which are costly to create
3810    are only created on-demand here.
3811    """
3812    if attr in {"Pattern", "Match"}:
3813        import re
3814        obj = _alias(getattr(re, attr), 1)
3815    elif attr in {"ContextManager", "AsyncContextManager"}:
3816        import contextlib
3817        obj = _alias(getattr(contextlib, f"Abstract{attr}"), 2, name=attr, defaults=(bool | None,))
3818    elif attr == "_collect_parameters":
3819        import warnings
3820
3821        depr_message = (
3822            "The private _collect_parameters function is deprecated and will be"
3823            " removed in a future version of Python. Any use of private functions"
3824            " is discouraged and may break in the future."
3825        )
3826        warnings.warn(depr_message, category=DeprecationWarning, stacklevel=2)
3827        obj = _collect_type_parameters
3828    else:
3829        raise AttributeError(f"module {__name__!r} has no attribute {attr!r}")
3830    globals()[attr] = obj
3831    return obj
Annotated = Annotated

Add context-specific metadata to a type.

Example: Annotated[int, runtime_check.Unsigned] indicates to the hypothetical runtime_check module that this type is an unsigned int. Every other consumer of this type can ignore this metadata and treat this type as int.

The first argument to Annotated must be a valid type.

Details:

  • It's an error to call Annotated with less than two arguments.
  • Access the metadata via the __metadata__ attribute::

    assert Annotated[int, '$'].__metadata__ == ('$',)

  • Nested Annotated types are flattened::

    assert Annotated[Annotated[T, Ann1, Ann2], Ann3] == Annotated[T, Ann1, Ann2, Ann3]

  • Instantiating an annotated type is equivalent to instantiating the underlying type::

    assert AnnotatedC, Ann1 == C(5)

  • Annotated can be used as a generic type alias::

    type Optimized[T] = Annotated[T, runtime.Optimize()]

    type checker will treat Optimized[int]

    as equivalent to Annotated[int, runtime.Optimize()]

    type OptimizedList[T] = Annotated[list[T], runtime.Optimize()]

    type checker will treat OptimizedList[int]

    as equivalent to Annotated[list[int], runtime.Optimize()]

  • Annotated cannot be used with an unpacked TypeVarTuple::

    type Variadic[Ts] = Annotated[Ts, Ann1] # NOT valid

    This would be equivalent to::

    Annotated[T1, T2, T3, ..., Ann1]

    where T1, T2 etc. are TypeVars, which would be invalid, because only one type should be passed to Annotated.

class Any:
599class Any(metaclass=_AnyMeta):
600    """Special type indicating an unconstrained type.
601
602    - Any is compatible with every type.
603    - Any assumed to have all methods.
604    - All values assumed to be instances of Any.
605
606    Note that all the above statements are true from the point of view of
607    static type checkers. At runtime, Any should not be used with instance
608    checks.
609    """
610
611    def __new__(cls, *args, **kwargs):
612        if cls is Any:
613            raise TypeError("Any cannot be instantiated")
614        return super().__new__(cls)

Special type indicating an unconstrained type.

  • Any is compatible with every type.
  • Any assumed to have all methods.
  • All values assumed to be instances of Any.

Note that all the above statements are true from the point of view of static type checkers. At runtime, Any should not be used with instance checks.

Callable = Callable
ClassVar = ClassVar

Special type construct to mark class variables.

An annotation wrapped in ClassVar indicates that a given attribute is intended to be used as a class variable and should not be set on instances of that class.

Usage::

class Starship:
    stats: ClassVar[dict[str, int]] = {} # class variable
    damage: int = 10                     # instance variable

ClassVar accepts only types and cannot be further subscribed.

Note that ClassVar is not a class itself, and should not be used with isinstance() or issubclass().

Concatenate = Concatenate

Special form for annotating higher-order functions.

Concatenate can be used in conjunction with ParamSpec and Callable to represent a higher-order function which adds, removes or transforms the parameters of a callable.

For example::

Callable[Concatenate[int, P], int]

See PEP 612 for detailed information.

Final = Final

Special typing construct to indicate final names to type checkers.

A final name cannot be re-assigned or overridden in a subclass.

For example::

MAX_SIZE: Final = 9000
MAX_SIZE += 1  # Error reported by type checker

class Connection:
    TIMEOUT: Final[int] = 10

class FastConnector(Connection):
    TIMEOUT = 1  # Error reported by type checker

There is no runtime checking of these properties.

class ForwardRef(_Final):
1016class ForwardRef(_Final, _root=True):
1017    """Internal wrapper to hold a forward reference."""
1018
1019    __slots__ = ('__forward_arg__', '__forward_code__',
1020                 '__forward_evaluated__', '__forward_value__',
1021                 '__forward_is_argument__', '__forward_is_class__',
1022                 '__forward_module__')
1023
1024    def __init__(self, arg, is_argument=True, module=None, *, is_class=False):
1025        if not isinstance(arg, str):
1026            raise TypeError(f"Forward reference must be a string -- got {arg!r}")
1027
1028        # If we do `def f(*args: *Ts)`, then we'll have `arg = '*Ts'`.
1029        # Unfortunately, this isn't a valid expression on its own, so we
1030        # do the unpacking manually.
1031        if arg.startswith('*'):
1032            arg_to_compile = f'({arg},)[0]'  # E.g. (*Ts,)[0] or (*tuple[int, int],)[0]
1033        else:
1034            arg_to_compile = arg
1035        try:
1036            code = compile(arg_to_compile, '<string>', 'eval')
1037        except SyntaxError:
1038            raise SyntaxError(f"Forward reference must be an expression -- got {arg!r}")
1039
1040        self.__forward_arg__ = arg
1041        self.__forward_code__ = code
1042        self.__forward_evaluated__ = False
1043        self.__forward_value__ = None
1044        self.__forward_is_argument__ = is_argument
1045        self.__forward_is_class__ = is_class
1046        self.__forward_module__ = module
1047
1048    def _evaluate(self, globalns, localns, type_params=_sentinel, *, recursive_guard):
1049        if type_params is _sentinel:
1050            _deprecation_warning_for_no_type_params_passed("typing.ForwardRef._evaluate")
1051            type_params = ()
1052        if self.__forward_arg__ in recursive_guard:
1053            return self
1054        if not self.__forward_evaluated__ or localns is not globalns:
1055            if globalns is None and localns is None:
1056                globalns = localns = {}
1057            elif globalns is None:
1058                globalns = localns
1059            elif localns is None:
1060                localns = globalns
1061            if self.__forward_module__ is not None:
1062                globalns = getattr(
1063                    sys.modules.get(self.__forward_module__, None), '__dict__', globalns
1064                )
1065
1066            # type parameters require some special handling,
1067            # as they exist in their own scope
1068            # but `eval()` does not have a dedicated parameter for that scope.
1069            # For classes, names in type parameter scopes should override
1070            # names in the global scope (which here are called `localns`!),
1071            # but should in turn be overridden by names in the class scope
1072            # (which here are called `globalns`!)
1073            if type_params:
1074                globalns, localns = dict(globalns), dict(localns)
1075                for param in type_params:
1076                    param_name = param.__name__
1077                    if not self.__forward_is_class__ or param_name not in globalns:
1078                        globalns[param_name] = param
1079                        localns.pop(param_name, None)
1080
1081            type_ = _type_check(
1082                eval(self.__forward_code__, globalns, localns),
1083                "Forward references must evaluate to types.",
1084                is_argument=self.__forward_is_argument__,
1085                allow_special_forms=self.__forward_is_class__,
1086            )
1087            self.__forward_value__ = _eval_type(
1088                type_,
1089                globalns,
1090                localns,
1091                type_params,
1092                recursive_guard=(recursive_guard | {self.__forward_arg__}),
1093            )
1094            self.__forward_evaluated__ = True
1095        return self.__forward_value__
1096
1097    def __eq__(self, other):
1098        if not isinstance(other, ForwardRef):
1099            return NotImplemented
1100        if self.__forward_evaluated__ and other.__forward_evaluated__:
1101            return (self.__forward_arg__ == other.__forward_arg__ and
1102                    self.__forward_value__ == other.__forward_value__)
1103        return (self.__forward_arg__ == other.__forward_arg__ and
1104                self.__forward_module__ == other.__forward_module__)
1105
1106    def __hash__(self):
1107        return hash((self.__forward_arg__, self.__forward_module__))
1108
1109    def __or__(self, other):
1110        return Union[self, other]
1111
1112    def __ror__(self, other):
1113        return Union[other, self]
1114
1115    def __repr__(self):
1116        if self.__forward_module__ is None:
1117            module_repr = ''
1118        else:
1119            module_repr = f', module={self.__forward_module__!r}'
1120        return f'ForwardRef({self.__forward_arg__!r}{module_repr})'

Internal wrapper to hold a forward reference.

ForwardRef(arg, is_argument=True, module=None, *, is_class=False)
1024    def __init__(self, arg, is_argument=True, module=None, *, is_class=False):
1025        if not isinstance(arg, str):
1026            raise TypeError(f"Forward reference must be a string -- got {arg!r}")
1027
1028        # If we do `def f(*args: *Ts)`, then we'll have `arg = '*Ts'`.
1029        # Unfortunately, this isn't a valid expression on its own, so we
1030        # do the unpacking manually.
1031        if arg.startswith('*'):
1032            arg_to_compile = f'({arg},)[0]'  # E.g. (*Ts,)[0] or (*tuple[int, int],)[0]
1033        else:
1034            arg_to_compile = arg
1035        try:
1036            code = compile(arg_to_compile, '<string>', 'eval')
1037        except SyntaxError:
1038            raise SyntaxError(f"Forward reference must be an expression -- got {arg!r}")
1039
1040        self.__forward_arg__ = arg
1041        self.__forward_code__ = code
1042        self.__forward_evaluated__ = False
1043        self.__forward_value__ = None
1044        self.__forward_is_argument__ = is_argument
1045        self.__forward_is_class__ = is_class
1046        self.__forward_module__ = module
class Generic:

Abstract base class for generic types.

On Python 3.12 and newer, generic classes implicitly inherit from Generic when they declare a parameter list after the class's name::

class Mapping[KT, VT]:
    def __getitem__(self, key: KT) -> VT:
        ...
    # Etc.

On older versions of Python, however, generic classes have to explicitly inherit from Generic.

After a class has been declared to be generic, it can then be used as follows::

def lookup_name[KT, VT](mapping: Mapping[KT, VT], key: KT, default: VT) -> VT:
    try:
        return mapping[key]
    except KeyError:
        return default
Literal = Literal

Special typing form to define literal types (a.k.a. value types).

This form can be used to indicate to type checkers that the corresponding variable or function parameter has a value equivalent to the provided literal (or one of several literals)::

def validate_simple(data: Any) -> Literal[True]:  # always returns True
    ...

MODE = Literal['r', 'rb', 'w', 'wb']
def open_helper(file: str, mode: MODE) -> str:
    ...

open_helper('/some/path', 'r')  # Passes type check
open_helper('/other/path', 'typo')  # Error in type checker

Literal[...] cannot be subclassed. At runtime, an arbitrary value is allowed as type argument to Literal[...], but type checkers may impose restrictions.

Optional = Optional

Optional[X] is equivalent to Union[X, None].

class ParamSpec:

Parameter specification variable.

The preferred way to construct a parameter specification is via the dedicated syntax for generic functions, classes, and type aliases, where the use of '**' creates a parameter specification::

type IntFunc[**P] = Callable[P, int]

The following syntax creates a parameter specification that defaults to a callable accepting two positional-only arguments of types int and str:

type IntFuncDefault[**P = (int, str)] = Callable[P, int]

For compatibility with Python 3.11 and earlier, ParamSpec objects can also be created as follows::

P = ParamSpec('P')
DefaultP = ParamSpec('DefaultP', default=(int, str))

Parameter specification variables exist primarily for the benefit of static type checkers. They are used to forward the parameter types of one callable to another callable, a pattern commonly found in higher-order functions and decorators. They are only valid when used in Concatenate, or as the first argument to Callable, or as parameters for user-defined Generics. See class Generic for more information on generic types.

An example for annotating a decorator::

def add_logging[**P, T](f: Callable[P, T]) -> Callable[P, T]:
    '''A type-safe decorator to add logging to a function.'''
    def inner(*args: P.args, **kwargs: P.kwargs) -> T:
        logging.info(f'{f.__name__} was called')
        return f(*args, **kwargs)
    return inner

@add_logging
def add_two(x: float, y: float) -> float:
    '''Add two numbers together.'''
    return x + y

Parameter specification variables can be introspected. e.g.::

>>> P = ParamSpec("P")
>>> P.__name__
'P'

Note that only parameter specification variables defined in the global scope can be pickled.

def has_default(self, /):

The type of the None singleton.

args

Represents positional arguments.

kwargs

Represents keyword arguments.

class Protocol():
2175class Protocol(Generic, metaclass=_ProtocolMeta):
2176    """Base class for protocol classes.
2177
2178    Protocol classes are defined as::
2179
2180        class Proto(Protocol):
2181            def meth(self) -> int:
2182                ...
2183
2184    Such classes are primarily used with static type checkers that recognize
2185    structural subtyping (static duck-typing).
2186
2187    For example::
2188
2189        class C:
2190            def meth(self) -> int:
2191                return 0
2192
2193        def func(x: Proto) -> int:
2194            return x.meth()
2195
2196        func(C())  # Passes static type check
2197
2198    See PEP 544 for details. Protocol classes decorated with
2199    @typing.runtime_checkable act as simple-minded runtime protocols that check
2200    only the presence of given attributes, ignoring their type signatures.
2201    Protocol classes can be generic, they are defined as::
2202
2203        class GenProto[T](Protocol):
2204            def meth(self) -> T:
2205                ...
2206    """
2207
2208    __slots__ = ()
2209    _is_protocol = True
2210    _is_runtime_protocol = False
2211
2212    def __init_subclass__(cls, *args, **kwargs):
2213        super().__init_subclass__(*args, **kwargs)
2214
2215        # Determine if this is a protocol or a concrete subclass.
2216        if not cls.__dict__.get('_is_protocol', False):
2217            cls._is_protocol = any(b is Protocol for b in cls.__bases__)
2218
2219        # Set (or override) the protocol subclass hook.
2220        if '__subclasshook__' not in cls.__dict__:
2221            cls.__subclasshook__ = _proto_hook
2222
2223        # Prohibit instantiation for protocol classes
2224        if cls._is_protocol and cls.__init__ is Protocol.__init__:
2225            cls.__init__ = _no_init_or_replace_init

Base class for protocol classes.

Protocol classes are defined as::

class Proto(Protocol):
    def meth(self) -> int:
        ...

Such classes are primarily used with static type checkers that recognize structural subtyping (static duck-typing).

For example::

class C:
    def meth(self) -> int:
        return 0

def func(x: Proto) -> int:
    return x.meth()

func(C())  # Passes static type check

See PEP 544 for details. Protocol classes decorated with @typing.runtime_checkable act as simple-minded runtime protocols that check only the presence of given attributes, ignoring their type signatures. Protocol classes can be generic, they are defined as::

class GenProto[T](Protocol):
    def meth(self) -> T:
        ...
Tuple = Tuple
Type = Type
class TypeVar:

Type variable.

The preferred way to construct a type variable is via the dedicated syntax for generic functions, classes, and type aliases::

class Sequence[T]:  # T is a TypeVar
    ...

This syntax can also be used to create bound and constrained type variables::

# S is a TypeVar bound to str
class StrSequence[S: str]:
    ...

# A is a TypeVar constrained to str or bytes
class StrOrBytesSequence[A: (str, bytes)]:
    ...
Type variables can also have defaults:

class IntDefault[T = int]: ...

However, if desired, reusable type variables can also be constructed manually, like so::

T = TypeVar('T') # Can be anything S = TypeVar('S', bound=str) # Can be any subtype of str A = TypeVar('A', str, bytes) # Must be exactly str or bytes D = TypeVar('D', default=int) # Defaults to int

Type variables exist primarily for the benefit of static type checkers. They serve as the parameters for generic types as well as for generic function and type alias definitions.

The variance of type variables is inferred by type checkers when they are created through the type parameter syntax and when infer_variance=True is passed. Manually created type variables may be explicitly marked covariant or contravariant by passing covariant=True or contravariant=True. By default, manually created type variables are invariant. See PEP 484 and PEP 695 for more details.

def has_default(self, /):

The type of the None singleton.

class TypeVarTuple:

Type variable tuple. A specialized form of type variable that enables variadic generics.

The preferred way to construct a type variable tuple is via the dedicated syntax for generic functions, classes, and type aliases, where a single '*' indicates a type variable tuple::

def move_first_element_to_last[T, *Ts](tup: tuple[T, *Ts]) -> tuple[*Ts, T]:
    return (*tup[1:], tup[0])
Type variables tuples can have default values:

type AliasWithDefault[Ts = (str, int)] = tuple[Ts]

For compatibility with Python 3.11 and earlier, TypeVarTuple objects can also be created as follows::

Ts = TypeVarTuple('Ts')  # Can be given any name
DefaultTs = TypeVarTuple('Ts', default=(str, int))

Just as a TypeVar (type variable) is a placeholder for a single type, a TypeVarTuple is a placeholder for an arbitrary number of types. For example, if we define a generic class using a TypeVarTuple::

class C[*Ts]: ...

Then we can parameterize that class with an arbitrary number of type arguments::

C[int]       # Fine
C[int, str]  # Also fine
C[()]        # Even this is fine

For more details, see PEP 646.

Note that only TypeVarTuples defined in the global scope can be pickled.

def has_default(self, /):

The type of the None singleton.

Union = Union

Union type; Union[X, Y] means either X or Y.

On Python 3.10 and higher, the | operator can also be used to denote unions; X | Y means the same thing to the type checker as Union[X, Y].

To define a union, use e.g. Union[int, str]. Details:

  • The arguments must be types and there must be at least one.
  • None as an argument is a special case and is replaced by type(None).
  • Unions of unions are flattened, e.g.::
assert Union[Union[int, str], float] == Union[int, str, float]
  • Unions of a single argument vanish, e.g.::

    assert Union[int] == int # The constructor actually returns int

  • Redundant arguments are skipped, e.g.::

    assert Union[int, str, int] == Union[int, str]

  • When comparing unions, the argument order is ignored, e.g.::

    assert Union[int, str] == Union[str, int]

  • You cannot subclass or instantiate a union.

  • You can use Optional as a shorthand for Union[X, None].
AbstractSet = AbstractSet
ByteString = ByteString
Container = Container
ContextManager = ContextManager
Hashable = Hashable
ItemsView = ItemsView
Iterable = Iterable
Iterator = Iterator
KeysView = KeysView
Mapping = Mapping
MappingView = MappingView
MutableMapping = MutableMapping
MutableSequence = MutableSequence
MutableSet = MutableSet
Sequence = Sequence
Sized = Sized
ValuesView = ValuesView
Awaitable = Awaitable
AsyncIterator = AsyncIterator
AsyncIterable = AsyncIterable
Coroutine = Coroutine
Collection = Collection
AsyncGenerator = AsyncGenerator
AsyncContextManager
Reversible = Reversible
@runtime_checkable
class SupportsAbs(Protocol, typing.Generic[T]):
2970@runtime_checkable
2971class SupportsAbs[T](Protocol):
2972    """An ABC with one abstract method __abs__ that is covariant in its return type."""
2973
2974    __slots__ = ()
2975
2976    @abstractmethod
2977    def __abs__(self) -> T:
2978        pass

An ABC with one abstract method __abs__ that is covariant in its return type.

@runtime_checkable
class SupportsBytes(Protocol):
2948@runtime_checkable
2949class SupportsBytes(Protocol):
2950    """An ABC with one abstract method __bytes__."""
2951
2952    __slots__ = ()
2953
2954    @abstractmethod
2955    def __bytes__(self) -> bytes:
2956        pass

An ABC with one abstract method __bytes__.

@runtime_checkable
class SupportsComplex(Protocol):
2937@runtime_checkable
2938class SupportsComplex(Protocol):
2939    """An ABC with one abstract method __complex__."""
2940
2941    __slots__ = ()
2942
2943    @abstractmethod
2944    def __complex__(self) -> complex:
2945        pass

An ABC with one abstract method __complex__.

@runtime_checkable
class SupportsFloat(Protocol):
2926@runtime_checkable
2927class SupportsFloat(Protocol):
2928    """An ABC with one abstract method __float__."""
2929
2930    __slots__ = ()
2931
2932    @abstractmethod
2933    def __float__(self) -> float:
2934        pass

An ABC with one abstract method __float__.

@runtime_checkable
class SupportsIndex(Protocol):
2959@runtime_checkable
2960class SupportsIndex(Protocol):
2961    """An ABC with one abstract method __index__."""
2962
2963    __slots__ = ()
2964
2965    @abstractmethod
2966    def __index__(self) -> int:
2967        pass

An ABC with one abstract method __index__.

@runtime_checkable
class SupportsInt(Protocol):
2915@runtime_checkable
2916class SupportsInt(Protocol):
2917    """An ABC with one abstract method __int__."""
2918
2919    __slots__ = ()
2920
2921    @abstractmethod
2922    def __int__(self) -> int:
2923        pass

An ABC with one abstract method __int__.

@runtime_checkable
class SupportsRound(Protocol, typing.Generic[T]):
2981@runtime_checkable
2982class SupportsRound[T](Protocol):
2983    """An ABC with one abstract method __round__ that is covariant in its return type."""
2984
2985    __slots__ = ()
2986
2987    @abstractmethod
2988    def __round__(self, ndigits: int = 0) -> T:
2989        pass

An ABC with one abstract method __round__ that is covariant in its return type.

ChainMap = ChainMap
Counter = Counter
Deque = Deque
Dict = Dict
DefaultDict = DefaultDict
List = List
OrderedDict = OrderedDict
Set = Set
FrozenSet = FrozenSet
def NamedTuple(typename, fields=<sentinel>, /, **kwargs):
3061def NamedTuple(typename, fields=_sentinel, /, **kwargs):
3062    """Typed version of namedtuple.
3063
3064    Usage::
3065
3066        class Employee(NamedTuple):
3067            name: str
3068            id: int
3069
3070    This is equivalent to::
3071
3072        Employee = collections.namedtuple('Employee', ['name', 'id'])
3073
3074    The resulting class has an extra __annotations__ attribute, giving a
3075    dict that maps field names to types.  (The field names are also in
3076    the _fields attribute, which is part of the namedtuple API.)
3077    An alternative equivalent functional syntax is also accepted::
3078
3079        Employee = NamedTuple('Employee', [('name', str), ('id', int)])
3080    """
3081    if fields is _sentinel:
3082        if kwargs:
3083            deprecated_thing = "Creating NamedTuple classes using keyword arguments"
3084            deprecation_msg = (
3085                "{name} is deprecated and will be disallowed in Python {remove}. "
3086                "Use the class-based or functional syntax instead."
3087            )
3088        else:
3089            deprecated_thing = "Failing to pass a value for the 'fields' parameter"
3090            example = f"`{typename} = NamedTuple({typename!r}, [])`"
3091            deprecation_msg = (
3092                "{name} is deprecated and will be disallowed in Python {remove}. "
3093                "To create a NamedTuple class with 0 fields "
3094                "using the functional syntax, "
3095                "pass an empty list, e.g. "
3096            ) + example + "."
3097    elif fields is None:
3098        if kwargs:
3099            raise TypeError(
3100                "Cannot pass `None` as the 'fields' parameter "
3101                "and also specify fields using keyword arguments"
3102            )
3103        else:
3104            deprecated_thing = "Passing `None` as the 'fields' parameter"
3105            example = f"`{typename} = NamedTuple({typename!r}, [])`"
3106            deprecation_msg = (
3107                "{name} is deprecated and will be disallowed in Python {remove}. "
3108                "To create a NamedTuple class with 0 fields "
3109                "using the functional syntax, "
3110                "pass an empty list, e.g. "
3111            ) + example + "."
3112    elif kwargs:
3113        raise TypeError("Either list of fields or keywords"
3114                        " can be provided to NamedTuple, not both")
3115    if fields is _sentinel or fields is None:
3116        import warnings
3117        warnings._deprecated(deprecated_thing, message=deprecation_msg, remove=(3, 15))
3118        fields = kwargs.items()
3119    nt = _make_nmtuple(typename, fields, module=_caller())
3120    nt.__orig_bases__ = (NamedTuple,)
3121    return nt

Typed version of namedtuple.

Usage::

class Employee(NamedTuple):
    name: str
    id: int

This is equivalent to::

Employee = collections.namedtuple('Employee', ['name', 'id'])

The resulting class has an extra __annotations__ attribute, giving a dict that maps field names to types. (The field names are also in the _fields attribute, which is part of the namedtuple API.) An alternative equivalent functional syntax is also accepted::

Employee = NamedTuple('Employee', [('name', str), ('id', int)])
def TypedDict(typename, fields=<sentinel>, /, *, total=True):
3253def TypedDict(typename, fields=_sentinel, /, *, total=True):
3254    """A simple typed namespace. At runtime it is equivalent to a plain dict.
3255
3256    TypedDict creates a dictionary type such that a type checker will expect all
3257    instances to have a certain set of keys, where each key is
3258    associated with a value of a consistent type. This expectation
3259    is not checked at runtime.
3260
3261    Usage::
3262
3263        >>> class Point2D(TypedDict):
3264        ...     x: int
3265        ...     y: int
3266        ...     label: str
3267        ...
3268        >>> a: Point2D = {'x': 1, 'y': 2, 'label': 'good'}  # OK
3269        >>> b: Point2D = {'z': 3, 'label': 'bad'}           # Fails type check
3270        >>> Point2D(x=1, y=2, label='first') == dict(x=1, y=2, label='first')
3271        True
3272
3273    The type info can be accessed via the Point2D.__annotations__ dict, and
3274    the Point2D.__required_keys__ and Point2D.__optional_keys__ frozensets.
3275    TypedDict supports an additional equivalent form::
3276
3277        Point2D = TypedDict('Point2D', {'x': int, 'y': int, 'label': str})
3278
3279    By default, all keys must be present in a TypedDict. It is possible
3280    to override this by specifying totality::
3281
3282        class Point2D(TypedDict, total=False):
3283            x: int
3284            y: int
3285
3286    This means that a Point2D TypedDict can have any of the keys omitted. A type
3287    checker is only expected to support a literal False or True as the value of
3288    the total argument. True is the default, and makes all items defined in the
3289    class body be required.
3290
3291    The Required and NotRequired special forms can also be used to mark
3292    individual keys as being required or not required::
3293
3294        class Point2D(TypedDict):
3295            x: int               # the "x" key must always be present (Required is the default)
3296            y: NotRequired[int]  # the "y" key can be omitted
3297
3298    See PEP 655 for more details on Required and NotRequired.
3299
3300    The ReadOnly special form can be used
3301    to mark individual keys as immutable for type checkers::
3302
3303        class DatabaseUser(TypedDict):
3304            id: ReadOnly[int]  # the "id" key must not be modified
3305            username: str      # the "username" key can be changed
3306
3307    """
3308    if fields is _sentinel or fields is None:
3309        import warnings
3310
3311        if fields is _sentinel:
3312            deprecated_thing = "Failing to pass a value for the 'fields' parameter"
3313        else:
3314            deprecated_thing = "Passing `None` as the 'fields' parameter"
3315
3316        example = f"`{typename} = TypedDict({typename!r}, {{{{}}}})`"
3317        deprecation_msg = (
3318            "{name} is deprecated and will be disallowed in Python {remove}. "
3319            "To create a TypedDict class with 0 fields "
3320            "using the functional syntax, "
3321            "pass an empty dictionary, e.g. "
3322        ) + example + "."
3323        warnings._deprecated(deprecated_thing, message=deprecation_msg, remove=(3, 15))
3324        fields = {}
3325
3326    ns = {'__annotations__': dict(fields)}
3327    module = _caller()
3328    if module is not None:
3329        # Setting correct module is necessary to make typed dict classes pickleable.
3330        ns['__module__'] = module
3331
3332    td = _TypedDictMeta(typename, (), ns, total=total)
3333    td.__orig_bases__ = (TypedDict,)
3334    return td

A simple typed namespace. At runtime it is equivalent to a plain dict.

TypedDict creates a dictionary type such that a type checker will expect all instances to have a certain set of keys, where each key is associated with a value of a consistent type. This expectation is not checked at runtime.

Usage::

>>> class Point2D(TypedDict):
...     x: int
...     y: int
...     label: str
...
>>> a: Point2D = {'x': 1, 'y': 2, 'label': 'good'}  # OK
>>> b: Point2D = {'z': 3, 'label': 'bad'}           # Fails type check
>>> Point2D(x=1, y=2, label='first') == dict(x=1, y=2, label='first')
True

The type info can be accessed via the Point2D.__annotations__ dict, and the Point2D.__required_keys__ and Point2D.__optional_keys__ frozensets. TypedDict supports an additional equivalent form::

Point2D = TypedDict('Point2D', {'x': int, 'y': int, 'label': str})

By default, all keys must be present in a TypedDict. It is possible to override this by specifying totality::

class Point2D(TypedDict, total=False):
    x: int
    y: int

This means that a Point2D TypedDict can have any of the keys omitted. A type checker is only expected to support a literal False or True as the value of the total argument. True is the default, and makes all items defined in the class body be required.

The Required and NotRequired special forms can also be used to mark individual keys as being required or not required::

class Point2D(TypedDict):
    x: int               # the "x" key must always be present (Required is the default)
    y: NotRequired[int]  # the "y" key can be omitted

See PEP 655 for more details on Required and NotRequired.

The ReadOnly special form can be used to mark individual keys as immutable for type checkers::

class DatabaseUser(TypedDict):
    id: ReadOnly[int]  # the "id" key must not be modified
    username: str      # the "username" key can be changed
Generator = Generator
class BinaryIO(typing.IO[bytes]):
3572class BinaryIO(IO[bytes]):
3573    """Typed version of the return of open() in binary mode."""
3574
3575    __slots__ = ()
3576
3577    @abstractmethod
3578    def write(self, s: Union[bytes, bytearray]) -> int:
3579        pass
3580
3581    @abstractmethod
3582    def __enter__(self) -> 'BinaryIO':
3583        pass

Typed version of the return of open() in binary mode.

@abstractmethod
def write(self, s: Union[bytes, bytearray]) -> int:
3577    @abstractmethod
3578    def write(self, s: Union[bytes, bytearray]) -> int:
3579        pass
class IO(typing.Generic[~AnyStr]):
3473class IO(Generic[AnyStr]):
3474    """Generic base class for TextIO and BinaryIO.
3475
3476    This is an abstract, generic version of the return of open().
3477
3478    NOTE: This does not distinguish between the different possible
3479    classes (text vs. binary, read vs. write vs. read/write,
3480    append-only, unbuffered).  The TextIO and BinaryIO subclasses
3481    below capture the distinctions between text vs. binary, which is
3482    pervasive in the interface; however we currently do not offer a
3483    way to track the other distinctions in the type system.
3484    """
3485
3486    __slots__ = ()
3487
3488    @property
3489    @abstractmethod
3490    def mode(self) -> str:
3491        pass
3492
3493    @property
3494    @abstractmethod
3495    def name(self) -> str:
3496        pass
3497
3498    @abstractmethod
3499    def close(self) -> None:
3500        pass
3501
3502    @property
3503    @abstractmethod
3504    def closed(self) -> bool:
3505        pass
3506
3507    @abstractmethod
3508    def fileno(self) -> int:
3509        pass
3510
3511    @abstractmethod
3512    def flush(self) -> None:
3513        pass
3514
3515    @abstractmethod
3516    def isatty(self) -> bool:
3517        pass
3518
3519    @abstractmethod
3520    def read(self, n: int = -1) -> AnyStr:
3521        pass
3522
3523    @abstractmethod
3524    def readable(self) -> bool:
3525        pass
3526
3527    @abstractmethod
3528    def readline(self, limit: int = -1) -> AnyStr:
3529        pass
3530
3531    @abstractmethod
3532    def readlines(self, hint: int = -1) -> List[AnyStr]:
3533        pass
3534
3535    @abstractmethod
3536    def seek(self, offset: int, whence: int = 0) -> int:
3537        pass
3538
3539    @abstractmethod
3540    def seekable(self) -> bool:
3541        pass
3542
3543    @abstractmethod
3544    def tell(self) -> int:
3545        pass
3546
3547    @abstractmethod
3548    def truncate(self, size: int = None) -> int:
3549        pass
3550
3551    @abstractmethod
3552    def writable(self) -> bool:
3553        pass
3554
3555    @abstractmethod
3556    def write(self, s: AnyStr) -> int:
3557        pass
3558
3559    @abstractmethod
3560    def writelines(self, lines: List[AnyStr]) -> None:
3561        pass
3562
3563    @abstractmethod
3564    def __enter__(self) -> 'IO[AnyStr]':
3565        pass
3566
3567    @abstractmethod
3568    def __exit__(self, type, value, traceback) -> None:
3569        pass

Generic base class for TextIO and BinaryIO.

This is an abstract, generic version of the return of open().

NOTE: This does not distinguish between the different possible classes (text vs. binary, read vs. write vs. read/write, append-only, unbuffered). The TextIO and BinaryIO subclasses below capture the distinctions between text vs. binary, which is pervasive in the interface; however we currently do not offer a way to track the other distinctions in the type system.

mode: str
3488    @property
3489    @abstractmethod
3490    def mode(self) -> str:
3491        pass
name: str
3493    @property
3494    @abstractmethod
3495    def name(self) -> str:
3496        pass
@abstractmethod
def close(self) -> None:
3498    @abstractmethod
3499    def close(self) -> None:
3500        pass
closed: bool
3502    @property
3503    @abstractmethod
3504    def closed(self) -> bool:
3505        pass
@abstractmethod
def fileno(self) -> int:
3507    @abstractmethod
3508    def fileno(self) -> int:
3509        pass
@abstractmethod
def flush(self) -> None:
3511    @abstractmethod
3512    def flush(self) -> None:
3513        pass
@abstractmethod
def isatty(self) -> bool:
3515    @abstractmethod
3516    def isatty(self) -> bool:
3517        pass
@abstractmethod
def read(self, n: int = -1) -> ~AnyStr:
3519    @abstractmethod
3520    def read(self, n: int = -1) -> AnyStr:
3521        pass
@abstractmethod
def readable(self) -> bool:
3523    @abstractmethod
3524    def readable(self) -> bool:
3525        pass
@abstractmethod
def readline(self, limit: int = -1) -> ~AnyStr:
3527    @abstractmethod
3528    def readline(self, limit: int = -1) -> AnyStr:
3529        pass
@abstractmethod
def readlines(self, hint: int = -1) -> List[~AnyStr]:
3531    @abstractmethod
3532    def readlines(self, hint: int = -1) -> List[AnyStr]:
3533        pass
@abstractmethod
def seek(self, offset: int, whence: int = 0) -> int:
3535    @abstractmethod
3536    def seek(self, offset: int, whence: int = 0) -> int:
3537        pass
@abstractmethod
def seekable(self) -> bool:
3539    @abstractmethod
3540    def seekable(self) -> bool:
3541        pass
@abstractmethod
def tell(self) -> int:
3543    @abstractmethod
3544    def tell(self) -> int:
3545        pass
@abstractmethod
def truncate(self, size: int = None) -> int:
3547    @abstractmethod
3548    def truncate(self, size: int = None) -> int:
3549        pass
@abstractmethod
def writable(self) -> bool:
3551    @abstractmethod
3552    def writable(self) -> bool:
3553        pass
@abstractmethod
def write(self, s: ~AnyStr) -> int:
3555    @abstractmethod
3556    def write(self, s: AnyStr) -> int:
3557        pass
@abstractmethod
def writelines(self, lines: List[~AnyStr]) -> None:
3559    @abstractmethod
3560    def writelines(self, lines: List[AnyStr]) -> None:
3561        pass
Match = Match
Pattern = Pattern
class TextIO(typing.IO[str]):
3586class TextIO(IO[str]):
3587    """Typed version of the return of open() in text mode."""
3588
3589    __slots__ = ()
3590
3591    @property
3592    @abstractmethod
3593    def buffer(self) -> BinaryIO:
3594        pass
3595
3596    @property
3597    @abstractmethod
3598    def encoding(self) -> str:
3599        pass
3600
3601    @property
3602    @abstractmethod
3603    def errors(self) -> Optional[str]:
3604        pass
3605
3606    @property
3607    @abstractmethod
3608    def line_buffering(self) -> bool:
3609        pass
3610
3611    @property
3612    @abstractmethod
3613    def newlines(self) -> Any:
3614        pass
3615
3616    @abstractmethod
3617    def __enter__(self) -> 'TextIO':
3618        pass

Typed version of the return of open() in text mode.

buffer: <class 'BinaryIO'>
3591    @property
3592    @abstractmethod
3593    def buffer(self) -> BinaryIO:
3594        pass
encoding: str
3596    @property
3597    @abstractmethod
3598    def encoding(self) -> str:
3599        pass
errors: Optional[str]
3601    @property
3602    @abstractmethod
3603    def errors(self) -> Optional[str]:
3604        pass
line_buffering: bool
3606    @property
3607    @abstractmethod
3608    def line_buffering(self) -> bool:
3609        pass
newlines: Any
3611    @property
3612    @abstractmethod
3613    def newlines(self) -> Any:
3614        pass
def assert_type(val, typ, /):
2395def assert_type(val, typ, /):
2396    """Ask a static type checker to confirm that the value is of the given type.
2397
2398    At runtime this does nothing: it returns the first argument unchanged with no
2399    checks or side effects, no matter the actual type of the argument.
2400
2401    When a static type checker encounters a call to assert_type(), it
2402    emits an error if the value is not of the specified type::
2403
2404        def greet(name: str) -> None:
2405            assert_type(name, str)  # OK
2406            assert_type(name, int)  # type checker error
2407    """
2408    return val

Ask a static type checker to confirm that the value is of the given type.

At runtime this does nothing: it returns the first argument unchanged with no checks or side effects, no matter the actual type of the argument.

When a static type checker encounters a call to assert_type(), it emits an error if the value is not of the specified type::

def greet(name: str) -> None:
    assert_type(name, str)  # OK
    assert_type(name, int)  # type checker error
def assert_never(arg: Never, /) -> Never:
2618def assert_never(arg: Never, /) -> Never:
2619    """Statically assert that a line of code is unreachable.
2620
2621    Example::
2622
2623        def int_or_str(arg: int | str) -> None:
2624            match arg:
2625                case int():
2626                    print("It's an int")
2627                case str():
2628                    print("It's a str")
2629                case _:
2630                    assert_never(arg)
2631
2632    If a type checker finds that a call to assert_never() is
2633    reachable, it will emit an error.
2634
2635    At runtime, this throws an exception when called.
2636    """
2637    value = repr(arg)
2638    if len(value) > _ASSERT_NEVER_REPR_MAX_LENGTH:
2639        value = value[:_ASSERT_NEVER_REPR_MAX_LENGTH] + '...'
2640    raise AssertionError(f"Expected code to be unreachable, but got: {value}")

Statically assert that a line of code is unreachable.

Example::

def int_or_str(arg: int | str) -> None:
    match arg:
        case int():
            print("It's an int")
        case str():
            print("It's a str")
        case _:
            assert_never(arg)

If a type checker finds that a call to assert_never() is reachable, it will emit an error.

At runtime, this throws an exception when called.

def cast(typ, val):
2384def cast(typ, val):
2385    """Cast a value to a type.
2386
2387    This returns the value unchanged.  To the type checker this
2388    signals that the return value has the designated type, but at
2389    runtime we intentionally don't check anything (we want this
2390    to be as fast as possible).
2391    """
2392    return val

Cast a value to a type.

This returns the value unchanged. To the type checker this signals that the return value has the designated type, but at runtime we intentionally don't check anything (we want this to be as fast as possible).

def clear_overloads():
2762def clear_overloads():
2763    """Clear all overloads in the registry."""
2764    _overload_registry.clear()

Clear all overloads in the registry.

def dataclass_transform( *, eq_default: bool = True, order_default: bool = False, kw_only_default: bool = False, frozen_default: bool = False, field_specifiers: tuple[Union[type[Any], Callable[..., Any]], ...] = (), **kwargs: Any) -> <class '_IdentityCallable'>:
3645def dataclass_transform(
3646    *,
3647    eq_default: bool = True,
3648    order_default: bool = False,
3649    kw_only_default: bool = False,
3650    frozen_default: bool = False,
3651    field_specifiers: tuple[type[Any] | Callable[..., Any], ...] = (),
3652    **kwargs: Any,
3653) -> _IdentityCallable:
3654    """Decorator to mark an object as providing dataclass-like behaviour.
3655
3656    The decorator can be applied to a function, class, or metaclass.
3657
3658    Example usage with a decorator function::
3659
3660        @dataclass_transform()
3661        def create_model[T](cls: type[T]) -> type[T]:
3662            ...
3663            return cls
3664
3665        @create_model
3666        class CustomerModel:
3667            id: int
3668            name: str
3669
3670    On a base class::
3671
3672        @dataclass_transform()
3673        class ModelBase: ...
3674
3675        class CustomerModel(ModelBase):
3676            id: int
3677            name: str
3678
3679    On a metaclass::
3680
3681        @dataclass_transform()
3682        class ModelMeta(type): ...
3683
3684        class ModelBase(metaclass=ModelMeta): ...
3685
3686        class CustomerModel(ModelBase):
3687            id: int
3688            name: str
3689
3690    The ``CustomerModel`` classes defined above will
3691    be treated by type checkers similarly to classes created with
3692    ``@dataclasses.dataclass``.
3693    For example, type checkers will assume these classes have
3694    ``__init__`` methods that accept ``id`` and ``name``.
3695
3696    The arguments to this decorator can be used to customize this behavior:
3697    - ``eq_default`` indicates whether the ``eq`` parameter is assumed to be
3698        ``True`` or ``False`` if it is omitted by the caller.
3699    - ``order_default`` indicates whether the ``order`` parameter is
3700        assumed to be True or False if it is omitted by the caller.
3701    - ``kw_only_default`` indicates whether the ``kw_only`` parameter is
3702        assumed to be True or False if it is omitted by the caller.
3703    - ``frozen_default`` indicates whether the ``frozen`` parameter is
3704        assumed to be True or False if it is omitted by the caller.
3705    - ``field_specifiers`` specifies a static list of supported classes
3706        or functions that describe fields, similar to ``dataclasses.field()``.
3707    - Arbitrary other keyword arguments are accepted in order to allow for
3708        possible future extensions.
3709
3710    At runtime, this decorator records its arguments in the
3711    ``__dataclass_transform__`` attribute on the decorated object.
3712    It has no other runtime effect.
3713
3714    See PEP 681 for more details.
3715    """
3716    def decorator(cls_or_fn):
3717        cls_or_fn.__dataclass_transform__ = {
3718            "eq_default": eq_default,
3719            "order_default": order_default,
3720            "kw_only_default": kw_only_default,
3721            "frozen_default": frozen_default,
3722            "field_specifiers": field_specifiers,
3723            "kwargs": kwargs,
3724        }
3725        return cls_or_fn
3726    return decorator

Decorator to mark an object as providing dataclass-like behaviour.

The decorator can be applied to a function, class, or metaclass.

Example usage with a decorator function::

@dataclass_transform()
def create_model[T](cls: type[T]) -> type[T]:
    ...
    return cls

@create_model
class CustomerModel:
    id: int
    name: str

On a base class::

@dataclass_transform()
class ModelBase: ...

class CustomerModel(ModelBase):
    id: int
    name: str

On a metaclass::

@dataclass_transform()
class ModelMeta(type): ...

class ModelBase(metaclass=ModelMeta): ...

class CustomerModel(ModelBase):
    id: int
    name: str

The CustomerModel classes defined above will be treated by type checkers similarly to classes created with @dataclasses.dataclass. For example, type checkers will assume these classes have __init__ methods that accept id and name.

The arguments to this decorator can be used to customize this behavior:

  • eq_default indicates whether the eq parameter is assumed to be True or False if it is omitted by the caller.
  • order_default indicates whether the order parameter is assumed to be True or False if it is omitted by the caller.
  • kw_only_default indicates whether the kw_only parameter is assumed to be True or False if it is omitted by the caller.
  • frozen_default indicates whether the frozen parameter is assumed to be True or False if it is omitted by the caller.
  • field_specifiers specifies a static list of supported classes or functions that describe fields, similar to dataclasses.field().
  • Arbitrary other keyword arguments are accepted in order to allow for possible future extensions.

At runtime, this decorator records its arguments in the __dataclass_transform__ attribute on the decorated object. It has no other runtime effect.

See PEP 681 for more details.

def final(f):
2767def final(f):
2768    """Decorator to indicate final methods and final classes.
2769
2770    Use this decorator to indicate to type checkers that the decorated
2771    method cannot be overridden, and decorated class cannot be subclassed.
2772
2773    For example::
2774
2775        class Base:
2776            @final
2777            def done(self) -> None:
2778                ...
2779        class Sub(Base):
2780            def done(self) -> None:  # Error reported by type checker
2781                ...
2782
2783        @final
2784        class Leaf:
2785            ...
2786        class Other(Leaf):  # Error reported by type checker
2787            ...
2788
2789    There is no runtime checking of these properties. The decorator
2790    attempts to set the ``__final__`` attribute to ``True`` on the decorated
2791    object to allow runtime introspection.
2792    """
2793    try:
2794        f.__final__ = True
2795    except (AttributeError, TypeError):
2796        # Skip the attribute silently if it is not writable.
2797        # AttributeError happens if the object has __slots__ or a
2798        # read-only property, TypeError if it's a builtin class.
2799        pass
2800    return f

Decorator to indicate final methods and final classes.

Use this decorator to indicate to type checkers that the decorated method cannot be overridden, and decorated class cannot be subclassed.

For example::

class Base:
    @final
    def done(self) -> None:
        ...
class Sub(Base):
    def done(self) -> None:  # Error reported by type checker
        ...

@final
class Leaf:
    ...
class Other(Leaf):  # Error reported by type checker
    ...

There is no runtime checking of these properties. The decorator attempts to set the __final__ attribute to True on the decorated object to allow runtime introspection.

def get_args(tp):
2571def get_args(tp):
2572    """Get type arguments with all substitutions performed.
2573
2574    For unions, basic simplifications used by Union constructor are performed.
2575
2576    Examples::
2577
2578        >>> T = TypeVar('T')
2579        >>> assert get_args(Dict[str, int]) == (str, int)
2580        >>> assert get_args(int) == ()
2581        >>> assert get_args(Union[int, Union[T, int], str][int]) == (int, str)
2582        >>> assert get_args(Union[int, Tuple[T, int]][str]) == (int, Tuple[str, int])
2583        >>> assert get_args(Callable[[], T][int]) == ([], int)
2584    """
2585    if isinstance(tp, _AnnotatedAlias):
2586        return (tp.__origin__,) + tp.__metadata__
2587    if isinstance(tp, (_GenericAlias, GenericAlias)):
2588        res = tp.__args__
2589        if _should_unflatten_callable_args(tp, res):
2590            res = (list(res[:-1]), res[-1])
2591        return res
2592    if isinstance(tp, types.UnionType):
2593        return tp.__args__
2594    return ()

Get type arguments with all substitutions performed.

For unions, basic simplifications used by Union constructor are performed.

Examples::

>>> T = TypeVar('T')
>>> assert get_args(Dict[str, int]) == (str, int)
>>> assert get_args(int) == ()
>>> assert get_args(Union[int, Union[T, int], str][int]) == (int, str)
>>> assert get_args(Union[int, Tuple[T, int]][str]) == (int, Tuple[str, int])
>>> assert get_args(Callable[[], T][int]) == ([], int)
def get_origin(tp):
2541def get_origin(tp):
2542    """Get the unsubscripted version of a type.
2543
2544    This supports generic types, Callable, Tuple, Union, Literal, Final, ClassVar,
2545    Annotated, and others. Return None for unsupported types.
2546
2547    Examples::
2548
2549        >>> P = ParamSpec('P')
2550        >>> assert get_origin(Literal[42]) is Literal
2551        >>> assert get_origin(int) is None
2552        >>> assert get_origin(ClassVar[int]) is ClassVar
2553        >>> assert get_origin(Generic) is Generic
2554        >>> assert get_origin(Generic[T]) is Generic
2555        >>> assert get_origin(Union[T, int]) is Union
2556        >>> assert get_origin(List[Tuple[T, T]][int]) is list
2557        >>> assert get_origin(P.args) is P
2558    """
2559    if isinstance(tp, _AnnotatedAlias):
2560        return Annotated
2561    if isinstance(tp, (_BaseGenericAlias, GenericAlias,
2562                       ParamSpecArgs, ParamSpecKwargs)):
2563        return tp.__origin__
2564    if tp is Generic:
2565        return Generic
2566    if isinstance(tp, types.UnionType):
2567        return types.UnionType
2568    return None

Get the unsubscripted version of a type.

This supports generic types, Callable, Tuple, Union, Literal, Final, ClassVar, Annotated, and others. Return None for unsupported types.

Examples::

>>> P = ParamSpec('P')
>>> assert get_origin(Literal[42]) is Literal
>>> assert get_origin(int) is None
>>> assert get_origin(ClassVar[int]) is ClassVar
>>> assert get_origin(Generic) is Generic
>>> assert get_origin(Generic[T]) is Generic
>>> assert get_origin(Union[T, int]) is Union
>>> assert get_origin(List[Tuple[T, T]][int]) is list
>>> assert get_origin(P.args) is P
def get_overloads(func):
2750def get_overloads(func):
2751    """Return all defined overloads for *func* as a sequence."""
2752    # classmethod and staticmethod
2753    f = getattr(func, "__func__", func)
2754    if f.__module__ not in _overload_registry:
2755        return []
2756    mod_dict = _overload_registry[f.__module__]
2757    if f.__qualname__ not in mod_dict:
2758        return []
2759    return list(mod_dict[f.__qualname__].values())

Return all defined overloads for func as a sequence.

def get_protocol_members(tp: type, /) -> frozenset[str]:
3788def get_protocol_members(tp: type, /) -> frozenset[str]:
3789    """Return the set of members defined in a Protocol.
3790
3791    Example::
3792
3793        >>> from typing import Protocol, get_protocol_members
3794        >>> class P(Protocol):
3795        ...     def a(self) -> str: ...
3796        ...     b: int
3797        >>> get_protocol_members(P) == frozenset({'a', 'b'})
3798        True
3799
3800    Raise a TypeError for arguments that are not Protocols.
3801    """
3802    if not is_protocol(tp):
3803        raise TypeError(f'{tp!r} is not a Protocol')
3804    return frozenset(tp.__protocol_attrs__)

Return the set of members defined in a Protocol.

Example::

>>> from typing import Protocol, get_protocol_members
>>> class P(Protocol):
...     def a(self) -> str: ...
...     b: int
>>> get_protocol_members(P) == frozenset({'a', 'b'})
True

Raise a TypeError for arguments that are not Protocols.

def get_type_hints(obj, globalns=None, localns=None, include_extras=False):
2416def get_type_hints(obj, globalns=None, localns=None, include_extras=False):
2417    """Return type hints for an object.
2418
2419    This is often the same as obj.__annotations__, but it handles
2420    forward references encoded as string literals and recursively replaces all
2421    'Annotated[T, ...]' with 'T' (unless 'include_extras=True').
2422
2423    The argument may be a module, class, method, or function. The annotations
2424    are returned as a dictionary. For classes, annotations include also
2425    inherited members.
2426
2427    TypeError is raised if the argument is not of a type that can contain
2428    annotations, and an empty dictionary is returned if no annotations are
2429    present.
2430
2431    BEWARE -- the behavior of globalns and localns is counterintuitive
2432    (unless you are familiar with how eval() and exec() work).  The
2433    search order is locals first, then globals.
2434
2435    - If no dict arguments are passed, an attempt is made to use the
2436      globals from obj (or the respective module's globals for classes),
2437      and these are also used as the locals.  If the object does not appear
2438      to have globals, an empty dictionary is used.  For classes, the search
2439      order is globals first then locals.
2440
2441    - If one dict argument is passed, it is used for both globals and
2442      locals.
2443
2444    - If two dict arguments are passed, they specify globals and
2445      locals, respectively.
2446    """
2447    if getattr(obj, '__no_type_check__', None):
2448        return {}
2449    # Classes require a special treatment.
2450    if isinstance(obj, type):
2451        hints = {}
2452        for base in reversed(obj.__mro__):
2453            if globalns is None:
2454                base_globals = getattr(sys.modules.get(base.__module__, None), '__dict__', {})
2455            else:
2456                base_globals = globalns
2457            ann = base.__dict__.get('__annotations__', {})
2458            if isinstance(ann, types.GetSetDescriptorType):
2459                ann = {}
2460            base_locals = dict(vars(base)) if localns is None else localns
2461            if localns is None and globalns is None:
2462                # This is surprising, but required.  Before Python 3.10,
2463                # get_type_hints only evaluated the globalns of
2464                # a class.  To maintain backwards compatibility, we reverse
2465                # the globalns and localns order so that eval() looks into
2466                # *base_globals* first rather than *base_locals*.
2467                # This only affects ForwardRefs.
2468                base_globals, base_locals = base_locals, base_globals
2469            for name, value in ann.items():
2470                if value is None:
2471                    value = type(None)
2472                if isinstance(value, str):
2473                    value = ForwardRef(value, is_argument=False, is_class=True)
2474                value = _eval_type(value, base_globals, base_locals, base.__type_params__)
2475                hints[name] = value
2476        return hints if include_extras else {k: _strip_annotations(t) for k, t in hints.items()}
2477
2478    if globalns is None:
2479        if isinstance(obj, types.ModuleType):
2480            globalns = obj.__dict__
2481        else:
2482            nsobj = obj
2483            # Find globalns for the unwrapped object.
2484            while hasattr(nsobj, '__wrapped__'):
2485                nsobj = nsobj.__wrapped__
2486            globalns = getattr(nsobj, '__globals__', {})
2487        if localns is None:
2488            localns = globalns
2489    elif localns is None:
2490        localns = globalns
2491    hints = getattr(obj, '__annotations__', None)
2492    if hints is None:
2493        # Return empty annotations for something that _could_ have them.
2494        if isinstance(obj, _allowed_types):
2495            return {}
2496        else:
2497            raise TypeError('{!r} is not a module, class, method, '
2498                            'or function.'.format(obj))
2499    hints = dict(hints)
2500    type_params = getattr(obj, "__type_params__", ())
2501    for name, value in hints.items():
2502        if value is None:
2503            value = type(None)
2504        if isinstance(value, str):
2505            # class-level forward refs were handled above, this must be either
2506            # a module-level annotation or a function argument annotation
2507            value = ForwardRef(
2508                value,
2509                is_argument=not isinstance(obj, types.ModuleType),
2510                is_class=False,
2511            )
2512        hints[name] = _eval_type(value, globalns, localns, type_params)
2513    return hints if include_extras else {k: _strip_annotations(t) for k, t in hints.items()}

Return type hints for an object.

This is often the same as obj.__annotations__, but it handles forward references encoded as string literals and recursively replaces all 'Annotated[T, ...]' with 'T' (unless 'include_extras=True').

The argument may be a module, class, method, or function. The annotations are returned as a dictionary. For classes, annotations include also inherited members.

TypeError is raised if the argument is not of a type that can contain annotations, and an empty dictionary is returned if no annotations are present.

BEWARE -- the behavior of globalns and localns is counterintuitive (unless you are familiar with how eval() and exec() work). The search order is locals first, then globals.

  • If no dict arguments are passed, an attempt is made to use the globals from obj (or the respective module's globals for classes), and these are also used as the locals. If the object does not appear to have globals, an empty dictionary is used. For classes, the search order is globals first then locals.

  • If one dict argument is passed, it is used for both globals and locals.

  • If two dict arguments are passed, they specify globals and locals, respectively.

def is_protocol(tp: type, /) -> bool:
3767def is_protocol(tp: type, /) -> bool:
3768    """Return True if the given type is a Protocol.
3769
3770    Example::
3771
3772        >>> from typing import Protocol, is_protocol
3773        >>> class P(Protocol):
3774        ...     def a(self) -> str: ...
3775        ...     b: int
3776        >>> is_protocol(P)
3777        True
3778        >>> is_protocol(int)
3779        False
3780    """
3781    return (
3782        isinstance(tp, type)
3783        and getattr(tp, '_is_protocol', False)
3784        and tp != Protocol
3785    )

Return True if the given type is a Protocol.

Example::

>>> from typing import Protocol, is_protocol
>>> class P(Protocol):
...     def a(self) -> str: ...
...     b: int
>>> is_protocol(P)
True
>>> is_protocol(int)
False
def is_typeddict(tp):
2597def is_typeddict(tp):
2598    """Check if an annotation is a TypedDict class.
2599
2600    For example::
2601
2602        >>> from typing import TypedDict
2603        >>> class Film(TypedDict):
2604        ...     title: str
2605        ...     year: int
2606        ...
2607        >>> is_typeddict(Film)
2608        True
2609        >>> is_typeddict(dict)
2610        False
2611    """
2612    return isinstance(tp, _TypedDictMeta)

Check if an annotation is a TypedDict class.

For example::

>>> from typing import TypedDict
>>> class Film(TypedDict):
...     title: str
...     year: int
...
>>> is_typeddict(Film)
True
>>> is_typeddict(dict)
False
LiteralString = LiteralString

Represents an arbitrary literal string.

Example::

from typing import LiteralString

def run_query(sql: LiteralString) -> None:
    ...

def caller(arbitrary_string: str, literal_string: LiteralString) -> None:
    run_query("SELECT * FROM students")  # OK
    run_query(literal_string)  # OK
    run_query("SELECT * FROM " + literal_string)  # OK
    run_query(arbitrary_string)  # type checker error
    run_query(  # type checker error
        f"SELECT * FROM students WHERE name = {arbitrary_string}"
    )

Only string literals and other LiteralStrings are compatible with LiteralString. This provides a tool to help prevent security issues such as SQL injection.

Never = Never

The bottom type, a type that has no members.

This can be used to define a function that should never be called, or a function that never returns::

from typing import Never

def never_call_me(arg: Never) -> None:
    pass

def int_or_str(arg: int | str) -> None:
    never_call_me(arg)  # type checker error
    match arg:
        case int():
            print("It's an int")
        case str():
            print("It's a str")
        case _:
            never_call_me(arg)  # OK, arg is of type Never
class NewType:
3403class NewType:
3404    """NewType creates simple unique types with almost zero runtime overhead.
3405
3406    NewType(name, tp) is considered a subtype of tp
3407    by static type checkers. At runtime, NewType(name, tp) returns
3408    a dummy callable that simply returns its argument.
3409
3410    Usage::
3411
3412        UserId = NewType('UserId', int)
3413
3414        def name_by_id(user_id: UserId) -> str:
3415            ...
3416
3417        UserId('user')          # Fails type check
3418
3419        name_by_id(42)          # Fails type check
3420        name_by_id(UserId(42))  # OK
3421
3422        num = UserId(5) + 1     # type: int
3423    """
3424
3425    __call__ = _idfunc
3426
3427    def __init__(self, name, tp):
3428        self.__qualname__ = name
3429        if '.' in name:
3430            name = name.rpartition('.')[-1]
3431        self.__name__ = name
3432        self.__supertype__ = tp
3433        def_mod = _caller()
3434        if def_mod != 'typing':
3435            self.__module__ = def_mod
3436
3437    def __mro_entries__(self, bases):
3438        # We defined __mro_entries__ to get a better error message
3439        # if a user attempts to subclass a NewType instance. bpo-46170
3440        superclass_name = self.__name__
3441
3442        class Dummy:
3443            def __init_subclass__(cls):
3444                subclass_name = cls.__name__
3445                raise TypeError(
3446                    f"Cannot subclass an instance of NewType. Perhaps you were looking for: "
3447                    f"`{subclass_name} = NewType({subclass_name!r}, {superclass_name})`"
3448                )
3449
3450        return (Dummy,)
3451
3452    def __repr__(self):
3453        return f'{self.__module__}.{self.__qualname__}'
3454
3455    def __reduce__(self):
3456        return self.__qualname__
3457
3458    def __or__(self, other):
3459        return Union[self, other]
3460
3461    def __ror__(self, other):
3462        return Union[other, self]

NewType creates simple unique types with almost zero runtime overhead.

NewType(name, tp) is considered a subtype of tp by static type checkers. At runtime, NewType(name, tp) returns a dummy callable that simply returns its argument.

Usage::

UserId = NewType('UserId', int)

def name_by_id(user_id: UserId) -> str:
    ...

UserId('user')          # Fails type check

name_by_id(42)          # Fails type check
name_by_id(UserId(42))  # OK

num = UserId(5) + 1     # type: int
NewType(name, tp)
3427    def __init__(self, name, tp):
3428        self.__qualname__ = name
3429        if '.' in name:
3430            name = name.rpartition('.')[-1]
3431        self.__name__ = name
3432        self.__supertype__ = tp
3433        def_mod = _caller()
3434        if def_mod != 'typing':
3435            self.__module__ = def_mod
def no_type_check(arg):
2643def no_type_check(arg):
2644    """Decorator to indicate that annotations are not type hints.
2645
2646    The argument must be a class or function; if it is a class, it
2647    applies recursively to all methods and classes defined in that class
2648    (but not to methods defined in its superclasses or subclasses).
2649
2650    This mutates the function(s) or class(es) in place.
2651    """
2652    if isinstance(arg, type):
2653        for key in dir(arg):
2654            obj = getattr(arg, key)
2655            if (
2656                not hasattr(obj, '__qualname__')
2657                or obj.__qualname__ != f'{arg.__qualname__}.{obj.__name__}'
2658                or getattr(obj, '__module__', None) != arg.__module__
2659            ):
2660                # We only modify objects that are defined in this type directly.
2661                # If classes / methods are nested in multiple layers,
2662                # we will modify them when processing their direct holders.
2663                continue
2664            # Instance, class, and static methods:
2665            if isinstance(obj, types.FunctionType):
2666                obj.__no_type_check__ = True
2667            if isinstance(obj, types.MethodType):
2668                obj.__func__.__no_type_check__ = True
2669            # Nested types:
2670            if isinstance(obj, type):
2671                no_type_check(obj)
2672    try:
2673        arg.__no_type_check__ = True
2674    except TypeError:  # built-in classes
2675        pass
2676    return arg

Decorator to indicate that annotations are not type hints.

The argument must be a class or function; if it is a class, it applies recursively to all methods and classes defined in that class (but not to methods defined in its superclasses or subclasses).

This mutates the function(s) or class(es) in place.

def no_type_check_decorator(decorator):
2679def no_type_check_decorator(decorator):
2680    """Decorator to give another decorator the @no_type_check effect.
2681
2682    This wraps the decorator with something that wraps the decorated
2683    function in @no_type_check.
2684    """
2685    import warnings
2686    warnings._deprecated("typing.no_type_check_decorator", remove=(3, 15))
2687    @functools.wraps(decorator)
2688    def wrapped_decorator(*args, **kwds):
2689        func = decorator(*args, **kwds)
2690        func = no_type_check(func)
2691        return func
2692
2693    return wrapped_decorator

Decorator to give another decorator the @no_type_check effect.

This wraps the decorator with something that wraps the decorated function in @no_type_check.

NoDefault = NoDefault
NoReturn = NoReturn

Special type indicating functions that never return.

Example::

from typing import NoReturn

def stop() -> NoReturn:
    raise Exception('no way')

NoReturn can also be used as a bottom type, a type that has no values. Starting in Python 3.11, the Never type should be used for this concept instead. Type checkers should treat the two equivalently.

NotRequired = NotRequired

Special typing construct to mark a TypedDict key as potentially missing.

For example::

class Movie(TypedDict):
    title: str
    year: NotRequired[int]

m = Movie(
    title='The Matrix',  # typechecker error if key is omitted
    year=1999,
)
def overload(func):
2709def overload(func):
2710    """Decorator for overloaded functions/methods.
2711
2712    In a stub file, place two or more stub definitions for the same
2713    function in a row, each decorated with @overload.
2714
2715    For example::
2716
2717        @overload
2718        def utf8(value: None) -> None: ...
2719        @overload
2720        def utf8(value: bytes) -> bytes: ...
2721        @overload
2722        def utf8(value: str) -> bytes: ...
2723
2724    In a non-stub file (i.e. a regular .py file), do the same but
2725    follow it with an implementation.  The implementation should *not*
2726    be decorated with @overload::
2727
2728        @overload
2729        def utf8(value: None) -> None: ...
2730        @overload
2731        def utf8(value: bytes) -> bytes: ...
2732        @overload
2733        def utf8(value: str) -> bytes: ...
2734        def utf8(value):
2735            ...  # implementation goes here
2736
2737    The overloads for a function can be retrieved at runtime using the
2738    get_overloads() function.
2739    """
2740    # classmethod and staticmethod
2741    f = getattr(func, "__func__", func)
2742    try:
2743        _overload_registry[f.__module__][f.__qualname__][f.__code__.co_firstlineno] = func
2744    except AttributeError:
2745        # Not a normal function; ignore.
2746        pass
2747    return _overload_dummy

Decorator for overloaded functions/methods.

In a stub file, place two or more stub definitions for the same function in a row, each decorated with @overload.

For example::

@overload
def utf8(value: None) -> None: ...
@overload
def utf8(value: bytes) -> bytes: ...
@overload
def utf8(value: str) -> bytes: ...

In a non-stub file (i.e. a regular .py file), do the same but follow it with an implementation. The implementation should not be decorated with @overload::

@overload
def utf8(value: None) -> None: ...
@overload
def utf8(value: bytes) -> bytes: ...
@overload
def utf8(value: str) -> bytes: ...
def utf8(value):
    ...  # implementation goes here

The overloads for a function can be retrieved at runtime using the get_overloads() function.

def override(method: F, /) -> F:
3732def override[F: _Func](method: F, /) -> F:
3733    """Indicate that a method is intended to override a method in a base class.
3734
3735    Usage::
3736
3737        class Base:
3738            def method(self) -> None:
3739                pass
3740
3741        class Child(Base):
3742            @override
3743            def method(self) -> None:
3744                super().method()
3745
3746    When this decorator is applied to a method, the type checker will
3747    validate that it overrides a method or attribute with the same name on a
3748    base class.  This helps prevent bugs that may occur when a base class is
3749    changed without an equivalent change to a child class.
3750
3751    There is no runtime checking of this property. The decorator attempts to
3752    set the ``__override__`` attribute to ``True`` on the decorated object to
3753    allow runtime introspection.
3754
3755    See PEP 698 for details.
3756    """
3757    try:
3758        method.__override__ = True
3759    except (AttributeError, TypeError):
3760        # Skip the attribute silently if it is not writable.
3761        # AttributeError happens if the object has __slots__ or a
3762        # read-only property, TypeError if it's a builtin class.
3763        pass
3764    return method

Indicate that a method is intended to override a method in a base class.

Usage::

class Base:
    def method(self) -> None:
        pass

class Child(Base):
    @override
    def method(self) -> None:
        super().method()

When this decorator is applied to a method, the type checker will validate that it overrides a method or attribute with the same name on a base class. This helps prevent bugs that may occur when a base class is changed without an equivalent change to a child class.

There is no runtime checking of this property. The decorator attempts to set the __override__ attribute to True on the decorated object to allow runtime introspection.

See PEP 698 for details.

class ParamSpecArgs:

The args for a ParamSpec object.

Given a ParamSpec object P, P.args is an instance of ParamSpecArgs.

ParamSpecArgs objects have a reference back to their ParamSpec::

>>> P = ParamSpec("P")
>>> P.args.__origin__ is P
True

This type is meant for runtime introspection and has no special meaning to static type checkers.

class ParamSpecKwargs:

The kwargs for a ParamSpec object.

Given a ParamSpec object P, P.kwargs is an instance of ParamSpecKwargs.

ParamSpecKwargs objects have a reference back to their ParamSpec::

>>> P = ParamSpec("P")
>>> P.kwargs.__origin__ is P
True

This type is meant for runtime introspection and has no special meaning to static type checkers.

ReadOnly = ReadOnly

A special typing construct to mark an item of a TypedDict as read-only.

For example::

class Movie(TypedDict):
    title: ReadOnly[str]
    year: int

def mutate_movie(m: Movie) -> None:
    m["year"] = 1992  # allowed
    m["title"] = "The Matrix"  # typechecker error

There is no runtime checking for this property.

Required = Required

Special typing construct to mark a TypedDict key as required.

This is mainly useful for total=False TypedDicts.

For example::

class Movie(TypedDict, total=False):
    title: Required[str]
    year: int

m = Movie(
    title='The Matrix',  # typechecker error if key is omitted
    year=1999,
)

There is no runtime checking that a required key is actually provided when instantiating a related TypedDict.

def reveal_type(obj: T, /) -> T:
3621def reveal_type[T](obj: T, /) -> T:
3622    """Ask a static type checker to reveal the inferred type of an expression.
3623
3624    When a static type checker encounters a call to ``reveal_type()``,
3625    it will emit the inferred type of the argument::
3626
3627        x: int = 1
3628        reveal_type(x)
3629
3630    Running a static type checker (e.g., mypy) on this example
3631    will produce output similar to 'Revealed type is "builtins.int"'.
3632
3633    At runtime, the function prints the runtime type of the
3634    argument and returns the argument unchanged.
3635    """
3636    print(f"Runtime type is {type(obj).__name__!r}", file=sys.stderr)
3637    return obj

Ask a static type checker to reveal the inferred type of an expression.

When a static type checker encounters a call to reveal_type(), it will emit the inferred type of the argument::

x: int = 1
reveal_type(x)

Running a static type checker (e.g., mypy) on this example will produce output similar to 'Revealed type is "builtins.int"'.

At runtime, the function prints the runtime type of the argument and returns the argument unchanged.

def runtime_checkable(cls):
2342def runtime_checkable(cls):
2343    """Mark a protocol class as a runtime protocol.
2344
2345    Such protocol can be used with isinstance() and issubclass().
2346    Raise TypeError if applied to a non-protocol class.
2347    This allows a simple-minded structural check very similar to
2348    one trick ponies in collections.abc such as Iterable.
2349
2350    For example::
2351
2352        @runtime_checkable
2353        class Closable(Protocol):
2354            def close(self): ...
2355
2356        assert isinstance(open('/some/file'), Closable)
2357
2358    Warning: this will check only the presence of the required methods,
2359    not their type signatures!
2360    """
2361    if not issubclass(cls, Generic) or not getattr(cls, '_is_protocol', False):
2362        raise TypeError('@runtime_checkable can be only applied to protocol classes,'
2363                        ' got %r' % cls)
2364    cls._is_runtime_protocol = True
2365    # PEP 544 prohibits using issubclass()
2366    # with protocols that have non-method members.
2367    # See gh-113320 for why we compute this attribute here,
2368    # rather than in `_ProtocolMeta.__init__`
2369    cls.__non_callable_proto_members__ = set()
2370    for attr in cls.__protocol_attrs__:
2371        try:
2372            is_callable = callable(getattr(cls, attr, None))
2373        except Exception as e:
2374            raise TypeError(
2375                f"Failed to determine whether protocol member {attr!r} "
2376                "is a method member"
2377            ) from e
2378        else:
2379            if not is_callable:
2380                cls.__non_callable_proto_members__.add(attr)
2381    return cls

Mark a protocol class as a runtime protocol.

Such protocol can be used with isinstance() and issubclass(). Raise TypeError if applied to a non-protocol class. This allows a simple-minded structural check very similar to one trick ponies in collections.abc such as Iterable.

For example::

@runtime_checkable
class Closable(Protocol):
    def close(self): ...

assert isinstance(open('/some/file'), Closable)

Warning: this will check only the presence of the required methods, not their type signatures!

Self = Self

Used to spell the type of "self" in classes.

Example::

from typing import Self

class Foo:
    def return_self(self) -> Self:
        ...
        return self
This is especially useful for:
  • classmethods that are used as alternative constructors
  • annotating an __enter__ method which returns self
Text = <class 'str'>
TYPE_CHECKING = False
TypeAlias = TypeAlias

Special form for marking type aliases.

Use TypeAlias to indicate that an assignment should be recognized as a proper type alias definition by type checkers.

For example::

Predicate: TypeAlias = Callable[..., bool]

It's invalid when used anywhere except as in the example above.

TypeGuard = TypeGuard

Special typing construct for marking user-defined type predicate functions.

TypeGuard can be used to annotate the return type of a user-defined type predicate function. TypeGuard only accepts a single type argument. At runtime, functions marked this way should return a boolean.

TypeGuard aims to benefit type narrowing -- a technique used by static type checkers to determine a more precise type of an expression within a program's code flow. Usually type narrowing is done by analyzing conditional code flow and applying the narrowing to a block of code. The conditional expression here is sometimes referred to as a "type predicate".

Sometimes it would be convenient to use a user-defined boolean function as a type predicate. Such a function should use TypeGuard[...] or TypeIs[...] as its return type to alert static type checkers to this intention. TypeGuard should be used over TypeIs when narrowing from an incompatible type (e.g., list[object] to list[int]) or when the function does not return True for all instances of the narrowed type.

Using -> TypeGuard[NarrowedType] tells the static type checker that for a given function:

  1. The return value is a boolean.
  2. If the return value is True, the type of its argument is NarrowedType.

For example::

 def is_str_list(val: list[object]) -> TypeGuard[list[str]]:
     '''Determines whether all objects in the list are strings'''
     return all(isinstance(x, str) for x in val)

 def func1(val: list[object]):
     if is_str_list(val):
         # Type of ``val`` is narrowed to ``list[str]``.
         print(" ".join(val))
     else:
         # Type of ``val`` remains as ``list[object]``.
         print("Not a list of strings!")

Strict type narrowing is not enforced -- TypeB need not be a narrower form of TypeA (it can even be a wider form) and this may lead to type-unsafe results. The main reason is to allow for things like narrowing list[object] to list[str] even though the latter is not a subtype of the former, since list is invariant. The responsibility of writing type-safe type predicates is left to the user.

TypeGuard also works with type variables. For more information, see PEP 647 (User-Defined Type Guards).

TypeIs = TypeIs

Special typing construct for marking user-defined type predicate functions.

TypeIs can be used to annotate the return type of a user-defined type predicate function. TypeIs only accepts a single type argument. At runtime, functions marked this way should return a boolean and accept at least one argument.

TypeIs aims to benefit type narrowing -- a technique used by static type checkers to determine a more precise type of an expression within a program's code flow. Usually type narrowing is done by analyzing conditional code flow and applying the narrowing to a block of code. The conditional expression here is sometimes referred to as a "type predicate".

Sometimes it would be convenient to use a user-defined boolean function as a type predicate. Such a function should use TypeIs[...] or TypeGuard[...] as its return type to alert static type checkers to this intention. TypeIs usually has more intuitive behavior than TypeGuard, but it cannot be used when the input and output types are incompatible (e.g., list[object] to list[int]) or when the function does not return True for all instances of the narrowed type.

Using -> TypeIs[NarrowedType] tells the static type checker that for a given function:

  1. The return value is a boolean.
  2. If the return value is True, the type of its argument is the intersection of the argument's original type and NarrowedType.
  3. If the return value is False, the type of its argument is narrowed to exclude NarrowedType.

For example::

from typing import assert_type, final, TypeIs

class Parent: pass
class Child(Parent): pass
@final
class Unrelated: pass

def is_parent(val: object) -> TypeIs[Parent]:
    return isinstance(val, Parent)

def run(arg: Child | Unrelated):
    if is_parent(arg):
        # Type of ``arg`` is narrowed to the intersection
        # of ``Parent`` and ``Child``, which is equivalent to
        # ``Child``.
        assert_type(arg, Child)
    else:
        # Type of ``arg`` is narrowed to exclude ``Parent``,
        # so only ``Unrelated`` is left.
        assert_type(arg, Unrelated)

The type inside TypeIs must be consistent with the type of the function's argument; if it is not, static type checkers will raise an error. An incorrectly written TypeIs function can lead to unsound behavior in the type system; it is the user's responsibility to write such functions in a type-safe manner.

TypeIs also works with type variables. For more information, see PEP 742 (Narrowing types with TypeIs).

class TypeAliasType:

Type alias.

Type aliases are created through the type statement::

type Alias = int

In this example, Alias and int will be treated equivalently by static type checkers.

At runtime, Alias is an instance of TypeAliasType. The __name__ attribute holds the name of the type alias. The value of the type alias is stored in the __value__ attribute. It is evaluated lazily, so the value is computed only if the attribute is accessed.

Type aliases can also be generic::

type ListOrSet[T] = list[T] | set[T]

In this case, the type parameters of the alias are stored in the __type_params__ attribute.

See PEP 695 for more information.

Unpack = Unpack

Type unpack operator.

The type unpack operator takes the child types from some container type, such as tuple[int, str] or a TypeVarTuple, and 'pulls them out'.

For example::

# For some generic class `Foo`:
Foo[Unpack[tuple[int, str]]]  # Equivalent to Foo[int, str]

Ts = TypeVarTuple('Ts')
# Specifies that `Bar` is generic in an arbitrary number of types.
# (Think of `Ts` as a tuple of an arbitrary number of individual
#  `TypeVar`s, which the `Unpack` is 'pulling out' directly into the
#  `Generic[]`.)
class Bar(Generic[Unpack[Ts]]): ...
Bar[int]  # Valid
Bar[int, str]  # Also valid

From Python 3.11, this can also be done using the * operator::

Foo[*tuple[int, str]]
class Bar(Generic[*Ts]): ...

And from Python 3.12, it can be done using built-in syntax for generics::

Foo[*tuple[int, str]]
class Bar[*Ts]: ...

The operator can also be used along with a TypedDict to annotate **kwargs in a function signature::

class Movie(TypedDict):
    name: str
    year: int

# This function expects two keyword arguments - *name* of type `str` and
# *year* of type `int`.
def foo(**kwargs: Unpack[Movie]): ...

Note that there is only some runtime checking of this operator. Not everything the runtime allows may be accepted by static type checkers.

For more information, see PEPs 646 and 692.