DEV Community

Cover image for General-Purpose Python Type Contracts
Steven L
Steven L

Posted on • Edited on

General-Purpose Python Type Contracts

For a while now at my new job, I have been staring at a lot of Python code. It's nice and easy, but I wonder where improvements can be made. A lot of improvements would simply come straight down to actually adding type signatures to functions and using mypy to cover the fuzzy testing.

I, however, have problems, as that seems like a half-baked problem in itself. mypy enables smarter code coverage, but I don't feel like it helps the runtime or the programmer development part much. Also, the fact that mypy is completely optional as well doesn't help much. In fact, simply looking at the first sentence in the typing docs shows us:

Note: The Python runtime does not enforce function and
variable type annotations. They can be used by third party
tools such as type checkers, IDEs, linters, etc.
Enter fullscreen mode Exit fullscreen mode

Python is a dynamic language, and as such there is no such thing as "static typing", unless you employ tools (like mypy) to force Python and your development projects to simply act like it is a static language.

One project I wrote a long time ago attempted to fix this, and I did it by using function decorators to "fix" the language. I was influenced a lot by Haskell, and still am to this day, and love the idea of functional languages like Haskell or OCaml that simply do away with dynamic-language problems by making it much easier to write in a static-typed language. They do this because they made their compilers just... smarter. That's it.

Decorator Patching

Using decorators to fix problems is the easiest, lowest barrier fix we could make, as it simply overlays on top of a plain Python function. The decorator absorbs basic information and passes it along to the target, and once executed, outputs the output of it's target function.

There's two parts to this we can implement: an expects() function, and a outputs(). The expects() function takes in a list of arguments describing the parameters of our goal function, while the outputs() function takes in a list of arguments describing the type of the final value returned. You can use neither, you can use both, or you can choose to use only one of them. It depends on how you want to further solidify your code.

def expects(*types):
    def func_in(fn):
        def in_wrap(*args, **kwargs):
            if len(types) != len(args):
                raise SyntaxError(f"Expected {len(types)}, got {len(args)}.")
            for t, v in zip(types, args):
                if not isinstance(v, t):
                    raise TypeError(f"Value '{v}' not of type '{t}'")
            return fn(*args, **kwargs)
        return in_wrap
    return func_in 

def outputs(*types):
    def func_out(fn):
        def in_wrap(*args, **kwargs):
            finalv = fn(*args, **kwargs)
            if not hasattr(finalv, '__iter__'):
                if not isinstance(finalv, types):
                    raise TypeError(f"Value '{finalv}' not of type '{types}'")
            else:
                for t, v in zip(types, finalv):
                    if not isinstance(v, t):
                        raise TypeError(f"Value '{v}' not of type '{t}'")
            return finalv
        return in_wrap
    return func_out
Enter fullscreen mode Exit fullscreen mode

If this looks like a lot of things going on, I can explain.

The goal behind a decorator is to treat plain-code functions as first-class citizens, allowing us to pass around the reference to the function itself. By doing this, we can take in functions, call them, and return functions from other Python functions.

A decorator returns a function that takes in a function, and does something with it. In some cases, they exist to modify the arguments to the function in some meaningful way, and helps us to modify logic of functions without changing much of their definition itself, or by re-using decorators to share them across a codebase. Instead of logging the inputs and outputs of function with gross print statements, we can write decorators that help us to debug the input and output to functions, all by decorating them easily.

def logger(fn):
    def handler(*args, **kwargs):
        print(f"{fn.__name__}({args}, {kwargs})")
        v = fn(*args, **kwargs)
        print(f"=> {v}")
        return v
    return handler

@logger
def f(x, y):
    return x*y
Enter fullscreen mode Exit fullscreen mode

So long as you carry the arguments through until the final function call, these decorators do their job right. You might be concerned with double-function invokes since @expects and @outputs are separate decorators and can be used freely from one another. If using both, you might be concerned that the target function activates twice, but no, it doesn't, since it works by passing a decorated function into another, it invokes safely, as you can see with this example.

@expects(int) 
def do_with_int(x):
    print("Hi, I handled an int")

@outputs(str) 
def output_str():
    print("Hello, returning a string")
    return "Hi, I'm a string"

@expects(int, int)
@outputs(int) 
def multiply(x, y):
    print("I don't get activated twice")
    return x * y
Enter fullscreen mode Exit fullscreen mode

Flat Contracts

Using decorators gets us a jump-start on better type checking, which is nice. It solidifies code and produces runtime errors which can almost act like they're compile-time errors. But the true power of languages like Haskell and OCaml comes from the idea of extending the type system in a way that allows functions to produce values that can be of any arbitrary type within a certain range.

Take for example this function, which adds one to a number and returns it.

@expects(int) 
@outputs(int) 
def add1(x):
    return x+1

add1(5) # 6
add1(5.1) # exception thrown, not an int
Enter fullscreen mode Exit fullscreen mode

The + operator, shortcut for operator.add in the operator library, overloads different numeric types that implement __add__(). float and int both overload that method to prove that to the operator.add method. However, our crude type checker doesn't have a way of binding multiple types for one input variable, which is kind of unfortunate and would lead to duplicate code for this case.

A better way is to come up with a system that can can be used to define a range of type-checking and other useful facts about values. Sure we have the int function to type-compare and construct values, but there is no contextual association in this, it's purely a number. How do we restrict numbers to a range? How do we check if it's positive? Or if it's odd or even?

Asserting if a condition is satisfied is what we could call a "contract" on that type, that it must satisfy some bounds that we define in order to consider it valid. The contractual agreement between the values and functions would have to be satisfied, that way the functions can operate within that scope, leading to less run-time errors due to silly things like out-of-bounds, off-by-one or even nulled-out values or misformed data.

Let's start with what I will call a "flat" contract. It exerts minimal effort in asserting if a value is valid. A flat contract is normally one that is easy to prove based on it's bounds and the inputs. Let's pretend we have a contract that takes one Python type, and checks if value(s) given to it all match that one type.

def flat_contract(t):
    def inner_check(*vals):
        for x in vals:
            if not isinstance(t, x):
                return False
        return isinstance(x, t)
    return inner_check

is_int = flat_contract(int)
is_float = flat_contract(float)

print(is_int(3)) # true
print(is_float(3)) # false
print(is_float(3.1)) # true
Enter fullscreen mode Exit fullscreen mode

We can see that is_int and is_float are valid flat contracts that are pretty simple to prove, and that an int cannot actually pass as a float in this system, so Python doesn't do any coercion.

The next step is providing tools to check if inputs belong to a family of types, satisfying the number overloading issue we were having. If we can write a contract that checks if a value matches a range of types, it will help us write better, more re-usable, generic Python code. For this we need to write contracts that implement similar logic to that of the any and all functions.

def or_contract(*types):
    def inner_check(*vals):
        for x in vals:
            res = [isinstance(x, t) for t in types]
            if not any(res):
                return False
        return True
    return inner_check

is_num = or_contract(int, float)

print(is_num(3, "seven")) # false
print(is_num(3.1, 4.1, 700)) # true
print(is_num("3.1", "300")) # false
Enter fullscreen mode Exit fullscreen mode

Here we force logic similar to that of a Boolean or operator, where one value on the left or right-hand side must satisfy the predicate to count as truthful. For this we check each value, and run it across the list of types, and if not a single match occurs, we consider it invalid. In our example, we put int and float into an or_contract, which will help us to validate incoming values.

The more complicated contract is the and_contract, which is implemented only by changing a few things around.

def and_contract(*types):
    def inner_check(*vals):
        for x in vals:
            res = [isinstance(x, t) for t in types]
            if not all(res):
                return False
        return True
    return inner_check

is_num = and_contract(int, int)

print(is_num(3, 3.1)) # false
print(is_num(3.1, 4.1, 700)) # false
print(is_num(3, 4, 5, 6)) # true
Enter fullscreen mode Exit fullscreen mode

It's the same core logic as or_contract, but the reason why I say this is complex, is because very few types in Python will ever satisfy this ruling unless a lot of object-oriented inheritance occurs. Some types may qualify as others, but very rarely. For instance, a float can't be an int, but very few cases require instances where one type may need to adhere to the type rules of another, maybe for instance a dict type might need to satisfy multiple bounds like defaultdict.

>>> d = {}
>>> isinstance(d, dict)
True
>>> isinstance(d, defaultdict)
False
>>> d2 = defaultdict()
>>> isinstance(d, defaultdict)
False # dict() does not qualify as defaultdict()
>>> isinstance(d2, defaultdict)
True # qualifies as a defaultdict
>>> isinstance(d2, dict)
True # also qualifies as a dict, can be and_contract'd
Enter fullscreen mode Exit fullscreen mode

Now, all we need to help verify contracts is a way to bind it to functions, via what's called a function contract. It's a lot like expects() and outputs(), but we can't use those, since flat contracts are functions and not strictly types. For this we'll use the names contract_in() and contract_out().

def contract_in(*contracts):
    if not all([callable(c) for c in contracts]):
        raise TypeError("All types must be callable contracts")
    def fn_wrap(fn):
        def arg_wrap(*args, **kwargs):
            if len(contracts) != len(args):
                raise SyntaxError(f"Expected {len(contracts)} inputs, got {len(args)}")
            for con, val in zip(contracts, args):
                if not con(val):
                    raise TypeError(f"Expecting a value to satisfy {con.__name__}, got {type(val)}")
            return fn(*args, **kwargs)
        return arg_wrap
    return fn_wrap


def contract_out(*contracts):
    def func_out(fn):
        def in_wrap(*args, **kwargs):
            finalv = fn(*args)
            if not hasattr(finalv, '__iter__'):
                for con in contracts:
                    if not con(finalv):
                        raise TypeError(f"Expecting value to satisfy {con.__name__}, got '{type(finalv)}'")
            else:
                for con, val in zip(contracts, finalv):
                    if not con(val):
                        raise TypeError(f"Expecting value to satisfy {con.__name__},  got '{type(val)}'")
            return finalv
        return in_wrap
    return func_out
Enter fullscreen mode Exit fullscreen mode

Seems similar right? That's because it's almost exactly the same code as expects() and outputs(). This one operates on contract functions instead of simple type constructors, allowing for new predicate checks to occur at both the input and output stages of functions.

is_num = or_contract(int, float)

@contract_in(is_num) 
@contract_out(is_num) 
def square(x):
    return x * x

square(500) # passes
square(True) # passes?
square("string") # fails
Enter fullscreen mode Exit fullscreen mode

Seems like we're all good here! Except wait, why is Boolean passing the is_num contract? Well...

>>> isinstance(True, int)
True
Enter fullscreen mode Exit fullscreen mode

Oops! Python thinks True, a Boolean value, is also a number. It even has an __add__() method bound for integral addition, but not floats. This is kind of okay, because a Boolean is considered a number (True=0, False=1/anything else) in traditional mathematics. Normally this would be fine, but since Boolean isn't a helpful number for basic addition, we might want to come up with a better alternative, and I think we can all blame isinstance() for this fun little bug.

It might be more beneficial in some instances to compare class names instead of the isinstance() function. isinstance() will follow the inheritance of object-oriented programming, and as such some classes may inherit properties you do not expect. It may be beneficial to have a strict_contract() function to compare exact class types instead.

def strict_contract(t):
    def inner_wrap(val):
        return val.__class__ == t
    return inner_wrap
Enter fullscreen mode Exit fullscreen mode

This now compares the exact class constructor function against the class used to construct the given value. We could extend or_contract and and_contract to also use this, or we can add a keyword toggle to change behavior and add a strict mode to the contracts, but I will leave that as an exercise for the reader.

Overlaying with Type Annotations

Since type annotations are a direct Python library, one is left to wonder how they actually work. Ever since the release of them a few versions back, all functions come with a hidden variable now, which you can see for yourself here:

>>> def fn(x): return x
>>> fn.__annotations__
{}
>>> from typing import *
>>> def gn(x: Any) -> Any: return x
>>> gn.__annotations__
{'x': typing.Any, 'return': typing.Any}
Enter fullscreen mode Exit fullscreen mode

The information is attached to the function object, and this is pretty much how all third party tools do work with Python to gather this type information. However, can we use this ourselves to our benefit? This is where I think it can get tricky, and unfortunately would still end up looking like our current contract system.

Take for example this code:

def fn(x: int) -> str:
    return f"{x}"
Enter fullscreen mode Exit fullscreen mode

It's annotations:

>>> fn.__annotations__
{'x': <class 'int'>, 'return': <class 'str'>}
Enter fullscreen mode Exit fullscreen mode

This would be a relatively simple process to check this with a decorator we could call enforce(), which would enforce the types it used to annotate the function.

def enforce(fn):
    def in_wrap(*args, **kwargs):
        annotes = fn.__annotations__
        for vname, val in zip(fn.__code__.co_varnames, args):
            if not val.__class__ == annotes[vname]:
                raise SyntaxError("Mis-matched annotation type")
        finalv = fn(*args, **kwargs)
        if not finalv.__class__ == annotes['return']:
            raise SyntaxError("Mis-matched annotation type")
        return finalv
    return in_wrap

@enforce
def fn(x: int) -> str:
    print("activated")
    return "YO"

fn(500)
fn("oops")
Enter fullscreen mode Exit fullscreen mode

At a primitive glance, this is completely fine. It does similar stuff to what we have defined earlier with the contract systems. It checks types, and raises errors if a value is not put in properly. I am lazy so I didn't quite do this for the **kwargs, but again, another exercise for the reader. This code actually digs into the Python code object itself to scan for the local variable names to compare against the inputs and check for validity. However, this does not carry over for code like this:

import typing

def fn(x: int) -> typing.List[int]:
    return [y*y for y in range(x)]
Enter fullscreen mode Exit fullscreen mode

With it's annotations:

>>> fn.__annotations__
{'x': <class 'int'>, 'return': typing.List[int]}
Enter fullscreen mode Exit fullscreen mode

The point behind typing was to add a way to define functions in a more abstract manner, but since the typing library doesn't provide any enforcement, this is just work left to the plugin developers to figure it out and do all the busy work. That is something I find to be annoying. typing.List describes some iterable object that should be a list, but since lazy generation exists, this doesn't really do a good job of saying "should my output be a flat list, or a lazily-generated one that yields values, and is actually a function?"

I expect that the Python developers expect us, the programmers, to come up with a solution for the gigantic amount of work they left for us by providing the typing library. If you do dir(typing), you can see all the names defined, and they are mostly enumerated names or special syntax names to wrap other types.

Summary

Should we define an enforce() function to enforce the types of everything in the typing library, you would have a long road ahead of you to provide logic and gigantic switch statements to prove the validity of everything part of the typing library. And for most of the Python community, that's probably fine, because it's a shared goal they can all strive for to help others. The mypy library is important for that reason, because it is a great static analysis tool that others can join in and help with, and needs minimal effort to start using.

However, I think I might be more leaning on the side of my contract system. I am probably the only one who might think a clever contract system like this would do better to constrain programs. mypy doesn't actually help Python developers make stronger code, it tells them what they're doing wrong with their code currently. Whereas a contract system helps to enforce code validity and cements the abstractions in place.

However you go about it, hopefully this is an enlightening piece on how to improve the general smartness of your Python code. A few examples of more contracts you could easily write to make code smarter:

  • integers within a certain range
  • odd or even numbers
  • strings being non-zero, or being a certain length
  • checking if a list contains a uniform type
  • restricting the values of a dictionary
  • checking whether certain values have bound methods
  • automatic regex checking for valid input
  • add Python doc strings to aid developers and users

Thank you for reading and hope you learned a new thing or two.

Top comments (0)