Ask HN: How do you make sure a back end matches API-spec?

  • Great question! It fractally occurs at every level of the stack, I dealt with it when formally verifying operating systems, compilers, and runtimes: what do we mean by correct, and how can we assure our implementation has that property? You have the right instinct that it's important to have a specification.

    One approach is to use an "executable specification", and now the problem becomes is the specification interpreter/compiler correct? If so, your implementation has no choice but to match the specification. The Python library Connexion implements this approach. https://connexion.readthedocs.io/en/stable/routing.html you augment your spec with specific python functions. It will automatically validate all requests against the spec.

    Another approach is template generation, where the specification becomes input to a code generation, forming the skeleton that you fill out with your code. Swagger Codegen uses this approach. There are other generators too. I haven't used them, but I'd be worried about the overhead of regeneration/specification change, and possible implementation drift.

    At a different level entirely, you have automated testing. https://dredd.org/en/latest/index.html is a great tool for OpenAPI. It's basic behavioral testing that the implementation matches the spec. If you're using the above approaches, running such a test is kinda superfluous. https://schemathesis.readthedocs.io/en/stable/ is another cool one. If your main worry is specification drift, these tools are your friends.

  • Several teams at my org use Smithy. The gist of it is that you define the spec and Smithy will generate the data model and clients.

    https://awslabs.github.io/smithy/