Evidently AI migrates to Litestar!


Exciting news out in the Litestar ecosystem!

Evidently, the open-source machine learning observability platform, has migrated from FastAPI to Litestar!

What is Evidently AI?

From the horses mouth:

Evidently AI allows you to evaluate, test, and monitor machine learning (ML) models from validation to production. From tabular data to natural language processing (NLP) and large language models (LLM). Built for data scientists and ML engineers.


Do you want more content like this? Subscribe!

They also allow you to self host, or have managed cloud hosting.

You can read more on the evidentlyai.com website!

Why Litestar?

As described in evidently#862 and evidently#989, but (probably more helpfully this comment), users were wanting to be able to utilize the amazing Pydantic v2 library inside of their app.

The Issue

Evidently, at the time of writing, uses Pydantic v1 internally, and there was a limitation from running Pydantic v1 and Pydantic v2 inside of the same application.

The Solution

Thanks to the request in litestar#2419 and subsequent litestar#2487 pull request, Litestar allows users to easily run Pydantic v1 and Pydantic v2 inside of the same application.

Litestar x Pydantic, a history

Litestar itself migrated internally from Pydantic to msgspec, namely for the maintenance burden of maintaining differing systems with 2 libraries.

However, we quite enjoy Pydantic and the work they are doing though, and know our userbase thinks the same - so we maintain a native Pydantic plugin!

The Future

With the advent of the mixed Pydantic v1 / v2 support, Evidently had what they needed to migrate... but that wasn't all they got.

As a result, we decided to replace the FastAPI dependency for litestar. On top of this, we experienced some other issues with FastAPI as we further develop the UI and collector service (where it is used). So we expect additional performance improvements by migrating to Litestar - solving two issues at once.

What does the future hold for your project?

There are many debates about which methodologies, tools, frameworks, and libraries to use. We are an opinionated group after all 😉.

In my opinion, there is not really a bad choice. Use FastAPI, Sanic, Litestar, whatever! Use the right tool for the job. Don't be a "frameworker", be a great engineer. These are all amazing project made by a fantastic group of people that started by doing it for free.

Who Uses Litestar?

Our framework and tools are used across a wide variety of places as you might expect (or be suprised by); this includes indie open-source projects, Google, O'Reilly Auto, Telemetry Sports (who serves the NFL and NCAA®), homelabbers, and anywhere in between.

One of the more exciting users of Litestar, though, is Scalar.com who has built an (soon to be) open-source LLM. Stay tuned for that!

With that being said, Litestar continues to grow in usage across Fortune 500 companies and open source projects large and small. It's a valid choice for hosting your full stack apps, ML pipelines, APIs, and more!

We invite you to try out Litestar, or even [come contribute](https://github.com/search?q=user%3Alitestar-org+state%3Aopen+label%3A%22good+first+issue%22+++no%3Aassignee+&type=issues and join our team)!

It's a simple pip away!

python3 -m pip install litestar
# or, if you want our native plugin Pydantic included:
python3 -m pip install litestar[pydantic]

Thanks to all of our contributors, sponsors, and users. We love what we do, but we also love using what we build and hope you do too.

  • Jacob | Litestar Maintainer & Polar.sh fanboy