Skip to content

Aimbase

Declarative, instant REST APIs for base AI Models based on instarest, a FastAPI, Pydantic, SQLAlchemy, and PostgreSQL library, and MinIO.


Documentation: aimbase.erob.io

Source Code: github.com/erob123/aimbase


Aimbase offers an opinionated yet flexible abstraction, seamlessly integrating with large object storage to safeguard proprietary or data-controlled model weights. Leveraging the instantaneous database and API configuration capabilities provided by Instarest, Aimbase presents a declarative, extensible, and opinionated framework tailored for crafting REST APIs dedicated to AI Models. It streamlines development by eliminating redundant boilerplate and promoting code reusability, proving especially valuable when working with models associated with sensitive or proprietary data.

Our goal is to help you turn months of work into days and thousands of lines of code into less than a hundred.

By using Aimbase, you will notice:

  • Simplicity: your typical multi-folder, multi-file, multi-class, multi-method, multi-line, multi-annotation, multi-configuration FastAPI application will be reduced to a single file with a few lines of code.
  • Consistency: your application will be built on a consistent, declarative, and opinionated foundation, making it easier to understand and maintain.
  • Speed: your application will be built on a foundation that is designed to be fast, both in terms of development and runtime performance.
  • Fewer Unit Tests: your application will be built on a foundation that is designed to be correct, reducing the need for extensive unit testing. Complete code coverage can be achieved with a handful of unit tests.
  • Easy: Designed to be easy to use and learn. Less time reading docs.
  • Extensible: Aimbase is built to be modular, and as such is easy to extend so that you can have your own custom, realizable opinions and abstractions. Frameworks such as YourFramework and Chainbase are built on top of Aimbase in this way.
  • Standards-based: Based on FastAPI, which itself is based on (and fully compatible with) the open standards for APIs: OpenAPI (previously known as Swagger) and JSON Schema

Requirements

  • Python 3.11+
  • A PostgreSQL database (use docker-compose to get one up and running quickly)
  • MinIO (used for large-object storage of proprietary or data-controlled model weights and configurations)

Installation

$ pip install aimbase

---> 100%

Example

Create it

Let's create a comprehensive, database-backed, large-object storage-enabled, type-checked, versioned REST API tailored for handling proprietary or data-controlled models with Aimbase in just five minutes:

  • Create a file main.py with:
## ************ ENV VAR INIT BEFORE IMPORTS ************ ##
# Make sure to set ENVIRONMENT, ENV_VAR_FOLDER, and SECRETS in your environment,
# outside of any .env file.  This is to ensure that the correct environment
# variables are loaded before the app is initialized.
# Default values are: ENVIRONMENT=local, ENV_VAR_FOLDER=./env_vars, SECRETS=False if not set here
import os

os.environ["ENVIRONMENT"] = "local"
os.environ["ENV_VAR_FOLDER"] = os.path.join(
    os.path.abspath(os.path.dirname(__file__)), "env_vars"
)
os.environ["SECRETS"] = "False"
## ************ ENV VAR INIT BEFORE IMPORTS ************ ##

from aimbase.crud.base import CRUDBaseAIModel
from aimbase.db.base import BaseAIModel
from aimbase.initializer import AimbaseInitializer
from aimbase.routers.sentence_transformers_router import (
    SentenceTransformersRouter,
)
from aimbase.dependencies import get_minio
from aimbase.crud.sentence_transformers_vector import (
    CRUDSentenceTransformersVectorStore,
)
from aimbase.db.vector import AllMiniVectorStore, SourceModel
from instarest import (
    AppBase,
    DeclarativeBase,
    SchemaBase,
    Initializer,
    get_db,
    RESTRouter,
    CRUDBase,
)

from aimbase.services.sentence_transformers_inference import (
    SentenceTransformersInferenceService,
)

# TODO: import to __init__.py for aimbase and update imports here
Initializer(DeclarativeBase).execute(vector_toggle=True)
AimbaseInitializer().execute()

# built pydantic data transfer schemas automagically
base_ai_schemas = SchemaBase(BaseAIModel)
vector_embedding_schemas = SchemaBase(AllMiniVectorStore)

# build db services automagically
crud_ai_test = CRUDBaseAIModel(BaseAIModel)
crud_vector_test = CRUDSentenceTransformersVectorStore(AllMiniVectorStore)

## ************ DEV INITIALIZATION ONLY (if desired to simulate
#  no internet connection...will auto init on first endpoint hit, but
#  will not auto-upload to minio) ************ ##
SentenceTransformersInferenceService(
    model_name="all-MiniLM-L6-v2",
    db=next(get_db()),
    crud=crud_ai_test,
    s3=get_minio(),
    prioritize_internet_download=False,
).dev_init()
## ************ DEV INITIALIZATION ONLY ************ ##

# build ai router automagically
document_vector_store_router = SentenceTransformersRouter(
    model_name="all-MiniLM-L6-v2",
    schema_base=vector_embedding_schemas,
    crud_ai_base=crud_ai_test,
    crud_base=crud_vector_test,
    prefix="/sentences",
    allow_delete=True,
)

# built pydantic data transfer schemas & crud db services automagically
source_model_schemas = SchemaBase(
    SourceModel,
    optional_fields=[
        "description",
        "downloaded_datetime",
        "private_url",
        "public_url",
        "embedding",
    ],
)
crud_source_model = CRUDBase(SourceModel)

# build sources router automagically
sources_router = RESTRouter(
    schema_base=source_model_schemas,
    crud_base=crud_source_model,
    prefix="/sources",
    allow_delete=False,
)

# setup base up from routers
app_base = AppBase(
    crud_routers=[document_vector_store_router, sources_router],
    app_name="Aimbase Inference Test App API",
)

# automagic and version app
auto_app = app_base.get_autowired_app()

# core underlying app
app = app_base.get_core_app()

Setup the database and MinIO

If you already have a PostgreSQL database and MinIO running, you can skip this step.

If not, we will launch them via local containers:

  1. First, make sure that you have docker installed and running. If you don't, you can install it by following the directions at this link.

  2. Download the aimbase docker-compose file:

$ curl -O https://raw.githubusercontent.com/erob123/aimbase/main/docker-compose.yml
  1. Launch the database and MinIO via docker-compose from the same directory as the docker-compose.yml file:
$ docker-compose up --build

Tell your app how to connect to the database and MinIO

Aimbase is set up to automatically connect to PostgreSQL and MinIO with the following environment variables. To get started, create a file named local.env in the same directory as main.py with the following contents:

POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_SERVER=localhost
POSTGRES_PORT=5432
POSTGRES_DB=postgres

DOCS_UI_ROOT_PATH=""
LOG_LEVEL="DEBUG"

MINIO_BUCKET_NAME=test
MINIO_ACCESS_KEY=miniouser
MINIO_SECRET_KEY=minioadmin
MINIO_ENDPOINT_URL=localhost:9000
MINIO_REGION=""
MINIO_SECURE=False

This will allow your app to connect to the database nad MinIO large object storage on launch. For reference, these values are defined within the docker-compose.yml file for local development, but for production they will come from your password defined through your database provider.

Run it

You should now have two files in the same directory: main.py and local.env. Let's run the app with:

$ uvicorn main:auto_app --reload

INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [28720]
INFO:     Started server process [28722]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
About the command uvicorn main:app --reload...

The command uvicorn main:auto_app refers to:

  • main: the file main.py (the Python "module").
  • auto_app: the object created inside of main.py with the line auto_app = app_base.get_autowired_app().
  • --reload: make the server restart after code changes. Only do this for development.

Interactive API docs

Now go to http://127.0.0.1:8000/v1/docs.

You will see the automatic interactive API documentation (provided by Swagger UI):

Congratulations! You have just created your first fully-functional REST API with Aimbase, implementing AI-based sentence embeddings backed by a database in just a few lines.

Check it

Send a test sentence to the API

In the interactive docs (or using curl if you prefer), go to the POST operation for /sentences/encode and try it. Send this JSON body:

{
  "documents": ["This is a test sentence."]
}

You should see the curl that was sent

$ curl -X 'POST' \
'http://localhost:8000/v1/sentences/encode' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{"documents": ["This is a test sentence."]}'

and the response received (with calculated embeddings):

{
  "embeddings": [[0.123, 0.456, 0.789, ...]]
}

And that's it! You have successfully created a base API for sentence-transformers embeddings. Continue through the documentation to see how to build more extensible APIs using sentence-transformers and other AI models with aimbase.

All done in under 50 lines of code.

Alternative API docs

If you prefer, go to http://127.0.0.1:8000/v1/redoc.

You will see the alternative automatic documentation (provided by ReDoc):

License

This project is licensed under the terms of the MIT license.