Documentation

Getting started with VecsAI only takes 3 steps.

1. Start the VecsAI Server

Create a docker-compose.yml file and start the VecsAI server:

yaml
services:
  vecsai-db:
    image: vecsai/vecsai-db:latest
    container_name: vecsai-db
    restart: always
    ports:
      - "8137:8137"
    volumes:
      - vecsai-data:/data
volumes:
  vecsai-data:
bash
docker compose up -d

Now there is a running VecsAI server on port 8137. You only need to install the Python SDK to use it.

2. Install Python SDK

Install our official Python SDK using pip. The SDK is fully typed and ready to use in your AI or LLM applications.

bash
pip install vecsai

3. Basic Usage Example

Connect to the database, insert a vector, and perform a real-time similarity search.

python
from vecsai import Client

# Initialize the client pointing to your local container
client = Client(host="localhost", port=8137)

# Create a high-dimensional vector collection
collection = client.create_collection(
    name="documents",
    dimension=1536
)

# Insert vectors alongside custom metadata
collection.insert([
    {"id": "doc1", "vector": [0.1, 0.2, 0.3], "metadata": {"title": "AI Trends"}},
    {"id": "doc2", "vector": [0.5, 0.4, 0.9], "metadata": {"title": "Vector DBs"}}
])

# Perform an ultra-fast nearest neighbor search
results = collection.search(
    vector=[0.1, 0.25, 0.8],
    top_k=5
)

print(results)