Getting started

Install the SDK, authenticate, and make your first API call.

The SeekrFlow Python SDK provides programmatic access to the SeekrFlow platform, including inference, agents, fine-tuning, content moderation, explainability, and data preparation. The SDK supports Python 3.9+ with synchronous and asynchronous clients.

For direct HTTP access without an SDK, see the API Reference.

Prerequisites

  • Python 3.9 or higher
  • A SeekrFlow account. See Access SeekrFlow for setup instructions.
  • A SeekrFlow API key, available in the User Profile section of the dashboard.

Installation

pip install --upgrade seekrai

Authentication

Set your API key as an environment variable:

export SEEKR_API_KEY=your_api_key
set SEEKR_API_KEY=your_api_key

Verify the variable is set by running echo $SEEKR_API_KEY (or echo %SEEKR_API_KEY% on Windows).

Supported integrations

SeekrFlow supports the native Python SDK and two third-party clients. All three provide access to the same inference engine.

SeekrFlow SDK – Native Python client with full platform access, including agents, fine-tuning, data engine, and explainability. No additional packages required beyond seekrai.

OpenAI SDK – OpenAI-compatible inference endpoint. Existing OpenAI-based applications can connect to SeekrFlow by changing the base_url. Install with:

pip install openai

The OpenAI compatibility layer supports model, messages, stream, temperature, logprobs, top_logprobs, max_tokens, stop, top_p, frequency_penalty, presence_penalty, and tools. Parameters such as tool_choice, parallel_tool_calls, n, logit_bias, and max_completion_tokens are not supported.

LangChain – The ChatSeekrFlow wrapper integrates SeekrFlow models into LangChain chains, prompts, and tools. Install with:

pip install langchain langchain-community langchain-seekrflow

ChatSeekrFlow supports tool calling, structured output, JSON mode, streaming, and token usage tracking. Async APIs, image input, audio input, and video input are not currently supported.

First API call

import os
from seekrai import SeekrFlow

client = SeekrFlow(api_key=os.environ.get("SEEKR_API_KEY"))

response = client.chat.completions.create(
    model="meta-llama/Llama-3.1-8B-Instruct",
    messages=[{"role": "user", "content": "What is SeekrFlow?"}],
)
print(response.choices[0].message.content)
# Requires: pip install openai
from openai import OpenAI

client = OpenAI(
    base_url="https://flow.seekr.com/v1/inference",
    api_key="your_seekr_api_key",
)

response = client.chat.completions.create(
    model="meta-llama/Llama-3.1-8B-Instruct",
    messages=[{"role": "user", "content": "What is SeekrFlow?"}],
)
print(response.choices[0].message.content)
# Requires: pip install seekrai langchain langchain-community langchain-seekrflow
import os
from seekrai import SeekrFlow
from langchain_seekrflow import ChatSeekrFlow
from langchain.schema import HumanMessage

seekr_client = SeekrFlow(api_key=os.environ.get("SEEKR_API_KEY"))

llm = ChatSeekrFlow(
    client=seekr_client,
    model_name="meta-llama/Llama-3.1-8B-Instruct",
)

response = llm.invoke([HumanMessage(content="What is SeekrFlow?")])
print(response.content)

Asynchronous usage

The SeekrFlow SDK includes an asynchronous client for non-blocking requests.

import os, asyncio
from seekrai import AsyncSeekrFlow

client = AsyncSeekrFlow(api_key=os.environ.get("SEEKR_API_KEY"))

async def main():
    response = await client.chat.completions.create(
        model="meta-llama/Llama-3.1-8B-Instruct",
        messages=[{"role": "user", "content": "What is SeekrFlow?"}],
    )
    print(response.choices[0].message.content)

asyncio.run(main())

What you can build