Getting started
Install the SDK, authenticate, and make your first API call.
The SeekrFlow Python SDK provides programmatic access to the SeekrFlow platform, including inference, agents, fine-tuning, content moderation, explainability, and data preparation. The SDK supports Python 3.9+ with synchronous and asynchronous clients.
For direct HTTP access without an SDK, see the API Reference.
Prerequisites
- Python 3.9 or higher
- A SeekrFlow account. See Access SeekrFlow for setup instructions.
- A SeekrFlow API key, available in the User Profile section of the dashboard.
Installation
pip install --upgrade seekraiAuthentication
Set your API key as an environment variable:
export SEEKR_API_KEY=your_api_keyset SEEKR_API_KEY=your_api_keyVerify the variable is set by running echo $SEEKR_API_KEY (or echo %SEEKR_API_KEY% on Windows).
Supported integrations
SeekrFlow supports the native Python SDK and two third-party clients. All three provide access to the same inference engine.
SeekrFlow SDK – Native Python client with full platform access, including agents, fine-tuning, data engine, and explainability. No additional packages required beyond seekrai.
OpenAI SDK – OpenAI-compatible inference endpoint. Existing OpenAI-based applications can connect to SeekrFlow by changing the base_url. Install with:
pip install openaiThe OpenAI compatibility layer supports model, messages, stream, temperature, logprobs, top_logprobs, max_tokens, stop, top_p, frequency_penalty, presence_penalty, and tools. Parameters such as tool_choice, parallel_tool_calls, n, logit_bias, and max_completion_tokens are not supported.
LangChain – The ChatSeekrFlow wrapper integrates SeekrFlow models into LangChain chains, prompts, and tools. Install with:
pip install langchain langchain-community langchain-seekrflowChatSeekrFlow supports tool calling, structured output, JSON mode, streaming, and token usage tracking. Async APIs, image input, audio input, and video input are not currently supported.
First API call
import os
from seekrai import SeekrFlow
client = SeekrFlow(api_key=os.environ.get("SEEKR_API_KEY"))
response = client.chat.completions.create(
model="meta-llama/Llama-3.1-8B-Instruct",
messages=[{"role": "user", "content": "What is SeekrFlow?"}],
)
print(response.choices[0].message.content)# Requires: pip install openai
from openai import OpenAI
client = OpenAI(
base_url="https://flow.seekr.com/v1/inference",
api_key="your_seekr_api_key",
)
response = client.chat.completions.create(
model="meta-llama/Llama-3.1-8B-Instruct",
messages=[{"role": "user", "content": "What is SeekrFlow?"}],
)
print(response.choices[0].message.content)# Requires: pip install seekrai langchain langchain-community langchain-seekrflow
import os
from seekrai import SeekrFlow
from langchain_seekrflow import ChatSeekrFlow
from langchain.schema import HumanMessage
seekr_client = SeekrFlow(api_key=os.environ.get("SEEKR_API_KEY"))
llm = ChatSeekrFlow(
client=seekr_client,
model_name="meta-llama/Llama-3.1-8B-Instruct",
)
response = llm.invoke([HumanMessage(content="What is SeekrFlow?")])
print(response.content)Asynchronous usage
The SeekrFlow SDK includes an asynchronous client for non-blocking requests.
import os, asyncio
from seekrai import AsyncSeekrFlow
client = AsyncSeekrFlow(api_key=os.environ.get("SEEKR_API_KEY"))
async def main():
response = await client.chat.completions.create(
model="meta-llama/Llama-3.1-8B-Instruct",
messages=[{"role": "user", "content": "What is SeekrFlow?"}],
)
print(response.choices[0].message.content)
asyncio.run(main())What you can build
Agents
Create agents, attach tools, manage threads, and run multi-turn conversations.
Fine-tuning
Train models on your data with instruction fine-tuning, context-grounded fine-tuning, or reinforcement tuning.
Content moderation
Classify content for safety and brand risk using Seekr ContentGuard and Meta Llama Guard.
Data engine
Ingest files, generate training datasets, and build vector databases for retrieval.
Explainability
Retrieve the fine-tuning data that influenced a model's response.
Updated about 7 hours ago
