Custom Functions

Custom functions can now be automatically invoked by the model based on context, streamlining business logic integration.

This example code shows you how to leverage the OpenAI client and SeekrFlow's inference engine to create a custom unit conversion tool that can be configured dynamically.

Primary use cases for custom functions are:

  • Fetching up-to-date data to incorporate into the model's response (RAG): Stock market prices, exchange rates
  • Taking actions: Calling APIs, submitting forms, or taking agentic workflow actions (escalating a ticket).

Create the client and make an API request

import os  
import openai

# Set the API key
os.environ["OPENAI_API_KEY"] = "YOUR API KEY"

# Create the OpenAI client and retrieve the API key

client = openai.OpenAI(  
  base_url="<https://flow.seekr.com/v1/inference">,  
  api_key=os.environ.get("OPENAI_API_KEY"

)
  
# Send a request to the OpenAI API to leverage the specified Llama model as a unit conversion tool.
  
response = client.chat.completions.create(  
  model="meta-llama/Llama-3.1-8B-Instruct",  
  stream=False,  
  messages=[{  
    "role": "user",  
    "content": "Convert from 5 kilometers to miles"  
}],  
max_tokens=100,  
tools=\[{  
  "type": "function",  
  "function": {  
      "name": "convert_units",  
      "description": "Convert between different units of measurement",  
      "parameters": {  
          "type": "object",  
          "properties": {  
              "value": {"type": "number"},  
              "from_unit": {"type": "string"},  
              "to_unit": {"type": "string"} 

}, 

  "required": ["value", "from_unit", "to_unit"] 

}  
}  
}]  
)

Define and register a function from JSON

# Parse json and register

def register_from_json(json_obj):  
   code = f"def {json_obj['name']}({',    '.join(json_obj['args'])}):\\n{json_obj['docstring']}\\n{json_obj['code']}"  
   print(code)  
   namespace = {}  
   exec(code, namespace)  
   return namespace\[json_obj["name"]]

Run the unit conversion tool

This function executes the tool call, given an LLM response object.

# Execute our tool

def execute_tool_call(resp):  
    tool_call = resp.choices[0].message.tool_calls[0]
    
    func_name = tool_call.function.name  
		args = tool_call.function.arguments

		func = globals().get(func_name)
		if not func:
    	raise ValueError(f"Function {func_name} not found")

		if isinstance(args, str):
    	import json
    	args = json.loads(args)

		return func(**args)
  
execute_tool_call(response)

Sample output

This is the output expected in response to the request made earlier to convert 5 kilometers to miles.

3.106855