Skip to content

Creating a Custom Model Class#

This guide shows how to create a custom Model class for byLLM that bypasses the default LiteLLM integration. This is useful when you want to use a self-hosted language model, a custom API, or any service not supported by LiteLLM. The example demonstrates this by implementing a custom class using the OpenAI SDK.

IMPORTANT

This assumes that you have a proper understanding on how to inference with your language model. If you are not sure about this, please refer to the documentation of your language model.

Steps#

  • Create a new class that inherits from BaseModel class.
from byllm.llm import BaseLLM
from openai import OpenAI

class MyOpenAIModel(BaseLLM):
    def __init__(self, model_name: str, **kwargs: object) -> None:
        """Initialize the MockLLM connector."""
        super().__init__(model_name, **kwargs)

    def model_call_no_stream(self, params):
        client = OpenAI(api_key=self.api_key)
        response = client.chat.completions.create(**params)
        return response

    def model_call_with_stream(self, params):
        client = OpenAI(api_key=self.api_key)
        response = client.chat.completions.create(stream=True, **params)
        return response
import from byllm.llm { BaseLLM }
import from openai { OpenAI }

obj  MyOpenAIModel(BaseLLM){
    has model_name: str;
    has config: dict = {};

    def post_init() {
        super().__init__(model_name=self.model_name, **kwargs);
    }

    def model_call_no_stream(params: dict) {
        client = OpenAI(api_key=self.api_key);
        response = client.chat.completions.create(**params);
        return response;
    }

    def model_call_with_stream(params: dict) {
        client = OpenAI(api_key=self.api_key);
        response = client.chat.completions.create(stream=True, **params);
        return response;
    }
}
  • Initialize your model with the required parameters.
# Initialize as global variable
glob llm = MyLLM(model_name="gpt-4o");

Thats it! You have successfully created your own Language Model to be used with byLLM.

NOTICE

This feature is under development and if you face an incompatibility, please open an issue here.