Creating a Custom Model Class#
This guide shows how to create a custom Model class for byLLM that bypasses the default LiteLLM integration. This is useful when you want to use a self-hosted language model, a custom API, or any service not supported by LiteLLM. The example demonstrates this by implementing a custom class using the OpenAI SDK.
IMPORTANT
This assumes that you have a proper understanding on how to inference with your language model. If you are not sure about this, please refer to the documentation of your language model.
Steps#
- Create a new class that inherits from
BaseModelclass.
- Initialize your model with the required parameters.
Thats it! You have successfully created your own Language Model to be used with byLLM.
NOTICE
This feature is under development and if you face an incompatibility, please open an issue here.