Connect proprietary models
Connect models from external providers and invoke them through Adaptive Engine
Adaptive Engine enables you to connect proprietary models, so you can use them through the Adaptive API in tandem with other platform features - such as A/B tests, interaction and feedback logging. The only requirement is that you supply a valid external API key.
This feature currently supports OpenAI (both via OpenAI’s API and Azure OpenAI Service) and Google models.
To connect an external model, click on Connect external model on the top right of the Models page in the Adaptive Engine UI.
The Connect external model wizard allows you to configure an external proprietary model connection in Adaptive Engine
You can also connect an external model using the Adaptive SDK.
Supported model ID’s:
GPT4O
GPT4O_MINI
GPT4
GPT4_TURBO
GPT3_5_TURBO
Supported model ID’s:
GPT4O
GPT4O_MINI
GPT4
GPT4_TURBO
GPT3_5_TURBO
When connecting a model through Azure OpenAI Service:
- the
external_model_id
should be the the deployment name/id of your model - the
endpoint
should be your Azure OpenAI subscription endpoint, which should look something likehttps://aoairesource.openai.azure.com
.
See here for more information.
Supported model ID’s are the ones listed in this table, with the exception of embeddings models.
Once the proprietary model is connected to Adaptive Engine, you can attach it to a use-case and make inference requests the same way you would for any other Adaptive Engine model.