If you are using a supported LangChain Chat model class to invoke an LLM API in your application, you can continue when switching to Adaptive Engine. You can also connect proprietary models to be invoked through the Adaptive API.

Currently, the Adaptive API is compatible with the following LangChain Chat models classes:

Below are usage examples for the supported classes:

from langchain_google_genai import ChatGoogleGenerativeAI
import os 

os.environ["GOOGLE_API_KEY"] = "ADAPTIVE_API_KEY"

llm = ChatGoogleGenerativeAI(
    model="use_case_key",
    client_options={"api_endpoint": "ADAPTIVE_URL/api/v1/google"},
    transport="rest",
    # api_key="ADAPTIVE_API_KEY"    # alternative to GOOGLE_API_KEY env var
)
messages = [
    (
        "system",
        "You are a helpful assistant that translates English to French. Translate the user sentence.",
    ),
    ("human", "I love programming."),
]
response = llm.invoke(messages)
print(response)