Documentation Index
Fetch the complete documentation index at: https://docs.adaptive-ml.com/llms.txt
Use this file to discover all available pages before exploring further.
If you are using a supported LangChain Chat model class to invoke an LLM API in your application,
you can continue when switching to Adaptive Engine. You can also connect proprietary models to be invoked through the Adaptive API.
Currently, the Adaptive API is compatible with the following LangChain Chat models classes:
Below are usage examples for the supported classes:
ChatGoogleGenerativeAI
ChatOpenAI
from langchain_google_genai import ChatGoogleGenerativeAI
import os
os.environ["GOOGLE_API_KEY"] = "ADAPTIVE_API_KEY"
llm = ChatGoogleGenerativeAI(
model="use_case_key",
client_options={"api_endpoint": "ADAPTIVE_URL/api/v1/google"},
transport="rest",
# api_key="ADAPTIVE_API_KEY" # alternative to GOOGLE_API_KEY env var
)
messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
response = llm.invoke(messages)
print(response)
from langchain_openai import ChatOpenAI
import os
os.environ["OPENAI_API_KEY"] = "ADAPTIVE_API_KEY"
llm = ChatOpenAI(
model="use_case_key/optional[model_key]",
base_url="ADAPTIVE_URL/api/v1",
# api_key="ADAPTIVE_API_KEY" # alternative to OPENAI_API_KEY env var
)
messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
response = llm.invoke(messages)
print(response)