Make API calls to Adaptive using LangChain Chat models
If you are using a supported LangChain Chat model class to invoke an LLM API in your application,
you can continue when switching to Adaptive Engine. You can also connect proprietary models to be invoked through the Adaptive API.
Currently, the Adaptive API is compatible with the following LangChain Chat models classes:
Below are usage examples for the supported classes:
Copy
Ask AI
from langchain_google_genai import ChatGoogleGenerativeAIimport os os.environ["GOOGLE_API_KEY"] = "ADAPTIVE_API_KEY"llm = ChatGoogleGenerativeAI( model="use_case_key", client_options={"api_endpoint": "ADAPTIVE_URL/api/v1/google"}, transport="rest", # api_key="ADAPTIVE_API_KEY" # alternative to GOOGLE_API_KEY env var)messages = [ ( "system", "You are a helpful assistant that translates English to French. Translate the user sentence.", ), ("human", "I love programming."),]response = llm.invoke(messages)print(response)
Copy
Ask AI
from langchain_google_genai import ChatGoogleGenerativeAIimport os os.environ["GOOGLE_API_KEY"] = "ADAPTIVE_API_KEY"llm = ChatGoogleGenerativeAI( model="use_case_key", client_options={"api_endpoint": "ADAPTIVE_URL/api/v1/google"}, transport="rest", # api_key="ADAPTIVE_API_KEY" # alternative to GOOGLE_API_KEY env var)messages = [ ( "system", "You are a helpful assistant that translates English to French. Translate the user sentence.", ), ("human", "I love programming."),]response = llm.invoke(messages)print(response)
Copy
Ask AI
from langchain_openai import ChatOpenAIimport os os.environ["OPENAI_API_KEY"] = "ADAPTIVE_API_KEY"llm = ChatOpenAI( model="use_case_key/optional[model_key]", base_url="ADAPTIVE_URL/api/v1", # api_key="ADAPTIVE_API_KEY" # alternative to OPENAI_API_KEY env var)messages = [ ( "system", "You are a helpful assistant that translates English to French. Translate the user sentence.", ), ("human", "I love programming."),]response = llm.invoke(messages)print(response)