Client
Adaptive (Sync)
parameters
- base_url: The base URL for the Adaptive API.
- api_key: API key for authentication. Defaults to None, in which case environment variable
ADAPTIVE_API_KEYneeds to be set. - timeout_secs: Timeout in seconds for HTTP requests. Defaults to 90.0 seconds. Set to None for no timeout.
AsyncAdaptive (Async)
parameters
- base_url: The base URL for the Adaptive API.
- api_key: API key for authentication. Defaults to None, in which case environment variable
ADAPTIVE_API_KEYneeds to be set. - timeout_secs: Timeout in seconds for HTTP requests. Defaults to 90.0 seconds. Set to None for no timeout.
Resources
A/B Tests
Resource to interact with AB Tests Access viaadaptive.ab_tests
cancel
parameters
- key: The AB test key.
create
parameters
- ab_test_key: A unique key to identify the AB test.
- feedback_key: The feedback key against which the AB test will run.
- models: The models to include in the AB test; they must be attached to the use case.
- traffic_split: Percentage of production traffic to route to AB test.
traffic_split*100% of inference requests for the use case will be sent randomly to one of the models included in the AB test. - feedback_type: What type of feedback to run the AB test on, metric or preference.
- auto_deploy: If set to
True, when the AB test is completed, the winning model automatically gets promoted to the use case default model.
get
parameters
- key: The AB test key.
list
parameters
- active: Filter on active or inactive AB tests.
- status: Filter on one of the possible AB test status.
- use_case: Use case key. Falls back to client’s default if not provided.
Artifacts
Resource to interact with job artifacts Access viaadaptive.artifacts
download
parameters
- artifact_id: The UUID of the artifact to download.
- destination_path: Local file path where the artifact will be saved.
Chat
Access viaadaptive.chat
create
parameters
- messages: Input messages, each dict with keys
roleandcontent. - stream: If
True, partial message deltas will be returned. If stream is over, chunk.choices will be None. - model: Target model key for inference. If
None, the requests will be routed to the use case’s default model. - stop: Sequences or where the API will stop generating further tokens.
- max_tokens: Maximum # of tokens allowed to generate.
- temperature: Sampling temperature.
- top_p: Threshold for top-p sampling.
- stream_include_usage: If set, an additional chunk will be streamed with the token usage statistics for the entire request.
- session_id: Session ID to group related interactions.
- use_case: Use case key. Falls back to client’s default if not provided.
- user: ID of user making request. If not
None, will be logged as metadata for the request. - ab_campaign: AB test key. If set, request will be guaranteed to count towards AB test results, no matter the configured
traffic_split. - n: Number of chat completions to generate for each input messages.
- labels: Key-value pairs of interaction labels.
- store: Whether to store the interaction for future reference. Stores by default.
Compute Pools
Resource to interact with compute pools Access viaadaptive.compute_pools
list
resize_inference_partition
Recipes
Resource to interact with custom scripts Access viaadaptive.recipes
delete
parameters
- recipe_key: The key or ID of the recipe to delete.
- use_case: Optional use case key. Falls back to client’s default.
generate_sample_input
parameters
- recipe_key: The key or ID of the recipe.
- use_case: Optional use case key. Falls back to client’s default.
get
parameters
- recipe_key: The key or ID of the recipe.
- use_case: Optional use case key. Falls back to client’s default.
list
parameters
- use_case: Optional use case key. Falls back to client’s default.
update
parameters
- recipe_key: The key of the recipe to update.
- path: Optional new path to a Python file or directory to replace recipe code. If None, only metadata (name, description, labels) is updated.
- entrypoint: Optional path to the recipe entrypoint file, relative to the
pathdirectory. Only applicable when path is a directory. Raises ValueError if: - path is a single file (entrypoint not supported for single files) - path is a directory that already contains main.py Raises FileNotFoundError if: - the specified entrypoint file doesn’t exist in the directory If path is a directory and entrypoint is None: - The directory must contain a main.py file, or FileNotFoundError is raised - name: Optional new display name.
- description: Optional new description.
- labels: Optional new key-value labels as tuples of (key, value).
- use_case: Optional use case key. Falls back to client’s default.
upload
parameters
- path: Path to a Python file or directory containing the recipe.
- recipe_key: Optional unique key for the recipe. If not provided, inferred from: - File name (without .py) if path is a file - “dir_name/entrypoint_name” if path is a directory and custom entrypoint is specified - Directory name if path is a directory and no custom entrypoint is specified
- entrypoint: Optional path to the recipe entrypoint file, relative to the
pathdirectory. Only applicable when path is a directory. Raises ValueError if: - path is a single file (entrypoint not supported for single files) - path is a directory that already contains main.py Raises FileNotFoundError if: - the specified entrypoint file doesn’t exist in the directory If path is a directory and entrypoint is None: - The directory must contain a main.py file, or FileNotFoundError is raised - name: Optional display name for the recipe.
- description: Optional description.
- labels: Optional key-value labels.
- use_case: Optional use case identifier.
Datasets
Resource to interact with file datasets Access viaadaptive.datasets
delete
get
parameters
- key: Dataset key.
list
upload
parameters
- file_path: Path to jsonl file.
- dataset_key: New dataset key.
- name: Optional name to render in UI; if
None, defaults to same asdataset_key.
Embeddings
Resource to interact with embeddings Access viaadaptive.embeddings
create
parameters
- input: Input text to embed.
- model: Target model key for inference. If
None, the requests will be routed to the use case’s default model. Request will error if default model is not an embedding model. - encoding_format: Encoding format of response.
- user: ID of user making the requests. If not
None, will be logged as metadata for the request.
Graders
Resource to interact with grader definitions used to evaluate model completions Access viaadaptive.graders
delete
get
list
lock
parameters
- grader_key: ID or key of the grader.
- locked: Whether to lock (True) or unlock (False) the grader.
- use_case: Explicit use-case key. Falls back to client.default_use_case.
test_external_endpoint
Integrations
Resource to manage integrations and notification subscriptions Access viaadaptive.integrations
create
parameters
- team: Team ID or key.
- input: Integration creation input.
delete
parameters
- id: Integration UUID.
get
parameters
- id: Integration UUID.
get_provider
parameters
- name: Provider name.
list
parameters
- team: Team ID or key.
list_providers
test_notification
parameters
- input: Test notification input with topic, scope, and payload.
update
parameters
- id: Integration UUID.
- input: Integration update input.
Jobs
Resource to interact with jobs Access viaadaptive.jobs
cancel
parameters
- job_id: The ID of the job to cancel.
get
parameters
- job_id: The ID of the job to retrieve.
list
parameters
- first: Number of jobs to return from the beginning.
- last: Number of jobs to return from the end.
- after: Cursor for forward pagination.
- before: Cursor for backward pagination.
- kind: Filter by job types.
- use_case: Filter by use case key.
run
parameters
- recipe_key: The key of the recipe to run.
- num_gpus: Number of GPUs to allocate for the job.
- args: Optional arguments to pass to the recipe; must match the recipe schema.
- name: Optional human-readable name for the job.
- use_case: Use case key for the job.
- compute_pool: Optional compute pool key to run the job on.
Feedback
Resource to interact with and log feedback Access viaadaptive.feedback
get_key
parameters
- feedback_key: The feedback key. return self._gql_client.describe_metric(input=feedback_key).metric
link
parameters
- feedback_key: The feedback key to be linked.
list_keys
log_metric
feedback_key it is logged against.
parameters
- value: The feedback values.
- completion_id: The completion_id to attach the feedback to.
- feedback_key: The feedback key to log against.
- user: ID of user submitting feedback. If not
None, will be logged as metadata for the request. - details: Textual details for the feedback. Can be used to provide further context on the feedback
value.
log_preference
parameters
- feedback_key: The feedback key to log against.
- preferred_completion: Can be a completion_id or a dict with keys
modelandtext, corresponding the a valid model key and its attributed completion. - other_completion: Can be a completion_id or a dict with keys
modelandtext, corresponding the a valid model key and its attributed completion. - user: ID of user submitting feedback.
- messages: Input chat messages, each dict with keys
roleandcontent. Ignored ifpreferred_andother_completionare completion_ids. - tied: Indicator if both completions tied as equally bad or equally good.
register_key
parameters
- key: Feedback key.
- kind: Feedback kind. If
"bool", you can log values0,1,TrueorFalseonly. If"scalar", you can log any integer or float value. - scoring_type: Indication of what good means for this feeback key; a higher numeric value (or
True) , or a lower numeric value (orFalse). - name: Human-readable feedback name that will render in the UI. If
None, will be the same askey. - description: Description of intended purpose or nuances of feedback. Will render in the UI.
unlink
parameters
- feedback_key: The feedback key to be unlinked.
Interactions
Resource to interact with interactions Access viaadaptive.interactions
create
parameters
- messages: Input chat messages, each dict should have keys
roleandcontent. - completion: Model completion.
- model: Model key.
- feedbacks: List of feedbacks, each dict should with keys
feedback_key,valueand optional(details). - user: ID of user making the request. If not
None, will be logged as metadata for the interaction. - session_id: Session ID to group related interactions.
- use_case: Use case key. Falls back to client’s default if not provided.
- ab_campaign: AB test key. If set, provided
feedbackswill count towards AB test results. - labels: Key-value pairs of interaction labels.
- created_at: Timestamp of interaction creation or ingestion.
get
parameters
- completion_id: The ID of the completion.
list
parameters
- order: Ordering of results.
- filters: List filters.
- page: Paging config.
- group_by: Retrieve interactions grouped by selected dimension.
Models
Resource to interact with models Access viaadaptive.models
add_external
parameters
- name: Adaptive name for the new model.
- external_model_id: Should match the model id publicly shared by the model provider.
- api_key: API Key for authentication against external model provider.
- provider: External proprietary model provider.
add_hf_model
parameters
- hf_model_id: The ID of the selected model repo on HuggingFace Model Hub.
- output_model_key: The key that will identify the new model in Adaptive.
- hf_token: Your HuggingFace Token, needed to validate access to gated/restricted model.
add_to_use_case
parameters
- model: Model key.
- use_case: Use case key. Falls back to client’s default if not provided.
attach
parameters
- model: Model key.
- wait: If the model is not deployed already, attaching it to the use case will automatically deploy it. If
True, this call blocks until model isOnline. - make_default: Make the model the use case’s default on attachment.
deploy
parameters
- model: Model key.
- wait: If
True, block until the model is online. - make_default: Make the model the use case’s default after deployment.
- use_case: Use case key.
- placement: Optional placement configuration for the model.
detach
parameters
- model: Model key.
get
parameters
- model: Model key.
list
terminate
parameters
- model: Model key.
- force: If model is attached to several use cases,
forcemust equalTruein order for the model to be terminated.
update
parameters
- model: Model key.
- is_default: Change the selection of the model as default for the use case.
Trueto promote to default,Falseto demote from default. IfNone, no changes are applied. - attached: Whether model should be attached or detached to/from use case. If
None, no changes are applied. - desired_online: Turn model inference on or off for the client use case. This does not influence the global status of the model, it is use case-bounded. If
None, no changes are applied.
update_compute_config
Permissions
Resource to list permissions Access viaadaptive.permissions
list
Roles
Resource to manage roles Access viaadaptive.roles
create
parameters
- key: Role key.
- permissions: List of permission identifiers such as
use_case:read. You can list all possible permissions with client.permissions.list(). - name: Role name; if not provided, defaults to
key.
list
Teams
Resource to manage teams Access viaadaptive.teams
create
parameters
- key: Unique key for the team.
- name: Human-readable team name. If not provided, defaults to key.
list
Use Cases
Resource to interact with use cases Access viaadaptive.use_cases
create
parameters
- key: Use case key.
- name: Human-readable use case name which will be rendered in the UI. If not set, will be the same as
key. - description: Description of model which will be rendered in the UI.
- team: Team key to associate the use case with.
get
list
share
parameters
- use_case: Use case key.
- team: Team key.
- role: Role key.
unshare
parameters
- use_case: Use case key.
- team: Team key.
Users
Resource to manage users and permissions Access viaadaptive.users
add_to_team
parameters
- email: User email.
- team: Key of team to which user will be added to.
- role: Assigned role
create
parameters
- email: User’s email address.
- name: User’s display name.
- teams_with_role: Sequence of (team_key, role_key) tuples assigning the user to teams with specific roles.
delete
parameters
- email: The email address of the user to delete.
list
me
remove_from_team
parameters
- email: User email.
- team: Key of team to remove user from.

