Client
Adaptive (Sync)
parameters
- base_url: The base URL for the Adaptive API.
- api_key: API key for authentication. Defaults to None, in which case environment variable
ADAPTIVE_API_KEYneeds to be set. - timeout_secs: Timeout in seconds for HTTP requests. Defaults to 90.0 seconds. Set to None for no timeout.
AsyncAdaptive (Async)
parameters
- base_url: The base URL for the Adaptive API.
- api_key: API key for authentication. Defaults to None, in which case environment variable
ADAPTIVE_API_KEYneeds to be set. - timeout_secs: Timeout in seconds for HTTP requests. Defaults to 90.0 seconds. Set to None for no timeout.
Resources
A/B tests
Resource to interact with AB Tests Access viaadaptive.ab_tests
cancel
parameters
- key: The AB test key.
create
parameters
- ab_test_key: A unique key to identify the AB test.
- feedback_key: The feedback key against which the AB test will run.
- models: The models to include in the AB test; they must be attached to the project.
- traffic_split: Percentage of production traffic to route to AB test.
traffic_split*100% of inference requests for the project will be sent randomly to one of the models included in the AB test. - feedback_type: What type of feedback to run the AB test on, metric or preference.
- auto_deploy: If set to
True, when the AB test is completed, the winning model automatically gets promoted to the project default model.
get
parameters
- key: The AB test key.
list
parameters
- active: Filter on active or inactive AB tests.
- status: Filter on one of the possible AB test status.
- project: Project key. Falls back to client’s default if not provided.
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
Artifacts
Resource to interact with job artifacts Access viaadaptive.artifacts
download
parameters
- artifact_id: The UUID of the artifact to download.
- destination_path: Local file path where the artifact will be saved.
Chat
Access viaadaptive.chat
create
parameters
- messages: Input messages, each dict with keys
roleandcontent. - stream: If
True, partial message deltas will be returned. If stream is over, chunk.choices will be None. - model: Target model key for inference. If
None, the requests will be routed to the project’s default model. - stop: Sequences or where the API will stop generating further tokens.
- max_tokens: Maximum # of tokens allowed to generate.
- temperature: Sampling temperature.
- top_p: Threshold for top-p sampling.
- stream_include_usage: If set, an additional chunk will be streamed with the token usage statistics for the entire request.
- session_id: Session ID to group related interactions.
- project: Project key. Falls back to client’s default if not provided.
- user: ID of user making request. If not
None, will be logged as metadata for the request. - ab_campaign: AB test key. If set, request will be guaranteed to count towards AB test results, no matter the configured
traffic_split. - n: Number of chat completions to generate for each input messages.
- labels: Key-value pairs of interaction labels.
- store: Whether to store the interaction for future reference. Stores by default.
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
Compute pools
Resource to interact with compute pools Access viaadaptive.compute_pools
list
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
resize_inference_partition
Recipes
Resource to interact with custom scripts Access viaadaptive.recipes
delete
parameters
- recipe_key: The key or ID of the recipe to delete.
- project: Optional project key. Falls back to client’s default.
generate_sample_input
parameters
- recipe_key: The key or ID of the recipe.
- project: Optional project key. Falls back to client’s default.
get
parameters
- recipe_key: The key or ID of the recipe.
- project: Optional project key. Falls back to client’s default.
list
parameters
- project: Optional project key. Falls back to client’s default.
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
update
parameters
- recipe_key: The key of the recipe to update.
- path: Optional new path to a Python file or directory to replace recipe code. If None, only metadata (name, description, labels) is updated.
- entrypoint: Optional path to the recipe entrypoint file, relative to the
pathdirectory. Only applicable when path is a directory. Raises ValueError if: - path is a single file (entrypoint not supported for single files) - path is a directory that already contains main.py Raises FileNotFoundError if: - the specified entrypoint file doesn’t exist in the directory If path is a directory and entrypoint is None: - The directory must contain a main.py file, or FileNotFoundError is raised - entrypoint_config: Optional path to a separate config file that specifies the InputConfig for the recipe entrypoint. Only applicable when path is a directory. Raises ValueError if: - path is a single file (entrypoint_config not supported for single files) - path is a directory that already contains config.py Raises FileNotFoundError if: - the specified entrypoint_config file doesn’t exist in the directory If path is a directory and entrypoint_config is None: - If entrypoint is specified, the InputConfig should be included in it. - If entrypoint is not specified, main.py should contain the InputConfig, or a config.py file must be present.
- name: Optional new display name.
- description: Optional new description.
- labels: Optional new key-value labels as tuples of (key, value).
- project: Optional project key. Falls back to client’s default.
upload
parameters
- path: Path to a Python file or directory containing the recipe.
- recipe_key: Optional unique key for the recipe. If not provided, inferred from: - File name (without .py) if path is a file - “dir_name/entrypoint_name” if path is a directory and custom entrypoint is specified - Directory name if path is a directory and no custom entrypoint is specified
- entrypoint: Optional path to the recipe entrypoint file, relative to the
pathdirectory. Only applicable when path is a directory. Raises ValueError if: - path is a single file (entrypoint not supported for single files) - path is a directory that already contains main.py Raises FileNotFoundError if: - the specified entrypoint file doesn’t exist in the directory If path is a directory and entrypoint is None: - The directory must contain a main.py file, or FileNotFoundError is raised - entrypoint_config: Optional path to a separate config file that specifies the InputConfig for the recipe entrypoint. Only applicable when path is a directory. Raises ValueError if: - path is a single file (entrypoint_config not supported for single files) - path is a directory that already contains config.py Raises FileNotFoundError if: - the specified entrypoint_config file doesn’t exist in the directory If path is a directory and entrypoint_config is None: - If entrypoint is specified, the InputConfig should be included in it. - If entrypoint is not specified, main.py should contain the InputConfig, or a config.py file must be present.
- name: Optional display name for the recipe.
- description: Optional description.
- labels: Optional key-value labels.
- project: Optional project identifier.
upsert
parameters
- path: Path to a Python file or directory containing the recipe.
- recipe_key: Optional unique key for the recipe. If not provided, inferred from: - File name (without .py) if path is a file - “dir_name/entrypoint_name” if path is a directory and custom entrypoint is specified - Directory name if path is a directory and no custom entrypoint is specified
- entrypoint: Optional path to the recipe entrypoint file, relative to the
pathdirectory. Only applicable when path is a directory. Raises ValueError if: - path is a single file (entrypoint not supported for single files) - path is a directory that already contains main.py Raises FileNotFoundError if: - the specified entrypoint file doesn’t exist in the directory If path is a directory and entrypoint is None: - The directory must contain a main.py file, or FileNotFoundError is raised - entrypoint_config: Optional path to a separate config file that specifies the InputConfig for the recipe entrypoint. Only applicable when path is a directory. Raises ValueError if: - path is a single file (entrypoint_config not supported for single files) - path is a directory that already contains config.py Raises FileNotFoundError if: - the specified entrypoint_config file doesn’t exist in the directory If path is a directory and entrypoint_config is None: - If entrypoint is specified, the InputConfig should be included in it. - If entrypoint is not specified, main.py should contain the InputConfig, or a config.py file must be present.
- name: Optional display name for the recipe.
- description: Optional description.
- labels: Optional key-value labels.
- project: Optional project identifier, falls back to client’s default if it is set.
Datasets
Resource to interact with file datasets Access viaadaptive.datasets
delete
get
parameters
- key: Dataset key.
list
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
upload
parameters
- file_path: Path to jsonl file.
- dataset_key: New dataset key.
- name: Optional name to render in UI; if
None, defaults to same asdataset_key.
Embeddings
Resource to interact with embeddings Access viaadaptive.embeddings
create
parameters
- input: Input text to embed.
- model: Target model key for inference. If
None, the requests will be routed to the project’s default model. Request will error if default model is not an embedding model. - encoding_format: Encoding format of response.
- user: ID of user making the requests. If not
None, will be logged as metadata for the request.
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
Graders
Resource to interact with grader definitions used to evaluate model completions Access viaadaptive.graders
delete
get
list
lock
parameters
- grader_key: ID or key of the grader.
- locked: Whether to lock (True) or unlock (False) the grader.
- project: Explicit project key. Falls back to client.default_project.
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
test_external_endpoint
Integrations
Resource to manage integrations and notification subscriptions Access viaadaptive.integrations
create
parameters
- team: Team ID or key.
- name: Human-readable name for the integration.
- provider: Provider name.
- connection: Connection config. Use one of: -
ConnectionConfigInputSlack(webhook_url=..., bot_token=...)-ConnectionConfigInputSmtp(host=..., port=..., username=..., password=..., from_email=..., to_emails=[...])-ConnectionConfigInputWebhook(url=..., method=..., headers=...)-ConnectionConfigInputGitHub(api_token=..., org=..., repo=...) - subscriptions: Optional list of
SubscriptionInputnotification subscriptions. - delivery_policy: Delivery policy, either
"multishot"or"singleshot".
delete
parameters
- id: Integration UUID.
get
parameters
- id: Integration UUID.
get_provider
parameters
- name: Provider name.
list
parameters
- team: Team ID or key.
list_providers
test_notification
parameters
- topic: Notification topic string.
- payload: Notification payload, e.g.
NotificationPayload(job_update=JobUpdatePayload(...)). - scope_user: List of user UUIDs to scope the notification to.
- scope_team: Team ID or key to scope the notification to.
- scope_organization: If True, scope the notification to the organization.
- scope_admin: If True, scope the notification to admins.
update
parameters
- id: Integration UUID.
- name: New name for the integration.
- enabled: Enable or disable the integration.
- connection: Updated connection config. See
create()for the available types. - subscriptions: Updated list of
SubscriptionInputnotification subscriptions. - delivery_policy: Updated delivery policy, either
"multishot"or"singleshot".
Jobs
Resource to interact with jobs Access viaadaptive.jobs
cancel
parameters
- job_id: The ID of the job to cancel.
get
parameters
- job_id: The ID of the job to retrieve.
list
parameters
- first: Number of jobs to return from the beginning.
- last: Number of jobs to return from the end.
- after: Cursor for forward pagination.
- before: Cursor for backward pagination.
- kind: Filter by job types.
- project: Filter by project key.
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
run
parameters
- recipe_key: The key of the recipe to run.
- num_gpus: Number of GPUs to allocate for the job.
- args: Optional arguments to pass to the recipe; must match the recipe schema.
- name: Optional human-readable name for the job.
- project: Project key for the job.
- compute_pool: Optional compute pool key to run the job on.
Feedback
Resource to interact with and log feedback Access viaadaptive.feedback
get_key
parameters
- feedback_key: The feedback key. return self._gql_client.describe_metric(input=feedback_key).metric
list_keys
log_metric
feedback_key it is logged against.
parameters
- value: The feedback values.
- completion_id: The completion_id to attach the feedback to.
- feedback_key: The feedback key to log against.
- user: ID of user submitting feedback. If not
None, will be logged as metadata for the request. - details: Textual details for the feedback. Can be used to provide further context on the feedback
value.
log_preference
parameters
- feedback_key: The feedback key to log against.
- preferred_completion: Can be a completion_id or a dict with keys
modelandtext, corresponding the a valid model key and its attributed completion. - other_completion: Can be a completion_id or a dict with keys
modelandtext, corresponding the a valid model key and its attributed completion. - user: ID of user submitting feedback.
- messages: Input chat messages, each dict with keys
roleandcontent. Ignored ifpreferred_andother_completionare completion_ids. - tied: Indicator if both completions tied as equally bad or equally good.
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
register_key
parameters
- key: Feedback key.
- kind: Feedback kind. If
"bool", you can log values0,1,TrueorFalseonly. If"scalar", you can log any integer or float value. - scoring_type: Indication of what good means for this feedback key; a higher numeric value (or
True) , or a lower numeric value (orFalse). - name: Human-readable feedback name that will render in the UI. If
None, will be the same askey. - description: Description of intended purpose or nuances of feedback. Will render in the UI.
Interactions
Resource to interact with interactions Access viaadaptive.interactions
create
parameters
- messages: Input chat messages, each dict should have keys
roleandcontent. - completion: Model completion.
- model: Model key.
- feedbacks: List of feedbacks, each dict should with keys
feedback_key,valueand optional(details). - user: ID of user making the request. If not
None, will be logged as metadata for the interaction. - session_id: Session ID to group related interactions.
- project: Project key. Falls back to client’s default if not provided.
- ab_campaign: AB test key. If set, provided
feedbackswill count towards AB test results. - labels: Key-value pairs of interaction labels.
- created_at: Timestamp of interaction creation or ingestion.
get
parameters
- completion_id: The ID of the completion.
list
parameters
- order: Ordering of results.
- filters: List filters.
- page: Paging config.
- group_by: Retrieve interactions grouped by selected dimension.
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
Models
Resource to interact with models Access viaadaptive.models
add_external
parameters
- name: Adaptive name for the new model.
- external_model_id: Should match the model id publicly shared by the model provider.
- api_key: API Key for authentication against external model provider.
- provider: External proprietary model provider.
- extra_params: Additional provider-specific parameters (supported for open_ai and azure).
add_hf_model
parameters
- hf_model_id: The ID of the selected model repo on HuggingFace Model Hub.
- output_model_key: The key that will identify the new model in Adaptive.
- hf_token: Your HuggingFace Token, needed to validate access to gated/restricted model.
add_to_project
parameters
- model: Model key.
- project: Project key. Falls back to client’s default if not provided.
attach
parameters
- model: Model key.
- wait: If the model is not deployed already, attaching it to the project will automatically deploy it. If
True, this call blocks until model isOnline. - make_default: Make the model the project’s default on attachment.
- num_draft_steps: Optional number of speculative decoding draft steps.
deploy
parameters
- model: Model key.
- wait: If
True, block until the model is online. - make_default: Make the model the project’s default after deployment.
- project: Project key.
- placement: Optional placement configuration for the model.
- num_draft_steps: Optional number of speculative decoding draft steps.
detach
parameters
- model: Model key.
get
parameters
- model: Model key.
list
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
terminate
parameters
- model: Model key.
- force: If model is attached to several projects,
forcemust equalTruein order for the model to be terminated.
update
parameters
- model: Model key.
- is_default: Change the selection of the model as default for the project.
Trueto promote to default,Falseto demote from default. IfNone, no changes are applied. - attached: Whether model should be attached or detached to/from project. If
None, no changes are applied. - desired_online: Turn model inference on or off for the client project. This does not influence the global status of the model, it is project-bounded. If
None, no changes are applied. - num_draft_steps: Optional number of speculative decoding draft steps.
update_compute_config
Permissions
Resource to list permissions Access viaadaptive.permissions
list
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
Roles
Resource to manage roles Access viaadaptive.roles
create
parameters
- key: Role key.
- permissions: List of permission identifiers such as
project:read. You can list all possible permissions with client.permissions.list(). - name: Role name; if not provided, defaults to
key.
list
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
Teams
Resource to manage teams Access viaadaptive.teams
create
parameters
- key: Unique key for the team.
- name: Human-readable team name. If not provided, defaults to key.
list
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
Projects
Resource to interact with projects Access viaadaptive.projects
create
parameters
- key: Project key.
- name: Human-readable project name which will be rendered in the UI. If not set, will be the same as
key. - description: Description of model which will be rendered in the UI.
get
list
optional_project_key
parameters
- project: Optional explicit project key.
project_key
parameters
- project: Optional explicit project key.
share
parameters
- project: Project key.
- team: Team key.
- role: Role key.
unshare
parameters
- project: Project key.
- team: Team key.
Users
Resource to manage users and permissions Access viaadaptive.users
add_to_team
parameters
- email: User email.
- team: Key of team to which user will be added to.
- role: Assigned role
create
parameters
- email: User’s email address.
- name: User’s display name.
- teams_with_role: Sequence of (team_key, role_key) tuples assigning the user to teams with specific roles.
create_service_account
parameters
- name: Account name. Must contain only lowercase letters (a-z), numbers, hyphens, and underscores.
- teams_with_role: Sequence of (team_key, role_key) tuples.
delete
parameters
- email: The email address of the user to delete.
list
me
remove_from_team
parameters
- email: User email.
- team: Key of team to remove user from.

