API Reference
Get Model Parameters
Fetches the configured model parameters and messages rendered with the provided variables mapped to the set LLM provider. This endpoint abstracts the need to handle mapping between different providers, while still allowing direct calls to the providers.
POST
/
sdk
/
v1
/
prompts
/
{id}
Path Parameters
id
string
requiredBody
application/json
variables
object
The template variables added to the prompt when executing the prompt.
overrideMessages
object[]
Replaces the configured prompt messages when running the prompt.
appendMessages
object[]
Appended the the end of the configured prompt messages before running the prompt.
user
string
A unique identifier representing your end-user, which can help monitor and detect abuse.
Response
200 - application/json
provider
enum<string>
requiredAvailable options:
anthropic
name
string
requiredparameters
object
requiredWas this page helpful?