API Reference
Completion
Initiates a completion request to the configured LLM provider using specified parameters and provided variables. This endpoint abstracts the integration with different model providers, enabling seamless switching between models while maintaining a consistent data model for your application.
POST
/
sdk
/
v1
/
prompts
/
{id}
/
completion
Path Parameters
id
string
requiredBody
application/json
variables
object
The template variables added to the prompt when executing the prompt.
overrideMessages
object[]
Replaces the configured prompt messages when running the prompt.
appendMessages
object[]
Appended the the end of the configured prompt messages before running the prompt.
user
string
A unique identifier representing your end-user, which can help monitor and detect abuse.
Response
200 - application/json
provider
enum<string>
requiredThe provider of the provided model.
Available options:
ANTHROPIC
, OPENAI
name
string
requiredstats
object
requiredmessage
object
requiredThe completion message generated by the model.
Was this page helpful?