You’ve created and tested your prompt, and now you’re ready to deploy it to your production environment.

1

Deploy

Click the “Deploy” button in the prompt editor to deploy your prompt to the production environment.

2

Create API Key

Navigate to the “API Keys” section in your organization settings and create a new API key. Be sure to save it as a secure environment variable or in a secret manager before closing the tab, as you will not have access to it again.

3

Install SDK

Now, install the PromptFoundry SDK in your application.

4

Integrate into your provider call

Integrate the SDK into your provider call to start using your deployed prompt.

Initiates a completion request to the configured LLM provider using specified parameters and provided variables. This endpoint abstracts the integration with different model providers, enabling seamless switching between models while maintaining a consistent data model for your application.

This option requires you to add a provider API key in your organization settings.

For more information on each SDK, visit the “Libraries” section of the documentation.