Installation
Install the Prompt Foundry SDK
npm install @prompt-foundry/typescript-sdk
Integration
The full API of this library can be found in the API Reference page by selecting JavaScript in the interactive examples.
Option 1 - Completion Proxy
Initiates a completion request to the configured LLM provider using specified parameters and provided variables. This endpoint abstracts the integration with different model providers, enabling seamless switching between models while maintaining a consistent data model for your application.
import PromptFoundry from "@prompt-foundry/typescript-sdk";
const promptFoundry = new PromptFoundry({
apiKey: process.env["PROMPT_FOUNDRY_API_KEY"],
});
async function main() {
const completionCreateResponse = await client.completion.create(
"637ae1aa8f4aa6fad144ccbd",
{
appendMessages: [
{
role: "user",
content: [
{
type: "TEXT",
text: "What is the weather in Seattle, WA?",
},
],
},
],
variables: {},
}
);
console.log(completionCreateResponse.message);
}
main().catch(console.error);
Option 2 - Direct Provider Integration
Fetches the configured model parameters and messages rendered with the provided variables mapped to the set LLM provider. This endpoint abstracts the need to handle mapping between different providers, while still allowing direct calls to the providers.
OpenAI Integration
Install the OpenAI SDK
Import the OpenAI and Prompt Foundry SDKs
import PromptFoundry from "@prompt-foundry/typescript-sdk";
import { Configuration, OpenAIApi } from "openai";
const promptFoundry = new PromptFoundry({
apiKey: process.env["PROMPT_FOUNDRY_API_KEY"],
});
const configuration = new Configuration({
apiKey: process.env["OPENAI_API_KEY"],
});
const openai = new OpenAIApi(configuration);
async function main() {
const modelParameters = await promptFoundry.prompts.getParameters("1212121", {
appendMessages: [
{
role: "user",
content: [
{
type: "TEXT",
text: "What is the weather in Seattle, WA?",
},
],
},
],
variables: { hello: "world" },
});
if (modelParameters.provider === "openai") {
const modelResponse = await openai.chat.completions.create(
modelParameters.parameters
);
console.log(modelResponse.data);
}
}
main().catch(console.error);
Anthropic Integration
Install the Anthropic SDK
npm install @anthropic-ai/sdk
Import the Anthropic and Prompt Foundry SDKs
import PromptFoundry from "@prompt-foundry/typescript-sdk";
import Anthropic from "@anthropic-ai/sdk";
const promptFoundry = new PromptFoundry({
apiKey: process.env["PROMPT_FOUNDRY_API_KEY"],
});
const anthropic = new Anthropic({
apiKey: process.env["ANTHROPIC_API_KEY"],
});
async function main() {
const modelParameters = await promptFoundry.prompts.getParameters("1212121", {
appendMessages: [
{
role: "user",
content: [
{
type: "TEXT",
text: "What is the weather in Seattle, WA?",
},
],
},
],
variables: { hello: "world" },
});
if (modelParameters.provider === "anthropic") {
const message = await anthropic.messages.create(modelParameters.parameters);
console.log(message.content);
}
}
main().catch(console.error);
For more details, visit the GitHub Repo.