Installation

Install the Prompt Foundry SDK

npm install @prompt-foundry/typescript-sdk

Integration

The full API of this library can be found in the API Reference page by selecting JavaScript in the interactive examples.

Option 1 - Completion Proxy

This option requires you to add a provider API key in your organization settings.

Initiates a completion request to the configured LLM provider using specified parameters and provided variables. This endpoint abstracts the integration with different model providers, enabling seamless switching between models while maintaining a consistent data model for your application.

import PromptFoundry from "@prompt-foundry/typescript-sdk";

// Initialize Prompt Foundry SDK with your API key
const promptFoundry = new PromptFoundry({
  apiKey: process.env["PROMPT_FOUNDRY_API_KEY"],
});

async function main() {
  // Retrieve model parameters for the prompt
  const completionCreateResponse = await client.completion.create(
    "637ae1aa8f4aa6fad144ccbd",
    {
      // Optionally append additional messages to the converstation thread on top of your configured prompt messages
      appendMessages: [
        {
          role: "user",
          content: [
            {
              type: "TEXT",
              text: "What is the weather in Seattle, WA?",
            },
          ],
        },
      ],
      // Supports prompt template variables
      variables: {},
    }
  );
  // completion response
  console.log(completionCreateResponse.message);
}

main().catch(console.error);

Option 2 - Direct Provider Integration

Fetches the configured model parameters and messages rendered with the provided variables mapped to the set LLM provider. This endpoint abstracts the need to handle mapping between different providers, while still allowing direct calls to the providers.

OpenAI Integration

Install the OpenAI SDK

npm install openai

Import the OpenAI and Prompt Foundry SDKs

import PromptFoundry from "@prompt-foundry/typescript-sdk";
import { Configuration, OpenAIApi } from "openai";

// Initialize Prompt Foundry SDK with your API key
const promptFoundry = new PromptFoundry({
  apiKey: process.env["PROMPT_FOUNDRY_API_KEY"],
});

// Initialize OpenAI SDK with your API key
const configuration = new Configuration({
  apiKey: process.env["OPENAI_API_KEY"],
});
const openai = new OpenAIApi(configuration);

async function main() {
  // Retrieve model parameters for the prompt
  const modelParameters = await promptFoundry.prompts.getParameters("1212121", {
    appendMessages: [
      {
        role: "user",
        content: [
          {
            type: "TEXT",
            text: "What is the weather in Seattle, WA?",
          },
        ],
      },
    ],
    variables: { hello: "world" },
  });

  // check if provider is Open AI
  if (modelParameters.provider === "openai") {
    // Use the retrieved parameters to create a chat completion request
    const modelResponse = await openai.chat.completions.create(
      modelParameters.parameters
    );

    // Print the response from OpenAI
    console.log(modelResponse.data);
  }
}

main().catch(console.error);

Anthropic Integration

Install the Anthropic SDK

npm install @anthropic-ai/sdk

Import the Anthropic and Prompt Foundry SDKs

import PromptFoundry from "@prompt-foundry/typescript-sdk";
import Anthropic from "@anthropic-ai/sdk";

// Initialize Prompt Foundry SDK with your API key
const promptFoundry = new PromptFoundry({
  apiKey: process.env["PROMPT_FOUNDRY_API_KEY"],
});

// Initialize Anthropic SDK with your API key
const anthropic = new Anthropic({
  apiKey: process.env["ANTHROPIC_API_KEY"],
});

async function main() {
  // Retrieve model parameters for the prompt
  const modelParameters = await promptFoundry.prompts.getParameters("1212121", {
    appendMessages: [
      {
        role: "user",
        content: [
          {
            type: "TEXT",
            text: "What is the weather in Seattle, WA?",
          },
        ],
      },
    ],
    variables: { hello: "world" },
  });

  // check if provider is Anthropic
  if (modelParameters.provider === "anthropic") {
    // Use the retrieved parameters to create a chat completion request
    const message = await anthropic.messages.create(modelParameters.parameters);

    // Print the response from Anthropic
    console.log(message.content);
  }
}

main().catch(console.error);

Additional Information

For more details, visit the GitHub Repo.

Was this page helpful?