WatsonX AI
LangChain.js supports integration with IBM WatsonX AI. Checkout WatsonX AI for a list of available models.
Setup
You will need to set the following environment variables for using the WatsonX AI API.
IBM_CLOUD_API_KEY
which can be generated via IBM CloudWATSONX_PROJECT_ID
which can be found in your project's manage tab
Alternatively, these can be set during the WatsonxAI Class instantiation as ibmCloudApiKey
and projectId
respectively.
For example:
const model = new WatsonxAI({
ibmCloudApiKey: "My secret IBM Cloud API Key"
projectId: "My secret WatsonX AI Project id"
});
Usage
- npm
- Yarn
- pnpm
npm install @langchain/community @langchain/core
yarn add @langchain/community @langchain/core
pnpm add @langchain/community @langchain/core
import { WatsonxAI } from "@langchain/community/llms/watsonx_ai";
// Note that modelParameters are optional
const model = new WatsonxAI({
modelId: "meta-llama/llama-2-70b-chat",
modelParameters: {
max_new_tokens: 100,
min_new_tokens: 0,
stop_sequences: [],
repetition_penalty: 1,
},
});
const res = await model.invoke(
"What would be a good company name for a company that makes colorful socks?"
);
console.log({ res });
API Reference:
- WatsonxAI from
@langchain/community/llms/watsonx_ai
Related
- LLM conceptual guide
- LLM how-to guides