Class: exports

exports(apiKey, defaultModel)

new exports(apiKey, defaultModel)

Create a new HClaudeClientConnector

.

Create a new HClaudeClientConnector

Parameters:
Name Type Default Description
apiKey string

Claude API key

defaultModel string claude-3-opus-20240229

Optional default model to use

Source:

exports(apiKey, baseUrl, defaultModel)

new exports(apiKey, baseUrl, defaultModel)

Create a new MistralConnector

.

Create a new MistralConnector

Parameters:
Name Type Default Description
apiKey string

Mistral API key

baseUrl string https://api.mistral.ai/v1

Optional base URL for the API (defaults to 'https://api.mistral.ai/v1')

defaultModel string mistral-medium

Optional default model to use (defaults to 'mistral-medium')

Source:

exports(apiKey, defaultModel)

new exports(apiKey, defaultModel)

Create a new NomicConnector

.

Create a new NomicConnector

Parameters:
Name Type Default Description
apiKey string null

Nomic API key (optional, will use NOMIC_API_KEY env var)

defaultModel string nomic-embed-text-v1.5

Optional default model to use

Source:

exports(baseUrl, defaultModel)

new exports(baseUrl, defaultModel)

Create a new HOllamaClientConnector

.

Create a new HOllamaClientConnector

Parameters:
Name Type Default Description
baseUrl string http://localhost:11434

Optional base URL for Ollama API (defaults to http://localhost:11434)

defaultModel string qwen2:1.5b

Optional default model to use

Source:

exports(llmProvider, chatModel, temperatureopt, optionsopt)

new exports(llmProvider, chatModel, temperatureopt, optionsopt)

Parameters:
Name Type Attributes Default Description
llmProvider LLMProvider
chatModel string
temperature number <optional>
0.7
options Object <optional>
{}
Source: