new ClientConnector(baseUrl, defaultModel)
Create a new ClientConnector
.Create a new ClientConnector
Parameters:
Name | Type | Description |
---|---|---|
baseUrl |
string | Optional base URL for Ollama API (defaults to http://localhost:11434) |
defaultModel |
string | Optional default model to use |
- Source:
Methods
(async) generateChat(model, messages, options) → {string}
Generate chat completion using Ollama
.Generate chat completion using Ollama
Parameters:
Name | Type | Description |
---|---|---|
model |
string | Model name to use |
messages |
Array | Array of message objects with role and content |
options |
Object | Additional options |
- Source:
Returns:
- Response text
- Type
- string
(async) generateCompletion(model, prompt, options) → {string}
Generate completion using Ollama
.Generate completion using Ollama
Parameters:
Name | Type | Description |
---|---|---|
model |
string | Model name to use |
prompt |
string | Text prompt |
options |
Object | Additional options |
- Source:
Returns:
- Response text
- Type
- string
(async) generateEmbedding(model, input) → {Array.<number>}
Generate embeddings using Ollama
.Generate embeddings using Ollama
Parameters:
Name | Type | Description |
---|---|---|
model |
string | Model name to use for embedding |
input |
string | Text to generate embedding for |
- Source:
Returns:
- Vector embedding
- Type
- Array.<number>
(async) initialize()
Initialize the client
.Initialize the client
- Source: