LlamaHttpClient

class LlamaHttpClient(defaultPort: () -> Int, log: (CodBi.LogLevel, String) -> Unit)

HTTP client for communicating with a local LLAMA-Server instance.

Provides synchronous and streaming POST requests to the LLAMA-Server API. Port is supplied as a lambda so it always resolves to the current active port.

Parameters

defaultPort

Lambda returning the current server port. Called on each request.

log

Log function for diagnostic output.

Constructors

Link copied to clipboard
constructor(defaultPort: () -> Int, log: (CodBi.LogLevel, String) -> Unit)

Properties

Link copied to clipboard

Base URL using the default port.

Functions

Link copied to clipboard
fun httpPost(endpoint: String, jsonBody: String, timeoutMs: Int, port: Int = defaultPort()): String

Sends a POST request to the LLAMA-Server and returns the response body.

Link copied to clipboard
fun httpPostStreaming(endpoint: String, jsonBody: String, onLine: (String) -> Unit, shouldStop: () -> Boolean = { false }, timeoutMs: Int, port: Int = defaultPort())

Sends a POST request to the LLAMA-Server and streams the response as SSE lines.

Link copied to clipboard

Base URL for a specific port.