README.md
md
# Friendli.ai Go Client Library
A Go client library for the [Friendli.ai](https://friendli.ai) serverless API, providing chat completions with streaming support.
## Features
- ✅ Chat completions (conversational AI)
- ✅ Streaming and non-streaming responses
- ✅ OpenAI-compatible API structure
- ✅ Server-Sent Events (SSE) streaming
- ✅ Tool/function calling support
- ✅ Comprehensive error handling
- ✅ Context support for cancellation
## Quick Start
### Initialize Client
```go
import "hacknight/internal/friendli"
client, err := friendli.NewClient(os.Getenv("FRIENDLI_API_KEY"))
if err != nil {
    log.Fatal(err)
}
```
### Non-Streaming Chat Completion
```go
req := friendli.NewChatCompletionRequest(
    "meta-llama-3.1-8b-instruct",
    []friendli.Message{
        friendli.NewSystemMessage("You are a helpful assistant"),
        friendli.NewUserMessage("What is Go?"),
    },
).WithTemperature(0.7).WithMaxTokens(200)
resp, err := client.Chat.CreateCompletion(context.Background(), req)
if err != nil {
    log.Fatal(err)
}
fmt.Println(resp.Choices[0].Message.Content)
```
### Streaming Chat Completion
```go
req := friendli.NewChatCompletionRequest(
    "meta-llama-3.1-8b-instruct",
    []friendli.Message{
        friendli.NewUserMessage("Tell me a story"),
    },
)
stream, err := client.Chat.CreateCompletionStream(context.Background(), req)
if err != nil {
    log.Fatal(err)
}
defer stream.Close()
for {
    chunk, err := stream.Recv()
    if err == io.EOF {
        break
    }
    if err != nil {
        log.Fatal(err)
    }
    if len(chunk.Choices) > 0 {
        fmt.Print(chunk.Choices[0].Delta.Content)
    }
}
```
## API Reference
### Client Creation
```go
// Create client with API key
client, err := friendli.NewClient(apiKey)
// With custom options
client, err := friendli.NewClient(apiKey,
    friendli.WithTimeout(120 * time.Second),
    friendli.WithHTTPClient(customHTTPClient),
)
```
### Chat Completions
```go
// Non-streaming
resp, err := client.Chat.CreateCompletion(ctx, req)
// Streaming
stream, err := client.Chat.CreateCompletionStream(ctx, req)
```
### Request Builders
```go
// Create basic request
req := friendli.NewChatCompletionRequest(model, messages)
// Fluent API for configuration
req.WithTemperature(0.7)
   .WithMaxTokens(200)
   .WithTopP(0.9)
   .WithTools(tools)
```
### Message Helpers
```go
friendli.NewSystemMessage("You are helpful")
friendli.NewUserMessage("Hello")
friendli.NewAssistantMessage("Hi there!")
friendli.NewToolMessage(result, toolCallID)
```
### Stream Helpers
```go
// Collect entire stream as string
content, err := stream.CollectContent()
// Collect all chunks
chunks, err := stream.CollectAll()
// Stream to channels (for concurrent processing)
chunkCh := make(chan *friendli.ChatCompletionChunk, 10)
errCh := make(chan error, 1)
go stream.StreamToChannel(chunkCh, errCh)
```
## Error Handling
```go
if err != nil {
    if apiErr, ok := err.(*friendli.APIError); ok {
        switch {
        case apiErr.IsRateLimitError():
            // Handle rate limiting
        case apiErr.IsAuthenticationError():
            // Handle invalid API key
        case apiErr.IsValidationError():
            // Handle request validation errors
        default:
            // Handle other API errors
        }
    }
}
```
## Supported Models
- **Llama 3.1**: 8B, 70B, 405B (Instruct variants)
- **Llama 3.2 Vision**: 11B, 90B
- **Mistral**: 7B, 7B Instruct
- **CodeLlama**: Various sizes
- **468,834+ other models** from Hugging Face
See https://suite.friendli.ai for the full model catalog.
## Advanced Features
### Tool Calling
```go
tools := []friendli.Tool{
    {
        Type: "function",
        Function: friendli.FunctionDef{
            Name:        "get_weather",
            Description: "Get the current weather",
            Parameters: map[string]interface{}{
                "type": "object",
                "properties": map[string]interface{}{
                    "location": map[string]string{
                        "type": "string",
                    },
                },
                "required": []string{"location"},
            },
        },
    },
}
req := friendli.NewChatCompletionRequest(model, messages).WithTools(tools)
```
### Response Format Control
```go
// JSON mode
req.ResponseFormat = &friendli.ResponseFormat{
    Type: "json_object",
}
// JSON schema validation
req.ResponseFormat = &friendli.ResponseFormat{
    Type: "json_schema",
    JSONSchema: yourSchema,
}
// Regex pattern
req.ResponseFormat = &friendli.ResponseFormat{
    Type: "regex",
    Regex: `\d{3}-\d{3}-\d{4}`,
}
```
### HTMX Integration
Perfect for server-sent events with HTMX:
```go
func handleChat(w http.ResponseWriter, r *http.Request) {
    w.Header().Set("Content-Type", "text/event-stream")
    w.Header().Set("Cache-Control", "no-cache")
    w.Header().Set("Connection", "keep-alive")
    stream, err := client.Chat.CreateCompletionStream(r.Context(), req)
    if err != nil {
        http.Error(w, err.Error(), http.StatusInternalServerError)
        return
    }
    defer stream.Close()
    flusher, _ := w.(http.Flusher)
    for {
        chunk, err := stream.Recv()
        if err == io.EOF {
            break
        }
        if err != nil {
            fmt.Fprintf(w, "event: error\ndata: %s\n\n", err.Error())
            flusher.Flush()
            return
        }
        if len(chunk.Choices) > 0 {
            content := chunk.Choices[0].Delta.Content
            fmt.Fprintf(w, "data: %s\n\n", content)
            flusher.Flush()
        }
    }
}
```
## Environment Variables
```bash
export FRIENDLI_API_KEY="flp_your_api_key_here"
```
Get your API key from https://suite.friendli.ai (Settings → Tokens).
## Implementation Details
### Architecture
- **client.go**: Core HTTP client with authentication
- **chat.go**: Chat completions service
- **stream.go**: SSE streaming handler
- **types.go**: Request/response structures
- **errors.go**: Error types and handling
### Dependencies
Uses only Go standard library (`net/http`, `encoding/json`, etc.). No external dependencies required.
### Thread Safety
The client is safe for concurrent use. Multiple goroutines can share a single client instance.
## License
Internal library for the hacknight project.
No comments yet.