@happyvertical/browser-ai
Client-side AI inference with WebGPU acceleration, ONNX Runtime, and offline-first capabilities.
v0.19.0Browser AIONNXWebGPU
Overview
browser-ai enables client-side AI inference in the browser using ONNX Runtime Web with WebGPU/WebAssembly. Provides embedding generation, text generation, and model management.
Installation
bash
npm install @happyvertical/browser-aiQuick Start
typescript
import { BrowserAI } from '@happyvertical/browser-ai';
// Initialize (loads ONNX Runtime in browser)
const ai = await BrowserAI.create();
// Generate embeddings
const embedding = await ai.embed('Hello world');
// Returns: Float32Array of 384 dimensions
// Semantic search
const documents = [
{ id: 1, text: 'Paris is the capital of France' },
{ id: 2, text: 'Tokyo is the capital of Japan' },
{ id: 3, text: 'Berlin is the capital of Germany' }
];
const results = await ai.semanticSearch('What is the capital of France?', documents);
// Returns documents sorted by similarity
// Generate embeddings
const embedding = await ai.generateEmbedding('Machine learning is fascinating');
console.log(embedding); // Float32Array[384]