We finally fixed Observability!
Stop paying data tax. Get efficient, petabyte-scale, flexible, OTEL-native observability for the age of AI agents.
Built by veterans from
Why Berserk?
Efficient by Design
Petabyte-scale data per day without breaking the bank. Minimal processing and S3 storage — bring your own bucket or self-host with MinIO.
AI Native
Built-in CLI and MCP server. Query in KQL — the well-known language AI agents already understand.
Claude setup →Zero-Config Ingest
Point your OpenTelemetry collectors at Berserk and ship. No schemas to define, no indexes to tune, no configuration needed.
Fast and Flexible
Sub-second queries on semi-structured data. Apply strict types and columns only where you need them.
Integrations
Observability needed a rewrite
Software is producing more telemetry than ever. Security events, error logs, application traces—and now AI prompts and model reasoning logs. With every deploy, the volume grows. Together they tell the story of our systems, but the plot gets lost in the noise and scale.
AI agents make the problem even harder. Their prompts and reasoning generate large, text-heavy outputs that capture the decisions driving our businesses. Prompts and model reasoning don't fit traditional telemetry schemas, yet they carry the same operational signal.
What we need isn't an auxiliary system for AI logs. We need a unified system that can correlate logs, metrics, traces, and AI output.
We built Berserk for telemetry in the AI era. It is schemaless, fast, and designed to handle large text-heavy logs alongside traditional telemetry—while remaining exceptionally affordable, even at petabyte scale.

Ready to get started?
Run the full stack locally with Docker Compose in under five minutes.
