February 25, 2025
How Sawmills Plans to Cut Big Observability Data Down to Size

How Sawmills Plans to Cut Big Observability Data Down to Size

How Sawmills Plans to Cut Big Observability Data Down to Size

(Matveev-Aleksandr/Shutterstock)

Observability data is supposed to help companies identify and solve IT issues, but the data has grown so large and complex that it’s become a problem in its own right. The latest vendor with a plan to slash observability data and curb spending on observability vendors is Sawmills, which emerged from stealth last week with AI-powered software for managing telemetry pipelines.

Problems with observability data–logs, traces, and metrics generated by applications–have been creeping up for years. Back in 2022, Dynatrace declared companies were “drowning in data” as a result of surging observability data collection, and last year, it saw a 15% increase in IT leaders saying the data was beyond humans’ ability of comprehension.

Today, the average company spends $1.95 million per year to manage their observability data, a 2024 New Relic study found. One vendor often on the receiving end of this observability-driven ire is Datadog. On an earnings call in 2023, Datadog executives discussed one customer in the crypto industry that spent a whopping $65 million on Datadog the previous year (which some bloggers have theorized was Coinbase).

Sawmills co-founders (from left) CTO Amir Jakoby, CEO Ronit Belson, CPO Erez Rusovsky

One person’s big data problem is another person’s big data opportunity, and so it’s not a surprise that observability data solutions have popped up. Cribl was one of the first companies to have the idea of stepping directly into the observability data pipeline and acting as a traffic cop to effectively weed out the extraneous, unnecessary data while sending the essential stuff on its merry way.

Now another vendor has appeared with a similar plan. Sawmills emerged last week from stealth with $10 million in seed funding from Mayfield and Alumni Ventures and a plan to cut the fat out of customers’ observability  data pipelines.

The company has developed a telemetry management tool that uses machine learning and AI to analyze the raw feed of data, identify duplicate, noisy, or unneeded data, and then route the data to its ultimate destination, such as a traditional observability platform like Datadog or an observability data lake.

“By proactively trimming high-volume or high-cardinality data streams before they’re ingested or stored by observability platforms (which charge by volume or data points), Sawmills significantly reduces both data ingestion and long-term retention costs,” the company says. Sawmills claims it can reduce customers’ observability data costs by 50% to 80% using this approach.

Sawmills uses AI to make recommendations for reducing observability data

Sawmills CEO Ronit Belson, who co-founded the company with Amir Jakoby and Erez Rusovsky, says observability has become the second largest expense after cloud costs for most companies.

“In our conversations with VPs of Engineering at leading companies, they consistently tell us that up to 90% of their observability data is useless–yet they’re still paying to collect, process, and store all of it,” Belson says in a press release.

Customers are struggling with unpredictable observability bills and are paying large sums for simple mistakes. Belson says her team spoke with one prospect that was billed $250,000 because of one error made by a developer.

“Engineering teams need intelligent telemetry data management that not only improves data quality but also prevents costly mistakes before they happen,” she continues. “Sawmills automatically identifies optimization opportunities and implements guardrails to protect against unexpected cost spikes while ensuring you capture the data that matters.”

A core component of Sawmills’ strategy is how it uses AI and ML to analyze observability data. The company says its software generates recommendations for how customers can cut costs, “such as sampling, dropping unneeded fields, aggregating, or converting logs to metrics,” the company tells BigDATAwire. “By proactively trimming high-volume or high-cardinality data streams before they’re ingested or stored by observability platforms.”

Sawmills doesn’t replace existing observability tools. Rather, it sits in front of observability tools from vendors like Splunk, New Relic, Grafana, Prometheus, Elastic, Dynatrace, and AWS’s Amazon CloudWatch. Sawmills works with logs, metrics, and traces defined using the Open Telemetry (Otetl) formats as well as other proprietary and open standards, the company says.

While interrupting the telemetry data flow looks somewhat similar to Cribl’s strategy, there are important differences, Sawmills says. Cribl is focused on security use cases, while Sawmills is oriented toward observability, Sawmills says.

Sawmills co-founders are tech industry veterans who have experience with the challenges of managing observability data. Belson and co-founder Rusovsky both worked at Rollout.io, which was acquired by CloudBees, while Jakoby was VP of AI Ops at New Relic.

In a blog post, Belson elaborates why she and her co-founders started the company. After surveying about 100 senior IT leaders, they realized that the situation with observability data was much worse than they originally thought, with 70% to 90% of observability data essentially noise. Better management of that data, then, became the goal.

“Observability is essential, yet its costs are spiraling out of control, forcing engineering teams into an impossible choice: maintain full visibility or keep costs in check,” she says. “[We] started Sawmills because we believe they shouldn’t have to choose.”

Related Items:

Companies Drowning in Observability Data, Dynatrace Says

2025 Observability Predictions and Observations

Sumo Logs a Stand with Dynamic Observability

Leave a Reply

Your email address will not be published. Required fields are marked *