Meet gnata The AI-Generated Go Library That Saved Reco $500K a Year.
From $500,000 to $400: How Reco Used AI to Port JSONata to Go in Just 7 Hours
In a remarkable display of AI-assisted engineering, the security firm Reco has shared its journey of porting the JSONata library from JavaScript to Go. By eliminating massive architectural overhead, the company successfully slashed its annual server costs by $500,000, all for a one-time AI token cost of just $400.
The Architectural Bottleneck
Reco’s core engine, written in Go, relies heavily on JSONata to evaluate security policies for internal events. Since JSONata is natively a JavaScript library, Reco previously ran it in separate containers, communicating via RPC (Remote Procedure Call).
While functional, this setup was incredibly inefficient. For large-scale clients, Reco had to maintain up to 200 replicas, costing the company $300,000 per year in server fees alone. Previous optimization attempts such as embedding the V8 engine into the Go process or using simpler libraries like GJSON failed to either reduce costs significantly or support the complex queries required.
The 7-Hour Transformation
Inspired by Cloudflare’s successful re-engineering of Next.js, Reco engineer Nir Barak decided to rebuild JSONata in Go using AI. To ensure 100% compatibility, he utilized:
1,778 existing test cases from the original JSONata.
2,107 new test cases designed to verify Reco’s specific internal wrappers.
In total, the development took only 7 hours and cost $400 in AI tokens.
The Result: Massive Savings and Open Source
The move to the native Go implementation, now dubbed "gnata" allowed Reco to eliminate the redundant JavaScript containers entirely. This not only saved the initial $300,000 but also improved resource management so effectively that it reduced the core engine's server costs by an additional $200,000.
Total Annual Savings: $500,000.
Reco has released gnata as an open-source project under the MIT License, making it freely available for the developer community.
What made this project a success wasn't just the intelligence of the AI, but the "test suites." Having nearly 4,000 test cases served as a "compass," guiding the AI to write 100% accurate code. This is a crucial lesson: in the AI era, programmers' work shifts from "writing code" to "structuring tests" to control the AI.
In the world of microservices, many think that calling across containers via RPC is normal. However, the Reco case proves that serialization/deserialization of large JSON datasets across protocols creates enormous overhead in terms of both latency and CPU usage. Returning to native libraries (Go) eliminates this step, significantly speeding up the system.
Currently, we're seeing a trend of porting older scripting libraries (JS/Python) to performance-oriented languages (Go/Rust) using AI. What used to take a team of five engineers three months can now be accomplished in a single afternoon with just one person and the AI. This is one of the most amazing productivity boosts of the decade.
Reco's decision to open-source gnata is a smart strategy. JSONata is a widely used standard, but lacks a good alternative in Go. Opening this project will attract a community of developers to help find bugs and further enhance Reco's reputation as a leader in security technology.
Google TurboQuant The New Speed Standard for AI Vector Compression.
Source: Reco

Comments
Post a Comment