Spice Cloud Platform
Spice Cloud Platform

Spice Firecache | Cloud-Scale DuckDB

Luke Kim
Founder and CEO

Spice AI is not just another data indexer or data provider.

Spice.ai is a set of AI/ML-infra building blocks for creating data and AI-driven applications — web3 data included.

In October’s General Availability announcement we also announced Spice Firecache.

Spice Firecache is a real-time, in-memory SQL service based on cloud-scale DuckDB instances that enables blazing fast SQL query up to 10x the performance of general SQL query. For example, EigenLayer uses Spice Firecache to enable scenarios not possible before, including serving dynamic data to their high traffic dashboards, real-time monitoring, and analytics. ML models hosted on the Spice.ai platform can be paired with Firecache to power super-fast, low-latency inferencing, as demonstrated by the AI predictions demo on the Spice.ai website.

If you like DuckDB, you’ll love it at cloud-scale, automatically provisioned and updated with real-time data.

Experience high concurrency, blazing fast SQL query up to 10x the performance of general SQL query in the Firecache Playground.

🔥 Spice Firecache in Preview

Spice.ai is preloaded with over 100TBs of Web3 and Asset Prices data which can be combined with your own PostgreSQL, MySQL, and data lake external data sources. While organizations need access to terabytes or petabytes of data, the working set of data that is used day-to-day is often orders of magnitude smaller, generally in the 10s of GBs.

Cloud-Scale DuckDB. Spice Firecache enables you to configure specific datasets for inclusion into cloud-scale managed DuckDB instances for query performance 10x or sometimes even 100x faster than general SQL query.

Platform managed. The entire Firecache data lifecycle, from ETL to query ensures datasets are updated in real-time and is completely platform managed. All the user has to do is enable Firecache on a dataset, via configuration as code through GitHub-connected Datasets, deploy it, and start querying.

Easy to query. Once datasets have been deployed, they can be queried by the same interfaces as general SQL query, including with the Playground SQL Query Editor in the Spice.ai portal, the HTTP API, the Apache Arrow Flight API, and SDKs.

Dedicated instances. Dedicated Firecache instances are deployed per organization and are Spice app-specific, only available to the Spice app to which they were deployed.

Easy to integrate. Simply swap to using the Firequery() method instead of the Query() method using one of the Go, Python, Node.js, or Rust SDKs to start using Firecache.

What it enables

Intelligent data and AI-driven software is often limited by the speed and cost of data retrieval from big data warehouses and systems.

High-performance access to data is required to power real-time operations, observability, analytics, ML inferencing, and recently popular Retrieval-Augmented Generation (RAG) for large-language-models (LLMs).

By keeping the day-to-day working set of frequently queried data available in Firecache, data retrieval latency can be significantly reduced, resulting in higher-performance (and lower-cost) frontends, applications, and AI-powered insights.

This is particularly beneficial for customers like EigenLayer who quickly rose to popularity, needing to make 10s of millions of SQL queries a month and growing to power their dynamic community and user dashboards. Spice Firecache ensures that software can meet user expectations for speed and interactivity, supercharging the user experience, all while reducing cost.

Spice Firecache is currently available in preview for Spice AI Design Partners and Enterprise customers.

Get in touch at hey@spice.ai if you’d like to trial Firecache for free.

Thank you!

The Team at Spice AI

🌐 Community updates:

Spice AI recently partnered with RISC Zero, a leader in Zero-Knowledge (ZK) technology to provide ZK-provable ML inferencing for Ethereum gas-fees.

Read the full case-study at spice.ai/cases/risc-zero.

👥 Join the growing Spice AI Community:

Twitter | LinkedIn | GitHub | Discord | Telegram

💡 We’re hiring! 💡

We’re looking for innovators across engineering, design, and devrel. Discover the latest opportunities at Spice AI here and learn how you can play a leading role in shaping the future of intelligent software and application development.

About Spice AI

Spice.ai is a set of AI/ML-infra building blocks for creating data and AI-driven applications — web3 data included.

​Spice AI eliminates the complexity of building and operating costly data and AI infrastructure by composing real-time and historical time-series data, custom ETL, machine learning training and inferencing, in a high-performance, enterprise-grade platform.

Have questions or feedback? Get in touch on Discord.

Spice.ai | View previous releases

Work with Spice AI

Interested in working with Spice AI or looking to learn a little more about the work we do? We are always looking for our next big challenge. Book an introductory call via our Calendly. Take a deeper look at our enterprise offerings by visiting Spice.ai.

Visit Spice.ai
Share

Latest Articles

Spice OSS
Spice AI

Spice AI Announces Contribution of TableProviders for PostgreSQL, MySQL, DuckDB, and SQLite to the Apache DataFusion Project

Spice AI has contributed new TableProviders for PostgreSQL, MySQL, DuckDB, and SQLite to the Apache DataFusion project.

By
Luke Kim
-
July 4, 2024
All Articles
Careers
Spice AI

Interviewing at Spice AI

Interviewing at Spice AI

By
Luke Kim
-
March 14, 2024
All Articles
Spice Cloud Platform
Spice AI

Spice AI achieves SOC 2 Type II compliance

Spice AI achieves SOC 2 Type II compliance

By
Luke Kim
-
March 5, 2024
All Articles