Spice Firecache | Cloud-Scale DuckDB

Spice Cloud Platform

Luke Kim

December 5, 2023
Spice Firecache | Cloud-Scale DuckDB

Spice AI is not just another data indexer or data provider.

Spice.ai is a set of AI/ML-infra building blocks for creating data and AI-driven applications — web3 data included.

In October’s General Availability announcement we also announced Spice Firecache.

Spice Firecache is a real-time, in-memory SQL service based on cloud-scale DuckDB instances that enables blazing fast SQL query up to 10x the performance of general SQL query. For example, EigenLayer uses Spice Firecache to enable scenarios not possible before, including serving dynamic data to their high traffic dashboards, real-time monitoring, and analytics. ML models hosted on the Spice.ai platform can be paired with Firecache to power super-fast, low-latency inferencing, as demonstrated by the AI predictions demo on the Spice.ai website.

If you like DuckDB, you’ll love it at cloud-scale, automatically provisioned and updated with real-time data.

Experience high concurrency, blazing fast SQL query up to 10x the performance of general SQL query in the Firecache Playground.

🔥 Spice Firecache in Preview

Spice.ai is preloaded with over 100TBs of Web3 and Asset Prices data which can be combined with your own PostgreSQL, MySQL, and data lake external data sources. While organizations need access to terabytes or petabytes of data, the working set of data that is used day-to-day is often orders of magnitude smaller, generally in the 10s of GBs.

Cloud-Scale DuckDB. Spice Firecache enables you to configure specific datasets for inclusion into cloud-scale managed DuckDB instances for query performance 10x or sometimes even 100x faster than general SQL query.

Platform managed. The entire Firecache data lifecycle, from ETL to query ensures datasets are updated in real-time and is completely platform managed. All the user has to do is enable Firecache on a dataset, via configuration as code through GitHub-connected Datasets, deploy it, and start querying.

Easy to query. Once datasets have been deployed, they can be queried by the same interfaces as general SQL query, including with the Playground SQL Query Editor in the Spice.ai portal, the HTTP API, the Apache Arrow Flight API, and SDKs.

Dedicated instances. Dedicated Firecache instances are deployed per organization and are Spice app-specific, only available to the Spice app to which they were deployed.

Easy to integrate. Simply swap to using the Firequery() method instead of the Query() method using one of the GoPythonNode.js, or Rust SDKs to start using Firecache.

What it enables

Intelligent data and AI-driven software is often limited by the speed and cost of data retrieval from big data warehouses and systems.

High-performance access to data is required to power real-time operations, observability, analytics, ML inferencing, and recently popular Retrieval-Augmented Generation (RAG) for large-language-models (LLMs).

By keeping the day-to-day working set of frequently queried data available in Firecache, data retrieval latency can be significantly reduced, resulting in higher-performance (and lower-cost) frontends, applications, and AI-powered insights.

This is particularly beneficial for customers like EigenLayer who quickly rose to popularity, needing to make 10s of millions of SQL queries a month and growing to power their dynamic community and user dashboards. Spice Firecache ensures that software can meet user expectations for speed and interactivity, supercharging the user experience, all while reducing cost.

Spice Firecache is currently available in preview for Spice AI Design Partners and Enterprise customers.

Get in touch at hey@spice.ai if you’d like to trial Firecache for free.

Thank you!

The Team at Spice AI

🌐 Community updates:

Spice AI recently partnered with RISC Zero, a leader in Zero-Knowledge (ZK) technology to provide ZK-provable ML inferencing for Ethereum gas-fees.

Read the full case-study at spice.ai/cases/risc-zero.

👥 Join the growing Spice AI Community:

Twitter | LinkedIn | GitHub | Discord | Telegram

💡 We’re hiring! 💡

We’re looking for innovators across engineering, design, and devrel. Discover the latest opportunities at Spice AI here and learn how you can play a leading role in shaping the future of intelligent software and application development.

About Spice AI

Spice.ai is a set of AI/ML-infra building blocks for creating data and AI-driven applications — web3 data included.

​Spice AI eliminates the complexity of building and operating costly data and AI infrastructure by composing real-time and historical time-series data, custom ETL, machine learning training and inferencing, in a high-performance, enterprise-grade platform.

Have questions or feedback? Get in touch on Discord.

Spice.ai | View previous releases

Share
twitter logolinkedin logomailto logo
copy link logo
Get the latest insights

Get expert updates on building, deploying, and scaling AI applications

See Spice in action

Get a guided walkthrough of how development teams use Spice to query, accelerate, and integrate AI for mission-critical workloads.

Get a demo

content stat graphiccontent stat graphiccontent stat orb
Spice Firecache | Cloud-Scale DuckDB | Spice AI