HelpLogin
© 2022-2025 Spice AI, Inc.
SQL Query ReferenceDocsFAQSupport
PrivacyTerms
Status
Home
Datasets
Models
trunk
Edit on GitHub
Fork
/docs/website/blog/2024/announcing-1.0-stable.mdx
1---
2date: 2025-01-22
3title: 'Announcing Spice.ai Open Source 1.0-stable: A Portable Compute Engine for Data-Grounded AI — Now Ready for Production'
4sidebar_label: 'Announcing 1.0-stable'
5description: 'Announcement of the 1.0-stable release of Spice.ai Open Source, a portable compute engine for data-grounded AI, now ready for production.'
6image: /img/blog/2024/announcing-1.0-stable/spice.ai-open-source.png
7slug: announcing-1.0-stable
8type: blog
9authors: [lukekim]
10tags: [announcement, release]
11---
12
13import Tabs from '@theme/Tabs';
14import TabItem from '@theme/TabItem';
15import ReactPlayer from 'react-player';
16
17import Quote from '@site/src/components/molecules/quote/quote';
18
19🎉 Today marks the [1.0-stable release](../releases/v1.0-stable.md) of Spice.ai Open Source—purpose-built to help enterprises ground AI in data. By unifying federated data query, retrieval, and AI inference into a single engine, Spice mitigates AI hallucinations, accelerates data access for mission-critical workloads, and makes it simple and easy for developers to build fast and accurate data-intensive applications across cloud, edge, or on-prem.
20
21![Spice.ai Open Source](/img/blog/2024/announcing-1.0-stable/spice.ai-open-source.png)
22
23Enterprise AI systems are only as good as the context they’re provided. When data is inaccessible, incomplete, or outdated, even the most advanced models can generate outputs that are inaccurate, misleading, or worse, potentially harmful. In one example, [a chatbot was tricked into selling a 2024 Chevy Tahoe for $1](https://hothardware.com/news/car-dealerships-chatgpt-goes-awry-when-internet-gets-to-it) due to a lack of contextual safeguards. For enterprises, errors like these are unacceptable—it’s the difference between success and failure.
24
25Retrieval-Augmented Generation (RAG) is part of the answer — but traditional RAG is only as good as the data it has access to. If data is locked away in disparate, often legacy data systems, or cannot be stitched together for accurate retrieval, you get, as Benioff puts it, "Clippy 2.0".
26
27![Benioff Post](/img/blog/2024/announcing-1.0-stable/benioff-clippy-2.0.png)
28
29And often, after initial Python-scripted pilots, you’re left with a new set of problems: How do you deploy AI that meets enterprise requirements for performance, security, and compliance while being cost efficient? Directly querying large datasets for retrieval is slow and expensive. Building and maintaining complex ETL pipelines requires expensive data teams that most organizations don’t have. And because enterprise data is highly sensitive, you need secure access and auditable observability—something many RAG setups don’t even consider.
30
31Developers need a platform at the intersection of data and AI—one specifically designed to ground AI in data. A solution that unifies data query, search, retrieval, and model inference—ensuring performance, security, and accuracy so you can build AI that you and your customers can trust.
32
33## Spice.ai OSS: A portable data, AI, and retrieval engine
34
35In March of 2024, [we introduced Spice.ai Open Source](https://spiceai.org/blog/2024/adding-spice), a SQL query engine to materialize and accelerate data from any database, data warehouse, or data lake so that data can be accessed wherever it lives across the enterprise — consistently fast. But that was only the start.
36
37Building on this foundation, Spice.ai OSS unifies data, retrieval, and AI, to provide current, relevant context to mitigate AI “hallucinations” and significantly reduce incorrect outputs-just one of the many mission-critical use cases Spice.ai addresses.
38
39Spice is a portable, single-node, compute engine built in Rust. It [embeds the fastest single-node SQL query engine](https://datafusion.apache.org/blog/2024/11/18/datafusion-fastest-single-node-parquet-clickbench/), DataFusion, to serve secure, virtualized data views to data-intensive apps, AI, and agents. Sub-second data query is accelerated locally using Apache Arrow, DuckDB, or SQLite.
40
41Now at version 1.0-stable, Spice is ready for production. It’s already deployed in enterprise use at Twilio, Barracuda Networks, and NRC Health, and can be deployed anywhere—cloud-hosted, BYOC, edge, on-prem.
42
43![A diagram of the Spice.ai OSS compute-engine](/img/blog/2024/announcing-1.0-stable/spice.ai-compute-engine.png)
44
45## Data-grounded AI
46
47Data-grounded AI anchors models in accurate, current, and domain-specific data, rather than relying solely on pre-trained knowledge. By unifying enterprise data—across databases, data lakes, and APIs—and applying advanced ingestion and retrieval techniques, these systems dynamically incorporate real-world context at inference time without leaking sensitive information. This approach helps developers minimize hallucinations, reduce operational risk, and build trust in AI by delivering reliable, relevant outputs.
48
49![Comparison of AI without contextual data and with data-grounded AI](/img/blog/2024/announcing-1.0-stable/data-grounded-ai.png)
50
51**How does Spice.ai OSS solve data-grounding?**
52
53With Spice, models always have access to materializations of low-latency, real-time data for near-instant retrieval, minimizing data movement while enabling AI feedback so apps and agents can learn and adapt over time. For example, you can join customer records from PostgreSQL with sales data in Snowflake and logs stored in S3—all with a single SQL query or LLM function call.
54
55![Secure Compute Engine for Data-grounded AI](/img/blog/2024/announcing-1.0-stable/secure-compute-engine.png)
56
57Spice includes an advanced suite of LLM tools including vector and hybrid search, text-to-SQL, SQL query and retrieval, data sampling, and context formatting—all purpose-built for accurate outputs.
58
59The latest research is continually incorporated so that teams can focus on business objectives rather than trying to keep up with the incredibly fast-moving and often overwhelming space of AI.
60
61## Spice.ai OSS: The engine that makes AI work
62
63Spice.ai OSS is a lightweight, portable runtime (single ~140 MB binary) with the capabilities of a high-speed cloud data warehouse built into a self-hostable AI inference engine, all in a single, run-anywhere package.
64
65It's designed to be distributed and integrated at the application level, rather than being a bulky, centralized system to manage, and is often deployed as a sidecar. Whether running one Spice instance per service or one for each customer, Spice is flexible enough to fit your application architecture.
66
67Apps and agents integrate with Spice.ai OSS via three industry-standard APIs, so that it can be adopted incrementally with minimal changes to applications.
68
691. **SQL Query APIs**: HTTP, Arrow Flight, Arrow Flight SQL, ODBC, JDBC, and ADBC.
70
712. **OpenAI-Compatible APIs**: HTTP APIs compatible with the OpenAI SDK, AI SDK with local model serving (CUDA/Metal accelerated), and gateway to hosted models.
72
733. **Iceberg Catalog REST APIs**: A unified Iceberg Catalog REST API.
74
75![Spice.ai OSS architecture](/img/blog/2024/announcing-1.0-stable/inside-spice.ai-oss.png)
76
77## Key features of Spice.ai OSS include:
78
79- **Federated SQL Query Across Data Sources**: Perform SQL queries across disparate data sources with over 25 open-source data connectors, including catalogs (Unity Catalog, Iceberg Catalog, etc), databases (PostgreSQL, MySQL, etc.), data warehouses (Snowflake, Databricks, etc.), and data lakes (e.g., S3, ABFS, MinIO, etc.).
80
81- **Data Materialization and Acceleration**: Locally materialize and accelerate data using Arrow, DuckDB, SQLite, and PostgreSQL, enabling low-latency and high-speed transactional and analytical queries. Data can be ingested via Change-Data-Capture (CDC) using Debezium, Catalog integrations, on an interval, or by trigger.
82
83- **AI Inference, Gateway, and LLM toolset**: Load and serve models like Llama3 locally, or use Spice as a gateway to hosted AI platforms including OpenAI, Anthropic, xAI, and NVidia NIM. Automatically use a purpose-built LLM toolset for data-grounded AI.
84
85- **Enterprise Search and Retrieval**: Advanced search capabilities for LLM applications, including vector-based similarity search and hybrid search across structured and unstructured data. Real-time retrieval grounds AI applications in dynamic, contextually relevant information, enabling state-of-the-art RAG.
86
87- **LLM Memory**: Enable long-term memory for LLMs by efficiently storing, retrieving, and updating context across interactions. Support real-time contextual continuity and grounding for applications that require persistent and evolving understanding.
88
89- **LLM Evaluations**: Test and boost model reliability and accuracy with integrated LLM-powered evaluation tools to assess and refine AI outputs against business objectives and user expectations.
90
91- **Monitoring and Observability**: Ensure operational excellence with telemetry, distributed tracing, query/task history, and metrics, that provide end-to-end visibility into data flows and model performance in production.
92
93- **Deploy Anywhere; Edge-to-Cloud Flexibility**: Deploy Spice as a standalone instance, Kubernetes sidecar, microservice, or scalable cluster, with the flexibility to run distributed across edge, on-premises, or any cloud environment. Spice AI offers managed, cloud-hosted deployments of Spice.ai OSS through the Spice Cloud Platform (SCP).
94
95## Real-world use-cases
96
97Spice delivers data readiness for teams like Twilio and Barracuda, and accelerates time-to-market of data-grounded AI, such as with developers on GitHub and at NRC Health.
98
99Here are some examples of how Spice.ai OSS solves real problems for these teams.
100
101---
102
103![Twilio Logo](/svg/twilio.svg)
104
105### CDN for Databases — Twilio
106
107![Twilio datalake and database acceleration using Spice.ai Open Source](/img/blog/2024/announcing-1.0-stable/use-case-twilio.png)
108
109A core requirement for many applications is consistently fast data access, with or without AI. Twilio uses Spice.ai OSS as a data acceleration framework or [Database CDN](https://materializedview.io/p/building-a-cdn-for-databases-spice-ai), staging data in object-storage that's accelerated with Spice for sub-second query to improve the reliability of critical services in its messaging pipelines. Before Spice, a database outage could result in a service outage.
110
111<Quote name="Peter Janovsky" title="Software Architect" company="Twilio" imageUrl="/img/blog/2024/announcing-1.0-stable/peter-janovsky.jpeg">
112 "Spice opened the door to take these critical control-plane datasets and move them next to our services in the runtime path."
113</Quote>
114
115With Spice, Twilio has achieved:
116
117- **Significantly Improved Query Performance**: Used Spice to co-locate control-plane data in the messaging runtime, accelerated with DuckDB, to send messages with a P99 query time of < 5ms.
118
119- **Low-Latency Multi-Tenancy Controls**: Spice is integrated into the message-sending runtime to manage multi-tenancy data controls. Before, data changes required manual triggers and took hours to propagate. Now, they update automatically and reach the messaging front door within five minutes via a resilient data-availability framework.
120
121- **Mission-Critical Reliability**: Reduced reliance on queries to databases by using Spice to accelerate data in-memory locally, with automatic failover to query data directly from S3, ensuring uninterrupted service even during database downtime.
122
123<Quote name="David Blum" title="Principal Software Engineer" company="Twilio" imageUrl="/img/blog/2024/announcing-1.0-stable/david-blum.jpeg">
124 "With a simple drop in container, we are able to double our data redundancy by using Spice."
125</Quote>
126
127By adopting Spice.ai OSS, Twilio strengthened its infrastructure, ensuring reliable services for customers and scalable data access across its growing platform.
128
129---
130
131![Barracuda Logo](/svg/barracuda.svg)
132
133### Datalake Accelerator — Barracuda
134
135![Barracuda Delta Lake acceleration using Spice.ai Open Source](/img/blog/2024/announcing-1.0-stable/use-case-barracuda.png)
136
137Barracuda uses Spice.ai OSS to modernize data access for their email archiving and audit log systems, solving two big problems: slow query performance and costly queries. Before Spice, customers experienced frustrating delays of up to two minutes when searching email archives, due to the data volume being queried.
138
139<Quote name="David Stancu" title="Senior Principal Software Engineer" company="Barracuda" imageUrl="/img/blog/2024/announcing-1.0-stable/david-stancu.jpeg">
140 "It's just a huge gain in responsiveness for the customer."
141</Quote>
142
143With Spice, Barracuda has achieved:
144
145- **100x Query Performance Improvement**: Accelerated email archive queries from a P99 time of 2 minutes to 100-200 milliseconds.
146
147- **Efficient Audit Logs**: Offloaded audit logs to Parquet files in S3, queried directly by Spice.
148
149- **Mission-Critical Reliability**: Reduced load on Cassandra, improving overall infrastructure stability.
150
151- **Significant Cost Reduction**: Replaced expensive Databricks Spark queries, significantly cutting expenses while improving performance.
152
153<Quote name="Darin Douglass" title="Principal Software Engineer" company="Barracuda" imageUrl="/img/blog/2024/announcing-1.0-stable/darin-douglass.jpeg">
154 It just kinda spins up and it just works, which is really nice.
155</Quote>
156
157---
158
159![NRC Health Logo](/svg/nrc-health.svg)
160
161### Data-Grounded AI apps and agents — NRC Health
162
163![NRC Health data-grounded AI using Spice.ai Open Source](/img/blog/2024/announcing-1.0-stable/use-case-nrc.png)
164
165NRC Health uses Spice.ai OSS to simplify and accelerate the development of data-grounded AI features, unifying data from multiple platforms including MySQL, SharePoint, and Salesforce, into secure, AI-ready data. Before Spice, scaling AI expertise across the organization to build complex RAG-based scenarios was a challenge.
166
167<Quote name="Dustin Warner" title="Director of Software Engineering" company="NRC Health" imageUrl="/img/blog/2024/announcing-1.0-stable/dustin-warner.jpeg">
168 "What I like the most about Spice, it's very easy to collect data from different data sources and I am able to chat with this data and do everything in one place."
169</Quote>
170
171With Spice OSS, NRC Health has achieved:
172
173- **Developer Productivity**: Partnered with Spice in three company-wide AI hackathons to build complete end-to-end data-grounded AI features in hours instead of weeks or months.
174
175- **Accelerated Time-to-Market**: Centralized data integration and AI model serving an enterprise-ready service, accelerating time to market.
176
177<Quote name="Taher Ahmed" title="Software Engineering Manager" company="NRC Health" imageUrl="/img/blog/2024/announcing-1.0-stable/taher-ahmed.jpeg">
178 "I explored AI, embeddings, search algorithms, and features with our own database. I read a lot about this, but it was so much easier to use Spice than doing it from scratch."
179</Quote>
180
181---
182
183### Data-Grounded AI Software Development — Spice.ai GitHub Copilot Extension
184
185When using tools like GitHub Copilot, developers often face the hassle of switching between multiple environments to get the data they need.
186
187<div style={{display: 'flex', justifyContent: 'center', marginBottom: '15px'}}>
188 <ReactPlayer
189 controls
190 url='https://www.youtube.com/watch?v=A0QdHVUKfAk'
191 />
192</div>
193
194The [Spice.ai for GitHub Copilot Extension](https://github.com/marketplace/spice-ai-for-github-copilot) built on Spice.ai OSS, gives developers the ability to connect data from external sources to Copilot, grounding Copilot in relevant data not generally available in GitHub, like test data stored in a development database.
195
196Developers can simply type **`@spiceai`** to interact with connected data, with relevant answers now surfaced directly in Copilot Chat, significantly improving productivity.
197
198## Why choose Spice.ai OSS?
199
200Adopting Spice.ai OSS addresses real challenges in modern AI development: it grounds models in accurate, domain-specific, real-time data. With Spice, engineering teams can focus on what matters—delivering innovative, accurate, AI-powered applications and agents that work. Additionally, Spice.ai OSS is open-source under Apache 2.0, ensuring transparency and extensibility so your organization remains free to innovate without vendor lock-in.
201
202## Get started in 30 seconds
203
204You can install Spice.ai OSS in less than a minute, on macOS, Linux, and Windows:
205
206<Tabs>
207 <TabItem value="default" label="macOS, Linux, and WSL" default>
208 ```bash
209 curl https://install.spiceai.org | /bin/bash
210 ```
211
212 Or using `brew`:
213
214 ```bash
215 brew install spiceai/spiceai/spice
216 ```
217
218 </TabItem>
219 <TabItem value="windows" label="Windows" default>
220 ```bash
221 curl -L "https://install.spiceai.org/Install.ps1" -o Install.ps1 && PowerShell -ExecutionPolicy Bypass -File ./Install.ps1
222 ```
223 </TabItem>
224</Tabs>
225
226Once installed, follow the [Getting Started with Spice.ai guide](/docs/getting-started) to ground OpenAI chat with data from S3 in less than 2 minutes.
227
228## Looking ahead
229
230The 1.0-stable release of Spice.ai OSS marks a major step toward accurate AI for developers. By combining data, AI, and retrieval into a unified runtime, Spice anchors AI in relevant, real-time data—helping you build apps and agents that work.
231
232A cloud-hosted, fully managed Spice.ai OSS service is available in the Spice Cloud Platform. It’s SOC 2 Type II compliant and makes it easy to operate Spice deployments.
233
234Beyond apps and agents, the vision for Spice is to be the best digital labor platform for building autonomous AI employees and teams. These are exciting times! Stay tuned for some upcoming announcements later in 2025!
235
236The Spice AI Team
237
238## Learn more
239
240- **[Cookbook](/cookbook)**: 47+ samples and examples using Spice.ai OSS
241- **[Documentation](/docs)**: Learn about features, use cases, and advanced configurations
242- **[X](https://x.com/spice_ai)**: Follow @spice_ai on X for news and updates
243- **[Discord](https://discord.gg/kZnTfneP5u)**: Connect with the team and the community
244- **[GitHub](https://github.com/spiceai/spiceai)**: Star the repo, contribute, and raise issues
1---
2date: 2025-01-22
3title: 'Announcing Spice.ai Open Source 1.0-stable: A Portable Compute Engine for Data-Grounded AI — Now Ready for Production'
4sidebar_label: 'Announcing 1.0-stable'
5description: 'Announcement of the 1.0-stable release of Spice.ai Open Source, a portable compute engine for data-grounded AI, now ready for production.'
6image: /img/blog/2024/announcing-1.0-stable/spice.ai-open-source.png
7slug: announcing-1.0-stable
8type: blog
9authors: [lukekim]
10tags: [announcement, release]
11---
12
13import Tabs from '@theme/Tabs';
14import TabItem from '@theme/TabItem';
15import ReactPlayer from 'react-player';
16
17import Quote from '@site/src/components/molecules/quote/quote';
18
19🎉 Today marks the [1.0-stable release](../releases/v1.0-stable.md) of Spice.ai Open Source—purpose-built to help enterprises ground AI in data. By unifying federated data query, retrieval, and AI inference into a single engine, Spice mitigates AI hallucinations, accelerates data access for mission-critical workloads, and makes it simple and easy for developers to build fast and accurate data-intensive applications across cloud, edge, or on-prem.
20
21![Spice.ai Open Source](/img/blog/2024/announcing-1.0-stable/spice.ai-open-source.png)
22
23Enterprise AI systems are only as good as the context they’re provided. When data is inaccessible, incomplete, or outdated, even the most advanced models can generate outputs that are inaccurate, misleading, or worse, potentially harmful. In one example, [a chatbot was tricked into selling a 2024 Chevy Tahoe for $1](https://hothardware.com/news/car-dealerships-chatgpt-goes-awry-when-internet-gets-to-it) due to a lack of contextual safeguards. For enterprises, errors like these are unacceptable—it’s the difference between success and failure.
24
25Retrieval-Augmented Generation (RAG) is part of the answer — but traditional RAG is only as good as the data it has access to. If data is locked away in disparate, often legacy data systems, or cannot be stitched together for accurate retrieval, you get, as Benioff puts it, "Clippy 2.0".
26
27![Benioff Post](/img/blog/2024/announcing-1.0-stable/benioff-clippy-2.0.png)
28
29And often, after initial Python-scripted pilots, you’re left with a new set of problems: How do you deploy AI that meets enterprise requirements for performance, security, and compliance while being cost efficient? Directly querying large datasets for retrieval is slow and expensive. Building and maintaining complex ETL pipelines requires expensive data teams that most organizations don’t have. And because enterprise data is highly sensitive, you need secure access and auditable observability—something many RAG setups don’t even consider.
30
31Developers need a platform at the intersection of data and AI—one specifically designed to ground AI in data. A solution that unifies data query, search, retrieval, and model inference—ensuring performance, security, and accuracy so you can build AI that you and your customers can trust.
32
33## Spice.ai OSS: A portable data, AI, and retrieval engine
34
35In March of 2024, [we introduced Spice.ai Open Source](https://spiceai.org/blog/2024/adding-spice), a SQL query engine to materialize and accelerate data from any database, data warehouse, or data lake so that data can be accessed wherever it lives across the enterprise — consistently fast. But that was only the start.
36
37Building on this foundation, Spice.ai OSS unifies data, retrieval, and AI, to provide current, relevant context to mitigate AI “hallucinations” and significantly reduce incorrect outputs-just one of the many mission-critical use cases Spice.ai addresses.
38
39Spice is a portable, single-node, compute engine built in Rust. It [embeds the fastest single-node SQL query engine](https://datafusion.apache.org/blog/2024/11/18/datafusion-fastest-single-node-parquet-clickbench/), DataFusion, to serve secure, virtualized data views to data-intensive apps, AI, and agents. Sub-second data query is accelerated locally using Apache Arrow, DuckDB, or SQLite.
40
41Now at version 1.0-stable, Spice is ready for production. It’s already deployed in enterprise use at Twilio, Barracuda Networks, and NRC Health, and can be deployed anywhere—cloud-hosted, BYOC, edge, on-prem.
42
43![A diagram of the Spice.ai OSS compute-engine](/img/blog/2024/announcing-1.0-stable/spice.ai-compute-engine.png)
44
45## Data-grounded AI
46
47Data-grounded AI anchors models in accurate, current, and domain-specific data, rather than relying solely on pre-trained knowledge. By unifying enterprise data—across databases, data lakes, and APIs—and applying advanced ingestion and retrieval techniques, these systems dynamically incorporate real-world context at inference time without leaking sensitive information. This approach helps developers minimize hallucinations, reduce operational risk, and build trust in AI by delivering reliable, relevant outputs.
48
49![Comparison of AI without contextual data and with data-grounded AI](/img/blog/2024/announcing-1.0-stable/data-grounded-ai.png)
50
51**How does Spice.ai OSS solve data-grounding?**
52
53With Spice, models always have access to materializations of low-latency, real-time data for near-instant retrieval, minimizing data movement while enabling AI feedback so apps and agents can learn and adapt over time. For example, you can join customer records from PostgreSQL with sales data in Snowflake and logs stored in S3—all with a single SQL query or LLM function call.
54
55![Secure Compute Engine for Data-grounded AI](/img/blog/2024/announcing-1.0-stable/secure-compute-engine.png)
56
57Spice includes an advanced suite of LLM tools including vector and hybrid search, text-to-SQL, SQL query and retrieval, data sampling, and context formatting—all purpose-built for accurate outputs.
58
59The latest research is continually incorporated so that teams can focus on business objectives rather than trying to keep up with the incredibly fast-moving and often overwhelming space of AI.
60
61## Spice.ai OSS: The engine that makes AI work
62
63Spice.ai OSS is a lightweight, portable runtime (single ~140 MB binary) with the capabilities of a high-speed cloud data warehouse built into a self-hostable AI inference engine, all in a single, run-anywhere package.
64
65It's designed to be distributed and integrated at the application level, rather than being a bulky, centralized system to manage, and is often deployed as a sidecar. Whether running one Spice instance per service or one for each customer, Spice is flexible enough to fit your application architecture.
66
67Apps and agents integrate with Spice.ai OSS via three industry-standard APIs, so that it can be adopted incrementally with minimal changes to applications.
68
691. **SQL Query APIs**: HTTP, Arrow Flight, Arrow Flight SQL, ODBC, JDBC, and ADBC.
70
712. **OpenAI-Compatible APIs**: HTTP APIs compatible with the OpenAI SDK, AI SDK with local model serving (CUDA/Metal accelerated), and gateway to hosted models.
72
733. **Iceberg Catalog REST APIs**: A unified Iceberg Catalog REST API.
74
75![Spice.ai OSS architecture](/img/blog/2024/announcing-1.0-stable/inside-spice.ai-oss.png)
76
77## Key features of Spice.ai OSS include:
78
79- **Federated SQL Query Across Data Sources**: Perform SQL queries across disparate data sources with over 25 open-source data connectors, including catalogs (Unity Catalog, Iceberg Catalog, etc), databases (PostgreSQL, MySQL, etc.), data warehouses (Snowflake, Databricks, etc.), and data lakes (e.g., S3, ABFS, MinIO, etc.).
80
81- **Data Materialization and Acceleration**: Locally materialize and accelerate data using Arrow, DuckDB, SQLite, and PostgreSQL, enabling low-latency and high-speed transactional and analytical queries. Data can be ingested via Change-Data-Capture (CDC) using Debezium, Catalog integrations, on an interval, or by trigger.
82
83- **AI Inference, Gateway, and LLM toolset**: Load and serve models like Llama3 locally, or use Spice as a gateway to hosted AI platforms including OpenAI, Anthropic, xAI, and NVidia NIM. Automatically use a purpose-built LLM toolset for data-grounded AI.
84
85- **Enterprise Search and Retrieval**: Advanced search capabilities for LLM applications, including vector-based similarity search and hybrid search across structured and unstructured data. Real-time retrieval grounds AI applications in dynamic, contextually relevant information, enabling state-of-the-art RAG.
86
87- **LLM Memory**: Enable long-term memory for LLMs by efficiently storing, retrieving, and updating context across interactions. Support real-time contextual continuity and grounding for applications that require persistent and evolving understanding.
88
89- **LLM Evaluations**: Test and boost model reliability and accuracy with integrated LLM-powered evaluation tools to assess and refine AI outputs against business objectives and user expectations.
90
91- **Monitoring and Observability**: Ensure operational excellence with telemetry, distributed tracing, query/task history, and metrics, that provide end-to-end visibility into data flows and model performance in production.
92
93- **Deploy Anywhere; Edge-to-Cloud Flexibility**: Deploy Spice as a standalone instance, Kubernetes sidecar, microservice, or scalable cluster, with the flexibility to run distributed across edge, on-premises, or any cloud environment. Spice AI offers managed, cloud-hosted deployments of Spice.ai OSS through the Spice Cloud Platform (SCP).
94
95## Real-world use-cases
96
97Spice delivers data readiness for teams like Twilio and Barracuda, and accelerates time-to-market of data-grounded AI, such as with developers on GitHub and at NRC Health.
98
99Here are some examples of how Spice.ai OSS solves real problems for these teams.
100
101---
102
103![Twilio Logo](/svg/twilio.svg)
104
105### CDN for Databases — Twilio
106
107![Twilio datalake and database acceleration using Spice.ai Open Source](/img/blog/2024/announcing-1.0-stable/use-case-twilio.png)
108
109A core requirement for many applications is consistently fast data access, with or without AI. Twilio uses Spice.ai OSS as a data acceleration framework or [Database CDN](https://materializedview.io/p/building-a-cdn-for-databases-spice-ai), staging data in object-storage that's accelerated with Spice for sub-second query to improve the reliability of critical services in its messaging pipelines. Before Spice, a database outage could result in a service outage.
110
111<Quote name="Peter Janovsky" title="Software Architect" company="Twilio" imageUrl="/img/blog/2024/announcing-1.0-stable/peter-janovsky.jpeg">
112 "Spice opened the door to take these critical control-plane datasets and move them next to our services in the runtime path."
113</Quote>
114
115With Spice, Twilio has achieved:
116
117- **Significantly Improved Query Performance**: Used Spice to co-locate control-plane data in the messaging runtime, accelerated with DuckDB, to send messages with a P99 query time of < 5ms.
118
119- **Low-Latency Multi-Tenancy Controls**: Spice is integrated into the message-sending runtime to manage multi-tenancy data controls. Before, data changes required manual triggers and took hours to propagate. Now, they update automatically and reach the messaging front door within five minutes via a resilient data-availability framework.
120
121- **Mission-Critical Reliability**: Reduced reliance on queries to databases by using Spice to accelerate data in-memory locally, with automatic failover to query data directly from S3, ensuring uninterrupted service even during database downtime.
122
123<Quote name="David Blum" title="Principal Software Engineer" company="Twilio" imageUrl="/img/blog/2024/announcing-1.0-stable/david-blum.jpeg">
124 "With a simple drop in container, we are able to double our data redundancy by using Spice."
125</Quote>
126
127By adopting Spice.ai OSS, Twilio strengthened its infrastructure, ensuring reliable services for customers and scalable data access across its growing platform.
128
129---
130
131![Barracuda Logo](/svg/barracuda.svg)
132
133### Datalake Accelerator — Barracuda
134
135![Barracuda Delta Lake acceleration using Spice.ai Open Source](/img/blog/2024/announcing-1.0-stable/use-case-barracuda.png)
136
137Barracuda uses Spice.ai OSS to modernize data access for their email archiving and audit log systems, solving two big problems: slow query performance and costly queries. Before Spice, customers experienced frustrating delays of up to two minutes when searching email archives, due to the data volume being queried.
138
139<Quote name="David Stancu" title="Senior Principal Software Engineer" company="Barracuda" imageUrl="/img/blog/2024/announcing-1.0-stable/david-stancu.jpeg">
140 "It's just a huge gain in responsiveness for the customer."
141</Quote>
142
143With Spice, Barracuda has achieved:
144
145- **100x Query Performance Improvement**: Accelerated email archive queries from a P99 time of 2 minutes to 100-200 milliseconds.
146
147- **Efficient Audit Logs**: Offloaded audit logs to Parquet files in S3, queried directly by Spice.
148
149- **Mission-Critical Reliability**: Reduced load on Cassandra, improving overall infrastructure stability.
150
151- **Significant Cost Reduction**: Replaced expensive Databricks Spark queries, significantly cutting expenses while improving performance.
152
153<Quote name="Darin Douglass" title="Principal Software Engineer" company="Barracuda" imageUrl="/img/blog/2024/announcing-1.0-stable/darin-douglass.jpeg">
154 It just kinda spins up and it just works, which is really nice.
155</Quote>
156
157---
158
159![NRC Health Logo](/svg/nrc-health.svg)
160
161### Data-Grounded AI apps and agents — NRC Health
162
163![NRC Health data-grounded AI using Spice.ai Open Source](/img/blog/2024/announcing-1.0-stable/use-case-nrc.png)
164
165NRC Health uses Spice.ai OSS to simplify and accelerate the development of data-grounded AI features, unifying data from multiple platforms including MySQL, SharePoint, and Salesforce, into secure, AI-ready data. Before Spice, scaling AI expertise across the organization to build complex RAG-based scenarios was a challenge.
166
167<Quote name="Dustin Warner" title="Director of Software Engineering" company="NRC Health" imageUrl="/img/blog/2024/announcing-1.0-stable/dustin-warner.jpeg">
168 "What I like the most about Spice, it's very easy to collect data from different data sources and I am able to chat with this data and do everything in one place."
169</Quote>
170
171With Spice OSS, NRC Health has achieved:
172
173- **Developer Productivity**: Partnered with Spice in three company-wide AI hackathons to build complete end-to-end data-grounded AI features in hours instead of weeks or months.
174
175- **Accelerated Time-to-Market**: Centralized data integration and AI model serving an enterprise-ready service, accelerating time to market.
176
177<Quote name="Taher Ahmed" title="Software Engineering Manager" company="NRC Health" imageUrl="/img/blog/2024/announcing-1.0-stable/taher-ahmed.jpeg">
178 "I explored AI, embeddings, search algorithms, and features with our own database. I read a lot about this, but it was so much easier to use Spice than doing it from scratch."
179</Quote>
180
181---
182
183### Data-Grounded AI Software Development — Spice.ai GitHub Copilot Extension
184
185When using tools like GitHub Copilot, developers often face the hassle of switching between multiple environments to get the data they need.
186
187<div style={{display: 'flex', justifyContent: 'center', marginBottom: '15px'}}>
188 <ReactPlayer
189 controls
190 url='https://www.youtube.com/watch?v=A0QdHVUKfAk'
191 />
192</div>
193
194The [Spice.ai for GitHub Copilot Extension](https://github.com/marketplace/spice-ai-for-github-copilot) built on Spice.ai OSS, gives developers the ability to connect data from external sources to Copilot, grounding Copilot in relevant data not generally available in GitHub, like test data stored in a development database.
195
196Developers can simply type **`@spiceai`** to interact with connected data, with relevant answers now surfaced directly in Copilot Chat, significantly improving productivity.
197
198## Why choose Spice.ai OSS?
199
200Adopting Spice.ai OSS addresses real challenges in modern AI development: it grounds models in accurate, domain-specific, real-time data. With Spice, engineering teams can focus on what matters—delivering innovative, accurate, AI-powered applications and agents that work. Additionally, Spice.ai OSS is open-source under Apache 2.0, ensuring transparency and extensibility so your organization remains free to innovate without vendor lock-in.
201
202## Get started in 30 seconds
203
204You can install Spice.ai OSS in less than a minute, on macOS, Linux, and Windows:
205
206<Tabs>
207 <TabItem value="default" label="macOS, Linux, and WSL" default>
208 ```bash
209 curl https://install.spiceai.org | /bin/bash
210 ```
211
212 Or using `brew`:
213
214 ```bash
215 brew install spiceai/spiceai/spice
216 ```
217
218 </TabItem>
219 <TabItem value="windows" label="Windows" default>
220 ```bash
221 curl -L "https://install.spiceai.org/Install.ps1" -o Install.ps1 && PowerShell -ExecutionPolicy Bypass -File ./Install.ps1
222 ```
223 </TabItem>
224</Tabs>
225
226Once installed, follow the [Getting Started with Spice.ai guide](/docs/getting-started) to ground OpenAI chat with data from S3 in less than 2 minutes.
227
228## Looking ahead
229
230The 1.0-stable release of Spice.ai OSS marks a major step toward accurate AI for developers. By combining data, AI, and retrieval into a unified runtime, Spice anchors AI in relevant, real-time data—helping you build apps and agents that work.
231
232A cloud-hosted, fully managed Spice.ai OSS service is available in the Spice Cloud Platform. It’s SOC 2 Type II compliant and makes it easy to operate Spice deployments.
233
234Beyond apps and agents, the vision for Spice is to be the best digital labor platform for building autonomous AI employees and teams. These are exciting times! Stay tuned for some upcoming announcements later in 2025!
235
236The Spice AI Team
237
238## Learn more
239
240- **[Cookbook](/cookbook)**: 47+ samples and examples using Spice.ai OSS
241- **[Documentation](/docs)**: Learn about features, use cases, and advanced configurations
242- **[X](https://x.com/spice_ai)**: Follow @spice_ai on X for news and updates
243- **[Discord](https://discord.gg/kZnTfneP5u)**: Connect with the team and the community
244- **[GitHub](https://github.com/spiceai/spiceai)**: Star the repo, contribute, and raise issues