Spice Cloud Platform
Spice Cloud Platform

Spice Cloud v1.8.0: Iceberg Write Support, Acceleration Snapshots & More

Wyatt Wenzel
Product Marketing Lead

Spice Cloud & Spice.ai Enterprise 1.8.0 are live! v1.8.0 includes Iceberg write support, acceleration snapshots, partitioned S3 Vector indexes, a new AI SQL function for LLM integration, and an updated Spice.js SDK.

V1.8.0 also introduces developer experience upgrades, including a redesigned Spice Cloud dashboard with tabbed navigation:

Figure 1: Switch between datasets, queries, and models without losing context.

Spice Cloud customers will automatically upgrade to v1.8.0 on deployment, while Spice.ai Enterprise customers can consume the Enterprise v1.8.0 image from the Spice AWS Marketplace listing.

What’s New in v1.8.0

Iceberg Write Support (Preview)

Spice now supports writing to Apache Iceberg tables using standard SQL INSERT INTO statements. This greatly simplifies creating and updating Iceberg datasets in the Spice runtime - letting teams directly manipulate open table data with SQL instead of third-party tools. Get started with Iceberg writes in Spice here.

Example query:

-- Insert from another table
INSERT INTO iceberg_table
SELECT * FROM existing_table;

-- Insert with values
INSERT INTO iceberg_table (id, name, amount)
VALUES (1, 'John', 100.0), (2, 'Jane', 200.0);

-- Insert into catalog table
INSERT INTO ice.sales.transactions
VALUES (1001, '2025-01-15', 299.99, 'completed');

Acceleration Snapshots (Preview)

A new snapshotting system enables datasets accelerated with file-based engines (DuckDB or SQLite) to bootstrap from stored snapshots in object storage like S3 - significantly reducing cold-start latency and simplifying distributed deployments. Learn more.

Partitioned Amazon S3 Vector Indexes

Vector search at scale is now faster and more efficient with partitioned Amazon S3 Vector indexes - ideal for large-scale semantic search, recommendation systems, and embedding-based applications. Learn more.

AI SQL Function (Preview)

A new asynchronous ai SQL function enables developers to call large language models (LLMs) directly from SQL, making it possible to integrate LLM inference directly into federated or analytical workflows without additional services. Learn more.

Spice.js v3.0.3 SDK

v3.0.3 brings improved reliability and broader platform support. Highlights include new query methods, automatic transport fallback between gRPC and HTTP, and built-in health checks and dataset refresh controls. Learn more.

Bug & Stability Fixes

v1.8.0 also includes numerous fixes and improvements:

  • Reliability: Improved logging, error handling, and network readiness checks across connectors (Iceberg, Databricks, etc.).
  • Vector search durability and scale: Refined logging, stricter default limits, safeguards against index-only scans and duplicate results, and always-accessible metadata for robust queryability at scale.
  • Cache behavior: Tightened cache logic for modification queries.
  • Full-Text Search: FTS metadata columns now usable in projections.
  • RRF Hybrid Search: Reciprocal Rank Fusion (RRF) UDTF enhancements for advanced hybrid search scenarios.

For more on v1.8.0, check out the full release notes.

v1.8 Release Community Call

Join us on Thursday, October 16th for live demos of the new functionality delivered in v1.8! Register here.

Figure 2: October 16th, v1.8 Release Community Call

Resources to Get Started with Spice

Work with Spice AI

Interested in working with Spice AI or looking to learn a little more about the work we do? We are always looking for our next big challenge. Book an introductory call via our Calendly. Take a deeper look at our enterprise offerings by visiting Spice.ai.

Visit Spice.ai
Share

Latest Articles

Spice Cloud Platform

Spice Cloud v1.8.0: Iceberg Write Support, Acceleration Snapshots & More

Announcing Spice Cloud v1.8.0 - now with Iceberg write support, acceleration snapshots, partitioned S3 Vectors indexes, & a new AI SQL function

By
Wyatt Wenzel
-
October 8, 2025
All Articles
Spice AI

Making Object Storage Operational for Real-Time and AI Workloads

Object storage and open table formats like Apache Iceberg and Delta Lake have become foundational to modern data architectures for their scalability and cost efficiency. However, they’re not built for real-time, AI-driven workloads that demand low latency and sophisticated querying. Spice extends these technologies by federating and accelerating data, enabling hybrid search and inference, and transforming object storage into a high-performance data layer for operational applications and AI agents.

By
Wyatt Wenzel
-
October 6, 2025
All Articles
Spice Cloud Platform
Spice AI

Spice Cloud v1.7.0: DataFusion v49, Full-Text Search Updates & More

Spice Cloud v1.7.0 includes DataFusion v49, EmbeddingGamma support, and real-time indexing for full-text search

By
Wyatt Wenzel
-
September 24, 2025
All Articles