Snowflake Summit 2025: AI, Acceleration, and a Lot of Inspiration

AI & Machine LearningAnalytics & VisualizationData & AI StrategyData AnalyticsData EngineeringData ScienceData TransformationSnowFlake
#

The message at Snowflake Summit 2025 was clear the future of data is here, and it’s infused with AI.

Between keynotes packed announcements (with live orchestra) and more casual get togethers, this year’s Summit felt like a signal that the data and AI world is officially leveling up.

Snowflake Summit 2025 was a whirlwind of data innovation and AI buzz, and I couldn’t be more excited to share the highlights. I spent the first week of June immersed in Snowflake’s annual Summit and Snowflake did not disappoint. In a single event, they unveiled a staggering range of new capabilities spanning AI, data engineering, cloud infrastructure, and collaboration. In this blog post, I’ll walk through the major announcements from Summit 2025 and reflect on what they mean for enterprises and data-driven teams:

The AI-First Data Cloud Vision

Snowflake’s vision this year can be summed up in one word: AI. Nearly every announcement was touched by artificial intelligence in some way – from new AI features baked into the platform to partnerships that bring AI to the forefront. The keynote theme was about making AI “easy, connected, and trusted” on top of your data . Here are the standout AI-centric launches:

  • Snowflake Intelligence (Public Preview) – Think of this as a secure, built-in AI assistant for your data. It allows anyone in your organization to converse with data in natural language via a chat interface (at ai.snowflake.com) and get immediate insights, even charts and visual answers . No SQL or BI tool needed…just ask, “Why did sales dip in Q3?” and get an answer with full context and sources. What makes Snowflake’s approach unique is that it runs within Snowflake’s governed environment, so all your data security and privacy rules still apply . Under the hood, it’s powered by large language models from OpenAI and Anthropic running inside Snowflake, ensuring AI is delivered with enterprise-grade security and trust . As a data leader, I find this hugely promising – it’s AI that connects to our single source of truth rather than living in a silo.

  • Snowflake Cortex “Agentic” AI Platform – Snowflake is embracing the concept of agentic AI, meaning AI that can take actions, not just answer questions. They introduced Cortex Agents (coming in preview), which let developers build custom AI agents that orchestrate workflows across your data and apps. These agents can be integrated into things like Slack, Teams, or custom applications to automate multi-step tasks . For example, you could have an agent that detects anomalies in data, sends alerts, or kicks off pipeline jobs – all guided by AI. It’s like having smart co-workers (AI interns, as Sam Altman quipped during the keynote) embedded in your data stack. This signals a future where AI isn’t just an analysis tool, but an active participant in business processes.

  • Cortex AISQL (Public Preview) – In one of my favorite announcements, Snowflake is bringing AI into SQL itself. AISQL lets you call AI functions from within SQL queries . This means you can, say, extract sentiment from text, label an image, or summarize a document using a simple SQL function, with the heavy AI lifting done behind the scenes. For analysts and engineers, it collapses complex AI tasks into the familiar SQL interface. The Summit demo showed how Document AI can pull structured tables out of PDFs and how Cortex Search can vector-scan documents – all invoked via SQL . It’s a powerful concept: your database not only stores data but can understand and analyze unstructured data on the fly . This will open up AI capabilities to anyone comfortable with SQL, no special ML frameworks required.

  • Data Science Sidebar – Automation & Observability: Snowflake also unveiled tools aimed at data scientists and ML engineers. A new Data Science Agent acts as an AI-powered assistant to auto-generate pipelines, prep data, and even train models with minimal code . For organizations struggling to implement machine learning due to talent gaps, this could be a game-changer – it’s like giving every data scientist a junior co-pilot to speed up experimentation. Additionally, AI observability features are being built into Snowflake’s Cortex framework, meaning every LLM (large language model) call and generative AI action can be monitored and audited with no custom tooling . This focus on “trust” and traceability in AI was a running theme; as I often say, in enterprise AI transparency is everything. Snowflake gets that – their AI components even provide citations for answers by default , so you can trace conclusions back to data sources. This commitment to governed, explainable AI is crucial for industry adoption.

By tightly coupling data and AI in a secure environment, they signal a future where AI is ubiquitous in analytics but also accountable. As a data consultant, I see huge opportunities here: businesses will be able to harness advanced AI without needing a fleet of PhDs or a patchwork of external tools – it’s all in one place, with guardrails. That’s a vision I can get behind.

Unifying Data Engineering and Integration

Another major focus at Summit 2025 was solving the less-glamorous but absolutely critical challenges of data engineering and interoperability. Snowflake knows that fancy AI means nothing if your data is stuck in silos or pipelines break. Some big announcements in this area aim to bridge data silos and streamline pipelines:

  • Snowflake Openflow (GA on AWS) – This one caused a lot of buzz on the floor. Openflow is Snowflake’s new managed data ingestion and integration service, powered under the hood by Apache NiFi . Essentially, Snowflake is stepping into the ETL/ELT arena by providing an open, extensible way to move any data into Snowflake (and out). It comes with hundreds of pre-built connectors and processors, making it easier to ingest streaming data, files, databases – you name it – without the usual custom coding or third-party ETL tools . One eyebrow-raising detail: Snowflake even announced a partnership with Oracle to enable change-data-capture (CDC) replication from Oracle databases straight into Snowflake . As someone who’s wrestled with plenty of brittle data pipelines, this is exciting. The vibe I got from attendees: data movement is now seen as “essential infrastructure” in the age of AI, not an afterthought . By eliminating data silos and manual pipeline work, Openflow aims to free up data engineers to focus on higher-value tasks. It’s a strategic move – Snowflake wants to be the central circulatory system for data in an organization.

  • Native dbt in Snowflake – For all the analytics engineers out there, this is big news. Snowflake is bringing dbt (data build tool) projects directly into its UI (Snowsight) with a new Snowflake Workspaces feature . In practice, this means you can develop, run, and monitor your dbt data transformation pipelines inside Snowflake’s environment, with built-in version control (Git integration) and even an AI SQL copilot to assist with code suggestions . No more context switching between development environments – it’s all unified. This reflects a broader trend: the lines between data warehousing and data transformation tooling are blurring. By natively supporting dbt, Snowflake is acknowledging that its platform can be the end-to-end home for data engineering workflows. I’m thrilled about this because it reduces friction for teams adopting analytics engineering best practices. If your team uses dbt for ELT, Snowflake just made your life easier.

  • Open Table Formats & Interoperability – Snowflake historically was somewhat of a walled garden, but that’s changing fast. At Summit, they doubled down on open format support, particularly Apache Iceberg. New enhancements will let you read and write Iceberg tables directly from Snowflake using external catalogs . In fact, they announced Snowflake Open Catalog as a way to govern data across many engines and storage locations (a nod to those with hybrid architectures). You can even create “Catalog Linked Databases” in Snowflake to query Iceberg data stored elsewhere, and use Dynamic Tables (Snowflake’s take on materialized views) on top of Iceberg data . Plus, performance features like Merge on Read are coming to optimize how Snowflake interacts with these tables . The takeaway: Snowflake doesn’t want to force everyone into one format or one ecosystem. They see value in playing nice with others – whether that’s data lakes on S3 or other query engines. In fact, Snowflake’s tagline could be “no data left behind”: wherever your data lives, Snowflake wants to interconnect with it. This open approach is strategic; it acknowledges that enterprises have diverse data landscapes (and it’s a subtle answer to competitors championing lakehouse architectures). For data teams, it means more flexibility and fewer barriers when unifying data for analysis.

  • DevOps & Pipeline Productivity – A few other engineering-focused updates worth noting: Snowflake’s Terraform provider for infrastructure-as-code is now generally available, making it easier to manage Snowflake resources with DevOps practices . They also added support for custom Git repository URLs (so you’re not locked into one Git provider) and updated Snowflake Notebooks to support Python 3.9 . These might seem minor, but as an architect I appreciate the continual improvements to make Snowflake more developer-friendly. It shows Snowflake is listening to the needs of modern data teams who treat pipelines and data infrastructure as code.

All these integration and engineering enhancements signal Snowflake’s push to be the unified data backbone for enterprises. The easier it is to get data into Snowflake and work with it in open formats, the more valuable the platform becomes. For companies, this means less time wrestling with plumbing and more time delivering insights. As I often tell my clients: the speed of data to insight is a competitive differentiator. Snowflake’s making moves to remove speed bumps on that road.

Performance and Automation in the Cloud Warehouse

No Summit would be complete without some good old-fashioned improvements in performance and infrastructure. Snowflake introduced new features that make its cloud data platform faster, more efficient, and more “auto-pilot” than before:

  • Standard Warehouse Gen 2 – Snowflake’s compute warehouses are getting a serious upgrade. The Generation 2 standard warehouse comes with new hardware and software optimizations that reportedly deliver 2.1× faster performance for typical analytics workloads . They specifically noted improved speed for large table scans and DML operations like DELETE, UPDATE, MERGE . In the past year, Snowflake claims these upgrades already doubled query performance for customers . What does this mean in practice? Potentially lower query times and/or lower costs (since faster queries use fewer credits). If you’ve ever sat waiting on an important report or dashboard to load, you know faster is always better. Snowflake is basically saying: we’ve turbo-charged the engine under the hood. For enterprises, this helps ensure the platform can keep up with growing data volumes and user demands without breaking the bank.

  • Adaptive Compute (Private Preview) – This was one of those “wow” announcements for infrastructure nerds like me. Snowflake Adaptive Compute is a new automated resource management feature that will auto-scale and auto-tune your compute warehouses on the fly . The idea is to eliminate manual tweaking of warehouse sizes. Instead, Snowflake will intelligently scale resources up or down and even route queries to the appropriately sized compute cluster to handle the workload efficiently . They’re calling the result Adaptive Warehouses, and the goal is to maximize performance and cost-efficiency without the user needing to manage anything. As the Snowflake team put it on stage, this service “lowers the burden of resource management” for customers . To me, this signals a future where the cloud data platform becomes truly serverless in experience – you don’t worry about what size warehouse or how many nodes; you just run queries and the platform invisibly handles it optimally. If done right, this could yield significant cost savings (no more paying for over-provisioned warehouses or suffering slow queries because of under-provisioning). It’s Snowflake’s answer to similar auto-scaling capabilities in cloud data rivals, and it plays into the narrative of making the infrastructure disappear so data teams can focus on data, not tuning knobs. Keep an eye on this one as it matures; it could change how we think about capacity planning in data platforms.

  • FinOps and Cost Control Tools – Although not a single headline feature, Snowflake sprinkled in mentions of better cost oversight tools. For example, Snowflake Marketplace now offers “Offers” – allowing providers to offer usage-based pricing deals to customers , which could help companies optimize spend on third-party data/products. Snowflake’s Trust Center and admin console are also getting enhancements for usage tracking and budget enforcement (they even mentioned features like alerting if an AI model is burning too many credits) . This focus on FinOps (financial operations) is very welcome. Many of my conversations with tech execs nowadays revolve around controlling cloud data costs. Snowflake seems to recognize that and is providing more levers to manage cost and performance in tandem. The Gen2 warehouse and Adaptive Compute are part of that efficiency story – more bang for your buck. As a result, I suspect Snowflake’s TCO (total cost of ownership) will improve for many workloads, which could make CFOs as happy as the CTOs.

  • Security and Governance Upgrades – In the realm of cloud infrastructure, security is paramount. Snowflake announced that they are deprecating password-based logins entirely in favor of more secure methods like federated auth and passkeys . “We will force you to be secure,” Snowflake’s product lead quipped on stage , which got a chuckle from the crowd – but it’s the right move. Weak passwords are an enemy of any cloud deployment. Additionally, a new Horizon Trust Center is providing admins with a unified view of security posture, and Horizon Catalog Copilot (in preview) will let you use natural language to ask questions about data governance and security policies . Imagine typing “Show me all datasets with sensitive PHI data shared externally” and getting an immediate answer. That’s the kind of AI-assisted governance that can save a data governance team hours of digging. By baking security and compliance into the fabric of the platform (and even making it conversational with AI), Snowflake is emphasizing that their Data Cloud is enterprise-ready at the highest standard. For data teams in regulated industries, this is a reassuring direction.

In summary, Snowflake’s infrastructure improvements are about speed, efficiency, and confidence. They’re making the platform faster and smarter on its own, while giving us better tools to manage it responsibly. This ensures that as we scale our data usage, we don’t hit performance bottlenecks or governance nightmares. Instead, we get a cloud data platform that can dynamically adapt to workload needs and keep our data secure by default. It’s the kind of behind-the-scenes innovation that may not grab as many headlines as AI, but absolutely enables those headline-grabbing use cases.

Collaboration, Apps, and the New Data Ecosystem

Snowflake wasn’t only talking about internal platform features – a big part of Summit was about the growing ecosystem around Snowflake, including data sharing, third-party apps, and partnerships. There’s a clear strategic direction: Snowflake aims to be not just a data warehouse, but a data application platform and marketplace. Some key developments:

  • Semantic Layers & Business Collaboration – One announcement that really struck a chord with me was Snowflake Semantic Views (now in preview). This essentially lets data teams define business metrics and relationships centrally in Snowflake . Think of it as a unified semantic layer inside the data cloud – you can define, for example, “annual recurring revenue” once and have both your BI tools and AI tools understand that definition. By sharing these semantic models, different departments or even partner organizations can be sure they’re talking about the same numbers . Snowflake is even allowing these semantic models to be shared across organizations (with proper governance) so that, for instance, a company and its supplier could share a common data model for reporting . This is huge for collaboration. In my experience, one of the biggest pain points in data-driven orgs is inconsistency in metrics – everyone has their own calculations. A semantic layer in Snowflake could standardize this and boost trust in data. It also aids AI interpretability; those Snowflake Intelligence chatbots can give better answers if they’re grounded in well-defined business concepts . As Snowflake puts it, the future of AI starts with accessible, structured context – and semantic models provide exactly that context.

  • Snowflake Marketplace & Native Apps – Snowflake’s Marketplace (their “app store” for data and apps) got a lot of love at Summit. Two notable additions: Agentic Apps and Cortex Knowledge Extensions are now first-class citizens on the Marketplace . This means third-party developers can offer Snowflake Native Apps that come with built-in AI/agent capabilities – for example, a finance data app that uses an LLM to answer natural language queries on financial data, installable directly into your Snowflake account. Snowflake has enhanced the Native App Framework to support better versioning, usage tracking, and even a compliance badge to indicate an app meets security standards . In short, they’re smoothing the path for a new wave of enterprise AI apps that run inside the data cloud. Meanwhile, Cortex Knowledge Extensions allow AI apps to pull in fresh content from external sources (like licensed news data, research, Stack Overflow Q&A, etc.) in a governed way . Imagine an AI app in Snowflake that can answer questions using both your private data and, say, the latest news articles – but it does so with proper attribution and without exposing more than it should . That’s what Knowledge Extensions facilitate: retrieval augmented generation with guardrails. For data teams, the Marketplace developments signal that Snowflake is becoming a platform to build and distribute data-driven applications, not just share datasets. In fact, at Summit’s developer day, I saw early-stage companies pitching apps built directly on Snowflake – AI tools, security automation, you name it . Snowflake is actively creating space for others to build on its foundation, much like how Salesforce’s ecosystem took off. This opens up an opportunity for enterprises to either build their own internal apps on Snowflake or leverage third-party solutions that can be deployed with a click. I find this extremely forward-looking: our data cloud is turning into an app platform where we don’t have to export data to use external innovations – the innovations come to the data, inside Snowflake’s secure perimeter.

  • Data Sharing & Clean Rooms – Snowflake also continues to strengthen data sharing capabilities. They highlighted improvements to Clean Rooms (secure data sharing environments for multiple parties) with better privacy features and an Egress Cost Optimizer to minimize cloud egress fees when sharing data across regions or clouds . This is a nod to the reality that data collaborations often span organizations and infrastructure – Snowflake wants to make those as frictionless as possible. Additionally, the Horizon Catalog (Snowflake’s catalog and governance layer) is expanding to index external data sources – not just stuff inside Snowflake, but also data in other clouds, BI dashboards, and more . That means you could potentially discover and govern data beyond Snowflake from one place, a big plus for enterprises with hybrid environments.

The overarching theme here is connectivity – connecting people with data, connecting different data systems, and connecting third-party innovation with first-party data. Snowflake sees that a modern data strategy isn’t confined within one company’s four walls. It’s an ecosystem play now: your customers, partners, and providers are all part of a connected data network. These announcements (semantic models, marketplace apps, enhanced sharing) show Snowflake positioning itself as the hub of that hub-and-spoke model. For forward-thinking enterprises, this is an invitation to participate in a larger data economy – sharing data products, monetizing data assets, and consuming external data/services more easily. For data teams, it means your scope of impact can broaden; you’re not just building for internal stakeholders anymore, you could be building data products that live on a marketplace or integrating external knowledge to enrich your analytics. It’s a exciting direction that blurs the line between data platform and application platform.

Strategic Partnerships and Industry Moves

Snowflake Summit 2025 wasn’t just product features – there were a few strategic announcements that show where the company is headed and how it’s planting flags in the broader tech landscape:

  • Acquisition of Crunchy Data (Postgres) – In a move that surprised many, Snowflake announced plans to acquire Crunchy Data, a company known for its expertise in PostgreSQL, for ~$250M . The goal? Launching “Snowflake Postgres”, a managed Postgres database service that will live alongside Snowflake’s own engine . This is fascinating because it suggests Snowflake will offer both its proprietary database and a standard Postgres option, likely to accommodate transactional workloads or just to grab workloads that prefer Postgres. It’s a recognition that one size (or one engine) doesn’t fit all. Strategically, this is about embracing open source tech and developer communities – some developers love Postgres and have existing apps using it, so why not host that data in the Snowflake Data Cloud ecosystem? Along with their Iceberg support, Snowflake is clearly signaling an open-arms approach to data formats and engines . For customers, this means more flexibility: you might use Snowflake’s native tables for analytics, but spin up a Postgres database for an app, and still enjoy unified security, billing, and governance across both. It’s a cloud-era twist on “hybrid database” offering – not something we expected from Snowflake a couple years ago, and it shows how aggressively they plan to be the universal data platform.

Bringing the Community Together

One of the best parts of Summit is always the people. It’s an opportunity to connect with industry peers, product teams, and some new faces throughout the week, including at a dinner we hosted.

From organized talks to impromptu hallway run-ins, this community is absolutely stacked with imaginative and inspirational thinkers who genuinely care about data innovation.

 

Snowflake Summit 2025 painted a picture of a data platform evolving into an intelligent, connected nerve center for the modern enterprise. Snowflake is not just adding features for feature’s sake – there’s a clear strategic narrative: integrate everything (data, AI, apps) in one secure hub, make it incredibly easy to use, and enable an ecosystem so others can extend it. It’s both ambitious and sensible. Ambitious, because Snowflake is aiming to be the default data+AI platform in an era where competition is fierce (hello Databricks, BigQuery, etc.). Sensible, because it addresses real pain points like data silos, slow time-to-insight, governance fears around AI, and the need to do more with less.

Enterprises that leverage these new tools stand to gain a competitive edge – whether it’s by uncovering insights faster with AI, cutting costs with smarter compute scaling, or creating new revenue streams by productizing data and apps. Data-driven teams should feel energized by what’s coming: we’ll have better toys to play with! And more importantly, we’ll be equipped to solve complex problems in simpler ways.

I’ll certainly be keeping a close eye on how quickly these features roll out and how they perform in the wild. But one thing’s for sure – Snowflake has set a bold direction that is pushing the whole industry forward. The lines between data engineering, analytics, and application development are blurring, and Snowflake is at that convergence point. As Josh, and as Blue Orange Digital, we’re thrilled to ride this next wave with our clients, helping turn these Summit announcements into real-world wins.