Accelerating FinOps with Cortex Code on Snowflake

How FinOps Reporting Dynamically Evolves on an Enterprise Data and AI Platform

Snowflake has always run its own FinOps practice on the Snowflake platform. Bringing cloud costs, usage and business metrics into one governed environment has given us consistency, scale and trust across all of our stakeholders. The power of our enterprise data and AI platform is the ability to contextualize these data sets to drive real business outcomes.

As we described in earlier posts on the Snowflake Builders Blog [first, second, third], this curated self-serve model has enabled us to democratize intelligence and share analytics across our teams. That foundation allowed us to run at our massive scale, while still having very granular visibility across our entire stack. 

However, our business is changing quickly and demands rapid iteration. Snowflake Cortex AI features, new warehouse SKUs and fast-changing customer needs are reshaping how we think about measuring cloud economics and margins. The pace of change is ramping up, and we need to iterate at the same speed.

Stepping on the gas pedal: Cortex Code

The February launch of Snowflake Cortex Code has changed how we work almost overnight. 

With Cortex Code, it’s like we’ve added a fleet of analysts and developers for each team member on top of the data we already have in Snowflake. Our team can now make a request like: “Create a Streamlit interface that shows monthly expenses on Google Cloud by region and flag anomalies over 10% of a metric that you define.”

Cortex Code translates that intent into deployable application code that runs securely against governed data already in Snowflake. The combination may not look like traditional FinOps tooling, but it delivers what matters: rapid iteration tied to business context. That's the foundation of any successful FinOps practice.

Use Case 1: Fully automated cloud forecasting

Like most teams, our forecasts historically lived in complicated spreadsheet-based models. The process works but is brittle as it scales, doesn’t directly connect to downstream tooling and takes significant work to update for each cycle.

We’re now using Cortex Code to rebuild forecasting across multiple areas: all cloud consumption across our internal and customer-facing workloads, Snowflake’s consumption of Snowflake features, third-party SaaS consumption, and the list goes on. Now, not only does our historical cost data live in structured tables that plug into existing tools but so do our forecast values. Engineering and Product teams can build goals and track KPIs on top of the same numbers the Finance team uses. No exports or ongoing metric gaps, and one forecast.

A few wins to consider from this updated approach:

  • Faster finance cycles: Forecasts are more frequently refreshed (daily is the goal!) and with significantly less manual effort.

  • Earlier variance signals: We’re able to tie anomaly reporting directly to our locked forecasts. We can additionally give advanced notice to leadership when we have a variance, allowing us to be proactive rather than reactive when landing any company period.

  • Improved visibility: Forecast logic is out in the open for key stakeholders to poke. This feedback cycle is critical for us to continue to iterate. Shared data means direct accountability.

Our goal is to reduce the time to refresh our forecast across billions of dollars of expense from about a week of person time to a few hours. This isn’t a pie-in-the-sky dream — we’re delivering on this outcome.

Use Case 2: An AI-enabled weekly metrics review and granular anomaly detection

We implemented weekly metric reviews initially in order to adjust to small changes in our business on a week-to-week basis. As a group we would review trend changes across platform areas, customers, and workloads and debug findings that analysts had spent hours manually drilling into data to explain it. By the time we were through these meetings, the analysis was complete but the opportunity for action had been diminished.

Our team saw a clear opportunity to automate this process and did so through Streamlit. However the real pace change in our approach came when we began developing dedicated Cortex Code skills for FinOps.

We now have first-pass variance analysis, anomalies and root-cause guidance available to review as a team with minimal oversight until review. Given the time-to-action reduction, we are able to triage with greater speed across our footprint. The surfacing of drivers and first-draft commentary is rapidly reducing our time to financial guidance as well, another key outcome for our small team. These are not static analytic capabilities but skills that mature with additional use over time.

Despite this speed and intelligence, we believe firmly that this does not replace analyst judgment; it amplifies it. Teams arrive at discussions focused on making decisions and driving outcomes rather than gathering data.

Use Case 3: Rebuilding accounting controls

Given the large number of features we manage, third-party services we ingest, and internal models we require to correctly recognize expenses, our accounting and audit partnership is fundamental to the health of Snowflake’s business.

While audit controls are, by nature, meant to be repeatable, inspectable and clearly explainable, Snowflake’s scale and ongoing growth have led to processes that can require significant manual toil. Naturally, these are the types of items that lend themselves to systematized streamlining. We’re now rebuilding many of our processes and audit controls using Cortex Code directly on Snowflake.

One of our key monthly cloud controls ensures correct recognition of our cloud spend by account. Given our scale, this can be fairly cumbersome. We directly replicated this control in Streamlit, including workflows, instructions and purpose hosted in GitHub. Approvals live in the application itself, creating a built-in, timestamped audit trail. Manual data manipulation and query aggregation is no longer necessary, removing the bulk of operator-error risk. 

Since updating this process, we have seen a 70% reduction in operational effort each month. These controls have become more durable, auditable, and extensible, and we are continuing to migrate additional processes to ensure they become direct extensions of our core data structures rather than fragile side workflows.

The subject matter experts who previously spent days maintaining these controls now invest that time improving them: tightening logic, expanding coverage and focusing on higher-value work. Cortex Code bought back time, and the team can now decide how to reinvest it.

 

Governance and trust still matter: New roles for data practitioners

As a business partner of Finance, the Analytics department remains foundational to enterprise FinOps motions. AI has heightened that responsibility, not reduced it. An analytics team still owns the health, accuracy and defensibility of underlying data for insights that are grounded in business context.

While traditional BI tasks such as dashboard building and routine analysis are increasingly automated, the role of the Analytics team at Snowflake is also quickly evolving. As Cortex Code takes on more of the technical execution, we’re shifting from manual manipulation to orchestration and productization. We now operate as executive producers — directing agents, curating outputs and embedding business context into our semantic layer.

With the speed offered by Cortex Code, it is critical for us to have data guardrails built directly into the system. Validation checks are run before and after insights are generated, outputs are logged and traceable, and historical results persist. We maintain active feedback loops across teams, feeding qualitative context back into our skills and semantic models, which sharpens the models’ abilities. We treat these tools like additional analysts, holding the outputs to a high bar.

As AI continues to accelerate workflows, our cross-team operating model is maturing. Teams increasingly do rapid design, reporting and dashboarding, while the analytics team executive produces, provides feedback and layers AI-driven enhancements on top. That allows us to focus on higher-order problems like context, quality and consumable workflows for Snowflake to share internally and externally.

Contextualized outcomes accelerated with Cortex Code

In the FinOps Foundation’s State of FinOps 2026, the mission statement of the organization was updated from “Advancing the People who manage the Value of Cloud” to “Advancing the People who manage the Value of Technology.” At Snowflake, our approach to platform finance directly aligns with that outlook.

FinOps tools have shaped modern cloud platform financial management, but cost reporting without enterprise context no longer meets the moment. As consumption-based models scale in depth and breadth, organizations must tie investment directly to performance and business value, anchored in core company metrics. At Snowflake, we have done this for years and are accelerating further by enabling subject matter experts to rapidly iterate on solutions with the support of key BI and data partners.

Cortex Code on Snowflake enables technical and nontechnical consumers alike to build FinOps tools that reflect how their business truly operates, with metrics, drivers, and workflows tailored to real needs rather than forced into predefined models. When cost, usage and business data live together on a governed platform, Snowflake and Cortex Code can reason across domains instead of within silos, compounding time savings and accelerating insight across the enterprise.

The bottom line

For organizations managing cloud cost at scale, the Snowflake platform now lets you prototype where your enterprise data lives. Discussing your cloud costs with business context will turn pure cost data into true business insight.

Get started with Cortex Code or start your free trial.

SnowflakeAll Companies