Manufacturer saves millions with Snowflake-powered analytics strategy

  • August 30, 2022
Visualizing 5G with colorful light trails connecting urban spots in finite vanishing point.

Getting the most impact and value from Snowflake, or any other cloud-based data platform or tool, hinges on having a modern data and analytics strategy in place. The strategy sets analytics priorities that are aligned to business goals — and enables business and IT leaders to make technology investment decisions that advance those goals. It can take time and discipline to design the right strategy first — especially when powerful new platforms and tools like Snowflake are easy to stand up and quick to deliver an initial pay-off. But when an organization commits to a strategy-first approach, it pays operational and business dividends for both the short and long term.

One of NTT DATA’s clients, a U.S.-based manufacturer of building products, had to make this decision: leverage technology for a short-term fix to its analytics frustrations or map out a phased strategy that could flex to meet IT and business goals over time.

An unsustainable situation

Near the end of a multiyear, enterprise-wide SAP implementation, our manufacturing client faced several challenges to its analytic objectives. Data couldn’t be easily integrated from within or outside SAP. The HANA views created for business unit reporting had performance issues that made ad-hoc analysis cumbersome and ineffective. Processes were riddled with redundancies and inefficiencies.

Simply put, our client had maxed out the analytic capabilities of its SAP system. The data was there for the most part, but largely inaccessible to the areas of the business — sales, finances, logistics and manufacturing — that needed it to make strategic decisions.

Obstacles to business insights — from all directions

Because of multiple data sources and custom hierarchies, IT teams struggled to enrich, extract or transform data in any meaningful way. Key ERP and transportation management databases in SAP couldn’t easily talk to each other. When teams tried to bring in third-party data to increase visibility in specific areas of the business, the process required more time and manual effort. Severely limited reporting options and a lack of visualization capabilities meant “reports” were delivered via rudimentary data tables in Excel spreadsheets. Insights on problem areas or trends over time were difficult to spot, track and address.

The obstacles and limitations involved in pulling and analyzing data made it impossible for the organization to have a single source of truth. Reports of different shapes, sizes and data points proliferated across IT and business functions. Case in point: Nearly eight different business teams were pulling reports on product shipments, all rife with inconsistencies and inaccuracies. Because even these sub-par reports were so painful to create, they were done weekly at best. The business was flying blind in between its Monday status meetings — and not flying anywhere close to 20/20 the rest of the time.

Gaining clarity on the best path forward

The company knew that getting on the right analytics path would require more than technology, so it turned to NTT DATA to map out a modern data and analytics strategy. The client wanted to make quick progress on the analytics front, but it also needed a better and more scalable environment and data foundation for its future-state analytics ambitions.

Before it could move forward, the client needed to know where it was starting from. That meant conducting a thorough and objective assessment of existing data and analytics capabilities, technology limitations and pain points. The process produced a detailed understanding of its current analytic process, the metrics used, decisions made — and it clearly spelled out the unmet needs. With this complete picture in hand, the NTT DATA team got to work defining a target architecture and identifying the tools to drive a next-generation data model.

Establishing the foundation: Snowflake’s value-add

Using Snowflake as a key enabling part of the strategy had been a possibility from early in the assessment, but the client wanted to understand how best to leverage its capabilities — and combine it with other technologies — for maximum impact. Its unique elastic database functionality would provide the accessibility and scalability the organization needed to accommodate different user hierarchies and workstream needs on demand (and manage costs in the process).

Snowflake’s multi-cloud support would also ensure the platform freedom to essentially plug and play the cloud and analytics technologies that would best support the client’s near-term goals — in this case Microsoft Azure, Power BI and Databricks — and provide a robust foundation for maturing its analytics capabilities.

The icing on the cake was Snowflake’s data sharing and data marketplace capabilities. Being able to easily find, integrate and access specific third-party data sets would enable the client to improve visibility and conduct more advanced analytics without the headaches of its existing process.

A multi-tech strategy comes to life

What does the Snowflake-powered multisource technology solution ultimately look like?

  • We created an end-to-end scalable analytical solution based on Snowflake and using cloud resources.
  • We extracted 200+ tables from the client’s existing SAP system using Azure Data Factory and landed raw exports into Azure Data Lake Gen2.
  • We built a curated multidimensional layer using Databricks that loads into Snowflake. We combined multiple input sources and applied business logic, custom hierarchies and record-level attributes.
  • We identified third-party data sets for integrating from Snowflake that would enrich existing data and decision-making.

Future State: Target Architecture

Screen Shot of future state target architecture

The analytics engine takes shape

With a coherent strategy in place and the data easily accessible, the team could focus on putting accurate and actionable analytics in the hands of business users. We created 12 certified data models and 20 certified Power BI dashboards. We trained 150+ users on how to create and consume reports. On the back end, we provided the client’s IT team with skills development on data pipelines, platform and data warehousing, implemented release pipelines and code repositories and helped onboard new Snowflake and PowerBI hires.

New efficiencies fuel new investments

We helped the client understand how to track and manage the use and associated costs of its new cloud-based data and analytics environment. We identified savings by reducing or eliminating the use of less efficient platforms and extracting greater value from Snowflake and PowerBI specifically. These savings and the shifting of the funds they enabled — along with an estimated savings of 6,000 man-hours — helped offset the cost of the client’s overall cloud data strategy investment.

Building on the benefits

One year ago, our client couldn’t imagine a reality where it had easy access to the data it needed — from SAP and third-party sources — to support its business, let alone accurate and insightful reporting at its fingertips. Thanks to a refreshed data and analytics strategy with Snowflake as the cornerstone of a multisource technology ecosystem, the client can ingest and analyze data faster, easier and with greater accuracy and impact. Here are a few examples of the impact:

  • Freight lane optimization: By replacing the team’s time-consuming manual data collection and analysis process with an auto-feed of freight lane rates from the Snowflake Data Marketplace, the team was able to extract timely and verified data, engineer it and compare it to existing freight spend. From this process, we could identify inefficiencies and excess costs in its shipment network. This equipped our client to negotiate lower freight contracts with shipping partners and saved the manufacturer an estimated $3M-$5M in annual savings. We’re now working with the logistics team to engineer a more efficient tendering process, which will optimize how it assigns its 1,000+ daily shipments across five carriers. The new process will replace the existing chronological approach with one that identifies and automates the best possible configuration of carriers with shipments to optimize costs.
  • Improved reporting and visibility: Remember those eight different shipment reports? They’re now in one beautiful dashboard that’s a click away for business users across departments and hierarchies. Business teams can pull indicator reports from Snowflake regularly and adjust their pricing, sales, customer service or manufacturing strategies to maximize opportunities.

These analytic product solutions are just the beginning. Snowflake will provide the scale and functionality to enable the client’s continued move up the analytics maturity curve. Looking at causals in the data, understanding why things are changing and building predictive and prescriptive analytics capabilities will help the company identify issues before they become problems. In the wake of all this progress, what’s the biggest challenge still facing our client? How to meet the surging internal demand for analytics and reporting. Now that business teams have visibility into what’s happening in their departments and the insights to make informed decisions, they can’t get enough.

This success story is a great example of how an organization can maximize the business impact from Snowflake when it evaluates and implements the platform as part of a modern data and analytics strategy. Speed to stand up is a good thing, but speed to value is even better.

— By
David Mobley, Vice President, Data & Artificial Intelligence
Greg Stuhlman, Managing Director, Data & Artificial Intelligence
Shan-Ming Chiu, Senior Manager, Data & Artificial Intelligence

Subscribe to our blog

ribbon-logo-dark

Related Blog Posts