Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Case Study: Faster IT Processes, Improves Plan Delivery

Tags: plan quality

Company saves $19M/month. Reducing Run Times by 45% & Improving On-time Plan Delivery to 95%

For supply chains to be more responsive, plan generation needs to become faster. Consistent on-time delivery of system loads and operating plans is a key metric for internal and external stakeholders.

The Company

A $14B high tech manufacturer was experiencing market growth and also made a number of acquisitions which included new factories and product lines. The supply chain team was focused on generating production schedules for 60 factories and rolling out a supply chain planning application to the new factories.

The Challenge

The increased volume and complexity of data made it more difficult to collect and processes the data to generate production plans and reports on-time. Data quality errors and expanding runtimes were leading to late delivery of production schedules to the factories and reports to businesses.

On-time delivery was critical to the manufacturing business and to the team’s responsiveness to customers. However, they were only able to generate plans on time 60% of the time. Each hour the plan was costing millions of dollars. An internal study quantified the cost of delays to be $20M a month.

The Solution

Initially the company’s focus was on upgrading hardware to run the optimization engine, tuning the algorithm, database processes and running the batch jobs in parallel. It all helped some, but wasn’t enough. We then focused on optimizing the data flowing through the system and developed a solution approach with multiple elements to resolve the challenges.

Automated Data Quality Checks

Batch run times are a function of processing time and reaction time when failures happen. We integrated data audits into the batch progresses, not as part of the ETL processes, but to alert the production support team during the load of data exceptions. For example, run over run variance of orders, or an unexpected change in volume.


Result

The parallel processes, and proactive notification alerted the team to errors hours before they previously would have found out. This enabled fixing errors before the plan delivery deadlines and improved initial Plan Quality.

Data Quality Report with Exceptions and Resolutions

Like most companies, planners were spending 10-20% of their time validating and reconciling data. This was done manually, after the plans were already distributed, which delayed execution of the plans and responses to customers. In addition to the batch data checks, a data quality, and plan quality scorecard was published for each run, with exceptions and potential resolutions.

Result

The automated exception reports cut-down the manual data validation time of the business users and enabled faster planning and execution cycles.

Data Trends and Runtimes Analysis

The Company’s initial approach was to represent the full detail and complexity of their various factories. The goal was to produce the most optimal and accurate plan possible. However, planning engines are sensitive to different aspects of data complexity, for example order volume, capacity constraints and BOM complexity. Analysis of the batch run time data points with the associated model metrics helped identify patterns and establish co-relation with different trends in data.


Result

By comparing, model complexity, plan runtimes, and plan quality, we were able to balance the speed and optimization to meet the on-time requirements with an optimal plan.

Data Consolidation and Education

In addition to the model complexity, the granularity of the data being collected also represented an opportunity. For example, demand forecast didn’t need to be in weekly buckets 6-8 months out. It can be consolidated into monthly buckets. Low-volume forecasts can be consolidated even further to align with minimum lot quantity. Because some users perceived this as loss of visibility, education was required with plan quality analysis to show the low impact and positive benefits aligned with the planning goals.


Result

By providing data driven analysis, some granular data collection and processing was relieved. This led to faster run times and freed up team members to focus on more value-added tasks.

Business Benefit

With the daily quality checks and analysis in place, on-time delivery and plan quality showed rapid improvement. Over a 3 month period we were able to reduce daily workflow runtimes by 40%. On-time delivery of plans improved from 60% to 95%, which saved the company $19M a month in cost impacts.


The post Case Study: Faster IT Processes, Improves Plan Delivery appeared first on DvSum.



This post first appeared on Blog | DvSum, please read the originial post: here

Share the post

Case Study: Faster IT Processes, Improves Plan Delivery

×

Subscribe to Blog | Dvsum

Get updates delivered right to your inbox!

Thank you for your subscription

×