The Science of Labor and Productivity

Data is helping DCs Cut Labor Cost, Boost Performance, and Retain Workers

Imagine for a moment that you could know exactly how many workers you’ll need to run your operation at peak efficiency next week. While you’re at it, assume that it’s also possible for you to know exactly where each of your employees will work most effectively, which incentives will motivate them to get their jobs done, and who’s thinking about quitting next week.

This scenario isn’t as far-fetched as it may sound. In fact, all of these insights are possible, thanks to recent advances in data science, enabled by the development of cutting-edge artificial intelligence (AI) and machine learning.

By incorporating these capabilities into next-generation labor management software (LMS) systems, connected distribution centers (DCs) can now harness an unprecedented level of proactive control over their workforce — and their operations. How DC operators choose to use this power will have a significant impact on the industry. Early signs suggest that savvy DC operators are using it to discover new opportunities for optimization, overcome the challenges of tight labor markets, and drive the development of autonomous systems.

The Declining Cost of LMS

One of the most promising benefits of modern LMS systems is their plummeting cost, which is making them accessible to more operations than ever before.

A significant part of an LMS setup is the development of engineered labor standards, which provide the benchmarks for each individual task in a DC. Historically, this process has been performed by a team of industrial engineers, which can be both costly and time-consuming.

Today, however, a significant portion of these standards can be generated automatically, using regression models and other data science tools that draw on the deep experience of materials handling and labor management experts. This approach has reduced the initial investment cost of the latest LMS systems by 40 percent or more.

Once a modern LMS system is up and running, it can also handle ongoing adjustments to labor standards and other calculations, automatically balancing expectations to maintain peak performance without leaving workers feeling overwhelmed. Over time, the model also learns to predict other factors, such as optimal pay rates for a given job and which incentives are generating the strongest return on investment.

These capabilities provide additional cost savings by relieving industrial engineers of back-end work, effectively giving them more time to invest in higher-value tasks, such as developing new process improvements. In many cases, it also eliminates the need for DCs to hire additional staff in order to manage the system itself.

Foreseeable Futures

During the last decade, the focus of LMS has gradually shifted from interpreting the past to predicting the future. About 10 years ago, the insights of LMS were largely limited to the previous day’s performance, highlighting which areas were meeting their goals or needing improvement. Details started coming faster as LMS evolved, providing information about the previous shift, or even supplying iterative updates once or twice during the current shift. This data had value — and it still does — but for the most part, it only enabled reactive responses to past results.

Fast-forward to today. With the advent of advanced machine-learning algorithms, LMS systems can now use this historical data to predict a DC’s future labor needs.

For example, a DC’s execution history can be used to predict resource needs for the next day or the next week. In addition, these plans can be supported by real-time data monitoring and predictive modeling. In a scenario like this, the LMS might notice that there are too many workers concentrated in one area while another will soon have a shortage. If the potential benefit of reassigning workers outweighs the opportunity cost of moving them, the system could automatically recommend an intra-day change for labor balancing.

Great Power Creates Great Opportunities

Another emerging benefit is the ability to predict labor attrition. Drawing on years of historical data from multiple facilities worldwide, machine-learning algorithms can track employees from their start date, as they transition to new roles, quit, get fired or laid off, and so on.

Once the model gains enough local data from a given site, it’s capable of identifying risk factors that signal when a worker is at risk for quitting. Local data is key, because what workers do differently before they leave can vary by organization, and even by location within the same organization.

While this is not exactly a “crystal ball,” early tests suggest the turnover of any given individual can be predicted with accuracy rates around 95 percent after the model has been programmed. The same machine-learning model can also be used to help pinpoint factors that are encouraging or discouraging employee retention, such as which supervisors are managing the largest number of satisfied employees, and what their peers could be doing to improve their own rates.

Early adopters are using this powerful information with varying degrees of sophistication. Some companies simply use this kind of data to automatically fire workers with the highest risk scores in a kind of “preemptive strike.” This strategy may eliminate some problems before they escalate, but it also overlooks the greater levels of value the data can provide. Considering today’s tight labor environment — which shows every sign of getting worse long before it improves — this approach may also be short-sighted.

Designing Engagement Models to Keep Good Workers

A more enlightened approach is to use risk data to inspire and engage the workers you’d like to retain. The powerful outputs of data science can be used to drive resource-focused engagement, creating measurable improvements for both workers and the warehouse. 

Each organization can choose the engagement incentives it’s willing to offer. Even simple, low-cost strategies like badges, pizza parties or an extra day off can pay big dividends when it comes to employee satisfaction and retention. The system can also predict which workers are most likely to respond to more substantial incentives, such as bonuses, training or career development. 

Many factors determine which engagement models will appeal most to a given worker. For example, baby boomers typically prefer monetary incentives, while many millennials value non-monetary perks such as extra time off. Other factors that often influence engagement models are an employee’s performance level, tenure, utilization, operation and current risk score.

For example, let’s say that your LMS system identifies a group of employees who are likely to quit in the next six weeks. Based on each individual’s unique risk factors, you could divide them into three groups: 

  • Red: Employees with high turnover risk, low performance records, or both
  • Yellow: Medium-risk workers with moderate to high performance levels
  • Green: Similar to the yellow group, but with lower levels of risk

Next, each group can be approached with tailored engagement models. Members of the red group are either unlikely to stay (regardless of incentives), or lack sufficient potential for engagement to be worthwhile. This group could be monitored, and supervisors can begin contingency planning for their specific roles, but the primary insight in their case is that they’re unlikely to stay. By contrast, the green group doesn’t need much encouragement to stick around. Their engagement model could include some standard training or a bit of extra attention.

Attrition prediction offers the most value when considering members of the yellow group. The effort needed to retain them could include more intensive training, larger incentives, or other motivation such as making sure some type of intervention is used to avert a labor shortage. The system can help determine which incentives are likely to work best, balancing their cost against the likely benefits.

These investments are easily justified when you consider the costs of recruiting and retraining replacements. If just 15 percent of workers in the green and yellow groups can be persuaded to stay, a medium-sized DC could easily see six-figure savings each year.

Needless to say, the system also monitors the results of all these initiatives. With properly engineered labor standards and engagement models, early tests suggest that for every 50 cents employers invest in engagement, they should expect to see about a dollar in throughput benefits.

At Honeywell Intelligrated, our labor management experts are currently conducting testing to further refine the effectiveness of attrition prediction. Machine learning plays a key role in shaping these models, balancing them with the aid of historical information. Once programmed, the system works to ensure a curve that drives performance, predicts the best incentives to offer, and helps determine the standards an organization should set as its goals. Going forward, it also monitors these and many other factors, suggesting changes as needed to maintain peak efficiency.

The Power of a Single Platform

LMS systems thrive best when the information they need is driven through a single consistent platform. With the proper integration, more data can be pulled in real time and process improvements can be implemented with greater speed and efficiency.

This is particularly true in the case of labor balancing, which requires the LMS system to have a real-time view of current resources and capacity. Seamless LMS integration into a connected infrastructure allows rapid calculation of opportunity costs and makes it easier for the system to redirect resources when and where they can provide the greatest benefit.

While some of the capabilities described in this article are relatively new, they’re already delivering significant competitive advantages to forward-looking DCs. From lower costs to higher throughput rates to more engaged workers, the evolution of data science and sophisticated machine learning is opening new frontiers for LMS systems.