Why Machine Learning is NOT the First Step to Full Automation

Learn more about the misconceptions of machine learning within the automation process.

The year was 2009 and change was coming! For nearly a decade prior, major innovations utilizing artificial intelligence were at the forefront of a technological revolution striving to automate our world. These milestone achievements in technology were accomplished by teaming process knowledge and subject matter expertise with software developers to create targeted applications to improve efficiency, reduce errors, and automate processes. This was one of the pathways to full automation, and it was a successful avenue that produced technologies such as the autopilot software the airline industry uses today as well as a plethora of military and government related technologies for defense application purposes–among many others.

However, a shift occurred around 2010 that changed the way process automation would be pursued, and it delayed the path to full automation for industry functions, especially those requiring analytical human logic and intervention. By year two of the recession, companies started to cut back on costs. They restructured their workforces by removing many “non-essential” employees to include subject matter experts, process gurus, and continuous improvement practitioners. In addition, organizations also phased out pension plans, force-retired long tenured employees (or offered an appealing retirement package), leaned out their existing analytical-focused positions, and scaled back on benefits–reducing way more than just company loyalty. Consequently, process knowledge and subject matter expertise were significantly diminished, if not fully eliminated. Organizations began believing that this valuable logic, experience, and insight could now be found externally through the “fresh sets of eyes” of new hires, promoting hourly and/or floor level employees into analytical roles, and incorporating machine learning. 

As a result, companies moved towards top-down software applications while they centralized a large portion of their data analysis activities. The previously successful function level, bottom-up approach to software development–where the people who understood the work collaborated with the software engineers and programmers for automation purposes–was all but reliced. Organizations chose short term cost cutting returns over long term success.

Here are a couple examples of this “shift” observed during our team’s time in industry: 

By 2011, one Fortune 500 automotive manufacturer laid off the majority of their material planners and forced the existing production schedulers to take over the workload with limited training and process knowledge sharing. This decision failed so miserably that the company was forced to refill the planning positions a couple years later, but these new planners were external hires who needed to be trained and required a significant onboarding period to learn and develop in the midst of an ongoing recession and minimal process support from the organization (because the company laid off their subject matter experts two years prior). 

Two other Fortune 500 manufacturers in the appliance and consumer goods industries decided that promoting floor level hourly employees into planning and scheduling positions was a quick fix to resolve the process knowledge gap and save money; and now, across all sites, approximately 60-80% of these analytical positions are filled with employees who may or may not have the skillset, education, training, and/or analytical prowess to perform the work effectively. However, due to these inefficiencies, these departments have grown to nearly twice the size they were ten years ago, yet they continue to struggle to be successful (as seen during the pandemic). 

Computational technology also made a shift in 2010. Instead of continuing to pair process knowledge and subject matter expertise with software development for targeted automation solution creation, software engineering shifted to a more singular existence which led to the rapid mainstream emergence and growth of machine learning. A key theory behind the growing dependence on machine learning was vaulted into place because of the cutbacks organizations implemented during the recession. The Oxford dictionary defines machine learning as “the use and development of computer systems that are able to learn and adapt without following explicit instructions, by using algorithms and statistical models to analyze and draw inferences from patterns in data.” However, as current Industry 4.0 initiatives are becoming a top priority for most companies, and process automation being an important element of Industry 4.0, machine learning has had a difficult time keeping up with expectations–especially due to timeline requirements.

In April of 2018, Elon Musk said on Twitter, “Yes, excessive automation at Tesla was a mistake. To be precise, my mistake. Humans are underrated.” Machine learning was a primary aspect to Tesla’s automation plan, and it proved to be a difficult one with less than optimal results. This is because the whole premise behind machine learning revolves around a computer “using algorithms and statistical models to analyze and draw inferences from patterns in data.” Well, what happens when the quality of data is poor, the process is not well defined, the process knowledge provided to the computer is limited (no situational awareness), and/or the implementation of machine learning is complex and painfully slow? Sounds like a recipe for failure, and organizations as a whole have invested trillions into the technology without seeing the return on investment they were hoping for.

Make no mistake, the Perfect Planner Team believes machine learning is a pivotal component to artificial intelligence and process automation, and it has a dual purpose. First, machine learning is powerful when a process cannot be or has not been fully established with what is known about it today. A couple great innovations that have been created due to this purpose are facial recognition technology and planet hunting telescopes. Secondly, machine learning is a complement to the automation process at the end of the journey when sustainment and continuous improvement activities are the focus. However, there are three fundamental steps that should come before artificial intelligence and machine learning are applied to the software development process (which our team considers the fourth step). These steps are what we call the Straight to Execution Approach, and they provide the foundational process automation elements required to transition smoothly into artificial intelligence and machine learning pairing, which ultimately results in a quick and effective automation outcome.

THE STRAIGHT TO EXECUTION APPROACH

The Straight to Execution Approach is a four step process to full automation designed by the Perfect Planner Team, but it is founded on the principles and theories of how software was developed before 2010. It has been proven highly successful in multiple Fortune 500 companies. For instance, one Fortune 500 appliance manufacturer used this approach over the course of two years after the pandemic impacted supply chain efficiency and operational resources (especially manpower). Over 30 process automation solutions were created in 2020 and 2021, and these organization-approved best practices affected eight core functions in three departments. Four of these functions were automated between 60% and 70%, and the results were achieved utilizing only the first three steps listed below–machine learning and robotics were not applied. However, the first three steps in the Straight to Execution Approach have now paved the way for full automation success when the time is right.

Step 1: A Process Must Come First

A function-specific, comprehensive process including all tasks and actions required for the job–prioritized by criticality–is necessary to standardize the role across industries. Knowing that the very best software has well-defined requirements, these requirements need to be provided first (i.e. the process). In the past, a significant part of a software budget went to understanding and defining the requirements.

As stated previously, for the past 10 to 12 years companies have essentially purged process knowledge and subject matter expertise. In their place, machine learning was inserted with varying results. Our belief is that the normalization of work across industries leading to the full automation of functions cannot occur without well-defined and proven processes. As the old Six Sigma adage asserts, “Good processes equal good results!”

Step 2: Data Accuracy & Situational Awareness are Essential

The difficult-to-find-and-quantify process knowledge extends further than defining the requirements and standardizing the process. There has to be data integrity for either a human or a computer to do the job without error. Using machine learning without data accuracy will undoubtedly hinder automation progress. In addition, from our team’s extensive experience with enterprise system data, only 50-60% of situational elements can typically be seen in the data, so the data gap experienced by machine learning algorithms creates a diminished environment for it to assess and apply situational awareness logic to the calculated results—also impacting accuracy. The only way we have found to consistently overcome the data integrity and situational awareness conundrum is through coded logic, and we believe this is the “missing link” to full automation.

Step 3: Make the Output Structured & Easy to Navigate

The most straightforward and simplistic way to instruct someone on what needs to be accomplished is a checklist. If steps one and two have occurred with a high level of confidence, then this step is considered the “making it easy” step. For full automation to be a success, the user output needs to be ranked, listed, and provided to the human or computer in an executable form. Regardless of who or what completes the work, the presentation of executables needs to be linear, structured, and easy to navigate.

Step 4: Apply AI & Machine Learning for Full Automation & Further Refinement

Now that the function has experienced up to 75% automation from the successful completion of steps one through three, the time is right to apply artificial intelligence and machine learning to finish the full automation journey.

 

Author: Thomas Beil

Publication Date: March 15, 2022

© Copyright 2023 Perfect Planner LLC. All rights reserved.

Explore More Related Topics