News Column

Finding solutions with imperfect information

July 1, 2014

Sink, D Scott

The new ISE must solve problems in a world of big data, little data and poor data

In the summer of 1990, Sloan Management Review published a semi- nal article by Thomas Davenport and James Short titled "The New Industrial Engineering: Information Technology and Business Process Redesign." To paraphrase from their introduction, those who endeavor to improve the way work is done need to apply the capa- bilities of information technology as integrated with business process and workflow. Business process design and re-engineering are natural partners, and yet industrial and systems engineers never have fully exploited their relation- ship. Davenport's assertion may well be as valid today as it was in 1990, which is surprising. However, slowly but surely, industrial and systems engineering (ISE) is moving in this direction - albeit quite slowly.

We can put this article in context by recalling that in the same year, Michael Hammer, formerly of MIT, published an article titled "Re-engineering Work: Don't Automate, Obliterate" in Harvard Business Review. These articles reflected the previous 10-plus years of height- ened awareness that the United States was behind the rest of the world in the areas of quality and productivity. A decade before, on June 24, 1980, NBC aired "If Japan Can, Why Can't We?" That program sparked a decade of work on the part of academia and business and industry in the area of total quality management (TQM) and the Malcolm Baldrige National Quality Improvement Act of 1987. W. Edwards Deming also played a vital role in the 1980s to provide leadership in the areas of principles, values and methods, highlighted in his outstanding book Out of the Crisis.

By 2000, lean and Six Sigma were supplanting or augmenting TQM, and we saw an evolution and maturation in deployment designs and executions. James P. Womack wrote about the Toyota Production System. Eli Goldratt wrote about the theory of constraints. Motorola and GE led the way toward Six Sigma. These are just the major thought leaders or corporate leaders for lean and Six Sigma, as many more were involved in the evolution of this movement.

By the early 2000s, it became increas- ingly clear that lean (elimination of waste, improvement in flow, constant search to eliminate bottlenecks, etc.) and Six Sigma (reduction in variation, improve- ment in quality, elimination of defects, etc.) made the most impact when they were integrated. Thus we began to see integrated lean Sigma emerge and evolve in terms of training and certification.

This need for integration became clear to me in 2005 at MDS when we made the move to improve performance in our diagnostic lab service business. We started with our largest and most complex lab in Toronto, bringing in a lean specialist to coach us through the process. When we were done, things had improved, but there was a lot of improvement left on the table. What was left could be called Six Sigma and ISE work. That experience convinced me to go forward with designing and deploy- ing our program with an integrated approach. That served us well from 2005 to 2007. These days, the integration of lean and Six Sigma is now the norm.

In addition to integrating lean and Six Sigma, another critical success factor is the need for data and facts to improve or re-engineer key processes, and that's the focus of the next section.

Integrated lean Sigma, data and facts

Basically, integrated lean and Six Sigma are the scientific method tailored to process improvement and/or design. As such, it's not surprising that they have a critical need for data and facts. Davenport's subtitle, "Information Technology and Busi- ness Process Redesign, portends the shift that has been felt in the world of ISE. the case with all engineering disciplines, ISE always has been based on hard data. But as Davenport and others have noted, the ability to get the right data and then do the right things with it to improve business process improvement is not trivial.

Since 2007, a strong shift and pull has lured ISE graduates into finance, healthcare, retail services and insurance. This has been driven by the recogni- tion that business process redesign/ re-engineering is critical to success, growth and survival, as well as the recognition that engineers (ISEs specifi- cally) have foundational knowledge that allows them to create high value in these endeavors. When industrial and systems engineers get lean and Six Sigma principles, theory, methods and thinking on top of their already solid ISE core curriculum in operations research, manufacturing and production systems, human factors engineering, and, in some departments, management systems engineering, then they are exactly what Davenport suggested: the new industrial and systems engineers.

So how has the evolution of IT enable- ment since 1990 driven more integration between process improvement (how work is done) and what we have to work with from a data and fact perspective?

One of Davenport's major themes was the role of information technology in business process redesign and, of course, how ISEs should be naturals at integrating those two. It is interesting to note that also in 1990, the technology research company Gartner Inc. report- edly used the acronym ERP (enterprise resource planning) for the first time as an extension of material requirements planning (MRP).

ERP has come to represent the evolu- tion of application and database integra- tion beyond manufacturing. At a high level, ERP systems have normative work- fows as their basis and have IT-enabled those workfows through data capture, data analysis, information portrayal, de- cision support, etc. ERP systems have modules that essentially represent busi- ness processes (as Davenport discusses). High-level examples of modules are cus- tomer relationship management, supply chain management, manufacturing execu- tion system, human capital management, procurement, fnancial management and business intelligence. These are extremely high-level workfows. However, most ISEs working on improvement projects would be breaking down processes that are many layers below this level of process architecture.

The point is this: The evolu- tion of ERP systems during the last 24 years has created more maturity and sophistication in the ability to capture data, hence part of the buzz around "big data." And the past 24 years have brought together a number of important streams of development. This is the context within which the new ISE is emerging and why management systems engineer- ing and analytics will play an increasingly important role in our profession.

Management systems engineering, as opposed to engineering management, is quite simply the engineering of manage- ment systems. To be more specific, let's use a conceptual model developed by Virginia Tech'sHarold Kurstedt. Figure 1 is a simplified version of Kurstedt's management systems model. Let's decompose the model.

1. The value stream: Note that the business processes (the system) that industrial and systems engineers are working to improve are listed at the center, across the bottom of Figure 1. In integrated lean Six Sigma terms, this would be depicted as our SIPOC diagram and value stream maps.

2. The data capture system: The data to measurement interface represents the capture of data and facts. Sometimes this is automated and integral to an ERP system, but often, ISEs in the field find that some of the more important data and facts from key control points are not captured or saved.

3. The data/fact storage and process- ing system: The upper right bubble represents where the data is ware- housed, retrieved and analyzed to yield information that can transform processes.

4. The data/fact portrayal system: After analysis, the data must be portrayed with some type of usable interface, which feeds into the bubble occupied by the stakeholders.

5. The decision-making system: This is where the data, facts and information are finally converted into decisions and actions aimed at controlling the "system" and improving it. The deci- sions to take action are made, and the study, plan, do, adjust (SPDA) cycle is completed.

6. New ISE role: One role for the new ISEs is to engineer improved manage- ment systems and apply this model to their organizations. More on this a little later.

Big data, poor data

Gartner Inc. consistently has placed the phenomenon of big data at the peak of its "hype cycle," with one to two years left of hype. The hype cycle, shown in Figure 2, is a conceptual framework for under- standing how technologies move from initial invention to widespread practical application.

The basic pattern is pretty simple: Whenever a new abstraction or tech- nology comes along, it usually gets hyped to the point of inflated expecta- tions. When reality inevitably sinks in, there is widespread disillusionment because of unfulfilled promises. Then the technology/abstraction is reduced to practice and has an impact.

No one debates that we have exponen- tially more data at our disposal to use in many human endeavors. How we make productive use of that and work out of the trough of disillusionment is the challenge ahead. Jameson Toole made a presentation during a TEDx event at the University of Michigan that showed a great example of creative ways to process big data to extract insights and meaning. In "Big Data for Tomorrow," Toole takes massive amounts of data from the cell phone and computer access of him and his significant other. He codes the infor- mation and then represents it as a stream of data. With some training, this visual- ization makes the data easier to translate, interpret and gain insights from - hence it's one step closer to productive use of all this data. This is a simple example of visualization developments that include software from Tableau, Business Objects Software and Microsoft Business Intel- ligence. These platforms will make it easier to extract insights and get closer to productive application of large sets of data.

But despite more than 20 years of ERP implementation, big data just isn't realistic at the operational level in most organizations. Businesses clearly are out of the trough of disillusionment with ERP, but not necessarily at the plateau of productivity. Even organizations with ERP systems must deal with DRIP (data rich and information poor), poor data, little data, wrong data and even no data.

My experience at MDS, where I was part of an Oracle install, and since then with sponsors who are installing, upgrading and using SAP, Oracle or other MRP and WMS type systems, has shown that ISEs will find that the data and facts required to solve many process improvement, lean or Six Sigma prob- lems are deficient. Often this is a product of a measurement breakdown structure, a granularity issue.

When ERP systems are configured to capture data during normative, somewhat generalized workflows, the automated or manual inputs are defined as best they can be during what is often a chaotic process of installation. Much detailed thinking about granular data capture is missed, and it leaves organizations with insufficient data capture. ISEs can and should play a larger role in ERP configurations because they can ensure that design for lean and Six Sigma is integrated into the configuration. This ensures that a larger percentage of the data elements required for sustainable, continuous improvement are captured at the outset of the project. This will speed up the improvement process significantly and help the organization compete more effectively and efficiently.

Ending the DRIP

While we don't want to oversimplify this challenge, I have found that going back to basics is often the best approach. The management systems model is a great framework for showing the way forward when faced with data and fact issues.

When you engineer a management system, you build it in a prescribed, proper sequence, 1-5-4-2-3, using the numbers on the management systems model in Figure 1: Understand the system (1); understand stakeholder require- ments and the decision-making process (5); understand information portrayal requirements/cognitive engineering (4); design the data capture system (2); and finally develop the storage, retrieval and processing of the data system (3).

However, the use of an engineered management system occurs more in a study, plan, adjust, do (PDSA lingo), or 4-5-2-3-1. In using the system, we study visualizations (4) that yield insights and actions (5). We continue to capture data (2), continue to evaluate the performance of the system (3, 1), and start the cycle over again.

This concept of design sequence/ process vs. execute sequence/process is often misunderstood. As a result, ISEs struggle to build effective measurement systems. Design has different require- ments and sequences than what is needed to run and operate the system.

Step a is essentially the measurement and analysis plan. What data elements are required to solve the operations problem? Do they exist? If so, where? If not, what do we do? Often, again at a granular level, many data elements simply don't exist, and many control points in value streams are not measured at all, and if they are, they often are not measured in a systematic way.

There are three ways to get data: by asking, by observing or through system documentation and data. If we, in fact, had good, prevalent big data, then the third source would make our jobs easier. But again, I contend that even in systems with Oracle, SAP, et cetera, the right data is scarce or nonexistent.

This places a premium on ISEs who can be creative, skillful and innovative with asking, observing and using good, old-fashioned measurement systems planning, analysis and work measure- ment. But isn't it interesting to note that many universities are not beefing up work measurement courses and broad- ening their focus, breadth and depth. Instead, many departments are cutting them back, which is precisely the wrong move.

Training our ISEs in measurement systems planning and analysis to support their analytics work is essen- tial. Industrial and systems engineers in the real world end up having to patch together ways to capture data. They must be inventive in terms of where to get data, how to bring it together and "lay it down" in a fashion that is analyzable. Only then can the analytics, which very likely will be the centerpiece of the new ISE, come front and center.

Framing analytics

Regardless of the quantity or quality of data that ISEs have to work with, the analytics process is the same. Figure 3, borrowed with permission from Intel, represents a nice, simple view of the analytics process.

The first thing to highlight is the foun- dational data role at the bottom half of the triangle in Figure 3. The activities above the red line are roles handled by analysts. The ideal model would have the roles separated, but ISEs often find that they have to do both roles - every row in the triangle. This breadth of requirements that ISEs must meet to be successful for their organizations is exactly what makes ISE so challenging, valued and interest- ing.

As far as the implications for our profession, industrial and systems engi- neering hasn't really changed all that much since the days of Gilbreth and Taylor. We still focus on performance improvement, which encompasses effectiveness, efficiency, quality, produc- tivity, innovation, quality of work life, profitability, sustainability and resil- ience. Although the early focus was on processes, ISEs now have expanded their focus to business processes and systems. Clearly, IT enablement has changed dras- tically since the 1970s, as reflected by Davenport's article and the evolution of ERP systems.

But in many respects, the ISEs of today often are in a situation similar to 1974 at Kodak, when I had little data, no data, poor data, and the right data was hard to get. This required a lot of creative observing, interviewing and chasing down leads. The new ISE must be curi- ous, inventive and cannot take no for an answer. These ISEs have to be willing to dig harder and deeper than others might. They have to get to know more stakehold- ers in the system and make friends with people in IT and finance. There often is, in fact, a lot of useful data out there, but you have to mine for it - not just press a button on an ERP system.

As mentioned earlier, the importance of being able to develop measurement and analysis plans is critical in the new ISE role. Discover what data and facts you need to solve the problem and improve performance. Build detailed plans for how to get it and use it. Review those plans with your core team. Make sure IT and finance people are on those teams because they often can bring data to the surface that you might have assumed didn't exist.

For our part, ISE educators must make sure that we factor in the whole triangle in Figure 3 as we add analytics to our program designs. My sense is that many analytics programs in ISE departments start out with a narrow focus. Some focus on computer science, others on visualization, others on data warehouse architecture. But we must make sure that the integrative, holistic nature of indus- trial and systems engineering shows up in our approach to blending analytics into our curriculum. d

first, figure out what you have

When it comes to using big data, perhaps it's best for organizations to learn to walk before they run.

"You May Not Need Big Data After All" in the December 2013Harvard Business Review reported that many investments in data scientists, data warehouses and data analytics software have not paid off - and it's possible they never will.

Seven case studies and interviews with executives at 51 companies revealed that few companies know how to analyze the information already in their operating systems. Without such analyses, they can't turn information into effective responses. Reeling in more information from the 2.4 quintillion digital data bits generated daily (according to Forbes) won't help develop those competencies.

Companies that are exceptions have adopted evidence-based decision-making, according to the researchers, and have developed four practices: "They establish one undisputed source of performance data; they give decision-makers at all levels near-real-time feedback; they consciously articulate their business rules and regularly update them in response to facts; and they provide high-quality coaching to employees who make decisions on a regular basis."

D. Scott Sink is director of the Integrated LeanSigma Certification Program in the Department of Integrated Systems Engineer- ing at The Ohio State University. He spent 2000 to 2007 in the private sector as vice president of business process improvement roles for Exchange Solutions and MDS.

For more stories covering the world of technology, please see HispanicBusiness' Tech Channel

Source: Industrial Engineer

Story Tools Facebook Linkedin Twitter RSS Feed Email Alerts & Newsletters