There has been a lot of buzz over the past couple of years about the new role of the data scientist in organizations. A recent article in the Harvard Business Review touts the role of data scientist as the “sexiest job of the 21st century.” And what exactly are the key attributes of this role? According to the article, this person is a “data hacker, analyst, communicator, and trusted adviser.” However, as us laypeople in the business and technology world know, you can build the most elegant solution to a problem, but if you can’t embed it efficiently within a business process it is likely to have less impact. In fact, I think a better term for this role is “data artist.”
But the data artist doesn’t live in a vacuum. As competitive pressures increase the need for organizations to master analytics, internal analytic teams have increased their statistical sophistication, but are struggling to operationalize their insight. Many analytic groups find themselves managing production processes, while the demand for analytics, as well as the data volume and need for speed-to-insight, grow within the organization. A need emerges to balance the creative instincts of the scientist with the creation of a flexible analytic delivery framework that adapts and evolves to the unpredictability of innovation.
Over time, analytic processes can grow so complex that it becomes difficult to identify the root causes when the process breaks down; and a breakdown in the process can set the team back days if not weeks as resources try to find and fix the problem. Since much of the work in the data mining process is knowledge based, the question becomes: What is the right framework to support the knowledge process?
Our data artists must be cognizant of the interdependencies between the design of their model and other components of the overall system. The model development and deployment process may span a number of functional or business areas, requiring the coordination of resources, data and technology across the organization. It’s a delicate balance: embracing formal controls requires adherence to a collective goal, often with pre-determined standards or processes. This may conflict with the creative aspects or methods inherent in the data discovery and model development process.
The Lean methodology has emerged as a way for insurer analytic groups to identify and streamline their workflow. Lean’s goal is to eliminate waste and improve efficiency. Lean provides a light toolkit for improving analytic lifecycle management, effectively and efficiently managing the elements (people, processes, data and technology) necessary for optimizing the model development process from conception to deployment. The Lean methodology in particular provides a way for organizations to drive value in their products and services (in this case, the analytic product), by allowing workers to perform work in the most efficient and effective way possible. Lean organizations do more with less – less effort, resources and time – while providing customers (defined as anyone downstream from a process) with maximum value. Following are examples of how the methodology can be applied:
• An analytic modeler in one organization spent six months performing the detective work necessary to define and understand the data that needed to be used to create a predictive model to answer a key business question. The actual modeling work only took two weeks. In those first six months, the modeler had to not only identify, but set up meetings with multiple system and data owners across the organization as he began to build out the data set for the model. The organization lacked consistent data quality routines and metadata that would have reduced the need for this modeler to scour the company for knowledge and resources.
• Another organization developed a data model to support its new customer-focused analytic initiatives. Leveraging a just-in-time approach to development, the team identified high value customer analytic activities in a use case format (such as customer retention, acquisition, next-best-offer, etc.) and phased in data delivery in support of these use cases. Subsequent iterations are driven by additional use cases, and drive further development of the data model. The iterative approach to the development of the data model insures that the correct data in the correct format is available to support prioritized business specific processes.
In one analytic process analyzed, there were multiple teams involved, more than 100 process steps, and the lead time from idea generation to execution exceeded 100 days – and that’s if everything went as planned. An error in their workflow (which was rife with data quality issues, inconsistent quality control mechanisms and an insane amount of handoffs between teams), could result in rework that set the process back two to three weeks. The time spent uncovering the problems reduced the capacity of the entire analytic team and delayed the output of the insight.
[For more industry insights from Rachel Alt-Simmons, see Navigating the "New Normal".]
As you can imagine, the first step down the path to Lean is getting cross-functional stakeholders together and creating shared accountability. A critical success factor for lean is horizontal-process ownership, not vertical-functional ownership. The next step is to map out the current state of the process (also known as value stream mapping) and identify areas for improvement.
One analytic team began to use lean improvement techniques in an operational analytic process that gave them back five days per month. That’s five days that the team could use to focus on new and possibly game-changing initiatives for their organization. By inserting a little process discipline in their workflow, the team was able to increase their creative capacity (and job satisfaction for data artists!). Are you thinking lean?
About the Author: Rachel Alt-Simmons is senior industry consultant for Insurance at Cary, N.C.-based SAS. Simmons has driven business intelligence initiatives at Travelers and Hartford Life and has been Research Director for Life & Annuity at research firm TowerGroup.