Question: 5.4 General Methodology-Related Considerations 5.4.1 Planning an Analytics Project A critical success factor in technical projects, particularly where there is any element of exploration and

5.4 General Methodology-Related Considerations

5.4.1 Planning an Analytics Project

A critical success factor in technical projects, particularly where there is any element of exploration and discovery, is project planning. This is no different for analytics projects. In fact, when one adds the expectation for a usable outcome (i.e., a tested and implemented process coded in software, running on real data, complete with a user interface and full documentation, all while providing smashing insights and impactful results), the project risks and failure odds go up fast. As mentioned in the macro-methodology section, the macro-methods align nicely with project planning because they give a roadmap that equates to the high-level set of sequential actities in an analytics project. When considering macro- and micro-method planning together, skills and details of activities can be revealed, so that task estimation and dependencies are possible. In fact, one of the traditional applications of network models taught to students of operations research is the PERT (program evaluation and review technique)/CPM (critical path method)-a micro-method that practitioners can apply to macro-methodology for helping to smoothly plan and schedule a complex set of related activities (see Ref. [14]).

When there are expectations for a usable software implementation outcome, practitioners can augment their macro-methodology steps with appropriate software engineering steps. The software engineering requirement step is recommended for planning desired outcome function, as well as usability needs and assumptions. In fact, complex technical requirements, such as integration into an existing operations environment, or perhaps data traceability for regulatory compliance, are best considered early in requirements steps that compliment domain and data understanding steps.

Overall, while prototyping and rapid development often coincide with projects of more exploratory nature, which analytics projects often are, some project planning and ongoing project management is the best way to minimize risks of failure, budget overruns, and outcome disappointments.

5.4.2 Software and Tool Selection

Most if not all of our analytics projects need some computational support in the form of software and tools. Aside from DIY software, which is sometimes necessary when new methods or new extensions are developed for a project, most micro-solution methods are available in the form of commercial and/or open-source software.

Without intending to endorse any specific software package or brand, a few packages are named here to provide illustrations of appropriate packages, while leaving to the reader to decide which packages are most appropriate for their specific project needs.

For (Group I) exploration, discovery, and understanding methods, popular packages include R, Python, SAS, SPSS, MATLAB, MINITAB, and Microsoft EXCEL. Swain [64] provides a very recent (2017) and comprehensive survey of statistical analysis software, intended for the INFORMS audience. Most of these packages also include GLM, factoring, and clustering methods needed to cover (Group III) data-dependent methods, as well.

For (Group II), a fairly recent survey of simulation software, again by Swain [65] and a very recent linear programming software survey by Fourer [66], are resources for selecting tools to support these methods, respectively. An older but still useful nonlinear programming software survey by Nash [67] is a resource to practitioners. MATLAB, Mathematica, and Maple continue to provide extensive toolboxes for nonlinear optimization needs. For Branch and Bound, the IBM ILOG CPLEX toolbox is freely available to academic researchers and educators. COIN-OR, Gurobi, GAMS, LINDO, AMPL, SAS, MATLAB, and XPRESS all provide various toolboxes across the optimization space. More and more, open source libraries related to specific languages, such as Python, now offer tools that are ready to use-for example, StochPY is a Python library addressing stochastic modeling methods.

As a final note, practitioners using commercial or open-source software packages for analytics are encouraged to use them carefully within a macro-solution methodology. In particular, verification, that is, testing to make sure the package provides correct results, is always recommended.

5.4.3 Visualization

Visualization has always been important to problem-solving. Imagine in high school having to study analytical geometry without 3D sketches of cylinders. Similarly, operations research has a strong history of illustrating concepts through visualization. Some examples include feasible regions in optimization problems, state space diagrams in stochastic processes, linear regression models, various forms of data plots, and network shortest paths. In today's world of voluminous data, sometimes the best way to understand data is to visualize it, and sometimes the only way to explain results to an executive is to show a picture of the data and something illustrating the "solution."

Other chapters in this book cover the topic of analytics and visualization, for example, seeChapters 3and6. The following points regarding visualization from a solution methodology perspective are provided in order to establish a tie with the methods of this chapter:

Analytics and OR researchers and practitioners should consider visualizations that support understanding of raw data, understanding of transformed data, enlightenment of process and method steps, and solution outcomes.

Visualization in analytics projects has three forms, which are not always equivalent:

1)Exploratory-that is, the analyst needs to design quick visualizations to support their exploration and discovery process. The visualizations may help to build intuition and give new ideas, but are not necessarily of "publish" or "presentation" quality.

2)Presentation-that is, the analyst needs to design visualizations as part of a presentation of ideas, method steps, and results to sponsors, stakeholders, and users.

3)Publishing-that is, the analyst wants to design figures or animations that will be published or posted and must be of suitable quality for archival purposes.

5.4.4 Fields with Related Methodologies

Many disciplines are using analytics in research and practice. As shown in the macro-methodology section summary, all macro-methodologies are derivatives of the scientific method. In fact, many of our micro-solution methodologies are shared and used across disciplines. As a community, we benefit from and have influenced shared methods with the fields of science, engineering, software development and computer science (including AI and machine learning), education, and the newly evolving discipline of data science. This cross-pollination helps macro- and micro-solution methodologies to stay relevant.

5.5 Summary and Conclusions

This chapter has presented analytics solution methodologies at both macro- and microlevels. Although this chapter makes no claim to cover all possible solution methodologies comprehensively, hopefully the reader has found the chapter to be a valuable resource and a thought-provoking reference to support the practice of an analytics and OR project. The chapter goals of enlightening the distinctions of macro- versus micro-solution methodologies, providing enough details of these solution methodologies for a practitioner to incorporate them into the design of a high-level analytics project plan according to some macro-level solution methodology, and providing some guidance for assessing and selecting appropriate micro-solution methodologies appropriate for a new analytics project should have hopefully come through in the earlier sections and sections. In addition to a few pearls scattered throughout the chapter, we conclude by stating that solution methodologies can help the analytics practitioner and can help that practitioner help our discipline at large, which can then help more practitioners. That's a scalable and iterative growth process that can be accomplished through reporting our experiences at conferences and through peer-reviewed publication, which often forces us to organize our thoughts in terms of methodology anyway, so we might as well start with it too! The main barriers for solution methodology seem to be myths. Dispelling some of the myths of analytics solution methodology is covered in these final few paragraphs.

5.5.1 "Ding Dong, the Scientific Method Is Dead!" [68]

The scientific method may be old, but it is not dead yet. By illustrating its relationship to several macro-solution methodologies in this chapter, we've shown that the scientific method is indeed alive and well. Arguments to use it literally may be futile, however, since the world of technology and analytics practice often places time and resource constraints on projects that demand quick results. Admittedly, it is quite possible that rigor and systematic methodology could lead to results that are contrary to the "desired" outcome of an analytics study. Thus, without intentionally doing so, our field of practice may be inadvertantly missing the discovery of truth and its consequences.

5.5.2 "Methodology Cramps My Analytics Style"

Imagine for a moment that analytics practitioners used systematic solution methodologies to a greater extent, particularly at the macrolevel and then publish their applied case study following an outline that detailed the steps that they had followed. Our published applied literature could then be a living source of experience and practice to emulate, not only for learning best practices and new techniques, but also for learning how to apply and perfect the old standards. More analytics projects might be done faster because they wouldn't have to "start from scratch" and reinvent a process of doing things. Suppose that analytics practitioners, in addition to putting rigor into defining their problem statements, also enumerated their research questions and hypotheses in the early phases of their project. Would we publish experiences that report rejecting a hypotheses? Does anyone know of at least one published science research paper that reports rejecting a hypothesis let alone one in the analytics and OR/MS literature?

Research articles on failed projects rarely (probably never) get published, and these could quite probably be the valuable missing links to helping practitioners and researchers in the analytics/OR field be more productive, do higher quality work, and thrive by learning from studies that show what doesn't work. When authentically applied, the scientific method should result in a failed hypothesis every once in a while, reflecting the true nature of exploration and the risks we take as researchers of operations and systems. The modern deluge of data allows us to inquire and test our hunches systematically without the limitations and scarcity of observations we faced in the past. Macro-solution methodologies, either the scientific method or any derivative of it (which is just about all of them), could relieve analytics projects cramps not only by giving us efficient and repeatable approaches but also by recognizing that projects sometimes "fail" or reject a null hypothesis-doing so within the structure of a methodology allows it to be reported in an objective, thoughtful manner that others can learn from and that can help practitioners and researchers avoid reinvention.

5.5.3 "There Is Only One Way to Solve This"

We've all heard the saying,if all you have is a hammer, then every problem looks like a nail.This general concept, phrased in a number of different ways since first put forward in the mid-1960s, is credited to Maslow [69], who authored the bookPsychology of Science. In our complex world, there are usually many alternate ways to solve a problem. These choices, in analytics projects, may be listed among the micro-methodology techniques described in this chapter or elsewhere. Sometimes, there are well-established techniques that work just fine, and sometimes a new technique needs to be designed. The point is that there are many ways to solve a problem, even though many of us tend to first resort to our favorite ways because those tend to align with our personal experiences and expertise. That's not a bad approach to project work, because experience usually means that we are using other knowledge and lessons learned. However, behind this is the danger of possibly using the wrong micro-solution methodology. In fact, the problem of an ill-defined problem can lead to overreliance on certain tools-often the most familiar ones. What does this mean? That in our macro-solution methodology, steps such as understanding the business and data, defining the problem, and stating hypotheses are useful in guiding us to which micro-methodologies to choose from and thus avoiding the potential pitfalls of picking the wrong micro-method or overusing a solution method.

5.5.4 Perceived Success Is More Important Than the Right Answer

In math class, school teachers might make a grading key that lists the right answer to each exam or homework problem. In practice however, there is no solutions manual or key for checking if an analytics project outcome is right or wrong. We have steps within various macro-solution methodologies, for example, verification, that help us to try to make the best case for the outcome being considered "right," but for the most part, the correctness of an analytics project outcome is generally elusive, and projects are usually judged by the perceived results of the implementation of a solution. In analytics and OR practice, there are cases where the implementation results were judged as wildly successful, for example, analytics/OR project recognized as an INFORMS Edelman award finalist for its contribution to a company's saving of over $1 billion might actually be judged as not successful because the company creating the OR solution was not able to commercialize the assets and find practitioners in its ranks to learn and deploy them and thus reproduce the solution as a profitable product (see, for example, Ref. [70]).

Documentation of reasons for analytics project failures probably exists, but it is rarely reported as such. Plausible reasons for failure (or, perhaps more accurately, "lack of perceived success") include the following ones: the solution was implemented, but there was no impact, or it was not used; a solution was developed but never implemented; a viable solution was not found; and so on. Because of the relationship between analytics projects and information technology and software, some insights can be drawn from those more general domains. Reference [71] provides an insightful essay on why IT projects fail that is loaded with examples and experiences, many with analogues and wisdom transferable back to analytics. Software project failures have been studied in the software engineering community for over two decades, with various insights; see, for example, Ref. [72]. The related area of systems engineering offers good general practices and a guide to systematic approaches: One of the most recognized for the field of industrial engineering is by Blanchard and Fabrycky in its fifth edition [73].

It is important to remember that in practice ultimate perceived success or failure of an analytics project may not mean "finding the right answer," that is, finding the right solution. By perceived success, we mean that an analytics solution was implemented to solve a real-world problem with some meaningful impact acknowledged by stakeholders. Conversely, perceived failure means that for one of a number of reasons, the project was deemed not successful by some or all of the stakeholders. Not unlike some micro-solution methodologies of classic operations research, we have necessary and sufficient conditions for achieving success in an analytics project, and they seem to be related to perception and quality. Analytics practitioners need to judge these criteria for their own projects, while perhaps keeping in mind that there have been well-meaning and not-so-well-meaning uses of data and information to design perceptions and influence. See, for example,How to Lie with Statisticsby Darrell Huff [74] and the more contemporary writing, which is similar in concept,How to Lie with Mapsby Mark Monmonier [75].

The bookHow to Lie with Analyticshas not been written yet, but unfortunately it is likely already practiced. By practicing some form of systematic solution methodologies, macro and micro, in our analytics projects, we may help our field to form an anchoring credibility that is resilient when that book does come out

Question:

1. I need to get some clear idea about this topic since I need to make some presentation slides on this and talk for 30 min on the above topic.

You can include anything from other sources too and help me to make a content so that I can make my presentation with different key points to talk about and explain the given topics properly and talk for 30 min at least.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related General Management Questions!