Oct 072013
 

When taking on an analytics project, or designing a reporting system (dashboard or otherwise) a core component to superior execution is to properly understand the question(s) the vehicle is expected to answer.  This may seem like an obvious statement, but it is amazing how often the metrics of focus are done so for convenience rather than impact. Additionally, dashboards and reports are often (at least initially) put together by individuals with little training in design and business reporting. A monkey can make a graph, but it takes a bit of thought and planning to make something impactful.  I would argue that the state of business intelligence in general suffers from this issue – people undervalue the opportunities for using data to make great business decisions because they have learned that the data available to them is not useful for doing so. Instead of insisting that the metrics and reports be useful for business decisions, they instead write off the full potential of the data and go back to the inefficiencies of traditional gut based decision making. What they fail to realize or are not made aware of, is the wide variety of data available that is not being used. Empowering decision makers to utilize data is a core purpose of an analyst.

While assigning fault for the poor state of BI affairs isn’t particularly helpful, it’s worth noting that it is a systemic issue based on the past delivery of inferior metrics and reports coupled with limited decision making timeframes. This can also be compounded by the general ignorance of what data is available. The analyst’s job is to right these wrongs and must retrain the organization around how data is approached, assembled, and utilized for business decisions. This reorganization begins with the most basic step in the analytics chain: Defining the right questions.

The reason that the question definition is key is because all further analytical planning actions stem from it. Question definition, while the job of the analyst, requires input from the consumers of the information. Interviewing those who will use the analysts’ output is key to deciding what any given analytical product will contain including how many metrics are needed, in what format, and how frequently they will need to be updated.  The unsophisticated organization derives metrics by default, in a post-hoc manner based around whatever data has happened to be collected. This approach is not likely to have the impact that a more carefully planned set of information tailored to the business needs will.

Additionally, some decision makers will believe they know exactly what data they want and need, and it is important that the analyst probe and make sure this is correct. Finding out what a particular metric will be used for, or why a specific piece of information is useful can uncover unintended interpretation mistakes (e.g. when a unique-id is not indicative of a unique user due to cookie expiration).  It is safe to say that while business owners often do not understand the data used to create particular metrics, they often have strong opinions about what the metrics mean. It is the job of the analyst to educate and redefine based around these shortcomings. Furthermore, the analyst should be aware of the myriad of data sources that are available for creating metrics from, helping to aid the business owner through discovery. This is a major reason it is critical to get the BI consumer/business decision maker to talk about what the questions they are trying to answer are rather than to expect them to rattle off a full list of needed metrics. The analyst defines the metrics based on the needs of the business owner. It is crucial that the analyst take an active, participatory role in the design stage rather than a passive “fast food order” style of design. You are a data educator – act like one.

In closing, there are a number of additional steps to defining the mechanism necessary for answering the questions settled upon. Some questions require multiple metrics, new data collection, derivative metrics (e.g. time between purchases is derived from last purchase timestamp minus previous purchase timestamp), or a re-statement of the question due to limitations in available data. This layer of the design lies in between the initial question definition and the visual display of data, but again is key to successful execution of an analytic output. The analyst is an artist, a teacher, a designer, a data counselor, and a storyteller as well as being the one who knows the data. You can’t design an output mechanism if you don’t know what the input question is.

Question –> translates to data –> summarized by metrics –> interpreted by analysis –> converted to display –> answers or informs question

May 132013
 

There are a number of often unspoken jobs that the analyst must perform in order to be useful, helpful and thereby successful.  These include being a question definer, a translator, a data expert, a storyteller, and an anticipator of post analysis questions. Without approaching a business question (which should inform a specific business action) carefully, the opportunity for error or underwhelming findings is greatly increased.

Definer of THE QUESTION – When I think about analysis I always start with “what is the question we are trying to answer?” The question to be answered is never as simple as whatever my boss has asked me; data is messy, the question is complex, and more often than not, the initial question is wrong. By wrong, I mean that it is too general, makes assumptions about the answer that may be wrong, or just does not make sense from a business point of view. One way to think about whether the question is a good one or not is to come up with a fake answer and see if it would change anything. “How many women in their 40’s are tweeting about our product?” the boss asks you… the answer, by itself, is probably pretty useless – “135”. What the boss really wants to know is “what are the demographics of people tweeting about our products, I think it’s mostly demographic X?” Assuming you have access to the demographics, you can present a plethora of data – by product, by cohort, etc. That approach makes your answer to the initial (not good) question make more sense “Looks like 135, which is 32% of all folks tweeting about us at all.” Engaging in a Socratic dialogue with the data consumer (or yourself for that matter) can help you to understand the kernel, the impetus for the question and thereby redefine it in a way that extends the usefulness of the answer and guides additional analyses necessary for understanding more deeply the phenomenon you are investigating.

Rosetta Stone – The analyst must be able to translate fluidly between many entities. Business and marketing oriented groups will not speak the same language as data miners and data scientists. The audience for consuming data is constantly changing. The analyst must anticipate these differences, speak the different languages and be sensitive to the fact that her job is to increase understanding (rather than expecting their customers, the data consumers, to do that research after the data is presented). Again, if the analyst was not needed to act as a translator, she would be easily replaced by a tool at some point. Once the question to be answered has been defined, the analyst will most likely have to interact with entities to extract the raw data. One man’s “how many” is another man’s “SELECT * FROM table_name”.

Master of Data – At the end of the day, the analyst must know the data they are transforming and interpreting. They must understand the structures, the nuances, and the quality. This means a part of an analyst’s job is to investigate, on their own, the data sources they interact with. Blindly extracting data is a great way to create false conclusions due to poor quality data, null fields, strange conventions and more. I recall running an analysis whereby I found the “gender” column of a master table contained the values “male”, “female”, and “child”. Imagine if I had tried to do some sort of deductive analysis whereby I got a count of total uniques and total males, and then took a shortcut (uniques – males) to derive females. Oops. There is no universal taxonomy, there is no universal ontology. When it comes to data, you have to check and double check the sources to be sure you understand the nature of the data you are extracting value from.

Storyteller – The analyst needs to take data, in its raw form, and mold it into something that can be understood and easily retained by the audience. Answers to business questions are rarely straightforward (and if they were, the analyst would be replaced by a dashboard) and often require a contextual back story. The analyst must determine, to the best of his or her ability, the relevant context for the answers presented. This context often goes beyond the data and requires (gasp) talking to others or (shudder) using the product that generated the data. Without the appropriate data context and the ability to describe that context to the audience, the analyst is nothing more than an overpaid calculator moving numbers into tables and charts for someone else to interpret. A great interview on the storytelling aspect of data analysis was given by Cole Nussbaumer (whose blog I am adding to the blogroll on the front page) for the website klevr.org.

Anticipator of Questions – I believe that a successful presentation of data provides clear information about a specific set of business questions such that decisions can be made. I also believe that a successful analysis generates more questions. While that may seem counter intuitive  (if you answered the original questions why are people asking more?) in my experience, the questions asked after a successful analysis are those that build upon the insights you presented (rather than being unrelated or confrontational/doubtful). If you get no questions, I fear you have bored or confused your audience. That said, anticipating the most likely follow up questions and running the analyses on a few is generally low cost with high reward as it shows you have defined the question space well, translated it in the appropriate manner, retrieved the data to answer those questions above and beyond expectations, have mastered the data and are now weaving it into the epilogue of your story, delighting your audience and giving them valuable information. If nobody asks the questions you anticipated, you can always present them (quickly, as you have spent most of your time talking about the core question) as bonus material “I was curious about…” this also leaves your audience with the assurance that you actually care about unearthing insights and frankly, if you really don’t, you should find a new job.

Nov 252012
 

So Nate Silver is the stats nerd of the year for his great (or lucky, if you hate science) methodology around poll aggregation and the poll weighting algorithm he employed regarding the prediction of the outcome of the recent national elections. Congratulations Nate, if I didn’t live in a country with Byzantine banking laws, I would have made a tidy sum using your leg work (among others – I firmly believe in leveraging the wisdom of crowds of experts) to invest on “Obama to win” via the event based market InTrade. I haven’t been able to find any apologies by the demonizers who suggested Nate was just another political hack (like them?) who was rooting for the wrong team and trying to hide it behind some sort of magical thinking in the guise of science, but I can’t say I looked too hard.

While the disappointing part of the whole Nate Silver predicting the elections bit lies in the constant misinterpretation of what Nate actually did to come by his numbers due to the general publics’ pseudounderstanding of statistics, the beauty of the press he received both before and after the election has elevated the role of data in decision making – even messy social data like poll results (essentially surveys, with all their inherent issues). The age old “gut feeling” as the sole driver of decision making (i.e. guessing) is coming under needed scrutiny in an age where having current and historical information is finally possible. Those who fail to incorporate data, especially data that is readily available or easily gathered, will be left behind or when successful in their guesses (expertise does have its place) will be less efficient.

It is my firm opinion that gut feeling is a garnish best placed on top of data driven analysis where the depth of gut needed is (roughly) inversely proportionate to the data available. Nate doesn’t use gut feelings, he uses data, which can then then be handed to those responsible for making decisions.

So how does Nate Silver make my job easier? As Silver commented to Jon Stewart on the Daily Show after being asked about what it would mean if his model had been wrong, Nate responded “It would have been bad, I think, because for some reason 538 became invested with this symbolic power, and you know symbolic power isn’t particularly rational, right, but it became this symbol for people who were believing in hey, let’s look at the polls. Let’s do some empirical research, right.” Empirical research was shown to best guts. This research was contrary to a huge contingent of, not surprisingly, biased observers but was shown to be superior to all other estimations, guesses, scrying stone proclamations, etc. even those made by individuals with a vested interest in Obama winning. His. Model. Won. Data won.  As data of such a highly scrutinized, over-thought, expensive contest won over individual “expert” opinion, my job got easier. The hugely symbolic power of that specific use of data helps serve as a powerful example of what data can do. When talking to organizations about the value of data, the value of quality data, and the usefulness of measurement to drive business decisions I now have an example that everyone knows, and in some small way, understands. Am I comparing myself to Nate Silver? Not particularly – we come from very different backgrounds, education, approaches etc. But one thing is certain – he has just made the human interaction part of my job a lot easier – that part where I am convincing a client to invest in data resources, to care about data quality, data completeness, and data driven decision making. Thanks Nate.