Trends in Customer Insights: Shifts Within the Industry| FocusVision Skip to main content

Trends in Customer Insights: Shifting from ‘the Next Big Thing’ to Delivering Business Impact

The pace of change has never been so fast. And will never be so slow again.

origin unknown

Research allows us to explore people’s lives: how they live, their routines, their likes & dislikes, their attitudes & opinions, their beliefs. In today’s rapidly changing world, understanding people and uncovering their truths, has never been more central to business in meeting their customers’ needs.

The market research toolkit has come a long way; from the early days of knocking on doors to speak to people in their homes to now including the petabytes of behavioral data offering minuscule detail on people’s digital activities. Much of the toolkit’s expansion has occurred over the past few decades and, as echoed in industries around the world, the rapid pace of change means each year brings ‘the next big thing’. Most often, this has been a new technology promising to revolutionize how we collect data. However, that trend may be drawing to a close as the focus shifts from the mechanics of how we collect data, to the impact of the data: the business decisions and outcomes that it informs. This paper explores the shifts that are taking place within our industry, beginning with a look at where we have come from in order to frame where we are today and project possibilities for tomorrow.

The early pioneers

The origins of the research techniques and principles that are fundamental within today’s toolkit can be traced back over 150 years. London in the mid-late 1800s was a hub of activity for the emerging social – and subsequently market – research approaches.

The first social questionnaire is attributed to Francis Galton, who also did pivotal work in the early development of modern statistics (Hamed et al., 2014). Another notable figure shaping social research was wealthy businessman Charles Booth, one of the early pioneers of the social survey and ethnographic methods. Booth undertook the most comprehensive survey of London life in a concerted effort to obtain a true measurement and understanding of poverty within the capital. His inquiry took him door to door, interviewing inhabitants and on occasions spending days and weeks living in different homes.

These approaches stayed largely the same until the 1940s when Robert K. Merton, also known as the “Father of Focus Groups” began his lifelong work that shaped the method and our industry beyond measure. In Merton’s own words, it “all started with a thoroughly unplanned work session with Paul Lazarsfeld in November 1941”, when they were both young sociologists at the US Bureau of Applied Social Research at Columbia University. Lazarsfeld, head of the Office for Radio Research, spontaneously invited Merton to observe a session testing audience response to radio programs for the Office for Facts and Figures (which eventually became Voice of America). At the studio, Merton was shown into a room containing 12 to 20 people sitting on chairs arranged in rows, and two built-in buttons to press during the program – green for positive and red for negative reactions to the material. An interviewer then questioned the audience on their recorded responses. Merton’s reaction was a combination of intrigue to this previously unseen format and a critique of the interviewing procedure: “Although this is a new kind of interview situation for me, I am not unfamiliar with the art and craft of interviewing” (Merton et al., 1990). As such, he began his life-long work in what we today call focus groups.

A well-known example of an early 1950s focus group leading to business growth, and shaping consumer culture, was undertaken by Ernest Dichter for the Betty Crocker brand. Struggling dry cake mix sales were put back on track with focus group insights uncovering the ‘just add water’ version was unsatisfying. The new version required bakers to add eggs as well, delivering a more satisfying ‘homemade’ experience.

During this period there was also rapid methodological developments on the quantitative front. Work in the decades between 1930-50s greatly advanced sampling theory and understandings of measurement error within questionnaire design (Groves, 2011). Further growth in statistical techniques came with the advent of commercial mainframe computing opening analytical possibilities. Interestingly, the first commercial mainframe computer was delivered to the US Census Bureau on the 31st March 1951. Known as UNIVAC 1 (Universal Automatic Computer), data was inputted through punch cards, tabulated and then printed out or stored on magnetic tape. Its first applications were to tabulate part of the 1950 population census, and the entire 1954 economic census (Census.Gov, 2018).

Computing technology transforms

Primary data collection methods remained stable through the 1970s, with interviewer-administered and mail surveys on the quantitative side, coupled with ethnographic methods, focus groups and individual interviews on the qualitative side. At this time, the growth and ever-increasing availability of computer technology led to various computer-assisted interviewing (CAI) methods. The first of these is computer-assisted telephone interviewing (CATI) devised in the early 1970s and during that decade went on to become a widely used research tool in the US, but it was only in the early 1980s that it gained extensive use in Europe.

The emergence of laptops in the mid-1980s led to the general use of computer-assisted personal interviewing (CAPI) in survey research. The development of CAPI was initially hindered by the limited capabilities of the first laptops in terms of memory and speed, not to mention the size and weight of the machine but as the technology improved, computer-assisted survey research became the norm in the early to mid-1990s. At this time, Touch-Tone Data Entry (TDE) and Interactive Voice Response (IVR) were also introduced. This decade also saw the emergence of email surveys and early online qualitative approaches with bulletin board discussions. All of these new entrants offered a way to broaden the reach of the people that we talked to by lowering geographic barriers, reducing cost, increasing speed and engaging them in ways that mirrored the emerging changes in communication.

The turn of the century and the rapid expansion and adoption of the World Wide Web brought perhaps the most transformative changes impacting us today. Web surveys now dominate all research approaches, and in recent years this is morphing into mobile web surveys. Qualitatively, activity-based online methods, mobile ethnographies as well as webcam interviews and focus groups are all in use. As much as these are ‘new’ modes within our toolkit, they all built on established methodologies, following the same principles and guidelines developed in previous decades. This is an important point that we’ll return to later.

There has also been a swath of fresh approaches, all of which have received notable attention over their future promise. These approaches include quantitative eye tracking via webcam surveys, neuroscience techniques such as galvanic skin response, and early offerings in virtual/augmented reality. We are also experiencing a new period of growth within analytics and scientific approaches. Behavioral economics shone a new light on how we ask questions – are we tapping into System One or System Two thinking when people are responding to the questions. System One is thinking that is fast, instinctive and emotional. For example, we use this thinking when driving a car in easy conditions. In contrast, System Two thinking is slower, deliberate and logical. An example here is when we call on our memory for when we last used a particular product or went to the cinema (Kahneman, 2011).

System One is thinking that is fast, instinctive and emotional. System Two thinking is slower, deliberate and logical.

Kahneman, 2011

All of the approaches mentioned to this point form part of Small Data, the data gathered through asking questions. However, Big Data is also now part of the toolkit. There are new datasets at our disposal – social networking offers the ability to pull and analyze online social conversations while click-stream and transactional data give a firsthand view of their actual behaviors. We also see a corresponding growth analytical in approaches through Artificial Intelligence (AI). AI is not new – the term was coined by John McCarthy at a 1956 conference (Smith, 2006) – but today advancement in computing power is driving growth in this field. For research, AI is enabling new ways of analyzing both structured and unstructured small data, generated from our studies as well as big data sets from behavioral, transactional and other sources. For example, AI-powered text analytics using deep neural networks can help identify key moments, sentiment, and themes. It’s worth noting the that there has been some thought that Big data would answers all our questions. However, it only tells one part of the story. Big data shows the behaviors, the actions taken, while small data offers deeper insight into why they took that course of action.

Figure 1: Evolving Research Approaches

Benefits and challenges

All these approaches, new and adapted, provide researchers with multiple ways to reach people and understand their truths.

The evolution reflects changes both in technological developments and in culture around us. A perfect example of this today is mobile qualitative that can tap into the mainstream culture of content creators and sharers. From a researcher perspective, what better way to understand people than through their lens with their pictures and videos taken in-the-moment.

Furthermore, the method addresses memory and post-rationalization challenges while also being as unobtrusive as far as research can get. Similarly, a quick mobile survey helps capture near-to-the-moment customer experience, providing timely feedback on brand interactions. Certainly, feedback surveys have proliferated since the turn of the century given the affordability and ease to which these can be deployed and the data analyzed in real-time.

Another important advantage of the adapted web-based approaches is the increased feasibility of blended methods, offering reduced time and costs. This is particularly true of triangulating quantitative and qualitative methods providing a full, rich picture of the research topic. However, other mixed-mode approaches, for example, an online diary followed by a webcam interview, also yield powerful insight.

This all said, the variety of approaches now at our disposal brings layers of complexity. Researches are still grappling with the design choices when creating an online questionnaire and fully understanding the implications of these decisions upon the respondents and the results. Often design choices are made (or not made) in deference to our quest for standardization and data comparability. This can at times fly against the best practices set out in our understanding of measurement error.

Practically speaking, the amount of data being returned from even relatively small projects, can be overwhelming. This is particularly true for qualitative studies. Consider that a recent 7-day FocusVision Revelation online community with 72 participants generated more than 1000 unique responses, 1200 images, and 170 videos. Quantitatively, big data promises a wealth of understanding, but similar to small data, these immense often unstructured datasets are cumbersome to mine. Whatever the data, time-strapped researchers juggle the business need for speed, together with delivering quality and impact.

Today’s landscape

With the wealth of new and adapted tools and the regular discussions on each year’s latest innovations, one may expect there is widespread usage across the expanded toolkit. However, this is not the case. The tried and tested ‘traditional’ approaches remain firmly mainstream. Surveys, in-depth interviews, and focus groups are used more than any other approaches. While surveys have mostly moved online (laptops, tablets, and mobile), IDIs and focus groups tend to be undertaken in person. Of the new approaches, online communities are used by nearly half.

Figure 2: Which approaches have you used in the past year? Source: FocusVision 2018

These results are mirrored by the 2017 GRIT report exploring the adoption of new methods. Despite all the new tools and discussion around the ‘next big thing’ we are solidly in the traditional realms.

Figure 3: Use of Emerging Methods Source: GRIT Report 2017

What’s going on?

Why is the hype around the new approaches so high but adoption so low? In short, the day-to-day business realities.

The 2007 global economic crisis impacted research budgets, and while companies bounced back, their budgets (along with many other departments) didn’t. This reality continues to impact insights teams, who are often working with zero or shrinking budgets year-on-year but the demand increases. As a result, businesses are looking for technology to help with speed and cost. For example, automated insights from text, video, numeric data. Data visualizations to help with analysis and ease of reporting.

Sample quality is the other major area challenging researchers. Sample is a key element impacting overall data quality and ultimately delivering good business decisions. Yet it is a struggling reality. Respondents in our study went as far to say: ‘fix panel quality… I dread even doing web surveys any more’. As a reminder, sampling theory was largely formed in the 1930s. The first notable innovation occurred in the 1970s with ubiquitous landline penetration and random digit dialing allowing probability sampling. The next development occurred in the early 2000s with the rise of opt-in (and therefore non-probability) online panels. These panels facilitated the rise of web surveys and have played an important role in our industry over the past two decades, and we are once again at a juncture where more innovation is required.

Figure 4: Top Research Challenges Source: FocusVision 2018

The immediate impact of technology

Overall, our data showed that budgets and sample are the burning platforms for researchers today, not exciting new approaches.

The future technology innovations that will win are not new approaches but ones that help address these pain points.

For example, automation will assist with resources by taking the pain out of repetitive tasks (e.g., data visualization). Artificial Intelligence will help speed and uncover patterns in qualitative and quantitative data. It’s possible, although very early days, that blockchain may help address sampling challenges by validating respondent profiles, offering new payment options and potentially facilitating the linking and exchange of data (Poynter, 2018).

There’s also a bigger shift going on. When asked about the ‘one thing you’d change’ in their day-to-day roles, the need for organizational integration came through loud and clear. Research is now being conducted within all departments, across the organization. At times this is carried out by the Insight Department, but at others, it is done by in-department researchers (such as user experience) or by nonresearchers needing quick answers to their pressing question, leading to the current movement around the democratization of research as DIY tools help meet these needs. Additionally, researchers are grappling with linking datasets (big and small) and delivering insights that become the fabric of their businesses. As one respondent said: ‘how to line up data with transactional information and making it part of the [organizational] cultural environment.’ This is a key area that we’ll now turn too.

Evolving Business Needs

As much as research approaches have evolved over the past 150 years, business is undergoing major shifts. 21st-century business demands:

  • Experience: customer experience is all-encompassing, spanning all departments from product, and service to brand and marketing.
  • Data: understanding Customer Truthholistically, through big and small data.
  • Speed: hours and days. Business decisions can’t wait for the next board meeting; they need to be made now.

The rise in the ‘experience economy,’ fueled by digital transformation has resulted in the business’ need to add human-derived data back into the data equation to deliver always-on, real-time direction.

In short, timely customer feedback is paramount to business growth.

This customer feedback needs to provide an understanding of them in their entirety by combining the power of big data and small data. This ‘why’ isn’t just gathered from the voice of the consumer surveys, but also from the all-important qualitative (or thick) data. Talking to your customers with webcam focus groups or interviews, seeing their lives through their lens by mobile ethnographies and communities, delivers the richness of who they are, what they think and how they feel. It provides the readily relatable clues that tie it all together, making the insights from the big data and the numeric small data come to life.

Breaking down the silos

In decades gone by, consumer insight was something that informed areas in silos – such as advertising messaging and effectiveness, brand health or ad hoc inspiration for innovation. Today the need for customer feedback means their reach is extending into all departments. This presents a new set of challenges.

As we heard earlier, budget is a core issue. While the importance of insights is elevating, teams are still small placing extra strain not just in terms of gathering the insights but ensuring the findings are available across the business. Oftentimes, teams’ first challenge is simply building awareness of their work.

Insights teams are also becoming educators. In the companies where research is becoming the fabric of every department, there’s a shift for non-researchers to undertake studies that don’t make sense for the insights team to tackle – either due to capacity or scope. This enables all departments to conduct research and a key driver of the democratization of insights movement.

There is another aspect of breaking down the silos – that of insights roles are moving out of a dedicated team and becoming integrated into other departments.

Now such roles might be held in Marketing or Product Innovation, under titles like Customer Experience Insight, Marketing Intelligence, and Market Strategist. Whether we see insights integrated into other departments or continue as a discrete function, the challenges around doing more with less together with ensuring all areas of the organization benefit from the knowledge remains.

The new narrative

Today data is an essential, not a nice to have. From creating customer experiences to driving product innovation, data is the driving force. Business decisions are increasingly less made on instinct and gut feel rather by data from all areas of the business. Researchers, keenly aware of insights’ importance to business, have long talked about getting a seat at the table. This time has come. Understanding customers, uncovering their truth by asking meaningful questions, and applying that knowledge throughout the business, is what makes a brand more successful than those that don’t.

The narrative has shifted from how we collect the data to how the data can influence all business decisions.

Data dissemination becomes a critical need. Both in terms of the delivery, for easy understanding and direct impact, as well as accessibility, the ability to revisit and mine existing data. This new world also requires that all within the organization understand how to evaluate data from different sources. They need to be open to what it is saying, while also curious and critical — asking questions of it to ensure full understanding.

In terms of how we gather the data, the proliferation of research approaches has given us many new doors to understanding Customer Truth. Fueled by technology, we can reach people in ways that are culturally relevant, cost-effective and quick. However, it is the established approaches (surveys, focus groups and interviewing) that remain our core means to ask questions.

Whether the mode is in-person or online, all these approaches have well established methodological principles guiding their use, and it’s fundamental that we don’t lose sight of these principles. We need to understand our data. Sampling error – what was our sample frame and how does this bias the data? Measurement error – are we asking the questions in a way that will yield the answers we are after? Moreover, the role of the researcher is to understand these principles, apply them and also educate others across the business on them. Good data means good business decisions. Today, this is a vital role.

Download White Paper

References

See for yourself: Begin your journey to better customer insights

Request a demo

Sign up to receive news from FocusVision