Introduction
When analyzing the market researcher position, the main lens of rationalization to focus on are the concepts of how current methods of data collection were created and how those methods apply to today’s digital age of reaching the consumer. From its inception in the 1920s until the spread of the internet in the 1990s, the majority of the marketing research was done by hand–whether that be collection, processing, or interpretation of data–and focused on collecting more specific, detailed data from a smaller group of the population. While the collection of this data may have been slow and difficult to process on a large scale, it did lay the foundations for modern collections of data by ironing out all of the flaws that may exist within the collection processes while also determining what collection methods would produce the most useful data for researchers. However, as their current job has shifted to be run predominantly through computers in the internet age, marketing researchers have seen a tidal shift in both tools available for them and the priorities of their focus when completing their job. In the present day, market researchers can collect vast quantities of data about a wide range of subjects in very little time, thus turning insights that would have taken months, or even years in some cases, to collect into data available almost instantly. Through the implementation of rationalization processes of efficiency, calculability, predictability, and control as laid out by sociologist George Ritzer in his novel The McDonaldization of Society: Into the Digital Age, we can attempt to explain both the methods used that brought this change to pass and why the future outlook of jobs in the market research sector is expected to grow by 19% by 2031.
1920-1940: The Era of Quantitative Questionaire
Since the inception of markets and, particularly, marketing, businesses have always been faced with one particular question: how effective are their advertising campaigns at selling their products and services? Traditionally, the best way to answer this question was by comparing sales figures before and after running promotions. However, this concept left businesses with very minimal tangible information to act on as these figures were not isolated to any specific marketing campaign, nor did they help distinguish changes that may be occurring in the market. These changes could be anything as simple as shifts in weather conditions over a few days or as complex as consumers gradually shifting preferences to other competing products populating the marketplace.
The Starch Test
In 1931, American media researcher Daniel Starch created and disseminated the world’s first copy test, the Starch Test, and with it, he birthed the field of market research as it is known today. Up until this point, advertisers and the businesses they worked for had no way of gauging whether their ad campaigns were effective, nor if they were conveying their desired targeted messages to consumers. Starch, however, wanted to solve this issue by approaching it from the perspective of understanding how a business’s target audience thinks, acts, and feels based on a pre-determined series of questions, later becoming known as the Starch Test. Through the collection of quantitative measurements on whether the reader had seen the ad in question, had seen the name of the product or company the ad endorsed, and had read half or more of the written material, Starch hoped to gain insights on how effectively certain magazine advertisements were reaching the consumer. In doing so, companies now had tangible statistics that could be used to determine the effectiveness of their advertising campaigns and whether the money they were spending was bringing in a positive return on investment.
The Gallup Poll
As Starch focused on his Starch Test methodology throughout the 1930s, contemporary American media researcher George Gallup found that he had a similar interest to Starch in collecting consumer data, focusing his efforts on consumer feedback through a process known as Aided Recall. Based on the idea that small samples of the populace could predict general attitudes, Gallup established the now famous Gallup Poll (of which his company lives on today as a leader in global public polling) to find specific demographic groups that could be represented proportionally on a wider scale, thus predicting the most likely outcomes of a given situation.
Gallup’s moment in the sun that propelled his polling to national acclaim came in 1936 when he correctly predicted that Franklin Roosevelt would beat out Alfred Landon for the US presidency. To do this, he used his poll to break down the US electorate into a precise set of demographic groups, which he then represented proportionally in his sample group of 3,000 people and applied to the population as a whole. This prediction directly contradicted The Literary Digest, the poll of record at the time, and displayed the promising future that the Gallup Poll had in store. Further testing and refinement of these processes took place in subsequent years with versions of the polls being adapted to chart data across all mediums of advertising, including radio and television ads, culminating into a final product that was generally accurate about a population and its desires.
Rationalization
The development of the Starch Test and the Gallup Poll strategies proved to be seismic in changing the outlook of market research as not only did these pioneering programs create the first usable data to be studied by businesses, making both data collection processes and market outcomes (market trends, consumer sentiments, etc.) much more predictable, but they also helped make the profession more calculable by providing researchers with hard figures on how effective certain marketing strategies by businesses were and whether they need to make changes. Simultaneously, by establishing and subsequently standardizing data collection methodologies, those in charge of conducting the research were able to better control the specific methods being used to collect information while researchers were able to focus their time on data collection methods that have proven to be the most effective, boosting the efficiency of how researchers spend their time working.
1940-1960: The Era of Qualitative Questionaire
With the previous generation of researchers having established a baseline understanding of the target audience through quantitative statistic collection, the next generation of researchers sought to build on this work by developing a complete and comprehensive understanding of the individual beyond simple statistics. The biggest difference in these studies was the conceptual strategy with which researchers approached the collection of data from consumers with researchers previously focusing on recording responses after contact had already been made. During the new era of qualitative questioning, though, researchers instead focused on being proactive in making contact with the consumer before being exposed to messaging from a business in order to determine their desires before the fact, rather than after. In doing so, researchers were emphasizing the importance of forming a deeper connection with the consumer, which they felt would allow them to better understand why customers made certain decisions. By better understanding the motivations and desires of the consumer, researchers could then turn these understandings into actionable information for businesses that would be better able to position themselves in the relationship with customers to meet their exact needs, rather than relying on ideas of what they think consumers might want.
Focus Groups
The first steps taken toward achieving this goal were made by Paul Lazarsfeld and Robert Merton in the early 1940s with their introduction of focus groups. The overarching idea of focus groups, as Lazarsfeld and Merton saw it, was to bring a small group of people together to answer questions in a moderated setting to give researchers insight into the experiences and perspectives of the designated target group. Through these insights, researchers could then turn this into actionable information for companies, informing them of how they should go about conducting future company operations. To hone this strategy, Lazarsfeld and Merton began working for the US government during world war two, surveying soldiers about the effectiveness of military training and morale films for the research branch of the US Army Information and Education division. From their research, both they and the Army were able to determine what about their films had worked well, what needed improvement, and create actionable information for future training programs. After the closing of the war in 1945, the business world began to take notice of Lazarsfeld and Merton’s success in focus groups with companies such as CBS rolling the strategy into their research for potential pilot radio and television shows, a trend that continues today for the brand.
Motivational Research
Following the early successes of focus grouping by Lazarsfeld and Merton, the soon-to-be-famous researcher/consultant Ernest Dichter felt that it was time to take the idea one step further using a theory that became known as motivational research. Predicated on the notion that focus groups didn’t reach far enough into the human mind, Dichter focused mainly on conducting Depth Interviews that closely resembled therapy sessions and observing consumers’ interactions with products in simulated or real environments. Dichter firmly believed that every product had a ‘soul’ and that individuals bought a product not only for its purpose but for the values it embodies. Due to this, Dichter concluded possessions serve as a kind of mirror that reflects our own image, which could be studied by researchers and businesses to create products that fit into that image.
Unfortunately for Dichter, interest in his studies ultimately tailed off into the 1960s as critics, such as Vance Packard in his 1957 novel The Hidden Persuaders, began to call into question the viability of his research methodology. Packard argued strongly that motivational research was a manipulative and potentially dangerous tool that allowed advertisers to "play upon our deepest fears and desires." He furthered that advertisers were using this research to create advertising messages that were specifically designed to bypass consumers' rational defenses and appeal to their emotions on a subconscious level while also raising concerns about the potential for motivational research to be used for political purposes, stating that it could be used to manipulate public opinion and suppress dissent. With strong pushback on motivational research from sources such as Packard, marketers and researchers began to turn their backs on Dichter’s ideas.
Rationalization
On the whole, the rationalization of market research was promoted in a few key ways. First, they made researchers’ work much more efficient by helping create a deeper understanding of how researchers should engage with consumers, allowing them to better focus their time and business resources on strategies that will provide the highest net value to the business wishing to conduct market research. Additionally, they made market research more predictable by knowing the customer on a more personal level, allowing researchers to better understand how they will act and react to the actions of businesses, meaning they can guide the specific actions of businesses into more favorable customer reactions. Finally, similar to quantitative testing of 1920-1940, through standardizing data collection methodologies, those in charge of conducting the research could exert greater control over the methods deemed to collect the most useful information, making actions more routinized. There is a notable missing pillar here, though, as the qualitative methods of data collection did prohibit any development of calculability, which focuses more creating on quantifiable objective outcomes rather than subjective ones. That said, even without any advancements in calculability, there was still strong progression in the establishment of rationalization thanks to the efforts of Lazarsfeld, Merton, and Dichter.
1960-1980: The Era of Refining
Around the start of the 1960s, market research again shifted back toward quantitative analysis, this time with the expressed goal of understanding consumer interactions with brands. Following the work of qualitative researchers of the 1940s and 1950s about understanding the mindsets of customers, researchers and, in turn, businesses saw an opportunity to create quantitative statistics that extended further than just the surface-level figures (sales, customer recognition, etc.) and hit home on true customer desires. What set apart the era of refinement, though, was that market researchers took a step back from analyzing only data points surrounding the customer making the purchase and found a more encompassing lifetime of ownership view of products. No longer was it about simply making a sale, but it became about how the customer interacted with the product or service sold over the complete period of ownership and how these interactions influenced future customer decision-making.
Conjoint Analysis
First to capitalize on this shift was American researcher Paul Green who, in 1971, published his methodology of conjoint analysis as it related to marketing. While conjoint analysis had been introduced back in 1964 as a means of scientific measurement, Green took the theory, based on collections of survey data involving a bank of pre-determined questions about an individual’s likes and dislikes, used the data collected to find how individuals value different attributes (features, functions, benefits, etc.) that make up a product or service. From these observations, Green would then predict what people would do in the future and offer companies solutions on how to make their product offerings reflect these future values. It should be noted that one of the biggest challenges Green faced in using conjoint analysis effectively was that it could often be quite difficult to narrow down the particular attributes that consumers fit in. Inherently, no two customers are the same and each could fit into any number of categories, thus it was very important to ensure researchers used the correct mix of questions and analysis techniques of the data to conduct research effectively.
Market Segmentation
Shortly after Green introduced conjoint analysis, researchers Yoram “Jerry” Wind and Richard Cardozo introduced their methodology of consumer classification, market segmentation, based on a two-step model of macro-segmentation and micro-segmentation. In these segments, consumers were defined as a group of present and potential customers with common characteristics relevant to explaining and predicting their response to a supplier’s marketing stimuli. The goal of this strategy was to organize these segments into current and previous customer groups that identified the most likely candidates for future consumers. From the study of this methodology, researchers began to understand that businesses needed to target different segments with different messages, not simply a blanket marketing strategy to all potential consumers, to maximize customer response. To do this, researchers used data classification methods similar to the groups Green had used in his conjoint analysis to break down the market into sections, and then, by determining what a specific group valued the most, they could guide businesses on how to use this information to engage with customers
Rationalization
Conjoint analysis and market segmentation had a net positive result in rationalizing the field of market research in the sense that they greatly increased the efficiency and calculability of research by significantly cutting down on the processing time of determining what customers value while making that data much more quantitative in nature. This was particularly true when comparing focus groups that aimed to answer questions about customer values with the newly developed statistical collection methodologies. Where before researchers might rely on a small sample size of respondents in a focus group to be a reliable source of data about what customers valued, now researchers could connect with a far greater number of people to determine what they value in relatively the same amount of time while making the data collected far easier to process into actionable information. Additionally, with the standardization of data collection processes that came with conjoint analysis and market segmentation, market research became more predictable and controllable as data finally became easily comparable to other data collected using the same or similar methodologies, thus allowing researchers to focus on specific data collection points found to be the most useful, compare results, and create more insightful information with which business decisions could be made.
1980-2020: The Era of Evolution
With market researchers now having a rather complete understanding of what consumers valued when selecting their brands and how they interacted with that brand during and after the purchase, researchers then moved on to solving the last piece of the puzzle: how to proactively influence and predict what customers will be drawn to before making a product or service selection. To do this, researchers would not only harness new methods of data collection but, for the first time, where they looked to collect consumer data and how they collected it would also shift with the implementation of advanced technology, radically changing the entire outlook of conducting market research.
Theory of Reasoned Action and Theory of Planned Behaviour
A key breakthrough made in predictive data analysis came from Martin Fishbein and Icek Ajzen in 1980 as both researchers linked behaviors and attitudes together in a study outlining the concept of the Theory of Reasoned Action (TRA), which was later extended to include the Theory of Planned Behaviour (TPB) in 1985. With their roots in attitude theory and the social cognitive tradition, both theories primarily focused on the beliefs of individuals, with respect to their potential future performance of a given behavior, and formed the basic conceptual framework for predicting, explaining, and changing human social behavior. These theories were based on Fishbein’s original multi-attribute attitude model, which utilized three components of attitude (salient beliefs, object-attribute linkages, and evaluation) that determined a measurable score representing a consumer’s attitude, forming the basis of the TRA. The TRA was created to predict how individuals would behave based on their pre-existing attitudes and behavioral intentions, which TPB further expanded on by highlighting considerations for the role of perceived behavioral control (a person’s belief that behavior is under their own control). In the end, what Fishbein and Ajzen found was that there was a strong relationship between attitudes, intentions, and behavior, with changes in attitudes and intentions often quickly being followed by changes in behavior. Through these conclusions, they were then able to offer brands the ability to measure consumer attitudes and gain insight into the factors driving consumer behavior.
Net Promoter Score
In a further attempt to quantify consumer brand preferences and behaviors, researcher Fred Reichheld created the Net Promoter Score (NPS) in 2003 to take the generally subjective idea of customer satisfaction and turn it into a quantifiable set of data, which companies could use to track brand satisfaction. In an NPS survey, consumers are asked a series of questions such as: ‘How likely are you to recommend us to a friend or colleague?’ However, rather than giving anecdotal evidence, such as past research like focus groups had sought to collect, customers answered the questions based on a scale of 1-10. Market researchers then take these answers and divide them into categories of Promoters (9-10), Passives (7-8), or Detractors (0-6). Once it has been determined how many people fall into each category, NPS is calculated by taking the percentage of promoters minus the percentage of detractors, thus leaving the company with a relatively accurate measure of customer satisfaction. This customer satisfaction rating is important because the higher the rating, the more likely businesses will have increased revenue, reduced marketing costs, and improved customer loyalty and reputation. By focusing on customer satisfaction, businesses can improve their bottom line and build a more sustainable business over the long term, thus making it imperative for businesses to understand how customers view them.
Rationalization
While both research strategies focus on distinctly different elements of consumer behaviors, they do share many of the same outcomes in terms of rationalizing the profession of market researchers. Both research methods prove to make market research more calculable by quantifying consumer observations that are inherently subjective (thoughts, feelings, attitudes, etc.) while also making these outcomes more predictable in the sense that studying the data derived from these test make it easier and more accurate to determine consumer attitudes toward what is being studied. Similarly, by quantifying these qualitative observations, researchers gain further control over the data through further standardization of important data points to observe about customers, making comparisons between data sets easier to understand and more efficient as researchers don’t have to spend as much time pouring over large qualities of data with numerous forms of data collection. Instead, researchers can simply focus on a few specific numbers that represent general customer attitudes and satisfaction, streamlining their job and allowing them to focus their time on collecting more data and data interpretation.
Customer Data Tracking
By the start of the 1990s, the internet was in its infancy, yet as it grew, it became ever more clear how soon it entrench itself into the global state of affairs. From the moment Sir Tim Berners-Lee went live with the first HTML code in late 1990, market researchers saw the potential to acquire new streams of data about customers by tracking them through this new digital space. This potential finally came to fruition in 1995 as Dr. Stephen Turner released a first-of-its-kind web tracking program named Analog, which individuals and businesses could use to track usage patterns on their web servers. Up until this time, web analytics were rather complicated and required dedicated engineers to understand patterns. However, Analog changed this by making reports generated through log files more comprehensible to online business owners, with clear documentation and visual graphs.
By measuring where customers had been, how long they spent on websites, and what they did while on those websites, researchers and businesses could now track information about customers on a much more in-depth scale than ever before without any concern for the reliability of the data collected. From these early advancements by programs such as Analog and further expansion of the internet, web analytics has since become the backbone of many modern companies who use everything from their own sources of data collection to other businesses, such as Google Analytics in 2005, doing their data collection for them. All of this, however, can be traced back to the advancements made by Analog which helped lay the groundwork for collecting internet data from users interacting with a particular website.
Consumer Insights on Demand
As the Internet has spread at quicker and quicker rates since the 1990s, so have the number of people interacting with the Internet, many of whom have plugged themselves into the digital world for continuously lengthened periods of time. With this influx of users spending consistently higher periods of time plugged in, researchers now can harness this activity and turn it into usable consumer insight nearly instantly with just a few button clicks. By using strategies such as keyword research (finding specific words and phrases being used by individuals online that make them likely to fit into a group with other individuals who use the same words and phrases), digital performance data, and point-of-sales data, researchers and businesses now have streams of data on tap about the market as it stands down to the second. These data streams show what customers gravitate towards much quicker and more accurately (not relying on self-reporting as in many past eras) than ever before, thus allowing researchers to generate insights for businesses to take action on using market data far more quickly.
Rationalization
With these advancements in internet technology made over the last 30 years, market research as a whole has undergone numerous shifts as it relates to rationalization. Chief among these changes comes in the department of efficiency, where data collection has been scaled to what would once be considered mammoth proportions in lightning-quick time. As more users access the internet than ever before and the internet continues to progress in development, there will be more data available for researchers to take hold than ever before with ever-quicker methods of collecting that data becoming developed. This means market researchers can spend their time collecting much more wide quantities and varieties of data while also being free to spend more time on creating insights about the data collected instead of focusing on the relatively small, in comparison, samples of data they used to collect. It also means that this data collected is often quantitative as organizing and interpreting quantitative data on a large collection scale can be much easier, causing the data to become more calculable than open-ended data collected in small quantities. Additionally, with greater quantities and specificity of data, outcomes/trends become more predictable in terms of what can be expected to happen based on the data presented as there is a greater pool of knowledge to draw from. Along the same lines, with data being controlled and standardized, along with other factors such as how that data was collected and how to reach target consumers to collect data from, it becomes much more easily compared to other data collected both in contemporary times (analyze current market trends) and past times (analyze historical market trends to derive actionable information).
2020-Present: The Era of Ease and Pace
As the world moves into the 2020s and beyond, the future for market research appears to be quite bright as data has seemingly become the lifeblood of modern business. With this in mind, researchers and the businesses they conduct research for have set their sights on how to streamline the process of data aggregation and analysis, including through methods such as automation. The end goal is to make it so businesses will have data that has been interpreted into actionable information in real-time, thus making it easier for businesses to take advantage of market trends more quickly and effectively than ever while also being able to respond to changes in real time.
Artificial Intelligence
While it hasn’t been completely integrated into the profession yet (although it is rather close to being so), artificial intelligence (AI) has stepped into the market analytics scene as a fascinating tool for researchers that has shown promise of fulfilling quick business insights for researchers and companies. AI primarily relies on the collection and input of large amounts of market data, with which it can then complete complex functions ranging from analyzing the effectiveness of a certain marketing campaign to predicting how future experimental product lines may do in the future marketplace. Companies, such as Zappi, have been using AI-driven machinery to do just this on a wide scale with a relatively high success rate over the past 10-15 years. While it doesn’t replace the role of a researcher, it does allow the researcher to streamline their analysis process and produce useful market data on a much quicker timeline, thus allowing companies to make informed decisions at a much quicker rate.
Rationalization
In the current undertaking of rationalization by market researchers, the introduction of AI is having by far the greatest impact on every facet of the job, particularly as it relates to the element of control. This can be attributed to it shifting the nature of what said researchers spend their time doing when completing research as with machine automation of analytical insights, market researchers have now been freed from the basic level of data interpretation from their studies. Due to many limitations that still exist with the technology (more on that below), it’s likely that researchers will continue to work closely with the data interpretation aspects of their job, but with AI on board, this process can now be carried out with much greater speed. With this advancement in control also comes a rather significant advancement in efficiency, too, as with the far more rapid completion of tasks and insights thanks to AI, researchers can now churn out greater amounts of useful information than ever before. Finally, it should be noted that with the usage of AI, market research becomes both more predictable and more calculable in the sense that AI can be often tuned to offer feedback from only a specific set of inputs (this helps with the accuracy of results), thus meaning data must be standardized, likely in quantitative form, for the software to provide actionable feedback. By standardizing this data, it sets expectations for how data collection will be conducted as specific forms of collection will be able to provide the data giving the best insights from the AI machinery.
Conclusions
Though over time the methodologies of conducting market research may have changed, the presence of rationalization has remained a constant throughout the years. Through the utilization of Ritzer’s variables of efficiency, calculability, predictability, and control, modern analytics have become a possibility and the data-driven world we know today has taken shape around us. While particular sections of the system have seen irrationalities develop, namely the potential overreliance on AI society seems to be hurdling toward, on the whole, there has been a great benefit to businesses through rationalizing market research. By developing thorough, complete, and quick processes of data collection, market researchers have made it possible for businesses to be far more effective in conducting their operations, especially in terms of decision-making, while also helping those businesses reach more customers than ever before.
References
Barge, Carlos. What Is the Impact of the Internet On Market Research Field? Digital Competitive Intelligence, 4 June 2020, https://www.bi.wygroup.net/digital-transformation/what-is-the-impact-of-the-internet-on-market-research-field/.
Booker, Bel. The 100-Year History of Market Research - 1920 to 2020. Attest, 7 Apr. 2023, https://www.askattest.com/blog/articles/history-of-market-research.
Contentsquare. A Brief History of Web Analytics: UX & Usability: Web Optimization. Contentsquare, 20 Mar. 2023, https://contentsquare.com/blog/a-brief-history-of-web-analytics/.
Duggal, Nikita. Advantages and Disadvantages of Artificial Intelligence. Simplilearn, 31 Mar. 2023, https://www.simplilearn.com/advantages-and-disadvantages-of-artificial-intelligence-article.
Gallup, George. “Impact: A New Method For Evaluating Advertising.” The Impact Method, Journalism Quarterly, 1950, pp. 378–382.
Glanz, Karen, et al. Health Behavior: Theory, Research, and Practice. 5th ed., Jossey-Bass, 2015.
Green, Paul E., and V. Srinivasan. “Conjoint Analysis in Consumer Research: Issues and Outlook.” Journal of Consumer Research, vol. 5, no. 2, Sept. 1978, pp. 103–123., https://doi.org/10.1086/208721.
Green, Paul E., et al. “Thirty Years of Conjoint Analysis: Reflections and Prospects.” Interfaces, vol. 31, no. 3, 2001, https://doi.org/https://www.jstor.org/stable/25062702.
Hagger, Martin S. “The Reasoned Action Approach and the Theories of Reasoned Action and Planned Behavior.” Oxford University Press, 27 Mar. 2019, https://doi.org/10.1093/obo/9780199828340-0240.
Kerner, Sean Michael. How AI-Powered Market Research Helps Predict Success of Future Products and Advertising. VentureBeat, 13 Dec. 2022, https://venturebeat.com/ai/how-ai-powered-market-research-helps-predict-success-of-future-products-and-advertising/.
Komando, Kim. Tech Privacy: 5 Ways You’re Being Tracked and How to Stop It. USA Today, 7 July 2022, https://www.usatoday.com/story/tech/columnist/komando/2022/07/07/tech-privacy-unknown-tracking/7792835001/.
Muhammad, Zia. 7 In 10 Of Consumers Don’t Want Companies Tracking Them. Digital Information World, 7 Oct. 2021, https://www.digitalinformationworld.com/2021/10/7-in-10-of-consumers-dont-want.html.
Oliver, Richard L. Satisfaction: A Behavioral Perspective on the Consumer. 2nd ed., Routledge, 2015.
Packard, Vance. The Hidden Persuaders. New York, D. McKay Co., 1957.
Pastor, Alexandra. How Has Market Research Changed Over the Years? Drive Research, 17 Nov. 2022, https://www.driveresearch.com/market-research-company-blog/how-has-market-research-changed-over-the-years/.
Patel, Neil. What Data Can You Pull from Cookie Tracking? NP Digital, 31 Oct. 2021, https://neilpatel.com/blog/cookie-based-advertising-wont-work/.
Ritzer, George. The McDonaldization of Society: Into the Digital Age. 9th ed., SAGE, 2019.
Sar, Sela, and Lulu Rodriguez. “Copy Test and Starch Test.” The International Encyclopedia of Communication Research Methods, 2017, https://doi.org/10.1002/9781118901731.iecrm0047.
Stewart, David W., and Prem N. Shamdasani. Focus Groups: Theory and Practice. 3rd ed., Sage, 2015.
U.S. Bureau of Labor Statistics. Market Research Analysts: Occupational Outlook Handbook. 6 Feb. 2023, https://www.bls.gov/ooh/business-and-financial/market-research-analysts.htm.
Fullerton, Ronald. Ernest Dichter: The Motivational Researcher. In: Schwarzkopf, Stefan, and Rainer Gries. Ernest Dichter and Motivation Research: New Perspectives on the Making of Post-War Consumer Culture. Palgrave Macmillan, 2010. https://doi.org/10.1057/9780230293946_3.
Walle, Thomas. Five Data Streams Needed To Measure Retail Market Share. Forbes Magazine, 30 Mar. 2023, https://www.forbes.com/sites/forbestechcouncil/2023/03/29/five-data-streams-needed-to-measure-retail-market-share/?sh=1e34303badaa.
Wind, Yoram, and Richard N Cardozo. “Industrial Market Segmentation.” Industrial Marketing Management, 3rd ed., vol. 3, Elsevier Scientific Publishing Company, Amsterdam, 1974, pp. 153–165.
Additional Image Sources
Bhaskaran, Vivek. Conjoint Analysis: Definition, Example, Types, and Model. QuestionPro, 29 Mar. 2023, https://www.questionpro.com/blog/what-is-conjoint-analysis/.
Bureau of Labor Statistics, U.S. Department of Labor, Occupational Outlook Handbook, Market Research Analysts, https://www.bls.gov/ooh/business-and-financial/market-research-analysts.htm
Furtak, Alexander. What Is Net Promoter Score? How to Calculate NPS? Mighty Digital, 30 Nov. 2021, https://www.mighty.digital/blog/nps-on-steroids.
Jones, Jeffrey M. U.S. Support for Death Penalty Holds Above Majority Level. Gallup, 19 Nov. 2020, https://news.gallup.com/poll/325568/support-death-penalty-holds-above-majority-level.aspx.
Megalytic. Understanding Google Analytics Data Sampling. Megalytic, 18 Mar. 2015, https://www.megalytic.com/blog/understanding-google-analytics-data-sampling.
Credits:
Created with images by Kittiphat - "Business stock market, trading, info graphic with animated graphs, charts and data numbers insight analysis to be shown on monitor display screen for business meeting mock up theme" • Valerii Evlakhov - "Business concept, financial charts and graphs as background" • amazing studio - "Magnifying glass on charts graphs paper. Financial development, Banking Account, Statistics, Investment Analytic research data economy, Stock exchange trading, Business office company meeting concept." • vegefox.com - "team" • Nikish Hiraman/peopleimages.com - "Closeup of business hands of a person filling out paperwork. Hand of an individual writing test, information or survey on paper to complete application or contract form on the desk at work." • Ravil Sayfullin - "Rear view of a businesswoman against black wall with keyhole" • ChayTee - "Young asian businesswoman explain idea to group of creative diverse team at modern office. Wide top view of manager standing against multiethnic people. Audience applauding speaker after presentation." • rawpixel.com - "Diverse people in a workshop" • JackF - "Portrait of friendly female fishmonger showing raw European bass to woman behind counter of seafood store" • pressmaster - "Young black woman holding small pink plastic jar with cream while choosing beauty profuct for her face in cosmetic shop" • A Stefanovska - "Market segmentation. dividing market into subsets or audiences. " • wernerimages - "man holding paper with the word Future in front of glass ball" • Parichat - "Business team colleagues meeting Planning Strategy Analysis discussing new plan financial graph data on office." • Deemerwha studio - "An analyst uses a computer and dashboard for data business analysis and Data Management System with KPI and metrics connected to the database for technology finance, operations, sales, marketing " • Blue Planet Studio - "Data Analysis for Business and Finance conceptual. Graphic interface showing future computer technology of profit analytic, online marketing research and information report for business strategy." • Blue Planet Studio - "3D rendering artificial intelligence AI research of robot and cyborg development for future of people living. Digital data mining and machine learning technology design for computer brain." • NicoElNino - "Big data analytics visualization technology with scientist analyzing information structure on screen with machine learning to extract strategical prediction for business, finance, internet of things"