Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Quantitative vs. Qualitative Research

Whereas Quantitative research, in general, leverages statistics as the basis for making generalizations about an issue at hand. On the other hand, qualitative research performs qualitative inquiry comprising small data, context, and human judgment.

What’s quantitative data?

The characteristics of quantitative research contribute to methods that use statistics as the basis for making generalizations about something. These generalizations are constructed from data that is used to find patterns and averages and test causal relationships.

Quantitative data has become extremely important, especially for improving business processes.

When dealing with quantitative data, it’s critical to have a pipeline of selection of what data make sense, which can drive the business.

In other words, a lot of time will be spent building and curating the dataset, which will be used as the foundation to analyze the business.

Indeed, the risk is otherwise to rely on unreliable data, which only increases the noise for the business.

Many tech companies, like Google, Amazon, Netflix, and Microsoft, leverage data in their business processes.

Some examples of how quantitative data drives those processes to comprise:

  • Inventory management.
  • Orders’ fulfillment.
  • Product recommendation.
  • Indexing and ranking.
  • Spam detection.
  • A/B testing.
  • Content recommendation.

In other words, there are tons of practical use cases for which data can be used to improve business processes.

It’s also important to balance that with qualitative data.

What’s qualitative data?

Qualitative research is performed by businesses that acknowledge the human condition and want to learn more about it. Some of the key characteristics of qualitative research that enable practitioners to perform qualitative inquiry comprise small data, the absence of definitive truth, the importance of context, the researcher’s skills and are of interests.

Qualitative data is extremely important as it can change the nature of our quantitative understanding.

For instance, while tech companies leverage quantitative data to improve their processes, much of it is imbued by qualitative understanding to make that quantitative data much more valuable.

Indeed, the risk of quantitative data is too much generalization, ultimately leading to the creation of abstract scenarios that do not exist in the real world.

In addition, quantitative data is skewed toward things that can be measured, thus leading to attributing way more importance to those things that can be easily measured vs. those that can’t.

Take the case of digital marketing campaigns, where you can easily track clicks, thus attributing more importance to platforms like Google Ads, which are easily tracked.

Yet, you realize that people might be clicking on your ads campaigns thanks to your strong brand, which can’t be directly measured.

Thus, you find out that branding drives performance campaigns only by having a qualitative judgment of your business.

This is one of the many examples of how qualitative data can inform quantitative data.

Other examples comprise:

  • Data selection.
  • Data curation.
  • Data cleaning.
  • Validation workflows.
  • Understanding of changing contexts for which quantitative data don’t make sense anymore.

All of the above help make quantitative data much more valuable by removing a substantial amount of noise.

Quantitative vs. Qualitative Research

Dealing with data is extremely hard.

It’s one of the hardest things in business.

And as most businesses now have a lot of data available, it’s easy to fall into the trapping of misusing it.

For that, it’s critical to establish project business processes, whereas it gets clear to the internal team when to use quantitative vs. qualitative data or both.

Quantitative research, if used in the proper context, can be incredibly effective.

For instance, companies like Amazon have been using quantitative research to drastically improve – over time – their business processes, from inventory management to order fulfillment.

This is part of Jeff Bezos’ “Day One” Mindset.

In a letter to shareholders in 2016, Jeff Bezos addressed a topic he had been thinking about quite profoundly in the last decades as he led Amazon: Day 1. As Jeff Bezos put it, “Day 2 is stasis. Followed by irrelevance. Followed by excruciating, painful decline. Followed by death. And that is why it is always Day 1.”

This forced Amazon to understand how to leverage both quantitative data (for business processes) and qualitative data (for discovery).

In what Bezos labeled as customer obsession.

As Jeff Bezos recounted in 2006:

Many of the important decisions we make at Amazon.com can be made with data. There is a right answer or a wrong answer, a better answer or a worse answer, and math tells us which is which. These are our favorite kinds of decisions.”

As Jeff Bezos also highlighted at the time:

As our shareholders know, we have made a decision to continuously and significantly lower prices for customers year after year as our efficiency and scale make it possible.

Indeed, this was the core tenet of Amazon’s flywheel.

And Jeff Bezos also explained:

This is an example of a very important decision that cannot be made in a math-based way. In fact, when we lower prices, we go against the math that we can do, which always says that the smart move is to raise prices.

This is a critical point to understand, as Amazon has learned how to integrate quantitative and qualitative understanding within its business processes over the years.

Indeed, as Jeff Bezos further explained:

We have significant data related to price elasticity. With fair accuracy, we can predict that a price reduction of a certain percentage will result in an increase in units sold of a certain percentage. With rare exceptions, the volume increase in the short term is never enough to pay for the price decrease. 

In other words, b using statistical tools like price elasticity, you can have a short-term quantitative understanding.

But it tells you nothing about the potential long-term effects of it.

This is where you understand the limitations of statistical tools.

Wich Jeff Bezos explained extremely well:

However, our quantitative understanding of elasticity is short-term. We can estimate what a price reduction will do this week and this quarter. But we cannot numerically estimate the effect that consistently lowering prices will have on our business over five years or ten years or more. 

By understanding the drawbacks and limitations of quantitative methods, you know when human judgment needs to kick in.

The Importance of Human Judgement

Jeff Bezos articulated it incredibly well, when he said, back in 2006:

Our judgment is that relentlessly returning efficiency improvements and scale economies to customers in the form of lower prices creates a virtuous cycle that leads over the long term to a much larger dollar amount of free cash flow, and thereby to a much more valuable Amazon.com.

This is a great point to emphasize.

As most long-term decisions with second-order effects require a different thinking approach.

Indeed, most of Amazon’s successful long-term projects that really moved the needle were mostly the result of human judgment, as Jeff Bezos further articulated:

We’ve made similar judgments around Free Super Saver Shipping and Amazon Prime, both of which are expensive in the short term and—we believe—important and valuable in the long term.

Balancing Data with Human Intuition

That is why it’s critical to know when human judgment needs to kick in.

This usually happens when we need to balance short-term decisions, with long-term ones.

While quantitative data is extremely useful for telling us the short-term consequences of a decision, it might not tell us anything about long-term ones.

This is true for both positive and negative cases.

Imagine the case, for instance, where quantitative data tells you that a decision is sound in the short term, yet it might turn out to carry a lot of hidden costs in the long run.

For instance, consider a company that invests all its marketing dollars in performance marketing campaigns without ever building a solid brand.

Quantitative judgement tells you that performance marketing campaigns work exceptionally well.

However, the thing is, unlesss you build a strong brand, you won’t survive in the long term.

Only a deep undersanding of the business can help you deal with that.

And the opposite is true.

Imagine you would not spend resources building your brand, as you don’t see short-term results.

From an intuitive standpoint, you know that your competitive moat will depend on your ability ot build a brand.

Yet if you were to follow the short-term understanding of your business through quantitative research alone, you would end up destroying it in the long run.

Second-Order Effects and System Thinking

Thus, to properly balance out quantitative and qualitative data.

Short and long-term thinking.

It would be best if you practiced second-order effect thinking.

Second-order thinking is a means of assessing the implications of our decisions by considering future consequences. Second-order thinking is a mental model that considers all future possibilities. It encourages individuals to think outside of the box so that they can prepare for every and any eventuality. It also discourages the tendency for individuals to default to the most obvious choice.

Read Next: Qualitative Data, Quantitative Data.

The post Quantitative vs. Qualitative Research appeared first on FourWeekMBA.



This post first appeared on FourWeekMBA, please read the originial post: here

Share the post

Quantitative vs. Qualitative Research

×

Subscribe to Fourweekmba

Get updates delivered right to your inbox!

Thank you for your subscription

×