There are different types of data: quantitative and qualitative. They are different in nature and used for different forms of research, but both are data.
Quantitative data is what you usually see on dashboards or what is often discussed a bit more in management groups. It is numerical data or data that can be translated into some form of statistics and is derived from a larger population. If you work with the number of conversions, NPS, or the number of users who answered “I agree” in the latest survey, then it is quantitative data.
Examples of quantitative methods are web analytics in tools such as Google Analytics or Adobe Analytics, clickmaps, and polls and surveys with so-called closed answers (i.e., the user can only respond based on a set of predetermined answer options). Even interviews can be designed quantitatively if you close the questions and do not allow users to elaborate on their answers.
Qualitative data is mainly used for exploratory research. It digs into needs, opinions, and motivations and explores what cannot really be quantified. Usually, a smaller number of data points are used, and the outcome is often in the form of descriptions of insights into the user's thoughts and behaviors.
Examples of qualitative methods are semi-structured or unstructured interviews, usability tests, and polls and surveys with open answers. You can absolutely quantify insights from qualitative studies, but that requires further research. The results from qualitative research with small populations cannot be scaled up to represent an entire user base. It will not be representative in a quantitative way. But the insights that the result consists of are still data.
Quantitative data is excellent for answering questions like “How many users do X?”, tracking trends over time, or measuring effect and uplift. When you conduct quantitative studies, you find out what users do. Users bought a product, added something to the cart, or clicked a button. But why they did or did not do something, you need to use qualitative methods to find out. So to get the whole picture, you usually need to combine these different types of data. You cannot use "what" without having a "why" and vice versa.
"You cannot use "what" without having a "why" and vice versa."
When conducting research, you usually set up a research question. It is a way to frame the continued work so that it has a clear direction and boundary. Having a well-formulated research question will guide you on which methods should be used.
If you want to explore what the next step is for your site or product, the research question could be something like:
What uncertainties do users have that prevent them from agreeing to a trial period when browsing the site?
Exploring uncertainties means digging into feelings and motivation, so a relevant setup to answer this question could be:
→ Result: What users say prevents them from completing the flow
→ Result: Both what users say prevents them from moving forward, but also insights into specific behaviors that indicate they are distracted or cannot find their way.
→ Result: What users do, how they move on the site, and in which steps they seem to have more problems.
To get the whole picture and answer the research question, you can't really remove any of these methods. You can replace one or more with other methods, but the problem still needs to be explored from different perspectives.
Combining methods and different types of data is called triangulating. Simply put, it means looking at a problem from different perspectives to get a more comprehensive picture. It doesn't have to be difficult. It mainly involves continuing to conduct research from different angles to see if the data tells the same story regardless of how you look at it.
"Simply put, it means looking at a problem from different perspectives to get a more comprehensive picture."
Sure, it happens that you conduct some form of study and the result shows something completely different than you expected. In these cases, just keep digging and find out why it gives a different result this time. It doesn't necessarily mean that the study was conducted incorrectly (watch out for confirmation bias!) but it could be as simple as you just don't have the whole picture yet. So keep doing the research and triangulate the data!
Have you tried everything but still not getting any uplift? You may have all the necessary tracking in place and indications of where the problem lies on the site. You've tried all the persuasive principles from all possible books but nothing seems to give a result.
Let go of the numbers for a moment and try a different approach.
From here, you can take the project further. Conduct an A/B test, redesign pages, change copy on product pages. Whatever the problem may be, you have now illuminated it well enough to take the next step.
Good luck with the research!