Conversionista worked alongside the e-commerce team at Fortum to help change focus and process from idea – implementation – reporting to analysis – insight – experimentation.
In this case we walk through a deep-dive analysis that led to small changes being suggested in certain elements. These changes helped Fortum go from a design that converted 9% worse than the original version, to a design that converted 9% better!
When Conversionista started collaborating with Fortum, the team saw big opportunities in starting to work in a more data driven way. The e-commerce team had gaps in knowledge about the user journey, the pain points and where to start optimizing on the site. Time was primarily spent on reporting and even if there were a lot of optimization ideas within the team, there were no processes in place to measure the effect of the implemented ideas. This is where we came in!
We wanted to create a truly data driven way of working with analytics as a core building block. Conversionista helped the team build a process for making ideas into clear hypotheses based on a combination of data points. We call that data triangulation! The team started with A/B-testing and prioritization of the A/B-tests. The prioritization was important due to a limited amount of traffic and resources in the team. Also, the analysis of each experiment after completion was important to gain as much insight and knowledge in the user journey as possible.
Together with Conversionista, the Fortum team ran an A/B test on the price page in the purchase flow. We wanted to redesign the pricing blocks to clarify the discounts and make important information more visible. We also wanted to reduce complexity by helping the user navigate to further information and giving them the right amount of information at the right time within the pricing blocks. The main KPI for the test was click-through rate and we also decided to follow up on the number of users clicking to view more price details and how this behavior affected the click-through rate.
Why CTR as the main KPI? A site may have too little traffic to achieve a significant result in conversion rate/revenue. So measuring if the user proceeds to the next step in the funnel is key!
The result (of the first test)
The result from this A/B-test showed a statistically significant result in -9% in the main KPI, click-through rate.
Since multiple elements were changed in the test, Conversionista did a deeper analysis by looking into the smaller elements to gain insight into what caused the decrease in the click-through rate.
Below we walk through the process of how we conducted the deeper analysis so we could maximise our learnings.
The price details element was analyzed to see if there was a change in the amount of users clicking on “See how we calculated” vs. “See detailed price”. The original had a drop down function whilst in the variant the user had to click a modal to view the price details. The change in the variant was based on the hypotheses that the original price block lacked clear communication regarding price. However, no change in the amount of interactions was seen.
Continuous analysis was made by looking at the interactions within the drop down and within the modal. The hypothesis was that it was not clear in the original that users could drop down more price details after expanding the card the first time. There was a big difference in the result of the amount clicking to see more details.
🔹15% of all users opened the price details in the original (drop down) also opened next level of price details
🔹57% of all users opened the price details in the variant (modal) also opened next level of price details
The conclusion was that users might find the second level of price details easier to find, or interact with, in the variant. But the results could also indicate that the users were looking for further information in the variant since it showed less information than the original.
Deeper analysis was done by looking at the click-through rate behavior after interaction with the price details. Fewer users in the variant continued to the next step of the funnel after opening the second level of price details, 18% in the variant vs. 75% in the original. This could indicate that users were facing problems with the modal interaction.
Conclusion and Learnings
These findings led to the conclusion that the price details were probably easier to find in the variant, but the modal in the variant could be the cause of the decrease in the click-through rate.
Based on this test and its analysis, the variant was redesigned by keeping most of the changes except the modal function for the price details. The drop down function was kept and there were some clarifications made with copy and colors of how the second level of price details was shown.
A new test was run with the drop-down function instead of the modal function, this test led to a new significant result of +9% in click-through rate.
These results show the importance of deeper analysis. Analysis brings more value and insights to the entire experimentation process, it is a big part of interpreting the outcomes of the A/B test and what the next step should be.
If you would like to discuss how your organization can benefit from having a truly data driven process, please reach out to rganize a meeting with our experts.