...

Look closely at the testing results to decipher and apply key findings.

Once you’ve run the test, you’ll want to take a look at the results. We used a number of analytics platforms and have learned that while none are perfect, the amount of effort you put into setting up analytics mirrors the quality of your analytics output. We tried several platforms and figured it might make sense to do a quick rundown.

MOUSESTATS

We were initially really impressed with MouseStats, it positions itself as a “set it and forget it” solution, but it didn't work out for us in the end. It provides a lot of the data we need, such as form analytics, heatmaps, and screen recordings, and it has tools to help you identify areas of improvement. However, MouseStats doesn’t allow you to export and download your analytics data in a useful format such as csv or JSON which was a dealbreaker for us.

MouseStats example
HOTJAR

Hotjar is a direct competitor to MouseStats but we saw immediately that it wasn’t a good fit. Like MouseStats there’s no way to export analytics data, and on top of that, we noticed that it didn’t treat forms in a way that we found useful. For example, we want groups of radio buttons to be considered a single form element and the analytics data should be combined to reflect user interactions with that group of radio buttons, rather than analyzing each radio button individually (what Hotjar does).

HotJar example
FORMISIMO

Formisimo didn’t work and spammed us with emails.

MOUSEFLOW

Mouseflow initially seemed like the winner for us. It captures the form analytics we needed and allowed us to export the data as a csv for further processing. These were exactly the features we were looking for, but we ran into some insurmountable issues with the accuracy of the reports. We were getting false flags for things like form completion and form abandonment, and consistently inaccurate reports of form element interaction durations.

MouseFlow example
MIXPANEL

After working with MouseStats, Hotjar, and Mouseflow, we concluded that we needed a more sophisticated solution that we could customize to fit the unique analytics needs of each of our tests. The “set it and forget it” approach of the services mentioned above wasn’t cutting it anymore. We turned to Mixpanel because they provide a javascript library of tracking utilities that allow us to specify exactly what we want to measure on any given test. Our mixpanel data is refreshingly accurate, and the data processing and filtering tools on their site allow us to generate custom reports for each test.

Mixpanel example

Today we’re using a combination of Mixpanel and Mouseflow to generate all of our required analytics. Mixpanel gives us precise measurement of tracking events that we specify for our tests (time spent in a field, number of corrects, form validation errors), while Mouseflow captures screen recordings and heatmaps.

What you’re looking for and looking at will change depending on which analytics platform you choose to use, but for the first few tests we ran our analytics focused around time spent in a given form field. When analyzing our data we’d start by referring back to our original question in order to reorient ourselves and remember the answer we were looking for. We’d then examine the demographic distribution of our sample to see if it matched the control test. If the two matched (and they always did), we’d take a look at analytics for the field in question to see if the key metrics we identified for it were improved or not. We’d also look at heatmaps and screen recordings to help explain the metrics we saw, as there’s always something interesting going on behind the numbers.

Finally, once we got more data to help us answer our initial question, we’d decide on what to tackle next. Oftentimes, running a test raised questions that led to two or three additional tests, and we’d begin the process all over again by asking ourselves how we would find the answers to those questions.