14th Feb 2018 –
When we asked what people thought ‘testing technology’ meant, no two answers were the same.
Of course, vendor websites are good for accessing resources too. Many platforms offer regular support and content on their blog (or resource) web pages. But, usually, these focus more on their own technology within the wider CRO industry.
Some respondents took a middle ground, as well as citing the inclusion of analytics teams. For others, integrating the toolset for clearest results, rather than who chooses it, was the issue.
Maybe the tools used in testing only reflect how well planned the experiments are.
Poor choice of tools from the client could impact budgets and resources. Choice of tools by agencies need to encompass the wider business goals. Getting understanding of these through clients can still pose a challenge for agencies.
Being partners with a specific toolset means agencies can often access deeper functionality. Agencies can be more cost-effective when compared to running the same level of a tool with an in-house team.
Poor choice of tools for the right maturity level of testing could be an issue. The tools may be underutilized or inadequate for the testing required.
‘Testing’ means different things to different people:
The client-side culture for CRO is going to be a factor in how welcome agencies are, and how valued their testing is. Clients might have been ‘burnt’ in the past through mediocre or poor agency work. Their view on implementing CRO might be negative.
Agencies often specialize in specific tools. Ones that they’ve benchmarked, tried and tested across many industries and clients. Partnering with these tools ensures they keep up-to-date with new features and developments. Internal marketing teams may already have contracted tools in place within the business. Or, maybe about to re-procure for upcoming campaigns from new budgets.
- Google Analytics – web analytics service that tracks and reports traffic to websites.
- Clicky – web analytics software that tracks and reports traffic to websites, with additional options for heatmaps and uptime monitoring.
- Hotjar – toolset for analytics, surveys, heatmaps and other customer monitoring functions.
- Convert.com – A/B testing, split URL testing, and multivariate testing software
- Optimizely – customer experience optimization software.
- VWO (Visual Website Optimizer) – visitor research, optimization road mapping, and testing software.
- A1WebStats – visitor identification and analysis software.
- Qubit – programmatic personalisation and experience testing software.
- Google Optimize – A/B testing and personalization tools for website owners
- AB Tasty – A/B testing and personalization software
- Peerius – website personalization software acquired by Episerver
- Adobe Target – automated personalization and A/B and multivariate testing software
- Mailchimp – marketing automation and email marketing platform
- Tealeaf – customer experience capture and analysis software, now part of IBM
- Dotmailer – marketing automation and email marketing platform
- Kitewheel – real-time personalized customer journey software
Marketing as an industry is more aware of the benefits of optimization. Yet that level of awareness can differ by industry, and by the skills and size of the marketing team. Some gave us technical responses, others more of a general feeling of the results they were after. It wasn’t limited only to A/B, MVT, and personalization testing. Testing for compliance and channel efficacy were part of these answers too.
The technology wasn’t the differentiator for clients.
There are other considerations businesses need to be aware of when choosing tools specified for optimization campaigns. To help understand the challenges in choosing technology, we ran a survey to people from both sides of the optimization world. As long as the choice of technology is independent of a commission that is!
For 2018 onwards, the biggest challenge for marketers will be around GDPR compliance.
We contacted key marketers and CRO agencies for their opinions.
There is a perceived kick-back in toolkit partnerships that responders mentioned. Some agencies will receive a percentage on any software contract sold to their customers while others have contracts in place with clients to guarantee neutrality. Though, this depends on the type or partnership contract they have with the toolset they use. GDPR compliance, though not referenced widely from our respondents, will pose a challenge in which tools form part of their testing and marketing toolkit.
As expected, the answer to this question differs by role. Agency roles see themselves as better placed to choose. Marketing roles also held the same view.
But who is best placed to decide on the right ones to use? Because of this, more than once we noted comments that the choice of tools might not be independent. Whilst not a major issue to most, the transparency behind this could be of concern to clients.
We then asked a couple of deeper questions to determine why. We then asked people to list the best sources for up-to-date CRO information. There was little common ground.
Our findings show that either role (client or agency) could choose the toolkit. But this relies on the ability to understand and accept if a person’s skill level is up to the task.
What influences the choice of working with a CRO agency?
From our responders, CRO experience was the key influencer in their decision making. Not the tools an agency chose to partner with.
Add your voice to this conversation, by taking part in our short survey here.
Agencies bring experience in both approach and execution. Also, in their ability to offer advice and direction during the campaigns. They take the lead on how best to draw insights and conclusions from any results.
The challenge for digital marketing and optimization:
Agencies may already have a small roster of approved partner tools they work with—often choosing to specialize in fewer tools to develop their level of expertise in these systems.
It was considered that the client is best placed to understand the wider impact of testing. After all, they know the goals of the business strategy. An agency may only see a small section of the bigger picture.
If not – or the business has no leadership for CRO within its team – then an Agency partner should lead on this choice.
One clear suggestion from both Agencies and Marketers was Conversion XL’s blog.
We then asked which were the top tools they were aware of, and which they used. Once tallied, the tools named (and in no particular order) were as follows:
Resources and learning:
Here are some of the insights we uncovered.
A big part of successful optimization comes from the tools used to run the testing campaigns.
This can be down to the pace which businesses work at, and the desire to see results faster. The continuous innovation in the industry creates a need for continuous learning. This remains a challenge to both CRO agency and digital marketing teams.
Learn more about how to ensure your chosen tool is ahead of this GDPR compliance.
(You can read more about how Convert.com is getting ahead of this regulation here.
Running in-house teams means there are fewer people in the decision-making process. Implementing new technology or making changes to campaigns can be quicker.
An Agency’s ability to showcase this expertise was the key to their decision. The expectation was – any technology partnerships would be in the best interests of their clients.
These mentioned above are just a selection of tools available. There are many options out there for all aspects of testing and optimization, but when it comes to selecting which tools to use together – the choice can soon become overwhelming.
Agency, Client (or both) – who do you think is best placed to choose the tools for testing?
The different tools and technology they use will need to be compliant in the way it captures and stores user data. The permission of this data capture may require a review of their current technology and tools.
The nature of their work means they also have to keep up with the latest industry technology—often comparing and contrasting new technologies against their preferred toolset. They update this toolset to maintain their advantage for better results.
Clients might not be aware of features and issues to consider when selecting a tool. Time to research and educate themselves can be an issue with internal team resources.