Recently Solomon posted a blog about why quality assurance or QA is important and how to get started. To take things a step further, here are some pieces of Email QA that appear similar but have enough differences to make all the difference for accurate quality assurance.
Resolved links vs Correct links
A big part of email QA is testing that all links resolve correctly per the intake documentation or copy doc provided by content teams. However, a link that resolves to a seemingly correct page may not be a correct link in the end. A root domain and subfolder will send you to the correct web page, but your URL parameters, tracking codes, or anchor link attributes could be wrong. As alarming as this sounds, it is easy to straighten out.
Comparing the intake document to your email, match the same link details itself either on View in Browser or from the platform directly. If you want to take the guesswork out of it, throw both links in a web tool like Text Compare to be sure they match exactly.
Misspelling vs. Local spelling
Depending on the process of your email content strategy, copy may vary based on local spelling from where the content originated to where the content is being sent (ex. color vs. colour). Be sure that there is a rhyme or reason for inconsistent spelling, or it may stand out to your audience as an error.
Keep a library of common words or grammar patterns that may vary between similar languages so your team can reference it in one place if there is a question.
Email client version A vs. Email client version B
When it comes to email clients, the differences between versions can be crucial in rendering your email correctly or incorrectly. Therefore, it is not sufficient to test one version of an email client— you will need to test multiple. Litmus is a great tool to test how one email renders across multiple clients.
Before you begin your project, make a list of which versions of mobile and desktop clients you will test and have it approved by members of the build and QA team.
Test send vs. Live send
For the purposes of this item, we’re defining a live send as a simulated send with specific parameters and a test send as default content. While both may be conveniently sent to your email team’s inbox from the platform for testing, they are very different.
On some occasions, test sends may not include the link tracking of a live send. Most importantly, a test send usually is not sent from the system the way your final email will deploy. Live sends test dynamic content within segmentation parameters in the process of a real send. You will want to be sure your QA experience most closely resembles the final audience experience.
Quality Assurance vs. Customer Feedback
A great way that marketers gauge how their content is being consumed is simply by asking for feedback. Whether it is a survey email blast or a feedback form directly in the original email, customer feedback is not quality assurance…and vice versa. Obviously, on a technical level, nothing replaces meticulous checks like combing through every possible customer experience in your scope—still, every once in a while a piece of feedback may alert your team to a technical check that has slipped past.
It’s important to think of the Quality Workbook as a living document. Whether it’s from a tip (i.e., likely a complaint) from a user or your own team’s creativity. Make the time to update the document regularly to become more comprehensive, up to date, or even more efficient.
————————————————————-
How can you find the right kind of folks that will nerd out about Quality Assurance for your team? Just give us a shout.