MiN8T
Home

Email Testing Playbook

Sarah Chen
Sarah Chen
Email Strategy Lead at MiN8T

Every email team has a horror story. The broken image that went to 200,000 subscribers. The personalization merge tag that rendered as {{first_name}} instead of "Sarah." The CTA button that was invisible in Outlook dark mode. The unsubscribe link that pointed to a 404. These are not edge cases -- they are the natural consequence of shipping emails without a rigorous testing process.

This playbook provides a complete, repeatable testing methodology. It covers everything from the 60-second pre-send sanity check to the multi-week A/B testing strategy that compounds small improvements into significant gains.

24%
Emails have rendering issues
3.5x
ROI from testing programs
90+
Client/device combos to test
$1.2M
Avg. annual cost of email errors

1 Why Testing Matters

The cost of an email error scales with your list size. A broken layout that goes to 500 subscribers is embarrassing. The same error going to 500,000 subscribers is a business-level incident -- it erodes brand trust, triggers unsubscribes, generates complaint spikes that damage sender reputation, and in some cases, creates regulatory exposure (broken unsubscribe links violate CAN-SPAM).

Research from Return Path and Validity consistently shows that companies with formalized email testing processes achieve 20-30% higher inbox placement rates and 15-25% higher engagement rates compared to those that rely on ad-hoc spot checks. The reason is simple: testing catches problems before they reach subscribers, and subscribers who consistently receive well-rendered, functional emails are more likely to engage.

Testing is not a quality tax -- it is a multiplier. Every hour invested in testing protects the thousands of hours invested in list building, content creation, and strategy development.


2 The Pre-Send Checklist

Before any email goes to rendering tests, it must pass a structural pre-send checklist. This catches the obvious errors that waste testing time and prevents the most common sending mistakes.

Content checklist

Technical checklist

!

Gmail clipping: Gmail clips emails larger than 102KB of HTML (not including images). When clipped, a "View entire message" link appears, and everything below the clip point is hidden -- including your footer, unsubscribe link, and CTA. Always check HTML file size before sending.


3 Rendering & Client Testing

Rendering tests verify that your email looks correct across the email clients your audience actually uses. This is the most time-consuming part of testing, but also the most impactful -- a beautifully designed email that renders as a broken mess in 30% of opens is worse than a plain-text email that works everywhere.

Priority client matrix

Test every client your audience uses, but prioritize based on actual open data. A typical B2C priority matrix:

PriorityClientTypical ShareKey Concerns
P0 (always test)Apple Mail / iOS Mail40-55%Dark mode, font rendering
P0Gmail (Android + web)25-35%Style stripping, clipping, no media queries
P1 (test weekly)Outlook (Windows)5-15%Word rendering engine, no CSS3
P1Yahoo / AOL5-10%Proprietary quirks, CSS stripping
P2 (test monthly)Samsung Mail3-8%Dark mode, Android variations
P2Outlook (macOS / mobile)3-5%Different engine than Windows Outlook

Dark mode testing

Dark mode is no longer optional. Test your email in both light and dark mode for every P0 and P1 client. The most common dark mode failures:

MiN8T feature: MiN8T's preview panel includes a one-click dark mode toggle for every client preview. You can view your email in Apple Mail dark mode, Gmail dark mode, and Outlook dark mode side by side without leaving the editor.

Accessibility testing

Accessibility testing should be part of every QA cycle, not an afterthought. The key checks:


4 Spam Score & Deliverability Tests

Before your email reaches a subscriber, it has to pass the spam filter. Spam scoring tests analyze your email against the same criteria that mailbox providers use, flagging potential issues before you send.

What spam filters evaluate

Pre-send spam test workflow

  1. Run a spam score check -- send your email through a spam scoring service (SpamAssassin, Mail-Tester, or MiN8T's built-in checker). Aim for a score of 0 or as close to 0 as possible. A SpamAssassin score above 5.0 is almost certainly hitting spam.
  2. Check all URLs against blacklists -- every link in your email should be clean. A single blacklisted URL can tank the entire message.
  3. Verify authentication -- send a test email and inspect the headers. Confirm SPF pass, DKIM pass, and DMARC pass for your sending domain.
  4. Check HTML size -- ensure the HTML body is under 100KB to avoid Gmail clipping, which can hide your unsubscribe link and trigger compliance concerns.
i

MiN8T feature: MiN8T runs an automated spam score check every time you preview an email. The score is displayed in the editor toolbar with a color indicator (green/yellow/red) and specific recommendations for improving it. No manual steps required.


5 A/B Testing Strategy

A/B testing (also called split testing) is the practice of sending two or more variants of an email to a small portion of your audience, measuring which performs better, and sending the winner to the remainder. It is the single most effective method for continuously improving email performance over time.

What to test (in priority order)

  1. Subject line -- the highest-impact element. A better subject line can improve open rates by 10-30%. Test length, personalization, urgency, curiosity, and emoji usage.
  2. Send time -- the second highest impact. Test morning vs. afternoon, weekday vs. weekend, and time-zone-optimized sending.
  3. From name -- "Sarah at MiN8T" vs. "MiN8T" vs. "MiN8T Team." The from name is the first thing recipients see and heavily influences open decisions.
  4. CTA copy and placement -- "Get started free" vs. "Start your trial" vs. "See it in action." Test button color, size, and position (above-fold vs. below content).
  5. Content layout -- single-column vs. multi-column, image-heavy vs. text-focused, long-form vs. short-form.
  6. Personalization depth -- first name in subject vs. no personalization. Product recommendations vs. generic content.

Statistical rigor

The most common A/B testing mistake is declaring a winner too early. To get a statistically significant result (95% confidence), you typically need:

Compound testing: A 5% improvement from a better subject line, combined with a 3% improvement from optimized send time, combined with a 4% improvement from better CTA placement, compounds to an 12.5% total improvement. Small, tested gains accumulate into transformative results.


6 Team QA Workflows

Individual testing is necessary but insufficient. For teams producing multiple campaigns per week, you need a formalized QA workflow that ensures every email meets quality standards regardless of who built it.

The three-stage review process

  1. Self-review (builder) -- the person who created the email runs through the pre-send checklist and verifies rendering in their top 3 clients. This catches 80% of issues.
  2. Peer review (teammate) -- a second team member reviews the email with fresh eyes. They check copy, links, and brand compliance. Fresh eyes catch things the builder has become blind to through repetition.
  3. Technical review (QA lead) -- for high-volume or high-stakes campaigns, a designated QA lead runs the full rendering test suite, spam score check, and accessibility audit. This is the final gate before sending.

Automated vs. manual testing

Test TypeAutomateKeep Manual
Link validationYes -- crawl all links, check HTTP status codes--
Spam scoreYes -- run automatically on every preview--
HTML size checkYes -- flag if over 100KB--
Image loadingYes -- verify all image URLs return 200--
Rendering quality--Yes -- human eyes needed for visual judgment
Copy review--Yes -- tone, voice, and clarity require human judgment
AccessibilityPartially -- automated contrast checks, but screen reader testing needs a humanYes
Dark mode--Yes -- automated screenshots help but human review is essential
i

MiN8T feature: MiN8T automates the entire "automate" column in the table above. Link validation, spam scoring, HTML size checking, and image verification all run automatically every time you save or preview. The manual items are surfaced in MiN8T's QA checklist panel, which tracks completion status and prevents sending until all items are checked off.

A disciplined testing process does not slow down email production -- it prevents the kind of errors that force teams to spend hours on damage control, apology emails, and reputation repair. Build the process once, follow it every time, and your error rate will approach zero while your engagement rates climb.

Test every email before it ships

Automated link checks, spam scoring, and 90+ client previews built into your editor.

Start building free