What is an inbox placement test and how does it work?
An inbox placement test evaluates the DNS authentication signals associated with a sending domain -- MX records, SPF, DMARC, and DKIM -- to predict whether email sent from that domain is likely to land in the inbox or the spam folder. This tool performs live DNS lookups via Cloudflare DoH and Google DoH, scores each signal based on its configuration quality, and produces an overall placement prediction. A score above 80% indicates well-configured authentication that is unlikely to trigger spam filters on technical grounds.
What signals have the biggest impact on email inbox placement?
The three authentication records -- SPF, DKIM, and DMARC -- have the largest technical impact on inbox placement. SPF specifies which mail servers are authorised to send on behalf of your domain. DKIM adds a cryptographic signature to every outgoing message that receiving servers can verify. DMARC ties them together with a policy that instructs receivers what to do when messages fail authentication. Missing any of these signals, or having them misconfigured, is the leading cause of legitimate email being routed to spam folders by Gmail, Outlook, and Yahoo.
What DKIM selector should I enter for my domain?
The DKIM selector is a short identifier that is part of the DKIM DNS record hostname: selector._domainkey.yourdomain.com. Common selectors include 'default', 'google' (Google Workspace), 'mail', 'k1' (Klaviyo), 's1' and 's2' (SendGrid), 'mandrill' (Mailchimp/Mandrill), and 'amazonses' (Amazon SES). Check your email service provider's documentation for the correct selector. If you have access to your email headers, look for the DKIM-Signature header and find the s= tag -- that value is your selector.
What is the difference between SPF ~all and -all?
SPF records end with an 'all' mechanism that defines what to do with email from senders not listed in the SPF record. '-all' means hard fail -- reject all unauthorised senders. '~all' means soft fail -- mark unauthorised senders as suspicious but still deliver. '-all' is stricter and signals stronger enforcement to receiving servers. '~all' is recommended for most senders as it avoids accidentally blocking legitimate email from forwarded messages. '+all' and '?all' are very permissive and provide essentially no protection and should be avoided.
What does DMARC p=none vs p=quarantine vs p=reject mean?
DMARC's p= tag defines the policy applied to messages that fail both SPF and DKIM alignment. p=none means monitor only -- failed messages are delivered normally and you receive reports about failures. This is used when first deploying DMARC. p=quarantine means failed messages should be sent to the spam/junk folder. p=reject means failed messages should be rejected entirely and never delivered. For maximum inbox placement credibility, p=reject signals to receiving mail servers that you have full control over your domain's sending infrastructure.
Why does my email still go to spam even when I pass all authentication checks?
Authentication is necessary but not sufficient for inbox placement. Even with perfect SPF, DKIM, and DMARC configuration, email can still land in spam due to sending IP reputation (new IP addresses or IPs with complaint history), domain reputation (new domains need a warm-up period of gradually increasing volume), content signals (spam trigger words, all-caps subject lines, excessive punctuation, image-heavy with minimal text), engagement history (low open rates signal to Gmail and others that recipients don't want your mail), and recipient-specific filters (individual users blocking your domain).
How do I fix a failing SPF record to improve inbox placement?
An SPF record is a TXT DNS record at your domain root. A basic SPF record looks like: v=spf1 include:_spf.google.com ~all. Replace the include: value with the SPF include provided by your email service (SendGrid, Mailchimp, etc.). If you send from multiple providers, chain them: v=spf1 include:sendgrid.net include:_spf.google.com ~all. Keep the total number of DNS lookups (include:, a:, mx:) below 10 -- exceeding this limit causes SPF to permerror which is treated as a failure. End with ~all or -all, never +all.
Can this tool test actual email delivery to Gmail or Outlook?
No -- this tool tests DNS-level signals only, which is how most inbox placement tools at the DNS layer work. Actual inbox delivery testing requires sending a real email from your infrastructure to a seed email address at Gmail, Outlook, Yahoo, etc., and then checking which folder it landed in. Tools like Mail-Tester, GlockApps, and Litmus Spam Testing provide this seed-list based approach. This tool is complementary -- it checks all the technical authentication configuration that accounts for the majority of placement decisions, and is faster and requires no email sending.
What is email warm-up and why does it affect inbox placement?
Email warm-up is the practice of gradually increasing sending volume from a new domain or IP address over several weeks. Mail providers like Gmail track sending patterns and treat sudden high-volume sending from a new sender as suspicious -- a common pattern for spammers. Starting with small volumes (50-100 emails/day), consistently sending to engaged recipients who open and click, and slowly increasing volume over 4-8 weeks builds a positive reputation history with mail providers. Using a new domain for cold email outreach without warming it up is one of the most common causes of immediate spam placement.
How often should I run an inbox placement test for my domain?
Run an inbox placement test whenever you set up a new sending domain, add a new email service provider (which requires new SPF includes and DKIM keys), notice a sudden drop in open rates (which can indicate spam placement), change your DNS configuration, or after any domain migration. For active email senders, running a monthly check is a good practice to catch DNS configuration drift -- SPF records sometimes lose includes when DNS is updated, and DKIM keys can expire or be deleted. DMARC aggregate reports (rua=) provide ongoing monitoring between manual checks.