SEO troubleshooting interview questions

General Troubleshooting Methodology

1. You notice a 50% drop in organic traffic overnight. What is your very first step?
Check Google Search Console’s manual actions and security issues, then verify if tracking code is still firing on all pages (check real-time analytics and tag manager). I simultaneously see if the site is accessible and not accidentally blocked by robots.txt or a noindex tag.

2. Walk me through your general framework for diagnosing any SEO issue.
I use a differential diagnosis approach: (1) Verify the problem exists (data vs. noise). (2) Isolate scope – specific pages, templates, devices, or entire site? (3) Check internal changes: code deployments, content updates, server changes. (4) Check external factors: algorithm updates, competitor moves, SERP feature changes. (5) Form hypotheses, test them in priority order, and document findings.

3. What tools do you use first when something goes wrong?
Google Search Console (Index Coverage, Performance, URL Inspection), a crawler (Screaming Frog or Sitebulb), server logs if available, Google Analytics real-time, and an algorithm tracker (Semrush Sensor, Algoroo).

4. How do you rule out analytics tracking issues vs. genuine traffic loss?
I compare server-side access logs with analytics data, check if other metrics (impressions in GSC, rankings) also declined. If impressions are stable but sessions dropped, it’s likely a tracking issue. I also verify if the dip appears only in certain browsers or segments.

5. What sources do you check to confirm a Google algorithm update?
Google Search Central Twitter, Search Engine Roundtable, Semrush Sensor, MozCast, official Google ranking updates page, and community forums like r/TechSEO or WebmasterWorld. I cross-reference timing with my traffic dip.

6. How do you decide whether to escalate an issue immediately or investigate further first?
If the issue is catastrophic (site offline, de-indexed, manual action) I escalate immediately while gathering data. For moderate traffic declines, I investigate first, prepare a summary with impact and likely causes, then escalate with context.

7. How do you isolate whether a ranking drop is page-specific, template-wide, or sitewide?
I segment queries/pages by directory, URL pattern, or page type in GSC and rank trackers. If only /blog/ pages dropped and /product/ pages are stable, I suspect a blog template issue or content quality problem specific to that section.

8. A stakeholder says “our SEO is broken” but provides no data. How do you start?
I ask clarifying questions: “What specifically are you observing? Rankings, traffic, conversions? Since when?” Then I pull objective data from GSC and analytics to either confirm or refine their observation before any diagnosis.

9. What’s a “red herring” in SEO troubleshooting? Give an example.
A correlation mistaken for causation. Example: a site migrates to a new design and traffic drops. The immediate blame goes to URL structure changes, but the real cause is that the new design accidentally noindexed all pages. I always verify root causes, not surface correlations.

10. How do you document a troubleshooting process for future team use?
I use a standard incident report template: symptoms, timeline, investigation steps, hypotheses tested, root cause, solution, and preventative measures. This goes into a shared knowledge base and is tagged for easy retrieval.

Traffic Drop Diagnosis

11. GSC shows impressions down but clicks up. What might explain this?
We may have lost rankings on high-volume, low-CTR keywords, while our higher-CTR keywords improved. Or the site began ranking for more branded terms (high CTR), even as non-brand impression volume dropped.

12. Both impressions and clicks dropped, but average position is stable. Why?
Search demand for those keywords may have declined seasonally or due to a trend shift. Alternatively, we lost rich snippet features (like star ratings) that boosted previous click volume, even though the blue link position didn’t change.

13. Rankings are unchanged but organic conversions halved. What would you investigate?
On-page conversion rate issues: broken checkout/forms, slow page speed hurting UX, changed call-to-action, or new friction (pop-ups). I also check if the traffic mix shifted to less-commercial query intent, or if my tracking for conversions broke.

14. A specific high-traffic page lost 80% of its clicks but the URL is still indexed. What would you check?
The page’s canonical tag might have changed, passing authority away. It could have lost critical backlinks. The content may have been thinned in a refresh. Or a competitor published a much stronger page, bumping us down just enough to drop CTR dramatically.

15. Traffic suddenly dropped only on mobile. What are your hypotheses?
A mobile-specific layout change (content hidden in accordions, interstitials), mobile speed regression, mobile-indexing issues (canonical tags between desktop/mobile URLs), or a mobile-targeted algorithm update focusing on page experience.

16. How do you check if a drop is seasonal vs. an actual problem?
Compare year-over-year data (same period last year), use Google Trends for query topic seasonality, and check if competitors experienced the same pattern in tools like Semrush Traffic Analytics.

17. You see a drop in “direct” channel traffic alongside organic traffic. What might this indicate?
Organic traffic might have been miscategorized as direct due to missing UTM parameters or improper referrer passing, especially if the site migrated to HTTPS poorly. Or a genuine brand-awareness decline affected both.

18. What’s the difference between a traffic drop from a keyword ranking loss vs. a CTR drop?
Ranking loss: impression share falls, average position moves down. CTR drop: impressions and position might be stable, but our snippet is less appealing (meta/title changed, competitors got rich snippets, SERP layout changed). I check GSC’s CTR change by query.

19. A blog section suddenly gets 0 traffic, but the rest of the site is fine. What’s your first check?
Disallow rule in robots.txt or a noindex meta tag accidentally applied to the blog template. A quick URL Inspection in GSC and crawler audit of the /blog/ subdirectory will confirm.

20. How long do you wait before declaring an organic traffic decline “significant” and initiating deep investigation?
I set anomaly detection thresholds ahead of time (e.g., ±15% weekly traffic). If an alert fires, I begin initial triage same-day. Even small sustained drops merit investigation before they snowball.

Indexing & Crawling Issues

21. A large chunk of important pages suddenly falls out of the index. What’s your first command?
Check the “Pages” report in GSC Index Coverage for the specific reason (Crawled – currently not indexed, Excluded by noindex, Not found 404, etc.). Then I immediately inspect a few affected URLs live.

22. Google is crawling far fewer pages per day than usual. What could cause a crawl budget reduction?
Server slowdown (5xx errors), increase in low-value URLs wasting budget (infinite spaces), a change to the crawl rate setting in GSC (if previously set), new robots.txt blocks on important paths, or site downtime.

23. How do you debug a “Crawled – currently not indexed” issue on a large scale?
I assess overall site quality signals (thin/duplicate content), internal link importance (are deep pages isolated?), and check if a sitewide quality issue from a recent update is suppressing our “index-worthiness.” I focus on improving E-E-A-T and pruning low-quality pages.

24. Google is indexing parameter versions of a URL (?sort=price, ?color=red). How do you fix it?
Ensure the canonical tag points to the clean URL parameter-free version, use the URL Parameters tool in GSC to specify behavior, and optionally standardize internal linking to the clean URL. For critical cases, I add a robots.txt disallow on parameter pages.

25. A page returns a 200 status but Googlebot sees it as a soft 404. Why?
The page content signals “no results” or is essentially empty/thin. A product page with “Out of Stock” and no related links, or a blog category with no posts, can be treated as a soft 404. I add meaningful content or noindex such pages.

26. How do you determine if JavaScript-linked content is being rendered by Google?
Use GSC’s URL Inspection Tool and view the rendered HTML screenshot. Compare it to the live page. Also test with Mobile-Friendly Test or Rich Results Test to see rendered output. I check if critical content appears in the rendered DOM.

27. After a CDN migration, Google can’t crawl the site. What do you check?
The CDN may be blocking Googlebot’s IP ranges or user-agent. I verify DNS propagation, check firewall rules, and ensure there’s no rate-limiting targeting crawlers. A quick “Fetch as Google” in GSC will reveal the exact error.

28. Sitemap says “couldn’t fetch” in GSC. What’s wrong?
The sitemap URL returns a non-200 status, is blocked by robots.txt, or is returning invalid XML (encoding issues, whitespace). I validate the sitemap URL in a browser and XML validator.

29. New pages are taking weeks to get indexed despite internal links and sitemap inclusion. How do you accelerate?
I submit the URL directly in the Inspection Tool, ensure the pages are linked from high-crawl-frequency pages (like homepage or news section), check that the content is unique and substantial, and consider adding them to an HTML sitemap. I also audit crawl budget waste on the rest of the site.

30. A staging site accidentally got indexed. How do you remove it?
I immediately password-protect the staging server, add a noindex meta tag, and use the GSC Removals tool for temporary take-down while I request removal of the URLs via directive (robots disallow + noindex). I then verify all staging environments have a standard noindex in deployment checklists.

Technical Glitches & Infrastructure

31. Site speed suddenly worsened, and rankings start slipping. What’s your troubleshooting flow?
I run PageSpeed Insights and WebPageTest for the affected pages. I check recent code deploys (large images, unoptimized JS), server response time (TTFB), CDN health, and third-party script bloat. I stage a rollback test if feasible.

32. How do you diagnose a high TTFB (Time to First Byte) issue?
I test a static resource (like .txt) to see if it’s server-side. I check server load, database query performance, and geographic routing. I involve engineering with a HAR file export showing network timing breakdown.

33. After an HTTPS migration, the site now has mixed content warnings. How does this affect SEO and how do you fix it?
Mixed content can degrade user trust and page experience signals, potentially impacting rankings. I crawl the site with a tool like Screaming Frog set to “mixed content” scan, identify HTTP resources, update them to HTTPS, and enforce a Content Security Policy that auto-upgrades.

34. The homepage rank tanked after adding an interstitial pop-up. How would you test this?
I’d check if it’s intrusive (Google’s interstitial penalty guidelines). I’d temporarily remove the pop-up for Googlebot and a subset of users, monitor rankings, and if they recover, propose a less intrusive format (banner) that follows Google’s best practices.

35. A site uses infinite scroll and new content isn’t being indexed. Why?
The scroll likely doesn’t update the URL or load content for crawlers. I suggest implementing a “Load More” button with unique paginated URLs, supporting History API (pushState) to change the URL as the user scrolls, and providing an HTML sitemap linking to all content.

36. How do you check if your XML sitemap contains only canonical URLs?
Crawl the sitemap URLs and cross-check with each page’s canonical tag. Any mismatch (sitemap points to a non-canonical) needs correction. I automate this with a Python script or Screaming Frog’s sitemap + canonical crawl mode.

37. You get reports of duplicate content across thousands of URLs. How do you find the root cause?
I segment by URL pattern: www vs. non-www, HTTP vs. HTTPS, trailing slash vs. non-slash, parameterized URLs, session IDs, or printer-friendly versions. I then implement the appropriate 301/canonical and remove the source of variation (standardize internal linking, server config).

38. How do you troubleshoot a discrepancy between GSC’s indexed page count and your actual number of pages?
If GSC is higher: I’ve got index bloat (staging, parameters, old pages). I crawl the site to find all indexable URLs, compare against the sitemap, and noindex/remove unnecessary URLs. If GSC is lower: crawling blockades, poor internal linking, or low-quality signals.

39. After a platform migration (e.g., WordPress to headless), organic traffic is slowly decaying. What’s your approach?
I crawl the old site’s key pages and compare on-page elements (titles, meta, headings, content blocks, schema) with the new site. I check for redirect chains, canonical mismatches, and ensure internal link equity flow is preserved. Loss often comes from subtle structural and semantic changes.

40. Google is indexing your PDF files but you want them deindexed. How?
Add X-Robots-Tag: noindex HTTP header for .pdf files (since you can’t put a meta tag in them), or block them with robots.txt if you also don’t want them crawled (but to remove from index, noindex is required). I’d use GSC Removals for immediate action.

Content & On-Page Issues

41. A page that ranked #1 for a high-volume keyword falls to #11 overnight. How do you investigate?
Check if the page’s content was changed recently (accidental keyword removal, heading restructure). Analyze the new #1 competitor to see what they improved. Check if the page lost critical internal or external links. Ensure the page is not accidentally blocked or canonicalized elsewhere.

42. Keyword rankings are fluctuating wildly. What could cause this?
Testing phase of an algorithm update, a new competitor actively gaining/losing authority, site speed inconsistencies, or the page content being dynamically changed (e.g., personalized content, rotating offers) causing Google to re-evaluate relevance.

43. How do you diagnose keyword cannibalization?
I export GSC query data and filter for multiple URLs ranking for the same query. I sort by impressions to find significant overlaps. Then I check if those URLs serve distinct intents or if they’re inadvertently competing. I decide whether to merge, differentiate, or canonicalize.

44. A service page with a great backlink profile isn’t ranking for its target keyword. What’s wrong?
Possible over-optimization of anchor text from backlinks, the content not matching the true search intent of the keyword, a hidden technical issue (noindex, canonical pointing to a different page), or the page is buried too deep in the site architecture to be seen as important.

45. Rich snippets (like review stars) suddenly disappeared from SERPs. What do you check?
I check the Structured Data report in GSC for new errors or warnings. I validate the page’s schema markup again using the Rich Results Test, ensure guidelines aren’t violated (e.g., aggregated review criteria), and check if Google updated its display requirements for that snippet.

46. The title tag you set isn’t appearing in SERPs; Google is rewriting it. Why?
Google often rewrites titles that are too long, keyword-stuffed, missing key context, or just don’t match the query well. I ensure the title is concise, descriptive, and matches on-page H1 and content. I test alternatives to see if Google accepts them.

47. How do you check if a page has been affected by the Helpful Content Update?
I compare traffic drops to the confirmed HCU rollout dates. I evaluate the page for “unhelpful” signals: thin content, heavy advertising, lack of first-hand experience, high bounce-back-to-SERP rate. I look at GSC pages report for HCU classification if Google provides signals, and benchmark against helpful competitors.

48. Brand search terms suddenly started ranking lower. What could cause this?
Negative PR, a competitor bidding heavily on our brand name pushing down organic CTR, a site migration where the new homepage diluted brand signals, or a technical issue making the homepage less crawlable, causing Google to favor a social media profile instead.

49. A category page with unique content is being treated as duplicate content by Google. How?
Often it’s because filter/sort pages are being indexed as separate URLs with the same canonical pointing to the category. I ensure canonical tags are self-referencing and filter URLs are properly handled.

50. How do you troubleshoot a high bounce rate on an organic landing page?
I check if the page fulfills the keyword intent immediately. If users land expecting a list and get a long-form guide, they bounce. I also check page speed, mobile usability, intrusive ads, and ensure clear next-step internal links exist.

Link Profile & Penalty Issues

51. Your backlink profile suddenly spiked with thousands of low-quality links. What do you do?
First, determine if it’s a negative SEO attack or spammy auto-generated linking. I document the links, then if they are truly toxic and large-scale, I prepare a disavow file as a precaution. I also try to contact the spammy site hosts where possible. I monitor for any manual action.

52. You receive a manual action notice for “unnatural links to your site.” How do you proceed?
I download the sample links GSC provides, do a full backlink audit to identify all suspicious links, reach out to site owners to remove them, document outreach efforts, and only then submit a disavow file with a thorough reconsideration request explaining our clean-up.

53. A manual penalty is revoked, but rankings don’t recover. Why?
Manual penalty removal just lifts the manual action. It doesn’t restore the lost algorithmic trust. It can take weeks or months for Google’s algorithms to re-evaluate the site without that penalty. I continue building quality signals and ensure no other algorithmic filters are in play.

54. How do you differentiate between an algorithmic demotion and a manual penalty?
Manual penalty: notification in GSC and often a sharp, complete drop on specific pages. Algorithmic: no notification, often partial or gradual, correlates with a known update. If no GSC message, I assume algorithmic and work on holistic quality improvements.

55. A key page lost a powerful backlink; how do you detect this?
Backlink monitoring tools like Ahrefs/Semrush send alerts for lost backlinks. I see a “lost” report and check if the specific link was removed or the page linking to us went offline. I then prioritize link reclamation or rebuilding a similar high-quality link.

56. How do you test if your disavow file is harming rankings instead of helping?
Disavow files rarely harm unless you’ve mistakenly disavowed good links. I audit the disavow file by cross-referencing with backlink data. If I suspect over-disavowal, I can remove the file (Google will re-crawl and re-integrate those signals). I proceed cautiously and monitor.

57. How do you find the exact toxic link causing a penalty when GSC only provides a few samples?
I perform a full link audit on all referring domains, classifying them by trust metrics, relevance, and anchor text patterns. I look for exact match rich anchor text spikes, blog network footprints, and links from de-indexed or penalized domains.

58. Rankings dropped after disavowing a lot of links. What could be the cause?
Those links may have been providing equity that wasn’t toxic enough to merit disavowal. Or, I inadvertently disavowed valuable links misclassified by an automated tool. I reconstruct the disavow process to find errors in judgment.

59. How do you recover from a negative SEO attack where someone cloned your site on a spammy domain?
I file a DMCA takedown with the host and Google. I also use the Google Copyright Removal tool. I ensure my canonical tags and internal signals are robust. I monitor for any ranking impact, but often Google can distinguish the original.

60. You see a spike in referring domains but they are all from irrelevant foreign-language sites. Should you do anything?
Check if they’re causing a traffic anomaly or a manual action. If not, they may be ignored by Google’s algorithms (likely spam). I may preemptively add the most obvious spam anchors to a disavow file, but I don’t panic-disavow everything. I monitor.

Local SEO Troubleshooting

61. Your Google Business Profile (GBP) listing suddenly disappeared from Local Pack. What’s your first step?
Check for suspension in the GBP dashboard. Verify NAP consistency across all citations, look for recent guideline violations in the listing (keyword stuffing in business name, address issues), and review the edit history. Then I file a reinstatement request if suspended.

62. A duplicate GBP listing is stealing your traffic. How do you remove it?
I use the “Suggest an edit” function on the duplicate listing, marking it as “Never existed” or “Duplicate of another place.” I also contact GBP support with evidence of the legitimate listing. I ensure my own listing is verified and fully claimed.

63. Your local rankings dropped but your organic rankings held. Where do you focus?
The Local Algorithm (proximity, prominence, relevance) behaves differently. I check for new competitors in the local area, sudden loss of reviews (or rating drop), incorrect category selection, missing/inconsistent citations, and whether my GBP posts are up-to-date.

64. How do you troubleshoot a “Your business is not eligible for verification” issue on GBP?
Ensure the business type is supported (not an online-only business, rental property rules), address format matches guidelines (no P.O. Box, correct physical location), and there’s no pre-existing verified listing for the same address that’s causing a conflict.

65. Your customer reviews are disappearing from GBP. What’s happening?
Google’s spam filter may be removing them (often reviews with unnatural language, bulk drop, or from accounts with low history). I inform clients not to solicit reviews from the same IP or incentivize them, and I contact GBP support for legitimate missing reviews.

Mobile & International Issues

66. After switching to a responsive design, mobile rankings dropped but desktop improved. Why?
Possible mobile-specific speed regression, UX issues like tap targets too close, intrusive mobile interstitials, or content hidden by default on mobile (accordions). I run a mobile-specific Lighthouse audit and check the Mobile Usability report in GSC.

67. The m.example.com version of your site is ranking instead of the responsive www. site. How do you fix this?
Implement proper canonical and alternate (rel=“canonical” and rel=“alternate” media) tags pointing to the responsive URLs, and 301 redirect the m. pages if you’re fully moving away from a separate mobile site. I also consolidate hreflang if international.

68. Hreflang tags are implemented but the wrong country version ranks in the UK. How do you debug?
I use an hreflang validator (like the one in Screaming Frog) to check return tags and errors. I ensure the canonical tags of alternate pages align with the hreflang URLs. I verify no conflicting geo-targeting settings in GSC or server signals (IP redirects) that override hreflang.

69. Quick-loading AMP pages were serving as the main URL, but now you’ve moved to non-AMP. Traffic dipped. What happened?
Google may have replaced AMP carousel placements with standard links, losing CTR. If your new pages have higher LCP, that can hurt page experience ranking. I ensure performance parity and monitor for the Top Stories eligibility changes.

70. A global site has separate subdirectories, but US version shows up in Australia. Why?
Missing or incorrect hreflang tags for en-AU, no local backlinks or signals for Australia, or the Australian version is thin (similar to US) causing Google to fuse them. I localize content distinctly, build local links, and implement accurate hreflang.

Analytics & Data Discrepancies

71. Google Analytics shows zero organic sessions, but GSC shows clicks. How can this be?
The GA tracking code is broken or missing from the landing pages where users arrive. I check with Tag Assistant and browser network tab. Often occurs after a code push or migration where the global snippet wasn’t included on newer templates.

72. GSC shows clicks, GA shows sessions, but the numbers don’t match. Why?
Differences in measurement: GA sessions can be filtered (internal traffic, bot filtering), require JavaScript execution, and count a session once despite multiple page clicks. GSC counts clicks from SERP even if the page bounces before GA loads. They rarely match perfectly but should correlate directionally.

73. Conversion rate for organic traffic suddenly dropped to zero. Likely cause?
A broken conversion tracking event (button change broke the GTM trigger, thank-you page URL changed, goal setup broken in GA4). It’s almost never an actual human behavior change to literally zero; it’s a tracking breakdown.

74. How do you verify if landing page URLs in GA suddenly show “(not set)” for organic traffic?
“(not set)” can indicate a redirect stripping referrer data, or that the landing page dimension isn’t passed correctly. I check if the session source/medium remains “google / organic” but pages are missing. I inspect recent changes to redirects or cross-domain tracking.

75. You see traffic spikes from a city where you don’t do business. Bad or good?
Probably bot traffic. I check for unusual language, high bounce rate, zero conversion, and a disproportionate number of sessions from a single ISP. I filter out bots in GA using known bot networks, and investigate server logs to block at the firewall level if necessary.

76. What’s a common reason GSC’s “average position” improved but clicks didn’t increase?
The improvement is on low-volume keywords or irrelevant long-tail queries. Or we gained position on many low-search-volume terms while our head terms fell slightly. I segment query data by impressions to see where movement actually matters.

77. How do you troubleshoot when a page is listed in analytics as the landing page but you never created it?
It could be a scraped site with your tracking ID, a ghost spam referral hitting your measurement protocol directly, or a parameterized version of your page that apps auto-generate. I investigate the hostname dimension to see if it’s my actual domain.

Site Migrations & Redesigns

78. After a site redesign, you discover the developers deleted your meta descriptions to “let Google decide.” Traffic dropped. How do you prove causality and fix it?
I show a before/after CTR comparison in GSC for top pages, correlating with the deployment date. I demonstrate how the auto-generated snippets decreased CTR. I request immediate rollback of meta descriptions to the pre-launch versions and track recovery.

79. A domain migration to a brand-new domain is causing volatile rankings. What’s the most fixable common mistake?
The redirect map might be incomplete or incorrect. I crawl the old site’s sitemap and test every URL for a proper 301 redirect pointing to the correct corresponding new URL, not just to the homepage. Site-wide redirects to the homepage destroy traffic.

80. The staging server of your site migration is crawled and now lives as a shadow domain. What do you do?
Immediately noindex the staging domain (via meta and HTTP header), block it via robots.txt, password protect it, and remove it from the CDN if cached publicly. Then, use the Removals tool in GSC to temporarily drop it from the index quickly.

81. After a server move, the site’s IP changed, and rankings tanked. Why?
Potentially a bad neighborhood IP (shared with spam sites) or the new IP is geo-located in a country different from the target audience. I check the IP’s neighborhood with tools, consider a dedicated IP, and ensure proper CDN setup for geo-routing.

82. Post-migration, 404 errors spiked because some old URLs aren’t redirecting. How do you recover?
I crawl the old domain’s known URL inventory (from GSC, backlinks, prior crawls) to find the missing redirects. I add 301 redirects for any URL with decent backlinks or traffic history. For the rest, I ensure a helpful custom 404 page.

Advanced & Niche Issues

83. The site is returning 5xx server errors only to Googlebot, not to human visitors. How to detect and fix?
I test using the “Fetch as Googlebot” option if available, or by modifying the User-Agent in a curl/crawler tool to mimic Googlebot. The server might be rate-limiting or blocking Googlebot by user-agent. I coordinate with DevOps to whitelist Googlebot IPs.

84. Google is indexing your search results pages even though they’re noindex’d. Why?
The noindex directive might not be rendered if it’s added via JavaScript and Googlebot didn’t execute it. Or external pages are linking to the search pages, and Google sometimes indexes URLs with a noindex if they have enough external links (rare, but possible). I block via robots.txt and ensure a server-side noindex.

85. How do you fix a site that was hacked and injected with Japanese keyword spam?
Immediate action: take site offline for maintenance, clean all infected files, patch the CMS vulnerability, change credentials, and submit a removal request via GSC for the spammy URLs. Then, request a review if a manual action was applied, and monitor closely for re-infection.

86. The site seems fine, but Google is showing old meta descriptions even after updating them weeks ago. Why?
Google might not have recrawled that specific page because the crawl frequency is low. I request indexing in GSC’s URL Inspection tool. If it persists, Google may be using an older version of the page from a cached snapshot or the page’s content doesn’t support the new description well.

87. How do you troubleshoot a situation where organic traffic is solid but leads from organic are declining?
I analyze the landing pages driving conversions. A shift in keyword ranking toward less commercial-intent terms can bring traffic but fewer conversions. Form abandonment issues, new UX friction on mobile, or a change in the lead definition in analytics could also be the cause.

88. A client’s competitor is ranking above them using a page built entirely with AI-generated content that’s low-quality. Why?
Google hasn’t caught it yet. Short-term, I focus on creating higher-quality, expert-driven content with genuine E-E-A-T signals. I may file a spam report if it includes factual inaccuracies or dangerous information. Long-term, quality almost always wins.

89. What is a “phantom” de-indexation and how do you diagnose it?
A page appears to be indexed (no reporting issues) but cannot be found for any keyword or even the exact URL query. I use “site:example.com/url” search. I check if the page is on a CDN not accessible to Google, or if a soft-404 scenario is at play; often an algorithm just devalues the page’s relevance.

90. How do you troubleshoot a newly launched e-commerce site with faceted navigation that generated 100,000 parameter pages?
I implement canonical tags pointing to the master category, configure GSC’s URL Parameter tool, and use robots.txt to block low-value facet combinations (like sorting by price) if canonical isn’t fast enough. I ensure high-value combos (like brand) have unique, optimized pages and aren’t blocked.

Communication & Scenario-Based Questions

91. The CEO emails you at 10 PM: “Why is our traffic down?” What is your immediate response?
I acknowledge the email, say I’m looking into it, and provide a preliminary finding within a brief timeframe (next morning). If I can see it’s a known algorithm update from my phone, I mention that context. I never ignore; I manage anxiety with rapid acknowledgment and a promise of thorough follow-up.

92. A product manager argues that SEO slows down their release cycle. How do you resolve this conflict?
I propose embedding SEO review earlier in the design phase, provide a self-service checklist for non-critical launches, and agree on a “fast track” vs. “thorough review” classification. For fast-tracked items, I accept post-launch monitoring with a defined rollback plan if impacts are severe.

93. How would you explain to a non-technical stakeholder that a page being deindexed wasn’t an “accident” but a helpful update from Google?
“Google determined our content wasn’t truly helpful enough for the query, even if we thought it was. It’s an automated evaluation of overall site quality. The fix isn’t re-indexing that one page; it’s improving our overall content standards to meet Google’s evolving definition of helpfulness.”

94. You identified the problem, but it requires significant engineering resources. The engineering manager declines your request. What now?
I build a stronger business case—quantify the revenue impact of the issue, show the organic traffic at risk if not fixed, and present a smaller-scope pilot fix that proves value with minimal effort. I also escalate to the product owner who prioritizes the joint roadmap.

95. A junior analyst on your team misdiagnosed a ranking drop for a client. How do you handle it?
I review the analysis with them privately, identifying where the diagnostic path went wrong (data misinterpretation, unverified assumption). We correct the diagnosis together, I guide them to communicate the updated findings to the client, and I treat it as a learning moment without blame.

96. A client demands a guarantee that rankings will return to previous levels after an update. How do you respond?
“I can’t guarantee specific ranking positions because that’s controlled by Google, not me. But I can guarantee a rigorous diagnostic process, a transparent action plan, and our full commitment to implementing best practices that maximize the probability of recovery.”

97. You’ve been given a site with chronic SEO issues, but no budget for tools. How do you troubleshoot?
Google Search Console, Google Analytics, and a free Screaming Frog license (500 URLs) get me 80% of the way. I also use the open-source Lighthouse, Google Trends, and manual SERP analysis. Free tools are sufficient for root cause analysis; premium tools speed it up, but don’t replace reasoning.

98. How do you prioritize troubleshooting efforts when multiple SEO issues are discovered simultaneously?
Revenue impact and severity: blocking issues (noindex, 500 errors) > high-traffic page ranking drops > low-traffic page issues. I triage like an emergency room: stop the bleeding first, then treat chronic conditions.

99. Describe a time your initial troubleshooting hypothesis was completely wrong. What did you learn?
I once assumed a traffic drop was a Google update, but after 3 days of digging, we found the analytics tag was firing twice, artificially inflating previous data, so there was no “drop”—just corrected inflation. I learned to always take a baseline audit of data integrity before deep-diving into external factors.

100. What’s your core philosophy when an SEO problem seems unsolvable?
Break it into smaller questions. “We don’t know why this happened” is too big. I ask: “Is it a crawl issue? Is it a content quality issue? Is it links? Is it SERP feature cannibalization?” By isolating each variable, even the most mysterious problem yields to structured investigation. There is always a root cause.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top