Here are 100 SEO practical interview questions and answers. These are scenario-based, hands-on questions that test your ability to diagnose issues, implement fixes, and make data-driven decisions. Each question is in bold, followed by a detailed answer. No dividing lines.
You notice a page that was ranking #3 for a high-value keyword has dropped to #25 overnight. What is your step-by-step diagnostic process?
Answer: First, I check Google Search Console for any manual action or coverage errors. Second, I verify the drop date against known Google algorithm updates using tools like MozCast or SEMrush Sensor. Third, I use GSC performance report to see if the drop is for that keyword only or across many. Fourth, I check the page’s backlinks using Ahrefs to see if any high-quality links were lost or toxic links appeared. Fifth, I check if the page’s content changed recently (via CMS logs or Wayback Machine). Sixth, I check competitor activity – did a new page outrank me? Seventh, I check technical issues: page speed, Core Web Vitals, noindex tag, canonical issues, or server errors. Finally, I prioritize fixes based on findings.
A client’s website has 10,000 product pages, but only 3,000 are indexed. How do you diagnose and fix this?
Answer: I first check the Google Search Console coverage report to see why the other 7,000 are not indexed. Common reasons: “Crawled – currently not indexed” (quality or canonicals), “Discovered – currently not indexed” (crawl budget), or “Excluded” (noindex, canonical to another page). Then I sample the unindexed URLs. If they are thin content (manufacturer descriptions), I improve content uniqueness. If crawl budget is the issue, I block low-value pages (faceted navigation, internal search) via robots.txt. I also ensure XML sitemaps only include important pages. I request indexing for high-priority pages via GSC.
You are migrating a site from HTTP to HTTPS. What is your complete checklist?
Answer: 1) Install SSL certificate correctly. 2) Update all internal links to HTTPS (relative paths help). 3) Implement 301 redirects from every HTTP URL to its HTTPS equivalent. 4) Update canonical tags to HTTPS. 5) Update XML sitemap to HTTPS URLs. 6) Update robots.txt to reference HTTPS sitemap. 7) Update hreflang tags if used. 8) Update Google Search Console property to HTTPS version. 9) Update Google Analytics tracked property. 10) Update any hardcoded external resources (images, scripts) to HTTPS to avoid mixed content. 11) Test with crawler. 12) Monitor GSC for crawl errors for 4 weeks.
A page has a high bounce rate (85%) but ranks well. How would you investigate and improve it?
Answer: High bounce rate with good rankings suggests the search intent may not match the page content. I first check the keyword’s search intent – if the user expects a product page but lands on a blog post, that’s a mismatch. I also check page load speed – slow pages increase bounce. I review the content: is it scannable, engaging, with clear CTAs? I use Hotjar or session recordings to see user behavior. Fixes: rewrite title and meta to accurately reflect content, improve page layout, add internal links to keep users engaged, add a clear next step (CTA), and improve mobile usability.
You are asked to improve the organic traffic for a blog that has 200 posts but only 10 get regular traffic. What do you do?
Answer: I first audit the 10 high-traffic posts to understand what they have in common: keyword targeting, length, backlinks, freshness. Then I analyze the 190 underperforming posts. I prioritize: 1) Refresh outdated posts with new data and optimize for current search intent. 2) Merge very thin or similar posts into comprehensive guides. 3) Add internal links from high-traffic posts to underperforming relevant ones. 4) Build a few backlinks to the best underperforming posts. 5) Noindex or delete posts with zero value. 6) Create new topic clusters around the winners.
A client wants to rank for a very competitive keyword (“insurance”). Their site is new. What realistic strategy do you propose?
Answer: I explain that ranking for “insurance” directly is nearly impossible for a new site. Instead, I propose a long-tail strategy: target phrases like “best life insurance for seniors over 60” or “car insurance for new drivers in Texas”. I build topical authority through cluster content. I also focus on local SEO if applicable. I invest in high-quality backlinks via digital PR and original research. Over 12-24 months, as domain authority grows, we can target broader terms. I set realistic KPIs.
You find that a website’s mobile traffic has dropped 40% while desktop remains stable. What is your first check?
Answer: The first check is Google Search Console’s Mobile Usability report for issues like text too small, clickable elements too close, or viewport not set. Second, I check if the site was recently redesigned and if the mobile version is missing content compared to desktop (mobile-first indexing). Third, I check Core Web Vitals on mobile specifically using PageSpeed Insights. Fourth, I check if any recent algorithm update targeted mobile experience. Fifth, I verify that the mobile version is not blocked by robots.txt.
How would you set up an SEO test to prove that changing title tags improves CTR?
Answer: I would run an A/B test using Google Search Console’s built-in “Experiment” (if available) or use a third-party tool like SearchPilot. I select a set of pages with similar baseline CTR. I change title tags on half (test group) and keep control group unchanged. After 4 weeks, I compare the CTR difference. I also use Google Analytics to see if the test group’s organic sessions increased proportionally. I ensure statistical significance (p-value <0.05). I document results.
A site has thousands of faceted navigation URLs indexed (e.g., /category?color=red&size=10). How do you clean this up?
Answer: I first use the URL parameters tool in GSC to tell Google how to handle these parameters – for low-value ones, set “No URLs”. Then I add canonical tags on filter pages pointing to the main category page. I also add a noindex tag on filter pages (if they have no unique content). I block crawling of filter parameters via robots.txt (e.g., Disallow: /?). Finally, I internally link only to clean category URLs, not filter ones.
You discover that a competitor has copied your entire product descriptions word-for-word. What do you do?
Answer: First, I document the duplicate content with screenshots and timestamps. I send a DMCA takedown notice to the competitor’s hosting provider and Google via the legal removal request tool. I also add a canonical tag on my page pointing to itself (self-referential) to assert originality. I consider adding a visible copyright notice. For SEO, duplicate content usually doesn’t penalize the original, but I monitor if my rankings drop. I also add unique content to my pages to differentiate.
A client’s website has a 20% drop in organic traffic but rankings are the same. What is likely the cause?
Answer: Rankings same but traffic down suggests a SERP feature change. For example, Google may have introduced a featured snippet, local pack, or shopping carousel that pushes organic results down the page, reducing visibility. Alternatively, the total search volume for the keyword may have decreased seasonally. I check GSC for CTR changes per keyword – if CTR dropped, that’s the cause. I also check for increased paid ads above organic results. I respond by optimizing for featured snippets or adding rich results to regain CTR.
How would you prioritize keywords for a new ecommerce site with budget for only 10 product pages?
Answer: I focus on bottom-of-funnel (transactional) keywords with moderate to low difficulty, decent search volume (100-1,000/month), and high commercial intent (e.g., “buy”, “price”, “discount”). I also look for long-tail phrases with clear purchase intent. I avoid head terms. I map each keyword to a product page. I also target “best [product]” keywords for category pages because they pre-educate users. I use keyword difficulty scores from SEMrush or Ahrefs to ensure feasibility.
You are handed a website with 5,000 404 errors from external backlinks. How do you handle this?
Answer: First, I export the 404 list from GSC. I categorize: 1) URLs that have obvious replacements – I 301 redirect to the closest relevant page. 2) URLs with no replacement – I leave as 404 but create a custom 404 page. 3) Bulk of errors from a specific pattern – I write regex redirects. I do not redirect all to homepage (that’s a soft 404). I also reach out to valuable linking sites to update their links. I monitor the 404 count weekly.
A client’s homepage has 10,000 backlinks, but their product pages have very few. How do you leverage this?
Answer: I use internal linking to distribute link equity from the homepage to important product pages. I add contextual links within the homepage content (not just footer). I also create a “featured products” section with links deep to product pages. Additionally, I reach out to sites that linked to the homepage and ask if they would also link directly to relevant product pages. I ensure the homepage’s anchor text is diverse and natural.
You run a crawl and find 200 pages with duplicate meta descriptions. What is your action plan?
Answer: I first identify which pages are high priority (product pages, category pages, blog posts). For those, I write unique, compelling meta descriptions manually, incorporating target keywords. For low-priority pages (e.g., paginated archives, tag pages), I either set a dynamic rule to generate unique descriptions using page title or category name, or I noindex those pages. I use a CMS plugin or script to automate bulk changes where possible.
A site’s load time is 8 seconds on mobile. List the top three fixes you would implement first.
Answer: 1) Compress and serve images in next-gen formats (WebP/AVIF) with lazy loading. 2) Minify CSS, JavaScript, and HTML, and defer render-blocking scripts. 3) Use a CDN and enable browser caching. These three typically give the biggest wins. Then I would move to server-side optimizations (PHP/DB, HTTP/2, hosting upgrade).
You are asked to double organic revenue in 6 months with a limited budget. What’s your strategy?
Answer: I focus on high-conversion, low-competition long-tail keywords that are already bringing some traffic but not converting well. I optimize those pages for conversion (CTAs, trust signals). I also identify the top 20% of pages that bring 80% of conversions and improve their rankings by building a few quality backlinks each. I also fix any technical issues on checkout or lead forms. I do not spend on broad content; I double down on what already works.
A competitor launches a new product category and immediately outranks you. You have the same products. What do you do?
Answer: I analyze their page: title, headings, content depth, internal links, backlinks, schema, and user engagement (if possible). I then improve my page: add more detailed content, better images, customer reviews, FAQ schema, internal links from high-authority pages. I build 2-3 quality backlinks to my page. I also check if they are using a different search intent (e.g., “best” vs “buy”) and adjust. I monitor and iterate.
How would you test if a canonical tag is correctly implemented and respected by Google?
Answer: I use Google Search Console’s URL Inspection tool for the non-canonical URL. Under “Coverage”, it will show “Duplicate, submitted URL not selected as canonical” and list the canonical URL Google chose. I check if that matches my canonical tag. I also crawl the site with Screaming Frog and verify the canonical tag is present. I then check if the canonical URL actually returns 200 OK and is indexable.
A site has been penalized by a manual action for “pure spam”. What immediate steps do you take?
Answer: First, I read the exact manual action message in GSC to understand the reason. I then audit the entire site for spammy content: doorway pages, keyword stuffing, cloaking, or hidden text. I remove or noindex all violating pages. I also check for hacked content. After cleanup, I submit a reconsideration request detailing every action taken. I also disavow any unnatural backlinks mentioned. I do not submit until I am confident the site is clean.
You notice that Google is indexing the “print” version of your pages instead of the main version. How do you fix?
Answer: The print version likely has a canonical tag missing or pointing to itself. I add a canonical tag on the print page pointing to the main page. I also add a noindex tag on the print page if users don’t need it in search. Additionally, I block the print folder in robots.txt (Disallow: /print/). I ensure the main page has a self-referential canonical.
A client wants to rank for “best coffee beans” but their product page only lists 3 products. Competitors list 20+ with detailed reviews. How do you advise?
Answer: I explain that “best” queries require comprehensive, comparative content. I recommend creating a dedicated “best coffee beans” blog post or category page that reviews at least 10-15 products with pros/cons, ratings, and buying guide. That page then links to individual product pages. This is a classic topic cluster. I also suggest adding user reviews and expert quotes. The existing product page is for transactional intent, not commercial investigation.
You find that a page has a canonical tag pointing to itself, but Google has chosen a different canonical. Why?
Answer: Google overrides canonical tags if it detects stronger signals. Possible reasons: the page has very thin content while another page is more comprehensive; there are many internal links pointing to a different version; the page has a 302 redirect; or the page is blocked by robots.txt. I investigate which URL Google chose as canonical and why. I then improve my page’s content and internal linking to reinforce the self-canonical.
How would you optimize a PDF that ranks for a keyword but your HTML page does not?
Answer: I create a new HTML page that contains the same or better content as the PDF, with proper title tags, headings, and internal links. I add a canonical tag on the PDF pointing to the HTML page. I also add a link on the PDF saying “for updated content, visit [HTML URL]”. I then use 301 redirect from the PDF URL to the HTML page if the PDF is no longer needed. Over time, Google will replace the PDF with the HTML page.
A client’s website is built entirely in React with client-side rendering. Google is indexing only the blank template. What do you recommend?
Answer: I recommend implementing server-side rendering (SSR) using Next.js or Nuxt.js. Alternatively, use dynamic rendering: detect Googlebot and serve a pre-rendered static HTML version (using Puppeteer or Rendertron). I also ensure all critical content is available in the initial HTML and that the meta tags are not injected via JavaScript. I test using GSC URL Inspection Tool.
You have two versions of a page: one at /product and one at /product?ref=123. Both are indexed. How do you consolidate?
Answer: I add a canonical tag on the parameterized page pointing to /product. I also update internal links to point only to /product. I use GSC URL parameters tool to tell Google that “ref” parameter does not change content and should be ignored (set to “No URLs”). I also add a 301 redirect from the parameterized URL to the clean URL if possible.
A blog post ranks #1 but has a low CTR (1%). How do you improve CTR without changing ranking?
Answer: I A/B test title tags and meta descriptions. I try adding numbers (“7 Ways to…”) , brackets (“[2026 Guide]”), power words (“Ultimate”), or emotional triggers. I also test adding the current year. I ensure the meta description matches user intent and includes a call-to-action. I also check if a rich snippet (like a featured snippet) is stealing clicks – if so, I optimize to win it.
You discover that a developer has added a noindex tag to all product pages by mistake. The pages have been noindexed for 2 weeks. What do you do?
Answer: First, I immediately remove the noindex tag. Second, I use GSC URL Inspection tool to request re-indexing for the most important product pages. Third, I submit updated sitemaps. Fourth, I monitor the coverage report to see pages returning to index. Fifth, I identify any lost rankings and build a recovery plan (backlinks, content refresh). Sixth, I communicate with the development team to prevent recurrence.
A client wants to know why their new blog post isn’t ranking even though it’s well-written. How do you troubleshoot?
Answer: I check if the page is indexed (GSC URL Inspection). If not, I request indexing. If indexed but not ranking, I check if it’s targeting a keyword with very high difficulty. I analyze the top-ranking pages for that keyword: their content length, backlinks, domain authority, and freshness. I also check internal linking – does the post receive links from high-authority pages? I then improve the post: add more depth, internal links, and build a few backlinks.
How would you identify keyword cannibalization on a large site?
Answer: I use the site:domain.com “keyword” search operator and see which pages appear. I also use Google Search Console performance report: filter by query and see which pages are getting impressions for that query. If multiple pages appear, that’s cannibalization. I also use crawling tools like Screaming Frog to export all title tags and meta descriptions, then look for duplicate keywords. I then consolidate the content.
You are asked to improve the crawl budget of a site with 2 million URLs. Where do you start?
Answer: I first analyze server logs to see which URLs Googlebot actually crawls. I then remove or block low-value URLs: faceted navigation parameters, internal search results, paginated archives beyond page 5, user profiles, and duplicate pages. I implement noindex on pages that are not useful. I update robots.txt to disallow crawling of waste sections. I also make sure XML sitemaps only contain high-priority URLs. I fix any server errors (5xx) that waste crawl attempts.
A page has great content but zero backlinks. How do you get the first few backlinks without a budget?
Answer: I use the skyscraper technique: find a popular but outdated resource in the same niche, create a better version, then email the people who linked to the original, politely presenting my improved resource. I also use HARO (Help a Reporter Out) to earn editorial links. I share the page on relevant social media and communities (Reddit, LinkedIn) where appropriate. I also internally link to it from high-authority pages on my own site.
You notice that a competing site has a huge number of backlinks from .edu domains. How can you replicate this ethically?
Answer: Many .edu backlinks come from scholarship pages, resource lists, or student projects. I create a legitimate scholarship program ($500) and list it on scholarship databases. I also create high-quality educational content (e.g., “Guide to X”) and reach out to professors or university librarians to suggest it as a resource. I also offer to speak at universities or provide free tools for educational use.
A client’s website has 500 pages with “Click here” as anchor text for internal links. How do you fix?
Answer: I audit all internal links using a crawler. For important pages, I change “click here” to descriptive anchor text (e.g., “learn more about SEO services”). I prioritize pages that are money pages. I use a database update or CMS script to bulk change common patterns. I then monitor if the target pages’ rankings improve due to better anchor text relevance.
How would you handle a situation where a product page is out of stock permanently, but it has many backlinks?
Answer: I 301 redirect the product page to the closest relevant category page or a similar product page. This preserves most link equity. I do not 404 or 410 a page with valuable backlinks. I also update internal links pointing to the old product page. If there is no replacement product, I redirect to a related resource page or the homepage as last resort.
You find that Google is indexing a staging version of the site (staging.example.com). What immediate steps do you take?
Answer: First, I add a noindex tag to all pages on the staging site in the robots meta or via X-Robots-Tag. Second, I block the staging subdomain in robots.txt (Disallow: /). Third, I request removal of the staging URLs via GSC Removal tool. Fourth, I password-protect the staging site with basic authentication. Fifth, I ensure staging is not linked from anywhere publicly.
A site ranks #1 for a keyword but has a very low conversion rate. What changes would you test?
Answer: I first analyze user intent – is the keyword informational but the page is transactional? If yes, I either change the page to match intent or target a different keyword. If intent matches, I improve the call-to-action: increase visibility, change button color or text, add urgency, simplify forms, add trust signals (testimonials, guarantees), and reduce friction. I also A/B test different headlines and layouts.
How would you measure the SEO impact of a site redesign that changed URL structures?
Answer: I set up a benchmark using GSC and Google Analytics from the 4 weeks before launch. After launch, I monitor daily: 404 errors (redirects working?), organic sessions, keyword rankings, and crawl stats. I compare week-over-week and month-over-month. I also check if Google has re-indexed the new URLs (coverage report). I give it at least 4 weeks before drawing conclusions.
You discover that a client’s site has a huge number of backlinks from porn or gambling sites. What is your action plan?
Answer: This could be a negative SEO attack. I first check if there is a manual action in GSC. If not, I monitor for ranking drops. I attempt to contact those site owners to remove links – unlikely to work. Then I disavow those domains using the Google Disavow Tool. I do not disavow individual URLs but entire domains. I also set up backlink monitoring to catch future attacks early.
A blog post ranks for a keyword but the featured snippet is held by a competitor. How can you take it?
Answer: I analyze the competitor’s snippet: format (paragraph, list, table), length, and exact wording. I then reformat my answer to match the snippet type, making it more concise and direct. I place the answer immediately after a heading that matches the query. I use schema markup (FAQ or QAPage). I also ensure my answer is factually better. Then I wait for Google to recrawl.
How would you optimize a website that has been penalized by the “Link Spam Update”?
Answer: The Link Spam Update targets unnatural links. I audit the backlink profile for spammy patterns (exact-match anchor text, low-authority foreign domains, PBNs). I attempt to remove those links. For those I cannot remove, I disavow them. I also review internal linking for over-optimized anchor text. After cleanup, I build high-quality, relevant links to dilute the remaining low-quality ones.
You are given a website with no XML sitemap. Create one from scratch using only the information you have.
Answer: I export a list of all important URLs by crawling the site with Screaming Frog (or using a CMS export). I then create a sitemap index file and individual sitemaps (max 50,000 URLs each, 50MB). I include lastmod, changefreq (optional), and priority (optional but not heavily used). I validate the sitemap with a validator. I upload it to the root directory. I then submit via GSC and reference it in robots.txt.
A client wants to target the same keyword in both English and Spanish on the same page using tabs. Is this good for SEO?
Answer: No. Content hidden in tabs may not be crawled effectively or given full weight. Also, the same URL cannot target both languages well. Instead, I recommend separate URLs for each language (example.com/en/page and example.com/es/page) with hreflang tags. Users can switch via language selector. This is much better for international SEO.
You find that your sitemap includes 404 pages. What are the risks and how do you fix?
Answer: Including 404 pages wastes crawl budget and can dilute sitemap effectiveness. Google may also lose trust in the sitemap. I fix by removing those URLs from the sitemap generation process. I also automatically 301 redirect those URLs if they have obvious replacements, or keep them as 404 but remove from sitemap. I regenerate sitemap and resubmit.
A page has a large image that is causing LCP to be 5 seconds. The image is a hero banner. How do you optimize?
Answer: I compress the image using a tool like Squoosh or ImageOptim. I convert it to WebP or AVIF. I add loading="eager" for hero image (not lazy load). I set fetchpriority="high". I also implement responsive images using srcset so mobile users get a smaller image. I move the image to a CDN. I also consider reducing image dimensions if visually acceptable.
How would you set up tracking for a “click-to-call” button as an SEO conversion?
Answer: I use Google Tag Manager to create a trigger on the click-to-call link (e.g., a[href^="tel:"]). I create a GA4 event tag with parameters: event name = “phone_call”, parameter “phone_number” = the number. Then I mark that event as a conversion in GA4. I also import offline data if calls are tracked. For GSC, I cannot track calls directly, but I can see the organic landing pages that lead to calls.
You notice that a competitor’s page has a much better internal linking structure than yours. How do you replicate?
Answer: I crawl both sites and compare the internal link graphs. I note the competitor’s hub pages (most linked-to), anchor text patterns, and link depth. I then recreate a similar structure on my site: add contextual links from relevant posts to my pillar pages, create a “related articles” section, and ensure each important page receives links from at least 3 other pages. I also use breadcrumbs and a table of contents.
A client has a single-page application (SPA) with a blog. The blog posts are loaded via AJAX and have no unique URLs. How do you fix?
Answer: I implement the History API (pushState) to give each blog post a unique, crawlable URL. I also configure server-side routing to return the correct HTML for those URLs. I add meta tags dynamically based on the post. I also create an XML sitemap for the blog posts. I test with GSC URL Inspection.
You are asked to estimate the potential organic traffic for a new product category before launching. How do you do it?
Answer: I conduct keyword research for terms related to that category. I sum the monthly search volumes for the top 10-20 relevant keywords (adjusting for typical CTR by position). I then multiply by the average click-through rate for a top 3 position (e.g., 20%). Then I apply a conservative factor based on our domain authority vs competitors. I also look at the traffic of competitors’ category pages using SEMrush or Ahrefs.
How do you use the “URL Inspection” tool to verify that a canonical tag is working?
Answer: I enter the non-canonical URL. Under “Coverage”, Google shows “Duplicate, submitted URL not selected as canonical” and displays the URL that Google selected as canonical. I compare that with my intended canonical. I also check “Indexing allowed?” and if no, why. I also see the last crawl date. If Google ignores my canonical, I investigate why.
A client’s website has a large number of HTML errors (e.g., unclosed tags, duplicate IDs). Does this affect SEO?
Answer: Generally, minor HTML errors do not directly impact rankings. However, severe errors can cause rendering issues for Googlebot, especially with JavaScript-dependent content. Duplicate IDs can cause structured data problems. I recommend fixing them for maintainability, but I prioritize broken links, canonical issues, and speed over HTML validation.
How would you identify the pages on your site that are losing the most organic traffic month over month?
Answer: In Google Analytics, I compare two date ranges (this month vs last month) for organic traffic. I sort by “change” descending for negative change. I also use Google Search Console’s performance report, compare date ranges, and filter by “Pages” with the biggest drop in clicks. I then investigate each page for reasons.
You inherit a site with a massive number of redirect chains (e.g., A-B-C-D-E). How do you simplify them?
Answer: I crawl the site with Screaming Frog, which reports redirect chains. I export the chains. For each chain, I replace the entire series with a single 301 redirect from the original URL to the final destination. I update .htaccess or server config accordingly. I also update any hardcoded internal links that point to intermediate URLs.
A client wants to understand why their homepage is not ranking for their brand name. What could be wrong?
Answer: The homepage should rank for brand name by default. Possible issues: the homepage has a noindex tag, is blocked by robots.txt, has a canonical pointing elsewhere, or has a very weak internal link structure. Also, if there are other pages with “brand name” in the title (e.g., a blog post), cannibalization could occur. I fix by ensuring the homepage is indexable and has a self-referential canonical, and I set the brand name as title.
How do you test if Google is crawling your JavaScript-rendered content correctly?
Answer: I use GSC URL Inspection Tool, click “Test Live URL”, then “View Tested Page”. I compare the rendered HTML with the raw HTML. If critical content is missing from rendered HTML, there’s a problem. I also use the mobile-friendly test and the rich results test. Additionally, I can use a command line tool with a user-agent of Googlebot.
A site is losing rankings for 30% of its keywords after a hosting change. What is the most likely cause?
Answer: The most likely cause is a change in IP address that affected many things: server response time increased, the new server is in a different geographic location affecting speed, or the IP was previously used for spam. Also, SSL certificate issues or mixed content could arise. I check Core Web Vitals before and after. I also verify robots.txt and server headers.
You discover that a page has a “noindex” tag in the HTTP header but not in the HTML. How do you find this?
Answer: I use a crawler like Screaming Frog that checks HTTP headers. I look for X-Robots-Tag: noindex. I also use curl -I command line. If found, I remove the HTTP header directive or modify it. I ensure consistency with HTML meta tags.
How would you recover from a “Discovered – currently not indexed” issue for a large number of pages?
Answer: This indicates crawl budget starvation. I first reduce the number of low-value pages (facets, filters, internal search) by blocking or noindexing them. I also improve internal linking to important pages from the homepage. I compress page sizes for faster crawling. I submit a fresh, trimmed sitemap. I also use GSC to request indexing for top priority pages individually.
A client has a blog with comments. Many comments contain spammy links. What do you do?
Answer: I enable comment moderation. For existing spammy comments, I delete them. I add rel="ugc" (user-generated content) to all comment links to signal to Google that they are not endorsed. I also consider nofollow on all comment links. I use an anti-spam plugin (Akismet). I also check if any of those spam links have led to a manual action.
You have two similar product pages that both have strong backlinks. You need to merge them without losing link equity. How?
Answer: I consolidate both pages into a single, comprehensive product page that includes all information from both. I then 301 redirect the weaker page to the new consolidated page. For the stronger page, I also 301 redirect if it’s not the chosen canonical, OR I choose that page as the canonical and merge the content there. I ensure all internal links point to the canonical version. I monitor rankings for both keywords.
How do you verify that a 301 redirect is passing link equity?
Answer: You cannot directly measure link equity passed, but you can check if the target page’s rankings improve over time. I also use a tool like Ahrefs to see if backlinks pointing to the old URL are now attributed to the new URL in their reports (it may take time). I also check Google Search Console’s Links report for the new URL – backlinks to the old URL should eventually appear.
A site’s pages are indexed but have “noindex” in the HTTP header for some pages. How do you locate all such pages?
Answer: I use Screaming Frog to crawl the site and check the HTTP headers tab. I filter for X-Robots-Tag containing “noindex”. I also use a script to check all URLs in my sitemap using curl. Once identified, I remove the noindex headers.
You run a report and see that a page has 10,000 impressions but only 1 click. What is the likely issue?
Answer: Extremely low CTR suggests the title and meta description are not compelling, or the page is ranking on position 8-10 where clicks are low. Also, the search result may be surrounded by ads, knowledge panels, or other SERP features pushing the organic result down. I optimize the title and meta, and aim to move the ranking to top 3.
A client’s site has 50,000 products, each with a unique URL, but many are out of stock and never return. What is the best long-term strategy?
Answer: For permanently discontinued products, I return a 410 Gone status code to de-index them quickly. I also remove them from the sitemap. For products that may return but not soon, I keep the page with “out of stock” and use 404 (or 410 if never). I also consider redirecting to a similar product or category if appropriate.
How would you use Google Trends to inform your content calendar?
Answer: I search for my core topics and compare interest over 12 months. I identify seasonal peaks (e.g., “Christmas gifts” peaks in November). I produce content 2-3 months before the peak to allow time for indexing and ranking. I also look for rising queries to create timely content before competition intensifies.
You find that Google is indexing URL parameters that lead to the same content (e.g., /page?sessionid=123). How do you stop this?
Answer: I use the URL Parameters tool in GSC, telling Google that the “sessionid” parameter does not change content and to treat it as “No URLs”. I also add a canonical tag on those pages pointing to the clean URL. I update internal links to remove the parameter.
A client’s website has no structured data. Which schema types would you prioritize for a local service business?
Answer: 1) LocalBusiness schema with name, address, phone, hours. 2) AggregateRating schema to show stars. 3) Service schema to list services offered. 4) Organization schema for the homepage. 5) Review schema for testimonials. I implement using JSON-LD.
How do you test if a page is mobile-friendly using command line?
Answer: I use Google’s Mobile-Friendly Test API. I send a GET request with the URL and receive a JSON response indicating if it passes. Alternatively, I use curl to fetch the page and manually inspect for viewport tag, but the API is best.
A site has a high number of “soft 404” errors in GSC. How do you fix them?
Answer: Soft 404s are pages that return 200 OK but have “page not found” content. I locate these pages via GSC. For each, I either: return a proper 404 status code or 301 redirect to relevant content. I also check if a plugin or theme is generating these pages. I fix the underlying cause.
You are asked to write a robots.txt file for a site with 3 sections: public, members only, and admin. How would you write it?
Answer:
User-agent: *
Allow: /public/
Disallow: /members/ (if you don’t want search indexing)
Disallow: /admin/
Sitemap: https://example.com/sitemap.xml
If members area should be indexed, allow it but ensure no sensitive data.
How do you set up Google Search Console for a site with subdomains and subdirectories?
Answer: For subdomains (blog.example.com), add a separate property for each subdomain. For subdirectories (example.com/blog/), you can add a single domain property (example.com) and it covers all subdirectories. But you can also add a URL prefix property for specific subdirectories if you need separate settings.
A page has both a title tag and an H1 that are identical. Is this bad?
Answer: Not necessarily bad, but not ideal. The H1 can be similar but should be slightly different. For example, title: “Best SEO Tools 2026 – Complete Guide”, H1: “The Ultimate Guide to Best SEO Tools”. They can be close in topic but not identical. Duplicate exact text may be missed opportunity for secondary keywords.
How would you identify if a site is being affected by a core update without any manual action?
Answer: I compare ranking and traffic drops with known core update dates using tools like RankRanger or SEMrush Sensor. If the drop coincides within 1-2 days of a confirmed core update, it’s likely the cause. Then I analyze the content of affected pages for E-E-A-T signals. No fix besides improving overall quality.
You are given a sitemap that returns a 404 error. What do you do?
Answer: First, I check if the sitemap file exists in the root directory. If not, I generate a proper sitemap using a CMS plugin or manually. I then upload it to the correct location. I update robots.txt to reference the correct path. I submit the new sitemap via GSC.
A client’s website has a huge number of inbound links with the exact same anchor text “click here”. What risk does this pose?
Answer: It looks unnatural and can trigger a link spam filter (Penguin). Google may devalue those links or penalize the site. I would diversify anchor text by getting new links with varied, natural anchors. I cannot change existing links, but I can disavow the spammy ones if they are part of a scheme.
How do you simulate a crawl from a different geographic location to test international SEO?
Answer: I can use a VPN or a proxy service to set my IP to the target country, then use a crawling tool like Screaming Frog with that proxy. Also, I can use Google Search Console’s “International Targeting” report to see how Google sees the site from different countries. There are also online tools like GeoPeeker.
You find that a page has a canonical tag pointing to a URL that returns a 404. What happens?
Answer: Google will ignore the canonical tag because the target is invalid. The page will be treated as a standalone page, potentially causing duplicate content. I fix by updating the canonical to a valid URL or removing it.
A client’s internal search results pages are getting indexed and causing duplicate content. How do you prevent this?
Answer: I add a noindex tag to all internal search result pages. I also block the search folder in robots.txt (Disallow: /search/). I update the CMS to prevent search URLs from being generated in sitemaps. I also ensure that internal links to search results pages are not created.
How would you prioritize fixing 500 internal server errors across a large site?
Answer: I export the list of 5xx URLs from GSC. I group by the server resource or script causing the error. I prioritize pages that are: 1) high-traffic pages, 2) conversion pages, 3) pages with many backlinks. I work with developers to fix the underlying server issue (e.g., PHP timeout, database connection). I test fixes on staging.
A page has a high number of outbound links (over 200). Does this harm SEO?
Answer: Excess outbound links can dilute PageRank passing to each link, but Google has stated there is no penalty. However, usability may suffer. I review the links: remove irrelevant or low-value ones, combine where possible. For large resource pages, it’s fine.
How do you check if a website is using dynamic rendering correctly?
Answer: I use a user-agent switcher to set my browser to Googlebot. I then visit a page that should be dynamically rendered. I view the source and see if the content is present in HTML (not just JS). I also use GSC URL Inspection and check the “rendered HTML” tab.
You are asked to rank for a keyword that has 0 search volume. Is it worth it?
Answer: Zero search volume in tools often means there is still some traffic (Google doesn’t share all data). It could also be misspelled or very new. I would target it as part of a broader topic cluster, or if it’s a brand term, yes. I also check Google Trends for any interest. If truly zero, I don’t prioritize.
A client wants to remove a page from Google search results quickly. What method do you use?
Answer: The fastest way is to use the GSC Removal tool, which temporarily hides the URL for about 6 months. Then I add a noindex tag to the page and request re-indexing. I also block the URL in robots.txt temporarily. For permanent removal, noindex + wait.
How do you check if a competitor is buying backlinks?
Answer: I look for sudden spikes in backlinks from low-authority or irrelevant domains, especially those with commercial anchor text. I check the linking pages: are they full of other paid links? I also use tools like Ahrefs’ “Link Intersect” to see if there are many same low-quality domains linking to them. It’s not certain, but suspicious.
You inherit a site with thousands of broken internal links. How do you efficiently fix them?
Answer: I use a crawler to generate a report of all broken internal links. I group by target URL. For broken links pointing to a page that has a replacement, I update the link. For pages that are gone permanently, I 301 redirect to relevant page. For site-wide footers, I fix once. I automate using find-replace in database.
A page has a canonical tag to itself, but also a noindex tag. Which one takes precedence?
Answer: Noindex takes precedence. Google will not index the page, so the canonical is irrelevant. If you want the page indexed, remove noindex. If you want it not indexed, noindex is enough.
How would you measure the impact of fixing duplicate title tags on the same site?
Answer: I compare the rankings and organic traffic of the affected pages before and after the fix (at least 4 weeks after). I use GSC performance report. I also check if the pages that previously cannibalized each other now have clearer ranking distribution. I document the changes.
A client’s website has a large number of pages with very similar content (e.g., location pages with only city name changed). How do you fix?
Answer: I add unique content to each location page: local landmarks, testimonials from that city, specific services, local phone number, embedded map, and local photos. If resources are limited, I consolidate into a single “service areas” page with a list of cities and links to a few detailed pages for major cities only.
You find that a page’s title tag is dynamically generated and sometimes includes the brand name twice. How do you fix?
Answer: I edit the template logic to remove duplication. I set a rule: if brand name already present in product name (e.g., “Nike shoes”), do not append again. I test the changes on staging before deploying.
How do you ensure that a 410 Gone page is properly removed from Google’s index?
Answer: After returning 410 status, I submit the URL via GSC URL Inspection and request re-indexing. Google will see the 410 and remove the page from index, usually within a few days. I also remove from sitemap. I monitor the coverage report for that URL to confirm removal.
A client wants to rank for a keyword that is very competitive but has high commercial value. You have limited budget. What do you advise?
Answer: I advise targeting long-tail variations first (e.g., “buy [product] in [city]” or “best [product] for [specific need]”). I create a cluster of content around the topic to build topical authority. I also focus on on-page optimization and user experience. Over time, as authority grows, we can target the head term. I set realistic expectations (12+ months).
You are asked to do a backlink audit for a site that has 500,000 backlinks. How do you scale the manual review?
Answer: I use Ahrefs’ “Backlink Quality” filters: low DR, low traffic, foreign language, nofollow, redirects. I also flag any with over-optimized anchor text. I sample the top 100 most suspicious. I then create a disavow file for domains that are clearly spam. I do not manually review every link. I rely on algorithmic clustering.
A site’s breadcrumb navigation is implemented but not showing in SERPs. What could be wrong?
Answer: Breadcrumbs in SERPs require BreadcrumbList schema markup. I check if schema is implemented correctly using Rich Results Test. Also, Google may not show them for all queries. I ensure the JSON-LD is valid and the URLs in the breadcrumb are correct and indexable.
How would you set up redirects for a site moving from /old-category/ to /new-category/ while keeping all product URLs under that category?
Answer: I use a regex rule: RedirectMatch 301 ^/old-category/(.*)$ /new-category/$1. This preserves the product slug. I also update internal links and sitemap. I test a few examples before implementing.
A client’s Google Business Profile is suspended. How does this affect their local SEO, and what do you do?
Answer: It severely harms local pack rankings and map visibility. I first identify the suspension reason (quality guidelines). I edit the profile to comply (accurate NAP, no keyword stuffing in business name). I then appeal via GBP support, providing documentation (utility bill, business license). While waiting, I ensure website local SEO is strong.
You have a page that uses rel=next and prev for pagination. Google announced they no longer use these. Should you remove them?
Answer: They are not harmful, but they are ignored. I can remove them to simplify code, but I keep if they help users or if other search engines use them. For Google, I rely on proper internal linking and sitemaps instead.
How do you test if a page’s structured data is valid for Google’s rich results?
Answer: I use Google’s Rich Results Test tool. I enter the URL or code snippet. It will show which rich result types are eligible and any errors or warnings. I also use the Schema Markup Validator for syntax. I then fix issues and retest.
A client’s site has high traffic but low conversions. You suspect the traffic is from low-intent keywords. How do you verify?
Answer: I export organic keyword traffic from GSC and GA4. I manually review the top 50 keywords by clicks. If they are informational (e.g., “what is”, “how to”), not commercial, that confirms low intent. I then develop content targeting commercial keywords (e.g., “buy”, “price”, “best”) and optimize pages accordingly.
What is your final step before launching a major SEO change?
Answer: My final step is to take a full backup of the site (files and database). I also take screenshots of current rankings and traffic. I have a rollback plan. Then I implement the change on staging, test thoroughly, and only then push to production. I monitor GSC and GA4 daily for the next week.