Key Summary
- Topical authority, not individual keyword optimization, is the primary driver of ranking in 2026. Sites that own a subject cluster consistently outrank sites with stronger domain authority on isolated pages.
- AI Overviews pull citations from pages already ranking in positions 1 to 10. You cannot target AI visibility without first earning organic rankings.
- E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) is the quality signal Google uses to evaluate content. Demonstrating real author credentials and first-hand experience directly affects which pages get surfaced.
- Core Web Vitals are a confirmed ranking signal. A failing LCP score caps your rankings regardless of how strong your content is.
- Schema markup for FAQ, Article, and Breadcrumb directly increases eligibility for rich results and AI citation.
- The single highest-leverage SEO action for most sites is fixing internal linking gaps, not building more backlinks.
SEO in 2026 has two audiences: Google’s ranking algorithm and the AI systems (Google AI Overviews, ChatGPT, Perplexity, Claude) that now answer millions of queries directly without a single click. Most articles about SEO best practices still write as if only the first audience exists.
This one covers both.
I have used every practice on this list on real sites. Some of them I learned the hard way, after watching traffic drop because I skipped them. The list is organized by category, not by arbitrary ranking, because the priority depends on where your site is right now.
SEO Best Practices for Technical Foundations
Technical SEO is not glamorous, but it is the floor your content stands on. Weak technical foundations mean Google cannot efficiently crawl and index your pages, Core Web Vitals failures cap your rankings, and duplicate content dilutes PageRank away from your best pages. Fix the floor first.
1. Audit and Fix Your Core Web Vitals
Core Web Vitals are not aspirational metrics. They are confirmed Google ranking signals, and a failing score will cost you positions against competitors with equivalent content.
The three metrics that matter are Largest Contentful Paint (LCP), Interaction to Next Paint (INP), and Cumulative Layout Shift (CLS). Google measures these using real-user data from the Chrome User Experience Report, not just lab scores from PageSpeed Insights. The distinction matters because your PageSpeed Insights score reflects a single controlled test. The CrUX data in Google Search Console reflects actual user experiences across all devices and connection speeds.
Start with the Core Web Vitals report in Google Search Console under Experience > Core Web Vitals. Filter to “Poor” URLs and sort by the number of affected URLs. That gives you the highest-impact fixes first. For LCP, the most common fix on content sites is preloading the featured image with a <link rel="preload"> tag and serving it in WebP format. For CLS, start with images missing explicit width and height attributes. For INP, audit which plugins add render-blocking JavaScript and remove or defer the ones not serving a front-end function.
Target passing scores (LCP under 2.5 seconds, INP under 200ms, CLS under 0.1) in the field data, not just the lab score.
2. Fix Crawlability Before Everything Else
Googlebot needs to be able to reach your pages efficiently. If your robots.txt is blocking the wrong paths, if your internal link structure creates orphan pages, or if your site has crawl traps (infinite pagination, faceted navigation generating thousands of URLs), you are wasting crawl budget on URLs that will never rank.
Run Screaming Frog on your entire site. Filter for pages that return a 200 status code but are either noindexed, have no incoming internal links (orphan pages), or are blocked by robots.txt. Orphan pages receive no PageRank through internal links, which means they start every ranking competition at zero even if their content is strong.
For large sites (above 1,000 pages), upload your server log file to Screaming Frog’s Log File Analyzer and filter for Googlebot. Pages that have not been crawled in 90 days are not being treated as important by Google. That is signal, not noise.
3. Eliminate Duplicate Content at the Source
Duplicate content is not just a penalty concern. It is a PageRank dilution problem. When the same content is accessible at multiple URLs, Google has to decide which version to rank. It usually picks correctly, but usually is not good enough when you are competing for competitive keywords.
The most common duplicate content sources: HTTP vs. HTTPS, www vs. non-www, trailing slash vs. no trailing slash, pagination URLs, category and tag archives that display full post content, and URL parameters added by analytics, sharing, or filter systems. Audit all of these with Screaming Frog by running a full crawl and filtering for “Duplicate Page Titles” and “Duplicate Meta Descriptions,” then tracing each back to whether the duplicate is a canonical URL issue or a content issue.
Set canonical tags on every page. Every page, including paginated ones, needs a self-referencing canonical unless it is intentionally pointing to a different canonical URL.
4. Submit a Clean XML Sitemap
Your XML sitemap is a direct instruction to Google about which pages deserve attention. A sitemap full of noindexed pages, 404s, redirect chains, and thin archives tells Google your site lacks editorial judgment. That reduces the crawl priority given to your legitimate content.
Clean rule: the sitemap should contain only URLs that are returning a 200 status code, have a canonical pointing to themselves, and are set to index. Run your sitemap through Screaming Frog (File > Import > Sitemap) and filter for any URL not meeting all three criteria. Remove the ones that fail. Resubmit through Google Search Console. Monitor the “Submitted vs. Indexed” ratio over 30 days. If the ratio drops below 70%, the remaining unindexed pages likely have a content quality problem that the sitemap cannot fix.
5. Move to HTTPS and Verify It Is Actually Correct
HTTPS is a confirmed (minor) ranking signal and a hard trust requirement for any site asking users to enter data. But just having an SSL certificate is not enough. Verify the full setup.
Check that HTTP to HTTPS redirects are 301s, not 302s. A 302 is temporary and does not pass full PageRank. Check for mixed content warnings in the browser console (resources loading over HTTP on an HTTPS page). Verify your SSL certificate grade at SSL Labs (ssllabs.com/ssltest). An A or A+ grade means the configuration is correct. A B or below usually indicates deprecated TLS versions or weak cipher suites, which modern browsers flag as security warnings.
On-Page SEO Best Practices That Move Rankings
On-page SEO in 2026 is not about keyword stuffing or hitting a specific keyword density. It is about writing pages that directly and completely answer the search intent, then structuring them so Google and AI systems can parse and cite the content clearly.
6. Match Your Content to the Full Search Intent
Search intent is the actual goal behind a query. Google classifies intent as informational, navigational, commercial, or transactional, but the useful classification for content is more specific: what does the searcher need to do or decide after reading your page?
A query like “best CRM for small business” has commercial intent, but the real intent is comparison and evaluation before a purchase decision. Content that only defines what a CRM is fails this intent even if it is well-written. Content that compares specific tools with specific criteria, real pricing, and honest trade-offs fully serves it.
Before writing any page, look at the top 5 results for your target keyword. Identify what content format they use (list, guide, comparison, definition), what specific questions they answer, and where they stop short. The gap between where the top results stop and what the searcher still needs to know after reading them is your content opportunity.
7. Use the Right Keyword Placement, Not Just Frequency
The primary keyword belongs in five places: the title tag, the H1 heading, the first 100 words of the body, at least one H2 subheading, and the meta description. These are the locations Google weights most heavily for keyword relevance signals.
Beyond those five locations, keyword frequency matters much less than semantic coverage. A page about “project management software” that also covers “task tracking,” “team collaboration tools,” “Gantt chart software,” and “Agile project management” will outrank a page that repeats “project management software” 20 times.
Use Ahrefs Keywords Explorer or Semrush’s Keyword Magic Tool to find the secondary and LSI (Latent Semantic Indexing) keywords related to your primary term. In Ahrefs, enter your primary keyword, click “Related Terms,” and filter for keywords that appear in the top-ranking pages. These are the terms Google associates with full coverage of the topic.
8. Write Title Tags That Get Clicked
The title tag is the most-seen piece of content your site produces. Every searcher sees it before they see a single word of your article. A title that communicates clear value and relevance generates a higher click-through rate, which reinforces positive ranking signals.
Keep title tags under 60 characters to avoid truncation in SERPs. Lead with the primary keyword when the sentence structure allows it naturally. Include a specific benefit or differentiator when possible: “10 Best Project Management Tools (Compared for Teams Under 20)” outperforms “Best Project Management Tools” because it signals specificity and the searcher can self-qualify before clicking.
Avoid writing title tags that sound like they were written for a keyword report. “Project Management Software: A Complete Guide to Project Management Tools in 2026” has two redundant keyword insertions and no click reason. “Project Management Software for Small Teams: What Actually Works” has one keyword placement and a clear point of view.
9. Optimize Meta Descriptions as Click Copy
Meta descriptions are not a direct ranking signal, but they are direct click-through rate copy. Google does not always display your meta description (it rewrites it for roughly 60% of queries), but when it does display your version, it functions as an ad.
Write meta descriptions that complete the searcher’s thought: identify the problem, state what the page resolves, and give one concrete reason to click. Keep them between 150 and 160 characters. Include the primary keyword naturally (Google bolds it in the SERP when it matches the query). End with an implicit call to action, not a literal “Click here.”
Bad: “Learn about SEO best practices in this article. We cover everything you need to know about SEO in 2026.”
Better: “20 SEO best practices that actually improve rankings and AI visibility in 2026. Covers technical, on-page, content, and GEO tactics.”
10. Structure Headings for Human Readers and AI Parsers
Heading structure (H1, H2, H3) serves three purposes simultaneously: it helps human readers scan the page, it signals content hierarchy to Google’s crawlers, and it creates the labeled sections that AI systems extract when generating summaries and citations.
One H1 per page. It should match or closely mirror the title tag. H2s mark major topic sections. H3s subdivide each major section into specific subtopics. Do not skip heading levels (going from H2 directly to H4) and do not use headings for decorative styling.
For AI visibility, the heading structure is particularly important. When Google’s AI Overviews or other AI systems pull information from your page, they typically identify a heading, then extract the first one to three sentences beneath it as the answer. Write the first sentence under every H2 and H3 as a complete, self-contained answer to the implicit question the heading poses. The reader should be able to read just the heading and the first sentence and understand the point without reading further.
Content SEO Best Practices for Topical Authority
Topical authority is the single most durable ranking advantage available in 2026. A site that thoroughly covers a subject area outranks sites with higher domain authority on a consistent basis, because Google trusts it as the authoritative source on that topic. Building topical authority is a content strategy, not a content volume play.
11. Build Topic Clusters, Not Isolated Pages
A topic cluster is a group of pages organized around one central “pillar” page and multiple “cluster” pages, each covering a specific subtopic in depth. The pillar page targets the broad head term. The cluster pages target long-tail variants and related questions. All cluster pages link back to the pillar, and the pillar links to each cluster.
This structure does three things: it signals to Google that your site covers the topic comprehensively, it concentrates PageRank on the pillar page through internal links, and it answers the full range of questions a searcher might have at different stages of their research.
A practical example from my work: a site targeting “eCommerce SEO” as a pillar topic had separate cluster pages for “Shopify SEO,” “WooCommerce technical SEO,” “product page optimization,” “category page SEO,” “eCommerce schema markup,” and “site speed for online stores.” Each cluster page linked to the pillar. After building out the cluster over four months, the pillar page moved from position 14 to position 3 for “eCommerce SEO” without a single new backlink.
Start building clusters in Ahrefs by entering your primary topic in Keywords Explorer, clicking “Matching Terms,” and grouping keywords by Parent Topic. That grouping shows you the natural cluster structure Google already perceives around your topic.
12. Cover Every Stage of the Searcher’s Journey
Most SEO content covers the awareness stage (informational content explaining what something is) and ignores the evaluation and decision stages. Sites that cover all three stages rank more pages, generate more internal link opportunities, and capture more converting traffic.
The three stages for any topic: awareness (what is it, how does it work, why does it matter), evaluation (best options, comparisons, pros and cons, what to look for), and decision (specific product/service reviews, pricing, how to get started).
A site about email marketing software that only has “What is email marketing” and “best email marketing software” articles is leaving the evaluation and decision stages open for competitors. Adding pages like “Mailchimp vs. Klaviyo for eCommerce,” “email marketing software pricing comparison,” and “how to switch email providers without losing your list” fills those gaps and creates a content moat.
13. Update High-Potential Pages Before Creating New Ones
Content decay is real. A page that ranked in position 4 eighteen months ago and has since dropped to position 9 is generating roughly 60% less organic traffic than it was. The cause is usually one of three things: competitors published stronger content, the search landscape shifted and your page no longer matches current intent, or the page simply shows age signals (outdated statistics, references to old tools, publication date in the title showing a year that has passed).
Before creating new content, run a monthly audit of pages that have dropped 3 or more positions in the past 90 days. Use Google Search Console’s Performance report filtered by page, sorted by change in clicks over the past 3 months vs. the prior 3 months. Pages showing significant click decline without a ranking change have a CTR problem (fix the title and meta description). Pages showing both position and click decline need a content refresh.
When refreshing, do not just add new paragraphs to the existing content. Rewrite sections that no longer match search intent. Update statistics and tool references. Add depth to sections where competitors now outperform you. Update the publication date only after making substantive changes.
14. Demonstrate E-E-A-T in Every Piece of Content
E-E-A-T stands for Experience, Expertise, Authoritativeness, and Trustworthiness. Google’s quality raters use it to evaluate content quality, and it is the framework Google uses when assessing whether a page deserves to rank in its category.
Experience is the newest addition to the framework. It refers to first-hand experience with the subject. A post about “best trail running shoes” written by someone who has actually run trails carries more experiential authority than one written from research alone. Demonstrate experience by including specific personal examples, actual results, or first-person observations that could only come from doing the thing.
Expertise signals include: named authors with credentials visible on the page, author pages linking to external profiles (LinkedIn, industry publications, conference talks), citations of primary sources (original research, official documentation, authoritative external sources), and content depth that goes beyond what a non-practitioner could produce.
Authoritativeness is largely built off-page through backlinks, brand mentions, and co-citations with authoritative sources in your niche. But on-page, it is reinforced through how you reference and build on existing knowledge in the field.
Trustworthiness covers technical trust signals (HTTPS, privacy policy, clear contact information, editorial policies) and content trust signals (accurate facts, honest trade-offs, no misleading headlines).
Link Building Best Practices That Still Work
Backlinks remain one of Google’s strongest ranking signals. The sites I have grown from zero to significant organic traffic all have one thing in common: a deliberate link acquisition strategy, not a “publish great content and links will come” one.
15. Earn Links Through Original Research and Data
The most consistent link magnet I have used is original data. Surveys, original studies, dataset analyses, and proprietary research attract links from journalists, bloggers, and industry publications who need a citable source for the claims they make.
The data does not need to be a massive study. A survey of 200 practitioners in your niche about a specific challenge produces citable findings that no one else has. A publicly available dataset analyzed in a way that produces a new insight is link-worthy. A competitive analysis comparing 50 websites on a specific metric produces concrete findings others will reference.
After publishing original research, proactively reach out to the journalists and bloggers who cover that topic. A direct email with the key finding and a link to the full data is more effective than waiting for them to discover it. Tools like HARO (Help a Reporter Out) and Qwoted connect you to journalists actively seeking sources, which is a faster path to earned editorial mentions.
16. Fix Unlinked Brand Mentions
Unlinked brand mentions are references to your brand, site, or content that do not include a link. They are the easiest link acquisition targets because the publisher already knows you exist and already values your work enough to mention it. Asking for a link conversion is a much lower-friction request than cold outreach.
Use Ahrefs Content Explorer or Google Alerts to find mentions of your brand name without a corresponding backlink. In Ahrefs Content Explorer, search for your brand name and filter to “One page per domain,” then export the list. Cross-reference against your backlink profile to find mentions that are not yet links. Reach out with a brief, direct email referencing the specific mention and requesting a link attribution.
17. Build Internal Links Systematically
Internal links are the most underused SEO tactic. Every internal link does two things: it passes PageRank from the linking page to the linked page, and it signals to Google the semantic relationship between the two pages.
The mistake most sites make is building internal links ad hoc while writing new content, which means they cluster links in recently published pages and leave older, high-authority pages as orphaned PageRank pools.
Build internal links systematically by doing a quarterly internal link audit. In Screaming Frog, run a crawl and look at the “Inlinks” count for each page. Pages with low inlinks that have high organic potential (they rank between position 10 and 20, meaning they are close to the first page) are prime candidates for internal link addition. Find your 5 to 10 most-linked pages and add contextual links from them to the underlinked target pages.
AI Visibility and GEO Best Practices
AI visibility is not a separate strategy from SEO. It is an extension of it. Google’s AI Overviews pull from pages already in positions 1 to 10. ChatGPT and Perplexity cite sources they retrieve through search. The path to AI citation is through organic ranking, and the path to organic ranking is through the practices above.
That said, there are specific structural and content choices that make your pages more likely to be cited by AI systems when they have the organic ranking to be eligible.
18. Structure Content for AI Extraction
AI systems extract information by identifying a heading, then pulling the most information-dense sentences immediately following it. Pages written with vague, buildup-heavy introductions to each section are skipped in favor of pages where the answer appears in the first sentence.
The structure that gets cited: H2 or H3 heading that names the specific topic, followed immediately by a sentence that fully answers the implicit question, followed by explanation and examples. Every section should be able to stand alone as a complete answer if extracted out of context.
This structure also increases your chances of earning featured snippets, which appear at the top of the SERP above organic results. Featured snippet content almost always comes from a page ranking in positions 1 to 5 and uses the answer-first format with a clear heading above it.
19. Add Schema Markup for FAQ, Article, and Breadcrumbs
Schema markup is structured data in the page’s HTML that explicitly tells Google and AI systems what type of content each page contains. It directly affects eligibility for rich results in Google SERPs and increases the confidence AI systems have when extracting and citing information.
The three schema types that deliver the most SEO and AI visibility value in 2026 are FAQ schema (adds expandable question-answer blocks in the SERP, and gives AI systems clean Q&A pairs to cite), Article schema (signals publication date, author, and content type for editorial content), and BreadcrumbList schema (enables breadcrumb display in search results and signals site hierarchy to both Google and AI indexing systems).
Validate your schema at Google’s Rich Results Test (search.google.com/test/rich-results) after adding it. Errors in schema do not break your page, but they prevent the schema from qualifying for rich result display.
20. Build Topical Authority That Makes You the Default Source
The highest-leverage AI visibility strategy is also the highest-leverage organic SEO strategy: become the most thorough, most cited, most credible source on a specific topic.
AI systems are trained on data from across the web. Sites that appear repeatedly as cited sources in their domain, that are referenced by other authoritative sources, and that have comprehensive coverage of a subject become the default citation source for AI-generated answers about that topic. This is not a tactic. It is the result of executing all the practices above consistently over 12 to 18 months.
I have seen this play out practically. A client site focused on Shopify SEO that built out 40-plus topic cluster pages, earned links from major eCommerce publications, and maintained E-E-A-T signals across every page now gets cited by AI Overviews for Shopify SEO queries at a rate disproportionate to its domain authority. The citations come from depth, specificity, and repeated presence across the topic, not from any single AI-targeting tactic.
Wrapping It Up
The 20 SEO best practices above are not a checklist to complete once. They are an ongoing operational standard.
If you are starting from scratch, prioritize in this order: fix technical foundations (practices 1 to 5), then build topical authority through content clusters (practices 11 to 14), then build links systematically (practices 15 to 17), then layer on AI visibility signals (practices 18 to 20). On-page optimization (practices 6 to 10) applies to every piece of content you create, starting from day one.
The sites I have seen grow fastest are not the ones that found a shortcut. They are the ones who executed these fundamentals without skipping steps.
Frequently Asked Questions
What are SEO best practices?
SEO best practices are the specific techniques and strategies that improve a website’s ability to rank in search engine results pages. They include technical site health (crawlability, page speed, HTTPS), on-page optimization (title tags, heading structure, keyword placement), content strategy (topical authority, search intent matching, E-E-A-T signals), link building (earning authoritative backlinks and converting unlinked mentions), and structured data (schema markup for rich results).
How many SEO best practices should I focus on at once?
Focus on one category at a time rather than all 20 practices simultaneously. Start with technical SEO because errors there affect the ability of all other work to produce results. Once your technical foundation is solid, move to content and on-page optimization. Address link building in parallel with content creation. Trying to fix everything at once spreads attention too thin and makes it difficult to attribute which changes produced which results.
Do SEO best practices still apply when trying to rank in AI Overviews?
Yes. Google’s AI Overviews pull citations from pages already ranking in positions 1 to 10 for the given query. You cannot appear in an AI Overview for a query where you do not have an organic ranking. Standard SEO best practices, specifically topical authority, strong E-E-A-T signals, and answer-first content structure, are the primary factors that determine both organic rankings and AI Overview citation eligibility.
How long does it take for SEO best practices to improve rankings?
Technical SEO fixes typically show results in 2 to 8 weeks after Google recrawls the affected pages. On-page optimization changes show results in 4 to 12 weeks. Content additions and refreshes take 2 to 6 months to fully register. Link building impact varies by the authority of the links acquired, but expect 3 to 6 months for significant movement on competitive keywords. Topical authority accrues over 12 to 18 months of consistent content production and link acquisition.
What is the difference between on-page SEO and technical SEO best practices?
Technical SEO covers the infrastructure of the site: crawlability, page speed, HTTPS, site architecture, XML sitemaps, canonical tags, and structured data. On-page SEO covers the content and HTML elements on individual pages: title tags, meta descriptions, heading structure, keyword placement, and content quality. Both are required. Technical SEO ensures Google can find and index your pages. On-page SEO determines how relevant and authoritative those pages appear for target queries.
Which SEO best practice has the highest ROI for most websites?
Internal linking has the highest ROI for most established websites because it is free, fully within your control, and directly redistributes PageRank to underperforming pages that are close to ranking on the first page. For new sites, the highest ROI practice is building topical authority through topic clusters rather than targeting isolated high-competition keywords. Both practices consistently outperform the time spent on link outreach campaigns for sites that have not already done them systematically.
Are SEO best practices different for AI search engines like Perplexity and ChatGPT?
The fundamentals are the same: authoritative, factually accurate, clearly structured content that covers a topic thoroughly. The difference is that AI search engines weight citation frequency (how often a source is referenced in other content), entity clarity (how clearly the page identifies who wrote it, when, and why it is authoritative), and answer density (how much useful information is present per paragraph). Pages optimized for Google using E-E-A-T principles, schema markup, and answer-first structure are naturally well-positioned for AI search citation.