TL;DR
- Technical SEO is still the gate. If Google can't crawl, render, and trust your site, nothing you write will rank — and no LLM will cite it.
- JavaScript rendering is still the #1 failure mode we see on B2B sites. Most teams don't know they have the problem.
- Core Web Vitals are worth the investment in 2026 — not because of direct ranking impact, but because they correlate with every downstream metric that matters.
- Schema is no longer optional. It's how both Google and LLMs parse your pages with confidence.
Why technical SEO still matters in 2026
A conversation we have a lot in 2026: "Does technical SEO even matter anymore? The game is AEO now." It's a fair question with an obvious answer. Every answer engine that cites you — ChatGPT, Claude, Perplexity, Gemini — ultimately depends on crawled, indexed, parseable source pages. Your technical foundation is the substrate underneath both SEO and AEO. If it's broken, neither surface will work.
Over the last 24 months we've audited 50+ B2B websites. Roughly 91% had at least two of the five failures below. Roughly 40% had all five. None of them are exotic. All of them are fixable in under a quarter.
Failure 1: JavaScript rendering gaps
Most B2B sites built in the last six years run on React, Vue, or a similar client-side framework. Without proper server-side rendering (SSR), static generation (SSG), or at minimum a robust hydration pattern, Googlebot gets served a blank shell and the LLM crawlers get the same — or worse.
The symptom: you publish a new page, link to it internally, confirm it's live in the browser — and it sits at position 50+ for a month with no organic traffic. Your SEO team is mystified. Googlebot has rendered the page's header and footer but never loaded the product content, because the framework bundle blew past the rendering timeout.
How to diagnose in 10 minutes
- Pick a product page on your site.
- View source (not "inspect"). Search for a product-specific phrase.
- If it's not in the raw HTML, it's being injected by JavaScript at runtime.
- Run the page through Google's Rich Results Test or Mobile-Friendly Test. Look at the rendered HTML.
- If the rendered HTML is missing your main content, you have a rendering problem.
Rule of thumb
Any content that matters for ranking or citation — H1, body copy, product specifications, pricing signals, schema — should exist in the initial HTML response, not injected by JS after hydration. SSR or SSG solves this. Client-side rendering alone does not.
Failure 2: Core Web Vitals slipping year-over-year
CWV isn't a magic ranking factor the way some agencies still pitch it. But it correlates with nearly every downstream metric that matters: bounce rate, conversion, crawl frequency, and, increasingly, LLM ingest cadence. Fast pages get re-crawled more often, which means fresh content gets into the index and into LLM training windows faster.
The problem: most sites were fast at launch and have been slowing ever since. Marketing teams ship tags. Product teams ship new embeds. A/B testing libraries pile up. The site that had an LCP of 1.8s at launch is at 4.2s two years later — and nobody on the team has noticed.
The fix, prioritized
- Audit third-party scripts. Cut anything unused. Load the rest async / defer. Heatmaps, session recorders, A/B tools, ad pixels — each one adds 50–300ms of tax.
- Self-host fonts and critical images. Subset fonts to the glyphs you actually use.
- Fix layout shift. Reserve space for images, embeds, and ads. Most CLS issues are a missing
width/heightattribute. - Ship on edge. Vercel, Cloudflare, Netlify — all trivially fix TTFB on global traffic.
- Instrument CWV in production. Field data from real users is the only metric that matters. Lab data lies.
Failure 3: Missing or broken schema
Schema markup (JSON-LD following schema.org) is how you explicitly tell Google and LLMs what your page is: an article, a product, an FAQ, an organization, a how-to. Without it, both surfaces guess. They guess wrong more often than you'd like to admit.
On the B2B sites we audit, schema failures fall into three categories:
- Missing entirely. The site has no schema at all — common on older WordPress installs and custom CMS builds.
- Invalid. Schema present but failing validation — typos, missing required fields, wrong type nesting. These are sometimes worse than no schema, because they confuse crawlers.
- Static and stale. Schema manually written once, never updated as the page content changed. Now the schema claims the article was published in 2022 while the page shows a 2025 update.
The minimum viable schema stack
- Organization on every page.
- WebSite with SearchAction if you have internal search.
- BreadcrumbList on every non-home page.
- Article on blog posts and news.
- Product on product pages (with Offer, AggregateRating where relevant).
- FAQPage on pages with genuine Q&A content.
- HowTo where applicable (step-by-step content).
Generate all of it from your CMS, not hand-crafted. Validate it in CI. Test a sample of pages weekly with Rich Results Test. This is the kind of maintenance work that sounds boring but compounds quietly over a year.
Failure 4: Crawl budget waste
Your site has a crawl budget — the number of pages Googlebot (and LLM crawlers) will fetch in a given window. On most B2B sites, 30–60% of that budget is wasted on pages that shouldn't be crawled at all: faceted search URLs with every filter combination, pagination tail pages with no unique content, admin or staging URLs accidentally public, UTM-parameterized duplicates.
"Most of your crawl budget is being spent on pages you wouldn't want to rank. Fix that, and your important pages get crawled twice as often — without adding authority."
Diagnostics
- Pull last 30 days of log files. Filter to Googlebot / Bingbot / OpenAI / Anthropic user-agents.
- Group by URL pattern. Sort by crawl frequency.
- For every top-crawled pattern, ask: "Do I want this ranked?" If the answer is no, you're leaking crawl budget.
The fixes
- robots.txt for patterns that should never be crawled (faceted search, filters, internal search results).
- noindex for pages that must be crawlable (for links) but shouldn't rank.
- Canonical tags on parameterized URLs pointing to the clean version.
- XML sitemaps segmented by priority — critical pages in one, long-tail in another — so you can watch crawl behavior on each bucket separately.
Failure 5: Internal linking as an afterthought
Internal linking is the single most under-invested technical SEO lever on most B2B sites. The team spent six months building the new flagship page and left it as an orphan — accessible only from the main nav and nowhere else in the content.
Google's PageRank algorithm is ancient by internet standards, but its logic still holds: pages gain authority from the pages that link to them internally. LLM crawlers use internal links similarly, as confidence signals about which pages are most important.
The internal linking audit
- Identify your 10 most important pages (revenue pages, priority product pages).
- For each one, count how many internal links point to it — excluding the main nav and footer.
- If any are under 10 internal links, you have a linking gap.
- Fix it by: adding contextual links from related blog posts, adding "related content" modules, adding one-sentence context links within longer pages.
The fix sequence
If you're looking at this list and wondering where to start, the order that works in practice is:
- JavaScript rendering. Fix this first. Everything else is wasted if Google can't see your content.
- Schema. The cheapest high-impact win. Ship in two weeks via your CMS.
- Crawl budget. Robots.txt and canonical fixes. Usually a two-sprint project.
- Core Web Vitals. Ongoing; start with the top five traffic pages.
- Internal linking. Continuous investment; build rituals into your content calendar.
What good technical SEO looks like in 2026
A technically excellent B2B site in 2026 does all of the above — and instruments them. That means:
- Real-user CWV monitoring in production (not just lab data).
- Schema validation in CI on every deploy.
- Weekly log-file review for crawl patterns.
- A monthly ritual for internal-link audits on priority pages.
- Dashboards that show all of this in one view.
None of that is glamorous. All of it compounds. Teams that invest in technical SEO consistently see 40–80% organic traffic gains within 12 months — without writing a single new page.
Benchmark
Across 14 B2B technical-SEO engagements in 2024–25, the median organic traffic lift at 12 months was +58% — and the median number of new pages published was zero. The entire gain came from fixing what was already there.
Not all of this is glamorous
Technical SEO doesn't get talked about at conferences because it doesn't make for good keynote material. You can't charge $50k for a 12-slide deck about it. But in our experience, it's the highest-ROI work most B2B teams aren't doing — and fixing it unlocks every downstream content and AEO investment you've already made.
The site you have is probably a better site than you're giving it credit for. It's just got five fixable problems silently capping its ceiling.