Technical SEO is the foundation that allows search engines to understand, rank, and display your website. In simple language, it is concerned with how well the search engine can find your content. When the technical foundation is weak even the best content with the strongest backlink profile won’t work as well as it could.
Most modern CMS platforms like WordPress, Squarespace, Shopify, etc. take care of many technical SEO tasks by default but sometimes do a poor job or nothing at all leaving someone with an understanding of these concepts an advantage is SEO optimization.
This is meant as a high-level article and not on practical implimentation. If you’d like to go deeper, Google Search Central is the best place to start.
Crawling & Indexing

Before your site can appear in search results, search engines must first crawl your website and index its pages. It’s important to remember that search engines rank individual pages, NOT entire websites. Crawlers begin at your top-level domain and work their way through your URL structure until they reach the final page, then return to the top level. From there, the search engine determines whether a page is valuable enough to index. If it is, the page becomes eligible to rank and can begin appearing in the SERP (Search Engine Results Page).
We have several ways to control what gets crawled, what gets indexed, and what is excluded from search results. Sometimes we don’t want certain files such as videos, images, or pages to be indexed. This can be for many reasons, including sensitive information, duplicate content, outdated content, and etc.
This is where sitemaps, robots.txt, canonicals, links, and metadata come in. These tools help us manage how search engines access and interpret our content.
Sitemap

A sitemap is a file written in XML or HTML that lets us tell search engines which files and paged relate to each other. This helps search engines understand which content to prioritize when crawling. Because each website is given a limited crawl budget, specifying details such as when a file was last updated helps search engines decide what to revisit and what to skip on the next crawl.
For small websites, this may not matter much. But for larger sites with 1,000+ pages, sitemaps become essential. Once your crawl budget is used up, the crawler stops whether it has reached every page or not and you’ll have to wait until the next crawl cycle.
robot.txt

robots.txt is file that tells search engines which files or directories their crawlers are allowed to access. If a file is disallowed, it still exists on your server, but the crawler should not visit it, and in most cases it won’t appear in the SERP. There are a few exceptions, though:
- If another website links to a disallowed URL, the page can still appear in search results as an un-crawled, un-rendered URL.
- Not all search engines support the same directives, and different crawlers may interpret robots.txt rules differently.
Because of this, it’s important to be careful when configuring your robots.txt file to avoid unintentionally blocking valuable content.
Canonical

A canonical tag tells the search engines which version of a page should be treated as the primary page when duplicates exist. It helps consolidate ranking signals, including link equity, toward the canonical URL. Its like a 301 redirect, except for crawlers. It signals to search engines how it should index and rank the related pages.
URL Slugs

A URL slug is the part of a URL that identifies a specific page on a website. A clear, descriptive slug helps search engines better understand the content of the page while crawling.
Links
Links are simply ways to point to other web pages, and anchor text is the clickable text within those links. Both help crawlers understand the relevance of pages and discover new content to crawl. Generally, there are three types of links.
Internal Links

Internal links are links that point to other pages within your own website. Internal links help establish site structure, distribute authority, and reinforce topical relevance. There are strategies such as content siloing that involve linking related pages together within the same category to strengthen their thematic connection and improve overall site organization.
External Links

External links are links that point to pages outside your website. If you don’t want to pass authority or credibility to the destination page, you can add a rel attribute, such as nofollow to signal that search engines should not transfer ranking value through that link.
Back Links

Backlinks are links from other websites that point to a page on your site. They signal to search engines that your content has value, helping to build your site’s domain authority. Over time, this authority and trust can improve the ranking of future pages you publish, as search engines are more likely to favor your site over others on the same topic.
Page & Content Metadata

Title and description tags provide search engines with key information about your page. Other meta tags exist, but they are typically used for specific cases rather than general SEO.
Title Tag
The title tag describes the content of a page. It helps search engines understand what the page is about during crawling and often appears as the clickable headline in search results.
Description Meta Tag
The meta description works similarly to the title tag, but provides a brief summary of the page’s content.
Search Appearance
Now that the search engine has properly evaluated and indexed our page, let’s discuss what appears on the SERP. While these elements aren’t direct ranking factors, they can significantly influence click-through rates (CTR) which in turn can impact rankings.
Title & Description Metadata
The title tag and meta description that were crawled earlier also appear in the SERP, often forming the main headline and snippet that users see in search results.
Favicon
The favicon is the small image that appears next to your page title in the SERP. It also serves as the icon for mobile apps on iOS and Android devices. Favicons are important for brand recognition, helping your site stand out among other search results.
Sitelinks

Sitelinks are additional pages that appear beneath the main site listing in the SERP when the search engine deems them relevant. They aren’t critical for SEO, but if there’s a page you don’t want displayed, the best approach is to deindex it.
Og Metadata

Open Graph (OG) metadata controls how your webpage appears when shared on social media. OG tags specify which image, title, URL, and description should appear in the preview, helping improve click-through rates before users even visit your page.
Videos & Images

Adding proper attributes like alt text and title tags to images and videos helps them appear in search results. High-quality visuals with descriptive attributes increase the likelihood that users will click on your content.
Structured Data Markup (Schema)

Adding structured data can enhance how your pages appear in search results, creating more engaging rich results. Depending on the type of content, rich results such as enhanced text snippets, ratings, or other visual elements can significantly improve click-through rates on the SERP.
Ranking Factors
Technical SEO does have some ranking factors although not as important than on-page and off-page signal. While we don’t know their exact weight, cumulatively they do make a meaningful difference. Ensuring your site is compliant gives you the best chance to rank well.
https / http

HTTPS is a ranking factor, making sure your site isn't susceptible to man in the middle attacks is important from a user trust stand point and search engines.
PageSpeed and Web Core Vitals

Core Web Vitals are three specific performance metrics within the broader PageSpeed score that impact rankings which are LCP, INP and CLS. The other two are not as important but are still worth mentioning. Theses vitals focus on user-centric aspects of a webpage’s experience, including loading speed, interactivity, and visual stability.
Largest Contentful Paint (LCP)
This metric measures how long it takes for the largest content element such as an image or text block to fully load and become visible on the screen.
Interaction to Next Paint (INP)
Measures the page's responsiveness to user interactions (like clicks or taps) throughout the entire session.
Cumulative Layout Shift (CLS)
Measures how much the content of a page shifts unexpectedly during loading.
Time to Interactive (TTI)
Measures how long it takes before the screen is interactive to the page visitor.
Total Blocking Time (TBT)
Measures the total time a page's main thread is blocked from responding to user input during loading.
Mobile Friendliness
Search engines primarily use the mobile version of a site for indexing and ranking, due to the high volume of searches conducted on mobile devices.
Orphan Pages & Broken Links

Orphan pages are web pages that have no internal links pointing to them, making them difficult for both users and search engines to discover which hurts page indexing. Broken links are URLs that lead to non-existent pages (e.g., a “404 error”). Negatively impact SEO and user experience.
Conclusion
Fixing technical SEO issues should be the first step in your SEO efforts. Much like building a a house if your foundation is weak, you can expect issue in the rest of the house. Every website is built on-top of technical SEO so making sure everything is setup correctly is important so you can have the best chance of appearing on-top of the search results. If you’d like a free technical audit, reach out we will be more than happy to assist you.

