Last updated on May 17th, 2022
Definitive Guide to Techincal SEO
Table of Contents:
What is Technical SEO?
Writing Quality content cannot alone guarantee that your site will be fetched up in top results. It also depends on how well you are website is configured and how well your site’s servers are optimized to help crawlers read and index your site smoothly to render(present) it to users on web browsers without any loading issues and errors.
Why Technical SEO?
- To improve page experience and user experience(UX) and consequently reduce bounce rate.
- Smoother crawling and indexing for Google bots to understand your site structure better.
- faster page loads to improve user experience.
- Using breadcrumbs for easy website navigation and to make users find the information they are looking for easily within the site.
- To remove 404(Not found) and rebuild the broken links.
- To protect the website content and keep your site secure and safe.
- To enhance how your site is displayed on search results with structured data implementation.
- Use of Robots.txt file to block crawlers from indexing pages with confidential or private information.
- To reduce the CLS(cumulative layout shift) of elements for better interaction with the site.
- To stop the usage of unnecessary pop-ups that disturbs the users reading flow and interest.
- Practicing SEO-friendly URL structure to make crawlers easily understand the content and site structure.
- The use of sitemaps helps in the crawling of orphan pages that go unindexed because of bad interlinking of a site.
Know the Major Differences Between On Page Vs Off Page Vs Technical SEO
How Technical SEO Works?
The target of search engines is to provide online users with the best results to users’ search intent (keyword). To Achieve this search engines analyze, evaluate, and rank pages based on various technical SEO ranking factors apart from on-page and off-page ranking factors.
These technical rankings factors comprise site structure, internal and external links, code, canonical tags, robots.txt file, sitemap, breadcrumbs, structured data, 404 errors, 301 & 302 redirects, page speed, etc.
Technical SEO checklist
Robots.txt is a text file with instructions to robots to crawl or not to crawl content on certain pages, directories, etc.
Example of robots.txt file
User-agent: * Disallow: /
The above instructions block all search engines from crawling the site.
User-agent: googlebot Disallow: /media
The above instructions will block google bots from crawling the media directory of your site.
Note: Blocking the spiders from crawling a link will not stop search engines from displaying the page in SERPs if there are enough links to the blocked page from other sites. So, add a no-index tag to the page to stop the indexation of a page in SERPs.
Pros and cons of robots.txt file:
Pros: It helps in managing the crawl budget allocated to a site based on the site authority, size and reputation. Blocking search engines from crawling unnecessary things on a site makes them focus on important sections of a site.
Cons: Blocking a page may not stop it from displaying in SERPs. It also stops the passage of link value to other links on the page.
Search engines use internal links to find the new content and determine the most linked pages as the important ones. This process requires search engines to crawl every page on the site to find the updated content resulting in an increase in the crawl budget of site owners.
A sitemap is a file in XML or HTML format that contains the complete list of pages present on your site and their order of importance, last modified, alternate language versions of a page if any and relation between them.
Sitemaps help search engines to discover the newly created or recently updated pages easily saving your site’s crawling budget. Sitemaps act as a backup in case of bad internal linking.
The 404 error (page not found) is an HTTP status code sent by the server which means that the user was able to communicate with the server but the server co the resource that was requested.
This happens due to various reasons like:
- Page Permalink structure was modified
- Page was intentionally deleted by the site admin
- URL was misspelled etc.
How to fix (page not found) 404 errors?
- If the page is removed intentionally present it with 404 not found or 410 (content deleted) HTTP status code.
- Sometimes pages that are available(active) are also treated as 404 soft errors by search engines due to duplicate or thin content on the page. Write unique solid relevant content on this page again and resubmit it to make search engines provide some value to it.
- If the page was moved to a new location then 301 permanently redirects the old page to the new page to send users to the new page.
- If you want the page to be available but want to stop search engines from indexing it in SERPs then add a no-index meta tag to it.
What is HTTPS or SSL Certificate?
HTTPs is a secured socket layer(SSL) protocol to transfer data from the browser to the server of the site.
- HTTPS protects the right to privacy of a user.
- With https data sent from browser to site is secured without third party interception
- For example: With HTTP connection Sensitive information provided by the user like credit card numbers, username, password on the site sign-up forms is sent to the site’s server in plain text that can be intercepted and may lead to data theft or data leakage.
- With HTTPS secured connection the data sent to the site server the is encrypted and protected protecting the privacy of the user.
- Upgrading your site to HTTPS make your site safer, faster because it includes new technologies that make your site to load fast.
- In 2014 Google announced that sites with SSL certificates will have slight ranking boost.
- It gives users a good experience and builds trust among visitors of your site.
Use of hreflang for international sites:
What is hreflang?
HREFLANG is a tag that provides technical solutions to international websites that provide the same content to users in different languages based on the country, region.
Benefits of hreflang?
Presenting the content to the user in the appropriate language they use in the location or country they live in improves the engagement rate and reduces the bounce rate.
It prevents duplicate issues from search engines as the attribute hreflang mentioned in the header tag informs google that the same content is being present to users in different languages with different country-specific URLs.
Hrelang can be technically implemented on sites in 3 different ways
- with addition of link elements in head tag
- mentioning the links in sitemap
- using headers
Example of how hreflang is implemented for different countries in head tag.
<link rel="alternate" href="https://www.example.com/" hreflang="en" /> <link rel="alternate" href="https://www.example.com/en-gb/" hreflang="en-gb" /> <link rel="alternate" href="https://www.example.com/en-au/" hreflang="en-au" />
Second method using hreflang HTTP headers
Link: <https://es.example.com/document.pdf>; rel=”alternate”; hreflang=”es”,
<https://en.example.com/document.pdf>; rel=”alternate”; hreflang=”en”,
<https://de.example.com/document.pdf>; rel=”alternate”; hreflang=”de”
Third method using XML sitemap hreflang implementation
<url> <loc>https://www.example.com/uk/</loc> <xhtml:link rel="alternate" hreflang="en" href="https://www.example.com/" /> <xhtml:link rel="alternate" hreflang="en-au" href="https://www.example.com/au/" /> <xhtml:link rel="alternate" hreflang="en-gb" href="https://www.example.com/uk/" /> </url>
Structured data is schema markup language code used to highlight the important information on a site like an author, price, event timings, faqs, reviews, etc.
It makes it easy for search engines to understand your site content and structure and provide users with additional information like product price in rich results when appropriate.
These days users are impatient and they expect a site to load in 3-5 seconds. If it takes more time then it gives users a bad page experience causing them to switch to another site which leads to loss of website traffic, increase in bounce rate and drop in search rankings.
In 2021, Google has made page experience a ranking factor. It also developed its page speed tester web tool based on the average loading, rendering, interaction times of various core web vitals like:
- First content paintful (FCP)
- Largest content paintful (LCP)
- Cumulative layout shift (CLS)
- First input delay (FID)
Reading time: 8 minutes