More visibility on Google with the Technical SEO

11/30/2020 12:00 AM by seowebsitestats in Seo

How can you improve Google’s visibility? How do I get more traffic? More conversion? These are questions that all entrepreneurs asked themselves in their early days. Several solutions exist, one of which is Natural Referencing or Search Engine Optimization (SEO).

Natural REFER is a set of webmarketing techniques that aim to improve a website’s visibility in search engine search results like Google.

For more visibility on Google, you have to work on the four main pillars of the SEO, namely:

  1. The technical aspect
  2. The user experience
  3. Content
  4. Recommendations

We will look at the first pillar in this article but it should be noted that the 4 must be worked to gain Google visibility.

More visibility on Google with the Technical SEO

What is the SEO technical aspect?

The first pillar of SEO is technical referencing, it is to optimize a set of criteria of Google internal to the website. Google attaches importance to these criteria because they play on both the user experience (user side) and the navigation of robots (Google side). In other words, technical referencing is mainly the optimization of a website so that search engines can understand it as best as possible.

If Google doesn’t understand your site or can’t visit all your pages, it’s a lack of visibility on Google. So you need to make sure that everything “under the hood” of your website purrs properly. We will therefore be particularly interested in how a website was built and what CMS (a tool for creating websites like WordPress or Wix) it uses, if it uses one. Most of the elements of technical referencing are therefore addressed to Google and invisible to Internet users.

There are a lot of technical criteria to put in place according to Google,here are a few.

15 technical criteria for more Google visibility

Crawl and indexability

Google works in 3 steps to manage websites. Crawl and indexability are the first two steps:

  • The crawl: Google sends its robots to analyze every page of a site
  • Indexing: Once analyzed, the pages are indized in the search engine and are visible to users

It is possible to control these two steps. The crawl can be managed via the robot file.txt which allows Google to indicate which pages it is not allowed to visit. Indexing can be managed by the No-index tag, which tells Google not to index a page. It is interesting to control its two steps to gain visibility on Google,I talk about it in this article dedicated to the quality of the pages.

When a user uses a CMS like WordPress, it’s a good way to check whether Google has indated all the pages you want to inddelify. Very often, Google indexes pages that we do not want to index and forget others that we want to see indexed. It’s very frustrating and you have to be on constant standby to check the correct indexing of its pages.

For this, you can use Google Search Console. In the “Cover” tab, you can see all the pages of your website (past or present). You can then see the pages in error, the inxed pages (valid) and those that Google has not indexated (voluntarily or not). Check this information. 100% of indexable pages dedicated to SEO must be in the “Valid” section and all pages you do not want to see indefinged must be in the “Excluded” section.

In order to guide Google, we can create a sitemap (site map) and send it to Google via the Search Console. On WordPress, the well-known Yoast SEO plugin supports this feature.

The accessibility of pages

It is very important toprovide a good user experience for Internet users. For this, 100% of the links on the pages must work properly. The user who clicks on an error link will be frustrated and will leave. This is dramatic for your Google visibility. There are three types of links that come up very often:

  • Code 200 links: everything is fine, it means that the link works properly, 100% of the links in your website must be in code 200.
  • Code 301 links: This means that the landing page is accessible but is redirected. This most often happens when the URL of the landing page has been changed but the links that point to that page have been forgotten. Code 301 means permanent redirection. It’s bad for your loading time and seED.
  • Code 404 links: 404 errors are not dramatic for natural referencing, but they do harm the user experience. A 404 error is an inaccessible page. There is nothing worse for a user to leave the website.

You can control your links by doing a crawl of your website. The free Screaming Frog SEO Spider tool identifies links that are misleading. Note that it’s exactly the same for outgoing links.


The depth is the number of clicks needed to access a web page. Users usually stop at 3 clicks. Depth interferes with the user experience. Internet users are lazy and want to find information very quickly, if they have trouble finding the info, they will leave. In addition, robots spend a limited amount of time on a website and may miss a page that is too deep. It will not then be indexed (not immediately anyway).

Limit the number of clicks to a depth of up to 3. To do this, you need to have a simple and concise tree. Internal mesh is also a good way to reduce depth.

The internal link

This is the best way to make it easier for Google’s robots to navigate, but also for Internet users. The internal mesh is simply to make links between the pages of the same website. It’s good for your Google visibility but also for keeping users on the site. Wikipedia is a perfect example. For my part, I think this is a major criterion of natural referencing, discover my complete guide on internal link.

The Title Tag

The SEO title is an ultra important element and needs to be optimized with good keywords. It will determine whether or not the user will find and then click on the link to access the page. It should be an incentive and the right keywords should be placed as far to the left as possible. The Title tag is between 60 and 65 characters. I advise you to use as much space as possible. Note that each title tag must be unique and that the title of the page is not necessarily identical to the title tag. On WordPress, Yoast SEO can be modified with the Title tag.

The meta tag description

The Meta descriptiontag has much less weight than the title tag, it’s a unique snippet of the page that only appears in search results. Too often forgotten, the meta description should be about 160 characters and also be optimized with good keywords. The keyword (or variant) of the title tag must also be found in the meta description. Check out my full guide to meta description.

HN tags

Hn tags correspond to the title and subtitle, to the architecture of a web page. They must be in order (H1- H2 – H3 – H4). Invisible to Internet users, they allow robots to better identify the data and semantics of a content. The more Google knows about a page, the more it will be able to display it in the most relevant search results.

The size of the text

Google works with keywords. Inevitably, the more text you add, the more keywords you’re likely to have that users will type into Google (even keywords you can’t imagine). According to several analyses, a well-optimized article is made in the vicinity of 3000 words. I always recommend 1000 words minimum. This is a very important criterion to gain visibility on Google.


We still touch the user experience, the fat catches the eye and allows the robots to better identify what a page is talking about. It’s not much but it’s a plus to gain more SEO weight.

THE AMP pages

Still new, the AMP pages for Accelerated Mobile Pages allow you to have a very fast mobile version. Google is increasingly promoting fast websites on mobile.

Loading speed – bounce rate:

The longer a page is, the more the user will leave. If the user leaves the site within the first seconds of loading the page, this increases your bounce rate. For Google, you’re hurting the user experience and it’s penalizing you for it. Take advantage of browser cache and optimize your images to be as light as possible. Check out my full guide on the subject.

The duplicate content

Google hates copy and paste, either externally or internally, try never to write more than 10% of the same text on a web page.


You can’t do much about it. Unfortunately, Google often gives more weight to an old website than a recent website. The solution may be in buying back an old domain name.


Google has made it clear that unsecured sites (HTTP) will be visually penalized in web browsers. He also indicated that HTTPS was now a factor in improving SEO. It is therefore essential to switch your website to HTTPS. Here’s a tutorial to pass your WordPress website to HTTPS.


The URL structure is also important to Google. What is absolutely necessary to remember is the stability of the URLs. To avoid links that are 301 or 404, avoid changing a URL that is already online. This could create a lot of SEO technical errors. Adding keywords to the URL is a good practice that brings a little extra to your Google visibility.

Avoid using the accented or special characters in URLs.

Final Word: how to improve Google’s visibility?

As we have seen, hundreds and hundreds of criteria are to be taken into account for good technical referencing and therefore maximum visibility on Google. These 15 elements above are more than enough to write a well-optimized web page. Think well that technical referencing is only the first pillar of SEO, you must then combine this information with the criteria of the other 3 pillars for your SEO strategy to be complete.

This article is the latest in a series on the theme of internal optimization of a website. Find:

  • Why do you need web writing?
  • The semantic study to find the right keywords to use
  • The corporate blog
  • My SEO hack to generate more traffic

leave a comment
Please post your comments here.



30 N Gould St,
Sheridan, WY, 82801, USA.

You may like
our most popular tools & apps