DEV Community

Cover image for My mistakes regarding the tech SEO optimization of my website
Eduardo Zepeda
Eduardo Zepeda

Posted on • Originally published at coffeebytes.dev on

My mistakes regarding the tech SEO optimization of my website

A few years ago, I migrated my website a couple of times, first from Wordpress to Frontity (A wordpress framework in React) and then from Frontity to a Digital Ocean Hugo App. I don’t regret my decision at all, but I made a few mistakes regarding SEO that you can probably avoid if you take into account what I’m about to tell you.

Why should I care about SEO in web development?

SEO is the factor that determines whether a website appears first in the search results of a search engine (almost always Google) or whether it is buried in the last positions, receiving little or no traffic and condemning the business it represents to bankruptcy, or if it is a personal project, to oblivion.

And I’m not talking about subtle differences, I’m talking about abysmal differences.

At the risk of sounding redundant, I will repeat it again: the traffic of a web site is much more important than the efficiency, the aspect or the language or framework with which it is made, yes, even if you write it in C++ or directly in assembler.

Aves exoticas is a perfect example of a web site with good seo but awful UI
Aves exoticas is the perfect example of a website visually not so attractive, but with an impeccable SEO that positions it in the first position on google for the keyword "aves exoticas".

Web developers often ignore the potential of SEO.

Most developers have an engineering background, where efficiency, best practices and the business aspect of a website are valued and overlooked.

Hence, when a web developer launches their personal projects, they often completely ignore SEO and focus on optimizing their website to the maximum, usually resulting in an extremely fast, efficient, and even visually appealing website, but with no traffic.

Web development in assembly meme
It is said that only true programmers program in low-level languages.

My mistakes while migrating a website without considering Technical SEO

When I migrated the blog the first thing I ignored were the multiple consequences of doing it abruptly.

I made three main mistakes:

  • Ignoring the changes in the sitemap
  • Ignoring changes in URLs' structure
  • Ignoring schema

The presence of a sitemap is crucial in Tech SEO

A sitemap is an xml file that functions as a map to navigate your site, usually in XML format, that lists the pages of your website.

The sitemap that my previous website had was located in a specific address changed its location when I migrated the website, so Google was unable to find the new sitemap, and then what happened? Well… Google indexed the pages as it could and wanted and, as you probably already know, I suffered the consequences.

One night my cell phone vibrated to the rhythm of the cascade of warnings that Google Search Console was sending in the form of notifications from my cell phone.

How could I have prevented it? By logging into Google search console and replacing the old sitemap address with the new one and asking google systems for a new reading.

How I realized that URL structure is important in Tech SEO

But that wasn’t all, after the migration from Wordpress to Hugo Google detected a lot of 404 errors when accessing the old URLs and, as a result of the consequent penalty, my traffic decreased by about 70%.

Why did this happen? Imagine that search engines see your website with a URL structure like the following.

Website-->Year-->Month-->Day-->Entry;
Enter fullscreen mode Exit fullscreen mode

And when you perform the migration, the structure changes;

Website-->Posts-->Entry;
Enter fullscreen mode Exit fullscreen mode

The important thing to remember here is that search engines do not have a way to easily recognize that one entry is exactly the same as another if it has changed location, especially if this migration involves slight changes to the page. While it is true that Google can detect duplicate content and is able to render a web page, that doesn’t mean it “sees” the entries visually, as a human would, in its guts it is still receiving and parsing text in the form of HTML.

How could I have prevented the traffic drop? By means of a redirect, in this case it was enough to tell Google that if it accessed /202020/12/12/entry_1 it should redirect to /posts/entry_1, and how? by returning an HTTP 302 or 308 response, Found or Permanent redirect, respectively.

The absence of structured data or Schema markup

When I used Wordpress the Yoast plugin took care of the structured data markup, but in Hugo this has to be done manually, so my website lasted a while without this structured data, the result? A penalty from google in the form of a decrease in impressions, and therefore traffic to my website.

Screenshot of structured data markup on a web site
The structured data markup for a web site looks like this

Structured data markup is usually in the form of an application/ld+json script on a website, it cannot be seen visually but it is read by search engines and helps them to understand the type and relationships that exist between each of the entities on your website.

That was the tragic story of how I decreased my traffic being a layman in SEO.

But after this incident there is a happy ending, because I started reading about it and I learned a lot of things that I put into practice, right now the website is not at the level it was before but it is heading that way and the best thing is that now I know what I’m doing.

I had not given myself the opportunity to touch this topic in the blog, because semantically it is quite far from what most devs understand by web development, although it really is not.

But I finally decided to write these mistakes in a post and if they can save you a couple of headaches, so much the better.

Top comments (0)

Some comments may only be visible to logged-in visitors. Sign in to view all comments.