Makerlog #12 - SEO-opsy: How I removed all my articles from Google

Google can't find my websites

I did an oopsie. With the SEO of my blogs.

Since I’ve been writing a bit more online. I’ve been dabbling a bit with SEO. Something I noticed these past few months was that a small blog with 3 blogposts written in 2020 (fullstackalmanack.com) was having a lot more hits than another blog with 5 blog posts (gcpsecured.com) and this blog with 15-ish blogposts. I know these blog posts numbers are rookie numbers for veteran bloggers but it still made me wonder why.

One of the distinctions was my tech stack. For FullstackAlmanack I use Hugo with a template. The others I moved to NextJS for more control over the design. After diving into the code I immediately saw that the FullstackAlmanck template had some good out of the box SEO optimisations backed into the template. I checked the others, and zero.

This looked like something I could tackle in a weekend. 2 small tasks.

  1. Dynamicly add metatags.
  2. Generate a site map after the site has been build.

Add metatags to NextJS Dynamically.

Metatags allow you to modify how Google or another search engine sees your website. You want to create a relevant title and description since this is what users will see in the Google Search results. But you can also add relevant data like when a post has been published or updated to let Google know your content is still relevant.

Not only search Engines but also Social Media makes use of these MetaTags. They can be used to tweak the card that for example twitter generates when somebody shares your site (You can try it out by sharing this site if you want wink). This is achieved trough OpenGraph.

Ofcourse I didn’t want to add all the Metadata tags myself on every page. So I looked for a solution. I stumbled upon next-seo. A handy tool that let’s me set a default for all my pages and allows me to dynamicly update tags per page. Best of all it was easily implemented. On to the next task.

Create a sitemap for a NextJS Static Site

Another thing I wanted to add is a sitemap. A sitemap is a file on your website most often found at /sitemap.xml that tells you all the pages on your website. This helps a search engine index all your pages by taking the guesswork away from them.

To accomplish this, I wrote a small script that generates the XML file based upon the files that are put in the build folder after a static export. It took some tweaking but we got it done.

And then I did an oopsie

With all the changes implemented, I deployed, waiting for the flood of people to land on my site.

And then… Nothing… Now one thing I learned from reading SEO blogs is that it takes time. Domains have to be warmed up, blog posts have to mature. I wasn’t really expecting much.

But my changes seemed to have the reverse effect. Instead of a flood, the little trickle I had completly dried up. I gave it a few weeks, still nothing. This is the point where I discoverd Google Search Console. A handy tool to see how Google is treating your website in the search rankings.

I also discoverd something else, the only page Google was indexing of both my sites was the index website. All others were gone.

Remember that we installed next-seo and that it had this handy feature of creating default metatags that could be reused on every website? Well one of the tags I put in there was the canonical tag. For those of you who don’t know what it does, a canonical tag let’s a search engine know that this page is a copy of whatever page is in the canonical tag. This let’s google know not to index your duplicate pages.

Well I set it as my root domain and didn’t update the canonical tag on any individual page. Meaning the search engine now thaught that every other page or blog post was a copy of my landing page, so it didn’t see a reason to index them anymore. Auwtch.

From a technical point this is quickly fixed. I removed the default canoncial for now and will probably add them later when I can generate them dynamicly. Now it’s just waiting on Google to find my pages again.