Hashnode’s Search Visibility Woes & Opportunities 

Hashnode is an exciting feature rich platform with excellent an excellent UX, so why is it struggling to gain search visibility?

Hashnode is a great platform focused on the highly competitive area of tech blogging. There are numerous other established competitors out there such as dev.todevdojo.com and the less focused like Ghost, Medium and hackernoon.com. So what sets Hashnode apart?

Well, they have an excellent user interface, personalized reading suggestions based on your tech stack, and those all important analytics baked right into the platform.

They have the ability to map their platform to your domain which presents a very interesting opportunity for link building but more on this later.

And Hashnode has seen huge success with this recipe with the co-founder Sayed Fazle Rahman posting on LinkedIn of their meteoric growth.


Notes, Exceptions and Caveats: If you find this beneficial and have an open-source or commercial project you want me to take a look at, connect with me, Murrough Foley, on Linkedin.

But as with many startups, over the years they have pivoted several times and this has led to a slightly incoherent SEO strategy.

Hashnode Search Visibility Woes & Opportunities

It would seem that the title was a little spicy to publish on the Hashnode.com platform as within hours of posting, the mods deleted the post.

But anyway.

Why Is Organic Traffic Important To Tech Blogs?

As I wrote recently over at Dev.to, organic traffic is a force multiplier for any blogging platform. Users invest their time and resources in writing articles for a number of different reasons;

  • The need to share what they are interested in
  • Show competency in their area
  • Tick a box for certification renewal
  • Make connections with others with similar interests

But just like social networks, when we take the time to write, we want to know that the article will be read and that the right audience will read it.

Organic traffic creates a positive feedback loop for the platform. The better the organic visibility, the more page views, the more people are aware of the platform and then, naturally the more signups.

So what’s the deal with Hashnode SEO?

Ahrefs Organic Traffic Estimation

Below is the organic traffic estimation from Ahrefs. This is far from accurate and is based on keyword monthly search volume and traffic estimates based on SERP position, but it does provide general trends.

It doesn’t indicate activity on the platform from direct and social traffic, it just provides some insight into organic search from Google.


But this doesn’t tell the whole story does it?

No it certainly doesn’t.

Now, when you sign up with hashnode dot com you get a profile url at hashnode dot com and you also get a subdomain at hashnode.dev.

Here’s the traffic estimation for the .dev domain.


Things are definetly moving in the right direction here but organic traffic to the user subdomains is still relatively little when compared to Hashnode’s competition.

So why is that?

Using this site structure, splitting the content across different domains and subdomains means that Hashnode users will miss out on huge SEO opportunities.

https://hashnode.com/@MattPorterBridges - profile
https://mattlawrence.hashnode.dev - subdomain site
https://mattlawrence.hashnode.dev/the-weirdest-interview-ive-ever-prepped-for - post

You are going to have to explain that a bit more.

Okay, it’s about using the available resources to your advantage.

Domains, Sub-Domains & Age

Google knows and trusts the main domain – hashnode dot com. Google knows that the domain is associated with coding and technology, and it has a large number of links from other technology sites. The domain hashnode dot com has built authority in this niche and Google knows it.

But by creating the user sub-domains at hashnode.dev, user generated content will see no benefit from the domain age and thousands of links pointed to hashnode dot com

I don’t get it

Imagine building a house with the aim of getting lots of organic traffic from Google. You get half way through building the house on the domain hashnode dot com and then you stop. And start building a new house on the domain hashnode.dev.

Okay, so using the .dev tld wasn’t the best option from an SEO perspective, what else?

Structure wise it gets worse. Google treats each subdomain as a separate domain. The shared benefits of having a very powerful domain with lots of content and lots of links doesn’t exist for any of the subdomains on hashnode.dev.

Give me an example

Both dev.to and hackernoon.com have a very simple site structure. They are both single root (mostly) domains with all the user generated content on the main domain. This means that all the content on the domain is seen as part of one site, and all the link equity to these domains is shared amongst all the pages.

To put it another way, these two sites are growing a very large tree that can be seen for miles around, but hashnode.com is building thousands of little saplings that will never get seen.

Hashnode Is Loosing Organic Traffic

Hashnode is an amazing platform that is always evolving and recently the team introduced a feature that allowed it to become a hosting platform. You can now map your domain to their servers and have a dev blog set up within minutes.

Without a doubt, this is a great feature and takes away a huge amount of hassle from getting started blogging on your own domain. I’m sure this will prove to be a hugely successful feature, but from an SEO perspective it does present some problems.

By adding this domain mapping feature, Hashnode are depriving the main domain of great content and future organic growth possibilities. This means that they need a inhouse content strategy that can produce targeted articles that rank and advertise the platform to new users.

But smart people make smart decisions

Yes, and no doubt Hashnode are aware of this and planning for the future.

Mysterious 307 Redirects

I have no ‘inside baseball’ on the goals of the engineering team and they have created one of the most user friendly and feature rich platforms around. I’m sure the coding required to roll out full domain mapping was a milestone

Hashnode dot com are using 307 redirects on their site as a way to forward traffic to the mapped domain or sub-domain.


Caveat: Hashnode block the Internet Archive bot in their robots.txt so my ability to chart the history of this feature/bug is limited. The team are obviously a smart bunch and they may have a very good reason for using the 307 instead of a 301.

But the issue with using the 307 here, is that it’s a temporary redirect, which means Google expects the page to come back to this URL at some stage in the future and keeps it in it’s index. A 301 or 308 redirect would flush these pages out of the index and make Hashnode a more slim and trim site.

What do you mean by slim and trim website?

Okay that’s a big one and an issue for many large sites.

Unnecessary Pages In The Google Index

Google is a business and as such has to make decisions that effect their bottom line. For years Google has given an algorithmic penalty to sites that take up space in their index with low quality or useless pages. You won’t get a notification in Google Search Console for an algorithmic penalty, you just won’t have the same search visibility as other slim and trim websites.


First of all what is the index?

Google’s bots constantly crawl the web to index the content, allowing us to perform the pretty remarkable feat of querying their database. Indexing all these pages and parsing the data takes huge resources so Google is always looking for ways to reduce it’s costs and make it’s job easier.

This comes in the form of recommendations to developers and site managers and also technical standards like schema to make it easier for search engines to understand structured data.

But Google has long asked website owners to minimise the the amount of indexable pages on a site to those that provide the person searching with value.

Google recommendeds webmasters limit the number of low quality pages, sometimes called “thin content” and doorway pages with the use of ‘canonical’ or ‘noindex’ tags.

In fact, as i write this, Google just announced a new update targeting ‘low quality pages’ that don’t add value.

So the idea is be Google’s friend and reduce their costs, provide real value to the visitor and they give you some more visibility.

Okay, I’m with you, but how does this affect Hashnode?

It affects it in a pretty big way. Let’s take a closer look.

Are These Pages Providing Value To Someone Coming From A Search Engine?


In the screen grab above, Google is indicating that there are 200k pages in it’s index. And as we noted before, there are a bunch of pages that have been 307ed rather than 301ed which take up about 50k.

So what else is there that doesn’t need to be?

Quite a bit of stuff actually. Every new user that registers, generates a profile page and two subpages.


If we just look at the quality of these 3 pages we can easily surmise that the core profile page is the only one that adds benefit to someone coming from Google. The following and followers endpoints or urls provide very little value, little original content and I can’t think of a questions or query that either would answer.

Marking these pages noindex in the robots.txt file or canonicalising them to the main profile page would greatly reduce the number of pages in the index and make the site more slim and trim.

site:hashnode.com inurl:followers

There are others too that could be hunted for but I’ll just use the last example below.


Comparing these two urls in many instances provide almost identical content on both and the /recent slug should be canonicalised to the main tag page to avoid duplicate content issues.

When looking at large sites in recent years, I’ve started to run a subdomain hunter to see if there are any issues that aren’t immediately visible from Google dorks and SEO scans. And whilst Hashnode have been pretty tidy, the subdomain ‘legacy’ should be ‘noindexed’ to avoid indexing or duplicate content issues.


That’s quite the list of things to consider

But wait, there’s more. We haven’t even touched on links yet.


The above is a screengrab from Ahrefs showing the steady increase of links from new domains over time. This is exactly what any webmaster wants to see and usually accompanies a steady growth in site visibility and organic traffic.

But dig a little deeper through this list of the top linked pages and we start to see a different picture.

Vietnamese Spammers Working Overtime

It’s not just Vietnamese spammers, it’s everybody but this type of tiered link building works in non English geographies a bit better. I’ll just pull a couple of examples to show what’s going on.


I’ve highlighted 3 random profiles that all happen to be Vietnamese. Each of these profile pages link to the main website and you can see that the spammers have built links to this page.

These types of automated links are pretty low quality and if too much of a websites link profile is made up of these, then Google can penalise the site.

If Hashnode is acquiring great links at the same time from quality sources, this may not be an issue as over the years Google has gotten better and better at filtering these toxic links out of their algo.

But what about this great/dangerous link building strategy you mentioned?

Okay, we will get to that now.

Hashnode As A Platform & Subdomains

Digging a little deeper into the link profile, a great number(almost half) of them are coming from domains that were taking advantage of Hashnode for the blog functionality on their own domain or subdomain.

Below I have two examples.


After we exclude these footer links, the total number of linking domains comes out to “24,527”, which is a little less impressive. If we add in the pages that are canonicalised to other domains(meaning the link equity is passed to another page and domain), the authority of Hashnode is less compelling.

Numerous SEOs have called out this type of linking strategy over the years. You provide a service and encourage the user to add a badge to their domain that links back to your site, rank and profit. Sometimes this link is embedded in a WordPress plugin or sometimes it’s in a Theme footer.

But Google started stepping in on some occasions and giving manual penalties to sites that were abusing this type of link building tactic.

Many of the manual penalties that were handed out over the years have been baked into the algos that run from time to time. So whilst my example of WPZoom above had the benefit of knowing they had penalty applied to their site, many site admins never get that message and have to deduce it for themselves, when there is a significant dip in traffic.

The Hashnode.com team should bear this in mind over the coming years and if their content push doesn’t get the expected traction, consider setting these footer links to rel=”nofollow” and waiting 3 months to see if there is a benefit.

SEO Considerations When Looking At Page Types & Templates

I’m going to take a quick look at the page templates used across the site here. But it’s important to bear in mind that different pages have unique goals and UX should always be the one of the key considerations.

The Hashnode team have done a phenomenal job at creating a great blogging and social platform. These suggestions are purely from an SEO perspective.

But before that, there are two issues that need to touched on first.

Problem: Baking In Content Silos To Site Structure

Bruce Clay wrote the definitive article on content silos 15 yeas ago and I won’t go into the same level of detail, but just to say that Google will give a boost to groups of closely related content that are linked together in a methodical and hierarchical way.

A problem that developers have is, how to integrate this structure for user generated content and on a platform that covers such a fast paced industry where new technologies emerge monthly.

Not an easy task when a user has no incentive to cross link to other great content onyour platform.

As indicated above, Google and the other search engines give a boost to topic expertise as indicated to them by lots of closely related content that is linked together.

Dynamic content (and links) is/are great for user experience, a constantly updating page shows off activity across the platform and reinforces the social media aspect of Hashnode, but what does Google think about it?

Googlebot will crawl a site at regular intervals and analyse the page content to best match it to user queries. If all or the majority of a page’s content is dynamic how can Google know what user queries to match to the page?

Give me a real world example.

The online sports betting industry is probably the most competitive for search engine optimisation and any mistake can cost traffic which immediately impacts revenue.

A quick Google of one of the most profitable terms “sports betting NFL“(link to Google not a sports betting site), shows Draftkings in #1.

The page contains mostly dynamic content that changes as the calendar moves forward and odds change.

But Draftkings have 500 words of static content in an info box and faq at the bottom of the page. This adds context to the dynamic content on the page.


This is not the only way to do it, but it’s been a standard for many years.

So pages need some static content to give the dynamic content context, right?

Exactly right and it’s not just static content needed, but also static links.

Google and the other search engines use links to understand the relationships between content and the importance of pages. Static links between closely related topics are essential to build topical trust.

Okay, you are going to look at the individual page templates now, right?

We’ll take a quick look.

Page Types & Optimizations

All sites are made up of page templates that serve a purpose. That can be for user content organisation, as a home for the user, or for other technical reasons like helping Googlebot spider the whole site.

I’m going to break down the types of pages and a purpose for each. My view is skewed from an SEO perspective and that’s fine for the purposes here.

  • Content Pages: Informational posts on the primary domain – hashnode dot com.
  • Profile Pages: Pages that centralize information about a user and their activity.
  • Tag Pages: Pages that centralize information around a topic or subtopic.
  • Subdomain pages: Catch all term for user blogs under hashnode dot dev.

Content Pages

Purpose: On the main site, the primary purpose of these pages should be to provide answers to queries entered into to Google and provide interesting resources for the community that produce backlinks from external sites and funnel new visitors to sign up for an account.

The main site has relatively little content on it, which is rather surprising as it has accrued links and age and is ripe for a content marketing push.

I ran a bunch of pages through Screaming Frog and relatively few came back without a 307. But those that did, had almost no internal linking to other related pages on the platform.

This issue is not unique to Hashnode and all the competitors in this space that I’ve looked at have the same issue. Hackernoon does have an editorial process, so they are best placed to deal with the problem.

To be explicit, the issue is that topically related pages should link to each other across the main domain.

Example Content Page

Here’s an excellent example. This listicle pulls in tonnes of organic traffic for the site.

Screenshot_20220819_175410.pngIt’s nothing complicated, just the typical listicle done well with 30 topically relevant researched links, nice big images and a couple of lines about the linked resource all targeting the key phrase set around “web developer portfolios“.

The listicle article type is working for hashnode, they should be targeting similar types of key terms like “web design sites for inspiration” or showcasing great content written by their active community.

Cross linking these posts is achieved via a “Read Next” Module at the bottom of the page.


The issue with this is that it is restricted to other articles written by the author. This is fine on mapped domains and subdomains. But the issue with it on the main hashnode platform is that the links are not always topically relevant.

What if Didicodes writes about a Javascript tutorial one week and AWS best practices the next.

There is little SEO benefit to these links.

Profile Pages

Purpose: The main purpose of these pages to rank for the members name and centralize a place where posts and comments are cataloged.

As it stands, these pages are beautifully organised and provide all the necessary information in a coherent way.

So what’s the issue?

There’s a small oversight in the use of H1s.


Googlebot does not authenticate or log in and as such sees 2 h1s in the source code. This a technical SEO no-no and something we avoid. Two h1s could confuse Google as to the primary purpose of the page, which is to rank for the user’s name.

It’s an easy oversight and an easy fix too.

Tag Pages

Purpose: The tag pages are the main hub around which the discussion happens. These pages contain links to articles and posts around the web (on mapped domains and the hashnode.dev subdomains). These tag pages help indexation and allow users to keep an eye on the tonne of great content produced by the community.

Whilst these pages are fantastic for UX and help Googlebot find and index content, they aren’t targeting any particular keyword.

This is a real missed opportunity as these tag pages are automatically interlinked throughout the main hashnode dot com website and on tag pages on mapped domains and subdomains.

With all these inbound links, why not optimise the tag pages for organic traffic?

And how would you suggest doing that?

There is no SEO targetting on these pages. Take a look at the Title, URL and meta description which is targetting the primary head keyword “JavaScript”.


And then take a look at the multiple dynamic h1s that will only confuse Google as to the main topic of the page.


Okay, that’s a big one. What else?

As mentioned above, these pages need some static content and static links to highly related shoulder topics.

Non Dynamic Content


As highlighted above, the only static content on the page is a very limited box of 40 odd words.

Adding 500 to 1200 words could be done in a variety of ways that wouldn’t impact on the sites UX. This content would allow for cross linking relevant should topics contextually and give the page the opportunity to tank for hundreds of related keywords.

Cross Linking Other Relevant Tag Pages

On many of the tag pages, the highlighted boxes contain very similar links. There is an opportunity to manually link highly relevant shoulder topics rather than just repeating the same links.Screenshot_20220819_181847.png

Maybe the Python page would link to the top packages, Django or web scrapping tutorials. These static links from the most interlinked pages on the domain would rise the tide on all pages.

Hashnode.dev Subdomains & Tecnical SEO

Freedom is great but perhaps we all need to work within certain boundaries. Hashnode provides users a fair amount of freedom on how they structure their pages. And anyone who is used to writing in markdown has their own flow.

But Hashnode, by allowing users to have the full freedom of markdown, has created a situation whereby a huge number of users have multiple h1s in their content.


It’s not ideal, but without a a nudge, or visual alert, users will continue to hamstring their own content by over using the h1 tag on their page.

Hashnode Content Strategy

Hashnode are well aware that their feature to support domain mapping is both a huge advantage and that robs them of great user generated content. I’m sure they are well aware that they need a major content push to capitalise on their growing position in the market.

Some of the questions around content that need to be answered are;

  • Organic traffic is a funnel and who is the funnel targeting?
  • What type of content does Google already favor from Hashnode?
  • How best to leverage and highlight the existing user base and their great content?
  • How quickly can Hashnode scale to keep pace with their competitors in this space?

I have no doubt that Hashnode will overcome these challenges and continue rolling out great features on their platform.

Conclusion – Murrough Foley

Okay, this article has grown from a target of 2500 words into another Sunday read. Hashnode are a fascinating platform with an interesting mix of issues and opportunities. To date, they have built a great, feature rich platform and smart people make smart decisions.

I’ll keep an eye on the site as it progresses and I wish them well.