Launched in 2016 covering tech, entrepreneurship and productivity. Hackernoon has gone through an amazing transformation over the years.
From covering a very limited number of topics at the outset, David Smooke and team have built a platform that people love to publish on with a stringent editorial process that ensures all the articles provide value.
This SEO review is part of a series that looks at similar platforms where people can publish their tech stories across the web. It grew out of my own curiosity about where best to publish and each of the platforms in the series has their own positives and negatives.
You can use the menu below to jump to a particular section.
- Organic Traffic
- Why New & Old Projects Fail To Rank
- The Magic Of Internal Linking
- Hackernoon’s Unique Position
- The Evidence For Organic Traffic Growth Being Limited On Hackernoon
- Hackernoon Today
- Two Tests To Confirm The Issue
- Other SEO Issues That Need Some TLC
But first of all let’s take a look at where Hackernoon is at for organic traffic.
Why Organic Traffic Is Important?
Sites are set up and run in a variety of ways with different strategies for acquiring new users and producing revenue.
Dev.to is a great social site with the focus on community. It supports the open-source project Forem which is used by a variety of enterprise sponsors and their internal teams. Whilst Dev.to wasn’t built from the ground up to capture new users from organic, they done a pretty good job.
Hashnode has also focused on the community aspect of their platform and coupled that with some pretty niffty features like domain mapping.
Has Hackernoon Organic Traffic Stagnated?
Below is a screen grab from Ahrefs organic traffic estimates. It does not reflect traffic from social or direct traffic and even it’s organic traffic estimates are wildly inaccurate. Ahrefs, despite being a great tool, just can’t accurately track all the long tail keywords, especially around the tech and emerging tech sectors.
It is useful to indicate the general trends of a site, and the below estimates should be seen in that light.
Hackernoon was making great progress until around April/May 2020 and then things seemed to drop off in a big way.
This analysis is to try and isolate what happened at this time and how to fix it. Like all things in SEO, I should say that there are usually more than one factor but for this case, I believe I found the main issue.
Before I dive into it, and show the timeline, we need to set a bit of context first.
Why New & Old Projects Fail To Rank
Developers are first and foremost looking at functionality, then looking at the UX of a web app or site and SEO often comes as an after thought, unfortunately. Take a look at the sheer number of areas a developer has to familiar with and then consider the growth of new technologies and you can understand why SEO best practice sometimes drops through the cracks.
But the consequences of ignoring SEO can be pretty sobering as visitors fail to find a new web site or organic growth is stunted.
Tried & Tested Platforms
The majority of the internet is built on reliable, tried and tested platforms like WordPress(love it or hate it), Magento, Squarespace, phpBB and Shopify.
Many of these platforms are themselves commercial enterprises with a paying user base that won’t be happy if an update or change to the platform affects organic visibility. They have a very real commercial need to get the SEO bit right. Any disruption to organic rankings on their platforms will see an exodus of customers and a drop in income.
Interestingly, Squarespace have worked hard over the years to improve their reputation and woo the SEO crowd. I’m not sure it’s worked.
The point here is that most of the web relies on tried and tested platforms that know, through years of trial and error, what works and what doesn’t.
Sometimes a new technology comes along and we get excited at the new possibilities that we don’t consider the bigger picture.
We Built This City On HTML
Google, Bing and the increasing number of new search engines were all built around a simple idea.
- Parse server side HTML
- Understand the topic and questions a page (or url) can answer
- Process the number and quality of inbound links
- Rate/rank the page accordingly for a variety of terms
- Serve the most useful to the user
This hasn’t really changed, despite all the new technologies and updates that cause hand-wringing. It’s still quality, targeted content and links.
Google Is a Business
Google is a business and there are costs associated with every page crawled, every article parsed and every link followed. Time and time again they have indicated to webmasters and developers to reduce their datacenter costs by a variety of methods including;
- Limiting the number of indexable pages to quality, usable content.
- Using Schema Markup to help parse and process information
- Simplify the crawling process by using clean, well organised code
- Improve site crawl-ability via hierarchical and intuitive internal linking
Google wants us as webmasters, SEOs, developers and entrepreneurs to make their job easy, reduce their costs and assist they in providing the best search results possible.
And for that we get improved search engine visibility for our projects, more clicks and visitors which may mean more income.
It’s a pretty easy to understand proposition.
A dynamic page or dynamic module can indicate the activity on a site, provide immediate feedback to a user, or be part of a full fledged app that works from your browser. Isn’t technology great?
But what’s great for the user, isn’t always great for a search engine and there are two main issues with client side rendered content;
- It increases the costs for Google
- It may delay crawling or content may not be crawled at all
But just to say that a page should not rely completely on dynamic content. Important links and static content should be available to the initial crawl of Googlebot. This speeds up indexing and ensures that the page context (it’s primary topic) is understood and that the site is crawled quicker.
To be fair, Google has gotten much better at this over the last number of years and I’ve no doubt that the devs at Hackernoon have done a great coding job.
Is this the reason that Hackernoon’s organic traffic is flat? Probably not, but it does highlight the next issue.
The Magic Of Internal Linking
SEOs love links and we spend a large part of our time on the clock building them. Whether it’s an outreach email, or a controversial bit of content and it’s promotion, a large part of the work week is centered around building links.
But Internal links are the best! They are free and as a rule of thumb, 4 internal links are as good as one decent inbound external link. (This rule of thumb is nonsense – but useful if you need to quantify and priortise internal linking to a customer or client who needs a cost benefit ratio)
I’m not going to go into the nitty gritty of content silos, Bruce Clay wrote about it 15 years ago and not a lot has changed since then. A large amount of highly related content, interlinked in a hierarchical way, will increase visibility for all the linked pages. It shows Google that a domain has in-depth specialist knowledge and you get a boost for covering a topic from several different angles.
Hackernoon & Internal Linking Topics
After looking at a number of similar platforms over the last number of weeks, one issue crops up again and again with sites that leverage user generated content.
A lack of internal linking and an over reliance on the “similar articles module”.
Unfortunately across the multiple platforms I’ve looked at, users don’t cross-link similar articles. And why would they? There is no prompt or visual aid to help find other great content on the platform about the topic they are writing about.
Most of these sites rely on a “similar articles module” to provide these internal cross-links as suggestions at the end of the article.
But there are two issues with the way that Hackernoon have implemented this module.
- Unrelated internal articles linked
- Module only loads scroll.
Most of these modules use a mix of things to generate the related articles including; the user, the tags and the date.
Hackernoon’s suggestions are completely dynamic(this is a problem) and it’s shows articles based on date – ie: the freshest articles. This is great for UX and highlighting fresh articles through out the site. But…
By not linking articles on similar topics and and using dynamic links based on recent date, Hackernoon misses out on the opportunity to build any content silos and the search visibility that goes with it.
Perhaps this is by design but it’s awful for SEO.
Related Articles Module Only Loads OnScroll
The “related articles” module is loaded into the DOM by a user scroll event so these links may not be crawled or recognised by Google at all.
But in my tests, the Hackernoon “related article” and the “tags” module aren’t triggered and so wouldn’t be crawled.
Hackernoon’s Unique Position
Hackernoon has developed a great reputation over the years and there are some very talented people who take the time to write on the platform. An important part of developing this reputation for quality in-depth articles is the Hackernoon editorial process.
Low quality spammy pieces don’t make the grade and editors work with content creators to make sure articles hit a balance of technical or specialist insight and readability.
This editorial control baked into the platform offers Hackernoon a huge advantage over it’s competition. The editiorial process allows the hackernoon team to add internal links to other relevant (evergreen) articles on Hackernoon.
I’m unsure whether the sites terms of service would allow this to be done retroactively. I’m guessing it would outside of the corporate partners Hackernoon has developed over the years.
The Evidence For Organic Traffic Growth Being Limited On Hackernoon
From the Ahrefs organic traffic estimation tool, we can see that the issue raised it’s head in March April of 2020. These traffic estimation tools take a little while to register lower volume keywords so, what looks like a decline over the course of weeks, could have been an immediate drop across the board.
For illustration purposes, I’m going to take a look at the post, “10 Data Structure, Algorithms, and Programming Courses to Crack Any Coding Interview“.
Here is the full list of archived pages which you can sort via “Captures” and below are the captures relevant to our discussion.
This is a great article that includes custom images, indepth explanations and is really well written. It garnered 99 linking domains for Hackernoon and is not canonicalised to another site – meaning the links benefit Hackernoon and not the other, canonicalised site.
So what happened?
Hackernoon, as you would expect, has evolved over the years, adding features, refining the user experience and speeding up it’s loading time. The area that we are interested in, is the “tags” and “related articles” modules located at the bottom of the page.
Hackernoon November 2018
In late 2018, it looks like the tag pages are hardcoded and there are two profile cards at the bottom of the article.
It doesn’t look like there is a “similar articles” module, but it may have been loaded in a different way.
Hackernoon August 2019
There was an iteration throughout early 2019 and in August, the bottom section contained
- Links to the tag pages
- More by author
- More related stories
Interestingly, the linked articles in “related stories” don’t change or update over time. They look static which creates permanent connections between related pages and is great for SEO.
Hackernoon May 2020
As we saw from the traffic estimation tool, May saw a huge drop in traffic site wide. So what happened in May?
The tags links are still there, but the direct links to the authors other posts has been removed and the related stories module has been reworked completely.
The related articles module now uses a dynamic process to show the fresh and sponsored articles that have little topical relevance to the rest of the content on the page.
Remember that the way the “tags” and “related stories” modules are loaded now, it’s highly unlikely that these links are seen at all. I would have to dig into the server logs to confirm this.
Regardless of whether Googlebot parses these links, you can see that suggested articles are based on time rather than topic and so, don’t provide an SEO benefit.
Two Tests To Confirm The Issue
Nobody wants to do work unnecessarily and if you are proposing major changes to a customer or client, it makes sense to show them low risk proof that the invested time and money will pay off. I’m sure the devleopers at Hackernoon are no different. So here are two low risk, low effort ways to confirm that the “related articles” module needs to be rethought.
Reworking the “related articles” module would be a huge endeavour that would require a lot of thought, consultation and testing. For a quick test this is not feasible.
The next obvious choice to take a look at the “tags” module, loading this with the main content should be a lot easier and would allow the huge amount of link equity to move through the site.
I would expect to see site wide improvement within about 2 – 3 months of this change. With recent articles in the tag pages benefiting the most.
Choose Two Topics & Manually Interlink
Interlinking within the body of an article is the most powerful type of internal link as Google is able to look at the context of the hyperlink and draw from the anchor text to understand the target page. The Hackernoon editorial process probably allows for the team to go in and make small edits such as adding internal links.
So how to choose a topic or set of articles and track them? Ideally you want look for pages that are on page 2 for a keyword or group of keywords, add the internal links with related anchors to the set of pages and wait.
The coding interview url from the example above is reasonable place to start as Hackernoon has a large number of related articles that could be linked.
Other SEO Issues That Need Some TLC
There were some other things I came across that could do with a little attention so in no particular order.
The tag pages provide little or no inbound traffic. Do they need to be indexed? Why not add a couple of hundred words per page to add context to the list of internal links and at least give them some value outside of UX perspective.
Profile Pages don’t contain a H1. If the SEO purpose is for the user’s profile to be found high in the SERPS when the name is Googled, this lack of a clear h1 with the username is an oversight.
Whatever way the article schema is parsed from the body of the content, it’s full of encoding errors and if the idea of schema is to provide the data in a clean and structured way, is this being achieved currently?
Hackernoon is a great site and has built an industry wide brand that is synonymous with insightful and entertaining tech content. They’ve built partnerships with the top tech companies in the world and steadily introduced great new features to the platform.
With a huge number of inbound linking domains and with all this cotenbt With topical interlinking baked into the editorial process and a re-engineered “related articles” module, I have no doubt there organic traffic would sky-rocket.