Betpawa Uganda | An SEO Challenge 

Betpawa Uganda is part of the larger Betpawa group that operate in a number of different countries in Africa. They provide users the opportunity to bet on a range of different sports from across the world.

And they have been hugely successful with direct marketing and advertising in each of the regions where they operate.

So if Betpawa Uganda are making the big bucks and doing great, why is an SEO looking at their site?

Nobody likes to see the missed opportunities and money left on the table. I took a look at this site 24 months ago and was curious if my suggestions had been acted upon.

The Preamble

Betpawa Uganda is part of the larger betpawa group with domains and business across 9 other countries. I’m taking a look at the Uganda domain as I believe it’s the one with the most traffic. I do not work for Betpawa Uganda and I don’t have any inside baseball on their internal metrics. I am solely using tools for traffic estimation.

The other consideration is that Betpawa have built their own platform from the ground up and refactoring code and systems to focus on SEO may be a challenge.

I’ll be using examples from international competitors in this and considerations should be taken for screen size in different markets. I have no affiliation to these competitors and this is a constructive review.

The Current Situtation

Betpawa get a tonne of traffic.

But by digging a little deeper, things begin to look a little less rosy.

After removing all the branded searches, estimated organic monthly traffic is in the low hundreds and not in the millions.

Betpawa are not catching organic traffic, fresh leads or first time depositors via their SEO strategy.

So what’s the deal?

There are a number of issues holding Betpawa back primarily structural and technical.

Unnecessary Pages In The Google Index

As I wrote elsewhere recently, this is a huge problem from many platforms that haven’t had a technical SEO on-board from the start of development. It’s not unique to Betpawa Uganda and it’s something I come across time and time again.

First of all what is the Google Index?

The Google index is where Google caches your webpage it’s like a copy of the page on their servers. When Google’s bots roam the internet, they take a look at each page and decide if it’s important or high quality, the ways it does this are numerous. But lets keep it simple for now.

Google is always trying to give the user the best result to match the searchers intent.

If the search query is “free no deposit spins paypal”(links Google, not a casino site), the first result is a high quality page that loads fast and has a list of casino bonuses that allow for deposits with PayPal.

So Googlebot spiders a page, looks at the Title, Content, Images etc and tries to decide the topic and what questions the page can answer or if the page should be in the index at all.


Google is a business, with costs, overheads and all the rest and there is a cost associated with keeping a page in it’s index. Google wants to keep costs down and has been recommending webmasters and SEOs keep their sites as slim and trim as possible for years.

So how does Betpawa fair?

Things look better than they did 24 months ago but it still doesn’t fair well.

Just for comparisons sake, let’s take a look at some of the market leaders in online sports betting.

Okay, you’ve made your point.

The competition all have between 22k and 31k pages in the index. And assuming that there are only a limited number of sports, teams and competitions, where are all the extra Betpawa pages coming from?

A quick investigation of the site structure shows that there are a number of page types that shouldn’t be there, one of the most unnecessary folders is shown below.

But this doesn’t tell the whole story, we still have a tens of thousands of pages that shouldn’t be there.

Here comes the Dev Vs SEO war.

From what I can see, the way the Betpawa platform is coded, it generates a new page for every event. Take a look at the pages below for Chelsea vs Arsenal Game, home and away.

So that’s 10 in total where the max should be 3, home, away and the community shield.

This is a huge problem, but not just because it spams pages into the Google Index.

For another reason too.

Page Age, SEO & Google Trust

Two significant ranking factors are

  • Domain age
  • Page age

If I want to sell Christmas trees, I have to build the site in April. This gives Google enough time to understand the site and begin to trust it by the time Xmas comes around.

It’s no different with pages. When a page is brand new, Google (often) doesn’t trust and rank it immediately. This is to fight churn and burn SEOs and there are some studies showing that it can take up to two years for a page to reach it’s full traffic potential. And just to be clear, when we talk about a page here, really we mean it’s URL(the address Google uses to find it) and to an extent, the content on the page.

If you generate a new page for every game, you will never have the benefit of age for the yearly recuring search terms of TeamA vs TeamB and, you will be fighting for position against established pages every year.

Some betting sites will switch in new data as the fixture comes around again. Here is an example of BWin switching updating the content and keeping the page.

But of course that’s not the only way to skin a cat. Paddy Power keep their space in the index to a minimum by deleting pages which are no longer relevant.

But competitor pages, such as Paddy Power’s English Premier League page, gets enormous targeted traffic as it is static.

Betpawa seem to have 4 similar pages in the index covering these search terms. It’s unclear if these different URLs are for different years/seasons or if they are an iteration of the interface.

If it’s due to an updated interface, well that means that every competition probably has 4 pages of the same content and no indication in the canonical tag as to which is the latest.


Schema is an increasingly important factor in SEO. Not just because this structured data tells Google explicitly about the information on the page, but also as “click through” factor that can help rankings.

Consider the following two search engine results:

The PaddyPower result takes up more space, provides more relevant information and even contains deep links to bet on the events themselves.

More and more sites across the web are implementing schema to better help Google understand the information on their pages. Whether that is event schema used across the betting industry or FAQ schema or any of the new types of schema being introduced all the time.

Okay so schema is good and you can bake it right into the code

Ugly URLs

The URL is the address of the page. It’s broken up into different parts including the domain and the slug as shown in the simplified example below.

When Google first visits a page, it will consider the Title, H1 and H2s, the metadescription. It may not digest the whole page. So it’s important to signal to Googlebot exactly what the page topic is as clearly as possible. This includes having the main keyword in the slug.

Here are the slugs for a couple of English Premier League betting sites.

And here is the Betpawa English Premier League url.

From a UX point of view, this is not so much of a problem as Google usually picks up the BreadcrumbList schema as shown below.

But it still means the site is missing out on any SEO benefit and the traffic that would result from making the change.

Progressive Web Applications & Search Engine Optimisation

So now we come to a very tricky topic, PWAs and rendering.

Progressive Web Apps(PWAs) were all the rage a couple of years ago in the dev sector. They allowed coders to build complex apps that worked in the browser . There was an explosion of interest across the web as developers built cross platform masterpieces. But things soon cooled due to a simple consideration.

Rendering and the very small but important cost.

With a PWA a company shifts the rendering to the users browser. This has numerous benefits but the computing needed to render the page (or endpoint) is done on the users device and not on the server.

If you are Google, and your bots crawl the internet constantly and you have to pay the datacenter bills, this small change could add up to a massive increase in costs, because now you not only have to visit and interpret pages but you must also render these webapps too.

Long story short, Google has vast but limited resources and limits the amount of resources applied to these PWAs.

What’s this got to do with

Betpawa are only delivering the page head and the minimum of content via static html and the rest of the content is dynamically generated as can be seen the Google Cache.

Without seeing server logs, I can’t say with certainty how big an issue this is. But all the competition are using a server side rendered solution.


Betpawa have a tremendous business built on advertising and a great user experience, do they need to change things for more organic traffic? Probably not, but it’s a pity to see such a lost opportunity.