9 Things That Hurt Your Website Rankings

Things that can hurt your website rankings

Google has hundreds of factors to consider before ranking a website on the search engine results page. The sad part is that the search engine barely shares any of the techniques it uses, as it safeguards this information to avoid SEO manipulation. However, SEO experts have learned a few things that will hurt your website rankings.

We'll look at a few of them below:

In 2014, Google explained that websites using an SSL certificate (https://) would rank higher than those without it. The search engine explained, if it compared two sites with lots of similar factors, the website with an SSL certificate would rank higher than the one without.

SSL is a short form for the Secure Sockets Layer. The SSL certificate creates an extra layer of safety to protect the information visitors share with a site. It establishes an encrypted link between the browser and the server to ensure all the data shared between the browser and the server remains private.

It’s challenging to calculate the actual impact an SSL certificate has on a website’s ranking as Google uses a range of other factors. SEO experts should not expect an instant boost on website rankings after including an SSL certificate.

Several case studies have been performed to determine the impact HTTPS has on ranking and it was concluded that SSL certification correlated with search engine rankings on Google’s first pages.

2. Slow Loading

In 2010, Google stated that page speed would affect how a website ranks on SERPs. Page speed is the time a website takes to load. Most e-commerce websites load in 7 seconds, but the ideal load time should be 3 seconds.

The loading speed is a function of multiple factors including the type of connection you are using, the processing power of the device, and running apps, to mention a few. However, this does not allow web developers to get reckless when optimizing sites for ranking. Research shows:

  • A second of delay when loading a site reduces a page view by 11%
  • A second of delay decreases customer satisfaction by 16%
  • A second of delay reduces the conversion rate by 7%

The Google algorithm penalizes websites with slow loading speed by ranking them lower. When coupled with Google’s crawler, which spends a relatively short period on each site, the chances of a website with slow loading speed ranking high are pretty slim.

Ways a website can improve loading speed include:

Compressing files

CSS, HTML, and JavaScript files can be compressed using apps like Gzip. Images can also be compressed using programs like Lightroom or Photoshop.

Reduce redirects

When a page redirects to another, visitors have to wait for a request-response cycle between the server and the browser. Reducing the number of redirects will improve load speed.

Using browser caching

Browsers are designed to hold lots of information to avoid reloading entire pages when a user returns to a website.

Enhance server response speed

Factors such as the hosting solution and website traffic can reduce a server’s response time. Identifying potential speed hurdles like insufficient memory and slow database queries can improve the loading speed.

3. No Internal Linking

The primary objective of internal links is to help visitors and search engines identify new content. Websites are continuously developing new material, and it’s up to the site owner to assist Google in identifying the location. Some pages are discovered when you include a link from a known page to your new page while others have been crawled by Google.

Internal links also help rank a page higher. The more internal links a page has, the higher the rank. Note that it’s not just about the number of internal links but their quality. High-quality internal links make it easy for pages to be indexed and ranked. This is because internal links have a direct influence on other ranking factors, like page importance and click depth. These links provide quantifiable results on how Google should prioritize the pages on your website. Here are some principles of internal linking:

  • More links give more value to a website. They help Google identify the importance of a page in comparison to other pages on a website.
  • Google considers 1000 is a reasonable number of internal links per page. This includes links in the header, footer, menu, and sidebar.
  • Links provide unparalleled boosts to low-value pages.

4. Page Layout

Google awards websites that enhance a user’s experience. Today, online visitors are more inclined to visit sites with a clean layout that provides quick and easy access to information. Having a strategic design with thoroughly-researched and relevant content should boost your website rankings.

User experience is not just about the layout of the pages on the website but a focus on metrics such as:

Bounce rate

The bounce rate shows the percentage of web users that leave your site after viewing one page. A high bounce rate shows that users have not found what they are looking for and indicates that your website may be poorly designed, the UX design may not be appealing, or you have poorly formatted content.

On-page time

The time visitors spend on a page shows how engaged they are. A high on-page time shows the content is interesting and provides the information users are looking for.

URL structure

The URL structure should communicate what the page is about and orient the user to the overall hierarchy of the website. Site owners should focus on creating coherent URLs that can be understood by the search engine crawlers and users alike.

Fresh Content

Fresh content attracts not only new visitors but also helps to retain old ones. New content attracts visitors who have not visited your website before and can help convert them into customers through avenues like content marketing, conversion rate optimization, and retargeting.

Search engines love fresh content and Google has a ranking factor called “Query Deserved Freshness (QDF).” Search engines use crawlers to find and index new content. When they find fresh pages with new and relevant content, the website may rank even higher. Websites without fresh content have less chance of getting indexed. Consistently creating new content enables Google and other search engines to crawl and re-index pages on your website which improves ranking.

5. Bad or Ugly Web Design

Sometimes even highly attractive website designs fail to achieve the best rank. This is because SEO is not only based on visual optimization but, more importantly, text optimization. A visually appealing website does not guarantee a higher rank on SERPs.

Google’s algorithms are designed to give higher rankings to sites that are easy to access and navigate. Websites with great designs can have higher rankings, but the look has to be practical.

Some web design aspects you should consider when optimizing your website for higher rankings include:

Keywords

Keywords should be strategically placed to improve ranking. Each page’s URL, for example, should consist of a keyword. Title tags also improve rankings by enhancing Google’s ability to locate your site. Other areas to include keywords include alt tags, body copy, and heading tags.

Navigation

A website should be easy to navigate, meaning pages should be easy to find and reach. Broken site architecture frustrates search engine crawlers and visitors alike. Using content hierarchies, for example, provides an opportunity for the website to rank higher for general terms and long-tail keywords.

Formatting and appearance

Search engines like Google reward text that is easy to scan. Including elements such as bulleted lists and numbered points, bolded texts and short paragraphs will improve a website’s appearance.

Security

Web design also involves data protection and the type of security used. As earlier highlighted, websites using HTTPS encryption are ranked higher because they protect users.

6. Lack of Expertise, Authority, and Trust (E-A-T)

Google ensures that websites displayed on SERPs have a high level of expertise, authority, and trustworthiness hence the acronym E-A-T. The initiative is to get rid of the likelihood of providing low-quality content, according to Google’s Search Quality Evaluator Guidelines.

The guidelines have been updated to cater to content and website quality needs. Here’s what EAT guidelines entail:

Expertise

According to these guidelines, Google ranks a website with expert content higher than regular sites. Meaning, the content should meet the needs of its audience directly. Also, the content should address other queries the visitor might have and have content that answers them.

Authority

You also want your content to be a source of information to other experts and influencers. Metrics used to assess a site’s authoritativeness include links from authority sites, content getting shared across social media, mentions on authoritative websites and a high volume of branded searches.

Trustworthiness

You also need to reign in negative publicity around your company. Google is very cautious on aspects regarding negative comments. Too many bad reviews are a sign of low quality and may cause the search engine to lower your site’s rank.

7. Lack of Focus on the Right Keywords

Many SEO experts rely too heavily on keyword suggestion tools to generate keywords for their websites. The problem with software-generated keywords is that they lack personalization. If you’re not careful, you might end up using the same popular keywords more than you should which can be called keyword stuffing.

The best way to generate keywords is to use the actual keywords people use to find information. This way, you can identify keywords that drive the most traffic and lead to conversions. Long-tail keywords, for example, show more intent than shorter keywords. They are also less competitive and will help your site rank higher.

You also need to group and organize the keywords to enhance strategic marketing initiatives. Organized keyword groups contribute to:

Targeted landing pages

Optimized landing pages improve conversion rates while landing pages that match ads and keywords enhance the quality score.

Targeted PPC ads

Organized keyword groups enable you to create more targeted PPC ads which increase the click-through rates leading to higher rankings for keyword advertising campaigns.

Better information architecture

Keyword organization enhances a site’s organization and hierarchy ultimately improving its rank.

8. Broken Links

Research shows that 88% of online visitors leave a page after a bad experience. One such experience is clicking on a broken link. This is a link that, when clicked does not lead you to the desired page or leads to nowhere.

Some causes of broken links include:

  • Changes in domain names
  • Moving the site to a new URL
  • Moving to a new page and failing to update the internal links
  • Deleting an image, file or video
  • Making a typo when creating a link

Broken links affect a site’s ranking because:

Affect the user’s experience

If a site has several broken links, they affect the user’s overall experience which a critical factor when ranking a website.

Undermines your SEO efforts

When search engines find a broken link, they spend a few minutes to verify is the link is indeed broken. It is a waste of the crawl budget which could be assigned on strong links. Google will rank your website lower despite any SEO efforts made.

9. Missing, Poor or Duplicate Meta Data

Even SEO experts will fail to include it or duplicate it. However, metadata plays an essential role when you want to attract visitors to your website. They act as mini-ads optimized to enhance the click-through rate.

SEO professionals should give as much attention to metadata as they do with other website ranking aspects.

Common metadata optimization mistakes to avoid include:

Duplicating meta descriptions

You should put unique descriptions for every page. According to Google meta descriptions are short blurbs that convince users the page is precisely what they are looking for.

Not including keywords

Meta descriptions should include keywords that are relevant to the content page and meet the needs of the user.

Focusing on character count

135-160 characters have been the standard for meta descriptions. This changed in 2017 when Google declared they can be as long as 320 characters. It does not mean SEO professionals should waste time generating long meta descriptions.

If you need help improving your website’s rankings contact us today. Don’t let your competitors continue to get the traffic that you should be getting.