News on the block is that Google has “evolved” the nofollow link tag. You are probably wondering what a nofollow link is and how this impacts you. Let me explain…
Nofollow links have been around for over 15 years and were initially established to help combat blog comment spamming. Google wanted to prevent spammy sites from ranking really well in search engine results – understandably! Previously, Google would ignore all nofollow links and only use follow links in their search algorithms.
Fast forward to March 2020 and Google has started to roll out the 2nd phase of a planned algorithm update which will affect how search engines crawl and index nofollow links. Google will now use nofollow links as “hints” about how to treat the links for crawling and indexing purposes. Ultimately, links that have previously been ignored and not contributed to your search engine rankings, will now be considered, along with other factors when ranking a site.
Back in September, Google also introduced 2 new link attributes to provide more context about the content that you’re linking to:
- rel="sponsored" – links related to advertising or a sponsorship.
- rel="ugc" – links that have appeared within user generated content.
Google claimed that if they look at all links, they will then be able to spot unnatural linking patterns. This will provide Google with even more information about the link and how the user intends to use this link.
- Link attributes are still important as they help avoid penalties.
- Google expects little to change with search results due to this change.
- You don’t need to change existing link attributes.
- Google recommends switching to “sponsored” attribute for links related to ads or sponsorships.
- You can use multiple attributes in one link e.g. rel”sponsored ugc”
- This change should not result in more spamming as a lot of blog sites already have methods to deter link spam.
However - if you were using nofollow links to prevent old, irrelevant pages from being indexed - this may change. Google has advised using a more robust method, such robots.txt and header meta tags to control this. The last thing you want is an old, out dated page being indexed and showing in the search results to potential customers/clients!
As this change started to roll out on March 1st, I would recommend analysing data which shows any changes in search ranking and search visibility from before and after the change.