So, another BrightonSEO conference has been and gone, and just like April, it was great. An incredible line-up (as usual), great topics and knowledgeable speakers, and a couple of free t-shirts!
Yet again, my biggest disappointment is that I couldn’t attend all of the talks. After all, there’s only one of me (much to the relief to anyone who knows me).
Here’s a quick recap of a few of the talks I attended.
The Future Of SEO
Greg Gifford, Vice President of Search at Wikimotive
First talk of the day was Greg Gifford. 1st takeaway, this guy talks like me! Really, really quick, with a few bad words here and there. He was lively, engaging, and made the topic interesting and fun. Plus, his slides had 92 horror film recommendations (always a win in my eyes).
His talk was titled “Beetlejuice’s Guide to Entities and the future of SEO”. It featured some awful, and therefore incredible, puns.
Takeaways
- Entities (I’ll explain what these are in a moment) are the most important concept of SEO. They are basically the basis of everything we do.
- To perform better in SEO, stop focusing on tiny tweaks here and there. Focus on providing the best answer to a searcher’s intent.
- Google is beginning to use real-world signals to help rank websites (these can’t be faked).
- Pay attention to Google patents. These will literally tell you what Google plans to do next.
- Stop focusing solely on keywords and backlinks.
Entities and how they relate to search
Entities in SEO can be anything. They are the basis of everything we do. Google’s understanding of an entity is that it is ‘a thing or concept that is singular, unique, well defined and distinguishable’.
Google used to look at keywords rather than the actual meaning of the intent behind the search. When the knowledge graph was introduced in 2012, this was Google’s first step towards “Entity Search”.
With Hummingbird in 2013 focus began to shift towards semantics. Then in 2015 and 2016 respectively, Google applied for patents for “ranking search results based on entity metrics”, and “question answering using entity references as unstructured data.” From these points alone it is clear that Google is starting to focus on entity-based search.
So how does this actually relate to search?
Real world signals – these can’t be faked. Offline actions will help website rankings.
Search intent – this is more important now than the actual keywords!
Quick Trick
Google my Business “Questions and Answers” can be ridiculously useful. Use them as a “pre-website FAQ”. Any answers with 3 thumbs up will auto show, and in the future if someone uses voice search to ask a question similar to one already asked on GMB, then your answer may be read out.
Tim Soulo, CMO & Product Advisor at Ahrefs
Following on from Greg, was Tim Soulo from Ahrefs with his talk “Rethinking the fundamentals of keyword research with The Insights from big data”.
His first slide backed up his entire presentation. In 4 years Ahrefs have seen traffic grow over 15x with a 60% Year on Year ARR (average rate of return) growth.
Ahrefs achieved this by breaking keywords down into 3 things.
-
- Traffic Potential – The more volume, the higher the traffic, right? Wrong. Look at the Total search traffic potential. Pages never rank for a single keyword, they rank for variations, long tail etc. Ahrefs undertook research that showed that the average page ranking number 1, also ranked for 1,000 other keywords, so look at the total search potential across all the keywords, not just one.Each search “topic” has a search demand curve of its own, this is the “total search potential”
- Business Potential – A metric that ahrefs use internally. They score keywords from 3-1. 3 being that your product is an irreplaceable solution to the “problem” and 1, being that your product is barely relevant at all. Anything you give a score of 3 to, would be considered “converting” traffic (potentially).
- Ranking Potential – Everyone wants to rank number 1. However, again, Ahrefs conducted a study (based on 100k search queries) that showed that the top ranking page got the most total search traffic only 49% of the time. Basically, focus on search intent and cover a broad topic rather than a narrow one.
Onsite Technical
Fili Wiese, Ex-Google Engineer & SEO Expert
So I stand by my statement from April. Fili Wiese has indeed forgotten more about SEO that I think I may ever know.
Fili shared how he sped up his own website, from challenges, to how he actually succeeded. As he pointed out, it’s our responsibility to make our site faster for our users, and we have to continually test.
Takeaways
- Maintain a mobile-friendly, mobile-focused mindset.
- Use HTTP/2 which allows you to serve content over a single connection.
- Utilise resource hints to give more instructions to the browser on what to load as a priority.
- Look into increasing server resources, as more server power can improve the average response time.
- Use a resource budgeting approach and check your site against the Google Chrome team’s ‘Never-Slow Mode’ list, which sets ideal budgets for resource file sizes of images, scripts, connection times, and more.
I’ll be honest, I would love to be able to pick Fili’s brain 1 to 1 for about 10 minutes. I honestly think I would walk away with a billion ideas.
Francois Goube, CEO & Founder of Oncrawl
Francois talk was titled, “Best Kept Secrets from Robots.txt”, and focused on trying to get back to basics. A really great talk, because it’s amazing how often robots.txt files can go wrong. He even shared some examples of people doing “stupid things” (like including your ftp passwords and account info in your robots.txt).
My biggest takeaway (that lined up with some of my previous thinking on PBNs) was that Google can use robots.txt files to detect PBNs. They even have patents to prove it (See! Ties back perfectly with Greg’s tip, keep an eye on google patents!).
Best Burger I Have Ever Had – Revised
Lunchtime! Now this is where I realise a trip to Brighton just to try all the burgers may be needed in the future.
Last time I was here, amazing burger from Lucky Beach Cafe.
This time, amazing burger from Patty and Bun.
(and yes I know I said lucky beach cafe was the best burger I’d ever had, but I hadn’t eaten at patty and bun before. A man is allowed to change his mind).
Advanced SEO
Jamie Alberico, Not a Robot
Another talk I was really looking forward to was Jamie Alberico’s, and I wasn’t disappointed. Plus, it was Dungeons and Dragons themed (always a win).
The talk was titled “How Googlebot Renders (Roleplaying as Google’s Web Rendering Service – D&D Style)”.
The talk explained how a crawler “builds a page” and how to improve the search engines understanding of it.
So, what is our “quest” as technical SEOs? It’s to protect the site visibility by delivering the content to Google’s index.
Our quest has changed. We used to think it was:
Crawl – Index – Rank
Now it’s changed. Now there are 2 waves of indexing, with a delay between HTML and JavaScript.
This may mean that pages needing to be rendered can get stuck in a queue, meaning it won’t be indexed straight away. If we can’t render, our content won’t show.
Jamie’s biggest takeaway? Make friends with developers. They can help you make better decisions around JavaScript and how to mitigate any risks that may be encountered.
I’ve attached links to the slides in this blog (Otherwise this blog would be huge if I tried to explain everything I learned in this talk).
Summary
I always feel like I leave BrightonSEO with a whole list of things to try, and honestly it is the best search marketing conference I have ever attended. It brings great speakers from the world of search and I guarantee you’ll walk away with something.
Events like BrightonSEO help us stay at the forefront of the ever-evolving field of organic search. If you’d like to find out more about our approach to SEO here at Coast Digital, then read our SEO service page.
Alternatively, if you’d like to talk about SEO then why not get in touch?