200+ signals.
For the past two decades, this number has been solidified as the representation for the number of signals that Google uses in their algorithm in their effort to generate any given SERP.
For quite some time SEO's had a clear roadmap of steps on "what to do" (and what to do first) when it came to ranking a website. The algorithm would change, but we still had blueprint to navigate the seas of SEO, and how to overcome the algorithm update tsunamis. In the last 5 years however, the waters have been muddied. Unless you're breaking the SERPs daily, you no longer have an edge on optimizing for the "200+ signals".
It doesn't help that John Mueller has a long list of now-deleted tweets claiming the following factors no longer directly affect SEO.
Domain Name Age - This is the first ranking factor. A good SEO would put up a one-page placeholder site with keyword-rich content while they flesh out their website and strategy, just to start clocking domain name age. But Mueller has stated that domain name age doesn't play a role and "means nothing". That being saidd, getting a page up, getting it indexed, and setting up all your metrics is a great way to constantly test the state of Google's aversion to brand new sites.
Charles Float has a great video that's up to date about what you should do in the first 100 days for new niche sites.
Structured Data (Schema.org) - Out of all the things that Google has neglected in the past 7 years, structured data is the one I believe they've missed the boat on the most.
There are companies that pay millions of dollars for AI labeling to train their models. Google could have gotten the entire globe to label data for free. Each of Schema's categories-such as "Car," "Organization," or "Creative Work"-could be infinitely expanded. Schema.org represented an opportunity for Google to put the world's SEO community to work. If they simply made schema.org more prominent in the SERPs, how many hours of free data labeling would Google get? Are AI overviews better than Schema.org rich snippets? Hell no.
Wouldn't a strong index of websites with feature-rich schema markup dramatically help Google stay #1 in the AI race?
Disavowing Toxic Links - Bing dropped the disavow link tool in 2023, and it was up in the air whether Google was going to follow suit. As of June 2025, Google still allows you to disavow links, but John Mueller has stated that it's a "waste of billable time."
Removing Old Content - John Mueller has stated that removing old content doesn't make the rest of your content rank better. So if you have a thin affiliate site with thousands of articles that haven't been updated in quite some time, you're going to want to take a serious look at your content update/delete strategy.
Third-Party Tools Don't Have the Data - Using Ahrefs, SEMrush, or Ubersuggest? John Mueller has basically said these companies don't have access to real Google data and shouldn't be relied upon for keyword volume and similar metrics.
Affiliate "Go" Links - This isn't necessarily a ranking factor, but take a look at these comments regarding redirecting your affiliate links:
"From our point of view that's perfectly fine, there's nothing that we would consider bad there.
From my point of view, because there's really nothing there that provides any additional value for your website in the sense that it looks better by having redirects to your affiliate pages or not.
Personally, I would recommend using less complexity and not setting up these kinds of redirects and just linking to that site directly. I just think you save yourself the effort of trying to maintain all of these things individually."
Now this is just a useless statement. There are plenty of benefits to managing your affiliate links in a database and redirecting them through a go-folder, aside from SEO. If you're looking for some of the many reasons why you'd want to cloak affiliate links, here are 7.
Converting Images to WebP - John has stated on Reddit that there's no "direct benefit" from converting your images. But once again, WebP images mean a faster site-and if site speed is a ranking factor, that statement is contradictory.
I'm sorry, but when you start adding all this up, you start to wonder: what has Google been doing since Matt Cutts posted that guest blogging is dead-all the way back in 2014?
Dynamic Content Rendering - John Mueller wants you to keep your content server-side rendered-and that makes sense, as it reduces the work for their crawlers. But with enterprise sites having a decoupled front end and back end, this means that large corporations often have a lot of dynamically (client-side) rendered content. They absolutely must keep working on maximizing crawl budget, and that means giving precedence to dynamically generated content because they need to understand how their tests are performing. I believe this is still one of the signals and capabilities where Google has a decent lead over Bing-but Bing is catching up.
There are many other factors that John Mueller has dismissed as directly related to SEO. The takeaway?
Even if they don't have direct impacts on SEO, each one of these factors provides value beyond rankings. So even if they don't help you rank better, you should probably do them. As you stack these signals, it's more than likely you'll end up with a better site. In short, John Mueller's growing list of "non-factors" to be a good indicator of the signals Google is still paying attention to.