Google has what amounts to a monopoly on our sites. We are beholden to their search engine for traffic, their browser for the consumption of our content and for some, even their advertising for revenue (which oddly enough affects performance).
Have we not then been effectively reduced into optimizing our sites for the Google ecosystem more than for our readers?
In the past, such impetus used to come naturally from our readers. You’d get complaints about your site having become slower than usual, about a missing image here and there, about that new design you just rolled out and so on and so forth.
Today, most of such suggestions (for that’s all they should be) come from one entity. An entity that in truth overrides all other suggestions for theirs is the one that ultimately counts the most.
But, here’s the thing: theirs are not mere suggestions which you can put up for consideration and decide a way forward. Theirs are more than that. They’ve become rules. They have become nothing short of a “quasi-standard”.
They’ve in effect created a standard without it being exactly a standard. For a standard is formulated by multiple entities that have come together for a common cause, but that’s the not the case here. Why is that?
Because Google is under no obligation to rope in the suggestions of their competitors (Bing, Yandex, DuckDuckGo etc.) and all other entities vested in our content (such as our users and other advertisers) when they can silently “force your hand” with their other products.
Even more problematic, how does one measure and decide what constitutes a good page experience with the astounding variety of web designs we have on the internet and how diverse end-users respond differently to them.
What if a design actually necessitates going against some Core Web Vitals metrics because it’s expected to behave as such? Should the web developer remain faithful to the design handed to them, seek an overhaul for a compliant design, or needlessly work on finding inventive workarounds that conform to the standard?
But more importantly, what if the end-users are happy with the results and Google is not? Should such site be “punished” with low rankings for non-compliance, yet its target users are content with their page experience?
Google’s AMP project raises similar questions with its standardization efforts and I can’t help but see a similar thinking in play here.
So while Core Web Vitals purposes and speaks of good things on the outside, that is, to create an enjoyable experience for our visitors, its echo is a very punitive one for content creators to whom it nonchalantly whispers either comply, or lose ranking, lose traffic and ultimately lose revenue.
Nevertheless, you still have a choice. That choice is optimizing your site first and foremost for your users before any other entity.
Remember that a score of 100 on Pagespeed insights is ultimately an arbitrary one, since it comes from the standards of one entity. For instance, before GTMterix switched to Lighthouse they used YSlow, which uses different testing results from those of Pagespeed insights.
Then of course, you have other analyzers such as Cloudflare’s Browser Insights, WebPageTest, Varvy etc
Is to say that all these other tools are inferior? Absolutely not. They just use a different set of rules, or at least weigh them differently, but Google’s set of rules and weighing count more because they’ve a whole ecosystem enforcing their standard .
But at least there are certain things they agree, and that’s the raw performance of your website that has nothing to do with some of the more arbitrary user experience metrics. They include metrics such as Load time, Time to First Byte (TTFB), First Contentful Paint (FCP) etc
Ideally, then, one should tailor their site’s design to the user experience that they (not Google) want their visitors to get from it, then optimize that design for the best possible performance. That’s the way it should be.
The above article wasn’t intended to be a post on its own, rather, it was meant to be a postscript to my lengthy WordPress optimization guide that I published recently.
Drafting that article however ended up taking more time than I had initially foreseen. The reason for that delay was because I was basing the article around my first-hand experience optimizing my WordPress sites.
I had undertaken that task at the behest of the impending Page Experience update (aka Core Web Vitals) that Google had scheduled to roll out in May 2021, a task which to say the least was proving monumental.
Nevertheless, I carried on into that world of perfect page speed scores, as I couldn’t risk the repercussions of non-compliance, namely reduced traffic.
All my sites are heavily dependent on organic traffic from Google, and the past had taught me well the indiscriminate aftermath that Google Algorithm updates often leaves in their wake.
We often hear that “content is king”, and other nice mottos from SEO merchants, but these rule of thumbs are not reliable in the face of such search engine updates. A more encompassing and durable motto to me would be something along the lines of “Google is King”.
Indeed, we are serfs in the kingdom that Google, as it were, opens the door to, or rather, holds the treasures to, and this whole page experience hysteria proved that point clearly.
Anyway, as it turned out later, the update was postponed to June but by that time you could say I had long been disillusioned by the whole endeavor it had set in motion.
This inevitably derailed the article I was working on as it had become evident by that point how useless, or rather to be fair, arbitrary, page speed scores were.
It occurred to me that what ultimately mattered here is what in SEO lingo they call Authority, and small sites would have better success focusing on that than on speedometer readings.
Instead, here we were optimizing performance beyond what we could feasibly achieve, maintain or, more likely, afford.
Yes, that is the other stark revelation that one finds waiting at the end of that speed tunnel. The fact of the matter is that perfect page speed scores have more to do with server performance than with the onsite optimizations that merely act as a bandage.
This costs more money, as it is not a matter of simply installing a free plugin to do all the magic. Yet, performance analyzers merely whisper this truth and instead prefer to send you out on wild goose chases, when in fact the golden egg lies at some server rack somewhere by way of deeper pockets.
SEO gurus spamming my contact form likewise suggest a similar strategy when it comes to their enigmatic work. By that I mean, why worry about Google’s edicts when some money can game its soft underbelly and propel you to the top of its throne.
In any case, Google doesn’t explicitly make it clear that complying to its rules will automatically improve one’s ranking, but is happy if you do.
A dating analogy surrounding free meals would be relevant here, but I have something far more appropriate to illustrate this. There’s a saying in Swahili whose literal translation is, “there are no refunds from a witch doctor”.
That saying I would say holds true with Google in the sense that there are no guarantees for whatever “absurd” things they put you up to in return for ranking.
The common thread one observes is that Google never reveals its “secret recipe” for success and prefers instead to hide behind suggestions, recommendations and penalties, all which create a vacuum that SEO gurus are more than happy to tap into profitably with speculation.
With regard to the aforementioned saying, one could liken these gurus to the virulent witch doctor adverts that one finds strategically stuck on electric posts and trees.
If you’ve been to Kenya, then you know exactly what I mean, but here’s an example for those of you who haven’t had the pleasure of coming on a Safari.
I get offers in the same vein as these from Google “consultants” who claim promises such as:
We will give you First page ranking on Google, Yahoo, and Bing.SEO Sale’s Pitch Promises
Improve your organic traffic and sales.
Secure your website from Google penguin updates 4.0
Increase your conversion rate.
Target your local market to increase business.
It seems the penguin update is still fishing these many years later. By the way, I didn’t make up the above list. I actually got this one recently from one Jay Leno who signed off the email with the title of “Site Analyst / Digital Marketing”.
Oh, and don’t get me started on the bunch that are busy trying to sell backlinks from Amazon and other big sites. Their subject lines are always in the lines of:
DA 96 Do-follow Backlink from [big site here]
DA 50 Backlink to [my-site-here]
Well, you get the idea. The point I’m trying to drive here is that while the page experience update “means well” for the most part, it will nevertheless be co-opted by shady SEO experts as another bullet point in the bid to achieve “first page ranking”.
Essentially, it’s Google feeding the very machinery that manipulates its search engine, which in turn affects the quality of its results.
That should explain why certain keywords spew out the same unhelpful spammy sites on every occasion while a legitimate site is buried in page 3.
Here’s the thing: if an individual is only concerned about making quick, easy money, then they will do anything within their means to achieve that objective.
Indeed, they will comply with all Google’s rules, get a 100/100 page speed score if necessary, but they will also hire out content farms, buy backlinks, spam its competitors and every thing else that should get them to position 1.
As all these unfold, what does the end user get? The same rehashed content available on several other sites, the same structure, the same phrasing, the same clickbait titles – basically the same “winning formula” as dictated by Google indirectly, for the main objective has been distilled into just getting eye-balls.
And who does this paradigm favor? Well, it’s clearly the one who can afford not only to comply, but also face off in the wild to come out at the very top.
Small sites stand little chance in this state of affairs. And trust me, I speak from experience, having been a victim of content theft from much larger sites and having to watch them get away with it at the top of Google.
And now they got one extra thing they can beat us in without resorting to underhand tactics: better performance.
I wish Google could focus their efforts not so much into “standardizing” the internet, but more on getting the best possible outcomes for its end users with regard to content; and that essentially means two things as I see it:
- weeding out all the spammy sites that provide no value whatsoever, then;
- ranking the legitimate sites based purely on content relevancy to the keywords searched and not all these other ‘gameable’ metrics that favor some unscrupulous few
This way the field is slightly leveled for any site to come out at the top and even more crucially, there’s a chance for a diversity of ideas, opinions, writing styles and web designs (that is, if we factor Google’s push for AMP project).
Basically, it’s having confidence that users can make their own judgements when presented with a fair list of choices.
It should be within their rights to determine what counts as a slow and a fast website to them, what is a good design and what is a bad one, i.e. all factors that are bound to be subjective.
This way, they are given the chance to make their own individualized decisions of sites they identify as reliable, trustworthy etc. instead of being presented with ones that have best passed some “authoritarian” quality assurance checks.
But, hey, beggars can’t be choosers. Right?