I recently read Ziemek Bucko’s fascinating article, Rendering Queue: Google Needs 9X More Time To Crawl JS Than HTML, on the Onely blog.
I recall several instances of fielding phone calls and emails from frustrated clients asking why their stuff wasn’t showing up in search results.
In all but one case, the challenge appeared to be because the pages were built on a JS-only or mostly JS platform.
Like any tool, however, it’s best used for tasks other tools cannot do. I’m not against JS. I’m against using it where it doesn’t make sense.
But there are other reasons to consider judiciously using JS instead of relying on it for everything.
Here are some tales from my experience to illustrate some of them.
1. Text? What text?!
Within a week of the new site going live, organic search traffic plummeted to near zero, causing an understandable panic among the clients.
A quick investigation revealed that besides the site being considerably slower (see the next tales), Google’s live page test showed the pages to be blank.
My team did an evaluation and surmised that it would take Google some time to render the pages. After 2-3 more weeks, though, it was apparent that something else was going on.
I met with the site’s lead developer to puzzle through what was happening. As part of our conversation, they shared their screen to show me what was happening on the back end.
That’s when the “aha!” moment hit. As the developer stepped through the code line by line in their console, I noticed that each page’s text was loading outside the viewport using a line of CSS but was pulled into the visible frame by some JS.
This was intended to make for a fun animation effect where the text content “slid” into view. However, because the page rendered so slowly in the browser, the text was already in view when the page’s content was finally displayed.
The actual slide-in effect was not visible to users. I guessed Google couldn’t pick up on the slide-in effect and did not see the content.
Once that effect was removed and the site was recrawled, the traffic numbers started to recover.
2. It’s just too slow
This could be several tales, but I’ll summarize several in one. JS platforms like AngularJS and React are fantastic for rapidly developing applications, including websites.
They are well-suited for sites needing dynamic content. The challenge comes in when websites have a lot of static content that is dynamically driven.
Several pages on one website I evaluated scored very low in Google’s PageSpeed Insights (PSI) tool.
When you examine this from the Core Web Vitals side, that accounted for nearly 8 seconds of blocking time as all the code has to be downloaded and run in the browser.
While the former developer in me agreed with that concept, the SEO in me could not accept how Google’s apparent negative perception of the site’s user experience was likely to degrade traffic from organic search.
Unfortunately, in my experience, SEO often loses out to a lack of desire to change things once they have been launched.
3. This is the slowest site ever!
Similar to the previous tale comes a site I recently reviewed that scored zero on Google’s PSI. Up to that time, I’d never seen a zero score before. Lots of twos, threes and a one, but never a zero.
I’ll give you three guesses about what happened to that site’s traffic and conversions, and the first two don’t count!
Get the daily newsletter search marketers rely on.
To be fair, excessive CSS, images that are far larger than needed, and autoplay video backgrounds can also slow download times and cause indexing issues.
I wrote a bit about those in two previous articles:
- 5 ways content helps create a better user experience
- Bad website design: Bad for SEO, UX and business
For example, in my second tale, the sites involved also tended to have excessive CSS that was not used on most pages.
So, what is the SEO to do in these situations?
Solutions to problems like this involve close collaboration between SEO, development, and client or other business teams.
Building a coalition can be delicate and involves giving and taking. As an SEO practitioner, you must work out where compromises can and cannot be made and move accordingly.
Start from the beginning
It’s best to build SEO into a website from the start. Once a site is launched, changing or updating it to meet SEO requirements is much more complicated and expensive.
Work to get involved in the website development process at the very beginning when requirements, specifications, and business goals are set.
Try to get search engine bots as user stories early in the process so teams can understand their unique quirks to help get content spidered indexed quickly and efficiently.
Be a teacher
Part of the process is education. Developer teams often need to be informed about the importance of SEO, so you need to tell them.
Put your ego aside and try to see things from the other teams’ perspectives.
Help them learn the importance of implementing SEO best practices while understanding their needs and finding a good balance between them.
Sometimes it’s helpful to hold a lunch-and-learn session and bring some food. Sharing a meal during discussions helps break down walls – and it doesn’t hurt as a bit of a bribe either.
Some of the most productive discussions I’ve had with developer teams have been over a few slices of pizza.
For existing sites, get creative
You’ll have to get more creative if a site has already launched.
Frequently, the developer teams have moved on to other projects and may not have time to circle back and “fix” things that are working according to the requirements they received.
There is also a good chance that clients or business owners will not want to invest more money in another website project. This is especially true if the website in question was recently launched.
One possible solution is server-side rendering. This offloads the client-side work and can speed things up significantly.
A variation of this is combining server-side rendering caching the plain-text HTML content. This can be an effective solution for static or semi-static content.
It also saves a lot of overhead on the server side because pages are rendered only when changes are made or on a regular schedule instead of each time the content is requested.
Other alternatives that can help but may not totally solve speed challenges are minification and compression.
Minification removes the empty spaces between characters, making files smaller. GZIP compression can be used for downloaded JS and CSS files.
Minification and compression don’t resolve blocking time challenges. But, at least they reduce the time needed to pull down the files themselves.
For a long time, I believed that at least part of the reason Google was slower in indexing JS content was the higher cost of processing it.
It seemed logical based on the way I’ve heard this described:
- A first pass grabbed all the plain text.
- A second pass was needed to grab, process, and render JS.
I surmised that the second step would require more bandwidth and processing time.
I asked Google’s John Mueller on Twitter if this was a fair assumption, and he gave an interesting answer.
From what he sees, JS pages are not a huge cost factor. What is expensive in Google’s eyes is respidering pages that are never updated.
In the end, the most important factor to them was the relevance and usefulness of the content.