JavaScript SEO: The Things You Need to Know
JavaScript is a programming language that can be used to design interactive and dynamic web pages. It operates on the visitor’s computer and doesn’t need constant downloads from the website. JS is the only scripting language that is backed by all web browsers that handle client-side scripting. Another advantage of JS is that it doesn’t require any special program to create usable code, a basic plain editor is enough to write it. Most websites, even if they’re under easy-to-manage content management systems, tend to use some kind of JavaScript to add dynamic & interactive elements to the website and improve the overall user experience. In this article, we’ll be discussing all you need to know about JavaScript SEO.
What is JavaScript SEO?
JavaScript SEO is a part of technical SEO aimed at making JavaScript-heavy websites easy to crawl and index, as well as search-friendly, i.e, is to make your JavaScript-backed website searchable via Google. When optimizing for SEO, you don’t really have to learn JavaScript, instead, you need to know how Google handles JavaScript and how to troubleshoot issues. Even though most websites use JS, many underperform in Google because they don’t do JavaScript SEO properly.
Some Basic Definitions
- HTML: Hypertext Markup Language is the backbone, or the organizer of content, on a site. It organizes the code for elements that define the structure of the website, i.e, headings, paragraphs, list elements, etc.
- CSS: Cascading Style Sheets are the design and style that are added to a website, making up the presentation layer of the page.
- JavaScript: JavaScript is a programming language that makes web pages interactive and dynamic.
- AJAX: AJAX is Asynchronous JavaScript and XML. It basically updates website content without refreshing the whole page. It also enables web applications and servers to communicate without intruding on the current page.
- DOM: Document Object Model helps Google analyze and understand webpages. The browser initially receives an HTML document, the contents are parsed within this document and additional resources, such as images, CSS, and JavaScript files are fetched. A structured, organized version of the webpage’s code formed after this is called DOM.
Why Google (and Other Search Engines) Find Difficulties with JavaScript?
Usually, while crawling a normal website, Googlebot downloads the HTML file, extracts the links from the source code, and visits them simultaneously. They also download CSS files and send every downloaded resource to Google’s Indexer, which then indexes the page. But, things become a bit complicated when it comes to crawling a JavaScript-based website. Here, Googlebot will not find links in the source code immediately after downloading the HTML file, since they are only injected after executing JavaScript. Once the CSS and JS files are downloaded, the Googlebot uses the Google Web Rendering Service (WRS) to parse, compile, and execute JavaScript. This WRS fetches the data from external APIs, from the database, etc and the indexer finally indexes it.
But rendering and indexing can go wrong in JavaScript websites. As you would have noticed in the above-mentioned steps, the process gets complicated when JavaScript is involved. Parsing, compiling, and running JavaScript files is a time-consuming process for both users as well as Google (and other search engines). For JS based websites, Google is not allowed to index the content until the website is fully rendered. While the rendering process also involves discovering new links, Google would find it difficult to discover any links on a page before the page is rendered.
Googlebot doesn’t Crawl Like Your Browser
Googlebot works just like a browser by visiting web pages, but it is not typically a browser. Googlebot declines user permission requests, for instance, it will deny video auto-play requests. Resources like cookies, local, and session storage are cleared across page loads. Thus, if the website content relies on cookies or other stored data, Google won’t pick it up. Another difference between the two is that, while browsers download all the resources (such as images, scripts, stylesheets), Googlebot may not choose to do the same.
Since Google optimizes its crawlers for performance, it may not load all the resources from the server and may not even visit all the pages that it encounters. Google algorithms try to identify if a resource is necessary from a rendering point of view. And, resources may not get fetched by Googlebot if the algorithm fails to detect it.
How to Make JavaScript SEO Friendly?
1. Recheck robots.txt so that Search Engines See Your JavaScript
Always start with checking your robots.txt file and meta robots tag to make sure someone hasn’t blocked the bots accidentally. Robots.txt provides Google with search engines with appropriate crawling opportunities. Blocking JS will make the page appear differently to web crawlers than it does to users. This leads to different user-experience for search engines, and Google can end up interpreting such actions as cloaking.
To avoid this from happening, it is better to provide web crawlers with all the resources they need to see web pages in the exact same manner as users. So, it is better to discuss with your developers and decide together which files should be hidden from search engines, and which of them should be made accessible.
2. Use HTTP Status Codes to Interact with Bots
Bots determine what went wrong during the crawling process using HTTP status codes. Meaningful status codes can be used to convey bots the message that a certain web page should not be crawled or indexed, such as a page requiring a login. In such instances, a 401 HTTP status code can be used. These codes can be used to inform bots if a certain page has been shifted to a new URL, instructing it to index the page accordingly.
Have a look at some most commonly used HTTP status code:
-
-
- 401/403: Page unavailable due to permission issues.
- 301/302: Page now available at a new URL.
- 404/410: Page unavailable now.
- 5xx: Issues on the server-side.
-
3. Check Your JavaScript Code’s Compatibility with Different Browsers
Different browsers use different APIs. Your website’s JS code needs to be compatible with these APIs to work perfectly under any browser. The JS code needs to be written considering it’s compatible with browsers of all kinds. Consider the various features and limitations of most common bots (at least of Google and Bing), and write the code. Entrust a team to identify and fix JavaScript issues that may be hindering indexing or display of your page on search engines.
4. Try Search Friendly Loading
Images affect a website’s bandwidth and performance, but it is a resource that can’t be excluded as well. Try something like a lazy-loading where images are loaded later, but in a search-friendly manner. Ensure that you do it correctly without ending up hiding content from the search engine. You can use a JavaScript library since it supports the loading of content when it enters the viewport.
5. Test Your Website Frequently
Although Google is able to crawl and understand many forms of JavaScript, it is still recommended to conduct some testing frequently. There are two basic steps to detect breaks in the website. One is to check whether the content on your webpages appears in the DOM and the other is to test a couple of pages to make sure that Google is able to index your content.
Sometimes, Google may not be able to see your content and JavaScript in your robots.txt and analyze it properly. In such cases, consider manually checking parts of your content and fetching them with Google to see if the content appears. Simply test Googlebot’s ability to fetch your page using the URL Inspection tool to run a live test of the page.
6. Use HTML Snapshots
An HTML snapshot is a capture of a webpage, displaying contents of the page after it has been completely parsed, interpreted, and rendered by the browser. These help search engines in situations when they cannot grasp the JavaScript on your website. The snapshots will make the content interpretable and thus indexable without the need of executing any JavaScript code.
Google strives to feel the exact same experience as a viewer does. HTML snapshots help Google with it. Thus, it is better to return HTML snapshots to search engine crawlers. With all that being said, use HTML snapshots only when there is something wrong with JavaScript, and it isn’t possible to contact your support team.
7. Site Latency
Usually, when a browser creates DOM using the received HTML document, it loads the majority of resources exactly as they are mentioned in the HTML document. But, when the large files mentioned on top of the HTML code loads, it automatically delays loading other resources. Placing important sections of the content above the fold is the key idea that Google’s critical rendering path focuses. It appreciates loading content that are crucial to people.
There could be render-blocking JavaScript that pulls down the website loading speed. They are hindering the page’s speed, which otherwise would have appeared faster. These can be detected by analyzing results from tools like Page Speed Insights or other similar tools. If you find it difficult to do so, we have an amazing team of SEO experts who can help you go through this.
You can resolve this issue to an extent by adding JavaScript in the HTML and using the ‘async’ attribute to make your JavaScript asynchronous. But, remember to follow the basic rules of JavaScript. For instance, while you modify the code, things would go wrong if the script’s order of precedence gets altered (referring to files that weren’t loaded before within the code).
Conclusion
JavaScript is a powerful tool to build websites interactively. JavaScript and SEO can go well together if it’s rightly implemented. Ensure that the Javascript you use on the website is easily accessible to Google by carrying out meticulous testing. Understand how search engine bots are designed to crawl and index web content and try implementing the approaches we mentioned in this article.
Still confused? We’ve got a team of SEO experts to help you. They’ll be glad to help you out!