Before JS took over web development, search engines could only crawl HTML's text-based content. Google began considering adding the capability for parsing JS resources and comprehending pages using them as it gained popularity. Although they have come a long way over the years, there are still many issues with how search crawlers access and view material on single-page applications.
For now, let’s see what actions you can take to make your single-page application website fully visible to crawlers. Among the helpful things, you’ll find SEO basics such as a clean, polished sitemap and some specific techniques related to rendering.
Before tackling SPA-specific crawling and indexing issues, make sure you start with the basics: create a properly formatted sitemap file and submit it to Google. It will not help you with JS resources, but it will inform search engines that your pages exist and what structure your website has.
Feature detection is among major Google’s recommendations for SPAs. This technique involves progressively enhancing the experience with different code resources. How does it work: a simple HTML page serves as a basis that is accessible to crawlers and users, while features on top of it (CSS and JS resources) and enables and disables according to browser support.
When users scroll through a SPA, they pass separate website sections. Technically, a SPA contains only one page (a single index.html file) but visitors feel like they’re browsing multiple pages. When users move through different parts of a single-page application website, the URL changes only in its hash part.
Websites often overlook social sharing optimization: we’ve learned that missing Twitter Cards top the list of the most common issues identified by SE Ranking’s Website Audit tool. No matter how insignificant it may look, implementing Twitter Cards and Facebook’s Open Graph will allow for rich sharing across popular social media channels, which is good for the website’s search visibility.
Previously, Google Webmaster Tools included the Fetch as Google functionality which allowed seeing the downloaded HTTP response and page HTML as it was fetched and rendered by the search engine. But in 2019, Google removed the tool, and now you can only access some crawling and indexing information in the URL Inspection section of Search Console. It doesn’t give an informative preview of what Google sees but provides you with basic information about crawling and indexing.
Since traditional tracking code in Google Analytics doesn’t work with single-page websites, you’ll have to use additional tools. The trick here is to record and monitor real-user interaction instead of pageviews. GA its own suggests tracking virtual pageviews by setting the set command and new page value. You can also implement plugins like Angulartics that track pageviews based on user navigation across the website.
The specific nature of SPAs aside, most general optimization advice suits this type of website. Basic SEO efforts entail:
To make your SPA shine in the eyes of web search tools, make its content easily accessible to crawlers. Remember to serve a static form to web search tools when providing guests with dynamic substance load, impacting speed, and consistent route. Furthermore, ensure that you have the correct sitemap, that you use unmistakable URLs rather than section identifiers, and that you name different substance types with organized information.