{"id":2246,"date":"2019-09-13T11:14:47","date_gmt":"2019-09-13T11:14:47","guid":{"rendered":"https:\/\/myblog2u.com\/?p=2246"},"modified":"2021-03-12T04:58:26","modified_gmt":"2021-03-12T04:58:26","slug":"basics-to-javascript-seo","status":"publish","type":"post","link":"https:\/\/myblog2u.com\/basics-to-javascript-seo\/","title":{"rendered":"Basics to JavaScript SEO: A Guide by Google"},"content":{"rendered":"\r\n
One of the important parts of the web platform is JavaScript. It helps to convert the web into a good powerful platform by applying many features. If your web application, which is powered by JavaScript, is discovered by Google Search engine then you will get many new users. It will also make the existing users revisit your website for the information you are providing.<\/p>\r\n\r\n\r\n\r\n
Now, JavaScript SEO is a department of SEO branch which focuses primarily on developing the websites which use JavaScript. They help these websites to get into the spotlight of the Google Search engine.<\/p>\r\n\r\n\r\n\r\n
A guide to JavaScript basics was released by Google. This document covers all the basics on the JavaScript SEO basics.<\/p>\r\n\r\n\r\n\r\n
The guide begins by telling about the three-step procedure to process the JavaScript<\/p>\r\n\r\n\r\n\r\n
After that, the guide covers some easy tips making a Google-friendly content using JavaScript. The tips are:<\/p>\r\n\r\n\r\n\r\n
Using unique and relative titles and useful meta descriptions make it easier for users to recognize the best outcome for their search.<\/p>\r\n\r\n\r\n\r\n
Now let\u2019s see how you can create an effective and relatable title.<\/p>\r\n\r\n\r\n\r\n
Titles are an important part of the content. A title provides an insight into the content to the users. It mainly acts as information, on the basis of which a user will click on it. <\/p>\r\n\r\n\r\n\r\n
So, here are a few tips to create a good and effective title:<\/p>\r\n\r\n\r\n\r\n
Some titles like \u201cfoo bars, foo bar, Foobar, foobars\u201d does not help the users at all. This will make Google recognize your result as spammy.<\/p>\r\n\r\n\r\n\r\n
For this case, you can consider using your site name at the beginning and ending of your page.<\/p>\r\n\r\n\r\n\r\n
You have to follow the guidelines of Google to make sure that your code is compatible with the Google bots. This will also prevent you from getting into JavaScript problems.<\/p>\r\n\r\n\r\n\r\n
To avoid JavaScript issues follow the given steps:<\/p>\r\n\r\n\r\n\r\n
You can make use of a status code to let the Googlebot know of what you want to do with a page. Using a status code you can tell whether you want your page not to be crawled, or if a page has been removed to a different URL.\u00a0<\/p>\r\n\r\n\r\n\r\n
Given below is a list of HTTP status codes, and where to use them:<\/p>\r\n\r\n\r\n\r\n
301\/ 302- <\/strong>For pages which has been removed to a new URL<\/p>\r\n\r\n\r\n\r\n 401\/ 403- <\/strong>For the pages which are unavailable for permission issues<\/p>\r\n\r\n\r\n\r\n 404\/ 410- <\/strong>For page which is no longer available.<\/p>\r\n\r\n\r\n\r\n 5xx- <\/strong>When something goes wrong with the server.<\/p>\r\n\r\n\r\n\r\n If you want to prevent your page from being indexed by the Googlebot, you can do it by using the meta robots tag.<\/p>\r\n\r\n\r\n\r\n For example, if you want the Googlebot to not index your page, you can add the following tag at the top of your page:<\/p>\r\n\r\n\r\n\r\n <!– Googlebot won’t index this page or follow links on this page –><\/strong><\/p>\r\n\r\n\r\n\r\n <meta name=”robots” content=”noindex, nofollow”><\/strong><\/p>\r\n\r\n\r\n\r\n You can also change a meta robot tag on your page using JavaScript. This will prevent Google bot from indexing your page. The following code will show enable you to do the same:<\/p>\r\n\r\n\r\n\r\n fetch(‘\/api\/products\/’ + productId)<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0.then(function (response) { return response.json(); })<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0.then(function (apiResponse) {<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0if (apiResponse.isError) {<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\/\/ get the robots meta tag<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0var metaRobots = document.querySelector(‘meta[name=”robots”]’);<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\/\/ if there was no robots meta tag, add one<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0if (!metaRobots) {<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0metaRobots = document.createElement(‘meta’);<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0metaRobots.setAttribute(‘name’, ‘robots’);<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0document.head.appendChild(metaRobots);<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0}<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\/\/ tell Googlebot to exclude this page from the index<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0metaRobots.setAttribute(‘content’, ‘noindex’);<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\/\/ display an error message to the user<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0errorMsg.textContent = ‘This product is no longer available’;<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0return;<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0}<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\/\/ display product information<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0\u00a0\u00a0\/\/ …<\/strong><\/p>\r\n\r\n\r\n\r\n \u00a0\u00a0});<\/strong><\/p>\r\n\r\n\r\n\r\n When the Googlebot will see noindex<\/strong> in its meta tag before executing JavaScript, it will not index your page.<\/p>\r\n\r\n\r\n\r\n Using images on your webpage is not very cost effective in case of data transfer and performance of the website.\u00a0<\/p>\r\n\r\n\r\n\r\n Lazy-loading is a good strategy while using images on your webpage. It allows the images to load when the user is about to see them.<\/p>\r\n\r\n\r\n\r\n To successfully implement lazy-loading, you need to follow the following process.<\/p>\r\n\r\n\r\n\r\n For the Googlebot to see all the content on your page, you have to ensure that the applied lazy loading loads all the important contents when they are visible in the viewport.\u00a0<\/p>\r\n\r\n\r\n\r\n Here are some ways of how to do it:<\/p>\r\n\r\n\r\n\r\n If you want to apply an infinite scroll in your page, you need to support paginated loading.\u00a0<\/p>\r\n\r\n\r\n\r\n Paginated loading is a very important thing. It allows the users to re-engage and share your content. It enables Google to display a link up to a specific point, and not the top of an infinite scrolling page.<\/p>\r\n\r\n\r\n\r\n For making the paginated loading to support, use an original link to every that can be shared and loaded directly by the users. You can use History API for updating the URL.<\/p>\r\n\r\n\r\n\r\n After you have completed the process, you need to be sure that it is working properly. One of the ways to do this is by using Puppeteer script to quickly check your implementation.\u00a0<\/p>\r\n\r\n\r\n\r\n To run it, you will need a Node.js. Follow the given steps to verify the script and run it:<\/p>\r\n\r\n\r\n\r\n git checkout https:\/\/github.com\/GoogleChromeLabs\/puppeteer-examples<\/strong><\/p>\r\n\r\n\r\n\r\n cd puppeteer-examples<\/strong><\/p>\r\n\r\n\r\n\r\n npm i<\/strong><\/p>\r\n\r\n\r\n\r\n node lazyimages_without_scroll_events.js -h<\/strong><\/p>\r\n\r\n\r\n\r\n After you have run the script, you will need to review the screenshot taken to check that it contains all the contents that you have expected to be visible and be indexed by the Googlebot.<\/p>\r\n\r\n\r\n\r\n So, these are the basic SEO guides for JavaScript. Using JavaScript on your website will improve your website presence in the Google search engine.<\/p>\r\n\r\n\r\n\r\n Hence, this guide will help you to optimize your website drastically.<\/p>\r\n","protected":false},"excerpt":{"rendered":" One of the important parts of the web platform is JavaScript. It helps to convert the web into a good powerful platform by applying many features. If your web application, […]<\/p>\n","protected":false},"author":2,"featured_media":2365,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[284],"tags":[],"yoast_head":"\n4. Make use of the Meta Robots Tags Properly<\/h4>\r\n\r\n\r\n\r\n
5. Repair Images and Using Lazy-Load content\u00a0<\/h4>\r\n\r\n\r\n\r\n
\r\n
\r\n
\r\n
\r\n
Conclusion<\/h2>\r\n\r\n\r\n\r\n