Q
Q
QQ2016-10-19 13:11:52
Google
QQ, 2016-10-19 13:11:52

How to make a page fragment loaded via JS be indexed?

Hello.
There is JS that loads an important block with internal site links on the page.

document.addEventListener('DOMContentLoaded', function() {
var add = document.querySelector('#add');
add.innerHTML = '<a href="/hello_world">Hello world!</a>';
});

Yandex recommends using the HTML copy for AJAX sites. I don't have an AJAX site, but I just need to index the links inserted through JS on the page.
Alternatively, these links can be added to the sitemap, but I wanted them to be indexed through the pages of the site. How can this be implemented?

Answer the question

In order to leave comments, you need to log in

3 answer(s)
M
Max, 2016-10-19
@AloneCoder

Re-read: "Yandex recommends using HTML copy"

G
Golover, 2016-10-19
@Golover

Possible:
1. JS loaded after the crawler scanned the page. Try to move the code as high as possible.
2. JS files are closed for scanning.
Look in the google webmaster console in the "View as GoogleBot" report

Y
Yury Egorenkov, 2016-12-18
@yury_egorenkov

AJAX-not AJAX is not the issue. Need to execute JS. Search engines and social networks don't do that. They index the html that arrived in response to the request and that's it. To execute JS, you can use a service that does this, for example, renderjs.io

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question