O
O
olijen2018-05-31 02:08:04
JavaScript
olijen, 2018-05-31 02:08:04

Project with non-standard architecture. How to optimize ajax site for SEO?

I'm trying a new project architecture - the idea is to combine the advantages of synchronous and asynchronous applications.

I use PJAX technology (loading entire blocks via AJAX + push state).

The site is designed in such a way that a synchronous request (of the /controller/action?param1=value type) does not differ from the same asynchronous one. Those. if you turn off pjax on the entire site - the site should work as it worked. Links do not change and obediently give content, reloading the entire page with the entire layout.

For example, the user clicks on the main menu - I reload the menu block and the main content block. The menu links are normal, as if the site is asynchronous, but PJAX adds the _pjax=#some-block parameter, the server is guided by this, you need to give only the desired block, or the entire site.

I set everything up so that it worked with the CNC. I set it up so that you can specify block dependencies and reload several at once (for example, when sending a comment, update the form, the list of comments and the notification block).

The system works many times faster and more optimized than with synchronous queries. In theory, the site can be indexed normally, tk. links such as a regular synchronous site/blog. But this is only an assumption.

Actually this is the question. Am I correct in assuming that due to this approach, it will be possible to make not only a good, fast and user-friendly website, but also not to sag in search engines?

Do I need to overload meta tags with asynchronous request? Or will the bots follow the links synchronously?
Will the search engine accept links that are inhref="..." (they work in synchronous mode, on direct request. That's the trick).
Or will the search engine dig deeper - track the events hung on the PJAX block and perceive the link not as /sefurl-ssilka, but as /sefurl-ssilka?_pjax=#some-block ?

It's just that if the search engine does not understand my efforts, then it makes sense to decide - to make a synchronous site and not fool your head, or to score at all on the search engine and make a SPA. In general, I have already done both, and somehow it’s not interesting, I would like to experiment with a new approach, but not to the detriment of my project.

There is very little information, everyone writes that this PJAX technology is cool, but everyone who tries it has a lot of problems because of it. And, obviously, at some stage they score. But this is not a solution to the problem. Who has worked with this? Or can share sources of information? Or can you suggest the specifics of the work of search engines in this regard?

Answer the question

In order to leave comments, you need to log in

5 answer(s)
D
Denis, 2018-05-31
@sidni

Why guess, register on the google search console, there is an item to view the page as a google bot, enter your pjax link there and see what happens (there will be two windows as the user sees and as the bot sees)
But it seems to me that a controversial situation may arise ... even if a bot without js follows the link and the user remains on the old but new content, then the header of different meta tags may not match, some images and Google may consider that you are trying to deceive the bot and give different content to the user and the bot and in not rare cases it is a ban. True, the problem of Ajax sites has been standing for 10 years, maybe Google is already more loyal to this

D
Dmitry Evgrafovich, 2018-05-31
@Tantacula

And what's cool? Giving away pieces of content is at least trouble in the future when you decide to change the design. Will your coder crawl through your code? Although this can be foreseen at an early stage by spawning files in the template engine (yeah, and not forgetting to support them). At the front, there are also potential troubles with tracking events, what if you need to click on the buttons to track that are inside the loaded block? Constructs like $('body').on('click', 'some-element-inside-loaded-block') are slower than $('some-element').on('click'), plus I I have no idea how you can screw up an adequate framework in this case and make it work. And if you have a datatables view plugin in the downloaded piece of code, you will need to activate it or something that will eat up memory, if it is not unloaded before removing html blocks? You will eventually come to the conclusion that in addition to html, you will also have to give js with the same pjax, and at this moment you will realize that you are wildly stuck. Although in fairness I note that I have not used ssr for js frameworks in practice yet (there is no need, since frameworks in my tasks have been responsible for basically non-indexed pages such as personal accounts) and everything is done there in two clicks, but of these two options, I would prefer to read about ssr than implement pjax.

I
ivankomolin, 2018-05-31
@ivankomolin

If the site is aimed at both users and SEO, then this is exactly how it should be - fast for users, accessible to search engines.
I implemented a similar system and I think this is the only way not to create brakes for the user due to the requirements of seo-specialists.
But you need to understand that search engines do not stand still and they should already have a robot that can do js. Therefore, the main rule is that within the same url, both the bot and the browser with js should be given exactly the same content. Then there shouldn't be any problems.
Features that were developed during development:
1. It is not worth reloading in very small pieces, because. this adds a load on the client in the form of complex logic. The simpler js on the client, the better. As a result, from small pieces came to 2 pieces: the static part and the dynamic part.
2. Html to form on the server. The most convenient option for the client.
3. The template of the same part must be the same for both the bot and the browser with js. Easier to maintain. And thanks to point 2, it becomes easier than ever.

M
Maxim Timofeev, 2018-05-31
@webinar

everyone writes that this PJAX is a cool technology

Cool. But this does not mean that it is intended for your undertaking. Its task is to simply and uniformly load blocks. This is mega convenient when developing an admin panel, for example, or submitting a form with ajax. Doing the entire site (front) on it is not the best idea. It makes sense to take some kind of js framework. However, you will not get very serious problems from the point of view of seo. After all, if you did everything right, then get requests for links should return valid html, and what else does a bot need to be happy?
Probably pjax is the first solution of this kind that you have come across. Therefore, I advise you to pay attention to other js technologies. Since pjax is more of a hack than a technology.

P
profesor08, 2018-05-31
@profesor08

Are you afraid of the parameter? Get rid of him. Download everything via GET, only a part via POST. Or come up with some other way to define the request.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question