Answer the question
In order to leave comments, you need to log in
Choosing a city on the site?
Let's say there is a site with ads. The city is determined via geoIP or when the user has manually selected a city. The city ID is recorded in the session, and then the necessary ads are shown to the user.
Those. there are no differences in url'e between a person from Kaliningrad and from Khabarovsk.
Now the question is how to make search robots crawl all ads? If the robot gets to the city selection links. Then further passing through the links, will the city ID be recorded for him in the session? If not, what is the way out of this situation?
Answer the question
In order to leave comments, you need to log in
You are doing wrong. Firstly, the ideology of the Web implies that different content has different addresses, and showing people at the same URL different things depending on the city is wrong. Each page must have its own, unique, address.
Now imagine a user. He went to Qooqle, typed “buy a moped moscow” and sees: “I’ll sell a moped, cheap, Moscow”, clicks on the link - and instead of a moped you show him ads from his city of Zamkadovsk (where you can buy a used bike at most). It is not right. We make a website for people, not for SEO, right?
For the same reason, you can’t show different content to bots and people. Because the person who gets to the site from the search expects to see the same thing that he saw in the search. For violation of this rule, search engines can lower the rating of the site and remove pages from the search results, and rightly so.
You need to make separate folders or domains for different cities: saint-peterburg.objava.ru/list for example. Or objava.ru/saint-petersburg/list. If a user from one city went to the domain of another city, you can show him in the header an offer to go to the domain of his city, but do not impose or force him to switch. Maybe I'm going to some city and want to see ads from there.
As for the main domain - objava.ru - you can show a map of Russia or a list of cities on it (with normal links, without javascript and other rubbish), and for example, highlight the user's city in it and offer to go. For example, as it is done on the main site avito.ru. A man will go to his city, and a robot will bypass everything.
But for displaying different content for different IPs at the same URL and for redirecting without asking the user’s desire, I think sites should be banned (which sometimes happens). We do not need the collective farm Internet. Let's do everything according to standards, usability guidelines and best practices, and not try to outwit search engines and users.
Legitimate question: Even if the search engine indexes this kindness, how will it send users to these ads? A link for one city will not be valid in the city, which is determined by the user by ip. Therefore, you can’t do this, the urls must be different. You can make subdomains, or take the first part of the address.
Making differences in the url for different content is the only correct solution, even with search bots, even without.
Even if the bot scans all the ads, the links in the search results will still be incorrect, because. a person can define a city in a different way.
The point is to make more and more semi-working crutches (want caching later? the ability to copy the link? the “back” button when switching between cities?), it is better to eliminate the cause.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question