I was wondering if the nature of decentralization would negatively affect SEO, since people can access the same post from many different instance
I tried searching for the title of this post verbatim and it isn’t in google results period.
That could just be because of lag between when the post is created and when the Google crawler finds/indexes the page
https://lemmy.ml/robots.txt , https://lemmy.world/robots.txt , etc don’t seem to disallow posts, so the text-based content should be easy to index, at least for these instances.
related news: Google is getting a lot worse because of the Reddit blackouts.
An earlier post pointed out: federated sites seem like they will suffer against central content in a SEO world - regardless of whether they are technically indexable.
I wonder if lemmy should have a SEO friendly federated site… .com domain, robots.txt and everything else…
right, though the SEO game is changing drastically with AI. People are using GPT-like models often in place of searches and likewise, expecting search results to hit their answers rather than being vague pointers. Following this reasoning, the search engines will need direct users to where the valuable information is, not always, but often enough to not lose users to competitors.
So the thing about SEO is that it’s often an attention game that advertisers and smaller websites compete with each other. The information in public forums and threads is invaluable for the success of the search engine itself, so they’re the ones that will eventually have to adapt to the new federated reality, should it become mainstream - and I do hope so.
How long does it usually take for google to index websites? Because I tried the string
lemmy site:lemmy.ml after:2023-06-15
and only one post turned up for me and it wasMemes
… the current state of affairs does not seem promising 😔 And if I tried with another instance with the same keywordslemmy site:kbin.social after:2023-06-15
nothing even turned up.I wonder though, will search engines adapt to Lemmy and its fediverse system? Or will search engines die? Or will we see dedicated search engines to search through the fediverse?
How long does it usually take for google to index websites?
Anything between a couple of hours to more than a week, I don’t think having a “real-time feed” through Google is important though. Other than world cup scores, their results were never about speed.
Oh, good point. Yes, probably? We can not simply assume search engines know that all of these point to the same content:
- https://slrpnk.net/c/technology
- https://feddit.de/c/technology@slrpnk.net
- https://sopuli.xyz/c/technology@slrpnk.net
- https://beehaw.org/c/technology@slrpnk.net
Or even worse, due to defederation, they may not all point to the exact same content.
Without further investment either from lemmy or the search engine’s side, they are probably seen as distinct sources, not aggregated. Which makes each individually less relevant and less likely to show up .
Also note none of the adresses above contain ‘lemmy’. How would users search for content on lemmy in these cases? Can’t do “technology site:lemmy”, or?
But I can say, lemmy content is visible. Haven’t seen it on the first page of ecosia yet, but on page 2 or 3.
This is relatively simple to solve from a technology perspective. You just incorporate the canonical URL meta tag on federated sites that reference the source URL. It’d be trivial to implement, provided the authoritative URL is known.
Maybe you could use use site:lemmy.ml, because they federate with most instances, they’re likely to have most of lemmy’s content?