This way, Google would continue to crawl the old URLs and "see" the redirects. Often, webmasters make the mistake of removing sitemaps too early, which may cause a decrease in crawl rate by Google. This means it could potentially take longer for Google to process the redirects. Sitemaps aren't a perfect guarantee that Google will visit all your old URLs, but they do provide a hint. In fact, we still had several thousand URLs after several months that Google still hadn't visited, even with the sitemaps in place.
Regardless, without the sitemaps of the old URLs, the issue could have taken much bosnia and herzegovina business email list longer. 2. New URLs: Our old sitemaps were grouped into lists of 50,000 each — the maximum allowed by Google. There's some suggestion in the SEO community that grouping URLs into smaller sitemaps can actually improve crawling efficiency. Fortunately, NodeBB allowed us to build smaller sitemaps by default, so that's exactly what we did. Instead of 2-3 sitemaps with tens of thousands of URLs, we now had 130 individual XML sitemaps, typically with no more than 500 URLs each.
Examples of Moz XML sitemaps. 3. Spam + cruft cleanup As I mentioned earlier, the old Q&A had over 60,000 individual posts built up over 10 years. Inevitably, a number of these posts were very low quality.of the posts, along with poor user experience, could be causing Google to rank us lower. Again, time constraints meant we couldn't do a full content pruning audit. Fortunately, NodeBB came to the rescue again (this is starting to sound like an advertorial — I swear it's not!) and ran all 60,000 posts through their spam plugin to remove the most obvious, low-quality offenders.
We suspected both the low quality
-
- Posts: 848
- Joined: Tue Dec 24, 2024 3:13 am