Top 4 On-Site SEO Analysis Tools
Posted: Mon Dec 09, 2024 9:27 am
Managing optimization projects, especially for large e-commerce sites, is one of the most common tasks we encounter at Zeo. Managing a team responsible for on-site SEO operations and making improvements on such sites requires serious effort. Of course, in this process, as always, it is possible to benefit from great alternatives created to reduce the human factor.
For any site with 100+ pages, you will need crawl analysis tools to see the overall projection more easily. With these tools, you will be able to measure the SEO compatibility of your pages without getting tired and in a shorter time. In this article, I will introduce some tools that we frequently use in our customers' operations at Zeo.
Although some of the tools I will mention are paid and some are free, I believe that they are generally complementary programs. Using them simultaneously during the evaluation process will help them complete each other's shortcomings and help you achieve much more effective results. In this way, you can have more control over your on-site SEO arrangements and create effective optimization strategies by using these tools.
MozPro
With this tool from Moz, we can examine the on-site SEO conditions of all projects, regardless of their size, thanks to research reports expressed in understandable language. The reason I say "understandable" is that even someone who does not have advanced knowledge on the subject can easily observe the current situation.
Another advantage of Moz Pro is that , in addition homeowner database to on-site SEO analysis, it also offers many different services to its users, such as competitor analysis, ranking tracking and link analysis. Of course, in this article, I will generally stay within the framework of on-site SEO.

I would like to expand on the process in general. First, we create the campaign by entering information about the website we want to analyze. After creating the campaign, there is a certain amount of time required for the site to be crawled. Especially since large-scale e-commerce sites have many subpages, this crawling process can take a few days. After all the analysis is complete, we come to the evaluation phase, which is frankly one of the most critical parts of the job. In addition, it is very important to make this evaluation in the most accurate way, in other words, to be able to evaluate our site from Google's perspective. The reason for this is that we will decide on the changes we need to make after this process during this process.
When you go to the "Crawl Diagnostics" tab, you can access all analyzed pages and their conditions. The example you see is the results of the analysis of a medium-sized e-commerce site. On this page, which talks about errors on the pages and points to be careful about, you can see all the problems in detail. On the right side, it is possible to see which issues are experienced and how many pages have this problem on average. By examining the menus within the "Show" button you see above, you can analyze the problems one by one, and you can also easily observe them under the headings "error", "warning" and "notification".
If we talk about the analysis results that Moz gives feedback to the user as 'errors' that definitely need to be fixed;
400 code errors, 500 code errors, missing or unused titles, duplicate page content, duplicate page titles and pages blocked by robots.txt are among the important errors that Moz should definitely fix. Although the number seems high in the example, you will be able to observe that these problems will slowly decrease in a short time with the creation of appropriate strategies. If you are confused about the exact definitions of these errors, you can access detailed information from the pages specially designed for each error situation.
Moz crawl analysis continues at regular intervals until you end the campaign, reporting the status and progress of the site. If you wish, you can report this progress in a PDF or Excel format and continue your analysis and adjustments accordingly.
Screaming Frog SEO Spider
We can say that Screaming Frog is a truly indispensable program to fully understand the performance of your website in organic searches and to ensure that Googlebots can easily navigate your site thanks to the changes you make .
Screaming Frog is a user-friendly desktop program that is completely suitable for SEO processes. Especially for large projects, we can say that it has an important role since there is no 10,000 or 20,000 page analysis limit like the Moz Pro tool. In this way, you can reach many different analysis results by reducing a manual control process that may take hours to minutes. In fact, if we consider that Google does not even share some of this data in its own service, Search Console, we can say that SEO Spider does serious data mining.
I want to talk a little bit about its usage. It is very easy to start the process right after downloading and installing the program. You just start the process by entering the URL of the website you want to research into the address bar and then pressing the "start" button. However, after the process is completed, different types of results and data indicators are listed.
We can filter this data according to page structures and use it to see and analyze the results we want. If we give an example about the content of the categories; under the category called "Internal", we can easily see the internal links of the site and which pages there are links between.
In addition, as you can easily see under the "status code" column, we can check the status of the linked page, whether it has problems or whether it is directed as we want.
The situation is not much different in the "External" tab. In general, it talks about the links given outside the site and their conditions. In addition, the full link of the given address, its content (image, text, video, etc.) and redirection methods (status code) are specified.
Other important analysis information that the Screaming Frog program offers its users is "Page Titles" and "Meta Descriptions". By examining these analysis results, we can easily detect double page title and missing/incomplete meta description problems. In addition to these; you can easily access a lot of data, from h1, h2 etc. heading usage to visual file analysis.
For any site with 100+ pages, you will need crawl analysis tools to see the overall projection more easily. With these tools, you will be able to measure the SEO compatibility of your pages without getting tired and in a shorter time. In this article, I will introduce some tools that we frequently use in our customers' operations at Zeo.
Although some of the tools I will mention are paid and some are free, I believe that they are generally complementary programs. Using them simultaneously during the evaluation process will help them complete each other's shortcomings and help you achieve much more effective results. In this way, you can have more control over your on-site SEO arrangements and create effective optimization strategies by using these tools.
MozPro
With this tool from Moz, we can examine the on-site SEO conditions of all projects, regardless of their size, thanks to research reports expressed in understandable language. The reason I say "understandable" is that even someone who does not have advanced knowledge on the subject can easily observe the current situation.
Another advantage of Moz Pro is that , in addition homeowner database to on-site SEO analysis, it also offers many different services to its users, such as competitor analysis, ranking tracking and link analysis. Of course, in this article, I will generally stay within the framework of on-site SEO.

I would like to expand on the process in general. First, we create the campaign by entering information about the website we want to analyze. After creating the campaign, there is a certain amount of time required for the site to be crawled. Especially since large-scale e-commerce sites have many subpages, this crawling process can take a few days. After all the analysis is complete, we come to the evaluation phase, which is frankly one of the most critical parts of the job. In addition, it is very important to make this evaluation in the most accurate way, in other words, to be able to evaluate our site from Google's perspective. The reason for this is that we will decide on the changes we need to make after this process during this process.
When you go to the "Crawl Diagnostics" tab, you can access all analyzed pages and their conditions. The example you see is the results of the analysis of a medium-sized e-commerce site. On this page, which talks about errors on the pages and points to be careful about, you can see all the problems in detail. On the right side, it is possible to see which issues are experienced and how many pages have this problem on average. By examining the menus within the "Show" button you see above, you can analyze the problems one by one, and you can also easily observe them under the headings "error", "warning" and "notification".
If we talk about the analysis results that Moz gives feedback to the user as 'errors' that definitely need to be fixed;
400 code errors, 500 code errors, missing or unused titles, duplicate page content, duplicate page titles and pages blocked by robots.txt are among the important errors that Moz should definitely fix. Although the number seems high in the example, you will be able to observe that these problems will slowly decrease in a short time with the creation of appropriate strategies. If you are confused about the exact definitions of these errors, you can access detailed information from the pages specially designed for each error situation.
Moz crawl analysis continues at regular intervals until you end the campaign, reporting the status and progress of the site. If you wish, you can report this progress in a PDF or Excel format and continue your analysis and adjustments accordingly.
Screaming Frog SEO Spider
We can say that Screaming Frog is a truly indispensable program to fully understand the performance of your website in organic searches and to ensure that Googlebots can easily navigate your site thanks to the changes you make .
Screaming Frog is a user-friendly desktop program that is completely suitable for SEO processes. Especially for large projects, we can say that it has an important role since there is no 10,000 or 20,000 page analysis limit like the Moz Pro tool. In this way, you can reach many different analysis results by reducing a manual control process that may take hours to minutes. In fact, if we consider that Google does not even share some of this data in its own service, Search Console, we can say that SEO Spider does serious data mining.
I want to talk a little bit about its usage. It is very easy to start the process right after downloading and installing the program. You just start the process by entering the URL of the website you want to research into the address bar and then pressing the "start" button. However, after the process is completed, different types of results and data indicators are listed.
We can filter this data according to page structures and use it to see and analyze the results we want. If we give an example about the content of the categories; under the category called "Internal", we can easily see the internal links of the site and which pages there are links between.
In addition, as you can easily see under the "status code" column, we can check the status of the linked page, whether it has problems or whether it is directed as we want.
The situation is not much different in the "External" tab. In general, it talks about the links given outside the site and their conditions. In addition, the full link of the given address, its content (image, text, video, etc.) and redirection methods (status code) are specified.
Other important analysis information that the Screaming Frog program offers its users is "Page Titles" and "Meta Descriptions". By examining these analysis results, we can easily detect double page title and missing/incomplete meta description problems. In addition to these; you can easily access a lot of data, from h1, h2 etc. heading usage to visual file analysis.