Page 1 of 1

Setting up Screaming Frog SEO Spider

Posted: Sat Feb 01, 2025 8:17 am
by rifat28dddd
With the help of Screaming Frog SEO Spide, you can detect errors in links, find duplicate pages and other shortcomings that prevent the site from moving forward normally. The program is used by SEO specialists of different levels, web analysts and site owners. We will tell you more in this article.
Screaming Frog SEO Spider ("Spider", "SEO-Spider") is a scanning program designed to conduct a technical SEO audit of a website in order to check its compliance with the requirements of search engines.

Features of Screaming Frog SEO Spider
Operating principle
The program was developed by a UK company in 2010. It is compatible with Windows, macOS and Linux. There is no Russian language support, there is English and four other languages. The scanner works as follows:

"Spider" collects data about the site and links on it;
analyzes them;
prepares a report in a format convenient for downloading.
The program provides more than 25 tools for checking the site.

Cost and installation
Screaming Frog SEO Spider is available for free, but cambodia telegram data there is also a paid version. In the free module, you can check up to half a thousand addresses and use 7 basic parsing tools: identifying duplicates, broken links, checking key queries, etc. The paid version has all the functions. Synchronization with Google Analytics, search for spelling and grammar errors are possible. You can scan any number of addresses. In both versions, you must first download the program to use it. To do this, download the distribution, open it and follow the instructions of the "Installation Wizard". You do not need to provide personal information. After installation, you need to click on the program shortcut on the desktop. Then you can immediately start parsing: enter the site address and click Start. But it is recommended to first configure the "Spider".

After opening the program, you need to click Configuration, and then Spider. The new window will have several tabs:

Basic. To increase the parsing speed, you can remove CSS styles, JavaScript, SWF. Then the program will not take them into account during operation and will be less loaded. If the site is closed in robots.txt, in the Basic window you need to activate the function of ignoring this file;
Limits. In the Limit Search Depth setting, you must specify the level of nesting of pages to check the site. This is relevant when checking a directory or other resource that has many internal folders located one inside another. Then very deep pages will not be checked;
Advanced. You need to tick the boxes so that only what search engines see is displayed. This simplifies the work, removing all unnecessary things and increasing visibility;
Preferences. Here you specify the sizes of meta tags, URLs, titles, image attributes, and the images themselves. If it is important to take into account all pages, you can enter larger values.