Majento
Facebook Twitter VK

Documentation

Detailed description of the program Majento SiteAnalyzer

Purpose of the program

SiteAnalizer program is designed to analyze the site and to identify technical errors (search for broken links, duplicate pages, incorrect server responses), as well as errors and omissions in the SEO-optimization (blank meta tags, excess or complete lack of headers h1 pages, page content analysis, relink quality and a variety of other SEO-parameters).

SiteAnalyzer, site auditor

Key features

Differences from analogues

Beginning of work

When the program is launched, the user can access the address bar for entering the URL of the analyzed site (you can enter any page of the site, since the crawler, following the links of the source page, will bypass the entire site, including the main page, provided that all links are in HTML and not use Javascript).

After clicking the "Start" button, the crawler starts re-passing all the pages of the site via internal links (it does not go to external resources, it also does not go through links performed on Javascript).

After the robot has bypassed all the pages of the site, a report is made available in the form of a table and displays the received data, grouped by thematic tabs.

All analyzed projects are displayed in the left part of the program and are automatically saved in the program database together with the received data. To delete unnecessary sites, use the context menu of the project list.

Program settings

The section of the main menu "Settings" is intended for fine settings of the program with external sites and contains 4 tabs:

SiteAnalizer, program settings

Main settings

The main settings section serves for specifying the user-defined directives used when scanning the site.

Description of the parameters:

Other settings are related to exclusion settings when crawling the site using the "robots.txt" file, "nofollow" links, and using the "meta name='robots'" directives in the site page code.

User-Agent

In the User-Agent section, you can specify which user-agent will be presented to the program when accessing external sites during their scanning. By default, a custom user agent is installed, however, if necessary, you can select one of the standard agents most commonly found on the Internet. Among them there are such: search engine bots YandexBot, GoogleBot, MicrosoftEdge, bots of Chrome browsers, Firefox, IE8, and also mobile devices iPhone, Android and many others.

Proxy-server

If there is a need to work through a proxy, in this section you can specify the proxy server settings through which the program will access external resources.

Exceptions

This section is designed to avoid crawling certain pages and sections of the site when parsing.

Using regular expressions, you can specify which sections of the site should not be crawled and, accordingly, should not get into the program database. This list is a local list of exceptions for the time of site scanning (relative to it, the "global" list is the file "robots.txt" in the root of the site).

Working with the program

After the scan is completed, the information in the "Master data" block becomes available to the user. Each tab contains data grouped with respect to their names (for example, the "Title" tab contains the contents of the page title <title></title>, the "Images" tab contains a list of all images of the site and so on). Using this data, you can analyze the content of the site, find "broken" links or incorrectly filled meta tags.

SiteAnalizer, find 404 errors

If necessary (for example, after making changes on the site), using the context menu, it is possible to rescan individual URLs to display changes in the program.

Using the same menu, you can display duplicate pages by the corresponding parameters (duplicate title, description, keywords, h1, h2, content of pages).

SiteAnalizer, context menu

Project list context menu

Sitemap.xml generation

The site map is generated based on the crawled pages of the site. It adds pages of the "text/html" format.

For sites of large volumes, from 50 000 pages, there is the function of automatically splitting "sitemap.xml" into several files (in this case the main file contains links to additional files containing direct links to the pages of the site). This is due to the requirements of search engines for processing large sitemap files.

SiteAnalizer, sitemap.xml

If necessary, the amount of pages in the file "sitemap.xml" can be varied by changing the value of 50 000 (it is set by default) to the desired value in the main settings of the program.

Data export

For a more flexible analysis of the data obtained, it is possible to upload them to a CSV format (Excel).

Data is uploaded relative to the current active tab of the "Master data" block of the selected project.

Multilanguage support

In the program there is a choice of the preferred language on which the work will be done.

Main supported languages: English, German, Italian, Spanish, French, Russian... At the moment the program is translated into more than fifteen (15) most popular languages.

SiteAnalizer, Multilanguage support

If you want to translate the program into your own language, then it is enough to translate any "*.lng" file into the language of interest, after which the translated file should be sent to the address "support@site-analyzer.pro" (comments to the letter should be written in Russian or English) and your translation will be included in the new release of the program.

More detailed instructions on how to translate the program into languages are found in the distribution (file "lcids.txt").

P.S. If you have any comments on the quality of the translation – send comments and corrections to "support@site-analyzer.pro".

Additional

The main menu item "Compress Database" is designed to perform the operation of packing the database (cleaning the database from previously deleted projects, as well as ordering data (analogous to defragmenting data on personal computers)).

This procedure is effective when, for example, a large project containing a large number of records has been deleted from the program. In general, it is recommended to periodically compress data to get rid of redundant data and reduce the size of the database.

The answers to the remaining questions can be found in the FAQ section.

Our clients