5 main points that you need to pay attention
As already mentioned in earlier articles the Foundation of a good site is the technical side. Competent technical optimization that allows search engine robots are good and fast to index my site, this site does not issue errors and failures, each page corresponds to only one address, etc.
Let us examine the 5 main points that you need to pay attention to the technical optimization of the website.
1. File Robots.txt
Text file robots.txt must be present in the root directory of each site. This is the first file called robots when you visit the site and which are stored instructions for robots.
In this file you can specify the pages that cannot be indexed – it is desirable to close from indexing system pages and pages with duplicate content. Also in the file you need to specify the main mirror site, give a link to a map of the site; indicate how often the robots can download information from the site. With the help of this file you can send to the robots and other important information for proper indexing of the website.
Video tutorial on how to create a file robots.txt can be viewed at this link.
2. The address of the website pages
Check on the uniformity of the addresses of all pages of the website – they must have the same type:
if you use or not use slashes, everywhere uniformly.
For example, site.ru/katalog/ and site.ru/cost/ — correctly.
http://site.ru/katalog and site.ru/cost/ wrong.
addresses should be fully in Latin, it is impossible to prevent Russian and Latin words and letters in a single address. Especially often this error occurs in sites in the area .of the Russian Federation.
For example, site.ru/katalog/sweetshots/ right,
It is better if the addresses of the internal pages don't meet the characters "?","&", "=", "@" and others because they may not be processed.
To check the addresses of internal pages for errors, it will be convenient after you create a site map.
3. Sitemap, site map
Site map users need to easily find the information on the site and the search engines to improve indexing of your site.
Search engine robots with the help of a site map you can see the whole structure and quickly index the new page.
Users are better guided on the site and quickly find the information they need.
With the help of Sitemap you can tell Yandex which pages of your website to index, how often do you update information on the website, as well as the index of which pages are most important.
To create Sitemap you can use different services, for example, http://www.xml-sitemaps.com/
4. Error of the site
When the user opens the page in the browser (or robot starts its scan), the server on which the site resides, gives in response to this request, HTTP status code, i.e. provides information about your site and the requested page.
The 200 – page all right,
404 – non-existent page,
503 – server is temporarily unavailable.
Sometimes, the status code given is incorrect. For example, a page working, and the status code 404, or Vice versa, a non-existent page give code 200. It is necessary to track and hold the setup correct status codes in the file .htaccess.
It is especially important to configure a 404 error. If the page exists, and the server request when it reports a 404 error, the page will not be indexed by search engines.
To check the status codes you can use Yandex.Webmasters or application "Firebug" for Mozilla Firefox.
5. A properly configured redirect
Error, occurring on almost every website – do not set up a 301 redirect from domain with www in the url, the domain without www, or Vice versa.
Decide what your website address will be important and will inform the search engines with the help file .htacces. As search engines www.site.ru and site.ru you can consider different sites, the issue can get duplicates, which will create difficulties with ranking in SERP.
This is only the basic points that you primarily need to pay attention.