Google SandBox
In the beginning of 2004 in optimizers environment the new mysterious concept – Google SandBox or sandbox Google was designated. Such designation was received by the new spam-filter Google directed on an exception of delivery of young, again created sites.
Filter SandBox is shown that again created sites are absent in search engine delivery practically on all phrases. It occurs, despite presence of qualitative both unique information contents and correctly spent promotion (without use of spams-methods).
At present SandBox concerns only an English-speaking segment, sites in Russian and other languages are not exposed to this filter. However, quite possibly that this filter can expand the influence.
It is possible to assume that purpose SandBox of the filter to exclude from delivery spams-sites – really, any search spamer cannot wait months before occurrence of results. However together with it the large quantity of the normal, again created sites suffers.
The exact information that particularly is SandBox the filter till now is not present. There is a number of the assumptions received on the basis of experience which we and will result more low:
- SandBox is a filter on young sites. Again created site gets to "sandbox" and there is in it uncertain time while the search engine will not translate it in the category of "usual";
- SandBox is a filter on the new links which are put down on again created sites. Try to notice basic difference from the previous assumption – the filter is imposed not on age of a site, and on age of links to a site. In other words Google has no claims to a site, however refuses to consider deep links to it if from the moment of their occurrence has passed less than X months. As deep links are one of ranging pacing factors, ignoring of deep links to equivalently absence of a site in search engine delivery. What of two resulted assumptions to tell difficult, quite possibly more truly that both of them are true;
- The site can is in a sandbox from 3 months about one year and more. There is also a supervision that sites leave a sandbox in a mass order. I.e. sandbox term is defined not individually for each site, and for the big groups of sites (sites created in some time range get to one group). The filter then acts in film at once for all group, thus, sites from one group will stay in "sand" different time.
Typical signs of that your site is in a sandbox:
- Your site is normally indexed Google, regularly visited by the search robot;
- Your site has PageRank, the search engine knows and correctly displays deep links to your site;
- Search in the site address (www.site.com) gives out correct results, with correct heading, snippet (the resource description) etc.;
- Your site normally is on the rare and unique word combinations containing in the text of pages;
- Your site is not visible in first thousand results by any other inquiries, even on under what it was initially created. Sometimes there are exceptions and the site by some inquiries appears on 500-600 positions that, of course, an essence does not change.
Methods of detour of the filter practically does not exist. There is a number of assumptions, of how it can be made, however it no more than assumptions, besides malopriemlimye for the usual web designer. The basic method is to work over a site and to wait for the filter termination.
After the filter acts in film, there is a sharp growth of ratings on 400-500 and more positions.
Google LocalRank
On February, 25th, 2003 company Google has patented new algorithm of ranging of the pages, received name LocalRank. In a basis the idea lies ranging pages not on their global reference citation, and on citation among group of the pages thematically connected with inquiry.
Algorithm LocalRank is not used in practice (at least,), however, the patent contains in that kind in what it is described in the patent a number of interesting ideas with which, we consider, should be familiar everyone optimizer. The account of subjects of referring pages is used almost by all search engines. Though there is it, probably, on slightly other algorithms, patent studying will allow to understand the general ideas as it can be realised.
At reading of this chapter consider that in it the theoretical information, instead of practical guidance to action is presented.
The basic idea of algorithm LocalRank is expressed by following three points:
1. Using some algorithm, the certain number of documents, relevant to search inquiry (we will designate this number N) gets out. These documents are initially sorted according to some criterion (it can be PageRank, or an estimation of relevance either any other criterion or their grouping). We will designate numerical expression of the given criterion as OldScore.
2. Each of N pages undergoes new procedure of ranging in which result each page receives some new rank. We will designate it LocalScore.
3. On this step of size OldScore and LocalScore are multiplied, therefore new size NewScore according to which there is a total ranging of pages turns out.
In the given algorithm new procedure of ranging in which result to each page new rank LocalScore is appropriated is key. We will describe this procedure more in detail.
0. Using some algorithm of ranging are selected N the pages responding search inquiry. The new algorithm of ranging will work only with these N pages. Each page in this group has some rank OldScore.
1. At calculation LocalScore for the given page all pages from N which have deep links to the given page are allocated. We will designate set of these pages M. Thus, pages will not get to set M from the same host (host, the filtration will occur on IP to the address), and also the pages which are mirrors of the given.
2. Set M breaks into subsets Li. The pages united by following signs get To these subsets:
- Accessory to one (or similar) to hosts. Thus, pages at which first three octets IP of the address coincide will get to one group. That is, pages, IP which address belongs to a range
xxx.xxx.xxx.0
xxx.xxx.xxx.255
Will be considered belonging to one group;
- Pages which have the identical or similar maintenance (mirrors, mirrors);
- cтраницы one site (domain).
3. Each page in each set Li has some rank (OldScore). Of each set gets out on one page with the greatest OldScore, the others are excluded from consideration. Thus, we receive some set K of the pages referring to given page.
4. Pages in set K are sorted according to parametre OldScore, then in set K remain only k the first pages (k – some set number), other pages are excluded from consideration.
5. On the given step pays off LocalScore. On remained k to pages there is a summation of their values OldScore. It is possible to express it the following formula
Here m – some set parametre which can vary from 1 to 3 (unfortunately, the information containing in the patent for described algorithm, does not give the detailed description of the given parametre).
After calculation LocalScore for each page from set N is finished, there is a calculation of values NewScore and resorting of pages according to new criterion. For рассчета NewScore the following formula is used:
NewScore (i) = (a+LocalScore (i)/MaxLS) * (b+OldScore (i)/MaxOS)
i – page for which new value of a rank pays off.
an and b – some numbers (the patent does not give more the detail information about these parametres).
MaxLS – maximum from calculated values LocalScore
MaxOS – maximum from values OldScore
Now we will try to distract from mathematics and we will repeat all aforesaid a simple language.
At the first stage there is a selection of a quantity of pages corresponding to inquiry. It becomes on the algorithms which are not considering subjects of links (for example, on relevance and the general reference popularity).
After the group of pages is defined, local reference popularity of each of pages will be counted up. All pages are anyhow connected with a subject of search inquiry and, hence, have partly I descend subjects. Having analysed links against each other in the selected group of pages (ignoring all other pages on the Internet), we will receive local (thematic) reference popularity.
After the done step we have values OldScore (a rating of page on the basis of relevance, the general reference popularity and other factors) and LocalScore (a rating of page among thematically connected pages). The total rating and ranging of pages is spent on the basis of a combination of these two factors.
Features of work of various search engines
Everything told above idea on text optimisation and increase of reference popularity are applicable to all search engines equally. More detailed description Google speaks the big presence of the information on this search engine in an easy approach, however the ideas stated concerning Google, in the big degree are applicable and to other search engines.
In general, I am not the supporter of search of "confidential knowledge» of how algorithms of various search engines in details work. All of them to some extent submit to the general rules and competent work on a site (without any features) leads to good positions almost in all search engines.
Nevertheless, we will result some features of various search engines:
Google – very fast indexation, very great value is given to deep links. Base Google is used very much by a great number of other search engines and portals.
MSN – больший, rather than at other search engines, accent on information contents of a site.
Yandex – the largest Russian search engine. Processes (under the different data) from 60 % to 80 % of all Russian-speaking search inquiries. Gives particular attention to thematic links (not thematic deep links also have effect, but to a lesser degree, than at other search engines). Indexation passes more slowly, than at Google, however as in acceptable time limits. Lowers in a rating or excludes from an index the sites which are engaged not thematic links exchange (not thematic links containing catalogues created only for the purpose of increase of a rating of a site), and also the sites participating in systems of an automatic exchange by links. During the periods of updatings of base which some days last, Yandex delivery constantly changes, during such periods it is necessary to refuse any works on a site and to wait stable results of work of a search engine.
One more feature Yandex – various search results depending on the register of keywords (i.e. "Word" and "word" give different search results).
Rambler – the most mysterious search engine. Occupies the second (on another the data the third after Google) a place on popularity among the Russian users. On available supervision, lowers in a rating the sites which are actively engaged in promotion (fast increase in number of deep links). Appreciates presence of search terms in the page plain text (without allocation various stylistic tags).
Mail.ru – typing popularity a search engine. Uses results of a Google search engine after some additional processing. Optimisation under Mail.ru is reduced to optimisation under Google.
Councils, assumptions, supervision
In the given chapter the information which has appeared as a result of the analysis of various articles, dialogue оптимизаторов is presented, practical supervision, etc. this Information is not exact and authentic is only assumptions and ideas, however ideas interesting. The data presented in this section, perceive not as an exact management, and as the information to reflexion.
- Proceeding links. Refer to authoritative resources in your area, using the necessary keywords. Search engines appreciate links to other resources of the same subjects;
- Proceeding links. Do not refer on FFA sites and the other sites excluded from an index of a search engine. It can lead to fall of a rating of your own site;
- Proceeding links. The page should not contain more than 50-100 proceeding links. It does not lead to page fall in a rating, but links over this number will not be considered by a search engine;
- External site wide links, that is the links standing on each page of a site. It is considered that search engines negatively concern such links and do not consider them at ranging. There is also other opinion that it concerns only the big sites with thousand pages;
- Ideal density of keywords. Very often it is necessary to hear a similar question. The answer consists that the ideal density of keywords does not exist, it is more true various for each inquiry, that is pays off a search engine dynamically, depending on the search term. Our council – to analyse the first sites from search engine delivery that will allow to estimate a situation approximately;
- Age of a site. Search engines prefer old sites, as stabler;
- Site updating. Search engines prefer developing sites, that is on what the new information, new pages is periodically added;
- Domain zone (concerns the western searchers). The preference is given to the sites located in zones.edu.mil.gov, etc. Such domains can register only the corresponding organisations, therefore such sites have more than trust;
- Search engines trace, what percent of visitors comes back to search, after visiting of this or that site from вылачи. The big percent of returns means not thematic contents, and such page goes down in search;
- Search engines trace, this or that link in search results how much often gets out. If the link gets out seldom, means, the page does not represent interest and such page goes down in a rating;
- Use synonyms and related forms of keywords, it will be estimated by search engines
; - Too fast growth of number of deep links is perceived by search engines as artificial promotion and conducts to rating fall. Very disputable statement first of all such way can be used for fall of a rating of competitors;
- Google does not consider deep links, if they are on one (or similar) hosts, that is pages, IP which address belongs to a range xxx.xxx.xxx.0 xxx.xxx.xxx.255. Such opinion occurs most likely that Google stated the given idea in the patents. However employees Google declare that any restrictions on IP to the address on deep links is not imposed, and there are no bases not to trust them;
- Search engines check the information on the owner of the domain. Accordingly links from the sites belonging to one owner have smaller weight, than usual links. The information is presented in the patent;
- Term on which the domain is registered. The more term, the more preference is given to a site;
Creation of a correct content
The content (information contents of a site) plays the major role in site promotion. To that there is a set of the reasons about which we will tell in this chapter, and also will give advice how correctly to fill a site with the information.
- Uniqueness of a content. Searchers appreciate the new information which anywhere earlier was not published. Therefore at site creation lean against own texts. The site constructed on the basis of another's materials, has much smaller chances of an exit in a top of search engines. As a rule, the primary source always is above in search results;
- At creation of a site do not forget that it is initially created for visitors, instead of for search engines. To result the visitor on a site is only the first and not the most difficult step. To keep the visitor on a site and to transform it into the buyer – here really challenging task. To achieve it it is possible only the competent information filling of a site interesting to the person;
- Try to update regularly the information on a site, to add new pages. Searchers appreciate developing sites. Besides, there is more than text – more visitors on a site. Write articles on a subject of your site, publish responses of visitors, create a forum for discussion of your project (the last – only if attendance of a site will allow to create an active forum). An interesting content – pledge of attraction of interested visitors;
- The site created for people, instead of search cars, has an every prospect of hit in important catalogues, such as DMOZ, Yandex and others;
- The interesting thematic site has much more chances of reception of links, responses, reviews etc. other thematic sites. Such reviews in itself can give quite good inflow of visitors, except that, deep links from thematic resources on advantage will be estimated by search engines.
In summary one more council. As they say, the shoemaker should do boots, and to write texts the journalist or the technical writer should. If you manage to create fascinating materials for your site it it is very good. However the majority of us does not have special abilities to a writing of attractive texts. Then it is better to entrust this part of work to professionals. It is more expensive variant, but in long-term prospect it will justify itself(himself).
Choice of the domain and a hosting
Now create the page in the Internet any can and for this purpose it is not necessary any expenses. There are the companies giving a free hosting which will place your page in exchange for the right to show on it the advertising. Many Internet providers also will make room for you on the server if you are their client. However all these variants have very essential lacks, therefore, at creation of the commercial project, you should concern these questions with more responsibility.
First of all it is necessary to buy the own domain. It offers to you following advantages:
- The project which does not have own domain, is perceived as a site-something ephemeral. Really, why we should trust the given resource if its owners are not ready to spend even the symbolical sum for creation of the minimum image. Placing of free materials on such resources probably, but attempt of creation of the commercial project without own domain is almost always doomed to failure;
- Own domain gives you freedom in a hosting choice. If the current company has ceased suit you you at any moment can transfer the site on other, more convenient or fast platform.
At a domain choice remember following moments:
- Try, that the domain name was remembered both its pronunciation and a writing would be unequivocal;
- For promotion of the international English-speaking projects more all approaches domains with expansion.com It is possible to use also domains from zones.net.org.biz, etc., however this variant is less preferable;
- For promotion of national projects always it is necessary to take the domain in a corresponding national zone (.ru – for Russian-speaking projects.de – for German etc.);
- In case of bilingual (and more) sites it is necessary to allocate the domain under each of languages. National search engines in a greater degree will estimate such approach, than presence on the basic site of subsections in various languages.
Domain cost makes (depending on the registrar and a zone) 10-20$ in a year.
At a hosting choice it is necessary to lean against following factors:
- Speed of access;
- Time of availability of servers (uptime);
- Traffic cost for a gigabyte and quantity of the prepaid traffic;
- It is desirable, that the platform settled down in the same geographical region, as the majority of your visitors;
Cost of a hosting for small projects fluctuates around 5-10$ in a month.
At a choice of the domain and a hosting avoid "free" offers. It is often possible to see that the hostings-companies offer free domains to the clients. As a rule, domains in this case are registered not on you, and on the company, that is the actual owner of the domain is your hosting-provider. As a result you cannot replace a hosting for the project, or will be compelled to redeem the own, untwisted domain. Also in most cases it is necessary to adhere to a rule not to register the domains through a hosting-company as it can complicate possible carrying over of a site on other hosting (even in spite of the fact that you are the high-grade owner of the domain).
Change of the address of a site
Sometimes for some reasons change of the address of the project can be demanded. Some resources beginning on a free hosting and the address, develop to high-grade commercial projects and demand moving on own domain. In other cases there is more successful name for the project. At any similar variants there is a question of correct carrying over of a site on the new address.
Our council in this plan is that – create on the new address a new site with a new, unique content. On an old site put on a new resource visible links that visitors could pass to your new site, however do not clean absolutely old site and its contents.
At such approach you can receive search visitors both on new, and on an old resource. Thus you have a possibility to capture additional subjects and keywords that it would be difficult to make within the limits of one resource.
Project carrying over on the new address a problem difficult and not so pleasant (as anyway promotion of the new address should be begun practically with zero), however, if this carrying over is necessary it is necessary to take a maximum of advantage from it.
In the beginning of 2004 in optimizers environment the new mysterious concept – Google SandBox or sandbox Google was designated. Such designation was received by the new spam-filter Google directed on an exception of delivery of young, again created sites.
Filter SandBox is shown that again created sites are absent in search engine delivery practically on all phrases. It occurs, despite presence of qualitative both unique information contents and correctly spent promotion (without use of spams-methods).
At present SandBox concerns only an English-speaking segment, sites in Russian and other languages are not exposed to this filter. However, quite possibly that this filter can expand the influence.
It is possible to assume that purpose SandBox of the filter to exclude from delivery spams-sites – really, any search spamer cannot wait months before occurrence of results. However together with it the large quantity of the normal, again created sites suffers.
The exact information that particularly is SandBox the filter till now is not present. There is a number of the assumptions received on the basis of experience which we and will result more low:
- SandBox is a filter on young sites. Again created site gets to "sandbox" and there is in it uncertain time while the search engine will not translate it in the category of "usual";
- SandBox is a filter on the new links which are put down on again created sites. Try to notice basic difference from the previous assumption – the filter is imposed not on age of a site, and on age of links to a site. In other words Google has no claims to a site, however refuses to consider deep links to it if from the moment of their occurrence has passed less than X months. As deep links are one of ranging pacing factors, ignoring of deep links to equivalently absence of a site in search engine delivery. What of two resulted assumptions to tell difficult, quite possibly more truly that both of them are true;
- The site can is in a sandbox from 3 months about one year and more. There is also a supervision that sites leave a sandbox in a mass order. I.e. sandbox term is defined not individually for each site, and for the big groups of sites (sites created in some time range get to one group). The filter then acts in film at once for all group, thus, sites from one group will stay in "sand" different time.
Typical signs of that your site is in a sandbox:
- Your site is normally indexed Google, regularly visited by the search robot;
- Your site has PageRank, the search engine knows and correctly displays deep links to your site;
- Search in the site address (www.site.com) gives out correct results, with correct heading, snippet (the resource description) etc.;
- Your site normally is on the rare and unique word combinations containing in the text of pages;
- Your site is not visible in first thousand results by any other inquiries, even on under what it was initially created. Sometimes there are exceptions and the site by some inquiries appears on 500-600 positions that, of course, an essence does not change.
Methods of detour of the filter practically does not exist. There is a number of assumptions, of how it can be made, however it no more than assumptions, besides malopriemlimye for the usual web designer. The basic method is to work over a site and to wait for the filter termination.
After the filter acts in film, there is a sharp growth of ratings on 400-500 and more positions.
Google LocalRank
On February, 25th, 2003 company Google has patented new algorithm of ranging of the pages, received name LocalRank. In a basis the idea lies ranging pages not on their global reference citation, and on citation among group of the pages thematically connected with inquiry.
Algorithm LocalRank is not used in practice (at least,), however, the patent contains in that kind in what it is described in the patent a number of interesting ideas with which, we consider, should be familiar everyone optimizer. The account of subjects of referring pages is used almost by all search engines. Though there is it, probably, on slightly other algorithms, patent studying will allow to understand the general ideas as it can be realised.
At reading of this chapter consider that in it the theoretical information, instead of practical guidance to action is presented.
The basic idea of algorithm LocalRank is expressed by following three points:
1. Using some algorithm, the certain number of documents, relevant to search inquiry (we will designate this number N) gets out. These documents are initially sorted according to some criterion (it can be PageRank, or an estimation of relevance either any other criterion or their grouping). We will designate numerical expression of the given criterion as OldScore.
2. Each of N pages undergoes new procedure of ranging in which result each page receives some new rank. We will designate it LocalScore.
3. On this step of size OldScore and LocalScore are multiplied, therefore new size NewScore according to which there is a total ranging of pages turns out.
In the given algorithm new procedure of ranging in which result to each page new rank LocalScore is appropriated is key. We will describe this procedure more in detail.
0. Using some algorithm of ranging are selected N the pages responding search inquiry. The new algorithm of ranging will work only with these N pages. Each page in this group has some rank OldScore.
1. At calculation LocalScore for the given page all pages from N which have deep links to the given page are allocated. We will designate set of these pages M. Thus, pages will not get to set M from the same host (host, the filtration will occur on IP to the address), and also the pages which are mirrors of the given.
2. Set M breaks into subsets Li. The pages united by following signs get To these subsets:
- Accessory to one (or similar) to hosts. Thus, pages at which first three octets IP of the address coincide will get to one group. That is, pages, IP which address belongs to a range
xxx.xxx.xxx.0
xxx.xxx.xxx.255
Will be considered belonging to one group;
- Pages which have the identical or similar maintenance (mirrors, mirrors);
- cтраницы one site (domain).
3. Each page in each set Li has some rank (OldScore). Of each set gets out on one page with the greatest OldScore, the others are excluded from consideration. Thus, we receive some set K of the pages referring to given page.
4. Pages in set K are sorted according to parametre OldScore, then in set K remain only k the first pages (k – some set number), other pages are excluded from consideration.
5. On the given step pays off LocalScore. On remained k to pages there is a summation of their values OldScore. It is possible to express it the following formula
Here m – some set parametre which can vary from 1 to 3 (unfortunately, the information containing in the patent for described algorithm, does not give the detailed description of the given parametre).
After calculation LocalScore for each page from set N is finished, there is a calculation of values NewScore and resorting of pages according to new criterion. For рассчета NewScore the following formula is used:
NewScore (i) = (a+LocalScore (i)/MaxLS) * (b+OldScore (i)/MaxOS)
i – page for which new value of a rank pays off.
an and b – some numbers (the patent does not give more the detail information about these parametres).
MaxLS – maximum from calculated values LocalScore
MaxOS – maximum from values OldScore
Now we will try to distract from mathematics and we will repeat all aforesaid a simple language.
At the first stage there is a selection of a quantity of pages corresponding to inquiry. It becomes on the algorithms which are not considering subjects of links (for example, on relevance and the general reference popularity).
After the group of pages is defined, local reference popularity of each of pages will be counted up. All pages are anyhow connected with a subject of search inquiry and, hence, have partly I descend subjects. Having analysed links against each other in the selected group of pages (ignoring all other pages on the Internet), we will receive local (thematic) reference popularity.
After the done step we have values OldScore (a rating of page on the basis of relevance, the general reference popularity and other factors) and LocalScore (a rating of page among thematically connected pages). The total rating and ranging of pages is spent on the basis of a combination of these two factors.
Features of work of various search engines
Everything told above idea on text optimisation and increase of reference popularity are applicable to all search engines equally. More detailed description Google speaks the big presence of the information on this search engine in an easy approach, however the ideas stated concerning Google, in the big degree are applicable and to other search engines.
In general, I am not the supporter of search of "confidential knowledge» of how algorithms of various search engines in details work. All of them to some extent submit to the general rules and competent work on a site (without any features) leads to good positions almost in all search engines.
Nevertheless, we will result some features of various search engines:
Google – very fast indexation, very great value is given to deep links. Base Google is used very much by a great number of other search engines and portals.
MSN – больший, rather than at other search engines, accent on information contents of a site.
Yandex – the largest Russian search engine. Processes (under the different data) from 60 % to 80 % of all Russian-speaking search inquiries. Gives particular attention to thematic links (not thematic deep links also have effect, but to a lesser degree, than at other search engines). Indexation passes more slowly, than at Google, however as in acceptable time limits. Lowers in a rating or excludes from an index the sites which are engaged not thematic links exchange (not thematic links containing catalogues created only for the purpose of increase of a rating of a site), and also the sites participating in systems of an automatic exchange by links. During the periods of updatings of base which some days last, Yandex delivery constantly changes, during such periods it is necessary to refuse any works on a site and to wait stable results of work of a search engine.
One more feature Yandex – various search results depending on the register of keywords (i.e. "Word" and "word" give different search results).
Rambler – the most mysterious search engine. Occupies the second (on another the data the third after Google) a place on popularity among the Russian users. On available supervision, lowers in a rating the sites which are actively engaged in promotion (fast increase in number of deep links). Appreciates presence of search terms in the page plain text (without allocation various stylistic tags).
Mail.ru – typing popularity a search engine. Uses results of a Google search engine after some additional processing. Optimisation under Mail.ru is reduced to optimisation under Google.
Councils, assumptions, supervision
In the given chapter the information which has appeared as a result of the analysis of various articles, dialogue оптимизаторов is presented, practical supervision, etc. this Information is not exact and authentic is only assumptions and ideas, however ideas interesting. The data presented in this section, perceive not as an exact management, and as the information to reflexion.
- Proceeding links. Refer to authoritative resources in your area, using the necessary keywords. Search engines appreciate links to other resources of the same subjects;
- Proceeding links. Do not refer on FFA sites and the other sites excluded from an index of a search engine. It can lead to fall of a rating of your own site;
- Proceeding links. The page should not contain more than 50-100 proceeding links. It does not lead to page fall in a rating, but links over this number will not be considered by a search engine;
- External site wide links, that is the links standing on each page of a site. It is considered that search engines negatively concern such links and do not consider them at ranging. There is also other opinion that it concerns only the big sites with thousand pages;
- Ideal density of keywords. Very often it is necessary to hear a similar question. The answer consists that the ideal density of keywords does not exist, it is more true various for each inquiry, that is pays off a search engine dynamically, depending on the search term. Our council – to analyse the first sites from search engine delivery that will allow to estimate a situation approximately;
- Age of a site. Search engines prefer old sites, as stabler;
- Site updating. Search engines prefer developing sites, that is on what the new information, new pages is periodically added;
- Domain zone (concerns the western searchers). The preference is given to the sites located in zones.edu.mil.gov, etc. Such domains can register only the corresponding organisations, therefore such sites have more than trust;
- Search engines trace, what percent of visitors comes back to search, after visiting of this or that site from вылачи. The big percent of returns means not thematic contents, and such page goes down in search;
- Search engines trace, this or that link in search results how much often gets out. If the link gets out seldom, means, the page does not represent interest and such page goes down in a rating;
- Use synonyms and related forms of keywords, it will be estimated by search engines
; - Too fast growth of number of deep links is perceived by search engines as artificial promotion and conducts to rating fall. Very disputable statement first of all such way can be used for fall of a rating of competitors;
- Google does not consider deep links, if they are on one (or similar) hosts, that is pages, IP which address belongs to a range xxx.xxx.xxx.0 xxx.xxx.xxx.255. Such opinion occurs most likely that Google stated the given idea in the patents. However employees Google declare that any restrictions on IP to the address on deep links is not imposed, and there are no bases not to trust them;
- Search engines check the information on the owner of the domain. Accordingly links from the sites belonging to one owner have smaller weight, than usual links. The information is presented in the patent;
- Term on which the domain is registered. The more term, the more preference is given to a site;
Creation of a correct content
The content (information contents of a site) plays the major role in site promotion. To that there is a set of the reasons about which we will tell in this chapter, and also will give advice how correctly to fill a site with the information.
- Uniqueness of a content. Searchers appreciate the new information which anywhere earlier was not published. Therefore at site creation lean against own texts. The site constructed on the basis of another's materials, has much smaller chances of an exit in a top of search engines. As a rule, the primary source always is above in search results;
- At creation of a site do not forget that it is initially created for visitors, instead of for search engines. To result the visitor on a site is only the first and not the most difficult step. To keep the visitor on a site and to transform it into the buyer – here really challenging task. To achieve it it is possible only the competent information filling of a site interesting to the person;
- Try to update regularly the information on a site, to add new pages. Searchers appreciate developing sites. Besides, there is more than text – more visitors on a site. Write articles on a subject of your site, publish responses of visitors, create a forum for discussion of your project (the last – only if attendance of a site will allow to create an active forum). An interesting content – pledge of attraction of interested visitors;
- The site created for people, instead of search cars, has an every prospect of hit in important catalogues, such as DMOZ, Yandex and others;
- The interesting thematic site has much more chances of reception of links, responses, reviews etc. other thematic sites. Such reviews in itself can give quite good inflow of visitors, except that, deep links from thematic resources on advantage will be estimated by search engines.
In summary one more council. As they say, the shoemaker should do boots, and to write texts the journalist or the technical writer should. If you manage to create fascinating materials for your site it it is very good. However the majority of us does not have special abilities to a writing of attractive texts. Then it is better to entrust this part of work to professionals. It is more expensive variant, but in long-term prospect it will justify itself(himself).
Choice of the domain and a hosting
Now create the page in the Internet any can and for this purpose it is not necessary any expenses. There are the companies giving a free hosting which will place your page in exchange for the right to show on it the advertising. Many Internet providers also will make room for you on the server if you are their client. However all these variants have very essential lacks, therefore, at creation of the commercial project, you should concern these questions with more responsibility.
First of all it is necessary to buy the own domain. It offers to you following advantages:
- The project which does not have own domain, is perceived as a site-something ephemeral. Really, why we should trust the given resource if its owners are not ready to spend even the symbolical sum for creation of the minimum image. Placing of free materials on such resources probably, but attempt of creation of the commercial project without own domain is almost always doomed to failure;
- Own domain gives you freedom in a hosting choice. If the current company has ceased suit you you at any moment can transfer the site on other, more convenient or fast platform.
At a domain choice remember following moments:
- Try, that the domain name was remembered both its pronunciation and a writing would be unequivocal;
- For promotion of the international English-speaking projects more all approaches domains with expansion.com It is possible to use also domains from zones.net.org.biz, etc., however this variant is less preferable;
- For promotion of national projects always it is necessary to take the domain in a corresponding national zone (.ru – for Russian-speaking projects.de – for German etc.);
- In case of bilingual (and more) sites it is necessary to allocate the domain under each of languages. National search engines in a greater degree will estimate such approach, than presence on the basic site of subsections in various languages.
Domain cost makes (depending on the registrar and a zone) 10-20$ in a year.
At a hosting choice it is necessary to lean against following factors:
- Speed of access;
- Time of availability of servers (uptime);
- Traffic cost for a gigabyte and quantity of the prepaid traffic;
- It is desirable, that the platform settled down in the same geographical region, as the majority of your visitors;
Cost of a hosting for small projects fluctuates around 5-10$ in a month.
At a choice of the domain and a hosting avoid "free" offers. It is often possible to see that the hostings-companies offer free domains to the clients. As a rule, domains in this case are registered not on you, and on the company, that is the actual owner of the domain is your hosting-provider. As a result you cannot replace a hosting for the project, or will be compelled to redeem the own, untwisted domain. Also in most cases it is necessary to adhere to a rule not to register the domains through a hosting-company as it can complicate possible carrying over of a site on other hosting (even in spite of the fact that you are the high-grade owner of the domain).
Change of the address of a site
Sometimes for some reasons change of the address of the project can be demanded. Some resources beginning on a free hosting and the address, develop to high-grade commercial projects and demand moving on own domain. In other cases there is more successful name for the project. At any similar variants there is a question of correct carrying over of a site on the new address.
Our council in this plan is that – create on the new address a new site with a new, unique content. On an old site put on a new resource visible links that visitors could pass to your new site, however do not clean absolutely old site and its contents.
At such approach you can receive search visitors both on new, and on an old resource. Thus you have a possibility to capture additional subjects and keywords that it would be difficult to make within the limits of one resource.
Project carrying over on the new address a problem difficult and not so pleasant (as anyway promotion of the new address should be begun practically with zero), however, if this carrying over is necessary it is necessary to take a maximum of advantage from it.
No comments:
Post a Comment