Search Engine Optimization (SEO) denotes measures that serve to that Web pages in the organic search engine ranking in the unpaid search results (natural listings) to appear in higher places.
Search engine optimization is a branch of search engine marketing.
How to SEO
The searching and reading in the content of Web pages follows known HTML standards of the Web, which is why their application on the creation of websites is the first step of optimization. According to statements made by Google, however, the validity of an HTML page is not a factor influencing the ranking. Nevertheless, the pages are particularly HTML compliant, to be loaded by the browser faster, which is rewarded by search engines.
We divide the SEO within online marketing roughly into two areas, the on-page and off-page optimization. This categorization is moored to the criterion of whether the site is under construction or take influence on other websites.
The on-page optimization includes all content-side customizations of the website. These include the optimization of the page content (including content) in terms of content quality, formatting, headings, etc., but also technical aspects such as the header and tags, as well as the internal link structure of the page. Usually a off-page optimization is always preceded by on-page optimization. The optimisation designed for search engine meta tag keywords is now no longer taken into account by Google. The same applies to the meta description. This, however, may (depending on the query) be an excerpt in the SERPs displayed and therefore should not be ignored.
A further step is the selection of appropriate search terms (keywords). This can be get freely from available databases for keywords. Furthermore, the use of the Google AdWords Keyword Tool to that lists related keywords in addition to the approximate number of monthly searches per keyword.
Usually a page is optimized for one to three keywords. Often a comprehensive site is divided into several pages to optimize them to different words. These main and other keywords (primary and secondary keywords) are set for the respective sides. The keywords with the corresponding contents are combined. A kind of optimization is performed by so-called landing pages. The user enters with the click on the link to a specially optimized for SEO purposes page.
Several methods of SEO having its own style of writing has developed in this area, which can be described as search engine optimized style. This follows the “rules” of SEO, which are to some extent determined by the search mechanisms of the search engine. It has implicit rules, since they are reconstructed on the basis of the success factors of the optimization, because search engine optimizers usually do not known the criteria for qualitative classification of indexed pages.
In the search engine optimization techniques is taking into consideration the crawlers and sorting algorithms of examined search engines. This will be disclosed only in part and often changed to make it harder for abuse and to provide relevant results to the user. The unknown and undisclosed techniques are disclosed by reverse engineering examines of the search results. Here, we analyze how search engines indexed the websites and their content, and what criteria they evaluated and compiled.
The specifications by the search engine optimization of the pages can thereby completely contradict the rules of classical text production. For example, play grammatical rules for the search algorithms hardly matters. In this way, often misspelled keyword can contribute more to the optimization of the ranking than technically correct term. However, this procedure decreases because Google and other search engines are increasingly identify and assign misspellings independently.
Since these mechanisms are subject to dynamic development, this style of writing is also often adapted to again deliver the best possible result in the optimization of each. This means that a page will never be optimized only once. Rather, it requires a long-term review of the relevance of keywords used, since also changed user behavior.
The “off-page optimization” includes all activities outside the site to be optimized.
It is no longer enough to increase the relevance of web pages. A good listing and a good search engine visibility is also increased by the quantity and quality of incoming links to a website as backreference, also called backlink. The off-page optimization is about the emergence of a structural link with other websites in order to better position themselves in specific thematic areas. For example, one looks for thematically appropriate websites to them for a link partnership to win. The use of the so-called link research tools can be worthwhile for the best optimization of a domain. In addition, the design of the link text on references is essential for the placement of certain search words. To study sites for their potential, you can use specific web applications. Often you have to make small changes in order to greatly increase the ranking in search engines.
A new development in the off page optimization is – in addition to generation of backlinks – the simultaneous generation of so-called “social signals”. It is to mention the website in the form of “likes”, “shares”, “comments” (Facebook ) or “Plus 1” (Google+). The background to this development is that search engines do not analyze the backlinks only, but check both algorithmically, how users share a web page within selected social networks like Facebook or Google+.
Academic Search Engine Optimization
Search Engine Optimization is applied not only in the field of web pages, but also for academic PDF files for academic search engines such as Google Scholar and CiteSeer. The basic principle of Academic Search Engine Optimization (ASEO) is the same as in the traditional web search engine optimization. The PDF should be as high keyword density and, instead of (hyper) links, have included citations. The range of academic search engine optimization is still young and is currently being evaluated by the academic community greatly. Some consider it immoral to tailor scientific articles on academic search engines, others consider it necessary so search engines to index and weight the contents of PDFs better and “fair”.
Methods that bring non-relevant web pages on top positions of search engine results pages, are considered as spamdexing refers. They violate the rules, and the search engines are set up to protect against manipulation of their search results. Thus, it is possible to set up automatic redirection that contain text pages created specifically for search engines. This technique, known as doorway pages, contradicts the guidelines of most search engines. Cases that are uncovered by the search engine operators, ie, the landing pages in question, are excluded from the search index.
Ethical Search Engine Optimization ( white hat SEO) avoids spamming. It does away with undesirable practices such as the use of doorway pages or link farms, and follow the directives of the individual search engines. This reduces the risk of exclusion, or downgrading in the search result pages is avoided. In contrast to the ethical “white hat” search engine optimization, is the optimization methods involving unwanted by the search engines, called “black hat” optimization.
In the field of search engine optimization occurred since the end of 2000, is the concept of link pyramid. The link pyramid tries to simulate a natural backlink structure. To this end, the backlinks of several successful entrepreneurs were analyzed and re-considered. The so-called foundation consists of simple links, and the number decreases, the higher the quality of the links. Hence, the image of a pyramid, from which the word link pyramid was derived. In order to handle a link pyramid, using bottom-up seen fewer and fewer links. This results from the fact that it is increasingly difficult to find high-quality links to get to them.
Pages with clean graphically oriented, designed with movies, images and graphics, embedded text,as for example programming in Flash, allows the search engines hardly evaluable text code. Since 2008 the company Adobe has provided for the companies Google and Yahoo the technology with which they can access previously exploitable content in Flash files. Googlebot can this way index almost all texts are displayed to users when they interact with flash SWF files on your site. On the basis of these texts, the Googlebot generate snippets or syndicate requested terms in the Google search. In addition, Googlebot can see the URLs in SWF files, for example, following the links to other pages on your website. However, neither dynamically loaded texts acquired this way, nor link text or various sub-pages within Flash, are identified. Therefore, the programming of a website entirely in Flash is not advisable from the perspective of search engine optimization.
Translated from Wikipedia De