Link Building | Transalpine Internet Sevices

Transalpine has a department specialised in the highly technical field of website content optimization. We explain here how Content Engineering can improve the search engine performance of your website.

Return to Index

Link Building

One of the two essential aspects of SEO success is link building. In combination with great content that is well organised and easily indexed by the search engines, a well-developed set of links to the site from quality sources will ensure rankings in the SERPs.

Backlinks are given a high priority in a search engine's algorithm for measuring the authority and popularity of a site. Without links, a site will not be found easily, and is like a island off the main shipping routes.

Good Link Development Strategies

Planning a site should include SEO in its early stages. A part of the SEO planning is how to create a structure and content which will encourage a continuous development of links. It should be borne in mind that a few high-quality links are worth more than hundreds of low-quality links from spurious sources diluting the overall impression and value of the site. For example, links that point to missing pages are negative to the SEO value, and sudden unsustained growths of links from blogs may suggest a sensationalist and therefore unreliable resource.

Nothing drives forward a link campaign better than great resources on the site which people like and want to pass on to their community. This content must be offering something different to longer-established competitors. This could be text, graphics, audio-visual material, or even some software tool which people cannot obtain easily elsewhere.

Gaining Authority

Authority is granted by external sites making references to the site. The more these referring sites are considered authoritative in the same or related field, the more authority value they pass. The search engine algorithm works like an academic citation system. A paper is given more importance if the most authoritative journals cite it. Academics will care little whether the paper also has many references from non-academic sources (they don't read them anyway).

In the same vein, if a site wishes to be taken seriously, it must submit to the rigours of peer review to gain acceptance from those whose opinions matter.

Deciding the Authority Profile for a site

In the initial cross-disciplinary brainstorming phase of SEO consultation, the marketing objectives of the site are established. The SEO consultant then designs a site to satisfy the demands from marketing for traffic which meets their profile. To do this, the site is planned in detail so that keyword distribution across pages matches expected organic search criteria, while the authority of the site is established through careful acquisition of good and relevant linking sites.

Site elements which support authority

As part of the analysis of a page's relevancy, the links to and from the page are not just counted, but analysed for their adherence to the page thematic profile the search engine has developed from its semantic analysis.

A fundamental aid for search engines in this task is careful use of attributes such as anchor, title, and alt texts. Putting 'Click here' is not very informative compared to 'Visit Andrew Bone's science news site', provided the hyperlink is related to the text around it.

Proximal text is therefore used to determine whether the link is really part of the page's discourse, or some unrelated (paid) link that will be an annoyance to the user rather than added value.

It is not a successful policy in the long term to attempt to cheat the search engines. They are continuously adapting their algorithms to defeat the newest schemes to raise SERP rankings unfairly. It has become accepted wisdom that a site can only hope to be ranked well for its core themes if it provides what visitors are seeking: quality returns for time invested in visiting.

Bad Link Generation Strategies

The quick path leads to a short journey

Received wisdom among SEO practitioners is that sites should endeavour to earn respect and authority within the internet community they reside through original quality content. The strategies adopted by poor-quality sites which lead to the downgrading of sites for authority and relevance include spamming and link farming:

Spamming

Spam is the opposite of good content. Broadly speaking, spam includes anything which annoys or distracts a visitor from his or her purpose for coming to the site. Unscrupulous spam tactics can even go as far as cloaking content, false download buttons, misleading links, lies and illegal tricks, such as phishing for personal security details. Spam is looked for by search engine crawlers, and if found, will result in the downgrading or even blocking of the site in SERPs, search engine results pages.

Needless to say, the more spammy a site is the less likely authoritative, quality sites are to provide links to it.

Unable to build legitimate link profiles, these sites turn to strategies such as link farming:

Link Farming

Link farming is the practice of attempting to pass on link authority (link juice) in exchange for payment or mutual benefits between consenting sites. It is frowned upon by search engines because it breaks their tenet that sites should be endeavouring to provide genuine enhanced value to visitors. Link farming tends to generate links en masse to poor or irrelevant sites, misleading the visitor.

'Free' template sites are often full of hidden external links, sapping the link juice from the site.

Erratic link acquisition rates

Link acquisition rates are also taken into consideration in the way search engines rank a site. A baseline assumption is that a good site does not have erratic link acquisition history. Rapid link acquisition followed by much slower rates may indicate a site that is not developing in step with its community. SEO practitioners should avoid quick-fix campaigns to gain backlinks illegitimately, such as hidden links (e.g. in page counters and other 'free' items) and paid links (link farming).

The Google PageRank system

Google uses an algorithm of scalable relevance. This means their intent is to remain focused on the desires of the user as a guide to how to tune their PageRank programme. There are numerous factors that Google has learned to incorporate in its algorithm, some in reaction to 'spamming', or attempts to unfairly manipulate search results by web designers. These hundreds of factors, with a little generous imagination, can ultimately be boiled down to two broad groups: relevancy and popularity. Both of these need to be attended to for a site to succeed in the SERP derby.

Popularity

This may equally refer to the number of visitors to a site as to the number of links to that page from other domains. For the SEO practitioner, the number of relevant hyperlinks should be the more dominant factor in his or her thinking. These imply a more serious acknowledgement of the importance of the site for the linker. Like YouTube videos, the raw number of views is no true indicator of the inherent satisfaction of the viewer.

A distinction needs to be made between 'page popularity' and 'domain popularity'. Some services offer their service from a single page (e.g. a search engine, news or meteological services), or frequent updates to their applications (such as video players and the Adobe Pdf Reader). These applications are updated regularly from a single page by people who do not normally extend their visit to the rest of the domain in question. As a result, there is a high page popularity but relatively low domain popularity.

Other domains, such as wikipedia.org, are heavily visited, but not for specific pages, so have high domain popularity, but low page popularity (except perhaps the homepage).

The best ranking will be achieved by a domain which can achieve the optimum combination of these two popularity metrics for its purposes. There is no easy solution, since this dynamic is dependent on many and diverse market and service specific factors.

Relevancy

The degree to which a recommended SERP page satisfies the intended query by the user. One of the most frightening things about SEO is that it is a double-edged sword. If visitors using certain keywords tend to bounce, or leave a domain quickly, Google will develop a profile that downgrades that domain for the keywords involved in the query.

Relevancy is also the relationship between a page and a specific theme. In this debate, the authority granted the site by the specialised internet community (i.e. established and authoritative sites which specialise in the same field as the domain in question) is a factor, as is the semantic analysis of the page and its content.

When a search engine sees a page it collects a large range of data concerning the page's content, structure and other factors, such as how parts of the content relate to other parts. The primary tool in this operation is the long-established science of information retrieval (IR). The two chief criteria used in SEO IR are relevance and importance. In library systems, citation analysis was used to determine whether an academic community considered the document to be important. In SEO, the equivalent is link analysis.

This information is imported from our site www.sciencelibrary.info. For further information, please visit the site, or contact us.

Lugano English

Quote of the day...

"Zer trick," said Einstein, "ist to realise zat space-time ist not vat we dink it ist: it ist not fixed und immovable, but flopping about like ... ein big floppy ding."
"A rubber mattress?" suggested Sean.
"If zat ist your idea of a good time, ja."

ZumGuy Internet Promotions

IT information forum by Sean Bone