SEO Glossary of Terms

Transalpine has a department specialised in the highly technical field of website content optimization. Here we explain some of the most important technical terms used in Search Engine Optimization (SEO).

Return to SEO Index

SEO Glossary of Terms

SEO Index

301 redirect

A permanent redirect. A better option than the temporary 302 redirect, since 301 passes link juice, while 302 does not.


Numerical analysis of data from a wide range of sources. Each search engine (Google, Bing, ... etc.) has its own algorithms by which they initially assess and then monitor on a regular basis how much a site is likely to satisfy a search query. These algorithms are adapted continuously to the changing Internet, the needs of searchers, and to counter illegitimate practices, such as spamming.


A major search engine algorithm parameter. The relative authority of a site is determined from how well the site is respected and referenced by the quality sites within the online community it resides. In measuring authority, the perceived authority and relevance of the linking sites is more important than the number of links in total.


A link from an external site to a site or page. Backlinks are important for assessing popularity and authority of a site for specific themes and keywords. The context of the internet community from where the links come from, as well as the quality (authority and popularity) of the external sites is also an important indicator. It is better to have a few links from high-quality sites than many from irrelevant, low-quality sites.

Bounce Rate

A parameter used to judge how satisfied a user is by the website the search engine has directed them to in response to a query. 100% bounce rate means the visitor left the site without opening a second page. This is interpreted as the site being irrelevant for the query. Too many results of this type will cause the Search Engine to begin to downgrade a site for that and similar search queries.


A formal procedure in which a cross-disciplinary team meet in an intensive session to find creative solutions to problems. Brainstorming lays emphasis on lateral thinking techniques and non-exclusion of any ideas during the initial phases. In SEO consultation, brainstorming is a standard tool to utilise the in-house knowledge of all departments of a company, and to expose bias in marketing and development strategies. It also serves as an opportunity to communicate that SEO as the responsibility of everyone in an enterprise.


Multiple versions of the same site or parts. The principle of identifying clearly the 'official' version of content that may appear in more than one location is standard good SEO practice. For example, each domain has a URL with and without the 'www.' prefix. A 301 redirect should be used from one to the other to ensure that link juice is not being divided between what the search engine sees as two distinct domains.


The practice of deceiving users as to the content of a page, while presenting search engines with a different profile. A simple example would be mislabeling elements which are visible only to humans: such as images, AV and Flash content. Google sees a title and alt text, and assumes the content of the video or image is true to that. Cloaking would take advantage of that to present users with different content, such as an ad or worse.


SEO professionals who advise clients on such matters as keyword analysis, site and content optimization, targeted audience behaviour, monitoring of site performance, and backlink generation.

Content Design

A branch of SEO concerned with the quality of content within a website. Its primary aim is to provide a site with material in sufficient quality and abundance, language options and degree of authority, that other sites within its online community voluntarily link to it and use it as a prime reference.

Content Engineering

A branch of web design concerned with the structuring of information within a formal architecture. SEO is concerned that this information architecture (IA) provides the desired level of access to search engines, so that they can build an accurate and fair index, and that human visitors may navigate easily through it. The hardest part of content engineering is ensuring relevant cross references between materials at the same hierarchical level.

Content Inventory

A formal phase of SEO consultation, in which existing online content is analysed for suitability, placement and performance in terms of the SEO parameters of keyword targeting, link patterns, and semantic mapping. Material that is not online may also be integrated into the inventory if deemed suitable for online publication.


When a programme, often called a 'spider', passes through websites via sitemaps and hyperlink paths, and generates an index of the World Wide Web. Spiders pass through a domain on average every 6-12 weeks, and index a maximum of 1000 pages per domain. Their behaviour on the site can be controlled to some extent by the domain publisher.


The removal of Google personalization settings which distort search returns. SEO practitioners should view the internet as much as possible as anyone else would.


A system of collaborative tagging of items in a catalogue, so that on a large scale a reliable naming and classification of the items will emerge.


The information and techniques employed to create a strong association between a site and a region or locality. Among the techniques are choosing a country-code TLD (.ch) in preference to a global TLD (.com), adding the address to every page, employing GoogleMaps locators, and associating the site with the local telephone directory entry, and other companies which are localised.

Global Navigation

Whole-site system for the orientation of the user. Beyond ensuring that every page has an evident return to home button (a logo is an effective continuity for this), a 'breadcrumb' system is useful. This works like the stub system on, where the path from the homepage, through the various levels and category and sub-category pages, to the current page can be seen as a logical sequence.


The records kept about sites on the World Wide Web, formed from data collected by spiders. Indexes are the results of machine learning algorithms which interpret and evaluate many different parameters, such as link density, semantic analysis of content, and keyword placement.


A word or phrase which is expected to be used regularly by searchers as their input string for a specific topic. Keyword analysis and targeting are important techniques in SEO.

Keyword Analysis

A set of tools and techniques to acquire an accurate understanding of the relative importance of keywords for a specific site or page. Keyword analysis is done as an initial step in the SEO consultation process, and involves brainstorming, with the participation of all stakeholders.

Keyword Cannibalization

The negative effect of conflicts in keyword targeting, confusing human searchers and search engines as to which page is the best match for specific keywords in searches. Cannibalization is countered by carefully structured hierarchies in the information architecture.

Keyword Placement

The various elements of a page do not have the same weighting in terms of contribution to overall ranking importance. SEO is concerned with developing a strategy for placement of keywords to target individual pages for specific search queries, while avoiding keyword cannibalization.

Keyword Targeting

The practical application of the knowledge gained in keyword analysis. Keyword targeting is a page by page application of keywords which specifically channel search query results to the most relevant pages on a site. A global plan for keywords is integrated into the information architecture of a site to prevent keyword cannibalization.

Link Farming

Abusive attempts to gain authority by excessive backlinks with low relevance. This is one of the deadly sins that Google is on the look out for.

Link Juice

The enhancement of a site's authority on a subject for a search engine through relevant and judicious links and backlinks. SEO consultants and site designers seek ways to increase and internally distribute a site's link juice.

Long Tail

The 70% of all search queries using terms which are not popular enough to be considered keywords. These may be highly specific queries (e.g. 'Brother DCP-365CN' instead of 'printer'), foreign terms in primarily English queries, less-popular synonyms and morphemes, misspellings, colloquialisms, and the plain weird. Named after the shape of the graph of frequency of term use in queries, which has 30% of queries based on a few known keywords, but 70% forming a long, rapidly diminishing frequency 'tail'. The long tail is a much more difficult area to exploit than obvious keywords, but can bring considerable benefits if incorporated well into a keyword targeting strategy.

Meta data

The information (description, author, keywords) stored in meta tags in the head section of an HTML page. Only the title is used by search engines for page ranking purposes. Description is an important human visitor advertisement in the SERP list, and keywords has no value at all to machine or beast.

Organic results

Non-paid returns to queries in the SERP (Search Engine Results Page). Sites are indexed and evaluated by algorithms to assess their suitability as matches for specific keywords in the query. Long-term good ranking from organic searches is the aim of SEO.

Paid results

Sites may be returned for specific keyword searches, irrespective of their evaluation by the search engine algorithms. The results are displayed prominently at the top of the SERP, and must be paid for. Competitive bidding for important keywords makes paid results an ever more expensive way to advertise. SEO utilises data available from paid results tools, such as keyword analytics, but proposes that organic results is a better return on investment strategy.


A major search engine algorithm parameter. The popularity of a site is essentially how much traffic it receives, and the number of backlinks it has. However, the algorithms do weight these parameters by traffic type and backlinking site quality and relevance.


The non-paid (organic) position of a site or other resource on a SERP in response to a particular search query. The aim of SEO is to improve rankings by optimizing a site for the key parameters used by search engines to determine ranking.


A major search engine algorithm parameter. The relevance of a site or page is how well it matches the intent of a search query. Key parameters in measuring relevance include user response to the SERP listing (click-through and bounce rates). Do searchers click through to the site for specific keywords, and if they do, do they continue on the site, or return immediately to the SERP (bounce)?

Search Engine Optimization (SEO)

The specific techniques in design and content of a website to enhance through fair techniques the search engine return page (SERP) results. In the Web 2.0 Internet of today, SEO is the primary factor in the success or failure of an online business. SEO is very dependent on the changes in search engine algorithms which control the way in which organic search queries assess the relevance and value of a website to satisfy the probable expectations of the user.

Semantic Analysis

A method by which search engines develop a map of the content of a page, and in particular how well the content is true to the theme profile presented through keywords in titles and headings. Semantic analysis is a tool to counter spamming, and to establish the relative value of a page for specific search queries.

Semantic Connectivity

A determination of how well content links within a page and to other pages of a site, or other sites, are relevant to each other.

Semantic Map

A description of a page's semantic connectivity and structure resulting from document analysis, based on data collected by a spider. It is used in conjunction with algorithms which categorise by parameters of authority, relevance, importance and popularity, to develop a profile of a page's suitability as a response to specific keywords in queries.


Search Engine Results Page. The page of possible websites and other internet resources, such as videos, images, maps, which may satisfy a searcher's query. SERPs return websites ranked according to parameters such as relevance and authority. Most results are organic, chosen from the search engine indexes, but some are paid by the site owners to obtain prominent rankings.


Attempts to raise the rankings of a low-quality site through techniques which are unfair and misleading to the user, and which search engines have declared to be illegitimate. Spamming includes cloaking, keyword stuffing, content theft, page scraping, content duplication, mislabelled links, and link farming.


Search engines collect information about a site by the use of 'spiders'. The site is accessed and information is collected. These include, for example, the regularity of use of certain keywords, and the relative value and relevancy of key elements, such as titles and site internal structure. Links to and out from the site are also assessed.


Also known as information architecture, structure is concerned with the accessibility of information and the clarity of its presentation and thematic fidelity, in logical, readily understood patterns. In CMS systems, structure must ensure self-replicability to any scale, without loss of accessibility and clarity.


Strengths, Weaknesses, Opportunities, Threats. A technique in marketing which facilitates analysis of a company with respect to competitors.


The page title, found in the head section of an HTML document, is the highest ranking keyword location opportunity. Contrary to popular belief, the title is the only piece of meta data which is considered by search engines in their page ranking systems.


Universal Resource Locator. The URL is the address of a page that appears in the long field in the control bar at the top of all browsers. e.g. The URL is a major indicator to a search engine and human navigators about the nature of the site. It should be clearly legible, not contain codes and illegible flags, and the suffix should be local (e.g. country code .ch, for geolocation of the site to a region. Generic top level domains (.com, .info, .net) should only be used for genuinely international sites, since they lose ranking importance for local service queries.


Quote of the day...

We must be clear-eyed about the security threats presented by climate change, and we must be proactive in addressing them.

ZumGuy Internet Promotions

Vitruvian Boy