What is Search Engine Optimisation (SEO) – History, Practices and Outlook

  1. What is Search engine optimisation ?

Search engine optimisation (SEO) is the process of affecting the online visibility of a website or a web page in a web search engine’s unpaid results—often referred to as “natural”, “organic”, or “earned” results. In general, the earlier (or higher ranked on the search results page), and more frequently a website appears in the search results list, the more visitors it will receive from the search engine’s users; these visitors can then be converted into customers. SEO may target different kinds of search, including image search, video search, academic search, news search, and industry-specific vertical search engines. SEO differs from local search engine optimisation in that the latter is focused on optimising a business’ online presence so that its web pages will be displayed by search engines when a user enters a local search for its products or services. The former instead is more focused on national or international searches.

As an Internet marketing strategy, SEO considers how search engines work, the computer programmed algorithms which dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. Optimising a website may involve editing its content, adding content, doing HTML, and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search. In 2015, it was reported that Google is developing and promoting mobile search as a key feature within future products. In response, many brands are beginning to take a different approach to their Internet marketing strategies.[4]

2. History of SEO

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a “spider” to “crawl” that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine’s own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.

Website owners recognized the value of a high ranking and visibility in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase “search engine optimization” probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term. On May 2, 2007, Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona that SEO is a “process” involving manipulation of keywords and not a “marketing service.”

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page’s content. Using meta data to index pages was found to be less than reliable, however, because the webmaster’s choice of keywords in the meta tag could potentially be an inaccurate representation of the site’s actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches. Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines. By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.

By relying so much on factors such as keyword density which were exclusively within a webmaster’s control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.

Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.Google’s Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.

Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimisation. Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the “crawl rate”, and track the web pages index status.

3. SEO and the Google algorithm

In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed “Backrub”, a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.

Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design. Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.

By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times’ Saul Hansell stated Google ranks sites using more than 200 different signals.The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimisation, and have shared their personal opinions.Patents related to search engines can provide information to better understand search engines. In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.

In 2007, Google announced a campaign against paid links that transfer PageRank. On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, in order to prevent SEO service providers from using nofollow for PageRank sculpting. As a result of this change the usage of nofollow leads to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript.

In December 2009, Google announced it would be using the web search history of all its users in order to populate search results. On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, “Caffeine provides 50 percent fresher results for web searches than our last index…” Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.

In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice, however Google implemented a new system which punishes sites whose content is not unique. The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine. Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google’s natural language processing and semantic understanding of web pages.

4. How algorithms work

Search engines use complex mathematical algorithms to guess which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links “carry through”, such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.

The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review. Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links in addition to their URL submission console. Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click. this was discontinued in 2009.

Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.

5. How to avoid getting listed

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine’s database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.

6. Increasing organic (SEO) visibility 

A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page’s meta data, including the title tag and meta description, will tend to improve the relevancy of a site’s search listings, thus increasing traffic. URL normalization of web pages accessible via multiple urls, using the canonical link elementor via 301 redirects can help make sure links to different versions of the url all count towards the page’s link popularity score.

7. White hat versus black hat SEO techniques

SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design (“white hat”), and those techniques of which search engines do not approve (“black hat”). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO. White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.

An SEO technique is considered white hat if it conforms to the search engines’ guidelines and involves no deception. As the search engine guidelines are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online “spider” algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility, although the two are not identical.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized, but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines’ algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices. Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google’s list.

8. Inbound Marketing Strategy with SEO

SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective like paid advertising through pay per click (PPC) campaigns, depending on the site operator’s goals. Search engine marketing (SEM), is practice of designing, running, and optimizing search engine ad campaigns. Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to PageRank visibility as most navigate to the primary listings of their search. A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site’s conversion rate. In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[55] which now shows a shift in their focus towards “usefulness” and mobile search.

SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.Search engines can change their algorithms, impacting a website’s placement, possibly resulting in a serious loss of traffic. According to Google’s CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic. In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.

Based on search data, organic visitors through SEO are 47% along their buyer journey, have the highest trust factor, higher average sale and better engagement than paid visitors. These signals are getting stronger each year – making SEO investment essential for businesses – Shane Chand, Digital Squad – Digital Marketing Agency Auckland

9. International SEO

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines’ market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[59] In markets outside the United States, Google’s share is often larger, and Google remains the dominant search engine worldwide as of 2007.[60] As of 2006, Google had an 85–90% market share in Germany.[61] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[61] As of June 2008, the marketshare of Google in the UK was close to 90% according to Hitwise.[62] That market share is achieved in a number of countries.

As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.

Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[61]

10. What are Backlinks

backlink for a given web resource is a link from some other website (the referrer) to that web resource. A web resource may be (for example) a website, web page, or web directory.

A backlink is a reference comparable to a citation. The quantity, quality, and relevance of backlinks for a web page are among the factors that search engines like Google evaluate in order to estimate how important the page is. PageRank calculates the score for each web page based on how all the web pages are connected among themselves, and is one of the variables that Google Search uses to determine how high a web page should go in search results. This weighting of backlinks is analogous to citation analysis of books, scholarly papers, and academic journals. A Topical PageRank has been researched and implemented as well, which gives more weight to backlinks coming from the page of a same topic as a target page.

Some other words for backlink are incoming linkinbound linkinlinkinward link, and citation.[1]

11. Backlinks and search engines

Search engines often use the number of backlinks that a website has as one of the most important factors for determining that website’s search engine ranking, popularity and importance. Google’s description of its PageRank system, for instance, notes that “Google interprets a link from page A to page B as a vote, by page A, for page B.” Knowledge of this form of search engine rankings has fueled a portion of the SEO industry commonly termed linkspam, where a company attempts to place as many inbound links as possible to their site regardless of the context of the originating site. The significance of search engine rankings is pretty high, and it is regarded as a crucial parameter in online business and the conversion rate of visitors to any website, particularly when it comes to online shopping. Blog commenting, guest blogging, article submission, press release distribution, social media engagements, and forum posting can be used to increase backlinks.

Websites often employ SEO techniques to increase the number of backlinks pointing to their website. Some methods are free for use by everyone whereas some methods, like linkbaiting, require quite a bit of planning and marketing to work. There are also paid techniques to increase the number of backlinks to a target site. For example, private blog networks can be used to purchase backlinks.

There are several factors that determine the value of a backlink. Backlinks from authoritative sites on a given topic are highly valuable. If both sites and pages have content geared toward the topic, the backlink is considered relevant and believed to have strong influence on the search engine rankings of the web page granted the backlink. A backlink represents a favorable ‘editorial vote’ for the receiving webpage from another granting webpage. Another important factor is the anchor text of the backlink. Anchor text is the descriptive labeling of the hyperlink as it appears on a web page. Search engine bots (i.e., spiders, crawlers, etc.) examine the anchor text to evaluate how relevant it is to the content on a webpage. Anchor text and webpage content congruency are highly weighted in search engine results page (SERP) rankings of a webpage with respect to any given keyword query by a search engine user.

Changes to the algorithms that produce search engine rankings can place a heightened focus on relevance to a particular topic. While some backlinks might be from sources containing highly valuable metrics, they could also be unrelated to the consumer’s query or interest. An example of this would be a link from a popular shoe blog (with valuable metrics) to a site selling vintage pencil sharpeners. While the link appears valuable, it provides little to the consumer in terms of relevance. Here is an updated guide to safe and ethical link building practices in 2018.

12. Conversion Rate Optimisation Explained

Online conversion rate optimization (or website optimization) was born out of the need of e-commerce marketers to improve their website’s performance in the aftermath of the dot-com bubble. As competition grew on the web during the early 2000s, website analysis tools and an awareness of website usability prompted internet marketers to produce measurables for their tactics and improve their website’s user experience.

In 2004, new tools enabled internet marketers to experiment with website design and content variations to determine which layouts, copy text, offers, and images perform best. This form of optimization accelerated in 2007 with the introduction of the free Google Website Optimizer.[3] Today optimization and conversion are key aspects of many digital marketing campaigns. A research study conducted among internet marketers in 2014, for example, showed that 59% of respondents thought that CRO was “crucial to their overall digital marketing strategy”.[4]

Conversion rate optimization shares many principles with direct response marketing – a marketing approach that emphasizes tracking, testing, and on-going improvement. Direct marketing was popularized in the early twentieth century and supported by the formation of industry groups such as the Direct Marketing Association, which formed in 1917.

Like modern day conversion rate optimization, direct response marketers also practice A/B split-testing, response tracking, and audience testing to optimize mail, radio, and print campaigns.

13. Statistical approach to Conversion Rate Optimisation

Frequently, when marketers study a lift in an ad campaign, they discover customer behavior is not consistent. Online marketing response rates fluctuate widely from hour to hour, segment to segment, and offer to offer.

This phenomenon can be traced to the difficulty humans have separating chance events from real effects. Using the haystack process, at any given time, marketers are limited to examining and drawing conclusions from small data samples. However, psychologists (led by Daniel Kahneman and Amos Tversky) have documented tendencies to find spurious patterns in small samples to explain why poor decisions are made. Statistical methodologies can be leveraged to study large samples, mitigating the urge to see patterns where none exist.

These methodologies, or “conversion optimization” methods, are then taken a step further to run in a real-time environment. The real-time data collection and subsequent messaging increases the scale and effectiveness of the online campaign.[citation needed]

Reaching a statistically significant result in itself is not enough. Conversion optimization practitioners must ensure that their sample size accounts for important variables. For example, a test may appear statistically significant well before seasonal factors (time of day, day of week, time of year) have been adequately reflected in the data sample. One variation may appeal to one season more than others and ultimately misguide the result.

It is equally important to understand how various segments affect tests and results. Different user segments (e.g. device type, location, new vs. returning visitor) will respond differently to each variation. Analyzing results without accounting for different segments can cause a significant improvement for one segment; or many variations can offset poor results for another segment. For example, uplift in desktop conversion-rate could offset a decreased conversion-rate on mobile devices. In this instance, only the desktop version should be declared a ‘winning’ test.

conversion rate optimisation

14. Conversion Rate Optimisation (CRO) methodology

Conversion rate optimisation  (CRO)seeks to increase the percentage of website visitors that take a specific action (often submitting a web form, making a purchase, signing up for a trial, etc.) by methodically testing alternate versions of a page or process. In doing so, businesses are able to generate more leads or sales without investing more money on website traffic, hence increasing their marketing return on investment and overall profitability.

A conversion rate is defined as the percentage of visitors who complete a goal, as set by the site owner. Some test methods, such as split testing or A/B testing, enable one to monitor which headlines, copy, images, social proof elements, and content help convert visitors into customers.

There are several approaches to conversion optimisation with two main schools of thought prevailing in the last few years.[citation needed] One school is more focused on testing to discover the best way to increase website, campaign, or landing page conversion rates. The other school is focused on the pretesting stage of the optimisation process. In this second approach, the optimisation company will invest a considerable amount of time understanding the audience and then creating a targeted message that appeals to that particular audience. Only then would it be willing to deploy testing mechanisms to increase conversion rates.

15. Elements of the test focused approach

Conversion optimization platforms for content, campaigns, and delivery consist of the following elements:

16. Data collection and processing

The platform must process hundreds of variables and automatically discover which subsets have the greatest predictive power, including any multivariate relationship. A combination of pre- and post-screening methods is employed, dropping irrelevant or redundant data as appropriate. A flexible data warehouse environment accepts customer data as well as data aggregated by third parties.

This means it’s essential to ensure the data is as ‘clean’ as possible, before undertaking any data analysis. For example, eliminating activity from bots, staging websites, or incorrect configurations of tools such as Google Analytics.

Data can be numeric or text-based, nominal or ordinal. Bad or missing values are handled gracefully.

Data may be geographic, contextual, frequential, demographic, behavioral, customer based, etc.

17. Hypothesis

After data collection, forming a hypothesis is the next step. This process forms the foundation of why changes are made. Hypotheses are made based on observation and deduction. It is important that each hypothetical situation be measurable. Without these no conclusions can be derived.

18. Optimization goals

The official definition of “optimization” is the discipline of applying advanced analytical methods to make better decisions. Under this framework, business goals are explicitly defined and then decisions are calibrated to optimize those goals. The methodologies have a long record of success in a wide variety of industries, such as airline scheduling, supply chain management, financial planning, military logistics and telecommunications routing. Goals should include maximization of conversions, revenues, profits, LTV or any combination thereof.

19. Business rules

Arbitrary business rules must be handled under one optimization framework. Using such a platform entails that one should understand these and other business rules, then adapt targeting rules accordingly.

20. Real-time decision making

Once mathematical models have been built, ad/content servers use an audience screen method to place visitors into segments and select the best offers, in real time. Business goals are optimized while business rules are enforced simultaneously. Mathematical models can be refreshed at any time to reflect changes in business goals or rules.

21. Statistical learning

Ensuring results are repeatable by employing a wide array of statistical methodologies. Variable selection, validation testing, simulation, control groups and other techniques together help to distinguish true effects from chance events. A champion/challenger framework ensures that the best mathematical models are deployed always. In addition, performance is enhanced by the ability to analyze huge datasets and to retain historical learning.

The term “white hat” in Internet slang refers to an ethical computer hacker, or a computer security expert, who specializes in penetration testing and in other testing methodologies to ensure the security of an organization’s information systems. Ethical hacking is a term coined by IBM meant to imply a broader category than just penetration testing. Contrasted with black hat, a malicious hacker, the name comes from Western films, where heroic and antagonistic cowboys might traditionally wear a white and a black hat respectively.

White-hat hackers may also work in teams called “sneakers”, red teams, or tiger teams.

22. Black Hat Basics

One of the first instances of an ethical hack being used was a “security evaluation” conducted by the United States Air Force, in which the Multicsoperating systems was tested for “potential use as a two-level (secret/top secret) system.” The evaluation determined that while Multics was “significantly better than other conventional systems,” it also had “… vulnerabilities in hardware security, software security and procedural security” that could be uncovered with “a relatively low level of effort.” The authors performed their tests under a guideline of realism, so their results would accurately represent the kinds of access an intruder could potentially achieve. They performed tests involving simple information-gathering exercises, as well as outright attacks upon the system that might damage its integrity; both results were of interest to the target audience. There are several other now unclassified reports describing ethical hacking activities within the US military.

By 1981 The New York Times described white hat activities as part of a “mischievous but perversely positive ‘hacker’ tradition”. When a National CSSemployee revealed the existence of his password cracker, which he had used on customer accounts, the company chastised him not for writing the software but for not disclosing it sooner. The letter of reprimand stated “The Company realizes the benefit to NCSS and in fact encourages the efforts of employees to identify security weaknesses to the VP, the directory, and other sensitive software in files”.

The idea to bring this tactic of ethical hacking to assess security of systems was formulated by Dan Farmer and Wietse Venema. With the goal of raising the overall level of security on the Internet and intranets, they proceeded to describe how they were able to gather enough information about their targets to have been able to compromise security if they had chosen to do so. They provided several specific examples of how this information could be gathered and exploited to gain control of the target, and how such an attack could be prevented. They gathered up all the tools they had used during their work, packaged them in a single, easy-to-use application, and gave it away to anyone who chose to download it. Their program, called Security Administrator Tool for Analyzing Networks, or SATAN, was met with a great amount of media attention around the world in 1992.

23. Tactics

While penetration testing concentrates on attacking software and computer systems from the start – scanning ports, examining known defects and patch installations, for example – ethical hacking may include other things. A full blown ethical hack might include emailing staff to ask for password details, rummaging through executive’s dustbins and usually breaking and entering, without the knowledge and consent of the targets. Only the owners, CEOs and Board Members (stake holders) who asked for such a security review of this magnitude are aware. To try to replicate some of the destructive techniques a real attack might employ, ethical hackers may arrange for cloned test systems, or organize a hack late at night while systems are less critical. In most recent cases these hacks perpetuate for the long term con (days, if not weeks, of long term human infiltration into an organization). Some examples include leaving USB/flash key drives with hidden auto-start software in a public area, as if someone lost the small drive and an unsuspecting employee found it and took it.

Some other methods of carrying out these include:

  • DoS attacks
  • Social engineeringtactics
  • Security scanners such as:
    • W3af
    • Nessus
    • Nexpose
    • Burpsuite
  • Frameworks such as:
    • Metasploit

Such methods identify and exploit known vulnerabilities, and attempt to evade security to gain entry into secured areas. They are able to do this by hiding software and system ‘back-doors’ that could be used as a link to the information or access the non-ethical hacker, also known as ‘black-hat’ or ‘grey-hat’, may want to reach.

24. Future of SEO – Voice Search

“50% of all searches will be voice searches by 2020” according to comscore

“About 30% of searches will be done without a screen by 2020.” via Mediapos

“We estimate there will be 21.4 million smart speakers in the US by 2020” according to Activate

“By 2019, the voice recognition market will be a $601 million industry”, according to a reportfrom Technavio via Skyword.

“This year (2017), 25 million devices will be shipped, bringing the total number of voice-first devices to 33 million in circulation.” based on a new study by VoiceLabs via Mediapost

25. Current Usage of Voice Search

“There are over one billion voice searches per month. (January 2018)” estimates Alpine.AI

“Google voice search queries in 2016 are up 35x over 2008” according to Google trends via Search Engine Watch

“40% of adults now use voice search once per day” according to Location World

“Cortana now has 133 million monthly users” according to Microsoft/Tech Radar

“In May 2016, 1 in 5 searches on an Android app in the USA were through speech” according to KPCB

“25% of 16-24s use voice search on mobile” via Global Web Index

“41% of people using voice search have only started  in the last 6 months” according to MindMeld

“60% of people using voice search have started  in the last year” according to MindMeld

“11% of people using voice search started  more than 3 years ago” according to MindMeld

19% of people use Siri at least daily. (HubSpot, 2015) (Source: https://www.hubspot.com/marketing-statistics)

“9% of users said that they’ve used AI personal assistants like Siri or Cortana in the past day” according to AYTM

“45% of those who have used AI personal assistants said they’ve used Siri. 33% have used Google Now. 27% used Microsoft’s Cortana. 10% have used Amazon Echo or Alexa.” via AYTM

“1 in 5 online adults have used voice search on their mobile in the last month” via Global Web Index

“37% use Siri, 23% use Microsoft’s Cortana AI, and 19% use Amazon’s Alexa AI at least monthly.” (HubSpot, 2015) (Source: https://www.hubspot.com/marketing-statistics)

“We estimate that 325.8 million people used voice control in the past month” according to Global Web Index (that’s almost 10% of the online population according to Internet Stats).

“We estimate that the retail giant (Amazon) has sold 5.1 million of the smart speakers in the U.S since it launched in 2014” according to CIRP via Geekwire.

“Amazon sold approximately 2 million units in the first nine months of 2016” according to CIRP

“Amazon sold 4.4 million Echo units in its first full year of sales” according to Geek Wire

“25% of searches on Windows 10 taskbar are voice. On desktop!” according to Purna Virji

“Only around a third of smartphone owners use their personal assistants regularly, even though 95% have tried them at some point.” according to Creative Strategies via The Economist

“Only 11% of respondents who already own an Amazon Alexa or Google Home device will also buy a competing device.” via Voicelabs.

“Application growth for Amazon Alexa has been impressive – over 500% in the second half of 2016″ according to Voicelabs.

“Evercore estimates 500,000 Google Home units shipped in 2016” via Bloomberg

“65 percent of people who own an Amazon Echo or Google Home can’t imagine to going back to the days before they had a smart speaker.” via Geomarketing.com

“42 percent say voice-activated devices have quickly become “essential” to their lives. via Geomarketing.com

“The Echo Dot was the best-selling product on all of Amazon in the 2018 holiday season” via Techcrunch

“1 in 2 use voice technology on their smartphone, 1 in 3 voice technology users use voice technology daily.” via ComScore

“47% expect their voice technology usage to increase” via ComScore

” The number of households in the US with smart speakers has grown 49% in the last 5 months (Jun-Nov 2017)” via ComScore

“Amazon and Google account for 94% of all smart speakers in use” via Strategy Analytics

“Google Home has roughly a 25 percent share of the US smart speaker market.” via Search Engine Land

“56% of online grocery shoppers use or plan to use voice controlled smart assistant/speaker” via Global Web Index

“52% of people keep their voice activated speaker in their common room (e.g family or living room), 25% in bedroom and 22% in their kitchen” via Think with Google

“72% of people who own a voice-activated speaker say their devices are often used as part of teir daily routine.” via Think with Google

“41% say using their voice-activated speaker is like talking to a friend or another person.” via Think with Google

“1 in 4 shoppers used voice assistants in their holiday shopping during the 2017 season.” according to CTA via Hubspot

26. Intent

“Mobile voice-related searches are 3X more likely to be local-based than text” via Search Engine Watch

But “just 13 percent of smart speaker owners use their smart speakers to find a local business” according to an NPR survey via Geomarketing.com

“Home Alone and Elf were the most requested 2016 holiday movies with Alexa.” via Amazon

“Customers use Amazon Echo for many purposes, with one-third using it as an information provider responding to questions and over 40% as an audio speaker for listening to streaming music.” according to CIRP.

“Nearly 50% of people are now using voice search when researching products.” via Social Media Today

“High consumer usage of voice assistants in autos (51%) and household (39%) indicates increased comfort with the technology” – according to Activate via WSJ.

“Google’s AI has been reading nearly 3,000 romance novels in order to improve its conversational search abilities” via Click Hub

‘Personal assistants’ is the top marketing search of 2016″ according to Bing via Econsultancy

“Voice activated speaker owners would like to receive the following from brands; deals, sales and promos (52%), personalised tips and info (48%), events and activity information (42%), business information such as store location (39%) and customer service support (38%).” via Think with Google

27. Reason

“Humans can speak 150 words per minute vs type 40 words per minute” via Katherine Watier

“28% think voice search is a more accurate way of searching” via Katherine Watier

“43% cite that using voice search is quicker than using a website or an app” via Katherine Watier

“42% say that use while driving is  a reason for using voice search” Katherine Watier

“21% don’t like typing on their mobile phone and so turn to voice search” via Katherine Watier and Statista, 2015 (Source: https://www.hubspot.com/marketing-statistics)

“82 percent of Amazon Echo smart speaker owners subscribe to Amazon Prime” via Geomarketing.com

“More than two thirds of current owners of Amazon Echo and Google Home smart speakers are planning to buy another smart speaker within the next six months” according to Strategy Analytics

28. Errors

“outside 35% of normal recognition errors, 31% were noise related and 22% were pronunciation related” according to Research Gate

“Today, speech recognition word error rate is 8 percent.” via Bruce Clay

“Fifteen years ago quality had stalled, with word-error rates of 20-30%. Microsoft’s latest system, which has six neural networks running in parallel, has reached 5.9% (see chart), the same as a human transcriber’s.” via The Economist


Curated from Wikipedia under Fair Use and Creative Commons Usage.

Contact: Digital Squad – Cape Town SEO Agency

Leave a Reply

Your email address will not be published. Required fields are marked *