Sunday 12 March 2017

History of Search engine optimization

The Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.

Site owners recognized the value of a high ranking and visibility in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term. On May 2, 2007, Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona that SEO is a "process" involving manipulation of keywords, and not a "marketing service".
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using meta data to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches. Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.

By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.

In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.

Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. Major search engines provide information and guidelines to help with site optimization. Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the crawl rate, and track the web pages index status.

3 comments:

  1. Great News!!
    Are you searching creative, unique and professional stationary product Photo design for your Amazon or Ebay Store? Now, you have come on the right place. We are providing all kind of stationary product like Banner , Logo, Flyer, Poster, Business Card, info graphic, Invoice, Restaurant Menu, Brochure, Calendar, Box Design, Movie poster, Book Cover, Letterhead Design, DVD Cover Design, CD Cover, Wedding Invention Card and Birthday Card, Postcard, Magazine cove,Newsletter Design and Roll Up Banner. Our cost is very low.
    We Also Do Photo Retouching ,Background Removing, Image Resizing and Any Type of Photoshop Work. Our Website Is ( https://goo.gl/ITIyoK ) You can Contact Us (Inbox us) here - My Profile URL ( https://www.fiverr.com/imon13bworkshop )
    Our Service Link and Description, See details here : https://www.fiverr.com/s2/fc5b6fa794
    Please feel free to contact us if you need any further information....

    ReplyDelete