What is SEO ?
SEO means Search Engine Optimization. Search Engine Optimization is a advertising area targeted on developing visibility in organic (non-paid) seek engine outcomes. Search engine optimization encompasses both the technical and innovative factors required to enhance rankings, power visitors, and boom awareness in search engines like google and yahoo. There are numerous factors to search engine optimization, from the words on your web page to the manner different websites hyperlink to you at the net. Now and again seo is genuinely a depend of ensuring your site is based in a way that search engines like google and yahoo recognize.
Seo is not just about building seek engine-friendly web sites. It is about making your web page higher for people too. At accept as true with those principles move hand-in-hand.
This manual is designed to explain all regions of seo—from finding the phrases and phrases (key phrases) that generate visitors to your internet site, to creating your website friendly to search engines, to constructing hyperlinks and advertising and marketing the particular price of your website. If you are pressured approximately these things, you aren’t by myself, and we are right here to help.
Best Search Engines in The World
Google Search Engine is the best search engine in the world and it is also one of most popular products from Google. Almost 70 percent of the Search Engine market has been acquired by Google. The tech giant is always evolving and looking to improve the search engine algorithm to provide best results to the end-user. Although Google appears to be the biggest search engine, as of 2015 YouTube is now more popular than Google (on desktop computers).
Bing is Microsoft’s answer to Google and it was launched in 2009. Bing is the default search engine in Microsoft’s web browser. At Bing, they are always striving to make it a better search engine but it’s got a long way to go to give Google competition. Microsoft’s search engine provides different services including image, web and video search along with maps. Bing introduced Places (Google’s equivalent is Google My Business), this is a great platform for business to submit their details to optimise their search results.
Yahoo & Bing compete more with each other than with Google. A recent report on netmarketshare.com tells us that Yahoo have a market share of 7.68 percent. Although a leader as a free email provider, this is declining significantly though with their recent acknowledgement that User Details & Passwords where hacked last year.
Baidu is the most used search engine in China and was founded in Jan, 2000 by Chinese Entrepreneur, Eric Xu. This web search is made to deliver results for website, audio files and images. It provides some other services including maps, news, cloud storage and much more.
Aol.com is also among the top search engines. These are the guys that used to send out CD’s which you’d load onto your PC to install their browser and modem software. Once the pre-eminent player they now have a market share of 0.59 percent. Verizon Communication bought AOL for $4.4 billion. It was started back in 1983 as Control Video Corporation. It was named America Online in 1991 and in 2009 as AOL Inc. AOL is a global mass media company which is based in New York. The company also provides advertising services as AOL Advertising, AOL mail and AOL Platform.
Founded in 1995, Ask.com, previously known as Ask Jeeves. Their key concept was to have search results based on a simple question + answer web format. It is a question & answer community where you can get the answers for your question and it integrates a large amount of archive data to answer your question. Because of this dependency on archived and active user contributions the results will not be as current as those you get in Google, Bing and Yahoo. They’ve tried to counter where their resources don’t have the answer they take help from a third-party search engine. Interestingly they don’t name who this is.
Excite is not widely known but is one that still gets into the top 10. Excite is an online service portal that provides internet services like email, search engine, news, instant messaging and weather updates. This also provides us with the latest trends, topics and search for phrases such as: What can President Trump actually do?
DuckDuckGo is a popular search engine known for protecting the privacy of the users. Unlike Ask.com they are quite open about who they use to generate search results; they’ve partnered with Yahoo, Bing and Yummly. It was founded back in 2008 by Gabriel Weinberg in California and its revenue come from Yahoo-Bing search alliance network and Affiliates.
Wolfram Alpha is a computational knowledge search engine which does not give list of documents or web pages as search results. Results are based on facts & data about that query. Their mission statement is to make all systematic knowledge computable and broadly accessible. Launched in 2009, they now have a Pro solution designed with pricing for Students and Educators. Much as it’s targeted, it’s an awesome tool for the right market.
Launched in 1997, Yandex is most used search engine in Russia. Yandex also has a great presence in Ukraine, Kazakhstan, Belarus and Turkey. It provides services like Yandex Maps, Yandex Music, online translator, Yandex Money and many other services.
Lycos has a good reputation in search engine industry. Its key areas served are email, web hosting, social networking, and entertainment websites.
December 2016 UPDATE: ChaCha ceased trading due to declining advertising revenues
Chacha.com is a human-guided search engine and was founded in 2006. You can ask anything in its search box and you will be answered in real-time. It also provides mobile search and marketing services. You can also install its mobile apps on iPhone, iPad and Android.
New to SEO ? Need to polish up your knowledge ? The Beginner’s Guide to SEO has been read over 3 million times and provides comprehensive information you need to get on the road to professional quality Search Engine Optimization, or SEO.
Crawling and Indexing
Crawling and indexing the billions of documents, pages, files, news, videos, and media on the World Wide Web.
Providing answers to user queries, most frequently through lists of relevant pages that they’ve retrieved and ranked for relevancy.
Crawling and Indexing & Providing Answers
Imagine the World Wide Web as a network of stops in a big city subway system.
Each stop is a unique document (usually a web page, but sometimes a PDF, JPG, or other file). The search engines need a way to “crawl” the entire city and find all the stops along the way, so they use the best path available—links.
The link structure of the web serves to bind all of the pages together.
Links allow the search engines’ automated robots, called “crawlers” or “spiders,” to reach the many billions of interconnected documents on the web.
Once the engines find these pages, they decipher the code from them and store selected pieces in massive databases, to be recalled later when needed for a search query. To accomplish the monumental task of holding billions of pages that can be accessed in a fraction of a second, the search engine companies have constructed datacenters all over the world.
These monstrous storage facilities hold thousands of machines processing large quantities of information very quickly. When a person performs a search at any of the major engines, they demand results instantaneously; even a one- or two-second delay can cause dissatisfaction, so the engines work hard to provide answers as fast as possible.
Search engines are answer machines. When a person performs an online search, the search engine scours its corpus of billions of documents and does two things: first, it returns only those results that are relevant or useful to the searcher’s query; second, it ranks those results according to the popularity of the websites serving the information. It is both relevance and popularity that the process of SEO is meant to influence.
How do search engines determine relevance and popularity ?
To a search engine, relevance means more than finding a page with the right words. In the early days of the web, search engines didn’t go much further than this simplistic step, and search results were of limited value. Over the years, smart engineers have devised better ways to match results to searchers’ queries. Today, hundreds of factors influence relevance, and we’ll discuss the most important of these in this guide.
Popularity and relevance aren’t determined manually. Instead, the engines employ mathematical equations (algorithms) to sort the wheat from the chaff (relevance), and then to rank the wheat in order of quality (popularity).
Provide alt text for images. Assign images in gif, jpg, or png format “alt attributes” in HTML to give search engines a text description of the visual content
Supplement search boxes with navigation and crawlable links.
Supplement Flash or Java plug-inswith text on the page
Provide a transcript for video and audio content if the words and phrases used are meant to be indexed by the engines
Crawlable Link Structures
Just as search engines need to see content in order to list pages in their massive keyword-based indexes, they also need to see links in order to find the content in the first place. A crawlable link structure—one that lets the crawlers browse the pathways of a website—is vital to them finding all of the pages on a website. Hundreds of thousands of sites make the critical mistake of structuring their navigation in ways that search engines cannot access, hindering their ability to get pages listed in the search engines’ indexes.
Below, we’ve illustrated how this problem can happen:
Optimize your content
The title element of a page is meant to be an accurate, concise description of a page’s content. It is critical to both user experience and search engine optimization.
As title tags are such an important part of search engine optimization, the following best practices for title tag creation makes for terrific low-hanging SEO fruit. The recommendations below cover the critical steps to optimize title tags for search engines and for usability.
Be mindful of length
Search engines display only the first 65-75 characters of a title tag in the search results (after that, the engines show an ellipsis – “…” – to indicate when a title tag has been cut off). This is also the general limit allowed by most social media sites, so sticking to this limit is generally wise. However, if you’re targeting multiple keywords (or an especially long keyword phrase), and having them in the title tag is essential to ranking, it may be advisable to go longer.
Place important keywords close to the front
The closer to the start of the title tag your keywords are, the more helpful they’ll be for ranking, and the more likely a user will be to click them in the search results.
we love to end every title tag with a brand name mention, as these help to increase brand awareness, and create a higher click-through rate for people who like and are familiar with a brand. Sometimes it makes sense to place your brand at the beginning of the title tag, such as your homepage. Since words at the beginning of the title tag carry more weight, be mindful of what you are trying to rank for.
Consider readability and emotional impact
Title tags should be descriptive and readable. The title tag is a new visitor’s first interaction with your brand and should convey the most positive impression possible. Creating a compelling title tag will help grab attention on the search results page, and attract more visitors to your site. This underscores that SEO is about not only optimization and strategic keyword usage, but the entire user experience.
Meta tags were originally intended as a proxy for information about a website’s content. Several of the basic meta tags are listed below, along with a description of their use.
The Meta Robots tag can be used to control search engine crawler activity (for all of the major engines) on a per-page level. There are several ways to use Meta Robots to control how search engines treat a page:
- index/noindex tells the engines whether the page should be crawled and kept in the engines’ index for retrieval. If you opt to use “noindex,” the page will be excluded from the index. By default, search engines assume they can index all pages, so using the “index” value is generally unnecessary.
- follow/nofollow tells the engines whether links on the page should be crawled. If you elect to employ “nofollow,” the engines will disregard the links on the page for discovery, ranking purposes, or both. By default, all pages are assumed to have the “follow” attribute.
Example: <META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>
- noarchive is used to restrict search engines from saving a cached copy of the page. By default, the engines will maintain visible copies of all pages they have indexed, accessible to searchers through the cached link in the search results.
- nosnippet informs the engines that they should refrain from displaying a descriptive block of text next to the page’s title and URL in the search results.
- noodp/noydir are specialized tags telling the engines not to grab a descriptive snippet about a page from the Open Directory Project (DMOZ) or the Yahoo! Directory for display in the search results.
The X-Robots-Tag HTTP header directive also accomplishes these same objectives. This technique works especially well for content within non-HTML files, like images.
The meta description tag exists as a short description of a page’s content. Search engines do not use the keywords or phrases in this tag for rankings, but meta descriptions are the primary source for the snippet of text displayed beneath a listing in the results.
The meta description tag serves the function of advertising copy, drawing readers to your site from the results. It is an extremely important part of search marketing. Crafting a readable, compelling description using important keywords (notice how Google bolds the searched keywords in the description) can draw a much higher click-through rate of searchers to your page.
Meta descriptions can be any length, but search engines generally will cut snippets longer than 160 characters, so it’s generally wise to stay within in these limits.
In the absence of meta descriptions, search engines will create the search snippet from other elements of the page. For pages that target multiple keywords and topics, this is a perfectly valid tactic.
Make your site interesting and useful
Creating compelling and useful content will likely influence your website more than any of the other factors discussed here. Users know good content when they see it and will likely want to direct other users to it. This could be through blog posts, social media services, email, forums, or other means.
Organic or word-of-mouth buzz is what helps build your site’s reputation with both users and Google, and it rarely comes without quality content.
Keyword usage and targeting are still a part of the search engines’ ranking algorithms, and we can apply some effective techniques for keyword usage to help create pages that are well-optimized. Here at Moz, we engage in a lot of testing and get to see a huge number of search results and shifts based on keyword usage tactics. When working with one of your own sites, this is the process we recommend. Use the keyword phrase:
- In the title tag at least once. Try to keep the keyword phrase as close to the beginning of the title tag as possible. More detail on title tags follows later in this section.
- Once prominently near the top of the page.
- At least two or three times, including variations, in the body copy on the page. Perhaps a few more times if there’s a lot of text content. You may find additional value in using the keyword or variations more than this, but in our experience adding more instances of a term or phrase tends to have little or no impact on rankings.
- At least once in the alt attribute of an image on the page. This not only helps with web search, but also image search, which can occasionally bring valuable traffic.
- Once in the URL. Additional rules for URLs and keywords are discussed later on in this section.
- At least once in the meta description tag. Note that the meta description tag does not get used by the engines for rankings, but rather helps to attract clicks by searchers reading the results page, as the meta description becomes the snippet of text used by the search engines.
Keywords dominate how we communicate our search intent and interact with the engines. When we enter words to search for, the engine matches pages to retrieve based on the words we entered. The order of the words (“pandas juggling” vs. “juggling pandas”), spelling, punctuation, and capitalization provide additional information that the engines use to help retrieve the right pages and rank them.
Search engines measure how keywords are used on pages to help determine the relevance of a particular document to a query. One of the best ways to optimize a page’s rankings is to ensure that the keywords you want to rank for are prominently used in titles, text, and metadata.
Generally speaking, as you make your keywords more specific, you narrow the competition for search results, and improve your chances of achieving a higher ranking. The map graphic to the left compares the relevance of the broad term “books” to the specific title Tale of Two Cities. Notice that while there are a lot of results for the broad term, there are considerably fewer results (and thus, less competition) for the specific result.
How To Choose Keywords For Your Website
Keywords are the essence of search engine optimization. Think of them as ID Cards that tell the searchers and the search engines what your web page is about.
How to Judge the Value of a Keyword
Is the keyword relevant to your website’s content? Will searchers find what they are looking for on your site when they search using these keywords? Will they be happy with what they find? Will this traffic result in financial rewards or other organizational goals? If the answer to all of these questions is a clear “Yes!” then proceed …
Search for the term/phrase in the major engines
Understanding which websites already rank for your keyword gives you valuable insight into the competition, and also how hard it will be to rank for the given term. Are there search advertisements running along the top and right-hand side of the organic results? Typically, many search ads means a high-value keyword, and multiple search ads above the organic results often means a highly lucrative and directly conversion-prone keyword.
Buy a sample campaign for the keyword at Google AdWords and/or Bing Adcenter
If your website doesn’t rank for the keyword, you can nonetheless buy test traffic to see how well it converts. In Google Adwords, choose “exact match” and point the traffic to the relevant page on your website. Track impressions and conversion rate over the course of at least 200-300 clicks.
Using the data you’ve collected, determine the exact value of each keyword
For example, assume your search ad generated 5,000 impressions in one day, of which 100 visitors have come to your site, and three have converted for a total profit (not revenue!) of $300. In this case, a single visitor for that keyword is worth $3 to your business. Those 5,000 impressions in 24 hours could generate a click-through rate of between 18-36% with a #1 ranking (see the Slingshot SEO study for more on potential click-through rates), which would mean 900-1800 visits per day, at $3 each, or between 1 and 2 million dollars per year. No wonder businesses love search marketing!
Search Engine Protocols
Think of a sitemap as a list of files that give hints to the search engines on how they can crawl your website. Sitemaps help search engines find and classify content on your site that they may not have found on their own. Sitemaps also come in a variety of formats and can highlight many different types of content, including video, images, news, and mobile.
The robots.txt file, a product of the Robots Exclusion Protocol, is a file stored on a website’s root directory (e.g., www.google.com/robots.txt). The robots.txt file gives instructions to automated web crawlers visiting your site, including search crawlers.
By using robots.txt, webmasters can indicate to search engines which areas of a site they would like to disallow bots from crawling, as well as indicate the locations of sitemap files and crawl-delay parameters. You can read more details about this at the robots.txt Knowledge Center page.
Best Search Engine Tools
Google Search Console
Bing Webmaster Tools
Getting Penalties Lifted
The task of requesting reconsideration or re-inclusion in the engines is painful and often unsuccessful. It’s also rarely accompanied by any feedback to let you know what happened or why. However, it is important to know what to do in the event of a penalty or banning.
- If you haven’t already, register your site with the engine’s Webmaster Tools service (Google’s and Bing’s). This registration creates an additional layer of trust and connection between your site and the search engine teams.
- Make sure to thoroughly review the data in your Webmaster Tools accounts, from broken pages to server or crawl errors to warnings or spam alert messages. Very often, what’s initially perceived as a mistaken spam penalty is, in fact, related to accessibility issues.
- Send your reconsideration/re-inclusion request through the engine’s Webmaster Tools service rather than the public form; again, this creates a greater trust layer and a better chance of hearing back.
- Full disclosure is critical to getting consideration. If you’ve been spamming, own up to everything you’ve done—links you’ve acquired, how you got them, who sold them to you, etc. The engines, particularly Google, want the details so they can improve their algorithms. Hold back, and they’re likely to view you as dishonest, corrupt, or simply incorrigible (and they probably won’t respond).
Appendix: Link URLs used in this paper
The following URLs are referenced in this paper: