The complicated algorithms of search engines may appear at first glance to be impenetrable. The engines themselves provide little insight into how to achieve better results or garner more traffic. What information on optimization and best practices that the engines themselves do provide is listed below:
Googlers recommend the following to get better rankings in their search engine:
Bing engineers at Microsoft recommend the following to get better rankings in their search engine:
These satements are important to keep in mind as you Optimize your site.
Clean navigation is essential for spiders to be able to effectively crawl an entire website. The following best practices focus on various domain issues that can hinder a spider from crawling a webpage. The main focus of a high quality, well built URL is a URL that is as compact as possible but still manages to incorporate targeted keywords that tell users and search engines what each page is about.
Pages closest to the root directory are the most important pages on the site, as they are the first pages spiders see. These pages should be optimized carefully and effectively, in order to rank well for relevant terms. For example –
Search Engines consider http://www.mystore.com/ as MOST important
Next would be...................http://www.mystore.com/category/
And so on and so forth.
When using separators for your URLs, make sure to use dashes (-) not underscores (_) as search engines do not recognize underscores (_) as spaces. Also you want to limit the amount that you use. If you can accurately describe the page with a single keyword, that would be best. (i.e.)
Some search engines consider sub-domains to be independent sites from the parent domain. Because search engine algorithms assign a measure of authority to all pages within a domain, the issue is that one highly-authoritative domain may not pass on its authority to a sub-domain of that same site. For example, Google considers a site shop.mystore.com separately from www.mystore.com in its measurement of value or authority for the site. So, although www.mystore.com and its subpages may rank well and have a high perception of authority for its themes, this may not be shared with shop.mystore.com – even though the information on shop.mystore.com is related to the same brand and its themes. To set up a sub-domain for your store click the button below.
|Do this for my Store|
Just as search engines need to see content in order to list pages in their massive keyword-based indexes, they also need to see links in order to find the content. A crawlable link structure - one that lets their spiders browse the pathways of a website - is vital in order to find all of the pages on a website. Hundreds of thousands of sites make the critical mistake of structuring their navigation in ways that search engines cannot access, thus impacting their ability to get pages listed in the search engines' indexes. Below, we've illustrated how this problem can happen:
In the example above, Google's spider has reached page "A" and sees links to pages "B" and "E". However, even though C and D might be important pages on the site, the spider has no way to reach them (or even know they exist.) This is because no direct, crawlable links point to those pages. As far as Google is concerned, they might as well not exist - great content, good keyword targeting, and smart marketing won't make any difference at all if the spiders can't reach those pages in the first place.
When a website with new and old content wants visitors to only see the new content, redirects must be used. Each type of redirect has a different purpose, and it is important to use the proper redirect for each situation. In the past, websites have tried to use redirects to artificially boost rankings and traffic. To combat this, search engine spiders have become very critical of redirects.
1. Search engines (such as Google) most likely have crawled and indexed your site on the search engines results page. If a user then searches and finds your site organically on the search engines, it would be poor user experience if the link found lead to a 404 page.
2. Search engines may recrawl your site via the old URLs and if stumbles upon a 404 page, will most likely drop you out of the search engines results page if they can’t see the association to the new URL. This is also because of poor user experience as search engines place high importance on ensuring users find what they're looking for.
3. You will lose link juice from external sites as these trust & authority juices aren't flowed from the old URL to the new URL. Loss of link juice means your site will lose authority & trust: 2 important factors in SEO.
4. If you did not update your internal links to point to the new URLs, you will have a lot of broken links too which will negatively affect your internal PageRank flow.
Canonicalization is the practice of organizing your content in such a way that every unique piece has one and only one URL.
How Does it Operate?
The tag is part of the HTML header on a web page, the same section you'd find the Title attribute and Meta Description tag. In fact, this tag isn't new, but like nofollow, simply uses a new rel parameter. For example:
This would tell Yahoo!, Live & Google that the page in question should be treated as though it were a copy of the URL www.mystore.com/page and that all of the link & content metrics the engines apply should technically flow back to that URL.
The Canonical URL tag attribute is similar in many ways to a 301 redirect from an SEO perspective. In essence, you're telling the engines that multiple pages should be considered as one (which a 301 does), without actually redirecting visitors to the new URL (often saving your dev staff considerable heartache). There are some differences, though:
Sitemaps are essentially lists of all the pages on the website, which facilitate the search engines and users to move through the website. The two kinds of site maps that are usually implemented are XML sitemaps and HTML sitemaps. XML sitemaps are primarily meant for the search engines as a reference where as an HTML sitemap is visible to the user and is also useful from point of view of navigation. Spiders can easily access site maps in order to index pages.
Your Bigcommerce store dynamicly creates your site map as your store changes. To learn how to Submit your SiteMap to Google click the "Do this for my Store" button.
|Do this for my Store|
Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users which is what Google's main guideline is all about. Another important reason to speed up your website is search engine ranking. This makes sense because the purpose of a search engine is to not just to provide relevant sources of information that the user desires -- but to provide them as fast as possible. Thus if Site A and Site B provide roughly the same desired information, and Site B is able to provide it 3 times as fast as A, the searcher would benefit by visiting Site B.
To help you achieve this we have implemented a CDN (Content Delivery Network). This makes it faster for your shoppers to get content from a place that’s closer to them. The biggest speed gains will be experienced by shoppers who are closer to one of our CDN nodes, some of which are located in London, Los Angeles, New York, Sydney, Tokyo, Sao Paulo, Hong Kong and Singapore.
Speed = More Sales: Site abandonment increases with page load times, resulting in lost sales. It’s well studied that faster page load times increase sales. It follows that you’ll get better conversion rates.
Keywords targeting our search intent and interaction with the engines. For example, a common search query pattern might go something like this:
When a search is performed, the engine matches pages to retrieve based on the words entered into the search box. Other data, such as the order of the words ("tanks shooting" vs. "shooting tanks"), spelling, punctuation, and capitalization of those keywords provide additional information that the engines use to help retrieve the right pages and rank them.
To help accomplish this, search engines measure the ways keywords are used on pages to help determine the "relevance" of a particular document to a query. One of the best ways to "optimize" a page's rankings is to ensure that keywords are prominently used in titles, text, and meta data.
Generally, the more specific your keywords, the better your chances of ranking based on less competition.
Since the dawn of online search, folks have abused keywords in a misguided effort to manipulate the engines. This involves "stuffing" keywords into text, the url, meta tags and links. Unfortunately, this tactic almost always does more harm to your site.
The best practice is to use your keywords naturally and strategically (more on this below.) If your page targets the keyword phrase "Eiffel Tower" then you might naturally include content about the Eiffel Tower itself, the history of the tower, or even recommended Paris hotels. On the other hand, if you simply sprinkle the words "Eiffel Tower" onto a page with irrelevant content, such as a page about dog breeding, then your efforts to rank for "Eiffel Tower" will be a long, uphill battle.
What should optimal page density look like then? An optimal page for the phrase “running shoes” would thus look something like:
You can read more information about On-Page Optimization at this post.
Selecting The Keywords!
Finding out which keywords you want to utilize for your keyword research, can be a long process especially when you are working with e-commerce, some ways in which you can speed up the process include:
Looking at what keywords competitors are targeting.
Gathering a list of products and categories.
Looking at the current site structure.
Taking ideas from Google Keyword Tool.
Using other keyword research tools such as Wordtracker.
Sorting The List In Google Keyword Tool!
OK, so you have worked out the keywords in which you wish to target. Now the next step head over to the Google Keyword Tool.
If you sold shoes it would be great to rank #1 for the keyword "shoes" - or would it?
It's wonderful to deal with keywords that have 5,000 searches a day, or even 500 searches a day, but in reality, these "popular" search terms actually make up less than 30% of the searches performed on the web. The remaining 70% lie in what's called the "long tail" of search. The long tail contains hundreds of millions of unique searches that might be conducted a few times in any given day, but, when taken together, they comprise the majority of the world's
demand for information through search engines.
Another lesson search marketers have learned is that long tail keywords often convert better, because they catch people later in the buying/conversion cycle. A person searching for "shoes" is probably browsing, and not ready to buy. On the other hand, someone searching for "best price on Air Jordan size 12" practically has their wallet out!
Understanding the search demand curve is critical. Below we've included a sample keyword demand curve, illustrating the small number of queries sending larger amounts of traffic alongside the volume of less-searched terms and phrases that bring the bulk of our
Primary Keyword - Secondary Keyword | Brand Name
Brand Name | Primary Keyword and Secondary Keyword
Make page titles 60-70 characters or less, as this is the limit Google displays in search results.
The title element of a web page is meant to be an accurate and concise description of a page's content. This element creates value in three specific areas (covered below) and is critical to both user experience and search engine optimization:
Creating a descriptive, keyword-laden title tag is important for increasing rankings in search engines.
As title tags are such an important part of search engine optimization, implementing best practices for title tags makes for terrific low-energy, high-impact SEO tasks. The recommendations below cover the critical parts of optimizing title tags for search engine and usability goals
Many SEO firms recommend using the brand name at the end of a title tag instead, and there are times when this can be a better approach. The differentiating factor is the strength and awareness of the brand in the target market. If it is a well known brand, and it can make a difference in click-through rates in search results, the brand name should be first. If this is not the case, the keyword should be first.
The meta description tag exists as a short description of a page's content. Search engines do not use the keywords or phrases in this tag for rankings, but meta descriptions are the primary source for the snippet of text displayed beneath a listing in the results.
The meta description tag serves the function of advertising copy, drawing readers to your site from the results and thus, is an extremely important part of search marketing. Crafting a readable, compelling description using important keywords (notice how Google "bolds" the searched keywords in the description) can draw a much higher click-through rate of searchers to your page. Meta descriptions can be any length, but search engines generally will cut snippets longer than 160 characters, so it's generally wise to stay in these limits.
In the absence of meta descriptions, search engines will create the search snippet from other elements of the page. For pages that target multiple keywords and topics, this is a perfectly valid tactic.
The meta keywords tag had value at one time, but is no longer valuable or important to search engine optimization. For more on the history and a full account of why meta keywords has fallen into disuse, read Meta Keywords Tag 101 from SearchEngineLand.
Content is the most important on-page factor that search engines look for when evaluating the value of a website. Websites having a lot of informative, unique content are always valued highly by search engines. Moreover, it is also very important to add fresh content to the website periodically. The content of a webpage should ideally include at least 2-3 instances of a keyword relevant to that page. Writing great content for your audiance is imporant the goal is to create content the others cant help but link to. This is called "Link Bait"
Link bait is content on your site to which other sites link because they want to, not because you ask them to. Traditionally, links are hard to get. But with link bait, you “bait” your content and sit back and wait. Of course, you can be a little proactive …
Great content always serves as link bait. Breaking news often falls in that category, but so does an amazing ebook. A “How-to Guide” is another example.
Manners may buy you links. If you remember to thank your partners and competitors (or cite them), they will probably do the same for you, when the time arises.
Link bait could be a great gadget. All kinds of companies create calculators for specific purposes that become link bait. There are far too many mortgage calculators and “how much do you need to retire?” calculators, but how about a calculator that figures out what kind of reusable insulation you need in your steam room, based on pipe size? (Now that’s one of a kind, and is great bait.)
Link bait could be a widget. Widgets create a link from the site that uses it back to the site that created it – and there’s the link bait again.
Pictures are link bait, too. How about pictures on your blog of the latest industry event? You might even get links from your competitors, who want to show off their faces….
So, how does link bait help you?