SEO Learning Notes By Piyush Kumar
-----------------------------------------------------------------------------------------------------------------------
Introduction
of SEO and Search Engines
Search
Engine Optimization (SEO) is often considered the more technical part of Web
marketing. This is true because SEO does help in the promotion of sites and at
the same time it requires some technical knowledge – at least familiarity with
basic HTML. SEO is sometimes also called SEO copyrighting because most of the
techniques that are used to promote sites in search engines deal with text.
Generally, SEO can be defined as the activity of optimizing Web pages or whole
sites in order to make them more search engine-friendly, thus getting higher
positions in search results.
One of
the basic truths in SEO is that even if you do all the things that are
necessary to do, this does not automatically guarantee you top ratings but if
you neglect basic rules, this certainly will not go unnoticed. Also, if you set
realistic goals – i.e to get into the top 30 results in Google for a particular
keyword, rather than be the number one for 10 keywords in 5 search engines, you
will feel happier and more satisfied with your results.
Although
SEO helps to increase the traffic to one's site, SEO is not advertising. Of
course, you can be included in paid search results for given keywords but
basically the idea behind the SEO techniques is to get top placement because
your site is relevant to a particular search term, not because you pay.
SEO can
be a 30-minute job or a permanent activity. Sometimes it is enough to do some
generic SEO in order to get high in search engines – for instance, if you are a
leader for rare keywords, then you do not have a lot to do in order to get
decent placement. But in most cases, if you really want to be at the top, you
need to pay special attention to SEO and devote significant amounts of time and
effort to it. Even if you plan to do some basic SEO, it is essential that you
understand how search engines work and which items are most important in SEO.
1. How
Search Engines Work
The first
basic truth you need to learn about SEO is that search engines are not humans.
While this might be obvious for everybody, the differences between how humans
and search engines view web pages aren't. Unlike humans, search engines are text-driven.
Although technology advances rapidly, search engines are far from intelligent
creatures that can feel the beauty of a cool design or enjoy the sounds and
movement in movies. Instead, search engines crawl the Web, looking at
particular site items (mainly text) to get an idea what a site is about. This
brief explanation is not the most precise because as we will see next, search
engines perform several activities in order to deliver search results –
crawling, indexing, processing, calculating relevancy, and retrieving.
First,
search engines crawl the Web to see what is there. This task is performed by e
piece of software, called a crawler or a spider (or Googlebot, as is the case
with Google). Spiders follow links from one page to another and index
everything they find on their way. Having in mind the number of pages on the
Web (over 20 billion), it is impossible for a spider to visit a site daily just
to see if a new page has appeared or if an existing page has been modified.
Sometimes crawlers will not visit your site for a month or two, so during this
time your SEO efforts will not be rewarded. But there is nothing you can do
about it, so just keep quiet.
What you
can do is to check what a crawler sees from your site. As already mentioned,
crawlers are not humans and they do not see images, Flash movies, JavaScript,
frames, password-protected pages and directories, so if you have tons of these
on your site, you'd better run the Spider Simulator below to see if these
goodies are viewable by the spider. If they are not viewable, they will not be
spidered, not indexed, not processed, etc. - in a word they will be
non-existent for search engines.
After a
page is crawled, the next step is to index its content. The indexed page is
stored in a giant database, from where it can later be retrieved. Essentially,
the process of indexing is identifying the words and expressions that best
describe the page and assigning the page to particular keywords. For a human it
will not be possible to process such amounts of information but generally
search engines deal just fine with this task. Sometimes they might not get the
meaning of a page right but if you help them by optimizing it, it will be
easier for them to classify your pages correctly and for you – to get higher
rankings.
When a
search request comes, the search engine processes it – i.e. it compares the
search string in the search request with the indexed pages in the database.
Since it is likely that more than one page (practically it is millions of pages)
contains the search string, the search engine starts calculating the relevancy
of each of the pages in its index to the search string.
There are
various algorithms to calculate relevancy. Each of these algorithms has
different relative weights for common factors like keyword density, links, or
metatags. That is why different search engines give different search results
pages for the same search string. What is more, it is a known fact that all
major search engines, like Yahoo!, Google, MSN, etc. periodically change their
algorithms and if you want to keep at the top, you also need to adapt your
pages to the latest changes. This is one reason (the other is your competitors)
to devote permanent efforts to SEO, if you'd like to be at the top.
The last
step in search engines' activity is retrieving the results. Basically, it is
nothing more than simply displaying them in the browser – i.e. the endless
pages of search results that are sorted from the most relevant to the least
relevant sites.
2.
Differences Between the Major Search Engines
Although
the basic principle of operation of all search engines is the same, the minor
differences between them lead to major changes in results relevancy. For
different search engines different factors are important. There were times,
when SEO experts joked that the algorithms of Bing are intentionally made just
the opposite of those of Google. While this might have a grain of truth, it is
a matter a fact that the major search engines like different stuff and if you plan
to conquer more than one of them, you need to optimize carefully.
There are
many examples of the differences between search engines. For instance, for
Yahoo! and Bing, on-page keyword factors are of primary importance, while for
Google links are very, very important. Also, for Google sites are like wine –
the older, the better, while Yahoo! generally has no expressed preference
towards sites and domains with tradition (i.e. older ones). Thus you might need
more time till your site gets mature to be admitted to the top in Google, than
in Yahoo!.
How many
types of SEO
Many
there are two types of SEO
White Hat
SEO
Black Hat
SEO
White Hat
SEO
White hat
SEO, as the name suggests, is clean and wholesome, and the type of search
engine optimization service most businesses would want for their website. To
put it in more accessible terms, white hat SEO is to search engine marketing
what organically grown food is to a healthy diet. It is not only wholesome and
ethical, but is also sustainable. Of course, developing organic and sustainable
rankings (just like organic food) requires a lot of time and care. Naturally,
this is reflected in the cost of practicing white hat SEO, whether you do it
yourself or hire a professional to do it for you. The good news is, however,
that the initial higher cost of developing a sustainable SEO strategy, and
implementation thereof, translates into cost savings in the long term.
White hat
SEO tactics, techniques, and strategies are those which adhere to guidelines
set by the search engines, and involve no deception. White hat search engine
optimization merely seeks to provide the most search engine friendly
presentation of useful content which is inherently valuable and specifically
designed for human consumption.
Another
important characteristic of white hat SEO is that it cannot generate great
results for poor quality content.
Black Hat
SEO
Black hat
SEO, as you may have already guessed (assuming you read the part about white
hat SEO) is the evil brother. It is slick, talks a fast game, and can get you
on the top ten lists for a while, but your website, and ultimately you, may end
up paying a very high price for letting the evil brother be your guide. Going
back tot he food reference, you can think of white hat SEO as the fat infused
fast food, or the sugary treat full of high fructose corn syrup--it tastes so
good and makes you crave more of it. Unfortunately, the goodness comes at a
cost, which can be a debilitating and even life-threatening illness in the case
of your body, and a penalized or banned website in the case of, well, your
website.
Black hat
SEO, unlike its wholesome kin, uses tricks, schemes, and games to circumvent
the algorithmic barriers set up by the search engines to prevent bad content
from gaining high rankings in the search engine result pages (SERPs).
Black hat
SEO is not to be mistaken for plain bad search engine optimization which is the
result of either lack of knowledge or cutting corners.
One New
Addition is called Grey Hat SEO
Grey Hat
SEO is mid-way between the two tools above and is all about the balance between
risk and reward. There are actually a wide number of SEO services solutions
categorized under this. Some Grey Hat SEO services may tend to use more dubious
strategies and take even bigger risks to produce fast and high search engine
rankings. While many Gray Hat SEO services methods obey search engine
guidelines, others might put you at risk. If you opt for SEO services using
Grey Hat, be sure about what you are exactly subjecting your online site to.
On Page
Optimization
On-page
optimization (on-page SEO) is what can be done on the pages of a website to
maximize its performance in the search engines for target keywords related to
the on-page content.
On-Page
SEO Checklist
* Always start with keyword selection, research and testing
* Meta Description tag
* ALT tags
* H1 tags
* URL structure
* Internal linking strategy
* Content
* Keyword density
* Site maps, both XML and user facing
* Usability and accessibility
* Track target keywords
Avoid
common on-page SEO mistakes such as:
* Duplicate content
* URL variants of the same pages
* Off-site images and content on-site
* Duplicate title tags
Do not
use on-page SEO spamming tactics such as:
* Hidden text
* Hidden links
* Keyword repetition
* Doorway pages
* Mirror pages
* Cloaking
Off-Page
Optimization (SEO)
Defined: Off-page optimization (off-page SEO) is what can be done off the pages of a website to maximize its performance in the search engines for target keywords related to the on-page content and keywords in off-page direct-links.
Off-Page
SEO Checklist
* Always start with keyword research, testing and selection
* Use Keywords in link anchor text
* Obtain links from high ranking publisher sites
* One-way inbound links (not link exchange or reciprocal links)
* Different keywords in your link-ads from the same site
* Gradual link building technology (no growth spikes)
* Use relevant keywords near your inbound link (contextual relevance)
* Deep linking (from multiple pages to multiple pages)
* Target a large list of keywords (5-500+)
* Link from sites with a variety of LinkRanks
* Track active all keywords and refine strategy as required
* Discontinue campaigns if ranking does not improve
* Expect results in 1-2 months (Bing) 1-9 months (Google, Yahoo)
Avoid
common off-page SEO mistakes:
* Duplicate keywords in link avderts
* Site-wide links causing link growth spikes
* Using on-page SEOs to do the work of specialist off-page SEO's
* Placing random links without keywords near your link adverts
Do not
use off-page SEO spamming tactics such as:
* Link farms (sites with 100+ outbound links per page)
* Using irrelevant keywords in your link-ads
* Garbage links
* Link churning
* Hidden inbound links
Off Page
Optimization VS On Page SEO (Optimization)
You may
have made sure that you picked one of the best on page optimization agencies in
the industry, got references, asked for case studies then after risking it all
with a 12 month contract, you invested all your expectations and large part of
your internet marketing budget with the on-page Search Engine Optimization
(SEO) agency.
Months
later with your patience worn through and no significant results delivered, you
must be frustrated with the lack of returns on your investment and possibly
even ready to give up on SEO all together.
We have
been very surprised to see how many of the top SEO agencies in the UK have been
very slow to change their optimization strategies; now that on page
optimization has mostly lost all of its affect.
Is there
any need for On Page Optimization?
There are
some basic optimization issues that are critical to have in place and then
there are more technical / advanced techniques that can improve your search
engine rankings. You should not pay for basic SEO advice and you do not need to
pay much for advanced optimization advice.
We
disclose up to date SEO advice and tips to most of our clients at no charge.
After
having the basics in place, creating masses of useful content is the main
on-page optimization strategy that a webmaster should focus on, but without
off-page SEO you will not see your website’s current ranking increase
significantly.
More on
off-page optimization…
Off page
optimization or off-page SEO is basically controlling how the internet portrays
your website.
A
professional off-page SEO will be able to employ their own resources* to
control how search engines view your website and thereby control your ranking.
Most off-page SEO techniques done well will result in very high ROI and high
ranking in MSN, Google and Yahoo!
One way
links from their link publishing partners
Gradual
link building technology
Your
business partners and their link publishing resources
General
internet resources: Powerful free directories, one way link brokering etc.
Online PR
campaigns
News
articles
Directory
Submissions: Submitting
your site URL to the relevant categories of popular directories like DMOZ, Best
of the Web, etc can help you to get valuable back links.
Article
Submission: An easy
way to get link juice via back links is to submit unique articles to various
popular article submission sites like EzineArticles.com, GoArticles.com,
ArticleDashboard.com, iSnare.com, and ArticlesBase.com.
Forum: Set up your account on some
popular forums, build your credibility there and soon you will be allowed to
add your site URL in your signature, which will act as a backlink and help to
lure your avid followers to your site.
Blogging: Whether you have a blog of your
own or want to write as a contributing blogger for some popular blogging sites,
this can prove to be an effective way of making people take notice of what you
have to offer. RSS Feed generation and submission also help to keep your avid
readers interested in your updates and news even when they don’t have the time
to actually visit the site.
Social
Bookmarking and Q and A Postings: You can also further your SEO interests by posting
question and answers on Yahoo Answers, and via social bookmarking.
Social
Networking:
Networking on various social platforms like Twitter, Facebook, MySpace, and
LinkedIn are the newest buzz in SEO tactics, which webmasters are using with a
zeal. You too can join the bandwagon after some careful planning.
SEO
Competitive Analysis
Many SEO
clients are focused on receiving ranking reports for their keywords as a major
deliverable associated with a properly managed SEO campaign.
But
ranking reports don't mean nearly as much as they once did. Search engine
rankings change regularly, are different on various data centers, and won't
generate traffic to the Web site, much less generate leads and sales,
especially if a site ranks well for keywords that aren't often searched.
So, I
preach to my prospects and clients that they should be focused on analytics and
measurement of SEO much like they would (try to) measure any form of marketing
effort. Is the SEO program generating qualified traffic to the site? Is the SEO
effort generating phone calls (yes, you can track this)? Is the SEO effort
delivering a solid ROI (for what I'm spending on these efforts, either in
internal resources or outsourcing)?
Now,
that's not to say that a firm you've outsourced your SEO efforts to shouldn't
be delivering reports. They absolutely should. But, let's try to focus on
things that actually matter. These include things like solid keyword research,
a competitive analysis, a site structure analysis and analytics reports that
"mean something" (making sure that analytics programs are set up
properly and tracking what matters).
Today,
I'll touch upon one of the most overlooked aspects of a successful SEO effort:
the competitive analysis.
Determine
Who Your Competitors Are
Many CMOs
are quick to list off a number of competitors (those that they think of as
competitors in the traditional sense). In the SEO landscape, we lean towards
those "keyword competitors" -- Web sites that are ranking for
keywords we'd like our client to be found for.
A good
example of this would be a client from my former life who sold
"signs" (banners, billboards, etc.). One of their main keywords was
(is) "signs." At the time, the movie "Signs," starring Mel
Gibson, was released. Obviously, the movie isn't a direct competitor for this
keyword, but a page devoted to this movie ranks number one in Google for
"signs," and the movie still has several mentions in Google's top 10.
How to
Compete for Various Keywords
Once
you've determined the keyword competitors, you need to determine the factors
that might be in play to help these Web sites to rank, while yours may not.
It's
possible to get carried away with this type of analysis, as there are over 100
factors in play to determine why a Web site might rank, and the factors (and
the weight of the factors) will fluctuate in the search engine's algorithms.
With that
said, there are some pretty consistent things that you can look for to better
compete for various keywords:
Age of
Domain: Many
people getting into business on the Web for the first time don't know this
simple rule. Buying an aged domain saves you a great amount of time. While
you're at it, buy a domain that already has links pointing to it (links from
within your chosen industry, ideally). When you look at the Web sites that are
ranking for your selected keywords, you'll most likely see a trend that those
listed on the first page of the search results are many years old. This could
be because it took that long to generate enough quality links/content, but an
aged domain is certainly one of the most important factors that goes into
getting a Web site to rank.
Pages
Indexed: This is
what I refer to as the "Wikipedia effect." Wikipedia is an extremely
deep Web site, with only one page relevant to your search. Why does it keep
showing up when you're searching? Because the search engines have determined
that the Web site -- as a whole -- is an "authority" site. That is,
it's deep with quality content (there are a number of other reasons why this
Web site ranks, but the depth of the Web site is certainly key among those
reasons). Wikipedia has more than 380 million pages indexed in Yahoo.
Linking: Through an easy
"site:www.sitename.com" search on Yahoo, you can see the pages
indexed and links indexed for any Web site that you're analyzing. Click on the
"Inlinks" link and use the drop down to select "except from this
domain," so that you aren't counting those internal links in your
analysis.
By
following these three simple steps, you'll gain a greater insight into what it
takes to rank for the keywords you're interested in ranking for, and help you
better understand the steps/tactics that you'll need to employ to compete with
those that are showing up in the SERPs.
Keyword
Research and Analysis
Keyword
research is one of the most important parts of online marketing. Sometimes we
don't realize just how much of a difference it can make in the overall success
of a search engine marketing campaign or even the success of an online
business. In fact, it's my belief that the keyword phrase the term that a searcher types into the search
engine when they're searching can make or break online business. Target the wrong
keyword phrases and an online business is destined to fail.
Domain
names by themselves are no longer the most valuable "location" on the
internet keywords are. If you buy a domain
name and put up a website, visitors won't automatically flock to your website
and buy your products. The popular phrase "if you build it they will
come" is not true when it comes to online businesses. You must market your
website online, and one of the best forms of ROI for an online business is
through search engine marketing. All search engine marketing campaigns need to
start with a set of keywords. It's those keywords that are an online business'
location, just as a traditional brick and mortar store's location is a physical
street address.
Location
If you're
considering opening a gas station, one of the most important decisions related
to the success of the gas station is going to be its location. Selling gas near
a busy highway where there is a large traffic count would be ideal. So, it's
logical to get the traffic count data for several locations before deciding
where to purchase land and build your gas station. In the online marketing
world, we have the opportunity to get traffic counts as well the average number of searches per day for certain
keyword phrases. By positioning your online business in the proper keyword
"locations", your online business will thrive. Target the wrong
keyword phrases and your online business won't get any search engine traffic,
and no potential customers will visit your website.
If you're
a brick and mortar business that sells power tools, proper keyword research can
be a tremendous help even before you set up your
online business. Keyword research can tell you exactly which power tools are
more popular online which may be different than the
best selling power tools in your retail store. Armed with this knowledge, you
can focus on the more popular products that customers are looking for online.
For example, did you know that more people search for portable generators
online than any other type of power tool? Perhaps this is due to the recent
weather-related disasters that have hit the United States or the pending winter
weather, but focusing on selling portable generators online might be good for
business. Keyword research using Word tracker (www.wordtracker.com) gives us
this valuable information it can also give us the list of
other power tools that are being searched for, which will be a good start for a
list of keywords.
Content
Copywriting & Optimization
Search
engines love relevance. They live for relevance. Nothing pleases them more than
finding the most informative, appropriate and useful website to match a query.
The search engine companies spend millions of pounds and working hours in
attempting to formulate ways by which the rubbish can be ignored and only the
purest, most useful sites returned. It's a noble quest. They work tirelessly to
out-tech, head off and second guess the engine manipulators, the spandexes, and
the accident lists, those who would knowingly or otherwise abuse the integrity
of search returns by working the system to return unhelpful and irrelevant
pages against queries for their own ends.
The
search engines are seeking the foremost authority on a given subject. They want
the most reputable sites and those that know the most on a particular topic.
The most rewarding approach when seeking to legitimately see your site at the
top of the heap is not to ask what search engines can do for you, but what you
can do for search engines. It's not up to the search engines to make your site
relevant. That's the job of the site owner.
The way
you do this is in principle, simple enough - through SEO Copywriting and
text-related content. Engines can't read images or graphics; they cannot
determine the relevance of Flash in relation to a search term. The two most
important factors upon which they can determine relevance is on the site text
and more importantly the links from authoritative sites inspired by that text.
The days
of the Search Engine Optimization copywriter weaving their magic with perfect
keyword selection, placement and Density to achieve wondrous top page rankings
are long gone (despite what many will try to tell you). Of course, keywords are
still important, especially in titles as engines prefer nice tight search
return matches, but it's more a case of frequency rather than density that
improves rankings.
There's a
common consensus these days amongst Search Engine Optimization professionals
that the major determinant of ranking position for any particular page is down
to what happens off the page, in the form of links from other sites. Reputation
built on authority bestowed by other reputable sites through sheer worth and
relevance. Good is good is good. When Barack Obama recently quipped 'you can
put lipstick on a pig but it's still a pig' what he's saying (without wishing
to be piggist) is that you can't properly disguise something ugly, something
unappealing. By the same token quality will always out. Compelling content
gains eyeballs, gains traction, gains authority, gains high search engine
placement, gains eyeballs, gains traction and so it continues.
At SEO
Consult we understand this clearly and work hard to get the wheels of virtuous
search engine performance rolling.
Appreciating
something is all well and good; it's acting on that knowledge that delivers
results and at SEO Consult a great deal of our work is in partnering our
clients to formulate and apply the maximum possible value into site Copy and
Content.
Getting
authoritative links has become the most difficult aspect of Search Engine
Optimization, hence the emergence from 2006 of Social Media Marketing as a way
to attract links with compelling content, hence the explosion in online
articles and blogs.
The
critical starting point before a word of Search Engine Optimization copywriting
has been written is the project scope and definition stage.
It's at
this point that business objectives need to be clearly defined through goal
analysis. What are we aiming to achieve? This helps us measure campaign
effectiveness and stay on track and focused.
We then
define our ideal site visitors through audience analysis. Through research,
surveys and the like we try to get inside the head of the prospective audience
and define the campaign's semantic space. Content on the site will be directed
specifically at them in a language they understand and appeals. Content needs
to press the right emotional buttons to generate positive responses and
ultimately inbound links. Similar emotional forces to those that prompt us to
buy can also cause us to link, bookmark, Twitter and Digg. Compelling benefits
in the form of content provide visitors the motivation to emotionally invest in
a site by linking to it. At this stage of the Search Engine Optimization
process we're simply defining an environment in which our audience might best
relate to.
"Ask
yourself what creates value for your users," Google says.
The
audience analysis acts as the foundation for intelligent keyword selection -
all part of the semantic space definition process.
We
collaborate with clients to create a list of preferable keywords and phrases
that, based on our research, present the best opportunities for attracting
potential visitors, customers and links. We make a point of revisiting the
keyword formulation step as part of campaign assessment so that we can hone and
tweak as necessary.
It's
impossible to repeat this too often or overstress - Content of value is the key
and it's the inbound links that reflect its worth in the eyes of Internet users
and therefore the search engines. Of course there are useful optimization
techniques that should also be applied to the site to make it search friendly and
at SEO Consult we expertly address optimization issues such as - site
structure, site hierarchy, optimized and themed Tag Naming. In reality though,
it's only worth pursuing additional optimization through tweaking once those
all-important inbound links have been attracted to the site in the first place.
Tweaking a page for higher rankings before the content has been established as
compelling is largely futile. The site is 'all dressed up, with nowhere to go.'
As those
search engine algorithms move further and further away from old school
relevance measurements and increasingly assign site importance and authority to
off page factors such as social media tagging and blog-driven links, so expert
Search Engine Optimization copywriters armed with the ability and empathy to
prompt inbound links and consequent conversions are finding themselves becoming
vital components in search engine marketing campaigns.
SEO
Consult offers passionate, professional and hugely experienced writers. Both
creative and qualified our writers are committed to delivering the very best
copy solutions to your campaigns.
SEO
copywriting is the process of writing literal and utilitarian content for
websites with the perfect blend of information and keywords. The SEO
copywriting and content optimization services of a professional SEO company
would achieve top ranking for your website. The skillfully crafted website
content brings immense benefits to your business firm.
Quality
Services ensure Better Rankings
SEO
copywriting and optimization services of a SEO company help to make a website
attractive and thus increase the number of visitors. They raise the client
company’s online visibility, attracting increased traffic and leading to
enhanced sales. Various SEO copywriting optimization services include:
• SEO
press release writing services
• Case
study writing and optimization
• SEO
article writing
•
Newsletter writing
• Travel
writing
•
Technical writing
• PPC Ad
writing
•
Ecommerce writing
• Web
page copywriting services
• Blog
copywriting services
• Article
writing
• Writing
research based articles
A premier
SEO company offers professional services for SEO optimization and copywriting.
If the content is highly optimized, it places your website in a top position on
popular search engines. Optimized copywriting is achieved by employing advanced
technology and search engine optimization tools. It is performed by highly
skilled and trained copywriters. They analyze the target of a business firm and
carry out productive copywriting. Copywriting is done in a easy-to-read,
understandable and attractive style. Most searched keywords are placed in
optimal format with a combination of title, keyword tags, headings, alt text,
description and link anchor tags.
Utilize
Services of a Top SEO Company
A leading
SEO company is the best option if you need professional aid in copywriting and
optimized content for your website. SEO copywriting and content optimization
services of such a company maintains top search engine ranking on a regular
basis, making your website profitable.
RSS
Creation & Submission
How to
Create RSS Feeds
Since any
RSS-file is a specially formatted XML file, it can be edited with any
XML-editor. And since XML-files are just plain text you can use any text editor,
even Notepad, to create your first feed.
Step-by-Step
guide to Creating an RSS Feed
Follow
these steps to create a simple RSS feed manually.
1: Create
an empty text file
Use
Windows Notepad or any other text editor.
2: Add
XML Declaration Tag
Since RSS
is a dialect of XML, the first line in the feed must be the XML declaration.
<?xml
version="1.0"?>
3: RSS
Channel
Now it is
time to add the rss XML tag, and the channel tag. All feed contents will go
inside these two tags.
<rss
version="2.0">
<channel>
4: RSS
Feed Properties
Next step
is to place information about the RSS feed such as it's title, it's
description, it's language and a link to it's web-site. And finally add the
lastBuildDate field which should be the date and time that the feed was last
changed. This field is optional, but highly recommended.
<title>John
Smith News</title>
<link>http://JohnSmithHomepage.com</link>
<description>Latest
stories form John Smith</description>
<lastBuildDate>Mon,
12 Sep 2005 18:37:00 GMT</lastBuildDate>
<language>en-us</language>
5: Adding
Items to your RSS Feed
Every RSS
feed consists of items, and each item is an RSS Feed has a title, link,
description, publication date, and (optionally) guid (unique identifier).
<item>
<title>My
First Article</title>
<link>http://JohnSmithHomepage,com/Article1.html</link>
<guid>http://JohnSmithHomepage,com/Article1.html</guid>
<pubDate>Mon,
12 Sep 2005 18:37:00 GMT</pubDate>
<description>It's
my first article. Hello World!</description>
</item>
<!--
insert more items here -->
6: Add
closing tags for Channel and RSS.
</channel>
</rss>
7:
Validate your new RSS feed
After you
have created your RSS Feed, validate it
Sample
feed
Here's a
sample RSS file which can be used as a template for your first feed:
<?xml
version="1.0"?>
<rss
version="2.0">
<channel>
<title>John Smith News</title>
<link>http://JohnSmithHomepage.com/</link>
<description>Latest stories form John Smith</description>
<language>en-us</language>
<lastBuildDate>Tue, 10 Jun 2003 09:41:01 GMT</lastBuildDate>
<item>
<title>My First Article</title>
<link>http://JohnSmithHomepage,com/Article1.html</link>
<description>It's my first article. Hello World!</description>
<pubDate>Tue, 03 Jun 2003 09:39:21 GMT</pubDate>
</item>
<item>
<title>My Second Article - I have bought a cat</title>
<link>http://JohnSmithHomepage,com/Article2.html</link>
<description>I've boght a cat. Now I have a pet.</description>
<pubDate>Tue, 17 Jan 2007 10:39:21 GMT</pubDate>
</item>
</channel>
</rss>
Robots
exclusion standard
The Robot
Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt
protocol, is a convention to prevent cooperating web spiders and other web
robots from accessing all or part of a website which is otherwise publicly
viewable. Robots are often used by search engines to categorize and archive web
sites, or by webmasters to proofread source code. The standard is unrelated to,
but can be used in conjunction with, Sitemaps, a robot inclusion standard for
websites.
About the
standard
If a site
owner wishes to give instructions to web robots they must place a text file
called robots.txt in the root of the web site hierarchy (e.g.
www.example.com/robots.txt). This text file should contain the instructions in
a specific format (see examples below). Robots that choose to follow the
instructions try to fetch this file and read the instructions before fetching
any other file from the web site. If this file doesn't exist web robots assume
that the web owner wishes to provide no specific instructions.
A
robots.txt file on a website will function as a request that specified robots
ignore specified files or directories in their search. This might be, for
example, out of a preference for privacy from search engine results, or the
belief that the content of the selected directories might be misleading or
irrelevant to the categorization of the site as a whole, or out of a desire
that an application only operate on certain data.
For
websites with multiple sub domains, each sub domain must have its own
robots.txt file. If example.com had a robots.txt file but a.example.com did
not, the rules that would apply for example.com would not apply to
a.example.com.
Disadvantages
The
protocol is purely advisory. It relies on the cooperation of the web robot, so
that marking an area of a site out of bounds with robots.txt does not guarantee
privacy. Some web site administrators have tried to use the robots file to make
private parts of a website invisible to the rest of the world, but the file is
necessarily publicly available and its content is easily checked by anyone with
a web browser.
There is
no official standards body or RFC for the robots.txt protocol. It was created
by consensus in June 1994 by members of the robots mailing list
(robots-request@nexor.co.uk). The information specifying the parts that should
not be accessed is specified in a file called robots.txt in the top-level
directory of the website. The robots.txt patterns are matched by simple
substring comparisons, so care should be taken to make sure that patterns
matching directories have the final '/' character appended, otherwise all files
with names starting with that substring will match, rather than just those in
the directory intended.
Examples
This
example allows all robots to visit all files because the wildcard "*"
specifies all robots:
User-agent:
*
Disallow:
This
example keeps all robots out:
User-agent:
*
Disallow:
/
The next
is an example that tells all crawlers not to enter four directories of a
website:
User-agent:
*
Disallow:
/cgi-bin/
Disallow:
/images/
Disallow:
/tmp/
Disallow:
/private/
Example
that tells a specific crawler not to enter one specific directory:
User-agent:
BadBot # replace the 'BadBot' with the actual user-agent of the bot
Disallow:
/private/
Example
that tells all crawlers not to enter one specific file:
User-agent:
*
Disallow:
/directory/file.html
Note that
all other files in the specified directory will be processed.
Example
demonstrating how comments can be used:
#
Comments appear after the "#" symbol at the start of a line, or after
a directive
User-agent:
* # match all bots
Disallow:
/ # keep them out
[edit]
Nonstandard extensions
[edit]
Crawl-delay directive
Several
major crawlers support a Crawl-delay parameter, set to the number of seconds to
wait between successive requests to the same server:
User-agent:
*
Crawl-delay:
10
[edit]
Allow directive
Some
major crawlers support an Allow directive which can counteract a following
Disallow directive. This is useful when one disallows an entire directory but
still wants some HTML documents in that directory crawled and indexed. While by
standard implementation the first matching robots.txt pattern always wins,
Google's implementation differs in that Allow patterns with equal or more characters
in the directive path win over a matching Disallow pattern. Bing uses the Allow
or Disallow directive which is the most specific.
In order
to be compatible to all robots, if one wants to allow single files inside an
otherwise disallowed directory, it is necessary to place the Allow directive(s)
first, followed by the Disallow, for example:
Allow:
/folder1/myfile.html
Disallow:
/folder1/
This
example will Disallow anything in /folder1/ except /folder1/myfile.html, since
the latter will match first. In case of Google, though, the order is not
important.
Sitemap
Some
crawlers support a Sitemap directive, allowing multiple Sitemaps in the same
robots.txt in the form:
Sitemap:
http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml
Sitemap:
http://www.google.com/hostednews/sitemap_index.xml
Static Vs
Dynamic websites - what's the difference?
What are
static and dynamic websites?
There are
many static websites on the Internet, you won’t be able to tell immediately if
it is static, but the chances are, if the site looks basic and is for a smaller
company, and simply delivers information without any bells and whistles, it
could be a static website. Static websites can only really be updated by
someone with a knowledge of website development. Static websites are the
cheapest to develop and host, and many smaller companies still use these to get
a web presence.
Advantages
of static websites
* Quick to develop
* Cheap to develop
* Cheap to host
Disadvantages
of static websites
* Requires web development expertise to update site
* Site not as useful for the user
* Content can get stagnant
Dynamic
sites on the other hand can be more expensive to develop initially, but the
advantages are numerous. At a basic level, a dynamic website can give the
website owner the ability to simply update and add new content to the site. For
example, news and events could be posted to the site through a simple browser
interface. Dynamic features of a site are only limited by imagination. Some
examples of dynamic website features could be: content management system,
e-commerce system, bulletin / discussion boards, intranet or extranet
facilities, ability for clients or users to upload documents, ability for
administrators or users to create content or add information to a site (dynamic
publishing).
Advantages
of dynamic websites
* Much more functional website
* Much easier to update
* New content brings people back to the site and helps in the search engines
* Can work as a system to allow staff or users to collaborate
Disadvantages
of dynamic websites
* Slower / more expensive to develop
* Hosting costs a little more
Summary
Many
sites from the last decade are static, but more and more people are realising
the advantages of having a dynamic website. Dynamic websites can make the most
of your site and either use it as a tool or create a professional, interesting
experience for your visitors.
Structuring
your website
Internet
users are objective driven. That means that they arrive at your site because
they are looking for specific information. The way you structure your website
is therefore very important in retaining visitors.
Plan your
structure on paper first
It’s a
good idea to draw a structure chart on paper before you brief your web
designer, and follow these tips:
* Limit your site to between 6 and 8 main sections - any more and you risk
creating information overload
* Put important information as few clicks away as possible
* Remember that people may be looking for different ways to get to the same
content. Some enquiries are product brand driven, while other users will focus
on the end use of a product
* Not everyone will arrive at your site on the Home page. You should plan
effective landing pages that will clearly route users directly into specific
topic areas.
Social
Media Optimization: 13 Rules of SMO
Here’s a
summary of what they’ve put together to date:
1. Increase
your link ability: Think blogs, content, aggregation & linkbait.
2. Make
tagging and bookmarking easy: Include calls to action for users to tag,
bookmark and Digg your stuff. I’d suggest the Sociable Plugin if you have a
WordPress powered blog.
3. Reward
inbound links: List blogs which link back to you via permalinks, trackbacks or
recently linking blogs (like the Yahoo & Google blogs do).
4. Help
your content travel: Content diversification can lead to mobility of your
content beyond the browser.
5.
Encourage the mashup: Let others use your content or tools to produce something
a bit different or outside of the box with your stuff, even RSS.
6. Be a
User Resource, even if it doesn’t help you: Add value and outbound links, even
if it doesn’t help in the short term, it will in the long.
7. Reward
helpful and valuable users: Give your contributors and readers the recognition
they deserve.
8.
Participate: Get in there and get involved in the discussions going on among
the blogs and sites of others, and do it organically. Earn your rep on
Digg.com, don’t try and force it.
9. Know
how to target your audience: Understand your appeal and those people you wish
to attract.
10.
Create content: A little bit of rules 1 & 4 here, but the underlying
message is know the form of content working for you.
11. Be
real: Transparency pays off and no one likes a fake.
And mine:
12. Don’t
forget your roots, be humble: Sometimes it can be easy to get carried away
being a BlogStar or industry talking head. Remember those who helped you along
the way, and that respect will help all involved.
13. Don’t
be afraid to try new things, stay fresh: Social Media is changing and morphing
by the minute, keep up on new tools, products and challenges in your social
sphere.
Search
Engine Results Page(s) (SERPs).
Within
SEM, there are three main opportunities for organizations to get their message
across, to gain visibility and to direct visitors to their sites. The first two
opportunities are via the SERPs and the third is on third-party sites.
1. The
Natural or Organic listings. The part of the pages listing results from a
search engine query which are displayed in a sequence according to relevance of
match between the keyword phrase typed into a search engine and a web page
according to a ranking algorithm used by the search engine.
The
method for achieving placement in this part of the page is called search engine
optimization (SEO) and is the focus of this best practice guide.
2. The
paid or sponsored listings. A relevant ad (typically a text ad) with a link to
a destination page is displayed when the user of a search engine types in a
specific phrase. A fee is charged for every click of each link, with the amount
bid for the click the main factor determining its position.
The
method for achieving placement in this part of the page is called paid-search
(aka ‘pay per-click’ or PPC). Econsultancy publishes
a dedicated best practice guide to paid-search marketing, to help you plan,
launch and optimize PPC campaigns.
3.
Content-network listings. These ads are displayed on third party sites that
have an
Ad sense
relationship with Google, or which display Yahoo or MIVA listings on their
website. These actually account for a sizeable proportion of Google revenue8,
but tend to have much lower click through rates.
On-page
optimization
In this
section we make recommendations on how you should create documents which the
search engine will assess as being highly relevant to a particular search term
a search user has entered as their query. The most basic test of relevance is
the number of times the search phrase appears on the page. However, there are
many factors which are also applied. In this section we will review:
Within
page key phrase factors including keyword density, synonyms and position
Page
markup key phrase factors including syntactical accuracy, <title> tags,
<meta> tags, <a href=> hyperlink tags and <img> alt tags.
Document-level
key phrase factors such as the inclusion of key phrases in the domain and
document file name.
Competitor
benchmarking
The first
stage of competitor benchmarking is to identify your online competitor types
for search traffic. Competitors for particular key phrases are not necessarily
your traditional competitors. For example, for a mobile phone retailer, when
someone searches for a product, you will be competing for search visibility
with these types of websites:
Retailers.
Network
providers.
Handset
manufacturers.
Affiliates
and partner sites.
Media-owned
sites.
Blogs
and personal sites about mobile phone technology.
To assess
the extent that search strategy should focus on SEO and PPC (and also to be
able to compete with these different types of content providers) it is
necessary to assess the relative strength of these sources, as well as the
various approaches to SEM they use. Try to identify competitors who have
optimized their sites most effectively.
Retailers
trying to compete on particular product phrases in the organic listings may
find that it is very difficult, since handset and network providers will often
feature prominently in the natural listings because of their scale (see also
Mike Grehan’s „rich-get-richer‟ argument, for explanations on
why top Google results can become happily entrenched in their positions).
Meanwhile,
many media-owned sites and blogs can feature highly in the natural listings,
because content is king. This isn’t at all
surprising, given the search robots‟ love of
text. Retailers tend to display big conversion-friendly images and lists of
features / specifications, which may be less attractive content as far as
Googlebot is concerned, if more appealing to visitors.
With all
this in mind, it seems obvious that many retail e-commerce managers favor PPC.
More likely, it is about short-term (versus long-term) goals. Or, maybe it is
just a case of easy versus difficult.
The
second stage of competitor analysis is to compare their relative performance.
Competitors can be compared in a number of ways using tools that are freely
available within the search engines or using paid for software or services.
So how
can I benchmark performance against competitors?
- Ranking Position report
Compare
the relative performance in the natural listings for different keyphrase types,
eg generic / qualified.
Pay per
click (PPC)
Pay per
click (PPC) is an
Internet advertising model used on websites, where advertisers pay their host
only when their ad is clicked. With search engines, advertisers typically bid
on keyword phrases relevant to their target market. Content sites commonly
charge a fixed price per click rather than use a bidding system.
Cost per
click (CPC) is the
sum paid by an advertiser to search engines and other Internet publishers for a
single click on their advertisement which directs one visitor to the
advertiser's website.
In
contrast to the generalized portal, which seeks to drive a high volume of
traffic to one site, PPC implements the so-called affiliate model, that
provides purchase opportunities wherever people may be surfing. It does this by
offering financial incentives (in the form of a percentage of revenue) to
affiliated partner sites. The affiliates provide purchase-point click-through
to the merchant. It is a pay-for-performance model: If an affiliate does not
generate sales, it represents no cost to the merchant. Variations include
banner exchange, pay-per-click, and revenue sharing programs.
Websites
that utilize PPC ads will display an advertisement when a keyword query matches
an advertiser's keyword list, or when a content site displays relevant content.
Such advertisements are called sponsored links or sponsored ads, and appear adjacent
to or above organic results on search engine results pages, or anywhere a web
developer chooses on a content site.
Among PPC
providers, Google AdWords, Yahoo! Search Marketing, and Microsoft ad Center are
the three largest network operators, and all three operate under a bid-based
model. Cost per click (CPC) varies depending on the search engine and the level
of competition for a particular keyword.
The PPC
advertising model is open to abuse through click fraud, although Google and
others have implemented automated systems to guard against abusive clicks by
competitors or corrupt web developers.
Determining
cost per click
There are
two primary models for determining cost per click: flat-rate and bid-based. In
both cases the advertiser must consider the potential value of a click from a
given source. This value is based on the type of individual the advertiser is
expecting to receive as a visitor to his or her website, and what the
advertiser can gain from that visit, usually revenue, both in the short term as
well as in the long term. As with other forms of advertising targeting is key,
and factors that often play into PPC campaigns include the target's interest
(often defined by a search term they have entered into a search engine, or the
content of a page that they are browsing), intent (e.g., to purchase or not),
location (for geo targeting), and the day and time that they are browsing.
Flat-rate
PPC
In the
flat-rate model, the advertiser and publisher agree upon a fixed amount that
will be paid for each click. In many cases the publisher has a rate card that
lists the CPC within different areas of their website or network. These various
amounts are often related to the content on pages, with content that generally
attracts more valuable visitors having a higher CPC than content that attracts
less valuable visitors. However, in many cases advertisers can negotiate lower
rates, especially when committing to a long-term or high-value contract.
The
flat-rate model is particularly common to comparison shopping engines, which
typically publish rate cards. However, these rates are sometimes minimal, and
advertisers can pay more for greater visibility. These sites are usually neatly
compartmentalized into product or service categories, allowing a high degree of
targeting by advertisers. In many cases, the entire core content of these sites
is paid ads
Bid-based
PPC
In the
bid-based model, the advertiser signs a contract that allows them to compete
against other advertisers in a private auction hosted by a publisher or, more
commonly, an advertising network. Each advertiser informs the host of the
maximum amount that he or she is willing to pay for a given ad spot (often
based on a keyword), usually using online tools to do so. The auction plays out
in an automated fashion every time a visitor triggers the ad spot.
When the
ad spot is part of a search engine results page (SERP), the automated auction
takes place whenever a search for the keyword that is being bid upon occurs.
All bids for the keyword that target the searcher's geo-location, the day and
time of the search, etc. are then compared and the winner determined. In
situations where there are multiple ad spots, a common occurrence on SERPs,
there can be multiple winners whose positions on the page are influenced by the
amount each has bid. The ad with the highest bid generally shows up first,
though additional factors such as ad quality and relevance can sometimes come
into play (see Quality Score).
In
addition to ad spots on SERPs, the major advertising networks allow for
contextual ads to be placed on the properties of 3rd-parties with whom they
have partnered. These publishers sign up to host ads on behalf of the network.
In return, they receive a portion of the ad revenue that the network generates,
which can be anywhere from 50% to over 80% of the gross revenue paid by
advertisers. These properties are often referred to as a content network and
the ads on them as contextual ads because the ad spots are associated with
keywords based on the context of the page on which they are found. In general,
ads on content networks have a much lower click-through rate (CTR) and
conversion rate (CR) than ads found on SERPs and consequently are less highly
valued. Content network properties can include websites, newsletters, and
e-mails.
Advertisers
pay for each click they receive, with the actual amount paid based on the
amount bid. It is common practice amongst auction hosts to charge a winning
bidder just slightly more (e.g. one penny) than the next highest bidder or the
actual amount bid, whichever is lower. This avoids situations where bidders are
constantly adjusting their bids by very small amounts to see if they can still
win the auction while paying just a little bit less per click.
To maximize
success and achieve scale, automated bid management systems can be deployed.
These systems can be used directly by the advertiser, though they are more
commonly used by advertising agencies that offer PPC bid management as a
service. These tools generally allow for bid management at scale, with
thousands or even millions of PPC bids controlled by a highly automated system.
The system generally sets each bid based on the goal that has been set for it,
such as maximize profit, maximize traffic at breakeven, and so forth. The
system is usually tied into the advertiser's website and fed the results of
each click, which then allows it to set bids. The effectiveness of these
systems is directly related to the quality and quantity of the performance data
that they have to work with - low-traffic ads can lead to a scarcity of data
problem that renders many bid management tools useless at worst, or inefficient
at best.
Social
media marketing
Social
media marketing is a recent addition to organizations’ integrated marketing
communications plans. Integrated marketing communications is a principle
organizations follow to connect with their targeted markets. Integrated
marketing communications coordinates the elements of the promotional mix;
advertising, personal selling, public relations, publicity, direct marketing,
and sales promotion. In the traditional marketing communications model, the
content, frequency, timing, and medium of communications by the organization is
in collaboration with an external agent, i.e. advertising agencies, marketing
research firms, and public relations firms. However, the growth of social media
has impacted the way organizations communicate. With the emergence of Web 2.0,
the internet provides a set of tools that allow people to build social and
business connections, share information and collaborate on projects online.
Social
media marketing programs usually center on efforts to create content that
attracts attention and encourages readers to share it with their social
networks. A corporate message spreads from user to user and presumably
resonates because it is coming from a trusted source, as opposed to the brand
or company itself.
Social
media has become a platform that is easily accessible to anyone with internet
access, opening doors for organizations to increase their brand awareness and
facilitate conversations with the customer. Additionally, social media serves
as a relatively inexpensive platform for organizations to implement marketing
campaigns. With emergence of services like Twitter, the barrier to entry in
social media is greatly reduced. Report from company Sysomos shows that half of
the users using Twitter are located outside US demonstrating the global
significance of social media marketing. Organizations can receive direct
feedback from their customers and targeted markets.
Platforms
Social
media marketing which is known as SMO Social Media Optimization benefits
organizations and individuals by providing an additional channel for customer
support, a means to gain customer and competitive insight, recruitment and
retention of new customers/business partners, and a method of managing their
reputation online. Key factors that ensure its success are its relevance to the
customer, the value it provides them with and the strength of the foundation on
which it is built. A strong foundation serves as a stand or platform in which
the organization can centralize its information and direct customers on its
recent developments via other social media channels, such as article and press
release publications.
The most
popular platforms include:
* Blogs
* Delicious
* Facebook
* Flickr
* Hi5
* LinkedIn
* MySpace
* Reddit
* Tagged
* Twitter
* YouTube
* More...
Web
analytics
Web
analytics is the measurement, collection, analysis and reporting of internet
data for purposes of understanding and optimizing web usage.
Web
analytics is not just a tool for measuring website traffic but can be used as a
tool for business research and market research. Web analytics applications can
also help companies measure the results of traditional print advertising
campaigns. It helps one to estimate how the traffic to the website changed
after the launch of a new advertising campaign. Web analytics provides data on
the number of visitors, page views, etc. to gauge the traffic and popularity
trends which helps doing the market research.
There are
two categories of web analytics; off-site and on-site web analytics.
Off-site
web analytics refers to web measurement and analysis regardless of whether you
own or maintain a website. It includes the measurement of a website's potential
audience (opportunity), share of voice (visibility), and buzz (comments) that
is happening on the Internet as a whole.
On-site
web analytics measure a visitor's journey once on your website. This includes
its drivers and conversions; for example, which landing pages encourage people
to make a purchase. On-site web analytics measures the performance of your
website in a commercial context. This data is typically compared against key
performance indicators for performance, and used to improve a web site or
marketing campaign's audience response.
Historically,
web analytics has referred to on-site visitor measurement. However in recent
years this has blurred, mainly because vendors are producing tools that span
both categories.
On-site
web analytics technologies
Many
different vendors provide on-site web analytics software and services. There
are two main technological approaches to collecting the data. The first method,
logfile analysis, reads the logfiles in which the web server records all its
transactions. The second method, page tagging, uses JavaScript on each page to
notify a third-party server when a page is rendered by a web browser. Both
collect data that can be processed to produce web traffic reports.
In
addition other data sources may also be added to augment the data. For example;
e-mail response rates, direct mail campaign data, sales and lead information,
user performance data such as click heat mapping, or other custom metrics as
needed.
Key
definitions
There are
no globally agreed definitions within web analytics as the industry bodies have
been trying to agree definitions that are useful and definitive for some time.
The main bodies who have had input in this area have been Jicwebs(Industry
Committee for Web Standards)/ABCe (Auditing Bureau of Circulations electronic,
UK and Europe), The WAA (Web Analytics Association, US) and to a lesser extent
the IAB (Interactive Advertising Bureau). This does not prevent the following
list from being a useful guide, suffering only slightly from ambiguity. Both
the WAA and the ABCe provide more definitive lists for those who are declaring
their statistics using the metrics defined by either.
* Hit - A request for a file from the web server. Available only in log
analysis. The number of hits received by a website is frequently cited to
assert its popularity, but this number is extremely misleading and dramatically
over-estimates popularity. A single web-page typically consists of multiple
(often dozens) of discrete files, each of which is counted as a hit as the page
is downloaded, so the number of hits is really an arbitrary number more
reflective of the complexity of individual pages on the website than the
website's actual popularity. The total number of visitors or page views
provides a more realistic and accurate assessment of popularity.
* Page view - A request for a file whose type is defined as a page in log
analysis. An occurrence of the script being run in page tagging. In log
analysis, a single page view may generate multiple hits as all the resources
required to view the page (images, .js and .css files) are also requested from
the web server.
* Visit / Session - A visit is defined as a series of page requests from the
same uniquely identified client with a time of no more than 30 minutes between
each page request. A session is defined as a series of page requests from the
same uniquely identified client with a time of no more than 30 minutes and no
requests for pages from other domains intervening between page requests. In
other words, a session ends when someone goes to another site, or 30 minutes
elapse between page views, whichever comes first. A visit ends only after a 30
minute time delay. If someone leaves a site, then returns within 30 minutes,
this will count as one visit but two sessions. In practice, most systems ignore
sessions and many analysts use both terms for visits. Because time between
pageviews is critical to the definition of visits and sessions, a single page
view does not constitute a visit or a session (it is a "bounce").
* First Visit / First Session - A visit from a visitor who has not made any
previous visits.
* Visitor / Unique Visitor / Unique User - The uniquely identified client
generating requests on the web server (log analysis) or viewing pages (page
tagging) within a defined time period (i.e. day, week or month). A Unique
Visitor counts once within the timescale. A visitor can make multiple visits.
Identification is made to the visitor's computer, not the person, usually via
cookie and/or IP+User Agent. Thus the same person visiting from two different computers
will count as two Unique Visitors. Increasingly visitors are uniquely
identified by Flash LSO's (Local Shared Object), which are less susceptible to
privacy enforcement.
* Repeat Visitor - A visitor that has made at least one previous visit. The
period between the last and current visit is called visitor regency and is
measured in days.
* New Visitor - A visitor that has not made any previous visits. This
definition creates a certain amount of confusion (see common confusions below),
and is sometimes substituted with analysis of first visits.
* Impression - An impression is each time an advertisement loads on a user's
screen. Anytime you see a banner, that is an impression.
* Singletons - The number of visits where only a single page is viewed. While
not a useful metric in and of itself the number of singletons is indicative of
various forms of Click fraud as well as being used to calculate bounce rate and
in some cases to identify automatons bots).
* Bounce Rate - The percentage of visits where the visitor enters and exits at
the same page without visiting any other pages on the site in between.
* % Exit - The percentage of users who exit from a page.
* Visibility time - The time a single page (or a blog, Ad Banner...) is viewed.
* Session Duration - Average amount of time that visitors spend on the site
each time they visit. This metric can be complicated by the fact that analytics
programs can not measure the length of the final page view.
* Page View Duration / Time on Page - Average amount of time that visitors
spend on each page of the site. As with Session Duration, this metric is
complicated by the fact that analytics programs can not measure the length of
the final page view unless they record a page close event, such as on Unload().
* Active Time / Engagement Time - Average amount of time that visitors spend
actually interacting with content on a web page, based on mouse moves, clicks,
hovers and scrolls. Unlike Session Duration and Page View Duration / Time on
Page, this metric can accurately measure the length of engagement in the final
page view.
* Page Depth / Page Views per Session - Page Depth is the average number of
page views a visitor consumes before ending their session. It is calculated by
dividing total number of page views by total number of sessions and is also
called Page Views per Session or PV/Session.
* Frequency / Session per Unique - Frequency measures how often visitors come
to a website. It is calculated by dividing the total number of sessions (or
visits) by the total number of unique visitors. Sometimes it is used to measure
the loyalty of your audience.
* Click path - the sequence of hyperlinks one or more website visitors follows
on a given site.
* Click - "refers to a single instance of a user following a hyperlink
from one page in a site to another". Some use click analytics to analyze
their web sites.
* Site Overlay is a technique in which graphical statistics are shown besides
each link on the web page. These statistics represent the percentage of clicks
on each link.
Google
Penalty Advice
Finding
the Causes of a Sudden Drop in Ranking
To check
for Google penalties with any degree of certainty can be difficult. For
example, if your website experiences a sudden reduction in ranking for its main
keyword terms it can be caused solely by a Google algorithm change or search
results (SERP) update.
Google
penalty example using Analytics
When any
algorithm change or Google SERP update is released, there are always winners
and losers, and when a sudden drop in rankings is experienced Google penalties
are often incorrectly blamed.
However,
where the traffic reduction from Google non-paid search is very extreme, as
pictured left (from Google Analytics data - traffic sources > search engines
> Google) then a penalty is much more likely.
There are
a growing number of Google filters now built into the Google algorithm which
aim to detect violations of Google Webmaster Guidelines in order to help
maintain the quality of Google's search results (SERP) for any given query. One
such algorithmic filter is thought to have caused the massive drop on Google
traffic pictured above.
Link
Devaluation Effects
When
considering the cause of a ranking reduction, its worth noting that Google
continually applies link devaluation to links from various non-reputable
sources that it considers spammers are exploiting to artificially raise the
ranking of their sites. Hence continual Google algorithm tweaks are being made
in an effort to combat link spam.
When link
devaluation is applied, as it has with reciprocal links as well as links from
many paid link advertisements, low quality web directories and link farms,
reductions in Google ranking may occur affecting the recipient site of the
links. The severity of ranking reductions is usually synonymous with the
website's reliance on that particular type of linking.
There's
no doubt that do-follow blog links and low quality web directory links have
also been devalued and that this has lead to reduced website rankings for sites
which got a significant number of backlinks or site wide links from do-follow
blogs or directories. In addition, backlinks from unrelated theme sites are
also experiencing Google devaluation - so if your site heavily relies on these
links, then it too may experience a sudden drop in Google rankings.
If you
suspect a Google penalty, it first makes sense to check whether any Google
algorithm changes have been made which could be the cause of the problem. SEO
Forum posts reflecting algorithm changes usually appear on the SEO Chat Forum
soon after the effects of any update are felt.
That
said, if your website suffers sudden and dramatic fall in ranking and no Google
algorithm changes have been made, then a Google penalty or filter may be the
cause, especially if you have been embarking on activities which might have
contravened Google Webmaster Guidelines. The most severe Google penalties lead
to total website de-indexing and where the SEO misdemeanour's is
serious a site ban may be imposed by Google, accompanied by a Page Rank
reduction to 0 and a greyed out Google Tool bar Page Rank indication.
Google filters are less extreme, but can still be extremely damaging to a
company's profits.
Whatever
the cause, recovering from a Google penalty or filter is a challenge and our
SEO checklist will help identify likely causes and reasons for a sudden
reduction in Google ranking or an major drop in SERPS position for your main
keywords.
Initial
Test for a Penalty
When a
penalty is suspected, start by checking with Google the number of URL's it has
indexed. This can be accomplished by using the site: yourdomainname.com command
within a Google search window. If no URL's are indexed and no backlinks show up
when the link: yourdomain.com is entered then there is a high probability of a
Google penalty, especially if your site used to be indexed and used to show
backlinks.
Another
indicator of a Google penalty is ceasing to rank for your own company name,
where previously your ranked well for your own brand name. The exception to
this rule is a new website with few backlinks, which may not be Google indexed
since it is still waiting to be crawled. Such websites frequently show no
backlinks, but this doesn't imply they have received a Google penalty!
Not all
Google penalties result in a loss of Page Rank. For example, various Google
filters can be triggered by unnatural irregularities in backlinks (detected by
the clever Google algorithm) or by excessive reciprocal link exchange,
particularly using similar keyword optimized anchor text in your links. The
example (left) shows a typical reduction in website traffic caused by a Google
SEO penalty.
Another
good indication that a site is under penalty is to take a unique paragraph of
text from a popular page on the affected site and searching for it in Google.
If the page doesn't come back as #1 and the page is still showing as cached
using cache:www.mydomain.com/page.htm, then this is a good indication that a
penalty or filter has been placed on the domain.
To avoid
a Google penalty or SERPS filter, take particular care when embarking on any
link building program. In particular, avoid reciprocal link exchange becoming
the main-stay of your SEO campaign.
If you
suspect your website has received a Google penalty, you can contact Google by
sending an e-mail to help@google.com to ask for help. They will usually check
the spam report queue and offer some form of assistance.
Interestingly,
in a recent move by Google, web sites which are in clear violation of Google's
webmaster guidelines or terms of service may receive an e-mail from Google
advising them to clean up their act, warning of a penalty and website
de-indexing. When the breach of Google's terms (e.g. link spam or hidden text)
is removed from the offending site, Google will usually automatically clear the
penalty and re-index the site as many so-called penalties are actually
'filters' triggered by irregularities found by Google's algorithm.
Google
Penalty Checklist
If your
website has suffered a Google penalty, some free SEO advice to help identify
the cause and solve the problem is provided below. Once you have identified the
cause of the problem, we suggest watching the Google reconsideration tips video
to help prepare a successful reconsideration request to Google.
For
further assistance with Google penalties contact us for professional help.
Linking
to banned sites
Run a
test on all outbound links from your site to see if you are linking to any
sites which have themselves been Google banned. These will be sites which are
Google de-listed and show Page Rank 0 with a greyed out Toolbar Page Rank
indicator.
Linking
to bad neighborhoods
Check you
are not linking to any bad neighbourhoods (neighborhoods - US
spelling), link farms or doorway pages. Bad neighbourhoods include
spam sites and doorway pages, whilst link farms are just pages of links to
other sites, with no original or useful content.
If in
doubt, we recommend quality checking all of your outbound links to external
sites using the Bad Neighbourhood detection tool. Whilst this SEO
tool isn't perfect, it may spot "problem sites". Another good tip is
to do a Google search for the HTML homepage title of sites that you link to. If
the sites don't come up in the top 20 of the Google SERPS, then they are almost
certainly low trust domains and linking to them should be avoided.
Automated
query penalty
Google
penalties can sometimes be caused by using automated query tools which make use
of Google's API, particularly when such queries are made from the same IP
address that hosts your website. These tools break Google's terms of service
(as laid out in their Webmaster Guidelines). Google allows certain automated
queries into its database using its analytic tools and when accessing through a
registered Google API account. Unauthorized types of automated query can cause
problems, particularly when used excessively.
Over
optimization penalties and Google filters
These can
be triggered by poor SEO techniques such as aggressive link building using the
same keywords in link anchor text. When managing link building campaigns,
always vary the link text used and incorporate a variety of different keyword
terms. Use a back link anchor text analyzer tool to check back links for
sufficient keyword spread. Optimizing for high paying (often abused) keywords
like "Viagra" can further elevate risk, so mix in some long tail
keywords into the equation. For brand new domains, be sensible and add a few
one way back links a week and use deep linking to website internal pages,
rather than just homepage link building. Above all, always vary your link
anchor text to incorporate different keywords, not variations on the same
keyword!
There is
strong evidence that Google has introduced some new automatic over optimization
filters into their algorithm. These seem to have the effect of applying a
penalty to a page which has been over optimized for the same keyword by link
building. See Google filters for more information or contact KSL Consulting for
assistance (fees apply).
Website
cross linking & link schemes
If you
run more than one website and the Google penalty hits all sites at the same
time, check the interlinking (cross linking) between those sites. Extensive
interlinking of websites, particularly if they are on the same C Class IP
address (same ISP) can be viewed as "link schemes" by Google,
breaking their terms of service. The risks are even higher where site A site
wide links to site B and site B site wide links back to site A. In addition,
link schemes offering paid link placement in the footer section of webpages
(even on high Page Rank pages) are detectable search engine spam and are best
avoided.
Site-wide
links should also be avoided at all costs. The reality is that site wide links
do little to increase site visibility in the Google SERPS, nor do they improve
Page Rank more than a single link, as Google only counts one link from a site
to another. KSL Consulting also believe that Yahoo! now applies a similar
policy. There is some evidence that the extensive use of site-wide links can
lower website Google trust value, which can subsequently reduce ranking.
Duplicate
Content problems
Whilst
duplicate content in its own right is not thought to trigger Google penalties,
it can be responsible for the non-indexation of website content and for placing
all duplicate web pages into Google's supplemental index, which results in
pages not ranking in the Google SERP. This can result in significant traffic
loss to a site, similar to that caused by a penalty.
Google
will not index duplicate content and any site which utilizes large amounts of
content (like news feeds/articles) featured elsewhere on the web will likely
suffer as a result.
Hidden
text or links
Remove
any hidden text in your content and remove any hidden keywords. Such content
may be hidden from view using CSS or alternatively, text may have been coded to
be the same colour as the page background, rendering it invisible. These risky
SEO techniques often lead to a Google penalty or web site ban and should be
removed immediately. The same applies to hidden links, which Matt Cutts has
openly stated break their webmaster guidelines.
Keyword
stuffing (spamming)
Remove
excessive keyword stuffing in your website content (unnatural repetitions of
the same phrase in body text). Always use natural, well written web copywriting
techniques.
Check for
Malware Problems
It is
worthwhile carrying out a check to see if Google has blacklisted your site as
unsafe for browsing. To assess whether this is the case visit
www.google.com/safebrowsing/diagnostic?site=mydomain.co.uk, replacing
'mydomain.co.uk' with your domain.
Automated
page redirects
The use
of automated browser re-directs in any of your pages. Meta Refresh and
JavaScript automated re-directs often result in Google penalties as the pages
using them are perceived to be doorway pages. This technique is especially
dangerous if the refresh time is less than 5 seconds. To avoid Google
penalties, use a 301 re-direct or Mod Rewrite technique instead of these
methods. This involves setting up a .htaccess file on your web server.
Link
buying or selling
Check for
any paid links (I.E. buying text links from known link suppliers / companies).
There is some evidence that buying links can hurt rankings and this was implied
by comments from Matt Cutts (a Google engineer) on his Google SEO blog. Matt
states that Google will also devalue links from companies selling text links,
such that they offer zero value to the recipient in terms for improving website
rankings or Page Rank. More recently, Google applied a Page Rank penalty to
known link sellers and many low quality directories.
Reciprocal
link building campaigns
Excessive
reciprocal linking may trigger a Google penalty or cause a SERPS filter to be
applied when the same or very similar link anchor text is used over and over
again and large numbers of reciprocal links are added in a relatively short
time.
The
dangers are made worse by adding reciprocal links to low quality sites or
websites which have an unrelated theme. This can lead to a back link over
optimization penalty (known as a BLOOP to SEO experts!). a Google Back link
Over Optimization Penalty causes a sudden drops in SERPS ranking (often
severe). To avoid this problem, reciprocal link exchange should only be used as
part of a more sustainable SEO strategy which also builds quality one way links
to original website content.
Adding
reciprocal links to unrelated sites is a risky SEO strategy, as is reciprocal
link exchange with low quality websites. To help identify quality link exchange
partners we use a simple but effective test - regardless of indicated Page
Rank, if you can't find a website's homepage in the top 20 of the Google search
results (SERPS) when you search for the first 4 words of a site's full HTML
title (shown at the top of the Internet Explorer window) then undertaking
reciprocal link exchange with that site may offer few advantages. Don't forget
to check that prospective reciprocal link partners have a similar theme as your
homepage too.
Paid
links on Commercial Directories
Some
leading online web directories offer paid placement for multiple regions where
a link to your website appears on many pages of the directory with keyword
optimized anchor text and these links are search engine accessible (I.E. they
have no "nofollow" tag).
If you
have optimized the same keyword elsewhere in your SEO campaign, adding hundreds
of links from commercial directories with the same or similar anchor text in a
short space of time can cause serious problems. In extreme cases we've seen
these kinds of directory links trigger a Google filter.
Thin
Affiliates and "Made for Adsense" sites
It's a
well known fact that Google dislikes affiliate sites with thin content and the
same applies to "made to Adsense" sites. Always make sure affiliate
sites have quality original content if you don't want to get them filtered out
of the search results when someone completes a Google spam report. We have had
personal experience of affiliate sites acquiring a Google penalty, so don't
spend time and money on SEO on such sites without the right content.
Content
Feeds and I-Frames
Whilst
content feeds (including RSS) are widely used on the web, there is some
evidence that pulling in large amounts of duplicate content through such feeds
may have an adverse effect on ranking and in extreme cases may trigger a Google
penalty. In particular, the use of I-frames to pull in affiliate content should
be avoided where possible. Consider the use of banners and text links as an alternative.
Same
Registrant Domains
As Google
has access to the WHOIS records for domains and is known to use this
information, it is possible that a penalty applied to one website may reduce
the ranking of other websites with the same registrant, although most filters
only affect one domain.
Check
Google Webmaster Guidelines
Read the
Google Webmaster Guidelines and check website compliance in all respects. Since
early 2007, Google may alert webmasters via the Google Webmaster Console who
they feel might have unknowingly broken their guidelines to advise them that
their site has been removed from Google for a set period of time due to
breaking one or more of Google's Webmaster Guidelines.
However,
blatant spam or significant breaches of Google's rules will often result in a
site being banned, with no Webmaster Console notification. Where notification
of a violation of Google's guidelines is received, it usually encourages the
webmaster to correct the problem/s and then submit a Google re-inclusion request
(now referred to as a 'reconsideration request' in Webmaster Tools). From my
experience, after this is done the website will usually regain its original
ranking in around 14 days, assuming that all violations of Google's terms and
conditions have been resolved.
Google
Webmaster Tools
According
to Matt Cutts's Blog, Google is improving webmaster communication with respect
to banned sites and penalties. Google is now informing some (but not all)
webmasters the cause of a website ban or penalty, via their excellent new
Webmaster Console. In addition, a Google re-inclusion request can be made from
the same interface. For this reason, if you've been hit by a web site ban or
penalty, it is worthwhile signing up for Google Webmaster Tools and uploading
an XML Sitemap onto your site and then to check site status in the Google
Webmaster Console. This is an easy 15 minute job and may help to identify the
cause and fix for the problem!
Preparing
Your Site for Google Reconsideration
Google
recently prepared a Google reconsideration video tutorial on how to create a
good reconsideration request, including tips on what Google look for when
assessing the reinclusion of any website. The video tutorial is presented by
actual members of Google's reconsideration team and is very helpful to any
webmaster looking to successfully prepare a reconsideration request.
Google
SERP Filters
There is
clear evidence that over-optimizing a single keyword through adding too many
back links and site-wide links can result in triggering a Google filter whereby
the recipient page of these links no longer ranks in the organic SERP for the
keyword being optimized.
Affected
page/s appear to still be Google indexed and cached. The Google Trust Rank of
the website may be slightly affected leading to a ranking reduction for other
keywords. Interestingly though, affected websites can retain ranking for other
long tail keywords which have not been over optimized, particularly on pages
which have not been subject to aggressive link building, but may have one or
two decent natural links.
One other
fact worth noting is that affected pages seem to have high keyword density to
the point of being over-optimized. In some cases changes to increase page
keyword density for the problem keyword may have been made shortly prior to the
Google filter being applied.
In the
cases observed, the websites still rank for their company name and pages still
show in the Google index (using the site:domain.com command). However, picking
a sentence of text from the affected page and searching for it in Google
yielded no results. It is therefore fair to assume that the filtered page was
all but removed from the index as far as its ability to rank - even for
long-tail keywords, although it still showed as being Google cached
(cache:domain.com/page).
To assess
whether your website is affected by a Google SERP filter, do a site-wide back
link anchor text analysis using Majestic SEO (free) or a paid SEO tool like SEO
Moz Links cape and check the spread of keywords used in links to your page look
natural. Check your keyword density too excluding Meta tags. Google is
tightening up on link spam in a big way; be warned!
Check for
a Total Google Website Ban
If you've
used unethical black hat SEO techniques your website could be Google banned and
consequently totally de-indexed. If your site no longer shows any pages indexed
when the site: www.yourdomain.com command is used in Google (and it was
previously indexed), then your site may have received the most extreme form of
penalty - a total Google ban. Check for possible causes using the free SEO
advice contained in our penalty checklist above.
Google
Penalty Recovery Strategy
Recovering
from a Google penalty normally involves fixing the cause of the problem and
then waiting for Google to remove any over optimization penalties or SERPS
filters. To fully recover Google ranking may take around 2-3 months after all
website problems are corrected, although we have seen penalty recovery in a
matter of weeks following full and thorough resolution of the Google Webmaster
Guidelines infringements.
The
Google algorithm can automatically remove penalties if the affected website is
still Google indexed. To check whether a particular website is still Google
indexed, refer to our Google indexing page. If your website has been Google
de-indexed and lost Page Rank, then you will need to make a Google re-inclusion
request. Where the reason for the penalty is clear, it helps to provide details
of any changes you've made to correct violations of the Google Webmaster
Guidelines.
The best
recovery strategy from any Google penalty is to thoroughly familiarize yourself
with Google Webmaster Guidelines and also check the SEO Chat Forum for threads
surrounding any recent Google algorithm changes and to evaluate recent changes
made to your website prior to the sudden drop in Google ranking. Don't forget
to check your link building strategy as poor SEO often causes Google penalties.
Start by removing any reciprocal links to low quality websites, or sites having
no relevance to your website theme.
Preparing
for a Google Re-Inclusion (Reconsideration) Request
We
recommend you start by watching the Google reconsideration tips video.
If your
site has been de-indexed due to a Google penalty, correct the problem and then
apply to be re-included in the Google index by submitting a Google re-inclusion
request from your Webmaster Tools account. More information about this is
provided in Google Webmaster Help. Google refer to this process as making a
"reconsideration request" which is now submitted from your Webmaster
Tools login.
How long
does site reconsideration take?
By
submitting a reconsideration request to Google you enter the queue for the
manual review process whereby your site is manually checked for violations of
Google's Webmaster Guidelines. This can take several weeks. At the end of the
process, an Inbox message is usually sent to the Webmaster to confirm that the
reconsideration has been processed. This will be visible by logging into
Webmaster Tools and then checking your Inbox under 'Messages'.
Gorilla
Marketing (Viral Marketing)
Viral
marketing and viral advertising are buzzwords referring to marketing techniques
that use pre-existing social networks to produce increases in brand awareness
or to achieve other marketing objectives (such as product sales) through
self-replicating viral processes, analogous to the spread of virus or computer
viruses. It can be word-of-mouth delivered or enhanced by the network effects
of the Internet. Viral promotions may take the form of video clips, interactive
Flash games, advergames, ebooks, brandable software, images, or even text
messages.
The goal
of marketers interested in creating successful viral marketing programs is to
identify individuals with high Social Networking Potential (SNP) and create
viral messages that appeal to this segment of the population and have a high
probability of being taken by another competitor.
The term
"viral marketing" has also been used pejoratively to refer to stealth
marketing campaigns—the unscrupulous use of astroturfing on-line combined with
under market advertising in shopping centers to create the impression of
spontaneous word of mouth enthusiasm. Viral marketing is a imitation which is
by using social media and other channels of communication spreading the planned
content aiming to reach the most efficient and friendly manner to the target
audience. Briefly, the idea spread from person to person.
Email
Marketing
E-mail
marketing is a form of direct marketing which uses electronic mail as a means
of communicating commercial or fund-raising messages to an audience. In its
broadest sense, every e-mail sent to a potential or current customer could be
considered e-mail marketing. However, the term is usually used to refer to:
* sending e-mails with the purpose of enhancing the relationship of a merchant
with its current or previous customers, to encourage customer loyalty and
repeat business,
* sending e-mails with the purpose of acquiring new customers or convincing
current customers to purchase something immediately,
* adding advertisements to e-mails sent by other companies to their customers,
and