Featured Post

ON PAGE OPTIMIZATION

Share on Tumblr!function(d,i){if(!d.getElementById(i)){var j=d.createElement("script");j.id=i;j.src="https://widgets.getpocket.com/v1/j/btn.js?v=1";var w=d.getElementById(i);d.body.appendChild(j);}}(document,"pocket-btn-js"); Getting a good ranking in a search engine hasn’t been the easiest thing...

Read More

How Google Plus Profiles & Pages Gain Search Authority

Posted by admin | Posted in Blogger, Case Studies, Google | Posted on 29-11-2013

0

At SMX East this past October, I gave a presentation titled, “Putting the SEO Power of Google+ to Work.” The centerpiece of that presentation was a first peek at a study I’d conducted that seemed to confirm my hypothesis that Google+ profiles and pages gain authority for ranking in Google Search (and elsewhere in Google, as you’ll see below) in much the same way regular Web pages do.

In this article, I’m going to expand upon that presentation and lay out the full case.

Profile Ranking Mysteries: A Personal Case Study

I’ve been active on Google+ since its third day. Thanks to relationships I’d already established through Google Buzz and other social networks, I quickly built a network with many influential Google+ users. By about eight months in, I had a respectable 10,000 followers, my highest follower count ever for a social network, but hardly a Google+ superstar.

But around then, I began noticing a strange power for my profile. If I reshared a Google+ post by a highly-followed G+ user on my own profile, and then a day or two later checked Google Search (logged out of Google, cache and history cleared), time and again I found that my reshare of the original post would be the highest ranking Google+ post for the title of the original post. In other words, I was often out-ranking users with many times the number of followers I had for their own posts in search.

Here’s an example from March 2012 that will help illustrate the phenomenon.

On March 5, I reshared on Google+ a post by popular social media speaker and author Mari Smith:

Google Plus: Fastest Growing Facebook Pages

Notice that when you reshare someone else’s post on Google+, the entire text of the original post is embedded in the reshare and becomes part of the reshare. Also take note of the keyword phrase in the original post.

Below is a screen capture of the actual, logged-out-of-Google top search results for [fastest growing Facebook pages] just one day after my reshare of Mari Smith’s post:

Google Plus Search Result for Facebook Pages

Note that my Google+ post, not Mari Smith’s, was the top ranking Google+ post (indeed, the top ranking non-news post, period) for [fastest growing Facebook pages] at that time, even though the original post was by Mari Smith. As you can see, Google even grabbed the indexed title tag text from the first line of Mari’s post, not mine.

But here’s the more startling fact: At the time, Mari Smith had 60,000 Google+ followers, six times as many as me.

This was repeated again and again, where I could very often outrank people who had far more followers than I, for their own posts, when I reshared them. At that time, most people assumed that the more followers you had or the more +1′s your posts got, the higher you would rank in search. But I was able to show that neither was necessarily the case.

This still continues to this day. I’m still able (not always, but more often than not) to outrank many highly-circled Google+ users in Google search with my reshares of their original posts — even if the original post got a lot more +1s than mine did.

So what actually did (and still does) cause some profiles to rank higher than others in Google search? If not follower count or +1 count, what was the magic factor?

Discovery: Profile & Page PageRank Authority

In the early days of Google+, a number of SEO-savvy users noticed that their profiles showed Google PageRank in toolbar PageRank tools. But then one day, toolbar PageRank stopped showing in those tools. Most assumed that Google had changed its mind about assigning PageRank to Google+ profiles.

That was, until alert SEO Joshua Berg discovered it was there all along. Working off asuggestion by Dan Petrovic that it was a change in URL structure by Google+ that caused most PageRank checking tools to show zero, Joshua revealed one tool, prchecker.net, that could properly parse the URLs and show that Google+ profiles did indeed still have Google PageRank. (Tip: The new Google+ custom URLs will not work in that tool. You must right-click on a user’s name anywhere on G+, and copy and use that URL.)

Finally, I had a possible explanation for my super-ranking abilities. I began to test, and sure enough, in case after case, if the person whose post I reshared had a lower, or at most equal, PageRank compared to me, I could outrank them in search for the same post keyword. There were always anomalies — but not so many as couldn’t be explained by either the imprecision of toolbar PageRank, or the fact that undoubtedly other factors come into play in post ranking that a 0-10 PageRank scale can’t show.

By the way, we soon discovered that in addition to profiles, Google+ pages and communities also have their own PageRank.

But questions remained: Where does this PageRank come from? How does a Google+ profile or page earn PageRank (and more importantly, the search authority it represents)?

Sources Of Google+ PageRank

Below are the internal and external sources of Google+ PageRank, followed by an initial experiment and then a more extensive study on Google+ PageRank External Links.

Internal Sources

Since, at least on its surface, Google+ is a social network, it stands to reason that a primary source from which Google would assess profile authority would be connections within the network itself.

We know that Google uses links from regular pages on the Web as a primary means of assessing the relative authority of the pages to which they point. Google+ profiles and pages interlink with each other as well, so it makes sense that Google would use a similar strategy for Google+.

So, it’s likely that mentions, reshares, and perhaps other engagement from other users help build the PageRank authority of a profile. When you +mention someone on Google+ (type a + and their name), it creates a followed link to that person’s profile.

Google Plus internal PageRank Flow

If the users mentioning you, resharing your posts, or otherwise engaging with you have high-authority profiles, then most likely they pass on more authority to your profile. Therefore, the more Google+ power users you network with, the higher your profile’s PageRank will probably climb.

External Sources

In addition to internal authority flow within Google+, if Google has assigned PageRank to G+ profiles, pages, and communities, then it also seems likely that links from Web pages external to Google+ would also help build the authority of those G+ entities.

Google Plus external PageRank flow

In the illustration above, a blogger who interviewed me linked to my Google+ profile in her blog post. That followed link should send some PageRank authority from her site to my profile.

An Initial Experiment

When I first noticed that some Google+ Communities were ranking in search for their own names, I decided to launch an experiment. We got articles written about two up-and-coming Google+ communities on several high-authority blogs, with anchor text links with each community’s name linking to the communities. Within a few days, we saw the Fitness & Nutrition community move from page 8 (logged-out search) to page 1 for [fitness and nutrition], and the Google Authorship and Author Rank community move from page 4 to #1 on page 1 for [authorship and author rank].

The fitness community has since dropped down (although it remains #1 for [fitness and nutrition community]), while the Authorship community retains its #1 ranking at the time of this publication. I attribute the staying power of the latter community to the fact that it now has 320 external links from 41 domains pointing at it.

Those experiences made me want to test further to try to confirm my hypothesis that external links help determine the PageRank authority of Google+ profiles, pages, and communities. So I launched a new, more extensive study.

Google+ PageRank External Link Study

I decided to examine a range of profiles and pages at all levels of PageRank to see if there were any correlation between the strength of their external link profiles and their PageRank. I am heavily indebted to Paul Shapiro, who volunteered to collect the necessary data.

Some necessary background before we get to the results:

  • Toolbar PageRank has not been updated since February of 2013.
  • Our profile backlink data were collected in June 2013.
  • We are assuming that PageRank of the study set of profiles would not have changed significantly enough from February to June to throw off the results.
  • For backlink data, we used MajesticSEO, one of the few backlink tools that tracks Google+ profiles and pages.
  • We eliminated profiles or pages that had few or no backlinks, since those profiles would have earned their PageRank entirely from internal Google+ links, and we were testing for external link influence.

The distribution of our sample group of profiles and pages is probably a fair approximation of the average PageRank distribution across all active Google+ accounts. When looking at the PR of profiles, we find that most which have been active for a while fall into the 2 to 4 range, and very few profiles have a PR of 6 or higher. We have found only one (Google’s own page) that has an 8, and none higher than that.

Here is the distribution of our sample group:

Google Plus PageRank distribution

PageRank Vs. Citation Flow

We first extracted the MajesticSEO Citation Flow for each of the sample profiles. Citation Flow is defined by MajesticSEO as “a number of predicting how influential a URL might be based on how many sites link to it.” The chart below plots the average Citation Flow of our sample group over against their PageRank scores. Citation Flow rank is on the left, and PageRank is at the bottom:

Google Plus profile citation flow chart

As you can see, citation flow seems to graph as we would expect if profile/page PageRank is indeed influenced by external links. The more sites linking to a profile, the higher its PageRank.

What about Trust Flow? MajesticSEO Trust Flow is “a number predicting how trustworthy a page is based on how trustworthy sites tend to link to trustworthy neighbors.” In other words, this metric attempts to gauge the value of the links pointing toward (in this case) a profile. A higher Trust Flow number indicates that the links pointing to the profile are mostly of the sort that a search engine would be more inclined to trust, and thus give more weight.

Here is how Trust Flow compared to PageRank in our sample group:

Google Plus profile trust flow chart

The curve is a bit bumpier, but still overall conforms to the expectations of our hypothesis. In general, the higher the trust level of the backlinks to a profile, the higher its PageRank.

Here are the two metrics, Citation Flow and Trust Flow, combined for comparison:

Google Plus profile citation and trust flow comparison chart

The curves seem to confirm our hypothesis: the strength of external backlinks to a Google+ profile or page has an effect on the profile’s or page’s PageRank, and thus, on its ability to rank for its content, both within Google+ and in Google Search.

Profile PageRank & Google Authorship

It occurred to me that if my hypothesis about external backlinks was correct, then using Google Authorship should also have an effect on profile PageRank. Why? Because establishing Authorship for a piece of content, in most cases, involves placing a link from that content back to the author’s Google+ profile. It would seem to follow then, that profile owners who regularly create content on a diversity of sites using Google Authorship ought to have, on average, higher PageRank authority than those who do not.

To test this, I selected a sample of 60 active profiles, 30 that actively use Authorship, posting regularly on a variety of sites, and 30 that don’t (as far as I can tell). Other than that criteria, the profiles were selected randomly, without looking at their PageRank in advance. The graph below shows the results for both median and average PageRank for both sets of profiles:

Google Plus PageRank as a function of Authorship use

While I would hesitate to call this study conclusive, at least for this test sample, on the average, profiles that use Google Authorship have about a full level of PageRank higher than those that do not. Once again, this is what we would expect to see if external links have an effect on the authority of Google+ profiles.

Final Thoughts

When Google set out to build Google+, the social network that they planned to make the “social layer” tying together all things Google, it’s not surprising that they baked into it some of their existing technology and expertise.

Their intention was that over time, Google+ pages and profiles would play an important role in helping to determine what should have more importance in various Google properties, and none are more important than Search. So Google gave profiles and pages (and now communities) the ability to have authority rankings in ways very similar to how Google evaluates “regular” sites and Web pages.

This means that Google+ profiles and pages build authority with Google by means not intuitive to most social media experts and analysts (which is why I believe so many of them have totally missed this).

What counts most is not necessarily how many followers or low-level social signals (e.g., +1′s) one’s profile has. Rather, Google takes a much more sophisticated and nuanced look at the links and relationships between various entities and a profile.

As I have demonstrated here, those entities can be both internal to Google+ (strong relational linkages from other profiles and/or pages) and external (strong backlink profiles from regular websites).

Increasing Visibility & Influence In Google Search

That means that anyone who wants to make use of Google+ as part of an overall strategy of increasing visibility and influence in Google Search should be actively pursuing all of the following tactics:

  1. Build a strong network within Google+. You should seek to cultivate not just a large number of followers, but more importantly, active relationships and partnerships with influential Google+ users. Their citations to your profile have a powerful effect on its authority to Google.
  2. Cultivate quality links from trusted websites. Create the kind of presence on Google+ that site owners want to recommend and link to. When being interviewed or referenced by a site, ask if they would link to your Google+ profile or page for identity purposes.
  3. Use Google Authorship for your content across the Web. Although this tip applies only to personal profiles, since Google Authorship involves a legitimate, Google-approved link to our profile, the more quality content you produce on trusted sites, the more authority given to your profile.

Bing Integrates TripAdvisor Tools, Content Into Search Results

Posted by admin | Posted in Blogger, Social Media | Posted on 29-11-2013

0

Travel site TripAdvisor made two Microsoft-relatedannouncements today. The company announced a Windows 8.1 app. But more significantly TripAdvisor and Bing are now presenting TripAdvisor content and travel search capabilities within Bing search results pages.

Bing will now display TripAdvisor reviews and photos as well as TripAdvisor’s hotel price comparison tool in SERPs. TripAdvisor’s full content library including restaurants, hotels and attractions will reportedly be available to Bing.

TripAdvisor said this on its blog about the Bing integration:

In the U.S., we are partnering with Bing to embed TripAdvisor’s price comparison tools, traveler reviews and photos within search results on Bing.com. Now, when you search for somewhere to stay on Bing, you’ll have instant access to our community’s reviews and ratings without needing to leave the page. And, if you have specific dates in mind, you can see which hotels are available and at what price. You can go from thinking about where to go to being ready to book in a matter of seconds.

It’s not clear whether this partnership is the beginning of a broader third party content syndication program for TripAdvisor or whether this is an exclusive deal with Bing. I haven’t received an answer from TripAdvisor to that question yet.

Regardless, it’s not likely that Google and TripAdvisor would make a similar arrangement for several reasons — among them the fact that TripAdvisor is part of anti-Google lobbying organization Fairsearch.org as well as the fact that Google has its own competing travel products that it’s trying to promote.

Below are live screens I pulled a few minutes ago, based on screenshots sent to me by TripAdvisor public relations.

Bing + TripAdvisor

Screen Shot 2013-11-21 at 12.46.28 PM

It wasn’t clear from the material I saw whether Bing search partner Yahoo will benefit from this deal and content. My suspicion is probably not. However Yahoo could potentially negotiate its own deal directly with TripAdvisor.

TripAdvisor offers very useful content, which will greatly enhance Bing’s SERPs. Good reviews on TripAdvisor become even more important now given their exposure through Bing. However, I suspect there are no Bing ranking benefits coming to those well-reviewed on TripAdvisor.

The Curious Case Of Bing Search Results In Google Search Results

Posted by admin | Posted in Google, Social Media | Posted on 23-10-2013

0

Over the weekend, Bill Hartzer noticed that some Google searches returned Bing search results. As of last night, Google search results are once again Bing-free. What happened?

Taking a closer look, the Bing search results weren’t www.bing.com/search URLs, which are correctly blocked by Bing’s robots.txt file. They were coming from www.bing.com/entities/search. This pattern is not blocked, which is how the related URLs ended up indexed by Google. As for why those URLs are no longer indexed? Google may have noticed and pulled them.

But what are these /entities URLs? They seem to be a hybrid of map results and search results. Take a look, for instance, at this Bing search for [cable television seattle].

Bing Search Results

The first few listings (after the ads) are web results, with a map on the right. The link to “cable television” (circled above) is to an /entities page.

Scrolling down below the fold are local listings and a link to “see all business listings”, also a link to the /entities page, followed by more web results.

Bing Search and Local

That /entities pages is slightly different from the regular web search results (a larger map, more local listings, web search results above or below the business listings, and yet not exactly like the Bing Maps page. A “Local” tab is highlighted (which isn’t an available tab in the regular web search or Maps search).

Bing Entity Search

For reference, here are the web results at the bottom of the page (missing from the regular maps results page).

Bing Entity Web Listings

Arguably, these pages are basically Bing search results, which Google doesn’t want to index. As Google notes in their webmaster guidelines:

“Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.”

Google wants to send searchers to an answer, not to more search results (for that matter, that’s what Bing wants to do too). Google (and not to make it weird or anything, but by Google I mean, in part, me) started talking about this back in 2007. In this case, Bing hasn’t yet added /entities to their robots.txt file, but Google appears to have removed the pages (and fairly quickly; yesterday, over 30,000 URLs were indexed). Google has noted before that they may remove these types of pages from their index if the pages don’t provide additional value beyond the aggregation of listings.

How do you add value to search results pages? Give the user a reason to visit that page first. Do the Bing pages do that? The yellowpages.com listing just above where the www.bing.com/entities result was is still there. Isn’t it a search results page too?

Google Bing Listings

It’s hard to say. Both pages include data beyond the web listings, including address, phone number, and ratings.

Yellow Pages Example

 

[adsenseyu1]

The question of how to add value to these types of pages is an ongoing challenge and it’s clearly a work in progress for search engines too.

Related:

The Double Serving Myth: When One Company Monopolizes PPC Ad Results

Posted by admin | Posted in Digital Marketing Practices, Pay Per Click | Posted on 23-10-2013

0

Here are two scenarios that are currently keeping many PPC advertisers up at night — maybe you’re one of them:

  1. A parent company buys up a significant portion of the competitors in a sector, and with the backing of large budgets,  advertises each site on AdWords and Bing Ads, shutting out most of the competition.
  2. A parent company launches several brands in the same sector and advertises them all with the same keywords on AdWords and Bing Ads, shutting competition out of the auction.
  3. [adsenseyu1]

Cases of one player monopolizing ad share happens more than you may realize, often devastating the once-successful paid search performance of smaller advertisers left in their wake.

But aren’t Double Serving policies written to prevent this type of monopolization from happening? It turns out, not really. Google and Microsoft’s policies don’t seem to protect smaller advertisers from either of these scenarios.

Real World Examples

Let’s look at two situations happening today:

J2 Global Communications has acquired the bulk of websites that offer online faxing services, including  eFax.com, Fax.com, MyFax.com and SmartFax.com.  The company has been accused of taking aggressive measures to crush competition. But does the company’s paid search strategy violate double serving policies?

J2 sites double servingBelow are screenshots of recent search results for “online faxing” on both Google and Bing. The ads in red boxes are by sites operated by J2.

AdWords Double Serving

Bing Double Serving

Now, here’s another example of a parent company operating and advertising sites within the same vertical. Build.com runs a network of sites including LightingDirect.com, HandleSets.com, VentingDirect.com and FaucetDirect.com. The company also owns additional sites within each of these verticals.

Build.com faucet sites double servingBelow is a screenshot showing how the company can monopolize the top ad spots for a search query — in this case “faucets”. The company owns FaucetDirect.com, Build.com and Faucet.com.

Build.com Double Serving

On the face of it, these examples above appear to violate of Google and Bing Ads’ double-serving policies. Upon closer reading it is clear the search engines leave much to their discretion.

Bing Ads Duplicate Ads Policy

Let’s look at Bing Ads’ policy for Duplicate Ads first (emphasis mine):

To provide the best possible user experience, Microsoft reserves the right to disallow specific ads or sites for offering a redundant user experience if the search results are too homogeneous. For example, we may disallow ads that link to websites whose content is too similar.

Multiple ads from the same advertiser may be displayed if the target site for each ad has:

  • A separate, distinct brand
  • A unique look and feel
  • Different products or services

Notice the language does not explicitly say duplicate ads WILL be disallowed; Microsoft “reserves the right” and “may” disallow ads for sites it deems too similar. Microsoft also doesn’t mention anything about common ownership of the websites.

Looking at the bullet points in the policy: the various J2 and Build.com sites do have separate branding and all have a unique look and feel from the other sites. Bullet number three, “Different products and services”, is where things get fuzzy. Obviously, it’s in the best interest of the user to serve ads for sites offering faucets on a search for “faucets”. However, the inventory and pricing on Faucet.com and FaucetDirect.com are identical, so it seems unique branding and look and feel trump the different products stipulation.

Microsoft was quick to issue a statement when I asked for clarification. A Microsoft spokesperson said,

“We remove specific ads or sites from showing on a query when they provide a redundant consumer experience. Multiple ads from the same advertiser are allowed provided that they are marketed as a distinct experience with a unique look and feel. We do not take into account parent or affiliate company relationships, as each of these sites have unique landing pages. For example, we would not block Amazon and Zappos from bidding on the same terms, despite the fact that they share the same parent company. For more information, please see our ad content and style guidelines.”

So it seems clear that both scenarios we’re looking at — J2 and Build.com — are not considered violations Microsoft’s Duplicate Ads policy. As long as the branding and experience are different, Microsoft most likely won’t take issue with duplicate ad serving by a parent company that either acquires competing sites or launches sites that carry the same inventory within a vertical.

Google Double Serving

Google is where the real financial impact is felt by advertisers who find themselves getting squeezed out of the auction. Google’s Double serving policy appears fairly straight forward, however, in practice it’s anything but (again, emphasis mine):

To protect the value and diversity of the ads people see on Google, we generally discourage advertisers from running ads for the same or similar businesses across multiple accounts triggered by the same or similar keywords. This policy, known as “double serving,” prevents multiple ads from the same or commonly-owned company from appearing on the same search results page.

To comply with this policy, advertisers should avoid running ads from different accounts on the same or similar keywords that point to the same or similar websites. Violations of this policy occur when multiple websites share Common Ownership … plus when two or more of the following factors are present:

  • Common product offering: For physical goods being sold, the sites share products in common such that a user browsing the site would perceive little difference in inventory between the sites.
  • Similar pricing: When pricing is available on the sites, there’s a price difference between the sites of 25% or less for substantially the same product or service. When two or more sites solicit contact information from users in order to provide a custom quote, they will be considered to have zero price difference.
  • Similar customer support experience: The sites offer the same or similar type of product or service for which the customer can expect to receive the same or similar level of Support.
  • Brand: The sites have non-differentiated Brands for which either the brand name is the same or the logo is the same.

Again, note the lack of clear directives such as “prohibit” instead of “generally discourage” and “must” instead of “should” in the policy.

Let’s look at how our two scenarios stack up against Google’s policy:

Common Ownership, check. In both cases, J2 and Build.com are transparent in their ownership of these websites. All sites are registered to their respective parent company in WhoIs.com; the footers on each of the sites include either the logo or registered name of the parent company.

Next, Common Ownership needs to be coupled with at least two of the bullet points to indicate “Violations of this policy”.

Common Product Offerings, check. Similar pricing, check. Again, in a spot check of Pricing on the Build.com sites, the prices are identical, and the pricing on the J2 sites are within Google’s 25 percent threshold.

Bonus points for Similar Customer Support Experience is harder to pin down. It’s likely these companies have consolidated customer service teams (and this point applies to customer service, not the actual sales process) so it’s nearly impossible for Google to determine without proactively contacting the companies.

That leaves Brand as the sole differentiator among the sites and the only point for which the sites aren’t in clear violation of the policy. It seems, despite how the Google policy is written and that in these two cases at least two of the three criteria have been met, branding trumps everything.

No Clarification From Google

Unlike Microsoft, Google declined to issue a clarifying statement. This silence, more than anything, is what’s so maddening for advertisers that find themselves getting shut out of the auctions by larger parent companies. One advertiser, who declined to be named, told me that after they presented Google with their findings about the J2 ads several months ago, were told, “We’ll look into it, but you won’t know what we decide one way or the other”.

Here’s what I have learned while researching this: Google won’t provide any further clarification or comment on the double serving policy. The company leaves it to its discretion of when and how to police potential violations. If an advertiser has reported a violation and nothing is done about it, then they should interpret that as Google did not classify it as a violation and has given tacit approval.

In other words, the fact that the J2 and Build.com ads are still running means Google decided they aren’t in violation of the double serving policy. This, in turn, seems to prove that distinct branding and unique logos are the only aspects of sites that Google really considers when deciding if ads violate the Double Serving policy. And yet, even then, the policy is often not enforced.

David Veldt highlighted a number of apparent double serving violations on his Interactually blog in July. It’s still easy to find many of them. Here’s just one of his examples that I was able to replicate today.  The Terminix “offers” site is not an affiliate; both Terminix domains are registered to parent company ServiceMaster in Memphis, TN. Both landing pages have the same $50 off offer.

Double Serving AdWordsThis is just one case that violates every bullet on Google’s policy, and yet, these ads have been running for months.

Mixed Signals Cause Even More Confusion

The reality is the double serving policies weren’t designed t0 keep parent companies from advertising their various brands and sites. They were written to keep companies (of all sizes)  from trying to cheat the system by cloning their sites or running various versions of landing pages in order to hoard all the ad spots. The real problem is Google’s Double Serving policy causes more confusion and frustration than it should, in part by the way it’s written, but in far larger part because the company seems uninterested in enforcing it with any uniformity.

Here’s one example in which the policy actually was enforced: In August, after an AdWords community contributor reported that a fairly small company was running ads for two sites — BritishFoodShop.com and BritishFood.com — Google did find it was in violation of the double serving policy. This despite the fact that the sites have different branding and design.

It begs the question, how is this situation any different than Faucet.com and FaucetDirect.com other than the size of the company buying the ads?

BritishFood Double ServingThe advertiser that reported the violation finally got results by posting the case with screenshots on the AdWords Community forum. The complaint encapsulates what I’ve been hearing from advertisers who find themselves facing these situations:

As an Adwords user, i’m curious and frustrated when I see someone violating the double serving policy. Whats more frustrating is when we report it several times over the years, wait patiently for Google to correct the issue but nothing gets done. It makes it difficult for us to compete and drives up our advertising costs.

For whatever reason, this time the complaint was heard, acknowledged and the duplicate ads were taken down. So, if you’ve been complaining about a specific case to Google but received no word, you might try the AdWords Community Forum. If you still hear nothing, it could be time to move on and redirect some of that paid search budget to other channels.

Foursquare Gives iPhone Users Real-Time Recommendations

Posted by admin | Posted in Social Media | Posted on 15-10-2013

Tags:

0

Foursquare announced three new upgradestoday. The location sharing app has extended its ‘Real-Time Recommendations‘ to iPhone users, and added two new search features: a ‘Nearby’ button and the ‘Friends at a Glance’ feature that displays most recent check-ins by friends.

According to the announcement, the app’s look and feel has also been updated with, “A simple feed that shows you what’s most relevant nearby, right now.”

[adsenseyu1]

While the ‘Real-time Recommendations’ has been available to Android users since August, Foursquare says the feature that makes automatic recommendations is now available to a limited number of iPhone users, with access to it, “Expanding more every day.” The new ‘Nearby’ button, defined by Foursquare as a ‘Right Now’ feature, allows users to see which friends may be closest to their current location.

For the ‘Friends at a Glance’ upgrade, Foursquare says users can go beyond each friend’s most recent check-ins by tapping on their profile to find more details.

Foursquare Updates Oct 2013

EdgeRank Is Dead: Facebook’s News Feed Algorithm Now Has Close To 100K Weight Factors

Posted by admin | Posted in FACEBOOK, Social Media | Posted on 21-08-2013

Tags: , ,

0

The next time you tell a client how Facebook selects and ranks the content that shows up in the News Feed, you’ll need to do it without using the word EdgeRank.

EdgeRank, Facebook’s original News Feed ranking system, is dead.

Facebook hasn’t used the word internally for about two-and-a-half years. That’s when the company began employing a more complex ranking algorithm based on machine learning. The current News Feed algorithm doesn’t have a catchy name, but it’s clear from talking to the company’s engineers that EdgeRank is a thing of the past.

During a phone call this week, Lars Backstrom, Engineering Manager for News Feed Ranking at Facebook, estimated that there are as many as “100,000 individual weights in the model that produces News Feed.” The three original EdgeRank elements — Affinity, Weight and Time Decay — are still factors in News Feed ranking, but “other things are equally important,” he says.

In other words, the News Feed algorithm of today is much more sophisticated than just a couple years ago.

[adsenseyu1]

“The easiest analogy is to search engines and how they rank web pages,” Backstrom says. “It’s like comparing the Google of today with Alta Vista. Both Google and Bing have a lot of new signals, like personalization, that they use. It’s more sophisticated than the early days of search, when the words on a page were the most important thing.”

This has implications for marketers and business owners far beyond the wording used to describe News Feed rankings. It’s a reflection — and a cause, too — of today’s complex battle to reach Facebook users organically.

The winners? They’ll be the ones who understand how Facebook has moved past Affinity, Weight and Time Decay, and move past it themselves. Before we get into today’s News Feed algorithm, let’s go back a few years.

knobs

In The Beginning It Was … Turning Knobs

Facebook’s News Feed was born in September 2006, promising to provide … and I quote … “a personalized list of news stories throughout the day, so you’ll know when Mark adds Britney Spears to his Favorites or when your crush is single again.”

Yep, that’s a direct quote from the announcement. Cute, huh?

With the launch of News Feed, Facebook wanted to show users the most important content from their social network without making them click to visit their friends’ profiles. And it had to figure out a way to decide what was important to each person.

“In the beginning, News Feed ranking was turning knobs,” said Facebook VP of Product Chris Cox during Facebook’s recent News Feed media event. “Turn up photos a little bit, turn down platform stories a little bit.”

Cox gave a funny account of how he and a co-worker sat in Facebook’s offices and changed the ranking “knobs” based on feedback from users — feedback in the form of often angry emails and conversations with users outside the Facebook office.

Times were much simpler then.

From Knobs To EdgeRank

facebook-edgerank-240pxFacebook has obviously grown up a lot since then, particularly with the simultaneous launch of Facebook Ads and Pages in November 2007.

Businesses, clubs, and organizations began creating Facebook Pages and using them to try to reach existing and new fans. That meant more content and more chances for users’ News Feeds to get crowded and unwieldy.

The company advanced from “turning knobs” to EdgeRank, the algorithm that a) determined which of the thousands of stories (or “edges” as Facebook called them) qualified to show up in a user’s News Feed, and b) ranked them for display purposes. EdgeRank had three primary pieces:

  • Affinity — i.e., how close is the relationship between the user and the content/source?
  • Weight — i.e., what type of action was taken on the content?
  • Decay — i.e., how recent/current is the content?

EdgeRank made it possible for Facebook to give users a more personalized NewsFeed. As Cox explained, users that played a lot of games on Facebook could see more game-related content in their News Feed. Users that took part in a lot of Group discussions would see more content like that. And so forth.

From EdgeRank To… ?

With EdgeRank, the way you used Facebook largely determined what showed up in your News Feed. And it still does because, as Cox said last week, “We’re in the business of giving our users the most interesting possible experience every time they visit.”

But now that job is a lot more complicated than ever.

Consider that there are more than a billion people using Facebook each month. And 128 million in the U.S. that use Facebook every day. They’re using dozens of different mobile devices with different capabilities for displaying content. There are 18 million Pages, many of which are actively looking for attention and a way to show up the News Feed as often as possible. And that number doesn’t include the numerous businesses that are using Facebook via regular accounts rather than Pages.

With all of that going on, Facebook says that the typical user has about 1,500 stories that couldshow in the News Feed on every visit.

So how does Facebook decide what users see, and what content from Facebook Pages make it into the News Feed? As you can imagine, Facebook isn’t about to give away all the details, but Backstrom did talk openly about several ways that the algorithm has grown up in recent years.

Affinity, Weight & Time Decay

These are “still important,” Backstrom says, but there are now multiple weight levels. “There are a lot of different facets. We have categories and sub-categories of affinity.”

Facebook is attempting to measure how close each user is to friends and Pages, but that measurement isn’t just based on personal interactions. Backstrom says Facebook looks at global interactions, too, and those can outweigh personal interactions if the signal is strong enough.

“For example, if we show an update to 100 users, but only a couple of them interact with it, we may not show it in your News Feed. But if a lot of people are interacting with it, we might decide to show it to you, too.”

Relationship Settings

Another factor is the relationship settings that Facebook users can apply. With each friend, you can go a step further and label the person a “close friend” or “acquaintance.” With liked Pages, users can choose to “Get notifications” or “Receive updates,” and there are deeper settings to control what kind of content the user wants to see.

facebook-settings

“We try to extract affinity naturally,” Backstrom says, “but if you go to the trouble to tell us more about your relationships, we will factor that in.”

Post Types

The News Feed algorithm takes into account the type of posts that each user tends to like. Users that often interact with photo posts are more likely to see more photo posts in the News Feed, and users that tend to click more on links will see more posts with links.

Backstrom says this is also applied on a deeper level. “It’s not just about global interactions. We also look at what types of posts you interact with the most from each friend.”

In other words, Facebook Page owners that continually publish one type of post are likely not having those posts seen by fans that interact with other types of posts.

Hide Post / Spam Reporting

News Feed visibility can also be impacted by users’ ability to hide posts or mark them as spam. But it’s not as simple as having a set threshold that will cause posts to stop showing in users’ News Feeds.

“For every story, we do the same computation,” Backstrom explains. “Given this story, and given the user’s history, what’s the probability that you’ll like this story? What’s the probably that you’ll hide it? We’re looking at this and trying to decide, is it a net positive to show this story in the News Feed?”

Further, Backstrom says there’s an element of decay when considering posts that have been hidden. Recent “hides” may carry more weight when deciding if a post shows in the News Feed, but those “hides” will have less impact as they decay over time.

Clicking On Ads, Viewing Other Timelines

The News Feed algorithm is completely separate from the algorithm that decides what ads to show, when to show ads, and where to show them. But how a user interacts with Facebook ads can influence what shows in the News Feed.

“Nothing is off the table when we’re looking at what we should show users,” Backstrom says. “It can be clicking on ads or looking at other timelines. It doesn’t have to be just what the user interacts with in the News Feed.”

Device & Technical Considerations

Yep, the News Feed algorithm even considers what device is being used and things like the speed of a user’s internet connection when deciding what to show.

“The technical limitations of some old feature phones make it impossible to show some content,” Backstrom. “We also know that some content doesn’t perform as well with Facebook users on certain devices. And if the user has a slow internet connection, we may show more text updates. We’re trying to show users content that they’ll find interesting and want to interact with.”

Story Bumping & Last Actor

Don’t forget these two changes that Facebook just announced last week. Story Bumping bends the “decay” rules by giving older, unseen posts a second chance at News Feed visibility if they’re still getting interaction.

Last Actor puts a premium on recency. Facebook is tracking a user’s most recent 50 interactions and giving them more weight when deciding what to show in the News Feed. This works on a rolling basis, so the value of an interaction will decline after the user has made 50 more recent interactions.

Final Thoughts

It should be clear that Facebook’s News Feed algorithm has developed significantly over the past few years. EdgeRank is a thing of the past, and it’s been replaced by a machine learning-based algorithm that, as Backstrom says, “only ever gets more complicated.”

That poses new challenges for brands and marketers hoping to get attention on Facebook, but the company says its advice to Page owners and others is the same: Create and publish and a variety of interesting content that will attract shares, comments, likes and clicks. That requires understanding your Facebook fans — from the types of posts they interact with to the different devices they might be using when they’re on Facebook.

We’ll keep reporting on Facebook’s News Feed changes, and our contributing writers will keep sharing tips and advice, too. You might also keep an eye on the new Facebook for Business news page because the company has promised to be more open in the future about changes that affect how the News Feed works.facebook-newsfeed-featured-570x270

The Smart Watch: Samsung, Apple Ready Dueling Wrist Devices

Posted by admin | Posted in Blogger | Posted on 21-08-2013

Tags:

0

The competitive effort by Samsung and Google to introduce an “iWatch” is not unlike the US-Soviet race to launch a man into space. Early September should bring new wearable devices from both companies within days of one another.

Both Bloomberg and the NY Times reported that Samsung is going to announce its smart watch on either September 4 or 6, seeking to preempt Apple’s anticipated similar announcement during its iPhone event on September 10 (still not confirmed).

smart-watches-generic-featured-570x270

Samsung’s watch is reportedly to be called “Galaxy Gear,” which sounds more like a product category than an individual product. We can expect Apple’s smart watch to be branded “iWatch.” The company has applied for the trademark “iWatch” in Japan according to Reuters.

[adsenseyu1]

The Samsung device, based on Android, will reportedly “make phone calls, play video games and send e-mails,” according to the Times’ article. The Apple watch will likely interact with the iPhone and iOS apps, but it’s not clear what specific features it will possess.

There are already a number of smart watches in the market. The Kickstarter-funded Pebble, for example, is now available from Best Buy for $150. The outlook for Pebble could be dramatically affected by the introduction of the Samsung and Apple smart watches. Ultimately some alchemical mix of features, design and pricing will determine sales.

Smart watches are part of a new category of wearable Internet devices, which include Google Glass but also Fitbit and other gadgets. A recent consumer study conducted for Rackspace found mixed demand and reaction to wearable technology. While many people thought their lives had been enhanced, there were also frustrations and privacy concerns.

Google Glass, while not a mainstream product, has already captured the public’s imagination. And though it’s unclear how these new experiences will be monetized, Google is already considering “pay per gaze” ad models.

Marketers have yet to fully embrace and adapt to mobile. Yet, here comes another category of connected devices to address.

The smartphone has dramatically impacted watch sales. However sufficiently “cool” and functional smart watches may revive interest. The idea of a smart watch or watch phone has captivated people since the Dick Tracy comic strip first appeared in the 1930s.

Off Page Optimization

Posted by ishan mishra | Posted in Digital Marketing Practices, SEO Gossip | Posted on 25-09-2011

Tags:

0

 

Off page optimizations consist entirely of link building strategies. Because it entails third parties, you will often be limited in how much control you can exert. However, knowing what the search engines are looking for can assist you to optimize those opportunities where you do have control.

Search engines evaluate pages that link to yours for relevance. Having a link from a page discussing widget training services will be more helpful to a widget storefront than a link from “Joe Bob’s Land of a Million Free Links”. In addition to relevance that search engines will appreciate, links from topically related sites can and will drive traffic to your site in their own right.

 

 

 

 

Table 4.1 Off Page Optimization

 

1.      Link Building Campaigns

2.      Reciprocal Links

 

 

 

 

 

 

 

 

 

 

Link Building Campaigns

With a new site, it can be helpful to engage in a link building campaign to get your site some immediate exposure. As your site matures and develops more content, you will find that other sites will start linking to it without solicitations.

There are several avenues you can pursue to expand your link building campaign:

free directories – Crawling search engines use these human edited directories to find (and categorize) websites. Be sure you get listings for all directories that are applicable:

    • DMOZ – largest free directory. Used by Google to power the Google directory. It can take a long time (months in some cases) for an application to be processed, so patience is golden when applying to DMOZ.
    • WebSavvy
    • JoeAnt – You must apply to become an editor if you do not want to pay for inclusion.
    • Gimpsy – Only accepts sites that offer some interactive features. Be sure to check their guidelines before applying.
    • Zeal – Only accepts non-commercial sites.

When applying to these directories, comply with their instructions and make it as easy as possible for their editors to approve your submission(s). Consider becoming an editor for some directories and you might get the ability to fast-track your own submissions.

Themed portals – Look for magazines, discussion forums, targeted vertical directories, etc. that pertain to your niche. Request listings where appropriate.

Press releases / articles – Many websites will be happy to publish a press release or content article and include a link back to your site.

Industry organizations – Many non-profit organizations offer directories of their membership with links to their websites. Join as many as make sense for your venture.

Participate in discussion forums – Many internet discussion forums allow hyperlinks in “signature files”. Participate in discussions and offer advice and wisdom. Do not spam forums with blatant advertising. This will reflect poorly on you as well as possibly get you banned from the forum.

Set up a search engine friendly affiliate program – offer commissions on sales of your product(s) for others to advertise your wares and link to your site.

RSS news feeds – publish company news though RSS syndication. Other websites pick up the RSS feed and publish your news story (which can include a link back to your site, of course).

Paid advertising – There are many directories (like Yahoo), portals and websites that offer paid advertising options from text links to banners.

Reciprocal Links

You can use the search engines to find websites that might link to you. Search on your keywords and your competitors’ backlinks to find sites that link out to websites in your industry. Visit those sites and try to determine the owner’s name (or a manager as appropriate). Send them a personalized e-mail introducing them to your site. Request consideration for a listing on their links page (and include the URL for that page) where your website would be of value to their audience.

Many owners/managers/webmasters will be happy to link to your site if it is germane, quality and useful to their audience. Some might request reciprocation. If their site is useful for visitors to your site, you should consider reciprocating. We do not recommend trading links with websites that have no topical connection to your site. It looks unprofessional and some search engines may be starting to penalize sites for it.

When you solicit a link from another site, be sure to format your requested link to conform to the format published on the target link page. Be sure to use your keyphrases whenever descriptions are published next to links.

As you generate backlinks with keyword rich anchor text or from related authority sites, you will be maximizing your off page optimizations.

 

 

 

ON PAGE OPTIMIZATION

Posted by ishan mishra | Posted in SEO Gossip, Uncategorized | Posted on 25-09-2011

Tags:

0

Getting a good ranking in a search engine hasn’t been the easiest thing for many. Search engines are getting more smarter & intelligent everyday, so now it takes more than just good content to top your competitors.

On page optimization is one of the very first step of SEO which every webmaster should look into. It probably won’t even take you an hour to learn and implement some of these on-page optimization techniques. But you may ask me, why it is so important? – Well literally speaking, if you can do proper on-page optimization for your website you can not only rank well in a search engine but also can increase the overall readability of your website for your visitors.

Below I have tried to summarize some of the most important on-page optimization techniques for you. You can implement some of these if not all to give your site a better exposure to the search engines as well as to increase your overall CTR (Click-Through-Rate) ratio.

On Page seo indore india by ISHANTECH

On Page seo indore india by ISHANTECH


 

3.2. Title Optimization

A site’s title tag is by far the most important on-page optimization element. A title tag should be short but descriptive enough for your visitors to identify you and your business. Title tag is the first thing that is shown & indexed by the search engines. So naturally it is given a very high importance – out of thousands results that a searcher sees, your site’s title has to be appealing enough for him to want to find out more information. On the other hand, your title has to be appealing enough to the search engine in order to rank you above thousands of other similar websites like yours.

Important things to include in your title:

Your Name/ Business Name / Site Name: This is very important for breading propose. If you feel that your customers may search you by your brand name than it’s also useful to put it somewhere in your title.

 

Keywords: If you want to rank for a certain keywords it is always good to place some of them in your title tag. A Title tag represents the whole flavor & content of your website. So if you are selling pizza online you can include keywords like – order pizza, home delivery pizza etc in your title tag. Don’t staff too many keywords in your title. Write a title which is readable to humans & also good for the search engine. (e.g. Domino’s Pizza, Order Pizza Online for Delivery – Dominos.com)

 

Include your 1-800 or other toll-free numbers: Some may not agree with me on this, but I think including your phone number in the title tag does help your visitors to take a direct action! It also makes your site look more professional and legit when it’s being displayed in the SERP (Search Engine Result Page). Searchers are likely to ‘click’ on the result that has a phone number attached to it because in their unconscious mind, they will have a good impression on the authenticity of the business and the level of support. If you prefer not to include your number in the title tag, you can alternatively include it in your Meta Description which will give you almost the same benefits.

3.3. Meta Tags Optimization

A site’s Meta tags may not be as important as it used to be before, however I feel that Meta Description is something you can’t just ignore. A site’s Meta description should contain a brief description of your website focusing on the areas and services that your business is specialized in. This small piece of text can be considered as a selling snippet, if a searcher finds it appealing he is likely to click and go inside your page to find out more information. But if your Meta Description is too generic and isn’t written too well then there is a good chance that your site will simply be ignored.

Important things to include in your Meta Description:

Include your Selling Point– Tell your customers what they want to hear through your site’s Meta Description, and you will definitely get some advantage over others in the SERP. For instance – if you sell ‘cheap web hosting’ then including a phrase like “hosting starting from only $0.99” may result in more clicks and visitors because your description will exactly match the flavor of search performed by the user.

 

Keywords – Including some of your keywords will give you some advantage in Google’s relevancy algorithm.

 

1-800 or other Toll Free Numbers – if you haven’t included this already in your title, you can rather include it in your Meta description. If the users have Skype installed on their system, any number on their browser will become clickable which will result in a direct skype-out call. So basically, if your number appears in the Meta description some of your customers might just prefer to call you directly instead of going through your site.

3.4  Important HTML Tags

It is necessary for you to highlight certain parts of your website that you want your readers to look at. There are several tags in html which allows you to do so. For instance – the header tags [h1] [h2] [h3], Bold [strong], Italic [em] etc. The text inside your header tags (e.g. [h1]) is given very high importance by the search engine. Usually you can use them to define the page/post titles or the important sections of your website.

Header Elements:

Header 1: Header 1 should be used to define the most important section of your page. Usually Header 1 is used for Site’s title or header text.

 

Header 2 & 3: Header 2 and 3 can be used for Page/Post titles or important sections of your pages. Separating your content with headers is a good practice as it makes your site more readable and easy to navigate.

 

Text Styles:

Bold: You can bold (e.g. [strong]) certain words which are of high importance. Sometimes it’s good to bold your keywords where appropriate. However overdoing this may get you penalized.

 

Italic: You can use the [em] tag to emphasize certain words which will appear in italic.

Quote: This is very useful when you are quoting from someone.

 

3.5. Keyword Optimization .

Your site’s content needs to be optimized in such a way that it can suit both search engines & your readers. Stuffing your site with too many keywords can make your site unreadable. So you will need to have some sort of balance between your keywords & your content.

Important elements of Keyword Optimization:

Research: Do a proper research before you decide on your keywords. There are plenty of free tools out there that can help you to do keyword research. Some of my personal favorites are: SEObook Keyword Suggestion tool, Google Adwords Keyword Tool & Overture Keyword Tool.

 

Keyword Density: Try to have a moderate keyword density so that it can help the search engine to determine that your page is indeed related & relevant to the keyword that you are targeting.

 

Synonyms & Related Keywords: I personally like to use synonyms instead of having a high keyword density. This helps to make my content sound natural but still helps in SEO.

 

Long Tail Keywords: It’s often good to target some long tail keywords as they are comparatively easier to rank for. During your keyword research you should be able to gather some good long tail keywords that you can optimize your site for. But you can also come up with your own long tail keywords; for example try adding some common words like – ‘best’, ‘free’, ‘cheap’, ‘top’ etc. along with your actual keyword and you might eventually get some good long tail keywords.

 

If you are using a CMS, then try using permalinks. This way your keywords/post title will be on the link itself and thus it will valued more by the search engines.

3.6. Image Optimization

If your site has lot of images, you need to optimize them too as they can’t be read by the search engines. It’s very easy for a human reader to interpret the image into its meaning. However for a Web crawler the whole interpreting process is completely different. Search Engine spiders can only read text but not images. So you need to use some special tags for your images in order to give them some meaning.

Alt text : ALT text or Alternate Text is the text to describe your image when your mouse moves over an image on your web page. The text should be meaningful but short. You can use your relevant keywords as ALT text. If your browser can’t display the image for some reason, the alt text is used in place of that particular image.

 

File name : always use meaningful file name for your images, use names like “apple-iphone-cover.jpg” instead of meaningless “DSC24045.jpg”. Keep image file name same or similar to the ALT text.

 

Image Title : always use the title tag in images which will show the title as tool tip when a user moves his mouse over the image. Example of an image with title tag: [img src=”http://imagelocation.jpg” alt=”Image description” title=”Title of the Image”]

 

 

Image Linking : Whenever you want to link to your image, use the image keywords in your link text. Example: use “view an Apple iPhone”, instead of “Click here to view” as the anchor text.

We generate and send Google Analytics reports to you by email that tell you exactly what keywords your visitors searched for and even which search engine they used to find your site.

 

Html Sitemap

We generate and add two sitemaps to your website. The first, called an html sitemap on your site. The html sitemap is helpful to both visitors and search engines when n

site.

 

XML Sitemap

The second is called an xml sitemap and it is a file that lists URLs for a site along wit

about each URL (when it was last updated, how often it usually changes, and how im

other URLs in the site) so that search engines can more intelligently crawl the site.

 

 

 

 

 

 


 

 

 

WEBSITE ANALYSIS

Posted by ishan mishra | Posted in SEO Gossip, WordPress | Posted on 25-09-2011

Tags:

0

Conducting a competitive analysis is an important part of the job if you’re a usability engineer or information architect. A good competitive analysis not only produces usability metrics but also aids decision makers in their strategic goal-setting and planning. Done right, a good competitive analysis can steer a Web development project in the right direction.

The day will come when you’re sitting happily at your desk and someone from marketing or business development will come into your office and ask you to do a competitive analysis for them. The company is launching a news site or portal, and the decision makers want to be sure that their site will stand up to the competition.

Suddenly, you’re not just in the world of usability and information architecture — of theories and deep thinking about cognitive psychology. You’re now in the rubber-meets-the-road world of business. Although you’ll be doing old-fashioned usability analysis work, you’re also expected to guide the team toward increasing return on investment. You’re expected to provide baseline readings from which to measure success. And you’re expected to help the team snoop out what the competition is doing.

What to analyze

Now that you have a list of competitors, you need to draw up a list of items to analyze when you visit their sites. I’ve developed a categorized list of items over the years, which are included below:

Home page.  How informative is the home page? Does it set the proper context for visitors? Is it just an annoying splash page with multimedia? How fast does it load?

Navigation.  Is the global navigation consistent from page to page? Do major              sections have local navigation? Is it consistent?

Site organization.  Is the site organization intuitive and easy to understand?

Links and labels . Are labels on section headers and content groupings easy to understand? Are links easy to distinguish from each other? Or are they ambiguous and uninformative (“click here” or “white paper”)? Are links spread out in documents, or gathered conveniently in sidebars or other groupings?

Search and search results.  Is the search engine easy to use? Are there basic and advanced search functions? What about search results? Are they organized and easy to understand? Do they give relevance weightings or provide context? Do the search results remind you what you searched for?

Readability.  Is the font easy to read? Are line lengths acceptable? Is the site easy to scan, with chunked information, or is it just solid blocks of text?

Performance. Overall, do pages load slowly or quickly? Are graphics and applications like search and multimedia presentations optimized for easy Web viewing?

Content.  Is their sufficient depth and breadth of content offerings? Does the content seem to match the mission of the organization and the needs of the audience? Is the site developing its own content or syndicating other sources? Is there a good mix of in-depth material (detailed case studies, articles, and white papers) versus superficial content (press releases, marketing copy)?

I provide a rating for each question on each site visited: 1=bad, 2=poor, 3=fair, 4=good, 5=outstanding. Naturally, you may want to tweak this scale to fit your needs, but it’s important to have some kind of scale to make the job of comparison easier. The list of resources contains links to other criteria you can use.

Conducting the analysis

Now that you have a list of sites to visit and a list of criteria to compare, start your analysis. Be sure to conduct your analysis with some rigor. Don’t be haphazard, and don’t do things differently with each site visit. Try to analyze a site without interruption. In other words, do everything you can to reduce bias in your investigation.

Here are some additional guidelines:

  1. Visit one site at a time, and take the same (or at least, similar) paths through each site. Follow the checklist of criteria.
  2. For each criterion, take lots of notes. You’ll refer to these notes when you organize and write your report.
  3. Try to give a score for each criterion as you complete them. That way you’ll have scores for each major category as well as for each site.
  4. If the company that you’re doing the analysis for has an existing site, then remember to rate them last. After visiting the company’s competitors, this will give you some sense of objectivity. This also provides a good measurement comparison for the readers of your report.