Thursday 28 February 2013

Why Did I Lose My Rankings in Google?

Recent updates to Google’s search algorithms (named Panda and Penguin) have caused many website owners to doubt Google’s “don’t be evil” attitude. Before you start hating Google for your website’s change in rankings, let’s take a look at why Google makes the decisions it does.
The first item in Google’s philosophy is “Focus on the user and all else will follow.” Google works hard to provide a search engine that will give searchers the best information available on the web relative to their search terms. Google is working to serve web users, not just website owners. It follows then, that Google would adjust its algorithms to bring higher quality content to the top of its search rankings.

google panda algorithm update
Google Panda Update
“I’m THE Big Fat Panda”
In late February 2011, Google rolled out the first of its major algorithm updates, dubbed, “Google Panda.” The main intent of Google Panda was to increase user experience by lowering the ranking of websites with “low-quality” or “thin” content. Conversely, websites with high-quality, unique content would be rewarded with an increased ranking. This change affected nearly 12% of all Google searches.
By April 2011, this update had rolled out to encompass all English queries, and by August, Google Panda had gone international, including all non-English queries.
To figure out what makes a site “high-quality” in Panda’s algorithms, let’s take a look at four main points:
Is the site trustworthy?
A trustworthy site has information written by someone who knows what he or she is talking about (an expert). It includes facts and figures that people find believable and wouldn’t report to Google for suspicious activity.
Does the site have original content?
A website should have new information on each page that isn’t redundant and hasn’t been gathered from another source.
Is there value?
The more value a website has, the better rankings Google will give it. Valuable websites have information that is interesting, complete, comprehensive and shareable.
Does the site have lots of spam?
Websites with lots of ads and “click here” banners are not considered to be high-quality sites. This update also has to do with a site’s page layout. An update to the algorithm in October 2012 targets pages with too many ads above the fold.

google penguin algorithm update
Google Penguin Update
“There’s a New Sheriff in Town”
In April of 2012, Google rolled out an update code-named “Penguin”. Due to this update, various methods of SEO are no longer desirable and can actually hurt search rankings. If you are trying any of the following, you should probably stop right away.
Keyword Stuffing
This is exactly what it sounds like. Stuffing a webpage with applicable keywords, hiding text behind images, and going crazy with meta tags. Not only is this annoying to your page’s visitors, it will provide no advantage on almost any current search engine. And Google will actually penalize your site in the rankings if they detect keyword stuffing. Remember, it’s about quality content, not quantity.
Cloaking

Cloaking is a slightly more advanced method of “black-hat” SEO, but this also will not increase your Google page rank. This is where a script on your website delivers a different version of your page to the search engine’s “crawler” than the webpage that your actual visitors see. With this, you can send whatever keywords you want to increase rankings, yet have something totally different pop up when users click through to your page. Sneaky workarounds like this no longer get results.
Link Schemes
You may have received emails from so-called SEO companies, or other businesses like yours, offering a link swapping service. It sounds a lot like the old chain mail postcards kids used to send around in grade school. “Send out ten postcards and get 8 billion in return.” In link schemes, you post other people’s links on your website in return for their posting your links on their websites.

The idea is Google will see that many websites refer to your website, giving you a better ranking in search results. Penguin looks for websites participating in link schemes and penalizes them for manipulating the search results.
Duplicate Content
Akin to keyword stuffing, duplicate content involves deliberately posting the same content on multiple pages within a website. The intention of duplicate content is to appear to have a plethora of pages and information on certain topics, and thereby garner higher rankings in Google searches. Again, Penguin identifies sites that have high levels of duplicate content and penalizes them in search results. This has caused difficulties for sites with multiple similar products. With quality product descriptions however, penalization can be avoided.
Exact Match Domain Update
On Friday, September 28, Google released an update to its algorithm that affects “exact-match” domains in the search results. The purpose of this update is to reduce the number of websites that rank for searches just because their domain is the same as a particular search. For example, if someone Googles “pet groomers in Chicago”, one of the websites in the search results might be “www.petgroomersinchicago.com”. And that site may have been ranking in the search results because of its name, but not because it was a quality website. This algorithm update is supposed to change that.
So remember, the key to a lasting presence in the search results is to have high quality content. It’s still good to include useful links and applicable keywords, but it should all be legitimate content. Hopefully the tips included here are helpful to you. If you have more questions, contact us here at Wallaroo Media!

Reference:-http://wallaroomedia.com/why-did-i-lose-my-rankings-in-google/

EBriks Infotech:-SEO India 

The Link Disavow Tool – Google Just Isn’t That Into You

When the link disavow tool was introduced at Pubcon 2012, many greeted it as an answer to their prayers. Looked upon as a reset button to rebuild a website or link profile, many were stunned when their link disavows were met with – nothing.  I’ve read a few stories where websites have bounced back after a link disavow was processed, but in every one of these cases, a painstaking process of link removals and reconsideration requests were also involved.
Meanwhile back at the Google Ranch
In slide 12 from Matt Cutts Pubcon presentation , two important points were highlighted in red:
“Don’t use this tool unless you are sure you need to use it”
Most sites shouldn’t use this tool
In Matt’s Disavow Links Video, he drives home the point that you need to make multiple link removal requests on your own. Once you have a “small fraction” of links left to remove, you should then use the tool.
In a follow up interview, when asked how long it would take to see results, Matt answered:
“It can definitely take some time, and potentially months. There’s a time delay for data to be baked into the index. Then there can also be the time delay after that for data to be refreshed in various algorithms”.
I’ve suspected from the introduction of this “new tool” that it may be nothing more than a “new feature” added to an old tool – The Spam Report. I don’t think Google is interested in a micro approach to fixing link spam one website at a time. Like spam reports, I believe Link disavow reports go directly to Web spam engineers and are used to devise scalable solutions. This could also explain the “baked into the index” comment.

Reference:-http://www.searchenginejournal.com/skinny-on-googles-link-disavow-tool/53493/

EBriks Infotech:-SEO Company

Tools to Find Your Competitors Keyword Rankings

Before I start reviewing available tools showing which keywords your competitors rank high in Google, let me describe how you can use this information:
  • do not just copy-paste your competitors’ keywords, use it as additional help for your keyword research;
  • compare which keywords your competitors optimized their pages (i.e. tried to rank for) and which keywords they ended up ranking for. Thus you’ll be able to avoid their mistakes.
1. SEO Digger (paid with free trial) will show you Google keyword position, check date (International style: DD/MM/YYYY) and Wordtracker rank:
SEO Digger

2. SpyFu (paid with free trial) spies for both your competitors’ organic search rankings and Google Adwords campaign keywords:
organic search rankings
3. KeywordSpy (paid with free trial) will show you both the keyword your competitor ranked for and the total number of results returned for this keyword. While the tool does have some really user-friendly interface, it lacks some sorting options.
KeywordSpy
4. KeywordRemix (if the tool is back, please comment) also offers some interesting data on keyword rankings such as total number of results in Google, Google News and Google Blog Search competition, data age and synonymous phrases (the tool also has some bugs: it shows one and the same key term multiple times):
KeywordRemix
Final note: all these tools showed me different keywords. So my advice is to try all of them before you decide which one is better for you.

Reference:-http://www.searchenginejournal.com/tools-to-find-your-competitors-keyword-rankings/7119/

EBriks Infotech:-SEO Company 

What is SEO Cocitation?

cocitation

cocitation Each year SEO experts profess that SEO, as we knew it, is dead. They describe how the old tactics don't work and introduce new concepts formed from the latest Google algorithm update. But the more things change, the more they stay the same. Yes, links still matter a great deal for achieving rankings. But some other concepts that may seem outdated are reviving. SEO cocitation was a hot topic many years ago as Google introduced anti-spam updates geared toward devaluing links from poor-quality blog networks. More recent discussions of cocitation have shifted the focus away from link juice and toward the words used around links. These factors are all gaining traction in the Google algorithm as Google evolves and becomes smarter at detected manufactured links (versus earned links). For webmasters attempting to reshape an SEO strategy that failed in 2012 due to Penguin and Panda, re-engaging SEO cocitation is a smart move. SEO Cocitation Core Concepts 1. Link juice flows backwards as well as forward Most webmasters obsess on inbound links, working feverishly to earn them from reputable sources. Generally speaking, the more inbound root domain links acquired, the higher the domain authority, PageRank, rankings, and traffic. But this orientation can lead to link juice hoarding. Often overlooked is the notion that outbound hyperlinks are also important in the link juice equation. Outbound links to high-authority sites are positive for the user-experience. 2. Good websites link to other good websites Linking to other great websites in a competitive space is counter-intuitive as it may aid the enemy. This SEO cocitation concept is frustrating for new webmasters as it implies that top rankings are an exclusive club and breaking-in is impossible. There is some truth to this, but smart publishers need to understand that to get, they must also give. Linking to high-quality, high-authority websites is also a positive signal to Google. The trick is to find high-quality sites that are relevant, but not competitive. 3. Linking to a bad neighborhood hurts a website On the flip side, outbound links to a poor-quality domain suggests the website as a member of a bad neighborhood. This is particularly helpful to Google in their pursuit to de-value links from spammy blog networks that provide no value to end-users. It is fairly easy for Google's algorithm to spot a cluster of low domain-authority sites that link exclusively to other low-domain authority sites. The manipulative intentions are even more obvious when the sites don't share any topical relevance to each other, are rarely shared in social media, and have poor time-on-site (i.e., are not useful). 4. The link graph uses the transitive property The most advanced concept in SEO cocitation is the transitive relationship. Put in mathematical terms: Website A Links To ---> Website C Website A Links To ---> Website B Website C’s authority is a benefit to Website B Here, link juice flows backwards, and then forward. While there is no link between Website B and Website C, there is a transitive relationship based on the fact that Website A links to both of them. If Website C is highly authoritative, the transitive principle of SEO cocitation suggests that Website B will gain a benefit simply because Website A links to both. 5. The words around your links matter More recently, the term cocitation has been expanded to include semantic analysis and word frequency – also called co-occurrence and semantic similarity. Specifically, several SEOs have found domains that are ranking on keywords that have never been used as anchors for inbound links. There has always been evidence that the words on pages that link to a website (contextual phrases) are influential, but many SEOs see the importance of co-occurrence, or semantic similarity, increasing in the Google algorithm. Conclusions Trying to understand SEO cocitation can be difficult. But the basic takeaways for small business websites owners are clear: Link freely to other highly authoritative websites in the space (i.e., relevant). Don't link to bad neighborhoods. When seeking inbound links, be mindful not only of the domain authority and relevance of the linking website, but where else that website links to. This will rule out most link buying tactics as those who sell links usually do so from networks, and almost always link to low-quality sites (i.e., other link buyers). Webmasters should avoid being pulled into a bad neighborhood unknowingly by cocitation.

Reference:-http://searchenginewatch.com/article/2251195/What-is-SEO-Cocitation

EBriks Infotech:-SEO Firms in India

Wednesday 27 February 2013

Measuring Your Link Building with Google Analytics


builders-measuring-tape

In the past, link building was less about bringing people straight to your site and more about making sure Google rank your site as high as possible.
These days, many link building methods focus on tactics that bring people straight to your site as well as helping Google understand that you should rank well for related terms. So, following on from Julie Joyce’s post on 3 Ways to Measure Link Building ROI, I’d like expand on ways that you can use Google Analytics to measure link building.
This post will assume that the link building you're doing is the type that has the potential to generate traffic directly from the links, rather than only through improving rankings. I’m not going to go in to detail about any of these techniques; there are plenty of great resources on this blog and others to help you with that.
Google Analytics is all about traffic – where it came from, what it led to on site and how much value this generated. The main focus for analyzing link building will be referral traffic data. You build a link, people use it to get to your site, you measure the value of this. But there is a lot more to it than just looking at this basic data.
This post will cover how to use Advanced Segments to your advantage, Multi-Channel Funnels for improved attribution figures, campaign tagging for easier analysis, a free custom report and how to easily use the API for exporting and combining data.

Basic Analysis

Firstly, let’s head to the Referrals data in Google Analytics.
Here we can see how many visits external websites brought to the site. Alongside this is interaction data such as pages per visit and time on site to help you understand the type of users coming from the site. By clicking the goal and ecommerce buttons above the graph you can also see how well these sites have performed with regards to conversions.
I’ve created a Custom Report to help you analyze these metrics, just sign in to Google Analytics, click this link, and choose which profile to apply the report to and enjoy the data:
link-analysis-custom-report
All of this information can help you reach a decision as to whether the effort you went to for each link was worth it. However, you are like to wonder how much of the referral traffic can be attributed to link building, links gained naturally and social media.
You will need to decide at this stage whether you count traffic from naturally gained links and from social media as link building or whether you separate these out. You could argue the case either way, so it may be best to break it down based on whether you have separate marketing budgets for social, link building off site and creating link worthy content on site.
If there are three different budgets you will want to break them down, if it’s all under the same roof then you can analyze everything together. To break the three sections down we will use Advanced Segments.

Advanced Segments

Advanced segments enable data to be shown based on a set of instructions, be it, ‘traffic from Twitter’, ‘visits with more than 1 conversion’ or ‘visits from users in London’.
We have the following three sections to break down:
  • Social media
  • Natural links
  • Link building
Of these, natural links is the hardest to identify and know which referring sites belong in this category. However, logic dictates that if we set up a filter to include all known social media sites, and another to include all known link building sites, we can then exclude these from the third advanced segment to leave only the natural links.

Link Building Advanced Segment

To create a segment that will only show traffic from the sites that you have targeted through link building follow these steps:
  • Click Advanced Segments
  • + New Custom Segment
  • Name it
  • Select Include
  • Choose Source in the green box
  • Change Containing to Matching RegExp
  • Fill box with all link building domains, putting a pipe between each and a backslash infront of any dot or hyphen, like so:
Searchenginewatch\.com|bbc\.co\.uk|other\-link\-building\-website\.com
The pipe means ‘or’ and the backslash means leave this as a standard character rather than matching a regular expression character.
Once this is set up and applied to the relevant Google Analytics profile, you will have filtered all data shown throughout all reports to only show data that included visits from your link building sites.

Social Media Advanced Segment

To create a social media only segment follow the steps above, but instead of filling it with link building sites we need to be using social media sites here. Like so:
facebook\.com|twitter\.com|linkedin|del\.icio\.us|delicious\.com|technorati|digg\.com|hootsuite|stumbleupon|
netvibes|bloglines|faves\.com|aim\.com|friendfeed|blinklist|fark|furl|msplinks|myspace|bit\.ly|tr\.im|cli\.gs|
zi\.ma|poprl|tinyurl|ow\.ly|reddit|plus\.url\.google\.com|^t\.co|m\.facebook\.com|tweetdeck|youtube|
ycombinator|flickr|popurls|myspace|pinterest\.com
Or, as I’ve already built it, you can apply this segment to your chosen profile just by clicking this link.
You may want to edit the sites in this profile, just remember to cancel out any characters with a backslash and you may need to start very short URLs with a carat (^) to prevent them matching other sites that end with the same text (see t.co for details!).

Natural Link Advanced Segment

So now that we have those two set up we can copy out the text used in each and then use these as excludes in another new segment, as shown in this image:
advanced-segment-natural-links
These three segments can all be applied together to help you compare the different results or individually to help you focus in on the performance of each method. Try applying them to different reports throughout Google Analytics for different insights.
For example, the Ecommerce report will show you the money you are generating from each segment, the location and language reports might show that you’re attracting a more diverse demographic than you expected.
If this method appeals to you and you want to take it further, later on you'll discover how to create custom channel groupings in Multi Channel Funnel reports.

Campaign Tagging

If you don’t have referral data that is easy to break down you may want to consider using custom tagging on links that you build; adding information to help you identify them in Google Analytics. This would be done through the use of the URL Builder Tool which allows you to choose the campaign name, source, medium and other information for your URLs before you use them in link building or other campaigns.
The URL builder is a way to put together the campaign tracking data in the format that Google can use it. You would take the URL:
http://www.mysite.com/awesome-infographic
and make it relevant to the site you are promoting it on:
http://www.mysite.com/awesome-infographic?utm_source=www.link-building-site.com&utm_medium=referral&utm_campaign=link-building
url-builder
You might be familiar with this method from tracking other campaigns, but why shouldn’t it also be applied to link building? Obviously there is a limit to how much you can add tracking to and I’ve not investigated whether it has an impact on the quality of the link, but if it can help you evaluate the investment in resources and budget then it’s got a high chance of being useful!

Multi Channel Funnels

* Images in this section have been edited to protect actual data
In order to understand the bigger picture of your link building it can be beneficial to know how links have assisted conversions that they have not been attributed to it. Google Analytics uses first click attribution; this means that if a user has come to your site having searched for your product organically, then is encouraged to come back and make a purchase following a link you have built, the conversion is shown as organic.
So does your link building assist conversions at all?
The best way to find out is to take your link building advanced segment from before and use the regular expression to create custom channel groupings, like so:
  • Navigate to Top Conversion Report 
  • Above the data click Other 
  • Select Copy Basic Channel Grouping, rename 
  • Edit ‘Referral’ to be link building sites 
  • Add a new rule for your natural links (excluding the same data as above)
custom-channel-grouping-for-link-building
To reverse this and show the natural links you need to exclude the link building sources and ensure that remaining referral data is caught:
custom-channel-grouping-natural-links
It may also be good to keep the social analysis consistent by editing the default social segment to include the sites you identified for your advanced segment.
So after a few short tweaks you will now have a report looking something like this:
mcf-link-analysis
Now, I like the path analysis for showing you the journey that users take, but if your site has many different journeys (as is common) the Assisted Conversions report will be better for you. The custom channel grouping that you just created can be applied to this report and will show you results like so:
assisted-conversions-link-analysis
This separated out the data for both assisted and last interaction conversions, both of which would never be understood if you relied only on the standard reports in Google Analytics.

API Export Fun

When working on a link building campaign, you will want to know both the traffic generated and how strong the promoted page of your site becomes. To do this you can export your traffic data from Google Analytics and combine it alongside link and social metrics.
If you haven't used the Google analytics API before, start by using SEO Tools for Excel, which makes it simple to export from Google Analytics as well as Majestic SEO and other tools.
Using this tool has created the following report to help analyze the full picture surrounding content marketing within a link building project:
api-analysis
This kind of analysis helps you bring everything together and understand not just the value of the traffic but the benefit that your link building has had in strengthening the link profile and social visibility of your pages.

Concluding Thoughts

As with many aspects of SEO, there is no single method that answers everything. There are different routes that you can take to analyze the value of link building and each project will require its own approach.
The ideas above are here to help you dig a bit deeper next time someone asks you what value the link building has brought, on top of the methods that Julie identified.

Reference:-http://searchenginewatch.com/article/2250861/Measuring-Your-Link-Building-with-Google-Analytics

EBriks Infotech:-SEO India

Facebook Graph Search: How Multi-Location Brands & Local Marketers Can Capitalize

Local marketing has grown increasingly important in recent years with the proliferation of smartphones, tablets, GPS, and other mobile navigation-enabled devices.
When people on the go search for businesses, they typically need directions to the nearest locations, phone numbers and other details, increasing the importance of data availability and accuracy.
As if local marketers and multi-location brands weren’t already struggling to make this information accurate, available and accessible, Facebook upped the stakes yet again with Graph Search.
Facebook’s Graph Search has already sparked an extraordinary level of interest and media coverage, because it stands a good chance of being the first true disruptor of traditional search behaviors.
Traditional search engines return relevant content from the entire web; Facebook’s Graph Search returns highly personalized content from the searcher’s own personal network (social graph).
By tapping into a searcher’s social graph, Facebook searches for people, photos, video pages, places, etc.; anything shared publicly or shared with the person conducting the search on Facebook is considered.
facebook-graph-search-restaurants-in-chicago
facebook-graph-search-results-chicago-restaurant-friends-like

Opportunity Awaits

This tidal shift gives local marketers, multi-location brands and the agencies, publishers, aggregators and resellers they work with a chance to climb into the driver’s seat and capitalize on this immense opportunity. To do so, they’ll need to get local by claiming a location page for each location and working with local representatives to build an engaged local fan base.
Keep in mind, local Facebook pages have the potential to outperform national brand pages in several key facets when you consider engagement on the individual fan level. A study by Mainstay Salire found that on average local Facebook pages receive five times more marketing reach and eight times more engagement per fan than do corporate brand pages.
Content from local Facebook pages reaches a higher percentage of users’ news feeds and fans are much more likely to engage with location specific content. This higher level of engagement and local relevancy also gives local pages an edge up when it comes to Graph Search.
Those who already embraced Facebook local marketing tactics to grow their local page, increase engagement and dominate users’ news feeds should be in pretty good shape, but even those starting from scratch have a great opportunity to get out in front of the pack and reap rewards. Follow these steps for Graph Search success.

Take Control

  • Be sure to claim location pages on Facebook: Continue to manage the established corporate page but also build out pages for each location that tie to the corporate page. The name, category, vanity URL and information in the “about” section will help users find these pages in Graph Search.
  • Ensure accuracy of location and contact information: This may seem obvious, but inaccurate or incomplete location and contact information leads to frustration for customers and potential customers trying to visit a location or speak to a customer representative.
  • Merge or remove duplicate listings: Often, when customers don’t find a location page when they attempt to “check in” they will create a Place on Facebook, which are separate from pages, creating duplicate listings with potentially incorrect or incomplete location and contact information. Be sure to check for these duplicate listings and ask for them to be removed or merged with the correct page.

Engage the Customer

  • Build the local fan base: Be sure to publicize newly created pages for each location and incentivize patrons to become fans of the page through Facebook-only discounts or rewards.
  • Encourage customer reviews, likes and check-ins: The more likes and interactions a page receives the greater the chance it appears in Graph Search results; upon arrival ask visitors to check in to the location. Check-in rewards like coupons and giveaways encourage visitors to check in, spreading the word about the location with their network. Once the customer interaction is complete be sure to ask for a review or recommendation on the Facebook page. Retailers can easily add these requests to customer receipts while anyone with a website can add a recommendation box which ties directly to the Facebook page.
  • Publish relevant local content to spark engagement: Providing customers with interesting, authentic, location-based content will go a long way toward garnering customer loyalty and repeat page interactions. Again, the more interaction a page receives the more likely it appears in Graph Search results.

Avoiding Potential Roadblocks

As with any new technology there are potential stumbling blocks for those interested in creating, maintaining and optimizing local Facebook pages for Graph Search including scalability, compliance, and resources.
Multi-location brands often lack the bandwidth to manage more than a handful of location pages at the corporate level, but handing the controls over to franchise or branch managers opens the door to inconsistent brand messaging and page post frequency issues.
Fortunately there are tools available to assist with these issues; automation technology allows a corporate marketer to manage multiple pages simultaneously while ensuring brand consistency. Often times these tools include varying levels of account access so corporate marketers can empower branch managers to provide local content while still maintaining control through an established approval process.
When evaluating the variety of tools on the market, brands should look for one that offers multi-page updating, post moderation, compliance and social analytics to measure success. Beyond this minimum standard, brands should find the tool that best suits their needs from a partner they trust.

Don’t Get Left Behind

The competition is already working to maximize their local Facebook presence; marketers who don’t keep pace run the risk of getting left behind. To ensure they’re keeping up, marketers should measure their pages’ success against those of the competition; this provides an easy way to identify page strengths and weaknesses and quickly adjust strategy if necessary.
If by chance marketers discover that the competition hasn’t taken these steps for themselves, they can take the lead. Marketers should also set page-specific goals to meet each month such as a certain number of check-ins or recommendations or increasing the number of page interactions (likes, shares, etc.) by a certain percentage each month.
On the flip side, if marketers remain inactive on this front, the consequences could be disastrous. Choosing to ignore this opportunity severely diminishes Facebook visibility as Graph Search gains momentum, because locations won’t factor into Graph Search results.
Marketers spent significant time in recent years establishing a presence with corporate Facebook pages. Some have taken the next step to capitalize on the opportunity to be more locally relevant. Graph Search shortens the timeline and raises the stakes for those who haven’t yet.

Reference:-http://searchenginewatch.com/article/2250855/Facebook-Graph-Search-How-Multi-Location-Brands-Local-Marketers-Can-Capitalize

EBriks Infotech:-SEO Company

AdWords Bill of Rights


bill-of-rights

Google dropped the Panda of paid search with enhanced campaigns, the most radical change to managing paid search in years.
Product changes are inevitable, but the value is often mixed. Consider session based broad match vs. modified broad match.
Google needs advertisers as much as advertisers need Google. We can’t drive the evolution of AdWords. But, we as paid search professionals can make sure that our needs are reflected in the options and formats Google chooses to pursue.
That’s why I think we need an AdWords Bill of Rights – a series of basic requirements that any performance marketer should be able to expect from their advertising vendor. Let me know what you’d amend in the comments.

1. Opt-in vs. Opt-Out

Fundamentally, I believe that advertisers should have choice of which features to use. Extensions are a great example of an opt-in feature. They’re often useful, but the choice of when and where they’re applied is up to you. The option to automatically show your ad for plurals and misspellings when you’re running phrase and exact match is another good example.
Combining all keywords together for every device is a bad example of choice. This takes advantage of the default bias, much in the same way combining search and display networks does.

2. Segmentation

Not every prospect is equally valuable. We as marketers want to be able to divide, target and value different users as granularly as possible.
Econsultancy reported that tablet users spend about 21 percent more than desktop shoppers. While tablets may be gradually replacing and desktops, the make up and potential ROI for those visitors may not yet be at parity for every advertisers.

3. Maximum Transparency

Search query reports and placement reports are the mainstays of great optimization. You know where your ads ran.
Enhanced campaigns will simplify management, but complicate reporting. Campaign numbers will increasingly be less useful as more sources of clicks are lumped together, with bids modified by layered bid adjustments and extensions served up differently within each campaign.
If there is a data point before the click, we should be able to see our results with those data after the click and adjust accordingly.

4. Efficiency

Google needs to make money. Advertisers need to make money. The tension between the two often meets in product features. Remember when Google wanted to force text ads to optimize for clicks instead of rotating evenly?
As performance marketers, we have the right to features that allow us to see how our money is being spent and to make it more efficient.

5. Accurate Measurement

Web analytics are often murky, but more sophisticated now than the early days of log files. Mobile web and app analytics are still very much in the early stages.
As we’re pushed increasingly to non-desktop devices, we have the right to expect the same level of insight and data we take for granted on the desktop web.

Reference:-http://searchenginewatch.com/article/2250827/AdWords-Bill-of-Rights

EBriks Infotech:-SEO Company India

Creative Link Prospecting: Following a News Story


idea-creativity

Sometimes a breaking news story spreads like wildfire. It’s covered extensively by mainstream media, specialist sites, and expert bloggers.
Such stories leave a trail behind them that can provide a rich stream of quality link prospects. Examining breaking news stories can help you discover prospects even months after the news event.
Link building is a creative process, so it’s good to try something different once in a while. This link prospecting exercise might help you get the creative juices going. Have some fun and you might just come up with some unusual link prospects.

What’s so Special About a Breaking News Story?

Breaking news stories have a fantastic momentum that pulls people to them – they attract journalists, bloggers, and experts who write regularly about the topic of the breaking news.
How they react to the story and the position they take gives you insights into their individual views, the things they take a stand on, the types of stories they’re likely to cover. All of this is fantastic intelligence if you want to make an approach to them in the future.
The coverage gives you great insights into the different angles that can be covered in a single news story – if you haven’t done too much public relations in the past, this can be a revelation and can spark some terrific creative ideas.
They take you outside the “usual suspects” that might be on your database or media directories, and identifies others that you might not have thought of. So it expands your thinking about the sectors that might be relevant.
Quirky stories are the best – the sort that go viral without being pushed to go viral. Think “link bait” without the premeditated intention behind it. Here are three recent breaking news stories we can learn from.

Neverseconds

A great example is the story of the schoolgirl Martha Payne, and her blog Neverseconds: the blog consists simply of photographs and a short daily review of her school dinner. Nothing special in that – it only turned into a major news story when the local Council decided to forbid her from publishing her blog.
The result was worldwide outrage as the story spread. Here’s one story from the New York Times:
girl-gives-lunch-failing-grade-nytimes
And the writer, Ravi Somaiya was not alone. Probably every food reporter in all the major media outlets had to jump in: and probably every food blogger on the planet were stirred to write about it – even if they hadn’t posted for weeks.
Add to that blogs on education and schools, blogs on local government, not to mention a host of blogs on public relations, online reputation and online mess-ups!
If you were in the food industry analyzing who wrote and linked to this story would give you a healthy catch of link prospects – and a great opener in getting in touch with them.
Here’s how Majestic SEO report the blog’s backlinks discovery rate – and that of the Argyll and Bute Council, and ‘Mary’s Meals’ a charity supported by young Martha.
majestic-martha-payne
MuckRack.com provides subscription-based database of journalists from top publications and broadcasters. The database can be searched by publication, by beat or by keyword.
Every tweet shown is a tweet or retweet from a verified journalist. It offers a fantastic opportunity to understand and build relationships with top journalists and editors.
MuckRack searches not only in the tweet, but also in the actual article that has been tweeted, making it a powerful discovery tool.
Here’s just a few of the many results from MuckRack:
muckrack-martha-payne
CitationLabs.com’s Link Prospector tool is another that can be used to uncover news stories:
citation-labs-martha-payne

BrewDog Craft Beer Award Scandal

Here’s another where the ineptitude of a major corporate is hard to fathom. BrewDog.com is a well-known Scottish craft brewer that exports worldwide – and happens to produce excellent beers. So good in fact that they won first prize for “Bar Operator of the Year” in the British Institute of Innkeeper’s Annual Scottish Awards.
However, the event’s sponsor’s Diageo refused to give the award to BrewDog – even though they were clear winners.
brewdog-award-scandal-huffington-post
Again, the story spread virally and brought a surge of new backlinks to both Brewdog.com and Diageo.com.
majestic-brewdog
And here’s an example of a search on MuckRack.com:
muckrack-brewdog

Horse Meat Scandal in Europe

There’s been a growing story of how horse meat has entered the food chain and been misrepresented as beef. The story first broke in Ireland, spread to the UK and then to Europe. Stories on the scandal have now reached the U.S.:
tainted-horse-meat-cnn
MuckRack.com is able to say exactly what journalists are saying:
muckrack-horse-meat

InkyBee.com

Inkybee is a new blogger outreach tool that is aimed at the public relations industry. However, it’s also a great place to discover important blogs writing about specific topics.
The tool will emerge from public beta testing in the next few weeks and promises to offer a powerful blogger outreach tool at a competitive price.
Here are just some of the results for a search on horse meat:
horse-meat-scandal-inkybee

Conclusion

Following news stories can give you a great idea of how stories spread, who are the bloggers and journalists who are most active and suggests ways that you can build relationship.

Reference:-http://searchenginewatch.com/article/2250881/Creative-Link-Prospecting-Following-a-News-Story

EBriks Infotech:-SEO India

Why SEO’s…are Focusing on Bing SEO along with Google?



EBriks SEO Company Provides this time that It is beyond doubt that till now, only Google dominated the search market, but from 2013, Google and Bing both would be collectively helping people to search information. When it comes to SEO and site optimization, people only look at Google, as it is a search engine platform that is base to all their efforts and innovative ideas, just because of the Google’s dominant position in online world. With the coming of New Year, people have started focusing on Bing too. 

The share of Bing in search engine market has increased. After all, Microsoft powered search engines, Bing and Yahoo hold one fourth of the total search engine market share. Recently, the release of Bing webmaster guidelines also supports its popularity.

Bing lays more stress on quality of content, as it says that content is for people and not for search crawlers. The information shared through piece of writing should convey the right message to target audience. Just the top ranking without any useful information won’t be of much use to people. It is therefore said, “SEO is a trick”. The keyword should not be over optimized. Optimizing the right page at right place can do wonders to your conversion rate, thus bringing in more business leads and increased sales. The content should be written not for the purpose of SEO, but with the aim to bring useful information to users and visitors. 

Social media participation is the next area to be looked into. The involvement of people is very much required for improving the ranking of the website. The sharing of information on social media sites should not only be concerned with posting, but in fact should be interesting and useful enough for people to like and share the same. The queries posted by followers should be answers. It would also improve upon the reputation of the company.  Online reputation is very much needed for online visibility. 

Bing also focuses at page loading time of a website, as website with longer page loading time is not liked by people. Bounce rate should also be taken into consideration. Higher bounce rate is direct indication of your site not much liked by users. The navigation patters and page content are others areas to be looked into.
In addition to these, there are several other areas to be pondered over. It includes sitemap, website’ technology, anchor text, etc. There are certain things to be avoided, like duplicate content, link scheming, cloaking, etc. 

EBriks Infotech:-SEO Company

Digital Marketing Strategy For Auto Dealers by EBriks Infotech

EBriks Infotech Share a Good Video to showing the factors which describes The Digital Marketing Strategy For Auto Dealers.EBriks infotech SEO Services Company provides their Services of Digital Marketing Strategy.If you want to more info then please visit www.ebriks.com

Tuesday 26 February 2013

Nielsen Average Social Media User Infographic By EBriks Infotech

Who is the Average Visitor to Social Networks and blogs.So Here EBriks Infotech Show You the Quality Image about this..If you Want to see it then visit www.ebriks.com

Tracking the Impact of Digital Marketing by EBriks Infotech

What is The Tracking the Impact of Digital Marketing in the Online Market Want to Know SO Here EBriks Infotech Show You the Quality Image about this..If you Want to see it then visit www.ebriks.com

10 Lession To Run Any Successful Bussiness by EBriks Infotech

Hello friends Here EBriks Infotech Share For You a PDF About 10 Top Level Lession To Run any successful bussiness .These lessions are improove the Bussiness Strategies.EBriks Infotech Company are Provides the Ten Important Lession for improove the Bussiness Growth.If you want to more info then please visit www.ebriks.com

SEO Guide Should Add Google+ Among Primary Necessities



EBriks Infotech SEO Company  this Time Provides the Information About Google + is one of the social websites that hold importance for businesses. Getting connected to this site directly influences the online visibility, which makes it a wise decision to connect business website to Google+. The “+1” button here is similar to “like’ button in Facebook. 

With the intention to make Google+ more famous, even more than Twitter or Facebook, Google reward +1 link more traffic and high weight age, when it comes to ranking over Google SERPs. It is because +1 link gets more Click Through Rate and thus, gets more shared across various social platforms. It would ultimately lead to more reach, more leads and more business. The difference can easily be noted by businesses that have got their websites already listed on Google+. At the same time, Google+ Profile Page Optimization is getting popular with time. 

Let’s discuss few of the important parameters that should be focused at in this direction
Google AuthorRank is the status that Google awards to sites for their online presence and credibility. In order to achieve this status, one should fulfil certain eligibility factors, as define by Google. The best way to become AuthroRank is to have attractive content that attract fans. It would be connecting social media with online web content.  So, achieve the rank badge and see your Google + image in search results.

Content plays a major role here. The content posted should be informative and high quality such that readers add you to their circles. Higher the number of circles, more famous and successful you would be in your online business. Apart from adding good content, one should also optimize Google+ profile page, attaining 100% profile completeness. Getting into high profile circles and sharing content publicly can also do good for your business. Don’t forget that engagement and involvement is what Google is aiming for its users. The attractive and thoughtful title along with topic-related image would further be advantageous. Image increases the probability of you content getting shared. 

Not only does Google + ends, there, it has brought much more with its chat, which can be personal chat or any discussion. 

SEO experts believe that adding Google + page to businesses listing can be a game changer for online businesses. Participating in Google + platform is expecting to do a lot good for businesses. It increases visibility, as social extension link your Google page to your Adwords campaign. So, all “+1” from your page, ad, website, etc get combine together. 

EBriks Infotech:-SEO Company India

Google Panda Two Years Later: Losers Still Losing & One Real Recovery

Editor’s Note: This is the first in a series of articles looking at the aftermath of Google’s Panda algorithm update, which launched February 24, 2011.)
panda-birthday-anniversary-iconTwo years ago today, Google sent shockwaves through not only the SEO industry, but also through online publishing in general when it launched the Panda algorithm update.
It was originally called the “farmer” update because Google’s prime target was “content farms,” a name used to describe sites that created high-quantities of low-quality content that sometimes ranked highly in Google’s search results. Although Google didn’t specifically say it was targeting content farms when Panda launched, Matt Cutts, head of Google’s webspam team, told us at the time: “I think people will get the idea of the types of sites we’re talking about.”
People did.
And Google’s targets became more obvious in the days after Panda launched when several search and software companies began issuing lists of winners and losers — websites that had been hurt or helped by Google’s algorithm change.
Of course, for every loser that lost search visibility, there’s also a winner that gained search visibility. But few of those winners have spoken out in the two years since Panda.
As you’ll see below, on a list of nearly two dozen of Panda’s original losers, only two websites have returned to the SEO visibility that they had about three weeks post-Panda. The others have all continued to lose search visibility.
Some other Panda-hit websites have recovered, though not all of those recoveries have been permanent. We’ll look at all that later in this article. First, some background.

Background: Panda’s Original Winners & Losers

It only took two days for the first look at Panda’s winners and losers to come in. Companies like Searchmetrics, Sistrix and others used their own tools and data to tell which websites lost or gained visibility in Google’s search results. Though these reports are far from official, many of the sites impacted eventually stepped forward to confirm that they were hit.
Panda’s early winners included several major content destinations like YouTube and Wikipedia, plus large brands like eBay and Amazon. Hundreds of other sites, big and small, no doubt saw their visibility go up as others were hurt. As I said above, for every loser that drops out of Google’s search results, there has to be a winner replacing it.
Our reports on Panda’s early losers listed hundreds of sites; here are some that were commonly included:
  • EzineArticles.com
  • HubPages.com
  • AssociatedContent.com
  • Mahalo.com
  • Examiner.com
  • Suite101.com
  • Buzzle.com
  • Squidoo.com
  • Buzzillions.com
You might’ve expected to see some of Demand Media’s sites on that list, but they were largely left off the first lists of Panda losers. More on that in a moment.

How Are The Panda Losers Now?

In a nutshell: Still losing.
In fact, some of Panda’s losers no longer exist and others have completely changed their name and/or business model. That’s the topic of tomorrow’s article.
We recently asked Searchmetrics to go back to one of its original lists of Panda losers from two years ago, and run its same “SEO Visibility” report on some of them. The company did that last week, and provided loads of information. (Note: We also contacted Sistrix with a similar request, but didn’t receive a reply in time for inclusion in this article.)
Searchmetrics looked at 22 Panda losers and compared their visibility in Google’s search results at three points:
  1. Before Panda (February 20, 2011)
  2. After Panda (March 13, 2011)
  3. Now (February 17, 2013)
The results? None of the 22 sites has returned to its pre-Panda visibility, and only two sites have improved their visibility today compared to their post-Panda visibility.
Here’s a spreadsheet that Searchmetrics shared with us (you can click to see a larger version):
panda-sheet-1
(Note: The numbers reflect Searchmetrics’ “SEO Visibility” score, which doesn’t reflect estimated traffic losses, but instead reflects how visible a domain is in Google’s search results across millions of keywords that the company tracks.)
In the image above, the key columns are to the far right: H and I. The way to read it is this: Suite101.com has seen its SEO visibility drop 96 percent since before Panda and has dropped 81 percent from its post-Panda visibility.
If you browse down column I, you’ll only see two sites with a positive number: Both MerchantCircle.com and Business.com have rebounded enough to have their current SEO visibility score be better than it was after Panda launched. But, as column H shows, both are both still far less visible than they were before Panda came along — as are all of the other 20 sites in this Searchmetrics list.
The SEO visibility chart for Business.com, shown below, is an eye-opener.
businesscom
Panda’s impact is obvious in February 2011, and the site’s visibility looks like a seesaw after that. It appears to have won back visibility in late 2011 or early 2012, around the time of Panda 9 or 10. It’s bounced a few times since then and today is doing a little better than it was right after Panda, but nowhere near pre-Panda visibility.

What About Demand Media Sites?

They’re not included above because, for the most part, they weren’t originally among the big Panda losers.
The company’s flagship site, eHow.com — a site that many associated with the term “content farm” — was actually reported to have gained visibility when Panda launched. That didn’t last long, though; eHow was hit a couple months later when Google rolled out Panda 2.0. Searchmetrics’ chart shows eHow gaining visibility in February 2011 when Panda launched, but losing it in April 2011.
panda-ehowcom-with-date
Although the site’s visibility appears to have gained a bit since September 2012, it’s still down 63 percent in Searchmetrics’ SEO visibility score compared to pre-Panda levels.
Another Demand Media site, Livestrong.com, spent much of 2012 on the rebound from Panda.
Searchmetrics says its SEO visibility dropped 35 percent in the first few weeks post-Panda — far less than some of the others mentioned above. But, as the chart below shows, it not only rebounded in 2012, but also far exceeded its SEO visibility … at least until the latter portion of the year.
panda-livestrong-with-date
After regaining visibility all year long, it appears that Livestrong was hit hard by Panda Update 22 in late November. It’s been dropping ever since. Today, Livestrong.com is about 13 percent below its pre-Panda visibility.
Panda hurt Demand Media: A year ago, the Los Angeles Times reported that Panda was to blame for Demand suffering a $6.4 million loss in Q4 of 2011.
But just last week, in its latest earnings report, Demand Media said that page views were up 24 percent in 2012 (compared to 2011) on its owned and operated websites, “driven primarily by strong traffic growth on eHow.com and Livestrong.com.” In the statement, CEO Richard Rosenblatt said the company “improved content quality” in 2012 and is “now prepared to significantly increase our content investments in 2013.”
Despite that optimism, Demand Media’s sites appear to be a mixed bag at this point in terms of post-Panda recovery.
At least one other site, however, has done better.

MotorTrend.com: A True Panda Recovery

Motor Trend is a long-running magazine with a presumably trusted website, and its annual “Car of the Year” award is about as prestigious as it gets in the auto industry. I’m not a Motor Trend reader, nor have I ever spent quality time on the MotorTrend.com website. So, I can’t speak to whether it deserved to get hit by Panda. But it certainly did, as this Searchmetrics chart shows:
panda-motortrend
MotorTrend.com was obviously hit in the initial Panda update, then recovered in July 2011 around the time of Panda 5. It dropped again with Panda 7 — and we mentioned it in our coverage — then quickly recovered again a couple weeks later with Panda 8.
Today, the site appears to have steady visibility based on Searchmetrics’ scoring — and better visibility than it had pre-Panda. I don’t recall ever reading about Motor Trend’s trials in dealing with Panda, but it might be an interesting read (assuming the magazine was aware of that five-month visibility drop).
The irony of Motor Trend being hit by Panda, and then recovering as it has, is that one of Google’s 23 questions for Panda-hit webmasters was, Would you expect to see this article in a printed magazine, encyclopedia or book?
Perhaps Google realized the answer to that question, in this case, was “yes.”

Further Reading

For more about the Google Panda update, read through these categories in our article library:
This series on the second anniversary of Google’s Panda update continues tomorrow with a look at its aftermath and impact on several of the sites that were labeled as losers. (UPDATE: That article is now online: Google Panda Two Years Later: The Real Impact Beyond Rankings & SEO Visibility.)


Reference:-http://searchengineland.com/google-panda-two-years-later-losers-still-losing-one-real-recovery-149491

EBriks Infotech:-SEO India