Wednesday, October 5, 2016

This write-up discusses the most common pitfalls that you should avoid when hiring an SEO reseller agency in India. These pitfalls can be disastrous and risky for your business.

Are you going to hire a private label SEO reseller for your company? If so, it is like moving to the edge of a cliff. Millions of agencies have arouse and become popular in the last few years due to increasing demand of private label services. Because of this growing industry, it becomes extremely challenging to choose the right firm for your business. Many freelancers as well as large web agencies are offering these services in order to leverage the benefits of SEO reselling as part of their business.

SEO Resellers in India have gained even more popularity because of cost advantage in additional to the expertise. But given the number of options, it is important to take every step carefully to assure that you choose a right agency and avoid disastrous pitfalls in their working.

Pitfalls to Avoid When Hiring an SEO Reseller in India
·         Black Hat SEO Techniques.
If you are into web related business or any other business; and have very less or zero knowledge of SEO, the first thing to take care of is: black hat SEO techniques. In excitement to drive more traffic to the website, many unethical agencies use aggressive black hat SEO techniques. But these are not acceptable by search engines now.  These unethical agencies include poor link building, writing content stuffed with keywords, irrelevant content and other poor practices. These may work in the short term, but it will bring only penalties from major search engines in the long run.  Once created mess, it will take huge cost as well as years in cleaning it.

·         Poor Content is Hazardous to Your Brand.
In the whole process of SEO, content is the king, and if the king is bad, everything goes bad. From website to social media and other platforms, content has to be posted in several places on your behalf. A poor quality and duplicate content that has grammatical errors, does not bring any result and also hurts your brand image. Therefore, it is important to check the portfolio of SEO reseller and ask them to write 1-2 samples for your business.

·         Lack of Product/Service Knowledge.
In order to run successful SEO campaigns, having detailed knowledge of respective business is a must. An agency must have knowledge of product and services they are promoting. They should be aware of challenges and competition involved in the process. If you hire an agency that holds experience working on similar project, that is a bonus point. It will help you achieve your business goals quickly. They must be confident and passionate towards their job.

·         Don’t Leave It Entirely To Them.
A good SEO reseller will do everything on your behalf. But it is important to keep yourself involved and communicate with your agency on a regular basis. Spend time on explaining your goals to them and communicate them for reporting. Try to remove all communication barriers that are perhaps the biggest obstacle in these kinds of dealings.

By avoiding these pitfalls, you can assure that your marketing campaigns are in the right hands. In this way, SEO reseller in India can prove to be a blessing for your business, while avoiding online marketing risks. 

Tuesday, June 24, 2014

Knowledge Graph for Free Blog

Hi Guys,

Do you have knowledge graph for your free blog like ***
Do you think Knowledge graph is only for big brand websites?

If you are good in SEO and have done ethical SEO for your less popular website or blog, then results are more appealing for you. Check out the example below for SEO Fundas (my free blogspot blog)

Friday, May 23, 2014

YouTube SEO Tips

Wednesday, March 19, 2014

Bing Webmaster Tools: An Overview

If you don't yet have a Bing Webmaster Tools (BWT) account, just go ahead and sign up. In the two years since we first published this guide to BWT, they've added more and more features that can help you manage your sites.
Just like Google Webmaster Tools (GWT), Bing's Webmaster Tools provides some great data for webmasters to use and address potential SEO issues, and serves as the primary mechanism for Bing to communicate those issues to site owners. All of this for the princely sum of free (which is a very fair price IMHO).
For those unfamiliar with Bing Webmaster Tools, this guide will walk you through the various features available to all webmasters.

Adding a Site

The first step is to log in to Bing Webmaster Tools, enter the URL of your home page, and click on the ADD button. This takes you to a screen to enter a sitemap URL and to enter some basic day parting information.
Bing Webmaster Tools Add a Site
Once you've clicked the ADD button you'll be taken to the dashboard page where you'll see a thumbnail of the home page and a note informing you that you need to verify the site. To do so, click on the "Verify Now" text.


Bing offers multiple verification methods.

XML File Verification

With this option you'll download a file named "BingSiteAuth.xml" which will automatically have an entry keyed to your account. This file will need to be placed in the root directory of your site. Once it's there you click the Verify button at the bottom of the page.

Meta Tag Verification

With this option you'll take the line of code provided and place it in thesection of the home page of your site. Once it's there you click the Verify button at the bottom of the page.

DNS Verification

This option is more technical than the previous 2, but if you only have access to your hosting solution, and don't have the ability to either modify the head of your home page or to drop files in the root of your domain, then this is the only option that will work for you.
You'll need to add a CNAME record with the name provided to the value There are instructions on how to do this for the majority of popular hosting solutions.
Once this has been done, you just have to click the Verify button at the bottom of the page.


Bing Webmaster Tools Dashboard
Clicking on your site takes you to the dashboard. Here you'll see data showing trending data over the last month – clicks, impressions, pages crawled, crawl errors, and pages indexed. You'll also see some basic information on your sitemaps, top keywords, and top pages linked to.

Configure My Site

Clicking on this tab will give you a dashboard of data for items that you can affect when configuring your site. Clicking on the sub-tabs will give you more detail and allow you to make changes where applicable.


Bing Webmaster Tools Sitemaps
Here's where you can submit new sitemaps, and be informed of the success/error status of the sitemap.

Ignore URL Parameters

Bing Webmaster Tools Ignore These URL Parameters
Here you have the ability to instruct the Bing crawler (Bingbot) to ignore URL parameters that have no impact upon the content of the page (usually tracking parameters).
The reason that this option is available was because of the concern over duplicate content – same content different URLs. By instructing the crawlers to ignore certain parameters, the idea was that this would reduce the potential for duplicate content.
If you're using canonical tags you have no need to use this option as the canonical tags will take care of normalizing your URLs. If you're not using canonical tags, simply enter the key element of the parameter to be ignored and click submit (then get a canonical tag project on your product roadmap so you can ignore this in the future).

Crawl Control

Bing Webmaster Tools Crawl Control
Search engines want to be able to get as much good content from your site as possible, so they can in turn provide that content to users within the search results, but they also recognize that they don't want to harm your business while they do that.
Bing allows you to customize their crawling pattern, so they're hitting your site the hardest when you have the least amount of traffic coming to your site. You can accept one of their default options or click yourself to specify the best times for you to be crawled.

Deep Links

Similar to sitelinks in Google, Deep Links give your page more shelf space / visibility in the search results, by providing additional content options for users to click through to. These are automatically generated based on the pages that Bing deem to be the most important / relevant to users.
Bing Webmaster Tools Deep Links
You don't have the ability to add Deep Links, but by clicking on one of the URLs you'll be taken to a page that shows the deep links in order, and gives you the ability to either block one or more of them (say for example if a completely unrelated or login required section was displaying), or to provide weighting information for each of the options in order to reorder them.

Block URLs

Bing Webmaster Tools Block URLs
If you need to remove either a page or a directory from the Bing index, here's where you do it. Simply select page or directory, enter the URL and click on either of the block buttons depending on whether you're blocking it from just the cache or completely.
Unlike Google, Bing doesn't require that you've removed or redirected the original content. What they do instead is to block it for 90 days, and if it's then crawlable and you've not extended the block, they'll re-index it again.

Page Preview

Bing Webmaster Tools Page Preview
Here is where you can either block a page preview image within Bing, or request a page preview refresh. You would do this if you have indexed content that you need to remove as quickly as possible for legal or other reasons. Note: it may take up to 24 hours for the page preview removal / refresh to actually happen.

Disavow Links

Bing Webmaster Tools Disavow Links
Here you can inform Bing of any links that you really don't want. This is the way that Bing is looking at dealing with negative SEO. Simply select page, directory, or domain and enter the place that houses the link to your site.


Bing Webmaster Tools Geo-Targeting
This section allows you to set geo-targeting information for your entire site, subdomain, directory or individual page. Bing will take this information as a suggestion for how your identified page(s) should display in the search results. Note, that this will not override stronger geo signals such as a country specific TLD.

Verify Ownership

Bing Webmaster Tools Verify Ownership for
This section confirms that your account has been validated to access information about this domain. Should the validation keys be removed, then you have the same options here as you had when you originally added the site (see above).

Connected Pages

Bing Webmaster Tools Connected Pages
If your site has social pages (if it doesn't... why not?) here's where you can hook them up to your Bing account. Simply enter the appropriate URLs for your pages here for everything from your Facebook page down to your MySpace page (yes, that's actually there) and hit the verify button. Once you've added them you'll start seeing impression and click data from Bing in the connected pages dashboard, which displays on this page.


Bing Webmaster Tools Users
Here you have the ability to add new users without going through the verification process for each one (although to get here one user has to be manually verified). Simply add a valid LiveID email address, select the role you want them to have (Read only, Read/Modify, or Administrator) and click add. That user will then see this site displayed in their BWT dashboard.

Reports and Data

This section gives you, as the name suggests, access to reports and data on the effectiveness of your site on Bing. In most of these sections you have the ability to export the data and play with it in the spreadsheet program of your choice.

Page Traffic

Bing Webmaster Tools Page Traffic
Page Traffic shows you the traffic stats for the top performing pages on a site. You get to see click data, impression data, CTR Data, the average position when clicked, and the average position when viewed. The View hyperlink at the end opens up a window that shows you the keywords for that URL and their data.

Index Explorer

Bing Webmaster Tools Index Explorer
This font of information lets you see all the data about the pages that Bing has crawled, or attempted to crawl. You can see the number of URLs discovered, the number of them that have surfaced in search, and those that have been clicked on in search. You can click down through the folders and get the data just for that section, which is a valuable tool for a site that segments content on a folder basis rather than a subdomain basis.
One really nice feature here is that it shows the subdomains that have been crawled, so if a dev forgets to put the right robots.txt on your sandbox site you'll see it listed here.
You can also filter the data to show only pages with 301 redirects, 404 errors or identified malware infections with a single click. If you want to see pages that have returned other error codes (i.e., 500 series), then all you have to do is select that range from the HTTP code drop down.

Search Keywords

Bing Webmaster Tools Search Keywords
Here you're going to see some analytics data showing your top performing keywords in Bing. You'll be able to view the clicks, the impressions, the CTR, the average position when clicked, and the average position overall. Clicking on the View hyperlink shows all of the pages were served up when that keyword was searched on Bing or Yahoo.

SEO Reports

Bing Webmaster Tools SEO Reports
Here you're going to get all of the SEO recommendations that will help your site comply with SEO best practices. Simply click on the error to get a full description of the problem along with a list of the top 50 pages that were non-compliant.

Inbound Links

Bing Webmaster Tools Inbound Links
This section shows the external links that Bing has found that point to your site. The trending information shows whether you're growing or losing links. Clicking on either the Target Page or on the count of links for that page brings up a popup window that shows you up to 20k links (and associated anchor text) for that page.

Crawl Information

Bing Webmaster Tools Crawl Information
This section contains similar data to the Index Explorer section, but gives a different view into the data. To see the pages that the errors have been reported for, click on the number under the error type and they'll be displayed below.


Bing Webmaster Tools Malware
In the Malware section Bing will inform you of any malware they have detected on your pages. This also includes any links on your page to pages that have been identified as ones containing malware.

Diagnostics and Tools

This section currently has seven useful tools that any webmaster can use.

Keyword Research

Bing Webmaster Tools Keyword Research
As is frequently expressed here on SEW, keyword research is one of the fundamental tasks of any SEO campaign. Here Bing gives you access to their keyword research data so you can see the query volumes on Bing for the keywords you're interested in, along with related keywords which can give you ideas for other areas that you may want to target.

Link Explorer

Bing Webmaster Tools Link Explorer
In this section you can enter any URL and get a list of pages that link to that URL. There are other filter options available, you can filter by site to get a list of the pages that link to that URL, you can filter by anchor text so you can see who links with "Click Here" and who links with the brand name, and you can filter by the source – internal, external or both (note: internal also includes subdomains).

Fetch as Bingbot

Bing Webmaster Tools Fetch as Bingbot
If you'd like to see your page as Bing's crawler – Bingbot – see it, just enter your URL here and click fetch. You'll then see the headers and content for the page displayed. Note, unlike the previous tools this only works on pages that you are verified to view in Bing Webmaster Tools.

Markup Validator

Bing Webmaster Tools Markup Validator
If you're using structured markup (Microformats, RDFa, Open Graph,, etc), this tool will validate your page to ensure that it correctly meets the specifications. If it does, then Bing may use that structured data directly in their search results.

SEO Analyzer

Bing Webmaster Tools SEO Analyzer
Here you can ask for analysis of an individual page to see what work would need to be done for the page to comply with SEO best practices.

Verify Bingbot Tool

Bing Webmaster Tools Verify Bingbot Tool
If you're looking through your log files and come across an IP that you're not quite sure whether it belongs to Bingbot or not, simply pop over here, enter the IP address, and you'll get confirmation as to whether it is or not.

Site Move

Bing Webmaster Tools Site Move
If you're ready to move your site to a new domain, or you're moving content around your site, this is where you tell Bing. Sure they'll pick up the redirects, and eventually change it in the index, but this can help to hurry that process along.

Message Center

Bing Webmaster Tools Message Center
This is where Bing is going to communicate with you, any issues they need to inform you of, from malware detection to crawl speed concerns will be here. If you work with multiple sites you can filter by site, and if you have lots of messages you can also filter by message type.

Webmaster API

Bing Webmaster API
If you'd like to programmatically pull any of the data from Bing Webmaster Tools into any internal tools, then Bing provides you with an API to do so. Simply follow the directions in the documentation and use the API key provided here on this page to do so.

Thursday, December 19, 2013

Matt Cutts, Google’s head of search spam, posted a video few days back about duplicate content and the repercussions of it within Google’s search results.

Matt said that somewhere between 25% to 30% of the content on the web is duplicative. Of all the web pages and content across the internet, over one-quarter of it is repetitive or duplicative.

Matt Cutts says you don’t have to worry about it. Google doesn't treat duplicate content as spam. It is true that Google only wants to show one of those pages in their search results, which may feel like a penalty if your content is not chosen — but it is not.

Google takes all the duplicates and groups them into a cluster. Then Google will show the best of the results in that cluster.

Matt Cutts did say Google does reserve the right to penalize a site that is excessively duplicating content, in a manipulative manner. But overall, duplicate content is normal and not spam.
The head of search spam at Google, Matt Cutts, has confirmed that Google has applied a 15% reduction in the amount of rich snippets displayed in the search results.

Matt Cutts announced at PubCon a couple months ago that this would happen, saying that the ability to have and use rich snippets may be taken away for low quality sites in the coming months. And this has indeed happened. Matt said this would likely reduce authorship by 15% to only show more authoritative authors.

Cyrus Shepard wrote that the MozCast features tool noticed a drop in authorship and webmasters have recently been complaining about their authorship being dropped out.

Friday, September 27, 2013

Google has a new search algorithm, the system it uses to sort through all the information it has when you search and come back with answers. It’s called “Hummingbird” and below, what we know about it so far.

Hummingbird is the new search algorithm that Google is using, one that Google says should return better results. Hummingbird looks at page rank of a page, how important links to a page are deemed to be, page quality, words used on it etc.

Google started using Hummingbird about a month ago, it said. Google only announced the change today. Hummingbird is a brand new engine, though it continues to use some of the same parts of the old, like Penguin and Panda. Hummingbird should better focus on the meaning behind the words.

Google said that Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.

But it does not mean that SEO is dead. According to Google guidance are still the same like high quality content, unique content. Hummingbird just allows Google to process them in new and hopefully better ways.

Monday, September 23, 2013

Google has switched all searches over to encrypted searches using HTTPS. This means no more keyword data will be passed to site owners. It appears that Google has cut off keyword data altogether.

Encrypted Google searches don't pass the keyword data through to websites, thereby eliminating the ability to track users by their keyword searches. The biggest impact for many site owners has been not being able to segment users by keywords within their web analytics software.

I was so expecting that. Google wants to generate more and more money from Adwords which means they need to make SEO difficult so that companies can move to PPC to get results.

But keyword data from other search engines like Bing still send keyword data through.

Friday, September 20, 2013

This article is focused on competitor's approach of buying or building low quality backlinks of your website to diminish its rankings in search engines. This is the most common attack from competitors these days. If you found sudden decrease in your website ranking and traffic, then is the possibility that your website suffers with competitors negative link attack. There are many kind of attacks:-

- Buying Low-Quality Links to the Homepage vs. a Subpage
- Single Spam Attack vs. Multiple Attacks Over Time
- Attack on New Website vs. Established Website

The main motive of attack is taking down a competitor website isn't the only motive for building low-quality links to a website. Occasionally, spammers will build low-quality links to help boost the equity of links pointing to their site from your site.

Negative SEO link attacks that are focused on diminishing the organic search visibility of a website tend to be targeted to the homepage, low-quality backlinks tend to point to the homepage. Conversely, negative SEO link attacks that are targeted for boosting existing links on a page tend to appear deeper on a website, low-quality backlinks tend to point to deeper of a website, often areas where there is user-generated content.

So keep in mind when you have new website, you have to take care about link niche. Focus to create heavy Page rank and branded websites relevant links. avoid any kind of submission links in starting, monitor your inbound links at least once a month for the first year of the life of your site.

If you have been impacted by a negative SEO link attack varies depending on your unique situations, then quickly check the scope of impact and hit. Check was your home page impacted or subpages. Check traffic in webtrends, omniture and Google analytics. Check Google webmaster for unnatural links. Keep record of what kind of links created by your team. Disavow spam links in Google and Bing. Submitt for reconsideration request in webmasters.

Thursday, September 19, 2013

Most site owners and businesses know exactly which pages on their site are the most profitable (i.e., the content that drives the most leads and/or revenue). If you run a site and you don't know, then you really should figure it out today.

- Move your top content page higher up in the site structure/ navigation and mega menu so it gets crawled more frequently by search engines and seen more often by viewers. This is a clear indication for all that you give more value to your content.

- Add contextutal links in your top content pages to relevant pages so that user get easily move to other relevant pages of your site.

since a primary goal of most organizations is to grow revenue, they might work to increase the profitability of their top pages with tactics like:

   > Hiring a conversion rate optimization specialist.
    >Using a fantastic A/B tool like Unbounce to test and iterate.
    > Conducting user tests to increase conversions, using an insanely affordable tool.