Dodgy math and a lot of numbers are a bad mix and paid search provides enough of the former to create ample opportunity for the latter. Like many other forms of online advertising, paid search generates masses of data and often leads some apparent professionals to confuse information with insights. Like many other forms of online advertising, data is easily available through platforms like Adwords which it leads some apparent professionals to confuse information with insights.
Recommendations for account expansion based on historic placement or search query reports filtered for cost per acquisition (CPA) or conversion rate (CVR) isn’t as useful as it may seem. In practice these lists are mostly populated by junk. Placements and queries that just happened to get their one conversion a year within the reporting period on so little traffic it manages to match the criteria used. This isn’t data, and it certainly isn’t an insight. But it can cost you money.
Math to the Rescue!
Fortunately there are simple, robust tools you can used to deal with this. Tools able to find what is likely to work and identify the riskier options. It’s called math. Numeracy and a familiarity with statistics and probability can be useful in marketing. In the scenario outlined above, the biggest issue is determining how close the data collected reflects reality.
Discovering how frequently a conversion occured is easy. Strictly speaking, a conversion rate is a measure of what has happened. Just because Keyword F got a conversion rate of 50% from the sample below does not mean it would do this again over 10 or 100 clicks. This holds true for any of the examples below. While this can be an indication of future performance, assumuing a large enough sample and that nothing changes, as the number of clicks go down, so does the reliability of this number.
How Wrong are You?
While this data won’t necessarily tell you what to expect for the next 10, 20 or 100 clicks, you can determine within what range the actual conversion rate would be. Confidence Intervals can be used to determine the range within which the actual success rate is likely to be 95% (or 50%, or 99%, or whatever) of the time.
|A||100||10||10%||4.12% – 15.88%|
|B||80||5||6.25%||0.95% – 11.55%|
|C||60||10||16.67%||7.23% – 26.1%|
|D||40||15||37.5%||22.5% – 52.5%|
|E||20||5||25%||6.02% – 43.98%|
|F||10||5||50%||19.01% – 80.99%|
Calculating these ranges was done using Wolfram Alpha’s Confidence interval for binomial tool. Unsurprisingly, the smaller the sample used, the greater the range.
Use Numbers Better
There is no excuse for getting this wrong. Not with the easy availability of these kinds of tools and the information to use them. Probability and Statistics is very much relevant to analysing and managing paid search campaigns, much in the same way that Economics can be useful too. Be it constructing a formula in a spreadsheet or using an online calculator, looking a little bit harder at the data you seek to base decisions on isn’t all that hard.
The Google disavow link tool has just been released and unsurprisingly Google’s version attracted a lot more attention than Bing’s version. This new Google Webmaster Tools (GWT) feature gives those with “unnatural link” warnings and concerns over Penguin related penalties something to do other than trying to get links removed by disinterested or non-existent site owners or past SEO agencies.
While removing these links from the web is still the preferred solution, Google will typically ignore links submitted via this tool once they have been recrawled. The links are submitted as a list in a text file, and each site can only submit one. These lists can be updated by downloading the original file, altering it and resubmitting through the same tool. The links submitted will be recrawled before the change can be applied, possibly taking weeks. Google also advises to wait a while after submitting a disavow request before submitting a reconsideration request.
The Link Builder’s Dilemma
One of the best links you can get for SEO is from a good site that no-one else, especially your competitors, has access to. These kind of links are not common, and usually require some work or cost to acquire. This also means the process isn’t scalable or easily commoditised. As a result most link profiles are full of shared link resources. These kind of sites include forums, directories, aggregators, news and resource sites, blogs that accept guest posts and anything with poorly moderated comments. Unfortunately this is also why Google is attempting to discourage this behaviour through “unnatural link” notices and penalties, and requesting disclosure of purchased and artificial links as a part of this.
In this environment the Prisoner’s Dilemma is a good metaphor for how the behaviour of SEOs can affect themselves and others in the industry. One SEO disclosing links and practices contrary to Google’s Guidelines will affect others in the industry. This provides information to Google for identifying patterns in behaviour and can even result in direct penalties. Reconsideration requests, “unnatural links” notices and the Disavow tool seem simply to encourage this behaviour further by offering a benefit.
Hypothetically, if no-one discloses all the shared resources they are using, there is a chance that Google won’t be able to accurately identify them. Conversely, if enough people provide Google with information on these networks, then most other sites that rely on them are likely to suffer as they are penalised or their links are disregarded. Ideally it is not in everyone’s interest to disclose shared link resources that may be violating Google’s guidelines.
The Disavow tool will get used, and for a lot of people it will probably be very valuable, especially after the predations of Google’s Penguin algorithm. Another certainty is that Google will collect a lot of data from this. From outing competitors and annoying link spammers to disclosing huge lists of links in reconsideration requests, collecting data on what other people think is a bad link isn’t new. It is just a little easier.
As their name implies, shared link resources are likely to link to more than one site, providing them with better rankings, nothing at all or a penalty. Sometimes it can be hard to work out which of the three it actually is. Which can be an issue when it comes to picking which links to have removed or disavowed.
Winning the Link Builder’s Dilemma
In the Prisoner’s Dilemma both the players are equal. They have the same options, the same resources and the same consequences apply to their collective decisions. This isn’t really the case with the Link Builder’s Dilemma.
Perhaps if the original Prisoner’s Dilemma had one player committing robbery and the other adding a little murder to the outing it would be a truer reflection. In the case of the Link Builder’s Dilemma, not all the players have the same risks, and some are better able to endure the negative effects of choosing to disclose more than their competitors.
Playing the Game
Disavowing links purportedly provides a benefit. Providing a list of links for Google to disregard should assist reconsideration requests and lift penalties, provided that the list covers the links affecting the site.
As it is hard to tell which links are having a negative effect on a site’s rankings, there is an incentive to over-report and list everything. Especially if the site in question has a good range of other, natural links. Brand sites (Site A) are likely to fall into this category. They have the resources to attract links others might not have access to through sponsorship, mainstream media activity and their place in the community. Often these entities have also engaged in other link building activity to remain competitive prior to the latest series of changes.
Other sites that are more reliant on shared link resources are less likely to over-report, as the risk of torching otherwise still valuable links carries greater consequences. These two kinds of sites come into conflict when they share link resources. If one disavows and flags links the other believes are legitimate and necessary, and if Google eventually uses this data to remove any value this link can pass, then the one that didn’t disavow will lose out.
Sites likely to fall into this second group are pure web businesses (Site B) such as aggregators and other online services without the resources to have established a good brand or presence throughout the community: sites that are not likely to be able to generate the same kind of links just by existing as Site A.
|Hypothetical and Simplified Link Builder’s Dilemma|
|Site A Over-Disavows||Site A Disavows||Site A Doesn’t Disavow|
|Site B Over-Disavows||Penalty Lifted, Site A has a link advantage||Penalty lifted, Site A has a link advantage||Site B Penalty Lifted|
|Site B Disavows||Penalty Lifted, Site A has a link advantage||Penalty Lifted, Some shared link resources remain||Site B Penalty Lifted|
|Site B Doesn’t Disavow||Site A Penalty Lifted||Site A Penalty Lifted||Status Quo|
Realistically, it will always be in the interest of sites like Site A to over-report and disavow any link that could be associated with a penalty. Established sites will usually have enough links that losing some shared resources that are providing good links is acceptable. Sites with a profile like Site B are not likely to be in this position.
To say that the Disavow tool will render all shared link resources ineffective is a massive assumption. It is far more likely that this is just one more source of data for Google to use in building and training their algorithms.
Something like the Disavow tool is certainly needed. Attempting to get thousands of comment spam, footer and splog links taken offline is not practical, and as links have been seen to be effective for so long, there are a lot of them out there.
Like any other change in how Google collects and uses data, there are going to be consequences, and as with any other change, there will be some who benefit, some who lose and some who are unaffected. Though if I were pressed to choose a winner in this environment, it would be those sites that are not reliant on just shared link resources.
How do you keep an eye on your competition in search? There are a lot of tools available, from free to costing thousands. However it does not stop there. It is not enough to see what has changed, you need to know how to respond.
For any business online, ranking for the terms that drive traffic to products that sell is just the first half of the battle. Ensuring you keep that valuable search traffic is just as important. Competitor analysis is a vital part of any ongoing optimization plan, especially if you rely on traffic from sought after search terms.
Read more on Kickstart your Competitor Analysis
Google Analytics handles search traffic from image search the same as any other organic query. The search term will appear under Sources/Search/Organic and the keywords will be reported the the same way as one from the blended search results page. Fortunately there are ways to differentiate and identify different kinds of organic search traffic using Google Analytics, if you do not mind creating new filters and profiles.
Google Analytics provides a number of tools that make this possible, such as a new profile for the web property you want to use this with a a few simple filters. The filters in Google Analytics provide a lot of additional functionality. They can be used to exclude certain audiences from the reports, manage how additional variables are handled and alter how and what information is displayed. Tracking other search types in Google Analytics will require some of the functionality of the last two.
Referrers and Search
All traffic from one site to another will pass a referrer. Even in the case of Google’s SSL search, a referrer is still passed, even if it does have important information stripped out of it. This referrer is why it is possible to identify image search traffic.
Traffic from image search will include imgres in the referring URL in this position, at least for the moment. There are other strings that appear here depending on the link that is clicked. Others include url, search and adclk. There are a few ways to monitor these referring URLs coming from search engines and two of the easiest are using the Firefox plugin Live http Headers or setting up a profile in Google Analytics to report on full referring URLs.
Profiles and Filters
It is important to start with a new profile whenever you plan to implement new filters. Leaving one profile untouched gives you at least one set of reports to use to check the others against, especially if something does not work out as planned.
Once a new profile has been set up, the filters are very easy to implement. Create a new filter and use the following settings:
- Select “Custom Filter/Advanced” as the type
- “Field A -> Extract A” needs to be set as:
- Referral / (imgres|search|url|map)(\?)
- This will look for and match to the strings included in the regular expression. Add other strings as needed. This is where it is useful to keep an eye on full referring URLs.
- “Output To -> Constructor” needs to be set as:
- Custom Field 1 / $A1
- This will select the string in the first set to be passed as a variable to the next filter.
To take the string extracted from the full referring URL and associate it with the search term to which the visit is linked, you will need another filter. This one takes the string from the first filter and will add it to the end of the search term.
- “Field A -> Extract A” needs to be set as:
- Custom Field 1 / (.*)
- This will take the value set as Custom Field 1 in the previous filter and make it available to the Constructor.
- “Field B -> Extract B” needs to be set as:
- Campaign Term / (.*)
- This will take the campaign term (search term) associated with the visit and make it available to the Constructor.
- “Output To -> Constructor” needs to be set as:
- Campaign Term / $B1 ($A1)
- This will take the string from the referring URL, such as imgres, and add it to the end of the Campaign term in brackets.
Once these filters are finished, ensure they are in the correct order, and the search terms under the traffic reports should be displayed with the strings defined in the first extract in the first filter.
While the example here was created to tag search terms in Google Analytics with the kind of search that they were, that is not all this can do. Ultimately these filters are just taking a string of characters from a referring URL and adding it to the end of a value that is displayed in Google Analytics.
There is more that can be done with these kinds of tools. Even if all you want to use this for is categorising Google search queries, it is still worth keeping track of the full referring URLs sending traffic to a site, and Analytics makes it easy. All you need is some way to take the referring URL, and then show it as a custom variable, perhaps even as a “User Defined” one.
Most tools have their own limitations, and one of Google Analytic’s blind spots are pages with no traffic at all. Google Analytics tracks visitors and not pages, which if used as the only tool for monitoring site performance can lead the user to ignore certain parts of the site.
Managing large sites is challenging. There is a lot to deal with, from the technical quirks of the content management system, managing crawl paths to the content you want to promote, without overly enthusiastic use of nofollow/noindex.