My latest guest post, “Using Excel Formulas To Group Your Google Analytics Long Tail Keywords Into Themes
“, is live on Search Engine People’s blog.
This post expands further on my last post, “Living with SSL Search” and covers a fairly simple and quick technique for dealing with theming keyword traffic:
Unfortunately natural language processing tools are not as widely available as we would like and frankly word clouds just do not cut it. This leaves us having to find some kind of a workable kludge to make sense of the information Google Analytics provides. In this case, it is going to be Excel’s formulas. A sample spreadsheet is provided at the end of this post.
You can ready the full post and grab the example spreadsheet with the formula here.
(Originally written May 23, 2012. Updated and edited prior to publishing.)
In October Google 2011 announced that all their users will be using SSL search by default. Google’s SSL search was originally rolled out in the USA market and provoked a lively response from the SEO community. SSL search as Google implemented it removed search query information from the referring URL the analytics package on the destination site would see. Google’s SSL search would not pass information about the users search query if:
- Search through a Google domain where SSL search was live,
- While signed into Google accounts, and
- Click on an organic listing (AdWords keyword referrals are still passed).
However SSL search would pass keyword referral information if:
- The search was through a Google domain where SSL search was not live or,
- The user was not logged in or
- The user clicked on an AdWords ad
Whether or not the destination site was or was not using SSL had no effect on this. If the user went from SSL search via an Adwords ad to an unencrypted site, the keyword data was passed, without any concerns for privacy. If the user clicked on an organic link that would take them to an encrypted site, which would ensure that their referrer information remained protected from third parties, it would not be passed. How Google’s SSL search was implemented promoted a lot of discussion around the Internet, some of which I linked to from a previous blog post, which covers this topic in more detail.
SSL Search for Google.com.au in Australia
On the 6th of March 2012, Google announced that they were going to be “Bringing more secure search around the globe“. Shortly afterwards, some users noticed that their accounts were using SSL search by default on Google.com.au. It was not long before it was possible to confirm this, and it seems that from now on, Australian online marketers will have to account for (Not Provided) when analysing their organic keyword traffic from Google.
How to Find the Missing Keyword Data?
The short answer is that you can’t. The data is gone, and the best out come would simply be a model. Google has indicated that Google Webmaster Tools (GWT) provides keyword data unaffected by SSL search. For now, the value of the search query data within GWT is not perfect, and provides incomplete and potentially unclear data for most sites.
One technique for dealing with this is to assume that (not provided) keywords as sharing the same composition as known keyword traffic. Assigning (not provided) traffic to groups based on the distribution of known traffic to the site based on mutually exclusive keyword themes is certainly one of the more straight forward approaches.
There are a number of issues that will limit it’s effectiveness. It’s accuracy will diminish with the amount of traffic being modeled. The other issue is that those using SSL search are a different group to those who are not. While the actual difference between someone who is signed in to Google and someone who isn’t might not be clear, it is negligent not to acknowledge the issues this may create. Also as the traffic volume for a specific keyword theme gets smaller, the less reliable it is for modeling anything.
Treating Not Provided as a Keyword
Avinash Kaushik also proposed an alternative approach on his own blog soon after SSL search went live in the USA. He put forward the idea that it is possible to use behavioural metrics to identify the provenance of the unknown search traffic. As per usual, his post is worth reading and fairly in-depth. It does not offer much if you wish to conduct any kind of detailed analysis of organic traffic.
Avinash focused on treating (not provided) as a single homogeneous group, which potentially already matches the behavioural profile of an existing, more easily identifiable group. This approach simplifies reporting and in some cases can at least provide some actionable information. However it does not resolve the main issues SSL search creates, that of tracking the effectiveness of targeted optimisation activity.
Filling in the Keyword Gaps
Assuming (not provided) represents a homogeneous group across an entire site does not provide anything meaningful either. It is better to assess (not provided) traffic at page level based on known traffic and keyword themes the content ranks for and monitor on page content updates, changes in backlink profiles and associated changes in visibility and traffic.
Unfortunately SSL search has made it harder to see small, interesting trends emerge in how people find your site and what they do once they get there. There won’t be enough known data in most cases to create a model, and most of the time, it will just disappear into the noise that is (not provided) and your own known keyword traffic.
Last month, Google had an important high court case ruled in their favour. In short, this case means that Google is no longer responsible or obliged to prevent advertisers from directly targeting branded keywords, bringing the Australian market in line with other English-speaking markets such as the UK and USA.
This case was brought by the ACCC and accused Google of “misleading or deceptive conduct” by allowing advertisers to appear for other brands’ names and to use these in their ad copy. The court ruled that Google was an intermediary and as such:
…Google search engine is only a means of communication between advertisers and consumers.
This effectively means that any responsibility regarding misleading or deceptive conduct rests with the advertisers. Some commentators have said that it seems that the advertisers are only likely to run into issues on this should they use their competitors’ brands in their own ad copy. But not if an ad appears for a search on these kinds of keywords.
The Expected Policy Change
Unsurprisingly this rule is finally reflected in Google’s policy towards brand terms. In an update from this week affecting Australia, New Zealand, China, Hong Kong, Macau, Taiwan, South Korea, and Brazil:
Google will no longer prevent advertisers from selecting a third party’s trademark as a keyword in ads targeting these regions.
In their policy update, Google stated that for these regions they will only respond to trademark complaints regarding advertisers using brands in their copy. However, Google was careful to clearly state that they are in not a position to make recommendations regarding the use of trademarks, as you would expect.
Picking the Change in the Market
While other markets such as the USA and UK have had this policy for a while now, it is still to come into effect for many other countries, including Australia. As a result we cannot be certain of how this will change the way the market behaves around brand terms. Not without looking to the USA and the UK as an example of what to expect. Or waiting for the change to take effect in Australia from the 23rd April.
However it is not hard to predict that this change will certainly lead to some advertising activity across competitors’ brand traffic, and depending how a campaign is implemented, it could in some cases negatively affect organic click through rates. Even though the brand itself should be more efficient than its competition, the cost of traffic will probably rise too and with it the account’s cost per acquisition.
Due to their position there is a strong incentive for the brand to ensure they are covering searches on their name with paid as well as organic listings.This policy change will probably also allow paid search advertisers to employ ambush marketing style tactics on a brand’s own navigational search traffic in interesting ways.
No More Google Brand Police
The decision from last month and this week’s update to their policy means Google is no longer responsible for enforcing rules regarding the use of trademarks in paid search or for how their advertisers use these tools. This ruling appears to let them pass this responsibility on to their advertisers, which is, when you consider how other media platforms work, where it really should lie. It is just fortunate for Google that this should also lead to increased competition and added incentive to advertise for a certain class of keywords.
Google has begun to roll out Enhanced Campaigns, and the reactions have been mixed. At least until the middle of this year the older format, now called Legacy Campaigns, will remain as they are, though any new ones will be Enhanced by default. These changes will add a number of interesting new features to AdWords while simultaneously creating a whole new set of what could politely be called ‘challenges’.
Enhanced Campaigns are described as a step towards helping advertisers deal with changing user behaviour in a multi device world. It is too late to be ahead of the trends and become mobile ready. The shift in user behaviour towards portable touch and hybrid devices has already happened and if you have not adapted already, you are just trying to catch up. AdWords’ Enhanced Campaigns give advertisers more tools for advertising to touch devices like tablets and mobiles. Enhanced Campaigns will target all types of devices by default, sharing one budget and one bid level. Enhanced Campaigns will also provide a number of bid level modifiers. An Enhanced Campaign’s Adgroups will also share advertising copy and ad extensions across devices with the option of selecting some to be used only with mobile. One more minor detail: Enhanced Campaigns do not differentiate between desktop and tablet computers.
There are Features too
With Legacy Campaigns, if you wanted to target mobile and desktop users separately you would need to set up two separate campaigns using different device targeting options. With Enhanced Campaigns you no longer have this option; each campaign targets all devices by default. Enhanced Campaigns do have the ability to change some settings for mobile. You can mark copy and extensions as preferred for mobile and you can apply a bid modifier to mobile phone traffic. Unfortunately no such options are available for tablets.
Enhanced Campaigns also come with a number of other features, such as support for a few new conversion types, like calls generated by clicks from mobile-optimised websites using new conversion code snippets. Enhanced Campaigns will also give the advertiser greater control over bids by location for different devices. Advertisers now also have the ability to set up extensions at Adgroup level too, and get better tracking for activity on sitelinks.
But Google Thinks They are the Same
The most significant change that Google has introduced with Enhanced Campaigns treats desktop, laptops and tablets as the same thing for targeting purposes. These devices will now always share the same bid, the same ad copy and all other targeting settings in search. Display advertising is the exception to this:
For “Display Network only” campaigns, you can still target the device model or operating system to restrict where your ads are shown. Use the “Advanced mobile and tablet options,” which is the same as in legacy campaigns.
In my experience I have found that tablet traffic is less likely to convert than visitors using a more traditional computer and that the traffic seems to cost less. It isn’t just me, with a number of news sites reporting the same (Google may pull in $5 billion in ad revenue from tablets this year). Given the different cost and effectiveness it makes sense to adjust the bidding strategy between the two kinds of devices. Enhanced Campaigns more or less make this impossible. Obviously this is going to force many advertisers to reassess their bidding strategies.
The Not Quite Generalised Second Price Auction
AdWords is effectively a Generalised Second Price auction with a twist, like the winner of a position on the page is determined by ad rank and not by money. The utility each advertiser would assign each position on the Search Engine Result Page (SERP) is highly variable. Even ignoring cross vertical competition, not every participant in the auction will assign the traffic the same value.
This difference in perceived value is going to be influenced by a number of factors, including their ability to convert the traffic, the apparent value of each conversion action to the business and perceived value in maintaining a set level of reach and frequency. If the difference is high enough between advertisers, at least for some groups of keywords, it is likely that the bidding is not a Nash equilibrium, as the participant for whom the traffic has a higher utility can bid at the break even point of their competitor with no negative consequences.
AdWords’ Legacy Campaigns make it easy for an advertiser to control which auctions they participated in, even if some other optimisation tools were removed in the past. The differences in behaviour between devices, as well as the impact the destination site’s own user interface (UI) have on different users can have a significant effect on the value of a click to an advertiser.
The difference in the value of traffic from one device to another influences bidding strategy using the available targeting options on platforms like AdWords or Facebook. Enhanced Campaigns will have an impact on this by changing the segmentation of the audience and more specifically by bundling two dissimilar products into one with a different value to what each product had before.
More Competition is Good…
Within my own experience, tablet traffic has been consistently cheaper and performed worse than desktop traffic in AdWords. Even at the lower price, the value of this traffic has made it illogical for some advertisers to bid as high for tablet traffic as they would for other devices. For some advertisers it would even be rational for them not even to try to bid on tablet traffic.
With Enhanced Campaigns, they do not have the choice. Advertisers can no longer adjust for differences in performance between mobile, tablet and desktop in their bidding strategy, except for the adjustment tools available for mobile. Enhanced Campaigns target all devices by default and the performance of this traffic will change to reflect this mix.
Generally speaking, it is likely that this change will increase competition for tablet and mobile traffic, which should also increase the cost of traffic for all advertisers. Assuming that grouping all devices together negatively affects the performance of AdWords traffic for all advertisers and they behave rationally, one scenario is that the cost per click (CPC) will come down relative to desktop traffic prior to this change. However the CPC will almost certainly be higher than what tablet alone generated, and mobile.
This is assuming that all advertisers experience the same issues with getting value from tablet and mobile traffic. In reality this is not likely to be the case, and advertisers able to get a return from tablet and mobile traffic as well as they can with desktop will be able to capitalise on this change. In this scenario, with a large enough share of spend this group is likely to drive the cost of traffic across both tablet and desktop. However reality is rarely neat, and in AdWords you only need to pay what the participant you beat was willing to pay. And if the return you can get from the same traffic is much higher than theirs, then you certainly are not paying up to your break even point.
Markets Following Markets
While I am sure this, as Google would put it, is for the user, there are a number of other likely outcomes for the search giant from this change. Tablets and smartphones are replacing PCs, both in terms of traffic online and in units sold. Some research, including Google’s own Understanding Tablet Use: A Multi-Method Exploration paper, indicate that this growth in tablet web traffic is due to many people switching devices:
By the end 2013 tablets will account for 20 percent of Google’s paid search ad clicks in the U.S., up from 6 percent in January 2012 according to research by Marin Software, a San Francisco-based company that helps large advertisers manage their online advertising.
Given the relatively low cost of tablet traffic to date and the scale and pace of this change in user behaviour, Google needs to ensure that they can make as much money from tablet users as they do from those on PCs. Assuming nothing changes except the switch to portable touch, with prices for search ads up to 17% cheaper than on desktop and advertisers unable to convert users as easily, Google is likely to lose revenue.
While current trends seem to indicate that the cost of tablet traffic is closing the gap on desktop, Enhanced Campaigns will ensure that Google is able to use all advertising inventory effectively and profitably, both for themselves and their clients who are ready for tablet and mobile users.