How can you identify trustworthy themes and plugins? There are a few telltale signs. Here are some ways you can identify trustworthy themes and plugins so you can be rest assured the code is of good quality and, more importantly, safe.
WordPress plugin directory contains a rich and abundant collection of free themes and plugins. The great thing about the directory is that it is maintained and policed by a great team of contributors, including people like Ipstenu who is contributes directly to the WordPress core.
This maintenance is invaluable as they are very quick to act on reports for untrustworthy content. Abusive themes and plugins are promptly removed, and the majority are reviewed before the first iteration is published.
Additionally, any registered user can review a theme or plugin and rate it out of 5. Reviews are a great indication of how the plugin performs on people’s sites. Reviews may also be useful to double check on specific features, or possible conflicts with other themes and/or plugins running on your site.
Read reviews that are rated 1/5. Reviewers generally rate something 1 out of 5 if a plugin is genuinely of low quality or doesn’t work, but sometimes a user can give something a low rating because it doesn’t work for themonly, not knowing that some other conflict may be happening. In addition, WordPress mods do check on reviews to overlook their substance.
Each theme and plugin hosted at the directory has its own support area. Take a quick look inside to see if there are a lot of issues and, if so, how much they could affect your own installation.
Also look at the proportion of threads labeled [resolved]as this shows the author’s own activity within the area – and if evidence that support is offered and potential bug fixes are being seen to.
Generally, any plugin that hasn’t been updated in more than 2 years is a plugin you should avoid. This is mainly because WordPress, in terms of core code, has evolved a lot over the past two years and with it includes new functions and processes which developers need to adopt to ensure it is compatible with current versions. Two years ago today the latest version of WordPress was 3.1.3 – as opposed to today’s 3.5.1 (with 3.6 soon to be released).
This rule doesn’t have to apply to free themes in the WordPress directory that have already satisfied the trust mentioned above – I am referring to themes available on the web in general. Don’t trust these themes. The code could contain anything, and could be harmful to your site both in terms of performance and security.
Essentially, it would be very easy for anyone to develop a simple WordPress theme and code it in such a way that they could have your whole installation (including posts, pages, usernames and general login credentials) conveniently emailed to them – or someone could run a simple script on the site that sends details about every visitor to your site.
Another thing to avoid is trying to find a free version of a premium theme or plugin. It’s easy enough to search for “[theme name] WordPress theme download free” and find what you want, but that zip file could have been forked by anyone. I have tested this once by purchasing a premium theme and then downloading one on the web offered for free. The difference between the two were slight, but enough to raise concerns.
The main culprit of hidden abusive scripts can be found by searching all of your theme or plugin files for “base64_“. This function has obvious honest intentions but is used widely within WordPress themes and plugins for dishonest means. This function is used by developer to insert encoded scripts without your being able to find them as easily.
For example, let’s say someone wants to include the following script into your theme:
The dodgy function they can produce is:
$str = base64_decode(‘PHNjcmlwdCB0eXBlPSJqYXZhc2NyaXB0IiBocmVmPSJodHRwOi8vZG9kZ3kuY29tL3Nj
Hidden well, right? To ensure you have none of these scripts running (I’m yet to see a trustworthy theme or plugin that uses it for “normal” purposes), remove all instances from all your plugin/theme files. If you aren’t confident in removing this, then at least contact the plugin/theme developer and ask how and why it’s being used.
N.B. This function is not limited to WordPress. On Tuesday we removed this function from a Magento installation. Be careful as this script was a blatant hack from an old installation we recently took control over.
A few months ago I purchased a WordPress theme from a well-known premium theme directory. The theme itself didn’t work very well as soon as we started to make the simplest of edits.
Upon further inspection I found out that the simplest of requirements set out by WordPress were ignored. The developer had numerous themes so decided to purchase another just to test and turned out that the same rookie mistakes were being adopted in both themes.
WordPress created these requirements so that minimal conflicts and errors would occur, and yet this is all too common when the themes don’t comply with WordPress’ own rules. The shame in my example is that the author is publicising himself as a WordPress theme developer and capitalizing (very well) from it, yet is ignoring the core of WordPress theme development best practices.
Instead of this:
wp_register_style(‘random-css-style’, get_template_directory_uri() . ‘/css/random-style.css’); wp_enqueue_style(‘ random-css -style’); wp_enqueue_script(‘functions-script’, get_template_directory_uri() . ‘/js/functions.js’);
I would immediately ask questions. The functions wp_enqueue_style() and wp_enqueue_script() are easy to implement, and basic required functions within the core code. If this isn’t implemented correctly, I automatically doubt the rest of the code.
This was the case with my example above and both themes were refunded… however, this isn’t the solution I wanted. It wasted my time, and the theme directory’s time to refund me and report back to the author. The author had his own support site but was never maintained and eventually shut down completely – not something you want when you paid for the theme!
As you can see, there are a few telltale signs of poorly developed themes and plugins. Hopefully now you’ll be able to keep yourself safe from any suspect themes and plugins in the future.
Have you had a bad experience with a plugin or theme (free or paid)? Please let us know in the comments below.
Each morning, millions of people log into their AdWords and Bing Ads accounts to see how their performance was on the previous day. This often includes reviewing which keywords drove the most traffic, how that new ad text you created is performing, and how you’re pacing against your client’s goals.
Each SEM manager most likely has their own approach and set of reports they run. Unfortunately, to truly understand how your account is performing, it can take some time. Thankfully, Google has released a new feature: the top movers report.
Per Google, this report will “show you which campaigns and ad groups have experienced the largest changes in clicks and cost, and highlight changes you made which might have contributed to those moves.”
The top movers report will be found under the Dimensions tab. You’ll need to select two consecutive time periods of equal length.
Once the date has been selected you will be shown the campaigns and ad groups that experienced the largest change during that time period. You can compare periods of 7, 14, or 28 days, or look at reports generated in the last 90 days, according to Google.
Once the report is created you will be presented with a roll up of information such as Top Increases, Top Decreases, Total Change and more. These data points will be available for both cost and clicks.
When you drill down further into the top movers data, Google will actually provide a “possible cause” of why the change occurred. Per Google’s AdWords Blog you could see “bids were increased” or “new keywords were added”.
You will also be provided with additional information as the exact change that occurred (example: an increase of 100 clicks), the percentage of change as well as additional information such as impressions, CTR, Avg. Pos and Avg. CPC.
At first glance this new report seems quite helpful. SEM managers will be able to log into their accounts run one single report and know what campaigns and ad groups need to be focused on. You’ll easily be able to see if the changes you made to your account have had a positive or negative affect on the performance of your account.
This report can also be super helpful to those accounts which have multiple SEM managers touching them. In most agencies you’ll see a lower level employee making the day to day adjustments on the campaign and the more senior manager overseeing the optimization efforts and client relationship. With this report, the senior team member can easily see what has caused the fluctuation in account performance.
When creating reports, access to insights and changes in performance will be easier than ever. Simply select the date range and any major change that caused fluctuation in performance is within reach.
If you took the time to set up a very granular account structure this report will save you quite a bit of time. If you happen to have larger campaigns and ad groups that don’t have a clear structure strategy behind them, you’ll still be spending your morning shifting through keyword and ad text level data to determine what really occurred.
The next most obvious and glaring need for improvement is the inclusion of conversion data. For most SEM managers, spend levels and total clicks aren’t KPIs. Having data points such as cost per conversion, conversion rate, and even total conversions would be extremely informative and actionable.
In Google’s blog post there is no mention of device level data. It would be extremely beneficial to understand if the change in performance was related to desktop and laptops or mobile devices.
You’ll also notice that the top movers are selected by total change (number of clicks) instead of percentage of change. If you happen to ad groups that drive a significant amount of traffic compared to others, I’d expect to see them quite often.
By involving their audience to make them feel like their input is listened to, a brand can build advocates and perhaps come up with ideas they wouldn’t have had on their own.
However, it doesn’t always go right.
Unsuspecting marketers who blindly attempt to take advantage of the crowds may find themselves causing more issues than benefits for their brands.
Earlier this year Durex decided to offer a new service where their condoms would be sent directly to couples in need in cities across the world, either on the web or using an app. Their marketing team decided that the best way to kick this campaign off was to ask their users to pick the first city this would launch in.
Unfortunately for them they didn’t think ahead and left it open for any city to be submitted. Thanks to the wit of the crowds, this resulted in Kuala Lumpur ending in second place, and the predominantly conservative, Muslim, although amusingly named, city of Batman, Turkey.
Durex closed the campaign down without offering the service anywhere, let alone in Batman.
In 1990 David Bowie decided to do a greatest hits tour “Sound+Vision”. In order to decide what songs to play he decided to let the crowds decide.
As it was in those prehistoric, pre-Internet days, the votes were collected via telephone. Music magazine NME heard about it and asked their readers to vote for one specific Bowie song – “The Laughing Gnome”. The vote was scrapped with that song in the lead.
(The author would like to apologize for getting that stuck in your head).
Last August Mountain Dew was ready to launch an Apple version and decided to get the crowd to share their wisdom by asking them to name this new variant. As you’d expect, the crowd decided to show their wit instead, by submitting and voting up names that wouldn’t be allowed anywhere near a bottle of soda.
The vote was quickly cancelled, and the soda was imaginatively named… “Apple Mountain Dew“.
Early last year, over on Twitter, McDonalds decided to use the hashtag #McDStories to get customers to share their stories of McDonalds. Apparently they thought that nothing could go wrong…
Last year English soccer player Jonny Howson transferred from his hometown team, Leeds United, to Norwich City. The Leeds United fans weren’t thrilled that he was transferred, given that he was the team captain, so when Norwich announced that they were taking questions for Jonny on Twitter…
…the Leeds fans took over the chat, completely.
The first thing you should do is set boundaries. If Durex had picked 20 cities around the world for their condom delivery service, and asked people to vote on those 20 cities, then this would have worked for them. They’d have been able to control the situation by preventing the crowd adding in cities that they’d not be able to deliver to.
Similarly David Bowie’s people should have restricted the options to a subset of his work, perhaps all of his songs that charted except for any gnome related ones.
If you truly want to make sure that the crowd can submit what they want, then you need to ensure that you firstly have the T&C’s in place that inform users that you have the right to take down anything derogatory, defamatory or just plain dumb. Then you need to actually enforce it.
Monitoring the suggestions and deleting inappropriate ones would probably have helped Mountain Dew with their issue. If the voting was taking place on their site, and they had the development resources, then implementing a filter of some kind, to not show any suggestions with “Hitler” or other words that had no place in the crowdsourcing, would also have been a good idea.
There are lots of people that have a beef with McDonalds, from PETA to anti-obesity campaigners. While inside McDonalds corporate HQ the concept of getting customers to share their stories may have resulted in high 5′s all around, stepping back and thinking about prevailing sentiment around the company may have given them pause for thought.
While Norwich City wanted to engage their fans (and they have over 3x the number of followers than the Leeds United official twitter account), they should have realized that there was a prevailing sense of anger amongst the Leeds fans about the Jonny Howson transfer, and should perhaps have either waited for that to die down, had the Q&A sessions somewhere where they’d have more control, or just simply used another player.
If, despite your best planning, crowdsourcing heads off in a bad direction, and you have no way to steer it back in the right direction, then pulling the plug is the best way to go. Just take whatever you’ve learned from the experience and make sure to apply it the next time you try to do something similar.
Surprisingly, it is taken over 10 years for the FTC to update these guidelines despite the vast changes that have occurred with search engines, their products, and how advertising is integrated.
Yesterday the FTC released updated guidelines for the search engine industry and how it distinguishes between advertising and natural search results. Particularly, the FTC believes that in recent years it has become less easy for consumers to recognize what is paid and what is not. Their primary objective is to ensure that advertising isn’t misleading consumers into believing they are natural search results.
The letters note that in recent years, paid search results have become less distinguishable as advertising, and the FTC is urging the search industry to make sure the distinction is clear.
One of their concerns is the fact that paid ads and ad blocks aren’t clearly shown to be paid advertisement, with the labels that have significantly smaller text, or that are more he then on the top right corner of an ad block as opposed to the left corner. They believe consumers may not note the reference to paid advertising when it is over on the right side.
We have observed that search engines have reduced the font size of some text labels to identify top ads and other advertising and often locate these labels in the top right-hand corner of the shaded area or “ad block,” as is the case with top ads. Consumers may not as readily notice the labels when placed in the top right-hand corner, especially when the labels are presented in small print and relate to more than one result. Web research suggests that web pages are normally viewed from left-center to right, with substantially less focus paid to the right-hand side.
The use of shading ad blocks has often been used to differentiate between paid and unpaid listings. The most common example of this are the Google AdWords listings in the Google search results. However, the FTC has noticed that the background shading used by many search engines tends to be less visible when viewed on mobile devices and certain computer monitors.
We have observed that, increasingly, search engines have introduced background shading that is significantly less visible or “luminous” and that consumers may not be able to detect on many computer monitors or mobile devices. Reliance on this method to distinguish advertising results requires that search engines select hues of sufficient luminosity to account for varying monitor types, technology settings, and lighting conditions.
Instead, the FTC suggests that search engine should use prominent shading with a clear outline, a prominent border, or both.
Accordingly, we recommend that in distinguishing any top ads or other advertising results integrated into the natural search results, search engines should use: (1) more prominent shading that has a clear outline; (2) a prominent border that distinctly sets off advertising from the natural search results; or (3) both prominent shading and a border.
FTC also brings up social listings, such as the ones you see in Facebook’s Graph Search, or even Google+ influenced personalized results where some listings are displayed more prominently because of someone in that users personal network. If any of those listings are paid, with the FTC example of a listing for a recommended restaurant because the user’s contacts have enjoyed it, if there is any pay compensation for that being displayed in social results, it must be clearly noted.
With the popularity of voice activated search results, such as Siri, the FTC is also advising that if search results are delivered verbally, that there must be an audio disclosure to go along with it states it is a paid advertising, and that one the user can easily here and recognize as being paid ads.
The FTC knows that as search engines continue to evolve with different technologies to provide users with search results, but they also wants to ensure that any new ways of receiving or displaying search results in the future also include clear differentiation between paid and unpaid. Since the original letter in 2002, mobile and tablets have increased significantly, as well as types of platforms search engines use to display and promote information, such as mobile apps, social media and their own way of narrowing segments of search results, such as things like news results or blog search results. With this, the FTC wants search engines to follow these guidelines from the outset as they introduce new ways of displaying their results.
Online search is far from static, and continues to evolve. Indeed, in the past few years, the growth of social media and mobile apps, and the introduction of voice assistants on mobile devices, have offered consumers new ways of getting information. Regardless of the precise form search may take in the future, the long-standing principle of making advertising distinguishable from natural results will remain applicable.
The FTC sent out letters to what they considered the primary general-purpose search engines, which includes AOL, Ask.com, Bing, Blekko, DuckDuckGo, Google, and Yahoo. They also sent out letters to 17 specialized search engines, ones that specialize in specific market areas of shopping, travel, and local businesses that also display advertising.
A new Easter egg turns Google Images into a playable classic arcade game with a Google twist. Head to Google Images and type “Atari breakout” to fire up “Image Breakout” and start destroying rows of images by bouncing a ball into them.
Start the game by going to Google Images and typing in “atari breakout”. The regular image results will load briefly, before your screen transforms to “Image Breakout”.
You’ll be given five balls at the beginning. You’ll lose a ball every time one touches the bottom of the screen.
The rules are pretty simple: there are five rows (sometimes only four) to destroy (blue, green, yellow, orange, and red) at the top of the screen. Use your mouse or the left and right arrow keys to control the blue bar at the bottom of the screen to angle your shots at the images you want to smash with your gray ball.
Hint: If you can make a hole and shoot your ball into that opening, your ball will bounce around and destroy several images at a time.
The “levels”, which are just different Google Images searches, seem to be pretty much the same difficulty level. After completing the first screen, you’ll be taken to random image searches – for example, after destroying the Atari images, up next for me were pictures of ginseng, mayonnaise, Tahiti, English Cocker Spaniel, and Macau.
After your game is over, Google gives you the option to share your score on Google+ (with the message “I played #breakout on Google Image Search! I made it to level x in Image Breakout! Can you beat my score of xx?”) or via a customized short URL.
I believe this is the first Google Images Easter egg.
Atari game “Breakout” debuted 37 years ago. Here’s what it looked like on the Atari 2600:
This is somewhat similar to the “Zerg Rush” Easter egg game Google put out, in which you had to shoot red and yellow zergs that devoured Google’s search results.
Google also famously paid tribute to another Atari game, Pac-Man, with a playable Doodle in 2010 for its 30th anniversary.
The latest Google Webmaster video features Distinguished Engineer Matt Cutts talking about what webmasters can expect to see in the next few months in terms of SEO for Google, particularly changes combating black hat web spam from many different angles in a variety of areas.
Here are nine search and SEO changes webmasters will likely see – although, as always, Cutts warns nothing is set in stone and it should be taken with a grain of salt.
This update is to try and target more black hat web spam. The new Penguin 2.0, which is the name Google uses internally for the next gen Penguin, will be much more comprehensive than Penguin 1.0 and it will go deeper and have a larger impact than the original.
Many advertorials (a.k.a., native advertising) violate Google’s quality guidelines. More importantly, they should not flow PageRank.
Google is planning to be a lot stronger on their enforcement of these types of paid links and advertising, disguised as “advertorials”. Cutts did clarify there is nothing wrong with advertorials, simply that they don’t want them to be abused for PageRank and linking reasons. If you use advertorials, Cutts suggested that they should be clearly marked and obvious that it is paid advertising.
Cutts mentioned that this is a problematic search, and there are others like it, so they are tackling it a couple of different ways. For those that play in that space, however, you’re out of luck since Cutts isn’t revealing exactly how they are dealing with it, just that it will be happening.
He said that they are targeting specific areas (another example he included was porn queries) that have traditionally been more spammy.
Again, Cutts isn’t going into details about this, but they are working on making link buying less effective and have a couple ideas for detailed link analysis to tackle this issue.
They want to roll out a next generation of hacked detection, as well as being able to notify webmasters better. They would like to be able to point webmasters to more specific information, such as whether they are dealing with malware or a hacked site, and to hopefully clean it up.
If Google’s algorithms believe you or your site is an authority in a particular area, they want to make sure those sites rank a little bit higher than other sites.
They are looking for some additional signals for sites that are in the “gray area” or “border zone”, and looking for other signals that suggest the site truly is high quality, so it will help those sites who have been previously impacted by Panda.
If you’re doing deep searches in Google, and going back 5, 6 or more results pages deep, you can see the same site popping up with a cluster of results on those deep pages.
Google is looking into a change where once you have seen a cluster of results from the same site, you will be less likely to see more and more from that same site as you go deeper. Cutts mentioned this as being something that came specifically from user feedback.
Cutts said they want to be able to keep giving webmasters more specific and detailed information via webmaster tools. He mentions specifically example URLs to help webmasters diagnose problems on their site.
He believes that the changes will really make a difference with the quality of the search results, as well as impact the amount of spam that is showing up.
Cutts says if you are focused on high quality content, you don’t have much to worry about. But if you’re dabbling in the black hat arts, you might have a busy summer.
If you have bought so-called “clean” links that pass PageRank, you might want to take a closer look to see if those links are still actually clean and that you haven’t been penalized for it.
Google’s Distinguished Engineer Matt Cutts reports that they have taken action today against several thousand link sellers that were passing PageRank.
Word hasn’t come out yet which network was the one penalized yet, or if it was a private in-house link network.
Judging from the wording of his tweet, it sounds as though manual action was taken against the link sellers, rather than an updated algorithm designed to target specifically these kinds of link sellers. That said, I fully expect they will be working on automated detection for these types of sellers.
This seems to be part of the link devaluing Cutts talked about in his “What to expect” webmaster video earlier this week.
It’s been a crazy couple of years for SEO. We’ve seen the rise of unnatural links manual penalties. The infamous algo updates like Panda and Penguin, and some lesser talked about ones like the page layout algorithm. The rise and fall of super-agencies. And, of course, the increased value of social media.
And yes, there’s been a ton of others, but you get the idea.
The whole nature of search – and by extension SEO – has changed drastically. But what isn’t entirely clear is how SEO providers have evolved to meet this new state of reality. One gets the feeling that they really haven’t fully embraced it. Or at least a lot of folks were never really much more than link shores and hype merchants… thus seeming are unable to adapt.
I was reading some articles lately about unnatural links messages and various tools and remedies to deal with them (or even Google Penguin). In most cases they talk about addressing or finding which links might be the problem in the website’s profile.
OK, seems sensible. But is it?
I mean come on… do you really need a tool?
For starters, if you’re even slightly worth your salt as an SEO, you should be able to pick them out without much trouble. It’s kinda stunning that people actually need guidance or a tool to do that.
Furthermore, if it’s your website or you worked on the link building, I am pretty sure you know exactly which links were manufactured. Remember that crap-hat SEO you bought on eBay? Or from that circa 2005 long scrolling sales page? Fiver? Pretty sure you can now start removing them.
Now let’s consider Google’s take on whether a link was manufactured to increase rankings. If that link was not about actual traffic, branding or exposure… good chance it’s going to fit the description. But still it seems there’s a ton of folks still trying to walk that thin line. It’s bound not to end well.
Start thinking about link attraction not link building.
This becomes the real question to be asking. All too often we see SEOs that talk about search engine optimization including things such as social media, public relations, conversion optimization, content strategy, and more. But is this really the case?
I suspect that:
While I wish it was the latter, sadly it is often the former. We do a great disservice to the professionals in those respective fields by pretending it is the domain of the optimizer. Which of course begs the above question, what is SEO?
Some of the things we actually do:
Some of the areas we’re advisory in:
There’s more, but you get the idea here right? I’m tired of hearing about the “new SEO.” Article after article tries to redefine what SEO is because somehow a lot of people got lost. I do understand that we touch a lot of things in our job, but let’s not lose sight here.
Notice link building isn’t on the list? There a reason for that: it’s really not SEO to me. There… I said it. Nyeaaah!
OK, so let’s look at this concept of link building. First off, the name. It actually puts us right square in the sights of ol Googly. Why? Because that’s manipulation. That’s what an unnatural link is to them.
One of the more catchy buzz phrases over the last while is inbound marketing. As my pal Dan Thies mused the other day, “if you’re doing outreach for guest posting/links; how is that inbound in any way shape of form?” Damn good question, huh?
Need some links, do ya? I hate to be a nutter, but have you considered:
Content + Outreach + Social + Promotion + Brand reach
And I don’t want to hear that your client can’t afford all that. Then I guess they can’t afford to be in business.
Seriously. Pick your battles. Maybe cut back on the number of terms your targeting.
Not all content needs to be pillar content (popular, stands the test of time). You can have filler as well. But create a strategy that’s doable. That’s your freakin’ job!
The fake-it-til-you-make-it days of “just throw some links at it” are well and truly gone, my friends. As a consultant that specializes in forensic work, I am tired of seeing shoddy SEO work that gets our m on the bad side of Google and ultimately hurts the industry I love.
At the end of the day, this has become a HUGE problem. The lines have become blurred and that’s not a good thing. Yes, a lot of things do affect SEO. That doesn’t mean it’s our domain. Yes, links do still have a roll in the SERPs. But that doesn’t mean building them is the only answer.
And there’s even pressure from clients. Pressure from agency higher-ups. That all comes down to education. We need to educate others as to the evolution of SEO. Like any (link) addict, we need to break the cycle.
Mobile isn’t just for finding local merchants to start the shopping experience. Recent data from Google shows mobile is a shopping companion while consumers are in the store looking to buy.
In a new study, “Mobile In-Store Research: How in-store shoppers are using mobile devices,” Google showcases consumer behavior while they shop. According to the data, 84 percent of smartphone shoppers use their devices to help shop while in the store.
Google also found that in-store mobile activity begins with search, rather than with shopping apps or navigating directly to brand/retail websites. Eighty-two percent of smartphone shoppers use mobile search to help make purchase decisions, Google reports.
This aligns nicely with all the data we’ve been seeing lately on how mobile drives user behavior online.
Earlier this month, we learned that local search via non-PC devices more than quadrupled in 2012, according to a study by the Local Search Association. We also saw data from Telmetrics and xAd that showed 46 percent of searchers now use mobile exclusively to research.
But what we haven’t seen yet is new data on how people use mobile while they shop, until now. Let’s take a look at some key stats from the Google study.
First, how many shoppers actually use their phone to assist in the shopping experience while in the store?
This is across several categories, too:
And shoppers are relying less on customer service reps and more on their mobile devices for help while shopping:
And the last stat applies to some categories more than others:
Here’s what consumers use their mobile device for while in the store:
And here’s where users go online to perform specific functions related to their real-time shopping:
So, what does all of this mean to a business? As Google puts it:
A whirlwind of upgrades, information, and new features have been spilling out of Google AdWords of late. Consider these five new features that will have advertisers and PPC managers excited about the progress.
With the launch of the enhanced campaigns in AdWords, we’re also seeing upgraded sitelinks. In the past, a set of sitelinks were created at the campaign level. Now individual sitelinks can be created and then assigned to ad groups, the “override” at the ad group level offers the ability to mix and match and gain a precision new to the ad extension feature.
Mobile specific sitelinks open up opportunities for additional massaging with a local emphasis like directions or local specials. Scheduling sitelinks is great for promotions, and anything seasonal.
We reported on the recent announcement of the keyword planner in April and are really looking forward to this new feature. The keyword planner takes the old keyword tool we all know and love and bumps it up to the next level by combining the keyword tool and traffic estimator.
The functionality is very similar, but improved. In-line you can change the match type before adding to your plan, and the tool gives traffic estimates in line next to the keyword.
Also available is a feature to multiply keyword lists to get estimates. Combine two or more lists of keywords automatically into new keywords, and then get performance estimates for the new list. This can save you the time of manually combining keywords.
I can see a great application of this being to easily add geo or purchase modifiers to a campaign.
Google’s new Display Planner has the look and feel of AdWords with additional data and estimates of the Google Ad Planner. Advertisers can find topics, interests, or placements to target with information on audience opportunity.
AdWords tells you how many “cookies” are available to target, as the cookie is representative of a device that’s been cookied, rather than a person. Impressions per week estimates are also a great piece of information.
Mobile bid adjustments for AdWords enhanced campaigns are made at the campaign level. You could see how the same bid adjustment may not be applicable to all ad groups. No worries! Between now and the next few weeks, Google is rolling out ad group level mobile adjustments.
It’s good to see a granular level of customization. Depending on past performance, if one ad group has CPCs much different from the rest of the campaign, use the ad group level bids to even the playing field and limit losses.
Recently released, the upgrade center for enhanced campaigns can make upgrading easier for advertisers. With just a few clicks, several campaigns can be updated.
However, do a little research before taking the easy way out. In this tool, AdWords has a default bid setting “Use the Google-suggested default, calculated for each campaign”. If you know your mobile CPCs are a percentage of spend, use that to guide you and avoid defaulting as much as possible.
The upgrade center rolls out to all accounts over the next few weeks, accessible from the left nav bar on the Campaigns tab.
Many of these updates are now available to experiment with and others will be released in the next few weeks. What are your favorite new features, tools, bells and whistles in Google AdWords?