Tag Archives: SEO News

Long Tail CTR Study: The Forgotten Traffic Beyond Top 10 Rankings

New post on Online Marketing Hub

Long Tail CTR Study: The Forgotten Traffic Beyond Top 10 Rankings
by christopherjanb
Posted by GaryMoyle

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.

Search behavior is fundamentally changing, as users become more savvy and increasingly familiar with search technology. Google’s results have also changed significantly over the last decade, going from a simple page of 10 blue links to a much richer layout, including videos, images, shopping ads and the innovative Knowledge Graph.

We also know there are an increasing amount of touchpoints in a customer journey involving different channels and devices. Google’s
Zero Moment of Truth theory (ZMOT), which describes a revolution in the way consumers search for information online, supports this idea and predicts that we can expect the number of times natural search is involved on the path to a conversion to get higher and higher.

Understanding how people interact with Google and other search engines will always be important. Organic click curves show how many clicks you might expect from search engine results and are one way of evaluating the impact of our campaigns, forecasting performance and exploring changing search behavior.

Using search query data from Google UK for a wide range of leading brands based on millions of impressions and clicks, we can gain insights into the how CTR in natural search has evolved beyond those shown in previous studies by
Catalyst, Slingshot and AOL.

Our methodology
The NetBooster study is based entirely on UK top search query data and has been refined by day in order to give us the most accurate sample size possible. This helped us reduce anomalies in the data in order to achieve the most reliable click curve possible, allowing us to extend it way beyond the traditional top 10 results.

We developed a method to extract data day by day to greatly increase the volume of keywords and to help improve the accuracy of the
average ranking position. It ensured that the average was taken across the shortest timescale possible, reducing rounding errors.

The NetBooster study included:

65,446,308 (65 million) clicks
311,278,379 (311 million) impressions
1,253,130 (1.2 million) unique search queries
54 unique brands
11 household brands (sites with a total of 1M+ branded keyword impressions)
Data covers several verticals including retail, travel and financial
We also looked at organic CTR for mobile, video and image results to better understand how people are discovering content in natural search across multiple devices and channels.

We’ll explore some of the most important elements in this article.

How does our study compare against others?
Let’s start by looking at the top 10 results. In the graph below we have normalized the results in order to compare our curve, like-for-like, with previous studies from Catalyst and Slingshot. Straight away we can see that there is higher participation beyond the top four positions when compared to other studies. We can also see much higher CTR for positions lower on the pages, which highlights how searchers are becoming more comfortable with mining search results.

A new click curve to rule them all
Our first click curve is the most useful, as it provides the click through rates for generic non-brand search queries across positions 1 to 30. Initially, we can see a significant amount of traffic going to the top three results with position No. 1 receiving 19% of total traffic, 15% at position No. 2 and 11.45% at position No. 3. The interesting thing to note, however, is our curve shows a relatively high CTR for positions typically below the fold. Positions 6-10 all received a higher CTR than shown in previous studies. It also demonstrates that searchers are frequently exploring pages two and three.

When we look beyond the top 10, we can see that CTR is also higher than anticipated, with positions 11-20 accounting for 17% of total traffic. Positions 21-30 also show higher than anticipated results, with over 5% of total traffic coming from page three. This gives us a better understanding of the potential uplift in visits when improving rankings from positions 11-30.

This highlights that searchers are frequently going beyond the top 10 to find the exact result they want. The prominence of paid advertising, shopping ads, Knowledge Graph and the OneBox may also be pushing users below the fold more often as users attempt to find better qualified results. It may also indicate growing dissatisfaction with Google results, although this is a little harder to quantify.

Of course, it’s important we don’t just rely on one single click curve. Not all searches are equal. What about the influence of brand, mobile and long-tail searches?

Brand bias has a significant influence on CTR
One thing we particularly wanted to explore was how the size of your brand influences the curve. To explore this, we banded each of the domains in our study into small, medium and large categories based on the sum of brand query impressions across the entire duration of the study.

When we look at how brand bias is influencing CTR for non-branded search queries, we can see that better known brands get a sizable increase in CTR. More importantly, small- to medium-size brands are actually losing out to results from these better-known brands and experience a much lower CTR in comparison.

What is clear is keyphrase strategy will be important for smaller brands in order to gain traction in natural search. Identifying and targeting valuable search queries that aren’t already dominated by major brands will minimize the cannibalization of CTR and ensure higher traffic levels as a result.

How does mobile CTR reflect changing search behavior?
Mobile search has become a huge part of our daily lives, and our clients are seeing a substantial shift in natural search traffic from desktop to mobile devices. According to Google, 30% of all searches made in 2013 were on a mobile device; they also predict mobile searches will constitute over 50% of all searches in 2014.

Understanding CTR from mobile devices will be vital as the mobile search revolution continues. It was interesting to see that the click curve remained very similar to our desktop curve. Despite the lack of screen real estate, searchers are clearly motivated to scroll below the fold and beyond the top 10.

NetBooster CTR curves for top 30 organic positions

Position

Desktop CTR

Mobile CTR

Large Brand

Medium Brand

Small Brand
1 19.35% 20.28% 20.84% 13.32% 8.59%
2 15.09% 16.59% 16.25% 9.77% 8.92%
3 11.45% 13.36% 12.61% 7.64% 7.17%
4 8.68% 10.70% 9.91% 5.50% 6.19%
5 7.21% 7.97% 8.08% 4.69% 5.37%
6 5.85% 6.38% 6.55% 4.07% 4.17%
7 4.63% 4.85% 5.20% 3.33% 3.70%
8 3.93% 3.90% 4.40% 2.96% 3.22%
9 3.35% 3.15% 3.76% 2.62% 3.05%
10 2.82% 2.59% 3.13% 2.25% 2.82%
11 3.06% 3.18% 3.59% 2.72% 1.94%
12 2.36% 3.62% 2.93% 1.96% 1.31%
13 2.16% 4.13% 2.78% 1.96% 1.26%
14 1.87% 3.37% 2.52% 1.68% 0.92%
15 1.79% 3.26% 2.43% 1.51% 1.04%
16 1.52% 2.68% 2.02% 1.26% 0.89%
17 1.30% 2.79% 1.67% 1.20% 0.71%
18 1.26% 2.13% 1.59% 1.16% 0.86%
19 1.16% 1.80% 1.43% 1.12% 0.82%
20 1.05% 1.51% 1.36% 0.86% 0.73%
21 0.86% 2.04% 1.15% 0.74% 0.70%
22 0.75% 2.25% 1.02% 0.68% 0.46%
23 0.68% 2.13% 0.91% 0.62% 0.42%
24 0.63% 1.84% 0.81% 0.63% 0.45%
25 0.56% 2.05% 0.71% 0.61% 0.35%
26 0.51% 1.85% 0.59% 0.63% 0.34%
27 0.49% 1.08% 0.74% 0.42% 0.24%
28 0.45% 1.55% 0.58% 0.49% 0.24%
29 0.44% 1.07% 0.51% 0.53% 0.28%
30 0.36% 1.21% 0.47% 0.38% 0.26%
Creating your own click curve
This study will give you a set of benchmarks for both non-branded and branded click-through rates with which you can confidently compare to your own click curve data. Using this data as a comparison will let you understand whether the appearance of your content is working for or against you.

We have made things a little easier for you by creating an Excel spreadsheet: simply drop your own top search query data in and it’ll automatically create a click curve for your website.

Simply visit the NetBooster website and download our tool to start making your own click curve.

In conclusion
It’s been both a fascinating and rewarding study, and we can clearly see a change in search habits. Whatever the reasons for this evolving search behavior, we need to start thinking beyond the top 10, as pages two and three are likely to get more traffic in future.

We also need to maximize the traffic created from existing rankings and not just think about position.

Most importantly, we can see practical applications of this data for anyone looking to understand and maximize their content’s performance in natural search. Having the ability to quickly and easily create your own click curve and compare this against a set of benchmarks means you can now understand whether you have an optimal CTR.

What could be the next steps?
There is, however, plenty of scope for improvement. We are looking forward to continuing our investigation, tracking the evolution of search behavior. If you’d like to explore this subject further, here are a few ideas:

Segment search queries by intent (How does CTR vary depending on whether a search query is commercial or informational?)
Understand CTR by industry or niche
Monitor the effect of new Knowledge Graph formats on CTR across both desktop and mobile search
Conduct an annual analysis of search behavior (Are people’s search habits changing? Are they clicking on more results? Are they mining further into Google’s results?)
Ultimately, click curves like this will change as the underlying search behavior continues to evolve. We are now seeing a massive shift in the underlying search technology, with Google in particular heavily investing in entity- based search (i.e., the Knowledge Graph). We can expect other search engines, such as Bing, Yandex and Baidu to follow suit and use a similar approach.

The rise of smartphone adoption and constant connectivity also means natural search is becoming more focused on mobile devices. Voice-activated search is also a game-changer, as people start to converse with search engines in a more natural way. This has huge implications for how we monitor search activity.

What is clear is no other industry is changing as rapidly as search. Understanding how we all interact with new forms of search results will be a crucial part of measuring and creating success.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

For more including images see:
http://omhub.wordpress.com/2014/12/04/long-tail-ctr-study-the-forgotten-traffic-beyond-top-10-rankings/

This page about SEO and Page Rankings has been posted “By Mike Armstrong”

SEO News & Advice

New post on Online Marketing Hub

The Danger of Crossing Algorithms: Uncovering The Cloaked Panda Update During Penguin 3.0
by christopherjanb
Posted by glenngabe

Penguin 3.0 was one of the most anticipated algorithm updates in recent years when it rolled out on October 17, 2014. Penguin hadn’t run for over a year at that point,
and there were many webmasters sitting in Penguin limbo waiting for recovery. They had cleaned up their link profiles, disavowed what they could, and were
simply waiting for the next update or refresh. Unfortunately, Google was wrestling with the algo internally and over twelve months passed without an
update.

So when Pierre Far finally
announced Penguin 3.0 a few days later on October 21, a few things
stood out. First, this was not a new algorithm like Gary Illyes had explained it would be at SMX East. It was a refresh and underscored
the potential problems Google was battling with Penguin (cough, negative SEO).

Second, we were not seeing the impact that we expected. The rollout seemed to begin with a heavier international focus and the overall U.S impact has been underwhelming to say the least. There were definitely many fresh hits globally, but there were a number of websites that should have recovered but didn’t
for some reason. And many are still waiting for recovery today.

Third, the rollout would be slow and steady and could take weeks to fully complete. That’s unusual, but makes sense given the microscope Penguin 3.0 was under. And this third point (the extended rollout) is even more important than most people think. Many webmasters are already confused when they get hit
during an acute algorithm update (for example, when an algo update rolls out on one day). But the confusion gets exponentially worse when there is an extended rollout.

The more time that goes by between the initial launch and the impact a website experiences, the more questions pop up. Was it Penguin 3.0 or was it something else? Since I work heavily with algorithm updates, I’ve heard similar questions many times over the past several years. And the extended Penguin 3.0 rollout is a great example of why confusion can set in. That’s my focus today.

Penguin, Pirate, and the anomaly on October 24

With the Penguin 3.0 rollout, we also had Pirate 2 rolling out. And yes, there are some websites that could be impacted by both. That added a layer of complexity to the situation, but nothing like what was about to hit. You see, I picked up a very a strange anomaly on October 24. And I clearly saw serious movement on that day (starting late in the day ET).

So, if there was a third algorithm update, then that’s three potential algo updates rolling out at the same time. More about this soon, but it underscores the confusion that can set in when we see extended rollouts, with a mix of confirmed and unconfirmed updates.

Penguin 3.0 tremors and analysis
Since I do a lot of Penguin work, and have researched many domains impacted by Penguin in the past, I heavily studied the Penguin 3.0 rollout and published a blog post based on analyzing the first ten days of Penguin 3.0 which included some interesting findings for sure.

And based on the extended rollout, I definitely saw Penguin tremors beyond the initial October 17 launch. For example, check out the screenshot below of a website seeing Penguin impact on October 17, 22, and 25.

But as mentioned earlier, something else happened on October 24 that set off sirens in my office. I started to see serious movement on sites impacted by Panda, and not Penguin. And when I say serious movement, I’m referring to major traffic gains or losses all starting on October 24. Again, these were sites heavily dealing with Panda and had clean link profiles. Check out the trending below from October 24 for several sites that saw impact.

A good day for a Panda victim:
A bad day for a Panda victim:
And an incredibly frustrating day for a 9/5 recovery that went south on 10/24:

(All on the link below)!

I saw this enough that I tweeted heavily about it and
included a section about Panda in my Penguin 3.0 blog post. And that’s when something wonderful happened, and it highlights the true beauty and power of the internet.

As more people saw my tweets and read my post, I started receiving messages from other webmasters explaining that they saw the same exact thing, and on their websites dealing with Panda and not Penguin. And not only did they tell me about, they showed me the impact.

I received emails containing screenshots and tweets with photos from Google Analytics and Google Webmaster Tools. It was amazing to see, and it confirmed
that we had just experienced a Panda update in the middle of a multi-week Penguin rollout. Yes, read that line again. Panda during Penguin, right when the internet world was clearly focused on Penguin 3.0.

That was a sneaky move Google… very sneaky. 🙂

So, based on what I explained earlier about webmaster confusion and algorithms, can you tell what happened next? Yes, massive confusion ensued. We had the
trifecta of algorithm updates with Penguin, Pirate, and now Panda.

Webmaster confusion and a reminder of the algo sandwich from 2012
So, we had a major algorithm update during two other major algorithm updates (Penguin and Pirate) and webmaster confusion was hitting extremely high levels. And I don’t blame anyone for being confused. I’m neck deep in this stuff and it confused me at first.

Was the October 24 update a Penguin tremor or was this something else? Could it be Pirate? And if it was indeed Panda, it would have been great if Google told us it was Panda! Or did they want to throw off SEOs analyzing Penguin and Pirate? Does anyone have a padded room I can crawl into?

Once I realized this was Panda, and started to communicate the update via Twitter and my blog, I had a number of people ask me a very important question:

“Glenn, would Google really roll out two or three algorithm updates so close together, or at the same time?”

Why yes, they would. Anyone remember the algorithm sandwich from April of 2012? That’s when Google rolled out Panda on April 19, then Penguin 1.0 on April 24,
followed by Panda on April 27. Yes, we had three algorithm updates all within ten days. And let’s not forget that the Penguin update on April 24, 2012 was the first of its kind! So yes, Google can, and will, roll out multiple major algos around the same time.

Where are we headed? It’s fascinating, but not pretty

Panda is near real-time now
When Panda 4.1 rolled out on September 23, 2014, I immediately disliked the title and version number of the update. Danny Sullivan named it 4.1, so it stuck. But for
me, that was not 4.1… not even close. It was more like 4.75. You see, there have been a number of Panda tremors and updates since P4.0 on May 20,
2014.

I saw what I was calling “tremors”
nearly weekly based on having access to a large amount of Panda data (across sites, categories, and countries).
And based on what I was seeing, I reached out to John Mueller at Google to clarify the tremors. John’s response was great and confirmed what I was seeing.
He explained that there was not a set frequency for algorithms like Panda. Google can roll out an algorithm, analyze the SERPs, refine the algo to get the desired results, and keep pushing it out. And that’s exactly what I was seeing (again, almost weekly since Panda 4.0).

When Panda and Penguin meet in real time…
…they will have a cup of coffee and laugh at us. 🙂 So, since Panda is near-real time, the crossing of major algorithm updates is going to happen.
And we just experienced an important one on October 24 with Penguin, Pirate, and Panda. But it could (and probably will) get more chaotic than what we have now.
We are quickly approaching a time where major algorithm updates crafted in a lab will be unleashed on the web in near-real time or in actual real time.

And if organic search traffic from Google is important to you, then pay attention. We’re about to take a quick trip into the future of Google and SEO. And after hearing what I have to say, you might just want the past back…

Google’s brilliant object-oriented approach to fighting webspam
I have presented at the past two SES conferences about Panda, Penguin, and other miscellaneous disturbances in the force. More about those “other
disturbances” soon. In my presentation, one of my slides looks like this:

(See link below)!

Over the past several years, Google has been using a brilliant, object-oriented approach to fighting webspam and low quality content. Webspam engineers can craft external algorithms in a lab and then inject them into the real-time algorithm whenever they want. It’s brilliant because it isolates specific problems, while also being extremely scalable. And by the way, it should scare the heck out of anyone breaking the rules.

For example, we have Panda, Penguin, Pirate, and Above the Fold. Each was crafted to target a specific problem and can be unleashed on the web whenever Google wants. Sure, there are undoubtedly connections between them (either directly or indirectly), but each specific algo is its own black box. Again, it’s object-oriented.

Now, Panda is a great example of an algorithm that has matured to where Google highly trusts it. That’s why Google announced in June of 2013 that Panda would roll out monthly, over ten days. And that’s also why it matured even more with Panda 4.0 (and why I’ve seen tremors almost weekly.)

And then we had Gary Illyes explain that Penguin was moving along the same path. At SMX East, Gary explained that the new Penguin algorithm (which clearly didn’t roll out on October 17) would be structured in a way where subsequent updates could be rolled out more easily.
You know, like Panda.

And by the way, what if this happens to Pirate, Above the Fold, and other algorithms that Google is crafting in its Frankenstein lab? Well my friends, then we’ll have absolute chaos and society as we know it will crumble. OK, that’s a bit dramatic, but you get my point.

We already have massive confusion now… and a glimpse into the future reveals a continual flow of major algorithms running in real-time, each that could pummel a site to the ground. And of course, with little or no sign of which algo actually caused the destruction. I don’t know about you, but I just broke out in hives. 🙂

Actual example of what (near) real-time updates can do After Panda 4.0, I saw some very strange Panda movement for sites impacted by recent updates. And it underscores the power of near-real time algo updates.
As a quick example, temporary Panda recoveries can happen if you don’t get out of the gray area enough. And now that we are seeing Panda tremors almost weekly, you can experience potential turbulence several times per
month.

Here is a screenshot from a site that recovered from Panda, didn’t get out of the gray area and reentered the strike zone, just five days later.

(See link to article below)!

Holy cow, that was fast. I hope they didn’t plan any expensive trips in the near future. This is exactly what can happen when major algorithms roam the web in real time. One week you’re looking good and the next week you’re in the dumps. Now, at least I knew this was Panda. The webmaster could tackle more content problems and get out of the gray area… But the ups and downs of a Panda roller coaster ride can drive a webmaster insane. It’s one of the reasons I recommend making
significant changes when you’ve been hit by Panda. Get as far out of the gray area as possible.

An “automatic action viewer” in Google Webmaster Tools could help (and it’s actually being discussed internally by Google)
Based on webmaster confusion, many have asked Google to create an “automatic action viewer” in Google Webmaster Tools. It would be similar to the “manual
actions viewer,” but focused on algorithms that are demoting websites in the search results (versus penalties). Yes, there is a difference by the way.

The new viewer would help webmasters better understand the types of problems that are being impacted by algorithms like Panda, Penguin, Pirate, Above the
Fold, and others. Needless to say, this would be incredibly helpful to webmasters, business owners, and SEOs.

So, will we see that viewer any time soon? Google’s John Mueller
addressed this question during the November 3 webmaster hangout (at 34:54).

http://ift.tt/1zj066n

John explained they are trying to figure something out, but it’s not easy. There are so many algorithms running that they don’t want to provide feedback
that is vague or misleading. But, John did say they are discussing the automatic action viewer internally. So you never know…

A quick note about Matt Cutts
As many of you know, Matt Cutts took an extended leave this past summer (through the end of October). Well, he announced on Halloween that he is extending his leave into 2015. I won’t go crazy here talking about his decision overall, but I will
focus on how this impacts webmasters as it relates to algorithm updates and webspam.

Matt does a lot more than just announce major algo updates… He actually gets involved when collateral damage rears its ugly head. And there’s not a
faster way to rectify a flawed algo update than to have Mr. Cutts involved. So before you dismiss Matt’s extended leave as uneventful, take a look at the
trending below:

Notice the temporary drop off a cliff, then 14 days of hell, only to see that traffic return? That’s because Matt got involved. That’s the
movie blog fiasco from early 2014 that I heavily analyzed. If
Matt was not notified of the drop via Twitter, and didn’t take action, I’m not sure the movie blogs that got hit would be around today. I told Peter from
SlashFilm that his fellow movie blog owners should all pay him a bonus this year. He’s the one that pinged Matt via Twitter and got the ball rolling.

It’s just one example of how having someone with power out front can nip potential problems in the bud. Sure, the sites experienced two weeks of utter
horror, but traffic returned once Google rectified the problem. Now that Matt isn’t actively helping or engaged, who will step up and be that guy? Will it
be John Mueller, Pierre Far, or someone else? John and Pierre are greatly helpful, but will they go to bat for a niche that just got destroyed? Will they
push changes through so sites can turn around? And even at its most basic level, will they even be aware the problem exists?

These are all great questions, and I don’t want to bog down this post (it’s already incredibly long). But don’t laugh off Matt Cutts taking an extended leave. If he’s gone for good, you might only realize how important he was to the SEO community
after he’s gone. And hopefully it’s not because your site just tanked as collateral damage during an algorithm update. Matt might be running a marathon or trying on new Halloween costumes. Then where will you be?

Recommendations moving forward:
So where does this leave us? How can you prepare for the approaching storm of crossing algorithms? Below, I have provided several key bullets that I think every webmaster should consider. I recommend taking a hard look at your site now, before major algos are running in near-real time.

Truly understand the weaknesses with your website. Google will continue crafting external algos that can be injected into the real-time algorithm.
And they will go real-time at some point. Be ready by cleaning up your site now.
Document all changes and fluctuations the best you can. Use annotations in Google Analytics and keep a spreadsheet updated with detailed
information.
Along the same lines, download your Google Webmaster Tools data monthly (at least). After helping many companies with algorithm hits, that
information is incredibly valuable, and can help lead you down the right recovery path.
Use a mix of audits and focus groups to truly understand the quality of your site. I mentioned in my post about aggressive advertising and Panda that human focus groups are worth their weight in gold (for surfacing Panda-related problems). Most business owners are too close to their own content and websites to accurately measure quality. Bias can be a nasty problem and can quickly lead to bamboo-overflow on a website.
Beyond on-site analysis, make sure you tackle your link profile as well. I recommend heavily analyzing your inbound links and weeding out unnatural links. And use the disavow tool for links you can’t remove. The combination of enhancing the quality of your content, boosting engagement, knocking down usability obstacles, and cleaning up your link profile can help you achieve long-term SEO success. Don’t tackle one quarter of your SEO problems. Address
all of them.
Remove barriers that inhibit change and action. You need to move fast. You need to be decisive. And you need to remove red tape that can bog down
the cycle of getting changes implemented. Don’t water down your efforts because there are too many chefs in the kitchen. Understand the changes that need to be implemented, and take action. That’s how you win SEO-wise.

Summary: Are you ready for the approaching storm?
SEO is continually moving and evolving, and it’s important that webmasters adapt quickly. Over the past few years, Google’s brilliant object-oriented approach to fighting webspam and low quality content has yielded algorithms like Panda, Penguin, Pirate, and Above the Fold. And more are on their way. My advice is to get your situation in order now, before crossing algorithms blend a recipe of confusion that make it exponentially harder to identify, and
then fix, problems riddling your website.

Now excuse me while I try to build a flux capacitor. 🙂

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

For more on this Article including the images and graphs see:
http://omhub.wordpress.com/2014/11/12/the-danger-of-crossing-algorithms-uncovering-the-cloaked-panda-update-during-penguin-3-0/

The SEO News & Advice page was posted “By Mike Armstrong” to the SEO Blog category.

A recent post about how users view and interact with Today’s Google Search Engine Results Page

New post on Online Marketing Hub

Eye Tracking in 2014: How Users View and Interact with Today’s Google SERPs
by christopherjanb
Posted by rMaynes1

In September 2014, Mediative released its latest eye-tracking research entitled “The Evolution of Google’s Search Engine Results Pages and Their Effects on User Behaviour”.

This large study had participants conduct various searches using Google on a desktop. For example, participants were asked “Imagine you’re moving from
Toronto to Vancouver. Use Google to find a moving company in Toronto.” Participants were all presented with the same Google SERP, no matter the search
query.

Mediative wanted to know where people look and click on the SERP the most, what role the location of the listing on the SERP plays in winning views and
clicks, and how click activity on listings has changed with the introduction of Google features such as the carousel, the knowledge graph etc.

Mediative discovered that, just as Google’s SERP has evolved over the past decade, so too has the way in which search engine users scan the page before
making a click.

Back in 2005 when
a similar eye-tracking study was conducted for the first time by Mediative (formerly Enquiro), it was
discovered that people searched in a distinctive “triangle” pattern, starting in the top left of the search results page where they expected the first
organic listing to be located, and reading across horizontally before moving their eyes down to the second organic listing, and reading horizontally, but
not quite as far. This area of concentrated gaze activity became known as Google’s “Golden Triangle”. The study concluded that if a business’s listing was
not in the Golden Triangle, its odds of being seen by a searcher were dramatically reduced.

Heat map from 2005 showing the area known as Google’s “Golden Triangle” (see link below).

But now, in 2014, the top organic results are no longer always in the top-left corner where searchers expect them to be, so they scan other areas of the
SERP, trying to seek out the top organic listing, but being distracted by other elements along the way. The #1 organic listing is shifting further down the
page, and while this listing still captures the most click activity (32.8%) regardless of what new elements are presented, the shifting location has opened
up the top of the page with more potential areas for businesses to achieve visibility.

Where scanning was once more
horizontal, the adoption of mobile devices over the past 9 years has habitually conditioned searchers to now scan
more
vertically—they are looking for the fastest path to the desired content, and, compared to 9 years ago, they are viewing more search results
listings during a single session and spending less time viewing each one.

Searchers on Google now scan far more vertically than several years ago (see link below)

One of the biggest changes from SERPS 9 years ago to today is that Google is now trying to keep people on the result page for as long as they can.

An example is in the case of the knowledge graph. In Mediative’s study. when searchers were looking for “weather in New Orleans”, the results page that was
presented to them showed exactly what they needed to know. Participants were asked to click on the result that they felt best met their needs, even if, if reality, they wouldn’t have clicked through (in order to end that task). When a knowledge graph result exactly met the intent of the searcher, the study found 80% of people looked at that result, and 44% clicked on it. Google provided searchers with a relevant enough answer to keep them on the SERP.
The top organic listing captured 36.5% of pages clicks—compared to 82% when the knowledge graph did not provide the searcher with the answer they were
looking for.

It’s a similar case with the carousel results; when a searcher clicks on a listing, instead of going through to the listing’s website, another SERP is
presented specifically about the business, as Google tries to increase paid ad impressions/clicks on the Google search results page.

How can businesses stay on top of these changes and ensure they still get listed?
There are four main things to keep in mind:

1.
The basic fundamentals of SEO are as important as ever Create unique, fresh content, which speaks to the needs of your customers as this will always trump chasing the algorithm. There are also on-page and
off-page SEO tactics that you can employ that can increase your chances of being listed in areas of the SERP other than your website’s organic listing such as front-loading keywords in page titles and meta descriptions, getting listed on directories and ratings and reviews site, having social pages etc. It’s important to note that SEO strategy is no longer a one-size-fits-all approach.

2.
Consider using schema mark-up wherever possible
In Mediative’s 2014 Google SERP research, it was discovered that blog posts that had been marked up using schema to show the picture and name of the author
got a significant amount of engagement, even when quite far down the first page—these listings garnered an average of 15.5% of total page clicks.

Note:

As of August 2014, Google removed authorship markup entirely. However, the results are still a good example of how schema mark-up can be used to make
your business listing stand out more on the SERP, potentially capturing more view and clicks, and therefore more website traffic.

In the study, participants were asked to “Imagine that you’re starting a business and you need to find a company to host your website. Use Google to find
information about website hosting companies”. The SERP presented is shown below:

Almost 45% of clicks went to 2 blog posts titled “Five Best Web Hosting Companies” and “10 Best Web Hosting Companies”.

In general, the top clicked posts were those that had titles including phrases such as:

“Best…”
“Reviews of…”
“Top 5…”
“How-to…”
According to Google, “On-page markup helps search engines understand the information on webpages and provide richer results…Google doesn’t use markup
for ranking purposes at this time-but rich snippets can make your web pages appear more prominently in search results, so you may see an increase in
traffic.”

Schema markup is probably the most under-utilized tool for SEO, presenting a huge opportunity for companies that do utilize the Google approved tool.
Searchmetrics reported that only 0.3% of websites
use schema markup, yet over a third of Google’s results contain rich snippets (additional text, images and links below the individual search results).
BruceClay.com reports rich snippets can increase CTRs of listings between
15-50% and that websites using schema markup tend to rank higher in search results.

Schema mark-up can be used to add star ratings, number of reviews, pricing (all shown in the listing below) and more to a search results page listing.

3.
Know the intent of your users
Understanding what searchers are trying to discover when they conduct a search can help determine how much effort you should try and put into appearing in the number one organic listing, which can be an extremely difficult task without unlimited budget and resources—and, even if you do make it the number one organic listing, traffic is not guaranteed as discovered in this reaserch. If you’re competing with big name brands, or ratings and review sites, and
THAT is what your customers want, they you are going to struggle to compete.

The importance of your business being the first listing vs. on the first page therefore, is highly dependent on the searcher’s intent, plus the strength of your brand. The key is to always keep
user intent top-of-mind, and this can be established by talking to real people, rather than guessing. What are they looking for when they are searching for your site? Structure your content around what people really want and need, list your site
on the directories that people actually visit or reference, create videos (if that’s what your audience wants)—know what your actual customers are
looking for, and then provide it.

There are going to be situations when a business can’t get to number one on the organic listings. As previously mentioned, the study shows that this is still the key place to be, and the top organic listing captures more clicks that any other single listing. But if your chances of getting to that number
one spot are slim, you need to focus on other areas of the SERP, such as positions #4 or higher, which will be easier to obtain ranking for—businesses that are positioned lower on the SERP (especially positions 2-4) see more click activity than they did several years ago, making this real estate much more valuable. As Gord Hotchkiss writes about, searchers tend to “chunk” information on the SERP and scan each chuck in the same way they used to search the entire SERP—in a triangle pattern. Getting listed at the top of a “chunk” can therefore be effective for many businesses. This idea of “chunking” and scanning can be seen in the heat map below.

To add to that, Mediative’s research showed that everything located above the top 4 organic listings (so, carousel results, knowledge graph, paid listings,
local listings etc.) combined captured 84% of clicks. If you can’t get your business listing to #1, but can get listed somewhere higher than #4, you have a good chance of being seen, and clicked on by searchers. Ultimately, people expect Google to continue to do its job, and respond to search queries with the most relevant results at the top. The study points out that only 1% of participants were willing to click through to Page 2 to see more results. If you’re not listed on page 1 of Google for relevant searches, you may as well not exist online.

4.
A combination of SEO and paid search can maximize your visibility in SERP areas that have the biggest impact on both branding and traffic Even though organic listings are where many businesses are striving to be listed (and where the majority of clicks take place), it’s important not to
forget about paid listings as a component of your digital strategy. Click-through rates for top sponsored listings (positions 1 and 2) have changed very little in the past decade. Where the huge change has taken place is in the ability of sponsored ads on the right rail to attract attention and clicks.

Activity on this section of the page is almost non-existent. This can be put down to a couple of factors including searchers conditioned behaviour as
mentioned before, to scan more vertically, thanks to our increased mobile usage, and the fact that over the years we have learned that those results may not typically be very relevant, or as good as the organic results, so we tend not to even take the time to view them.

Mediative’s research also found that there are branding effects of paid search, even if not directly driving traffic. We asked participants to “Imagine you are traveling to New Orleans and are looking for somewhere to meet a friend for dinner in the French Quarter area. Use Google to find a restaurant.”
Participants were presented with a SERP showing 2 paid ads—the first was for opentable.com, and the second for the restaurant Remoulade, remoulade.com.

The top sponsored listing, opentable.com, was viewed by 84% of participants, and captured 26% of clicks. The second listing, remoulade.com, only captured
2% of clicks but was looked at by 73% of participants. By being seen by almost 3/4 of participants, the paid listing can increase brand affinity, and
therefore purchase (or choice) consideration in other areas! For example, if the searcher comes back and searches again another time, or clicks to opentable.com and then sees Remoulade listed, it may benefit from a higher brand affinity from having already been seen in the paid listings. Mediative conducted a Brand Lift study featuring Honda that found the more real estate that brands own on the SERP, the higher the CTR, and the higher the brand affinity, brand recognition, purchase consideration etc. Using paid search for more of a branding play is essentially free
brand advertising—while you should be prepared to get the clicks and pay for them of course, it likely that your business listing will be seen by a large number of people without capturing the same number of clicks. Impression data can also be easily tracked with Google paid ads so you know
exactly how many times your ad was shown, and can therefore estimate how many people actually looked at it from a branding point of view.

Rebecca Maynes is a Marketing Communications Strategist with Mediative, and was a major contributor on this study. The full study, including click-through rates for all areas of the SERP, can be downloaded at

http://ift.tt/1vuhDXI.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

For more on this article or for pictures see: http://omhub.wordpress.com/2014/10/22/eye-tracking-in-2014-how-users-view-and-interact-with-todays-google-serps/

The A recent post about how users view and interact with Today’s Google Search Engine Results Page was posted “By Mike Armstrong”

Content Marketing & Content Writing for Search Engine Optimisation

If you are adopting or implementing a content marketing strategy and looking to provide content writing on a blog or a website to help improve your ranking under certain “Keyword Searches” in the Internet Search Engine Rank Pages (SERP’s), then you should look to post or write content regularly.

You should also write 300 to 500 words and should use the keyword (or keyword phrase) that you are looking to rank high for regularly throughout the content, and within your Headers and meta descriptions etc.

If you are looking for help with a content marketing strategy, content marketing or content writing services or Keyword Analysis or hot spot details in order to get the most out of your content or your content marketing please call: 07517 024979 | or email: maconsultancy1@gmail.com

*If you like this SEO post you might also like this other SEO post:

SEO Tip / Search Engine Optimisation:

http://maconsultancycardiff.com/blogging/seo-tip-search-engine-optimisation/

The Content Marketing & Content Writing for Search Engine Optimisation page was posted “By Mike Armstrong”

IMG_0153.JPG

A recent post about how users view and interact with Google Search Engine Results Page!

New post on Online Marketing Hub

Eye Tracking in 2014: How Users View and Interact with Today’s Google SERPs
by christopherjanb
Posted by rMaynes1

In September 2014, Mediative released its latest eye-tracking research entitled “The Evolution of Google’s Search Engine Results Pages and Their Effects on User Behaviour”.

This large study had participants conduct various searches using Google on a desktop. For example, participants were asked “Imagine you’re moving from
Toronto to Vancouver. Use Google to find a moving company in Toronto.” Participants were all presented with the same Google SERP, no matter the search
query.

Mediative wanted to know where people look and click on the SERP the most, what role the location of the listing on the SERP plays in winning views and
clicks, and how click activity on listings has changed with the introduction of Google features such as the carousel, the knowledge graph etc.

Mediative discovered that, just as Google’s SERP has evolved over the past decade, so too has the way in which search engine users scan the page before
making a click.

Back in 2005 when
a similar eye-tracking study was conducted for the first time by Mediative (formerly Enquiro), it was
discovered that people searched in a distinctive “triangle” pattern, starting in the top left of the search results page where they expected the first
organic listing to be located, and reading across horizontally before moving their eyes down to the second organic listing, and reading horizontally, but
not quite as far. This area of concentrated gaze activity became known as Google’s “Golden Triangle”. The study concluded that if a business’s listing was
not in the Golden Triangle, its odds of being seen by a searcher were dramatically reduced.

Heat map from 2005 showing the area known as Google’s “Golden Triangle” (see link below).

But now, in 2014, the top organic results are no longer always in the top-left corner where searchers expect them to be, so they scan other areas of the
SERP, trying to seek out the top organic listing, but being distracted by other elements along the way. The #1 organic listing is shifting further down the
page, and while this listing still captures the most click activity (32.8%) regardless of what new elements are presented, the shifting location has opened
up the top of the page with more potential areas for businesses to achieve visibility.

Where scanning was once more
horizontal, the adoption of mobile devices over the past 9 years has habitually conditioned searchers to now scan
more
vertically—they are looking for the fastest path to the desired content, and, compared to 9 years ago, they are viewing more search results
listings during a single session and spending less time viewing each one.

Searchers on Google now scan far more vertically than several years ago (see link below)

One of the biggest changes from SERPS 9 years ago to today is that Google is now trying to keep people on the result page for as long as they can.

An example is in the case of the knowledge graph. In Mediative’s study. when searchers were looking for “weather in New Orleans”, the results page that was
presented to them showed exactly what they needed to know. Participants were asked to click on the result that they felt best met their needs, even if, if reality, they wouldn’t have clicked through (in order to end that task). When a knowledge graph result exactly met the intent of the searcher, the study found 80% of people looked at that result, and 44% clicked on it. Google provided searchers with a relevant enough answer to keep them on the SERP.
The top organic listing captured 36.5% of pages clicks—compared to 82% when the knowledge graph did not provide the searcher with the answer they were
looking for.

It’s a similar case with the carousel results; when a searcher clicks on a listing, instead of going through to the listing’s website, another SERP is
presented specifically about the business, as Google tries to increase paid ad impressions/clicks on the Google search results page.

How can businesses stay on top of these changes and ensure they still get listed?
There are four main things to keep in mind:

1.
The basic fundamentals of SEO are as important as ever Create unique, fresh content, which speaks to the needs of your customers as this will always trump chasing the algorithm. There are also on-page and
off-page SEO tactics that you can employ that can increase your chances of being listed in areas of the SERP other than your website’s organic listing such as front-loading keywords in page titles and meta descriptions, getting listed on directories and ratings and reviews site, having social pages etc. It’s important to note that SEO strategy is no longer a one-size-fits-all approach.

2.
Consider using schema mark-up wherever possible
In Mediative’s 2014 Google SERP research, it was discovered that blog posts that had been marked up using schema to show the picture and name of the author
got a significant amount of engagement, even when quite far down the first page—these listings garnered an average of 15.5% of total page clicks.

Note:

As of August 2014, Google removed authorship markup entirely. However, the results are still a good example of how schema mark-up can be used to make
your business listing stand out more on the SERP, potentially capturing more view and clicks, and therefore more website traffic.

In the study, participants were asked to “Imagine that you’re starting a business and you need to find a company to host your website. Use Google to find
information about website hosting companies”. The SERP presented is shown below:

Almost 45% of clicks went to 2 blog posts titled “Five Best Web Hosting Companies” and “10 Best Web Hosting Companies”.

In general, the top clicked posts were those that had titles including phrases such as:

“Best…”
“Reviews of…”
“Top 5…”
“How-to…”
According to Google, “On-page markup helps search engines understand the information on webpages and provide richer results…Google doesn’t use markup
for ranking purposes at this time-but rich snippets can make your web pages appear more prominently in search results, so you may see an increase in
traffic.”

Schema markup is probably the most under-utilized tool for SEO, presenting a huge opportunity for companies that do utilize the Google approved tool.
Searchmetrics reported that only 0.3% of websites
use schema markup, yet over a third of Google’s results contain rich snippets (additional text, images and links below the individual search results).
BruceClay.com reports rich snippets can increase CTRs of listings between
15-50% and that websites using schema markup tend to rank higher in search results.

Schema mark-up can be used to add star ratings, number of reviews, pricing (all shown in the listing below) and more to a search results page listing.

3.
Know the intent of your users
Understanding what searchers are trying to discover when they conduct a search can help determine how much effort you should try and put into appearing in the number one organic listing, which can be an extremely difficult task without unlimited budget and resources—and, even if you do make it the number one organic listing, traffic is not guaranteed as discovered in this reaserch. If you’re competing with big name brands, or ratings and review sites, and
THAT is what your customers want, they you are going to struggle to compete.

The importance of your business being the first listing vs. on the first page therefore, is highly dependent on the searcher’s intent, plus the strength of your brand. The key is to always keep
user intent top-of-mind, and this can be established by talking to real people, rather than guessing. What are they looking for when they are searching for your site? Structure your content around what people really want and need, list your site
on the directories that people actually visit or reference, create videos (if that’s what your audience wants)—know what your actual customers are
looking for, and then provide it.

There are going to be situations when a business can’t get to number one on the organic listings. As previously mentioned, the study shows that this is still the key place to be, and the top organic listing captures more clicks that any other single listing. But if your chances of getting to that number
one spot are slim, you need to focus on other areas of the SERP, such as positions #4 or higher, which will be easier to obtain ranking for—businesses that are positioned lower on the SERP (especially positions 2-4) see more click activity than they did several years ago, making this real estate much more valuable. As Gord Hotchkiss writes about, searchers tend to “chunk” information on the SERP and scan each chuck in the same way they used to search the entire SERP—in a triangle pattern. Getting listed at the top of a “chunk” can therefore be effective for many businesses. This idea of “chunking” and scanning can be seen in the heat map below.

To add to that, Mediative’s research showed that everything located above the top 4 organic listings (so, carousel results, knowledge graph, paid listings,
local listings etc.) combined captured 84% of clicks. If you can’t get your business listing to #1, but can get listed somewhere higher than #4, you have a good chance of being seen, and clicked on by searchers. Ultimately, people expect Google to continue to do its job, and respond to search queries with the most relevant results at the top. The study points out that only 1% of participants were willing to click through to Page 2 to see more results. If you’re not listed on page 1 of Google for relevant searches, you may as well not exist online.

4.
A combination of SEO and paid search can maximize your visibility in SERP areas that have the biggest impact on both branding and traffic Even though organic listings are where many businesses are striving to be listed (and where the majority of clicks take place), it’s important not to
forget about paid listings as a component of your digital strategy. Click-through rates for top sponsored listings (positions 1 and 2) have changed very little in the past decade. Where the huge change has taken place is in the ability of sponsored ads on the right rail to attract attention and clicks.

Activity on this section of the page is almost non-existent. This can be put down to a couple of factors including searchers conditioned behaviour as
mentioned before, to scan more vertically, thanks to our increased mobile usage, and the fact that over the years we have learned that those results may not typically be very relevant, or as good as the organic results, so we tend not to even take the time to view them.

Mediative’s research also found that there are branding effects of paid search, even if not directly driving traffic. We asked participants to “Imagine you are traveling to New Orleans and are looking for somewhere to meet a friend for dinner in the French Quarter area. Use Google to find a restaurant.”
Participants were presented with a SERP showing 2 paid ads—the first was for opentable.com, and the second for the restaurant Remoulade, remoulade.com.

The top sponsored listing, opentable.com, was viewed by 84% of participants, and captured 26% of clicks. The second listing, remoulade.com, only captured
2% of clicks but was looked at by 73% of participants. By being seen by almost 3/4 of participants, the paid listing can increase brand affinity, and
therefore purchase (or choice) consideration in other areas! For example, if the searcher comes back and searches again another time, or clicks to opentable.com and then sees Remoulade listed, it may benefit from a higher brand affinity from having already been seen in the paid listings. Mediative conducted a Brand Lift study featuring Honda that found the more real estate that brands own on the SERP, the higher the CTR, and the higher the brand affinity, brand recognition, purchase consideration etc. Using paid search for more of a branding play is essentially free
brand advertising—while you should be prepared to get the clicks and pay for them of course, it likely that your business listing will be seen by a large number of people without capturing the same number of clicks. Impression data can also be easily tracked with Google paid ads so you know
exactly how many times your ad was shown, and can therefore estimate how many people actually looked at it from a branding point of view.

Rebecca Maynes is a Marketing Communications Strategist with Mediative, and was a major contributor on this study. The full study, including click-through rates for all areas of the SERP, can be downloaded at

http://ift.tt/1vuhDXI.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

For more on this article or for pictures see: http://omhub.wordpress.com/2014/10/22/eye-tracking-in-2014-how-users-view-and-interact-with-todays-google-serps/

The A recent post about how users view and interact with Today’s Google Search Engine Results Page was posted “By Mike Armstrong”

2014 Local Search Ranking Factors

New post on Online Marketing Hub

Announcing the 2014 Local Search Ranking Factors Results
by christopherjanb
Posted by David-Mihm

Many of you have been tweeting, emailing, asking in conference Q&As, or just generally awaiting this year’s Local Search Ranking Factors survey results.

Here they are!

IMG_5780.PNG

Hard to believe, but this is the seventh year I’ve conducted this survey—local search has come a long way since the early days of the 10-pack way back in 2008! As always, a massive thanks to all of the expert panelists who in many cases gave up a weekend or a date night in order to fill out the survey.

New this year
As the complexity of the local search results has increased, I’ve tried to keep the survey as manageable as possible for the participants, and the presentation of results as actionable as possible for the community. So to that end, I’ve made a couple of tweaks this year.

Combination of desktop and mobile results
Very few participants last year perceived any noticeable difference between ranking criteria on desktop and mobile devices, so this year I simply asked that they rate localized organic results, and pack/carousel results, across both result types.

Results limited to top 50 factors in each category
Again, the goal here was to simplify some of the complexity and help readers focus on the factors that really matter. Let me know in the comments if you think this decision detracts significantly from the results, and I’ll revisit it in 2015.

Factors influenced by Pigeon
If you were at Matt McGee’s Pigeon session at SMX East a couple of weeks ago, you got an early look at these results in my presentation. The big winners were domain authority and proximity to searcher, while the big losers were proximity to centroid and having an address in the city of search. (For those who weren’t at my presentation, the latter assessment may have to do with larger radii of relevant results for geomodified phrases).

My own takeaways
Overall, the
algorithmic model that Mike Blumenthal developed (with help from some of the same contributors to this survey) way back in 2008 continues to stand up. Nonetheless, there were a few clear shifts this year that I’ll highlight below:

Behavioral signals—especially clickthrough rate from search results—seem to be increasing in importance. Darren Shaw in particular noted Rand’s IMEC Labs research, saying “I think factors like click through rate, driving directions, and “pogo sticking” are valuable quality signals that Google has cranked up the dial on.”
Domain authority seems to be on its way up—particularly since the Pigeon rollout here in the U.S. Indeed, even in clear instances of post-Pigeon spam, the poor results seem to relate to Google’s inability to reliably separate “brands” from “spam” in Local. I expect Google to get better at this, and the importance of brand signals to remain high.
Initially, I was surprised to see authority and consistency of citations rated so highly for localized organic results. But then I thought to myself, “if Google is increasingly looking for brand signals, then why shouldn’t citations help in the organic algorithm as well?” And while the quantity of structured citations still rated highly for pack and carousel results, consistent citations from quality sources continue to carry the day across both major result types.
Proximity to searcher saw one of the biggest moves in this year’s survey. Google is getting better at detecting location at a more granular level—even on the desktop. The user is the new Centroid.
For markets where Pigeon has not rolled out yet (i.e. everywhere besides the U.S.), I’d encourage business owners and marketers to start taking as many screenshots of their primary keywords as possible. With the benefit of knowing that Pigeon will eventually roll out in your countries, the ability to compare before-and-after results for the same keywords will yield great insight for you in discerning the direction of the algorithm.
As with every year, though, it’s the comments from the experts and community (that’s you, below!) that I find most interesting to read. So I think at this point I’ll sign off, crack open a
GABF Gold-Medal-Winning Breakside IPA from Portland, and watch them roll in!

2014 Local Search Ranking Factors

For more on this article or content marketing see:
http://omhub.wordpress.com/2014/10/13/announcing-the-2014-local-search-ranking-factors-results/

The 2014 Local Search Ranking Factors page was posted “By Mike Armstrong”

SEO News from recent Online Marketing Hub post!

New post on Online Marketing Hub

SEO Teaching: Should SEO Be Taught at Universities?
by christopherjanb
Posted by Carla_Dawson

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.

SEO is a concept that has been around for years and some universities have incorporated it into the curricula. A while back, I posted this question on Moz and noticed some very strong opinions on the idea that SEO should be part of formal education. Search Engine Journal also posted an article on the idea that SEO should not be taught in universities. We (I co-wrote this post with Aleksej Heinze, who also currently teaches SEO) obviously believe SEO should be taught in higher education and got together to discuss how it benefits the SEO industry and how SEO can be incorporated in higher education. Aleksej teaches SEO in the U.K.; I teach SEO in Argentina.

Before I get started with the pros and cons, I want to share with you some opinions from people in industry on the topic of SEO in universities.

Wil Reynolds (Founder – Seer Interactive)
1. Do you believe universities or higher education institutions should equip students with the skills to meet industry needs?

Yes, people take BIG loans to go to the university in the U.S.; we should at least make sure when they graduate they have the skills that are in…demand in the workplace.

2. Are SEO skills something you believe are lacking in industry?

Not sure. “SEO skills” is a broad phrase.

3. Do you think teaching SEO in universities gives credibility to the profession?

Not really, I think the profession has credibility. Teaching SEO in universities gives a student a great platform to learn and to be prepared for one of the industries that is in desperate need of talent.

4. Do you think teaching SEO in universities benefits the industry?

Yes, but I think SEO is too narrow, according to many definitions. If you think about it, SEO is as much about technical as it is about link building [or] keyword research. To teach the broad definition of SEO you’d need a pretty multi-disciplinary group to teach it. Maybe we’d just teach it as part of a digital marketing rotation.

Stephen Lock (Head of Content & Inbound Marketing, Linkdex.com)
1. Do you believe universities or higher education institutions should equip students with the skills to meet industry needs?

Yes, it makes sense that universities, where appropriate, offer courses that are based heavily on industry demands, especially if the course/institution has been marketed as…tailored for employers.

2. Are SEO skills something you believe are lacking in industry?

They definitely are. There is a real shortage, and due to the fast-moving nature of the field, knowledge is quickly outdated, meaning even experienced practitioners aren’t always great candidates.

3. Do you think teaching SEO in universities gives credibility to the profession?

I believe it does, although it is one of those fields where it’s common for people to…come from a broad range of backgrounds. The skills required are so diverse that it’s also understandable that people who have studied one field can adapt. From experience, employers are more interested in the person, their attitude and capacity to learn. However, SEO in universities can only be a good thing for the industry.

4. Do you think teaching SEO in universities benefits the industry?

Teaching SEO, I believe, would benefit the industry, as the skills shortage is so acute and it is so common for entry-level candidates to come from many different backgrounds. My final thoughts are that SEO is so broad as a discipline that calling it just SEO may not do it justice.

What we can see from these and other opinions we received for this article is views are still mixed since SEO education is not clearly defined. Where do you start with a subject area that touches such a broad range of disciplines, including technical, content and engagement? However, the vast majority of our respondents were
positive about the need to integrate SEO in higher education!

Pros to teaching SEO in universities
Eli Overbey wrote a great article on this topic
here, but me and Aleksej took some of the ideas one step further. Basically, we identified problems in industry and how teaching SEO in universities might help the industry.

How teaching SEO in universities may benefit the industry
Industry Problem How SEO in higher education might alive the problem?
Long sales cycles – Selling SEO is a lot about educating your potential client. Today’s student is tomorrow’s potential client.
Students who learn SEO formally (and not just on the job) are likely to have a broader understanding of its benefits, and therefore, be able to “sell” it more effectively to clients.

Lack of Credibility – Most SEOs learned SEO on the job, or through reading great books like “The Art of SEO” and reading great articles on the internet. However, few formal institutions recognize it as a valid marketing technique. SEO is not taught in many marketing related programs. Creating an educational standard for SEO increases the credibility of the field. Treating the discipline as if it was law, engineering, etc., would elevate SEO to a discipline seen as requiring a significant period of study before it can be practiced.
Everyone says they know SEO. Without a recognized standard for the field of SEO, anyone and everyone can say they know SEO.
Clients with bad experiences don’t trust SEO companies. Showing clients you have a certified person on your team may alleviate this situation.
Long recruiting cycles. Recruiters currently have to give SEO tests to verify that the job candidate in front of them really knows SEO. A certification or a degree does not guarantee you know the subject (this is true for lots of fields), but it is an excellent filter and a great starting point.
SEO is constantly changing, making it hard to keep up. Law, medicine and most other subject areas are also constantly changing, and content and concepts are updated accordingly. The same can be true for SEO in universities.
Clients challenge your techniques (ex. “Why don’t you use the keyword meta tag?” or “Why are you using parallax scrolling when it is not SEO-friendly?”) This happens in all industries and being able to reference an independent institution and a high-quality article will probably reduce discussion time.
There is a high demand for SEO skills. Below you will find articles that mention demand for SEO skills in industry. Universities are in the business of creating professionals and satisfying workforce demands.Higher education institutions are often criticized for their lack of relevant educational courses that will equip students with the skills to meet specific industry needs.
SEO is relevant today and will be well into the foreseeable future.

Cons to teaching SEO in universities
We do see some negatives to teaching SEO in universities, but we see them more as issues to be mitigated.
John Weber did a great job identifying the difficulties in teaching SEO in his article on searchenginejournal.com. We agree with several of the points in this article. However, we see them more as issues that can be alleviated through great program development.

Obstacles Potential Solutions
Google makes changes to its algorithm constantly. This exact topic should be brought up in the classroom. Students get that what they learn in school is somewhat “academic” and may be slightly out-of-date, but is still useful.
(On a side note, laws change all the time, yet law is taught in school.)

SEO is complex. It requires analytical and creative skills. Case studies are a great way to teach complex concepts and creativity. Law, perhaps, is similar to SEO in that it requires analytical and creative skills to be successful, and it is taught in universities.
No one absolutely knows “the magic formula.” This exact topic should be brought up in the classroom. This is true with many professions. Medicine is not an exact science and continuously evolves. Physicians often prescribe differing treatments for the same diagnosis.
Current flaws in academia
We also see lots of flaws within the academic world regarding SEO, specifically the fact that if the subject is taught, it is mostly taught as an extension (vocational) course or optional part of an MBA program.

Here are some universities that offer SEO:

University of California San Diego, U.S – taught as an extension course
City University London, U.K. – taught within a digital marketing program
Georgetown University, U.S. – taught as a course within a public relations and communications program
University of Salford, U.K. – taught as an extension course as part of BSc Business Information Technology, MSc Marketing and the Salford MBA course
Universidad Católica de Córdoba, Argentina – taught as a course within a digital marketing program
Universidad Blas Pascal, Argentina – taught as a course within a digital marketing program
Universidad Siglo 21, Argentina – taught as a course within a digital communications and social media program and as a online course
University of Sydney, Australia – taught as a course within a Joomla Framework
We feel SEO should be included as part of many other degree programs.

Please note that mentioning the concept and explaining it is not the same as teaching how to do SEO. In some cases, the concept should be mentioned and included, and in other cases, SEO should be fully taught. For example at Salford Business School, students are expected to plan, execute and evaluate live SEO campaigns and report on their results. This kind of SEO learning helps in job interviews where students can show their own artefacts and discuss what they have done and learned from their practical SEO experience.The academic world
has not incorporated the subject in a holistic manner.

How could SEO be incorporated into higher education?
Degree focus SEO Concept (not to be confused with course) to be incorporated in program Comments
Master of Business Administration (MBA) How to use SEO as a business strategy for long term sustainability of business? Not many MBA courses recognize SEO as a strategic tool for developing value for their business. Hence a number of businesses are missing growth opportunities in the online world.
Advertising How to use SEO with viral marketing and word of mouth as an advertising technique?
Is Inbound Marketing an advertising technique?

Television ads are no longer as effective as those created for YouTube with viral sharing in mind.
Web design/ computer science Designing for Search Engines – Is SEO part of web design? SEO is not taught in many web design or computer science schools. This has major issues/benefits for agencies that try to turn a non-SEO-friendly website into one that can be crawled by search engines.
Marketing Organic search engine results are an important marketing channel, and this concept does not have visibility in the educational system.
Many marketing programs talk about SEO as if it is something that’s useful to someone else. We are all individual brands who can learn and use SEO (e.g., integration of keyword research allows for better digital consumer profiling and learning about the digital personas to be engaged with in marketing mix).

Public Relations (PR) Synergies of online PR with content development strategies and long-term link building Many PR ignore the benefits of SEO and miss out on the mutual benefits that an integration of SEO and online PR could provide.
Journalism Writing text for online readability and scanability (e.g., using headings, bullet points, etc.) Many journalism courses are still based on great headlines and catchy first paragraph, but these are great techniques when combined with SEO, too. Not thinking about the online audience means you miss a lot of reach with articles that are “thrown” onto the web without much consideration.
We argue for wider adoption of SEO at university teaching because of these three reasons:

Shaping the SEO industry
Starting with understanding SEO principles at the university-level, we are shaping the digital marketing professionals of the future. Recognizing the growing range of opportunities that digital marketing creates as a consequence of good SEO practices offers an invitation to the industry for new talent. Offering SEO at universities will not stop cowboy SEO practices, but at least it will reduce the use of such practices out of incompetence.

SEO is no longer a “dark art”
By demystifying the process of SEO, companies will be more likely to employ SEO professionals by recognizing and better appreciating the value they create. SEO is no longer perceived as a “black box” or “dark art” and individuals who might be supervising others will be more able to expect higher standards and discern whether someone is using unwelcome practices.

Good SEO practices will make our industry sustainable
By integrating SEO into wider advertising, digital marketing, journalism, web design, PR and MBA courses, we are able to create a better long-term future for SEO as a profession. Having SEO skills applies to many disciplines, and business would be prepared to pay for these skills as soon as they recognise the return on investment that good SEO can create. By teaching SEO in higher education, SEO will appear more professional, which will lead to long-term sustainability.

Is there demand in the industry for SEO skills?
Universities have often been criticized for offering courses not relevant to industry needs. Students invest in higher education to broaden their horizons, but also to obtain skills that equip them better for their chosen profession. The underlying principle is that universities have to offer “universal knowledge and skills” to improve innovation and skills of the world we live in. So if an industry demands SEO skills, then perhaps it is time for higher education to respond? Here are some articles that show workforce demand related to SEO.

2012 – Conductor –
Demand for SEO Professionals Has Never Been Greater [Study]

2013 – Bruce Clay –
Studies Reveal SEO Analysts are in High Demand

2013 – Search Engine Land –
SEO Talent In High Demand — How To Hire An SEO

Here are some great stats from the articles above.

Studies show a 112 percent year-over-year increase in demand for SEO professionals, with salaries as high as $94,000, as reported by Conductor, an SEO technology company based in New York.
Search Engine Land surveyed the SEO industry and found that 93 percent of respondents expected their SEO business to grow by the end of 2013. It makes sense, then, that 82 percent of respondents also reported plans to hire additional SEO staff this year.
Digital Journal proclaimed “there is no doubt that a career in an SEO agency as an SEO professional can be an exciting and rewarding one. Stress levels would match the lows found in other online positions, while the employment opportunities in such a fast growing business are obvious … Mid-level strategist and management roles can earn from $60,000, while senior marketing directors can expect to approach six-figure sums.”
First-hand experience – Aleksej Heinze
Salford Business School is currently leading a European project, a Joint European Masters in Digital and Social Media Marketing (
JEMSS). This project aims to develop the digital marketeers of the future. JEMSS is a partnership between five European Universities and two commercial organizations, one of which is a digital marketing recruitment agency based in Manchester, the UK.

As part of this project, an extensive consultation with digital agencies and in-house teams has been conducted across five European countries. This multi-stage research project started with a brainstorming session that included ten UK-based agencies in December 2013. They were looking at the top
10 digital marketing skills for international business. The key skill identified as part of this focus group was Search Engine Optimization.

The views from the UK-based agencies were also inline with the online survey results from students and potential students regarding digital marketing courses. The list of 25 skills was developed through the initial focus group with industry practitioners. We can clearly see that SEO tops the table of skills needed when developing knowledge and skills in the area of digital marketing. This online survey was completed by 712 respondents across several countries. We were interested to look at five countries taking part in the JEMSS project: Bulgaria, Greece, Lithuania, Poland and the UK. At least 50 respondents for each of these counties were collected to have a representative sample group.

Do people want to learn SEO?
Looking at the generic searches related to learning SEO/SEO courses in various parts of the world we see some interesting trends:

This Google Trends screenshot See link below) shows some of the main terms related to the popularity of SEO courses. We can see there is a major difference between “SEO training” and “SEO courses.” This can mean most people are seeing SEO as a vocational skill and not an academic course. It is also interesting to note that the location for those interested in “SEO courses” tends to be in India, the U.K. and the U.S. More research should be done in to identify additional hot spots throughout the world.

First hand experience – Carla Dawson

My students are eager to learn about SEO. Many of them make comments like “Carla, we have been waiting for this class” or “This is the best class [in the] program.” In the SEO class, I notice that students pay closer attention than they do in other classes. Multiple requests have been made by my students to “offer a second course or a seminar” so they can learn more about SEO. It almost seems as if the SEO course has more value than some of the other courses. In class, I get questions like “where can we learn more about SEO?” “What sources are reliable?” etc.

Conclusion
Long gone are the days gone where
universities were run by nuns and monks and the main courses included Latin, metaphysics and theology. Most universities are becoming businesses that develop educational products, research and sell them.

If you believe that universities or higher education institutions should equip students with the skills to meet specific industry needs, then perhaps SEO or better yet “Search Marketing” is ideal for universities?

SEO touches so many fields and in our opinion it should be incorporated in various degrees not just offered as an extension course. We would love to hear the communities opinion on this topic so please comment below!

This article was co-authored with Aleksej Heinze from the University of Salford Manchester.

For more on this SEO article or content marketing see:
http://omhub.wordpress.com/2014/10/09/seo-teaching-should-seo-be-taught-at-universities/

SEO News from recent Online Marketing Hub post page posted “By Mike Armstrong”