Tag Archives: SEO Advice

SEO News & Advice

New post on Online Marketing Hub

The Danger of Crossing Algorithms: Uncovering The Cloaked Panda Update During Penguin 3.0
by christopherjanb
Posted by glenngabe

Penguin 3.0 was one of the most anticipated algorithm updates in recent years when it rolled out on October 17, 2014. Penguin hadn’t run for over a year at that point,
and there were many webmasters sitting in Penguin limbo waiting for recovery. They had cleaned up their link profiles, disavowed what they could, and were
simply waiting for the next update or refresh. Unfortunately, Google was wrestling with the algo internally and over twelve months passed without an
update.

So when Pierre Far finally
announced Penguin 3.0 a few days later on October 21, a few things
stood out. First, this was not a new algorithm like Gary Illyes had explained it would be at SMX East. It was a refresh and underscored
the potential problems Google was battling with Penguin (cough, negative SEO).

Second, we were not seeing the impact that we expected. The rollout seemed to begin with a heavier international focus and the overall U.S impact has been underwhelming to say the least. There were definitely many fresh hits globally, but there were a number of websites that should have recovered but didn’t
for some reason. And many are still waiting for recovery today.

Third, the rollout would be slow and steady and could take weeks to fully complete. That’s unusual, but makes sense given the microscope Penguin 3.0 was under. And this third point (the extended rollout) is even more important than most people think. Many webmasters are already confused when they get hit
during an acute algorithm update (for example, when an algo update rolls out on one day). But the confusion gets exponentially worse when there is an extended rollout.

The more time that goes by between the initial launch and the impact a website experiences, the more questions pop up. Was it Penguin 3.0 or was it something else? Since I work heavily with algorithm updates, I’ve heard similar questions many times over the past several years. And the extended Penguin 3.0 rollout is a great example of why confusion can set in. That’s my focus today.

Penguin, Pirate, and the anomaly on October 24

With the Penguin 3.0 rollout, we also had Pirate 2 rolling out. And yes, there are some websites that could be impacted by both. That added a layer of complexity to the situation, but nothing like what was about to hit. You see, I picked up a very a strange anomaly on October 24. And I clearly saw serious movement on that day (starting late in the day ET).

So, if there was a third algorithm update, then that’s three potential algo updates rolling out at the same time. More about this soon, but it underscores the confusion that can set in when we see extended rollouts, with a mix of confirmed and unconfirmed updates.

Penguin 3.0 tremors and analysis
Since I do a lot of Penguin work, and have researched many domains impacted by Penguin in the past, I heavily studied the Penguin 3.0 rollout and published a blog post based on analyzing the first ten days of Penguin 3.0 which included some interesting findings for sure.

And based on the extended rollout, I definitely saw Penguin tremors beyond the initial October 17 launch. For example, check out the screenshot below of a website seeing Penguin impact on October 17, 22, and 25.

But as mentioned earlier, something else happened on October 24 that set off sirens in my office. I started to see serious movement on sites impacted by Panda, and not Penguin. And when I say serious movement, I’m referring to major traffic gains or losses all starting on October 24. Again, these were sites heavily dealing with Panda and had clean link profiles. Check out the trending below from October 24 for several sites that saw impact.

A good day for a Panda victim:
A bad day for a Panda victim:
And an incredibly frustrating day for a 9/5 recovery that went south on 10/24:

(All on the link below)!

I saw this enough that I tweeted heavily about it and
included a section about Panda in my Penguin 3.0 blog post. And that’s when something wonderful happened, and it highlights the true beauty and power of the internet.

As more people saw my tweets and read my post, I started receiving messages from other webmasters explaining that they saw the same exact thing, and on their websites dealing with Panda and not Penguin. And not only did they tell me about, they showed me the impact.

I received emails containing screenshots and tweets with photos from Google Analytics and Google Webmaster Tools. It was amazing to see, and it confirmed
that we had just experienced a Panda update in the middle of a multi-week Penguin rollout. Yes, read that line again. Panda during Penguin, right when the internet world was clearly focused on Penguin 3.0.

That was a sneaky move Google… very sneaky. 🙂

So, based on what I explained earlier about webmaster confusion and algorithms, can you tell what happened next? Yes, massive confusion ensued. We had the
trifecta of algorithm updates with Penguin, Pirate, and now Panda.

Webmaster confusion and a reminder of the algo sandwich from 2012
So, we had a major algorithm update during two other major algorithm updates (Penguin and Pirate) and webmaster confusion was hitting extremely high levels. And I don’t blame anyone for being confused. I’m neck deep in this stuff and it confused me at first.

Was the October 24 update a Penguin tremor or was this something else? Could it be Pirate? And if it was indeed Panda, it would have been great if Google told us it was Panda! Or did they want to throw off SEOs analyzing Penguin and Pirate? Does anyone have a padded room I can crawl into?

Once I realized this was Panda, and started to communicate the update via Twitter and my blog, I had a number of people ask me a very important question:

“Glenn, would Google really roll out two or three algorithm updates so close together, or at the same time?”

Why yes, they would. Anyone remember the algorithm sandwich from April of 2012? That’s when Google rolled out Panda on April 19, then Penguin 1.0 on April 24,
followed by Panda on April 27. Yes, we had three algorithm updates all within ten days. And let’s not forget that the Penguin update on April 24, 2012 was the first of its kind! So yes, Google can, and will, roll out multiple major algos around the same time.

Where are we headed? It’s fascinating, but not pretty

Panda is near real-time now
When Panda 4.1 rolled out on September 23, 2014, I immediately disliked the title and version number of the update. Danny Sullivan named it 4.1, so it stuck. But for
me, that was not 4.1… not even close. It was more like 4.75. You see, there have been a number of Panda tremors and updates since P4.0 on May 20,
2014.

I saw what I was calling “tremors”
nearly weekly based on having access to a large amount of Panda data (across sites, categories, and countries).
And based on what I was seeing, I reached out to John Mueller at Google to clarify the tremors. John’s response was great and confirmed what I was seeing.
He explained that there was not a set frequency for algorithms like Panda. Google can roll out an algorithm, analyze the SERPs, refine the algo to get the desired results, and keep pushing it out. And that’s exactly what I was seeing (again, almost weekly since Panda 4.0).

When Panda and Penguin meet in real time…
…they will have a cup of coffee and laugh at us. 🙂 So, since Panda is near-real time, the crossing of major algorithm updates is going to happen.
And we just experienced an important one on October 24 with Penguin, Pirate, and Panda. But it could (and probably will) get more chaotic than what we have now.
We are quickly approaching a time where major algorithm updates crafted in a lab will be unleashed on the web in near-real time or in actual real time.

And if organic search traffic from Google is important to you, then pay attention. We’re about to take a quick trip into the future of Google and SEO. And after hearing what I have to say, you might just want the past back…

Google’s brilliant object-oriented approach to fighting webspam
I have presented at the past two SES conferences about Panda, Penguin, and other miscellaneous disturbances in the force. More about those “other
disturbances” soon. In my presentation, one of my slides looks like this:

(See link below)!

Over the past several years, Google has been using a brilliant, object-oriented approach to fighting webspam and low quality content. Webspam engineers can craft external algorithms in a lab and then inject them into the real-time algorithm whenever they want. It’s brilliant because it isolates specific problems, while also being extremely scalable. And by the way, it should scare the heck out of anyone breaking the rules.

For example, we have Panda, Penguin, Pirate, and Above the Fold. Each was crafted to target a specific problem and can be unleashed on the web whenever Google wants. Sure, there are undoubtedly connections between them (either directly or indirectly), but each specific algo is its own black box. Again, it’s object-oriented.

Now, Panda is a great example of an algorithm that has matured to where Google highly trusts it. That’s why Google announced in June of 2013 that Panda would roll out monthly, over ten days. And that’s also why it matured even more with Panda 4.0 (and why I’ve seen tremors almost weekly.)

And then we had Gary Illyes explain that Penguin was moving along the same path. At SMX East, Gary explained that the new Penguin algorithm (which clearly didn’t roll out on October 17) would be structured in a way where subsequent updates could be rolled out more easily.
You know, like Panda.

And by the way, what if this happens to Pirate, Above the Fold, and other algorithms that Google is crafting in its Frankenstein lab? Well my friends, then we’ll have absolute chaos and society as we know it will crumble. OK, that’s a bit dramatic, but you get my point.

We already have massive confusion now… and a glimpse into the future reveals a continual flow of major algorithms running in real-time, each that could pummel a site to the ground. And of course, with little or no sign of which algo actually caused the destruction. I don’t know about you, but I just broke out in hives. 🙂

Actual example of what (near) real-time updates can do After Panda 4.0, I saw some very strange Panda movement for sites impacted by recent updates. And it underscores the power of near-real time algo updates.
As a quick example, temporary Panda recoveries can happen if you don’t get out of the gray area enough. And now that we are seeing Panda tremors almost weekly, you can experience potential turbulence several times per
month.

Here is a screenshot from a site that recovered from Panda, didn’t get out of the gray area and reentered the strike zone, just five days later.

(See link to article below)!

Holy cow, that was fast. I hope they didn’t plan any expensive trips in the near future. This is exactly what can happen when major algorithms roam the web in real time. One week you’re looking good and the next week you’re in the dumps. Now, at least I knew this was Panda. The webmaster could tackle more content problems and get out of the gray area… But the ups and downs of a Panda roller coaster ride can drive a webmaster insane. It’s one of the reasons I recommend making
significant changes when you’ve been hit by Panda. Get as far out of the gray area as possible.

An “automatic action viewer” in Google Webmaster Tools could help (and it’s actually being discussed internally by Google)
Based on webmaster confusion, many have asked Google to create an “automatic action viewer” in Google Webmaster Tools. It would be similar to the “manual
actions viewer,” but focused on algorithms that are demoting websites in the search results (versus penalties). Yes, there is a difference by the way.

The new viewer would help webmasters better understand the types of problems that are being impacted by algorithms like Panda, Penguin, Pirate, Above the
Fold, and others. Needless to say, this would be incredibly helpful to webmasters, business owners, and SEOs.

So, will we see that viewer any time soon? Google’s John Mueller
addressed this question during the November 3 webmaster hangout (at 34:54).

http://ift.tt/1zj066n

John explained they are trying to figure something out, but it’s not easy. There are so many algorithms running that they don’t want to provide feedback
that is vague or misleading. But, John did say they are discussing the automatic action viewer internally. So you never know…

A quick note about Matt Cutts
As many of you know, Matt Cutts took an extended leave this past summer (through the end of October). Well, he announced on Halloween that he is extending his leave into 2015. I won’t go crazy here talking about his decision overall, but I will
focus on how this impacts webmasters as it relates to algorithm updates and webspam.

Matt does a lot more than just announce major algo updates… He actually gets involved when collateral damage rears its ugly head. And there’s not a
faster way to rectify a flawed algo update than to have Mr. Cutts involved. So before you dismiss Matt’s extended leave as uneventful, take a look at the
trending below:

Notice the temporary drop off a cliff, then 14 days of hell, only to see that traffic return? That’s because Matt got involved. That’s the
movie blog fiasco from early 2014 that I heavily analyzed. If
Matt was not notified of the drop via Twitter, and didn’t take action, I’m not sure the movie blogs that got hit would be around today. I told Peter from
SlashFilm that his fellow movie blog owners should all pay him a bonus this year. He’s the one that pinged Matt via Twitter and got the ball rolling.

It’s just one example of how having someone with power out front can nip potential problems in the bud. Sure, the sites experienced two weeks of utter
horror, but traffic returned once Google rectified the problem. Now that Matt isn’t actively helping or engaged, who will step up and be that guy? Will it
be John Mueller, Pierre Far, or someone else? John and Pierre are greatly helpful, but will they go to bat for a niche that just got destroyed? Will they
push changes through so sites can turn around? And even at its most basic level, will they even be aware the problem exists?

These are all great questions, and I don’t want to bog down this post (it’s already incredibly long). But don’t laugh off Matt Cutts taking an extended leave. If he’s gone for good, you might only realize how important he was to the SEO community
after he’s gone. And hopefully it’s not because your site just tanked as collateral damage during an algorithm update. Matt might be running a marathon or trying on new Halloween costumes. Then where will you be?

Recommendations moving forward:
So where does this leave us? How can you prepare for the approaching storm of crossing algorithms? Below, I have provided several key bullets that I think every webmaster should consider. I recommend taking a hard look at your site now, before major algos are running in near-real time.

Truly understand the weaknesses with your website. Google will continue crafting external algos that can be injected into the real-time algorithm.
And they will go real-time at some point. Be ready by cleaning up your site now.
Document all changes and fluctuations the best you can. Use annotations in Google Analytics and keep a spreadsheet updated with detailed
information.
Along the same lines, download your Google Webmaster Tools data monthly (at least). After helping many companies with algorithm hits, that
information is incredibly valuable, and can help lead you down the right recovery path.
Use a mix of audits and focus groups to truly understand the quality of your site. I mentioned in my post about aggressive advertising and Panda that human focus groups are worth their weight in gold (for surfacing Panda-related problems). Most business owners are too close to their own content and websites to accurately measure quality. Bias can be a nasty problem and can quickly lead to bamboo-overflow on a website.
Beyond on-site analysis, make sure you tackle your link profile as well. I recommend heavily analyzing your inbound links and weeding out unnatural links. And use the disavow tool for links you can’t remove. The combination of enhancing the quality of your content, boosting engagement, knocking down usability obstacles, and cleaning up your link profile can help you achieve long-term SEO success. Don’t tackle one quarter of your SEO problems. Address
all of them.
Remove barriers that inhibit change and action. You need to move fast. You need to be decisive. And you need to remove red tape that can bog down
the cycle of getting changes implemented. Don’t water down your efforts because there are too many chefs in the kitchen. Understand the changes that need to be implemented, and take action. That’s how you win SEO-wise.

Summary: Are you ready for the approaching storm?
SEO is continually moving and evolving, and it’s important that webmasters adapt quickly. Over the past few years, Google’s brilliant object-oriented approach to fighting webspam and low quality content has yielded algorithms like Panda, Penguin, Pirate, and Above the Fold. And more are on their way. My advice is to get your situation in order now, before crossing algorithms blend a recipe of confusion that make it exponentially harder to identify, and
then fix, problems riddling your website.

Now excuse me while I try to build a flux capacitor. 🙂

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

For more on this Article including the images and graphs see:
http://omhub.wordpress.com/2014/11/12/the-danger-of-crossing-algorithms-uncovering-the-cloaked-panda-update-during-penguin-3-0/

The SEO News & Advice page was posted “By Mike Armstrong” to the SEO Blog category.

SEO Tips / SEO Advice / Blogging Advice

New post on Online Marketing Hub

How to Be the Best Answer with Topic Targeting
by christopherjanb

This weekend I had the good fortune to present at the Minnesota Blogger Conference where nearly 300 local bloggers gathered to learn, get inspired and network.

For my part, I gave a presentation on how blogs are still an incredibly useful tool for marketing. Keeping the reason for blogging top of mind as well as empathy for reader preferences in how they find, consume and act on information are essential if a blog author expects marketing outcomes from their efforts.

When a blog or any content hub can become “the best answer” for the topics that are important for buyers, the return on blogging goes way, way up. One way to execute a content plan to become known as an authority is through topic targeting.

For experienced multi-channel and integrated marketing pros, this kind of approach is going to be fairly common. But for the vast majority of bloggers, whether they be corporate or enthusiasts, the shift from writing for yourself (or your brand) to writing to satisfy specific audience needs is a fundamental shift.

Topic targeting starts by answering a few key questions:

How do you want to be known? How do you want your product or service to be known? What are you, your brand, product or serve the “best answer” for? That singular distinction is essential in order to stand out.
What questions relevant to your area of expertise do buyers have? What information do they need in order to move from curiosity to specific interest to transaction?
As you come to find the sweet spot between how you want to be known and what customers care about, that’s the focus of your topic targeting plan.

Topic targeting is an approach that involves creating resources, experiences and connections that result in an undisputed affinity between a target topic and your brand.
On a large scale for large companies, this is essentially brand marketing. For a small or medium business without massive budgets or resources, these 3 phases below represent a practical approach to becoming the “best answer” wherever customers are looking.

Inspire:
When starting out from a position without prominent authority on your desired topic, one of the most effective ways to close the gap between where you are and where you want your brand to be is to connect with those that already have the authority and community you desire. Recognizing topical influencers in a creative and qualitative way with an emphasis on inspiring readers to think in new ways about the topic is a good start. Co-creating content with topic influencers is also particularly effective. Your target topic will drive which influencers you engage with, the questions and interactions you have, and the titling of the resulting content.

Additional inspire tactics include speaking events that are “on topic” in the conference scope, track and/or title of your presentation. Social engagement promoting target topic content and events should also align. Comments made on industry articles (blogs and online magazines) are also opportunities to create affinity. Blogging about the target topic from different perspectives (what would a buyer need to know from start to finish) is also an effective directed content effort that will contribute to becoming the best answer.

Lastly, a limited number of guest posts on relevant, high profile blogs and contributed articles to industry magazines and websites on your target topic will provided added support for your brand and the target topic.

Desire:
Anticipation is a gateway to topical authority. Continuing to blog on the target topic and growing influencer relationships will lead to even more community engagement opportunities. Consistent creation of useful and entertaining blog content as well as alignment with industry influencers will create a very powerful mental state amongst your blog readers: anticipation. A community that can’t wait to see what you’re going to publish next will be instrumental for amplifying content and stimulating new perspectives on your target topic. That desire leads to advocacy, evangelism and scale for reaching a target audience in a highly credible way.

Acquire:
Demand for information and expertise leads to demand for your solutions. As authority is built on your target topic represented by the content you create on your own websites, third party references to your brand as an authority, growth of your community around the topic and advertising activities, there are several opportunities to show more tangible evidence of expertise: Some examples include:

Case studies
Definitive topic resource/guide
Events – online and off
Industry survey and report (annual)
Lists recognizing experts in the topic (annual or quarterly)
All of these tactics provide opportunities for readers to move from awareness and learning about the topic (with your brand at the forefront) to consideration and action – leads and transactions. Consumers increasingly expect to be able to educate themselves to a purchase decision and making it easy to find, experience and act on your content isn’t just good content marketing, it’s what buyers want.

Specificity is essential with topic targeting as are patience and persistence. This is an earned achievement that also needs to be maintained. But once consensus and momentum are achieved, the ability to attract those actively seeking what you have to offer will expand the value of your content beyond lead generation and sales to other means of monetization – sponsorships, advertising, syndication.

To apply the approach mentioned in this post will require some homework – research in your market or industry to see what kinds of content and messages resonate with the target audience. That’s where the audience Discovery, Consumption and Action model for understanding your audience comes in to play. It is also a continuous effort that can start simply and scale based on what works and what doesn’t.

But the most important thing if all, is to start: How do you want to be known? How does that fit with what your customers want to know?

For more on this SEO Tips /SEO Advice / Blogging Advice post or content marketing in general see:
http://omhub.wordpress.com/2014/10/27/how-to-be-the-best-answer-with-topic-targeting/

The SEO Tips / SEO Advice / Blogging Advice page was posted “By Mike Armstrong”

IMG_0153.JPG

A recent post about how users view and interact with Today’s Google Search Engine Results Page

New post on Online Marketing Hub

Eye Tracking in 2014: How Users View and Interact with Today’s Google SERPs
by christopherjanb
Posted by rMaynes1

In September 2014, Mediative released its latest eye-tracking research entitled “The Evolution of Google’s Search Engine Results Pages and Their Effects on User Behaviour”.

This large study had participants conduct various searches using Google on a desktop. For example, participants were asked “Imagine you’re moving from
Toronto to Vancouver. Use Google to find a moving company in Toronto.” Participants were all presented with the same Google SERP, no matter the search
query.

Mediative wanted to know where people look and click on the SERP the most, what role the location of the listing on the SERP plays in winning views and
clicks, and how click activity on listings has changed with the introduction of Google features such as the carousel, the knowledge graph etc.

Mediative discovered that, just as Google’s SERP has evolved over the past decade, so too has the way in which search engine users scan the page before
making a click.

Back in 2005 when
a similar eye-tracking study was conducted for the first time by Mediative (formerly Enquiro), it was
discovered that people searched in a distinctive “triangle” pattern, starting in the top left of the search results page where they expected the first
organic listing to be located, and reading across horizontally before moving their eyes down to the second organic listing, and reading horizontally, but
not quite as far. This area of concentrated gaze activity became known as Google’s “Golden Triangle”. The study concluded that if a business’s listing was
not in the Golden Triangle, its odds of being seen by a searcher were dramatically reduced.

Heat map from 2005 showing the area known as Google’s “Golden Triangle” (see link below).

But now, in 2014, the top organic results are no longer always in the top-left corner where searchers expect them to be, so they scan other areas of the
SERP, trying to seek out the top organic listing, but being distracted by other elements along the way. The #1 organic listing is shifting further down the
page, and while this listing still captures the most click activity (32.8%) regardless of what new elements are presented, the shifting location has opened
up the top of the page with more potential areas for businesses to achieve visibility.

Where scanning was once more
horizontal, the adoption of mobile devices over the past 9 years has habitually conditioned searchers to now scan
more
vertically—they are looking for the fastest path to the desired content, and, compared to 9 years ago, they are viewing more search results
listings during a single session and spending less time viewing each one.

Searchers on Google now scan far more vertically than several years ago (see link below)

One of the biggest changes from SERPS 9 years ago to today is that Google is now trying to keep people on the result page for as long as they can.

An example is in the case of the knowledge graph. In Mediative’s study. when searchers were looking for “weather in New Orleans”, the results page that was
presented to them showed exactly what they needed to know. Participants were asked to click on the result that they felt best met their needs, even if, if reality, they wouldn’t have clicked through (in order to end that task). When a knowledge graph result exactly met the intent of the searcher, the study found 80% of people looked at that result, and 44% clicked on it. Google provided searchers with a relevant enough answer to keep them on the SERP.
The top organic listing captured 36.5% of pages clicks—compared to 82% when the knowledge graph did not provide the searcher with the answer they were
looking for.

It’s a similar case with the carousel results; when a searcher clicks on a listing, instead of going through to the listing’s website, another SERP is
presented specifically about the business, as Google tries to increase paid ad impressions/clicks on the Google search results page.

How can businesses stay on top of these changes and ensure they still get listed?
There are four main things to keep in mind:

1.
The basic fundamentals of SEO are as important as ever Create unique, fresh content, which speaks to the needs of your customers as this will always trump chasing the algorithm. There are also on-page and
off-page SEO tactics that you can employ that can increase your chances of being listed in areas of the SERP other than your website’s organic listing such as front-loading keywords in page titles and meta descriptions, getting listed on directories and ratings and reviews site, having social pages etc. It’s important to note that SEO strategy is no longer a one-size-fits-all approach.

2.
Consider using schema mark-up wherever possible
In Mediative’s 2014 Google SERP research, it was discovered that blog posts that had been marked up using schema to show the picture and name of the author
got a significant amount of engagement, even when quite far down the first page—these listings garnered an average of 15.5% of total page clicks.

Note:

As of August 2014, Google removed authorship markup entirely. However, the results are still a good example of how schema mark-up can be used to make
your business listing stand out more on the SERP, potentially capturing more view and clicks, and therefore more website traffic.

In the study, participants were asked to “Imagine that you’re starting a business and you need to find a company to host your website. Use Google to find
information about website hosting companies”. The SERP presented is shown below:

Almost 45% of clicks went to 2 blog posts titled “Five Best Web Hosting Companies” and “10 Best Web Hosting Companies”.

In general, the top clicked posts were those that had titles including phrases such as:

“Best…”
“Reviews of…”
“Top 5…”
“How-to…”
According to Google, “On-page markup helps search engines understand the information on webpages and provide richer results…Google doesn’t use markup
for ranking purposes at this time-but rich snippets can make your web pages appear more prominently in search results, so you may see an increase in
traffic.”

Schema markup is probably the most under-utilized tool for SEO, presenting a huge opportunity for companies that do utilize the Google approved tool.
Searchmetrics reported that only 0.3% of websites
use schema markup, yet over a third of Google’s results contain rich snippets (additional text, images and links below the individual search results).
BruceClay.com reports rich snippets can increase CTRs of listings between
15-50% and that websites using schema markup tend to rank higher in search results.

Schema mark-up can be used to add star ratings, number of reviews, pricing (all shown in the listing below) and more to a search results page listing.

3.
Know the intent of your users
Understanding what searchers are trying to discover when they conduct a search can help determine how much effort you should try and put into appearing in the number one organic listing, which can be an extremely difficult task without unlimited budget and resources—and, even if you do make it the number one organic listing, traffic is not guaranteed as discovered in this reaserch. If you’re competing with big name brands, or ratings and review sites, and
THAT is what your customers want, they you are going to struggle to compete.

The importance of your business being the first listing vs. on the first page therefore, is highly dependent on the searcher’s intent, plus the strength of your brand. The key is to always keep
user intent top-of-mind, and this can be established by talking to real people, rather than guessing. What are they looking for when they are searching for your site? Structure your content around what people really want and need, list your site
on the directories that people actually visit or reference, create videos (if that’s what your audience wants)—know what your actual customers are
looking for, and then provide it.

There are going to be situations when a business can’t get to number one on the organic listings. As previously mentioned, the study shows that this is still the key place to be, and the top organic listing captures more clicks that any other single listing. But if your chances of getting to that number
one spot are slim, you need to focus on other areas of the SERP, such as positions #4 or higher, which will be easier to obtain ranking for—businesses that are positioned lower on the SERP (especially positions 2-4) see more click activity than they did several years ago, making this real estate much more valuable. As Gord Hotchkiss writes about, searchers tend to “chunk” information on the SERP and scan each chuck in the same way they used to search the entire SERP—in a triangle pattern. Getting listed at the top of a “chunk” can therefore be effective for many businesses. This idea of “chunking” and scanning can be seen in the heat map below.

To add to that, Mediative’s research showed that everything located above the top 4 organic listings (so, carousel results, knowledge graph, paid listings,
local listings etc.) combined captured 84% of clicks. If you can’t get your business listing to #1, but can get listed somewhere higher than #4, you have a good chance of being seen, and clicked on by searchers. Ultimately, people expect Google to continue to do its job, and respond to search queries with the most relevant results at the top. The study points out that only 1% of participants were willing to click through to Page 2 to see more results. If you’re not listed on page 1 of Google for relevant searches, you may as well not exist online.

4.
A combination of SEO and paid search can maximize your visibility in SERP areas that have the biggest impact on both branding and traffic Even though organic listings are where many businesses are striving to be listed (and where the majority of clicks take place), it’s important not to
forget about paid listings as a component of your digital strategy. Click-through rates for top sponsored listings (positions 1 and 2) have changed very little in the past decade. Where the huge change has taken place is in the ability of sponsored ads on the right rail to attract attention and clicks.

Activity on this section of the page is almost non-existent. This can be put down to a couple of factors including searchers conditioned behaviour as
mentioned before, to scan more vertically, thanks to our increased mobile usage, and the fact that over the years we have learned that those results may not typically be very relevant, or as good as the organic results, so we tend not to even take the time to view them.

Mediative’s research also found that there are branding effects of paid search, even if not directly driving traffic. We asked participants to “Imagine you are traveling to New Orleans and are looking for somewhere to meet a friend for dinner in the French Quarter area. Use Google to find a restaurant.”
Participants were presented with a SERP showing 2 paid ads—the first was for opentable.com, and the second for the restaurant Remoulade, remoulade.com.

The top sponsored listing, opentable.com, was viewed by 84% of participants, and captured 26% of clicks. The second listing, remoulade.com, only captured
2% of clicks but was looked at by 73% of participants. By being seen by almost 3/4 of participants, the paid listing can increase brand affinity, and
therefore purchase (or choice) consideration in other areas! For example, if the searcher comes back and searches again another time, or clicks to opentable.com and then sees Remoulade listed, it may benefit from a higher brand affinity from having already been seen in the paid listings. Mediative conducted a Brand Lift study featuring Honda that found the more real estate that brands own on the SERP, the higher the CTR, and the higher the brand affinity, brand recognition, purchase consideration etc. Using paid search for more of a branding play is essentially free
brand advertising—while you should be prepared to get the clicks and pay for them of course, it likely that your business listing will be seen by a large number of people without capturing the same number of clicks. Impression data can also be easily tracked with Google paid ads so you know
exactly how many times your ad was shown, and can therefore estimate how many people actually looked at it from a branding point of view.

Rebecca Maynes is a Marketing Communications Strategist with Mediative, and was a major contributor on this study. The full study, including click-through rates for all areas of the SERP, can be downloaded at

http://ift.tt/1vuhDXI.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

For more on this article or for pictures see: http://omhub.wordpress.com/2014/10/22/eye-tracking-in-2014-how-users-view-and-interact-with-todays-google-serps/

The A recent post about how users view and interact with Today’s Google Search Engine Results Page was posted “By Mike Armstrong”

SEO & Content Marketing Advice

New post on Online Marketing Hub

How To Be Successful With Content Marketing Through Your Search Engine Strategy
by christopherjanb

Find a publicly advertised brand you identify with, from a healthy perspective, and follow their brand online.

What do you see?

Consistent values? Positive, helpful and resourceful communication? Ways to get in touch with them?

If you answer, “yes” to these questions, then that brand shows positive signs of effective content marketing through search engine strategy.

Content marketing and search engine strategy go hand-in-hand now more than ever.

Taking all the above into consideration, I did a brief case study using Kaiser Permanente to demonstrate how search strategy works. You can see for yourself how Kaiser has tapped into the power of content marketing through their search engine strategy.

In this article, I’ll walk you through the basics of optimizing Google’s search capabilities for your own brand’s content marketing success. You’ll learn about the following:

How IP addresses and location-based services play into localized search engine strategy.
The importance of branding… and why paying for Google ads may be worth your while.
Why and how Google rewards brands that create quality content.
First, let’s talk about search engine strategy…
Enter ‘Kaiser’ in the Google search bar and you’ll find this:

See link below:

You’ll Notice the following in the screenshot:

1. The result numbers from your search term.

The search term, ‘Kaiser,’ yields about 47,100,000 results from across the entire Web. What does this mean to you? Take into consideration the strength you – i.e., your website – must have to be #1 on the first page of only about ten.

2. A map showing you the results of the matching Kaiser locations in your area.

Your computer has what is known as an “IP address,” which is short for Internet Protocol address. Google, and other sites, use your IP address to identify computers on the Internet.

Your IP address is typically based on your real-world location, which helps Google provide you with relevant, local results. You can learn more about what Google has to say about IP addresses, including how to find yours here.

More people currently rely on their mobile devices to search and obtain instant and accurate results from their search efforts than ever before. Location services for mobile devices are essential for viewing maps and other local results, based on your current location. Google Maps can detect your location through the following:

GPS
WiFi
Cell Tower
Apple Location Services
Google provides helpful suggestions for improving your location’s accuracy, GPS signals, and more for your mobile device here.

3. Kaiser’ ad is the first result.

Google likes ads. I’ll talk more about that below.

4. The second result is their name in URL/website domain name format, and their top locations in your area.

Site structure, page names, and organization is important for the user experience and helps Google find information quickly. IP addresses and GPS/location settings also make a difference.

5. The third result is a social profile where reviews are encouraged and unconfirmed, such as Yelp.

Google prefers sites which are based on traffic, use or advertisement procurement.

What did this case study teach me about using search engine strategy for my own content marketing?
I learned about the importance of branding and advertising
Your brand name is everything.
Dominating the webiverse with your brand name is a great end goal. Since IP addresses are key for people finding you online, you’ll want to start locally and build up from there. Google rewards brands that register their physical locations by giving them top placement within search engine results. A recent article via Vertical Response lists the top 20 places where you should list your business online.

Social profiles and reviews are important to Google.
As your brand’s name gains exposure, more people start learning about you and your name. As more people use your brand name in these sites, it also helps Google recognize you, helping your brand show up higher in Google rankings and search results.

Google likes being paid.
Paying for advertising, especially search advertising, increases the likelihood that your brand name shows up in Google search results. Using Google AdWords makes it easy to create and run ad campaigns. Depending on how much you pay and the types of campaigns you set up, you can control different features of your ad, such as where it shows up in a Google search.

I also learned more about how to create the content Google wants…and how that can work for my clients and my business…
Our website pages and content are only as valuable as the words we use.
How often do you hear the expression, “content is king”? We hear it often because, when it comes to your website ranking higher in search results, it’s still very much the truth. When it comes to creating content, we need to understand that it’s the quality of our words that matters. Not only are people searching online for content that informs and educates them, but the Google search engine also craves useful and appealing content. Google wants results which are relevant to the search and to share the best, most trustworthy results.

Moz explains the importance of creating “great content” for your site in the following way:

“ Every search performed at the engines comes with an intent – to find, learn, solve, buy, fix, treat, or understand. Search engines place web pages in their results in order to satisfy that intent in the best possible way, and crafting the most fulfilling, thorough content that addresses a searcher’s needs provides an excellent chance to earn top rankings.”

Regular content is extremely valuable.
We know now how Google prefers quality content, but Google especially rewards those sites which produce quality content regularly.

Eric Sornoso lists and explains the following 3 reasons about why the Google search engine likes regular posting in his recent article for SEOBlog.com, “Google Rewards Sites That Regularly Post Great Content”:

Google likes fresh, new content.
Google likes a constant flow of content.
Google likes accountability.
Creating quality content on a regular basis certainly helps Google (and potential clients) find you, but you can also take extra steps to improve your results. For example, one of the first (and best) actions you should take, especially when your site is new, is to set up your RSS with Feedburner.

This is Google’s RSS Management tool, and it notifies Google each time your blog is updated with a new post. Even better? As you start producing content which generates higher traffic to your website, Google will continue rewarding you through more traffic referrals. It’s a win-win-win: for you, for Google, and for the searcher.

Over to you
Without a doubt, tapping into the power of Google and forming a strong search engine strategy can take your content marketing – and your business – to entirely new levels. Are you using search engine strategy as a way to bolster your own content marketing efforts?

Author information

Jamie Teasdale
Founder & Lead Strategist at Plan Promote Prosper
Jamie Teasdale is a business growth advocate and strategist who is passionate about supporting small businesses in their quest to effectively communicate with their target market. Focusing on content marketing (inbound and outbound) and brand messaging, Jamie’s company Plan Promote Prosper assists companies who recognize the value of strategic and consistent content marketing through blogging, email marketing and social media engagement, and the impact it makes toward positive SEO.
Planning a content marketing strategy is no small endeavor and should be done each year. Jamie and her team make it easy and affordable. Plan Promote Prosper offers eight white-labeled content marketing products and services that are sought after by leading marketing companies, web firms, copywriters and social managers. Jamie’s office is located in downtown Portland, Oregon. In her spare time, Jamie enjoys painting, traveling and spending time with friends and family.

For this full article including images see:
http://omhub.wordpress.com/2014/10/15/how-to-be-successful-with-content-marketing-through-your-search-engine-strategy/

SEO & Content Marketing Advice page posted “By Mike Armstrong”