Tag: SEO Advice

#MikeArmstrong #KingofMarketing #Podcast – Search Marketing and SEO Advice – #MikeArmstrongPodcasts

Listen to the most recent episode of my podcast: #KingofMarketing #MikeArmstrong talking about Search Marketing and Search Engine Optimisation / SEO😎 https://anchor.fm/mike-armstrong9/episodes/KingofMarketing-MikeArmstrong-talking-about-Search-Marketing-and-Search-Engine-Optimisation–SEO-ef97r3

#KingofMarketing #MikeArmstrong talking about Search Marketing and Search Engine Optimisation / SEO 👑 😎💪🙌🏴󠁧󠁢󠁷󠁬󠁳󠁿🇬🇧🌍 – Another King of Marketing Episode #KingofMarketing – The Importance of getting your search marketing right for those switching to online 🐺🏴󠁧󠁢󠁷󠁬󠁳󠁿 #MikeArmstrong #YouCanDoIt #PositivityPodcast #Motivation #MotivationalPodcast #Entrepreneurship #PersonalDevelopmentPodcast #SalesPodcast #MarketingPodcast 😎 #Sales #Marketing #PersonalDevelopment #WOLFofWALES #WOW #WOWPodcast #SalesTraining #LifeCoaching #BusinessTraining from #MikeArmstrong Teaching people how to achieve their goals and dreams via various motivation, education and personal development teachings! #MikeArmstrongSalesTraining #MikeArmstronarketingTraining #MikeArmstrongBusinessTraining #MikeArmstrongEntrepreneurTraining #MikeArmstrongTraining #MATraining on his #YouCanDoItPodcast – #MikeArmstrongPodcast / #MikeArmstrongPodcasts – Rapid Business Growth, Personal Development and Sales & Marketing Training & Advice From #MikeArmstrong on the #YouCanDoItPodcast #YCDI #YCDIPodcast featuring the #Awesome Mike Armstrong #AwesomeArmstrong – #Motivation #Motivated #Motivational #MotivationalPodcast – #10x #20x #Infinityx  #BusinessGrowth #RapidBusinessGrowth #PersonalGrowth #PersonalDevelopment 🚀😎 – #MikeArmstrongYouCanDoItPodcast – More About Mike;  Mike is “The Awesome Mike Armstrong” – A #Philanthropic #Entrepreneur who loves to help people. He’s also an Author, Speaker, Mentor, Coach, Blogger, Vlogger & Podcaster who lives to help people especially; Struggling Business Owners and Entrepreneurs who need a lift and those suffering with Mental Health issues #MentalHealth #MentalWellbeing #MentalHealthSupport. Mike has spent years cultivating an awesome global network, and is currently building an #AwesomeArmy of similarly minded #Philanthropist #Entrepreneurs and is happy to share the contacts and the love with those who are deserving. If that’s you please get in touch with Mike. Mike Armstrong of Mike Armstrong Ltd | MA Group | MA Consultancy | MA Web | MA Training | Marketing Wales / WelshBiz | Tourism Wales | Things To Do In | MA News | MAN Media | MA Property | Mike Armstrong News & Mike Armstrong’s You Can Do It Podcast.

Mike’s areas of Interest and Expertise include Welsh Business News & Events, UK Business News & Events, Global Business News & Events, Business Advice & Personal Development, Rapid Business Growth, Happiness, Success, Goal Achieving, Knowledge Sharing, Elite Performance, as well as Sales & Marketing Mentoring, Coaching, Training and Services inc. Sales & Marketing Strategy & Services, Social Media Strategy & Services, SEO Strategy & Services, Content Marketing Strategy & Services, Ecommerce Strategy & Services, Business Growth Strategy & Services and Property Maintenance, Property Management and Property Development Joint Ventures (JV’s) – All aimed at Biz Owners, Entrepreneurs, Speakers, Coaches, Startups, Networkers, Global Networks and people in need of help, support, love and a lick me up etc.

MA Website – https://mikearmstrong.me #MikeArmstrong

MA News Site – https://MikeArmstrong.me/news/

#MikeArmstrongNews

#MikeArmstrongPoems

#PositiveCoronavirusNews

MA Podcast – https://anchor.fm/mike-armstrong9

#MikeArmstrongPodcasts

#YouCanDoItPodcast #WOLFofWALES #WOWPodcast

Co. Websites – www.maconsultancycardiff.com #MAConsultancy

www.marketing.wales  #WelshBiz

Search & connect with Mike Armstrong in any social media as well as MA Consultancy & WelshBiz!

Also pls join one or all of my Cardiff Businesses, Welsh Businesses, UK Businesses, Global Businesses, Global Networkers, Entrepreneur Zone, Wolf of Wales Fans, Mental Health Support Group, or Mike Armstrong Podcast Fans – Groups on FB 👍😎 or the #AwesomeArmy of you want to get involved and join the team!

Business and Sports News from Mike Armstrong – See http://mikearmstrong.me

11 SEO Hacks Guaranteed to Deliver Impressive Results on Google [Infographic]

Ensure you’re maximizing your search opportunities with these SEO tips.

https://www.socialmediatoday.com/news/11-seo-hacks-guaranteed-to-deliver-impressive-results-on-google-infographi/574621/

Google’s 200+ Ranking Factors: The Complete List as of 2020

Google’s 200+ Ranking Factors: The Complete List as of 2020 🌍💼📰

http://mikearmstrong.me/googles-200-ranking-factors-the-complete-list-as-of-2020-%f0%9f%8c%8d%f0%9f%92%bc%f0%9f%93%b0/
— Read on mikearmstrong.me/googles-200-ranking-factors-the-complete-list-as-of-2020-🌍💼📰/

The 10 Most Important SEO Tips You Need to Know

The 10 Most Important SEO Tips You Need to Know 🌍📲📰

http://mikearmstrong.me/the-10-most-important-seo-tips-you-need-to-know/
— Read on mikearmstrong.me/the-10-most-important-seo-tips-you-need-to-know/

Google’s 200+ Ranking Factors: The Complete List as of 2020

Interesting Article on Google Search Engine Ranking Factors as of 2020. Great read for those looking for SEO Tips are ways to improve their websites’ internet page rankings:

You might already know that Google uses over 200 ranking factors in their search engine algorithm…

But what are they?

Well, you are in for a treat because I’ve put together a complete list.

Some Factors are proven.

Some Factors are controversial.

Others Factors are SEO speculation.

But they are all here.

And the search ranking factors were recently updated. The entire 200 Google Ranking Factors list is updated for 2020.

Let’s dive right in.

The 200 Google Search Engine Factors are split in to the following 10 Sections:

Domain SEO Factors
Page-Level SEO Factors
Site-Level SEO Factors
Backlink SEO Factors
User Interaction Factors
Special Google Algorithm Rules
Brand Signals
On-Site Webspam Factors
Off-Site Webspam Factors

Domain SEO Factors

1. Domain Age:

Google’s Matt Cutts states that:

“The difference between a domain that’s six months old versus one year old is really not that big at all.”
In other words, they do use domain age. But it’s not that important. There is much more of a difference between a domain that 10 years old versus one that is just 6 months old though.

2. Keyword Appears in Top Level Domain:

Having a keyword in your domain name doesn’t give you the SEO boost that it used to. But it still acts as a relevancy signal for your page SEO.

3. Keyword As First Word in Domain:

A domain that starts with their target keyword has an SEO edge over sites that either don’t have that keyword in their domain (or have the keyword in the middle or end of their domain).

4. Domain registration length:

A Google patent states:

“Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain.”

5. Keyword in Subdomain:

Moz’s SEO expert panel agrees that a keyword appearing in the subdomain of a website can boost your search engine rankings.

6. Domain History:

A site with volatile ownership or several drops may tell Google to “reset” the site’s history, negating links pointing to the domain. Or, in certain cases, a penalised domain may carry the penalty over to the new domain owner.

7. Exact Match Domain:

Exact Match Domains may still give you a slight SEO edge. But if your EMD happens to be a low-quality site, it’s vulnerable to the EMD update.

8. Public vs. Private WhoIs:

Private WhoIs information may be a sign of “something to hide”. Googler Matt Cutts is quoted as stating:

“…When I checked the whois on them, they all had “whois privacy protection service” on them. That’s relatively unusual. …Having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so.”

9. Penalised WhoIs Owner:

If Google identifies a particular person as a spammer it makes sense that they would scrutinise other sites owned by that person.

10. Country TLD extension:

Having a Country Code Top Level Domain (.cn, .pt, .ca) can help the site rank for that particular country… but it can limit the site’s ability to rank globally.

Page-Level SEO Factors

11. Keyword in Title Tag:

Although not as important as it once was, your title tag remains an important on-page SEO ranking signal.

12. Title Tag Starts with Keyword:

According to Moz , title tags that starts with a keyword tend to perform better in search engine rankings than title tags with the keyword towards the end of the tag.

13. Keyword in Description Tag:

Google doesn’t use the meta description tag as a direct page ranking signal. However, your description tag can impact click-through-rate, which is a key SEO ranking factor.

14. Keyword Appears in H1 Tag (main page title):

H1 tags are a “second title tag”. Along with your title tag, Google uses your H1 tag as a secondary relevancy signal, according to results from one correlation study:

15. TF-IDF:

A fancy way of saying: “How often does a certain word appear in a document?”. The more often that word appears on a page, the more likely it is that the page is about that word. Google likely uses a sophisticated version of TF-IDF.

16. Content Length:

Content with more words can cover a wider breadth and are likely preferable in the algorithm compared to shorter, superficial articles. Indeed, one recent ranking factors industry study found that content length correlated with SERP position.

17. Table of Contents:

Using a linked table of contents can help Google better understand your page’s content. It can also result in sitelinks:

18. Keyword Density:

Although not as important as it once was, Google may use it to determine the topic of a webpage. But going overboard can hurt your Search Engine page ranking.

19. Latent Semantic Indexing Keywords in Content (LSI):

LSI keywords help search engines extract meaning from words that have more than one meaning (for example: Apple the computer company vs. Apple the fruit). The presence/absence of LSI probably also acts as a content quality signal.

20. LSI Keywords in Title and Description Tags:

As with webpage content, LSI keywords in page meta tags probably help Google discern between words with multiple potential meanings. May also act as a relevancy signal.

21. Page Covers Topic In-Depth:

There’s a known correlation between depth of topic coverage and Google rankings.

Therefore, pages that cover every angle likely have an edge vs. pages that only cover a topic partially.

22. Page Loading Speed via HTML:

Both Google and Bing use page speed as a ranking factor. Search engine spiders can estimate your site speed fairly accurately based on your page’s HTML code.

23. Page Loading Speed via Chrome:

Google also uses Chrome user data to get a better handle on a page’s loading time. That way, they can measure how quickly a page actually loads to users.

24. Use of AMP:

While not a direct Google ranking factor, AMP may be a requirement to rank in the mobile version of the Google News Carousel.

25. Entity Match:

Does a page’s content match the “entity” that a user is searching for? If so, that page may get a rankings boost for that keyword.

26. Google Hummingbird:

This “algorithm change” helped Google go beyond keywords. Thanks to Hummingbird, Google can now better understand the topic of a webpage.

27. Duplicate Content:

Identical content on the same site (even slightly modified) can negatively influence a site’s search engine visibility.

28. Rel=Canonical:

When used properly, use of this tag may prevent Google from penalising your site for duplicate content.

29. Image Optimisation:

Images send search engines important relevancy signals through their file name, alt text, title, description and caption. Not keyword describing your images can affect your page rank.

30. Content Recency:

Google Caffeine update favors recently published or updated content, especially for time-sensitive searches.

Highlighting this factor’s importance, Google shows the date of a page’s last update for certain pages:

31. Magnitude of Content Updates:

The significance of edits and changes also serves as a freshness factor.

Adding or removing entire sections is more significant than switching around the order of a few words or fixing a typo.

32. Historical Page Updates:

How often has the page been updated over time?

Daily, weekly, every 5 years? Frequency of page updates also play a role in freshness.

33. Keyword Prominence:

Having a keyword appear in the first 100 words of a page’s content is correlated to first page Google rankings.

34. Keyword in H2, H3 Tags:

Having your keyword appear as a subheading in H2 or H3 format may be another weak relevancy signal. In fact, Googler John Mueller states:

“These heading tags in HTML help us to understand the structure of the page.”

35. Outbound Link Quality:

Many SEOs think that linking out to authority sites helps send trust signals to Google. And this is backed up by a recent industry study.

36. Outbound Link Theme:

According to The Hillop Algorithm, Google may use the content of the pages you link to as a relevancy signal.

For example, if you have a page about cars that links to movie-related pages, this may tell Google that your page is about the movie Cars, not the automobile.

37. Grammar and Spelling:

Proper grammar and spelling is a quality signal, although Cutts gave mixed messages a few years back on whether or not this was important.

38. Syndicated Content:

Is the content on the page original? If it’s scraped or copied from an indexed page it won’t rank as well… or may not get indexed at all.

39. Mobile-Friendly Update:

Often referred to as “Mobilegeddon“, this update rewarded pages that were properly optimised for mobile devices.

40. Mobile Usability of your web content:

Websites that mobile users can easily use may have an edge in Google’s “Mobile-first Index”.

41. “Hidden” Content on Mobile: Hidden content on mobile devices may not get indexed (or may not be weighed as heavily) vs. fully visible content.

However, a Googler recently stated that hidden content is OK. But also said that in the same video, “…if it’s critical content it should be visible…”.

42. Helpful “Supplementary Content”:

According to a now-public Google Rater Guidelines Document, helpful supplementary content is an indicator of a page’s quality (and therefore, Google ranking).

Examples include currency converters, loan interest calculators and interactive recipes.

43. Content Hidden Behind Tabs on a web page:

Do users need to click on a tab to reveal some of the content on your page? If so, Google has said that this content “may not be indexed”.

44. Number of Outbound Links:

Too many dofollow OBLs can “leak” PageRank, which can hurt that page’s rankings.

45. Multimedia Content:

Images, videos and other multimedia elements may act as a content quality signal. For example, one industry study found a correlation between multimedia and rankings:

46. Number of Internal Links Pointing to Web Page:

The number of internal links to a page indicates its importance relative to other pages on the site (more internal links=more important).

47. Quality of Internal Links Pointing to a Web Page:

Internal links from authoritative pages on domain have a stronger effect than pages with no or low PageRank.

48. Broken Links:

Having too many broken links on a page may be a sign of a neglected or abandoned site. The Google Rater Guidelines Document uses broken links as one was to assess a homepage’s quality.

49. Reading Level:

There’s no doubt that Google estimates the reading level of webpages. In fact, Google used to give you reading level stats:

But what they do with that information is up for debate. Some say that a basic reading level will help you rank better because it will appeal to the masses. But others associate a basic reading level with content mills like Ezine Articles.

50. Affiliate Links:

Affiliate links themselves probably won’t hurt your rankings. But if you have too many, Google’s algorithm may pay closer attention to other quality signals to make sure you’re not a “thin affiliate site“.

51. HTML errors/W3C validation:

Lots of HTML errors or sloppy coding may be a sign of a poor quality site. While controversial, many in SEO think that a well-coded page is used as a quality signal.

52. Domain Authority:

All things being equal, a page on an authoritative domain will rank higher than a page on a domain with less authority.

53. Page’s PageRank:

Not perfectly correlated. But pages with lots of authority tend to outrank pages without much link authority.

54. URL Length:

Excessively long URLs may hurt a page’s search engine visibility.

In fact, several industry studies have found that short URLs tend to have a slight edge in Google’s search results.

55. URL Path:

A page closer to the homepage may get a slight authority boost vs. pages buried deep down in a site’s architecture.

56. Human Editors:

Although never confirmed, Google has filed a patent for a system that allows human editors to influence the SERPs.

57. Page Category:

The category the page appears on is a relevancy signal. A page that’s part of a closely related category may get a relevancy boost compared to a page that’s filed under an unrelated category.

58. WordPress Tags:

Tags are WordPress-specific relevancy signal. According to Yoast.com:

“The only way it improves your SEO is by relating one piece of content to another, and more specifically a group of posts to each other.”

59. Keyword in URL:

Another relevancy signal. A Google rep recently called this “a very small ranking factor“. But a ranking factor nontheless.

60. URL String:

The categories in the URL string are read by Google and may provide a thematic signal to what a page is about:

61. References and Sources:

Citing references and sources, like research papers do, may be a sign of quality. The Google Quality Guidelines states that reviewers should keep an eye out for sources when looking at certain pages:

“This is a topic where expertise and/or authoritative sources are important…”.

However, Google has denied that they use external links as a ranking signal.

62. Bullets and Numbered Lists:

Bullets and numbered lists help break up your content for readers, making them more user friendly.

Google likely agrees and may prefer content with bullets and numbers.

63. Priority of a Page in your web Sitemap:

The priority a page is given via the sitemap.xml file may influence the ranking of that page.

64. Too Many Outbound Links:

Straight from the aforementioned Quality rater document:

“Some pages have way, way too many links, obscuring the page and distracting from the Main Content.”

65. UX Signals From Other Keywords Page Ranks For:

If the page ranks for several other keywords, it may give Google an internal sign of quality.

In fact, Google’s recent “How Search Works” report states:

“We look for sites that many users seem to value for similar queries.”

66. Page Age:

Although Google prefers fresh content, an older page that’s regularly updated may outperform a newer page.

67. User Friendly Layout:

Citing the Google Quality Guidelines Document yet again:

“The page layout on highest quality pages makes the Main Content immediately visible.”

68. Parked Domains:

A Google update in December of 2011 decreased search visibility of parked domains.

69. Useful Content:

As pointed out by Backlinko reader Jared Carrizales, Google may distinguish between “quality” and “useful” content.

Site-Level Factors

70. Content Provides Value and Unique Insights:

Google has stated that they are happy to penalise websites that don’t bring anything new or useful to the table, especially thin affiliate sites.

71. Contact Us Page:

The aforementioned Google Quality Document states that they prefer sites with an “appropriate amount of contact information”. Make sure that your contact information matches your whois info.

72. Domain Trust/TrustRank: Many SEOs believe that “TrustRank” is a massively important ranking factor.

And a Google Patent titled “Search result ranking based on trust”, seems to back this up.

73. Site Architecture:

A well put-together site architecture (for example, a silo structure) helps Google thematically organise your content. It can also helps Googlebot access and index all of your site’s pages.

74. Site Updates:

Many SEOs believe that website updates — and especially when new content is added to the site — works a site-wide freshness factor.

Although Google has recently denied that they use “publishing frequency” in their algorithm.

75. Presence of Sitemap:

A sitemap helps search engines index your pages easier and more thoroughly, improving visibility.

However, Google recently stated that HTML sitemaps aren’t “useful” for SEO.

76. Site Uptime:

Lots of downtime from site maintenance or server issues may hurt your rankings (and can even result in deindexing if not corrected).

77. Server Location:

Server location influences where your site ranks in different geographical regions (source).

Especially important for geo-specific searches.

78. SSL Certificate: Google has confirmed that they use HTTPS as a ranking signal.

According to Google, however, HTTPS only acts as a “tiebreaker“.

79. Terms of Service and Privacy Pages:

These two pages help tell Google that a site is a trustworthy member of the internet. They may also help improve your site’s E-A-T.

80. Duplicate Meta Information On-Site:

Duplicate meta information across your site may bring down all of your page’s visibility.

81. Breadcrumb Navigation:

This is a style of user-friendly site-architecture that helps users (and search engines) know where they are on a site:

Google states that: “Google Search uses breadcrumb markup in the body of a web page to categorise the information from the page in search results.”

82. Mobile Optimised:

With more than half of all searches done from mobile devices, Google wants to see that your site is optimised for mobile users.

In fact, Google now penalises websites that aren’t mobile friendly.

83. YouTube:

There’s no doubt that YouTube videos are given preferential treatment in the SERPs (probably because Google owns it ):

In fact, Search Engine Land found that YouTube.com traffic increased significantly after Google Panda.

84. Site Usability:

A site that’s difficult to use or to navigate can hurt rankings indirectly by reducing time on site, pages viewed and bounce rate (in other words, RankBrain ranking factors).

85. Use of Google Analytics and Google Search Console:

Some think that having these two programs installed on your site can improve your page’s indexing. They may also directly influence rankings by giving Google more data to work with (ie. more accurate bounce rate, whether or not you get referral traffic from your backlinks etc.).

That said, Google has denied this as a myth.

86. User reviews/Site reputation:

A site’s reputation on sites like Yelp.com likely play an important role in Google’s algorithm. Google even posted a rarely candid outline of how they use online reviews after one site was caught ripping off customers in an effort to get press and links.

Backlink SEO Ranking Factors

87. Linking Domain Age:

Backlinks from aged domains may be more powerful than new domains.

88. # of Linking Root Domains:

The number of referring domains is one of the most important ranking factors in Google’s algorithm, as you can see from this industry study of 1 million Google Search results.

89. # of Links from Separate C-Class IPs:

Links from separate class-c IP addresses suggest a wider breadth of sites linking to you, which can help with rankings.

90. # of Linking Pages:

The total number of linking pages — even from the same domain — has an impact on rankings.

91. Backlink Anchor Text: As noted in this description of Google’s original algorithm:

“First, anchors often provide more accurate descriptions of web pages than the pages themselves.”

Obviously, anchor text is less important than before (and, when over-optimised, work as a webspam signal). But keyword-rich anchor text still sends a strong relevancy signal in small doses.

92. Alt Tag (for Image Links):

Alt text acts as anchor text for images.

93. Links from .edu or .gov Domains: Matt Cutts has stated that TLD doesn’t factor into a site’s importance. And Google has said they “ignore” lots of Edu links. However, that doesn’t stop SEOs from thinking that there’s a special place in the algorithm for .gov and .edu TLDs.

94. Authority of Linking Page:

The authority (PageRank) of the referring page has been an extremely important ranking factor since Google’s early days and still is.

95. Authority of Linking Domain:

The referring domain’s authority may play an independent role in a link’s value.

96. Links From Competitors:

Links from other pages ranking in the same SERP may be more valuable to a page’s ranking for that particular keyword.

97. Links from “Expected” Websites:

Although speculative, some SEOs believe that Google won’t fully trust your website until you get linked to from a set of “expected” sites in your industry.

98. Links from Bad Neighborhoods:

Links from so-called “bad neighborhoods” may hurt your site.

99. Guest Posts:

Although links from guest posts still pass value, they likely aren’t as powerful as true editorial links (plus, “large-scale” guest posting can get your site into trouble).

100. Links From Ads:

According to Google, links from ads should be nofollowed. However, it’s likely that Google is able to identify and filter out followed links from ads.

101. Homepage Authority:

Links to a referring page’s homepage may play special importance in evaluating a site’s — and therefore a link’s — weight.

102. Nofollow Links:

This is one of the most controversial topics in SEO. Google’s official word on the matter is:

“In general, we don’t follow them.”
Which suggests that they do… at least in certain cases. Having a certain % of nofollow links may also indicate a natural vs. unnatural link profile.

103. Diversity of Link Types:

Having an unnaturally large percentage of your links coming from a single source (ie. forum profiles, blog comments) may be a sign of webspam. On the other hand, links from diverse sources is a sign of a natural link profile.

104. “Sponsored” or “UGC” Tags:

Links tagged as “rel=sponsored” or “rel=UGC” are treated differently than normal “followed” or rel=nofollow links.

105. Contextual Links:

Links embedded inside a page’s content are considered more powerful than links on an empty page or found elsewhere on the page.

106. Excessive 301 Redirects to Page:

Backlinks coming from 301 redirects dilute some PageRank, according to a Webmaster Help Video.

107. Internal Link Anchor Text:

Internal link anchor text is another relevancy signal. That said, internal links likely have much less weight than anchor text coming from external sites.

108. Link Title Attribution:

The link title (the text that appears when you hover over a link) may also be used as a weak relevancy signal.

109. Country TLD of Referring Domain:

Getting links from country-specific top level domain extensions (.de, .cn, .co.uk) may help you rank better in that country.

110. Link Location In Content:

Links in the beginning of a piece of content may carry slightly more weight than links placed at the end of the content.

111. Link Location on Page:

Where a link appears on a page is important. Generally, a link embedded in a page’s content is more powerful than a link in the footer or sidebar area.

112. Linking Domain Relevancy:

A back link from a site in a similar niche is significantly more powerful than a link from a completely unrelated site.

113. Page-Level Relevancy:

A back link from a relevant page also passes more value.

114. Keyword in Title:

Google gives extra love to links from pages that contain your page’s keyword in the title (“Experts linking to experts”.)

115. Positive Link Velocity:

A site with positive link velocity usually gets a SERP boost as it shows your site is increasing in popularity.

116. Negative Link Velocity:

On the flip side, a negative link velocity can significantly reduce rankings as it’s a signal of decreasing popularity.

117. Links from “Hub” Pages:

The Hilltop Algorithm suggests that getting links from pages that are considered top resources (or hubs) on a certain topic are given special treatment.

118. Link from Authority Sites:

A link from a site considered an “authority site” likely pass more juice than a link from a small, relatively unknown site.

119. Linked to as Wikipedia Source:

Although the links are nofollow, many think that getting a link from Wikipedia gives you a little added trust and authority in the eyes of search engines.

120. Co-Occurrences:

The words that tend to appear around your backlinks helps tell Google what that page is about.

121. Backlink Age:

According to a Google patent, older links have more ranking power than newly minted backlinks.

122. Links from Real Sites vs. “Splogs”:

Due to the proliferation of blog networks, Google probably gives more weight to links coming from “real sites” than from fake blogs. They likely use brand and user-interaction signals to distinguish between the two.

123. Natural Link Profile:

A site with a “natural” link profile is going to rank highly and be more durable to updates than one that has obviously used black hat strategies to build links.

124. Reciprocal Links:

Google’s Link Schemes page lists “Excessive link exchanging” as a link scheme to avoid.

125. User Generated Content Links:

Google can identify UGC vs. content published by the actual site owner. For example, they know that a link from the official WordPress.com blog is very different than a link from besttoasterreviews.wordpress.com.

126. Links from 301:

Links from 301 redirects may lose a little bit of juice compared to a direct link. However, Matt Cutts says that a 301s are similar to direct links

127. Schema.org Usage:

Pages that support microformats may rank above pages without it. This may be a direct boost or the fact that pages with microformatting have a higher SERP CTR:

128. TrustRank of Linking Site:

The trustworthiness of the site linking to you determines how much “TrustRank” gets passed on to you.

129. Number of Outbound Links on a Web Page:

PageRank is finite. A link on a page with hundreds of external links passes less PageRank than a page with a handful of outbound links.

130. Forum Links:

Because of industrial-level spamming, Google may significantly devalue links from forums.

131. Word Count of Linking Content:

A link from a 1000-word post is usually more valuable than a link inside of a 25-word snippet.

132. Quality of Linking Content:

Links from poorly written or spun content don’t pass as much value as links from well-written, content.

133. Sitewide Links:

Matt Cutts has confirmed that sitewide links are “compressed” to count as a single link.

User Interaction Ranking Factors

134. RankBrain:

RankBrain is Google’s AI algorithm. Many believe that its main purpose is to measure how users interact with the search results (and rank the results accordingly).

135. Organic Click Through Rate for a Keyword:

According to Google, pages that get clicked more in CTR may get a SERP boost for that particular keyword.

136. Organic CTR for All Keywords:

A site’s organic CTR for all keywords it ranks for may be a human-based, user interaction signal (in other words, a “Quality Score” for the organic results).

137. Website Bounce Rate:

Not everyone in SEO agrees bounce rate matters, but it may be a way of Google to use their users as quality testers (after all, pages with a high bounce rate probably aren’t a great result for that keyword). Also, a recent study by SEMRush found a correlation between bounce rate and Google rankings.

138. Direct Web Traffic:

It’s confirmed that Google uses data from Google Chrome to determine how many people visit site (and how often). Sites with lots of direct traffic are likely higher quality sites vs. sites that get very little direct traffic. In fact, the SEMRush study I just cited found a significant correlation between direct traffic and Google rankings.

139. Repeat Website Traffic:

Websites with repeat visitors may get a Google ranking boost.

140. Pogosticking:

“Pogosticking” is a special type of bounce. In this case, the user clicks on other search results in an attempt to find the answer to their query.

Results that people Pogostick from may get a significantly rankings drop.

141. Blocked Sites:

Google has discontinued this feature in Chrome. However, Panda used this feature as a quality signal. So Google may still use a variation of it.

142. Chrome Bookmarks:

We know that Google collects Chrome browser usage data. Pages that get bookmarked in Chrome might get a boost.

143. Number of Comments:

Pages with lots of comments may be a signal of user-interaction and quality. In fact, one Googler said comments can help “a lot” with rankings.

144. Dwell Time:

Google pays very close attention to “dwell time“: how long people spend on your page when coming from a Google search.

This is also sometimes referred to as “long clicks vs short clicks”. In short: Google measures how long Google searchers spend on your page. The longer time spent, the better.

Special Google Algorithm Rules

145. Query Deserves Freshness:

Google gives newer pages a boost for certain searches.

146. Query Deserves Diversity:

Google may add diversity to a SERP for ambiguous keywords, such as “Ted”, “WWF” or “ruby”.

147. User Browsing History:

You’ve probably noticed this yourself: websites that you visit frequently get a SERP b oost for your searches.

148. User Search History:

Search chain influence search results for later searches.

For example, if you search for “reviews” then search for “toasters”, Google is more likely to rank toaster review sites higher in the SERPs.

149. Featured Snippets:

According to an SEMRush study, Google chooses Featured Snippets content based on a combination of content length, formatting, page authority and HTTPs usage.

150. Geo Targeting:

Google gives preference to sites with a local server IP and country-specific domain name extension.

151. Safe Search:

Search results with curse words or adult content won’t appear for people with Safe Search turned on.

152. Google+ Circles:

Even though Google+ is soon to be dead, Google still shows higher results for authors and sites that you’ve added to your Google Plus Circles.

153. “YMYL” Keywords:

Google has higher content quality standards for “Your Money or Your Life” keywords.

154. DMCA Complaints:

Google “downranks” pages with legitimate DMCA complaints.

155. Domain Diversity:

The so-called “Bigfoot Update” supposedly added more domains to each SERP page.

156. Transactional Searches:

Google sometimes displays different results for shopping-related keywords, like flight searches.

157. Local Searches:

For local searches, Google often places local results above the “normal” organic SERPs.

158. Top Stories box:

Certain keywords trigger a Top Stories box:

159. Big Brand Preference:

After the Vince Update, Google began giving big brands a boost for certain keywords.

160. Shopping Results:

Google sometimes displays Google Shopping results in organic SERPs:

161. Image Results:

Google images sometimes appear in the normal, organic search results.

162. Easter Egg Results:

Google has a dozen or so Easter Egg results. For example, when you search for “Atari Breakout” in Google image search, the search results turn into a playable game (!). Shout out to Victor Pan for this one.

163. Single Site Results for Brands:

Domain or brand-oriented keywords bring up several results from the same site.

164. Payday Loans Update:

This is a special algorithm designed to clean up “very spammy queries“.

Brand Signals

165. Brand Name Anchor Text:

Branded anchor text is a simple — but strong — brand signal.

166. Branded Searches:

People search for brands. If people search for your brand in Google, this shows Google that your site is a real brand.

167. Brand + Keyword Searches:

Do people search for a specific keyword along with your brand (for example: “Backlinko Google ranking factors” or “Backlinko SEO”)? If so, Google may give you a rankings boost when people search for the non-branded version of that keyword in Google.

168. Website Has Facebook Page and Likes:

Brands tend to have Facebook pages with lots of likes.

169. Website has Twitter Profile with Followers:

Twitter profiles with a lot of followers signals a popular brand.

170. Official Linkedin Company Page:

Most real businesses have company Linkedin pages.

171. Known Authorship:

In February 2013, Google CEO Eric Schmidt famously claimed:

“Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results.”

172. Legitimacy of Social Media Accounts:

A social media account with 10,000 followers and 2 posts is probably interpreted a lot differently than another 10,000-follower strong account with lots of interaction. In fact, Google filed a patent for determining whether or not social media accounts were real or fake.

173. Brand Mentions on Top Stories:

Really big brands get mentioned on Top Stories sites all the time. In fact, some brands even have a feed of news from their own website, on the first page:

174. Unlinked Brand Mentions:

Brands get mentioned without getting linked to. Google likely looks at non-hyperlinked brand mentions as a brand signal.

175. Brick and Mortar Location:

Real businesses have offices. It’s possible that Google fishes for location-data to determine whether or not a site is a big brand.

On-Site Webspam Factors

176. Panda Penalty:

Websites with low-quality content (particularly content farms) are less visible in search after getting hit by a Panda penalty.

177. Links to Bad Neighbourhoods:

Linking out to “bad neighborhoods” — like spammy pharmacy or payday loan websites — may hurt your search visibility.

178. Redirects:

Sneaky redirects is a big no-no. If caught, it can get a website not just penalised, but de-indexed.

179. Popups or “Distracting Ads”:

The official Google Rater Guidelines Document says that popups and distracting ads is a sign of a low-quality website.

180. Interstitial Popups:

Google may penalise websites that display full page “interstitial” popups to mobile users.

181. Site Over-Optimisation:

Yes, Google does penalise people for over-optimising their site. This includes: keyword stuffing, header tag stuffing, excessive keyword decoration.

182. Gibberish Content:

A Google Patent outlines how Google can identify “gibberish” content, which is helpful for filtering out spun or auto-generated content from their index.

183. Doorway Pages:

Google wants the page you show to Google to be the page that user ultimately see. If your page redirects people to another page, that’s a “Doorway Page”. Needless to say, Google doesn’t like websites that use Doorway Pages.

184. Ads Above the Fold:

The “Page Layout Algorithm” penalises websites with lots of ads (and not much content) above the fold.

185. Hiding Affiliate Links:

Going too far when trying to hide affiliate links (especially with cloaking) can bring on a penalty.

186. Fred:

A nickname given to a series of Google updates starting in 2017. According to Search Engine Land, Fred “targets low-value content sites that put revenue above helping their users.”

187. Affiliate Sites:

It’s no secret that Google isn’t the biggest fan of affiliates. And many think that sites that monetise with affiliate programs are put under extra scrutiny.

188. Autogenerated Content:

Google understandably hates autogenerated content. If they suspect that your site’s pumping out computer-generated content, it could result in a penalty or de-indexing.

189. Excess PageRank Sculpting:

Going too far with PageRank sculpting — by nofollowing all outbound links — may be a sign of gaming the system.

190. IP Address Flagged as Spam:

If your server’s IP address is flagged for spam, it may affect all websites on that server.

191. Meta Tag Spamming: Keyword stuffing can also happen in meta tags. If Google thinks you’re adding keywords to your title and description tags in an effort to game the algo, they may hit your site with a penalty.

Off-Site Webspam Factors

192. Hacked Site:

If your site gets hacked it can get dropped from the search results. In fact, Search Engine Land was completed deindexed after Google thought it had been hacked.

193. Unnatural Influx of Links:

A sudden (and unnatural) influx of links is a sure-fire sign of phony links.

194. Penguin Penalty:

Websites that were hit by Google Penguin are significantly less visible in search. Although, apparently, Penguin now focuses more on filtering out bad links vs. penalising entire websites.

195. Link Profile with High % of Low Quality Links:

Lots of links from sources commonly used by black hat SEOs (like blog comments and forum profiles) may be a sign of gaming the system.

196. Links From Unrelated Websites:

A high-percentage of backlinks from topically-unrelated sites can increase the odds of a manual penalty.

197. Unnatural Links Warning:

Google has sent out thousands of “Google Search Console notice of detected unnatural links” messages. This usually precedes a ranking drop, although not 100% of the time.

198. Low-Quality Directory Links:

According to Google, backlinks from low-quality directories can lead to a penalty.

199. Widget Links:

Google frowns on links that are automatically generated when user embeds a “widget” on their site.

200. Links from the Same Class C IP:

Getting an unnatural amount of links from sites on the same server IP may help Google determine that your links are coming from a blog network.

201. “Poison” Anchor Text:

Having “poison” anchor text (especially pharmacy keywords) pointed to your website may be a sign of spam or a hacked site. Either way, it can hurt your website’s ranking.

202. Unnatural Link Spike:

A 2013 Google Patent describes how Google can identify whether or not an influx of links to a page is legitimate. Those unnatural links may become devalued.

203. Links From Articles and Press Releases:

Articles directories and press releases has been abused to the point that Google now considers these two link building strategies a “link scheme” in many cases.

204. Manual Actions:

There are several types of these, but most are related to black hat link building.

205. Selling Links:

Getting caught selling links can hurt your search visibility.

206. Google Sandbox:

New sites that get a sudden influx of links are sometimes put in the Google Sandbox, which temporarily limits search visibility.

207. Google Dance:

The Google Dance can temporarily shake up rankings. According to a Google Patent, this may be a way for them to determine whether or not a site is trying to game the algorithm.

208. Disavow Tool:

Use of the Disavow Tool may remove a manual or algorithmic penalty for sites that were the victims of negative SEO.

209. Reconsideration Request:

A successful reconsideration request can lift a penalty.

210. Temporary Link Schemes:

Google has caught onto people that create — and quickly remove — spammy links. Also know as a temporary link scheme.

You can read more on this article and other marketing tips and advice.

— Read on backlinko.com/google-ranking-factors

Business and Sports News from Mike Armstrong – See http://mikearmstrong.me

SEO News & Advice

New post on Online Marketing Hub

The Danger of Crossing Algorithms: Uncovering The Cloaked Panda Update During Penguin 3.0
by christopherjanb
Posted by glenngabe

Penguin 3.0 was one of the most anticipated algorithm updates in recent years when it rolled out on October 17, 2014. Penguin hadn’t run for over a year at that point,
and there were many webmasters sitting in Penguin limbo waiting for recovery. They had cleaned up their link profiles, disavowed what they could, and were
simply waiting for the next update or refresh. Unfortunately, Google was wrestling with the algo internally and over twelve months passed without an
update.

So when Pierre Far finally
announced Penguin 3.0 a few days later on October 21, a few things
stood out. First, this was not a new algorithm like Gary Illyes had explained it would be at SMX East. It was a refresh and underscored
the potential problems Google was battling with Penguin (cough, negative SEO).

Second, we were not seeing the impact that we expected. The rollout seemed to begin with a heavier international focus and the overall U.S impact has been underwhelming to say the least. There were definitely many fresh hits globally, but there were a number of websites that should have recovered but didn’t
for some reason. And many are still waiting for recovery today.

Third, the rollout would be slow and steady and could take weeks to fully complete. That’s unusual, but makes sense given the microscope Penguin 3.0 was under. And this third point (the extended rollout) is even more important than most people think. Many webmasters are already confused when they get hit
during an acute algorithm update (for example, when an algo update rolls out on one day). But the confusion gets exponentially worse when there is an extended rollout.

The more time that goes by between the initial launch and the impact a website experiences, the more questions pop up. Was it Penguin 3.0 or was it something else? Since I work heavily with algorithm updates, I’ve heard similar questions many times over the past several years. And the extended Penguin 3.0 rollout is a great example of why confusion can set in. That’s my focus today.

Penguin, Pirate, and the anomaly on October 24

With the Penguin 3.0 rollout, we also had Pirate 2 rolling out. And yes, there are some websites that could be impacted by both. That added a layer of complexity to the situation, but nothing like what was about to hit. You see, I picked up a very a strange anomaly on October 24. And I clearly saw serious movement on that day (starting late in the day ET).

So, if there was a third algorithm update, then that’s three potential algo updates rolling out at the same time. More about this soon, but it underscores the confusion that can set in when we see extended rollouts, with a mix of confirmed and unconfirmed updates.

Penguin 3.0 tremors and analysis
Since I do a lot of Penguin work, and have researched many domains impacted by Penguin in the past, I heavily studied the Penguin 3.0 rollout and published a blog post based on analyzing the first ten days of Penguin 3.0 which included some interesting findings for sure.

And based on the extended rollout, I definitely saw Penguin tremors beyond the initial October 17 launch. For example, check out the screenshot below of a website seeing Penguin impact on October 17, 22, and 25.

But as mentioned earlier, something else happened on October 24 that set off sirens in my office. I started to see serious movement on sites impacted by Panda, and not Penguin. And when I say serious movement, I’m referring to major traffic gains or losses all starting on October 24. Again, these were sites heavily dealing with Panda and had clean link profiles. Check out the trending below from October 24 for several sites that saw impact.

A good day for a Panda victim:
A bad day for a Panda victim:
And an incredibly frustrating day for a 9/5 recovery that went south on 10/24:

(All on the link below)!

I saw this enough that I tweeted heavily about it and
included a section about Panda in my Penguin 3.0 blog post. And that’s when something wonderful happened, and it highlights the true beauty and power of the internet.

As more people saw my tweets and read my post, I started receiving messages from other webmasters explaining that they saw the same exact thing, and on their websites dealing with Panda and not Penguin. And not only did they tell me about, they showed me the impact.

I received emails containing screenshots and tweets with photos from Google Analytics and Google Webmaster Tools. It was amazing to see, and it confirmed
that we had just experienced a Panda update in the middle of a multi-week Penguin rollout. Yes, read that line again. Panda during Penguin, right when the internet world was clearly focused on Penguin 3.0.

That was a sneaky move Google… very sneaky. 🙂

So, based on what I explained earlier about webmaster confusion and algorithms, can you tell what happened next? Yes, massive confusion ensued. We had the
trifecta of algorithm updates with Penguin, Pirate, and now Panda.

Webmaster confusion and a reminder of the algo sandwich from 2012
So, we had a major algorithm update during two other major algorithm updates (Penguin and Pirate) and webmaster confusion was hitting extremely high levels. And I don’t blame anyone for being confused. I’m neck deep in this stuff and it confused me at first.

Was the October 24 update a Penguin tremor or was this something else? Could it be Pirate? And if it was indeed Panda, it would have been great if Google told us it was Panda! Or did they want to throw off SEOs analyzing Penguin and Pirate? Does anyone have a padded room I can crawl into?

Once I realized this was Panda, and started to communicate the update via Twitter and my blog, I had a number of people ask me a very important question:

“Glenn, would Google really roll out two or three algorithm updates so close together, or at the same time?”

Why yes, they would. Anyone remember the algorithm sandwich from April of 2012? That’s when Google rolled out Panda on April 19, then Penguin 1.0 on April 24,
followed by Panda on April 27. Yes, we had three algorithm updates all within ten days. And let’s not forget that the Penguin update on April 24, 2012 was the first of its kind! So yes, Google can, and will, roll out multiple major algos around the same time.

Where are we headed? It’s fascinating, but not pretty

Panda is near real-time now
When Panda 4.1 rolled out on September 23, 2014, I immediately disliked the title and version number of the update. Danny Sullivan named it 4.1, so it stuck. But for
me, that was not 4.1… not even close. It was more like 4.75. You see, there have been a number of Panda tremors and updates since P4.0 on May 20,
2014.

I saw what I was calling “tremors”
nearly weekly based on having access to a large amount of Panda data (across sites, categories, and countries).
And based on what I was seeing, I reached out to John Mueller at Google to clarify the tremors. John’s response was great and confirmed what I was seeing.
He explained that there was not a set frequency for algorithms like Panda. Google can roll out an algorithm, analyze the SERPs, refine the algo to get the desired results, and keep pushing it out. And that’s exactly what I was seeing (again, almost weekly since Panda 4.0).

When Panda and Penguin meet in real time…
…they will have a cup of coffee and laugh at us. 🙂 So, since Panda is near-real time, the crossing of major algorithm updates is going to happen.
And we just experienced an important one on October 24 with Penguin, Pirate, and Panda. But it could (and probably will) get more chaotic than what we have now.
We are quickly approaching a time where major algorithm updates crafted in a lab will be unleashed on the web in near-real time or in actual real time.

And if organic search traffic from Google is important to you, then pay attention. We’re about to take a quick trip into the future of Google and SEO. And after hearing what I have to say, you might just want the past back…

Google’s brilliant object-oriented approach to fighting webspam
I have presented at the past two SES conferences about Panda, Penguin, and other miscellaneous disturbances in the force. More about those “other
disturbances” soon. In my presentation, one of my slides looks like this:

(See link below)!

Over the past several years, Google has been using a brilliant, object-oriented approach to fighting webspam and low quality content. Webspam engineers can craft external algorithms in a lab and then inject them into the real-time algorithm whenever they want. It’s brilliant because it isolates specific problems, while also being extremely scalable. And by the way, it should scare the heck out of anyone breaking the rules.

For example, we have Panda, Penguin, Pirate, and Above the Fold. Each was crafted to target a specific problem and can be unleashed on the web whenever Google wants. Sure, there are undoubtedly connections between them (either directly or indirectly), but each specific algo is its own black box. Again, it’s object-oriented.

Now, Panda is a great example of an algorithm that has matured to where Google highly trusts it. That’s why Google announced in June of 2013 that Panda would roll out monthly, over ten days. And that’s also why it matured even more with Panda 4.0 (and why I’ve seen tremors almost weekly.)

And then we had Gary Illyes explain that Penguin was moving along the same path. At SMX East, Gary explained that the new Penguin algorithm (which clearly didn’t roll out on October 17) would be structured in a way where subsequent updates could be rolled out more easily.
You know, like Panda.

And by the way, what if this happens to Pirate, Above the Fold, and other algorithms that Google is crafting in its Frankenstein lab? Well my friends, then we’ll have absolute chaos and society as we know it will crumble. OK, that’s a bit dramatic, but you get my point.

We already have massive confusion now… and a glimpse into the future reveals a continual flow of major algorithms running in real-time, each that could pummel a site to the ground. And of course, with little or no sign of which algo actually caused the destruction. I don’t know about you, but I just broke out in hives. 🙂

Actual example of what (near) real-time updates can do After Panda 4.0, I saw some very strange Panda movement for sites impacted by recent updates. And it underscores the power of near-real time algo updates.
As a quick example, temporary Panda recoveries can happen if you don’t get out of the gray area enough. And now that we are seeing Panda tremors almost weekly, you can experience potential turbulence several times per
month.

Here is a screenshot from a site that recovered from Panda, didn’t get out of the gray area and reentered the strike zone, just five days later.

(See link to article below)!

Holy cow, that was fast. I hope they didn’t plan any expensive trips in the near future. This is exactly what can happen when major algorithms roam the web in real time. One week you’re looking good and the next week you’re in the dumps. Now, at least I knew this was Panda. The webmaster could tackle more content problems and get out of the gray area… But the ups and downs of a Panda roller coaster ride can drive a webmaster insane. It’s one of the reasons I recommend making
significant changes when you’ve been hit by Panda. Get as far out of the gray area as possible.

An “automatic action viewer” in Google Webmaster Tools could help (and it’s actually being discussed internally by Google)
Based on webmaster confusion, many have asked Google to create an “automatic action viewer” in Google Webmaster Tools. It would be similar to the “manual
actions viewer,” but focused on algorithms that are demoting websites in the search results (versus penalties). Yes, there is a difference by the way.

The new viewer would help webmasters better understand the types of problems that are being impacted by algorithms like Panda, Penguin, Pirate, Above the
Fold, and others. Needless to say, this would be incredibly helpful to webmasters, business owners, and SEOs.

So, will we see that viewer any time soon? Google’s John Mueller
addressed this question during the November 3 webmaster hangout (at 34:54).

http://ift.tt/1zj066n

John explained they are trying to figure something out, but it’s not easy. There are so many algorithms running that they don’t want to provide feedback
that is vague or misleading. But, John did say they are discussing the automatic action viewer internally. So you never know…

A quick note about Matt Cutts
As many of you know, Matt Cutts took an extended leave this past summer (through the end of October). Well, he announced on Halloween that he is extending his leave into 2015. I won’t go crazy here talking about his decision overall, but I will
focus on how this impacts webmasters as it relates to algorithm updates and webspam.

Matt does a lot more than just announce major algo updates… He actually gets involved when collateral damage rears its ugly head. And there’s not a
faster way to rectify a flawed algo update than to have Mr. Cutts involved. So before you dismiss Matt’s extended leave as uneventful, take a look at the
trending below:

Notice the temporary drop off a cliff, then 14 days of hell, only to see that traffic return? That’s because Matt got involved. That’s the
movie blog fiasco from early 2014 that I heavily analyzed. If
Matt was not notified of the drop via Twitter, and didn’t take action, I’m not sure the movie blogs that got hit would be around today. I told Peter from
SlashFilm that his fellow movie blog owners should all pay him a bonus this year. He’s the one that pinged Matt via Twitter and got the ball rolling.

It’s just one example of how having someone with power out front can nip potential problems in the bud. Sure, the sites experienced two weeks of utter
horror, but traffic returned once Google rectified the problem. Now that Matt isn’t actively helping or engaged, who will step up and be that guy? Will it
be John Mueller, Pierre Far, or someone else? John and Pierre are greatly helpful, but will they go to bat for a niche that just got destroyed? Will they
push changes through so sites can turn around? And even at its most basic level, will they even be aware the problem exists?

These are all great questions, and I don’t want to bog down this post (it’s already incredibly long). But don’t laugh off Matt Cutts taking an extended leave. If he’s gone for good, you might only realize how important he was to the SEO community
after he’s gone. And hopefully it’s not because your site just tanked as collateral damage during an algorithm update. Matt might be running a marathon or trying on new Halloween costumes. Then where will you be?

Recommendations moving forward:
So where does this leave us? How can you prepare for the approaching storm of crossing algorithms? Below, I have provided several key bullets that I think every webmaster should consider. I recommend taking a hard look at your site now, before major algos are running in near-real time.

Truly understand the weaknesses with your website. Google will continue crafting external algos that can be injected into the real-time algorithm.
And they will go real-time at some point. Be ready by cleaning up your site now.
Document all changes and fluctuations the best you can. Use annotations in Google Analytics and keep a spreadsheet updated with detailed
information.
Along the same lines, download your Google Webmaster Tools data monthly (at least). After helping many companies with algorithm hits, that
information is incredibly valuable, and can help lead you down the right recovery path.
Use a mix of audits and focus groups to truly understand the quality of your site. I mentioned in my post about aggressive advertising and Panda that human focus groups are worth their weight in gold (for surfacing Panda-related problems). Most business owners are too close to their own content and websites to accurately measure quality. Bias can be a nasty problem and can quickly lead to bamboo-overflow on a website.
Beyond on-site analysis, make sure you tackle your link profile as well. I recommend heavily analyzing your inbound links and weeding out unnatural links. And use the disavow tool for links you can’t remove. The combination of enhancing the quality of your content, boosting engagement, knocking down usability obstacles, and cleaning up your link profile can help you achieve long-term SEO success. Don’t tackle one quarter of your SEO problems. Address
all of them.
Remove barriers that inhibit change and action. You need to move fast. You need to be decisive. And you need to remove red tape that can bog down
the cycle of getting changes implemented. Don’t water down your efforts because there are too many chefs in the kitchen. Understand the changes that need to be implemented, and take action. That’s how you win SEO-wise.

Summary: Are you ready for the approaching storm?
SEO is continually moving and evolving, and it’s important that webmasters adapt quickly. Over the past few years, Google’s brilliant object-oriented approach to fighting webspam and low quality content has yielded algorithms like Panda, Penguin, Pirate, and Above the Fold. And more are on their way. My advice is to get your situation in order now, before crossing algorithms blend a recipe of confusion that make it exponentially harder to identify, and
then fix, problems riddling your website.

Now excuse me while I try to build a flux capacitor. 🙂

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

For more on this Article including the images and graphs see:
http://omhub.wordpress.com/2014/11/12/the-danger-of-crossing-algorithms-uncovering-the-cloaked-panda-update-during-penguin-3-0/

The SEO News & Advice page was posted “By Mike Armstrong” to the SEO Blog category.

SEO Tips / SEO Advice / Blogging Advice

New post on Online Marketing Hub

How to Be the Best Answer with Topic Targeting
by christopherjanb

This weekend I had the good fortune to present at the Minnesota Blogger Conference where nearly 300 local bloggers gathered to learn, get inspired and network.

For my part, I gave a presentation on how blogs are still an incredibly useful tool for marketing. Keeping the reason for blogging top of mind as well as empathy for reader preferences in how they find, consume and act on information are essential if a blog author expects marketing outcomes from their efforts.

When a blog or any content hub can become “the best answer” for the topics that are important for buyers, the return on blogging goes way, way up. One way to execute a content plan to become known as an authority is through topic targeting.

For experienced multi-channel and integrated marketing pros, this kind of approach is going to be fairly common. But for the vast majority of bloggers, whether they be corporate or enthusiasts, the shift from writing for yourself (or your brand) to writing to satisfy specific audience needs is a fundamental shift.

Topic targeting starts by answering a few key questions:

How do you want to be known? How do you want your product or service to be known? What are you, your brand, product or serve the “best answer” for? That singular distinction is essential in order to stand out.
What questions relevant to your area of expertise do buyers have? What information do they need in order to move from curiosity to specific interest to transaction?
As you come to find the sweet spot between how you want to be known and what customers care about, that’s the focus of your topic targeting plan.

Topic targeting is an approach that involves creating resources, experiences and connections that result in an undisputed affinity between a target topic and your brand.
On a large scale for large companies, this is essentially brand marketing. For a small or medium business without massive budgets or resources, these 3 phases below represent a practical approach to becoming the “best answer” wherever customers are looking.

Inspire:
When starting out from a position without prominent authority on your desired topic, one of the most effective ways to close the gap between where you are and where you want your brand to be is to connect with those that already have the authority and community you desire. Recognizing topical influencers in a creative and qualitative way with an emphasis on inspiring readers to think in new ways about the topic is a good start. Co-creating content with topic influencers is also particularly effective. Your target topic will drive which influencers you engage with, the questions and interactions you have, and the titling of the resulting content.

Additional inspire tactics include speaking events that are “on topic” in the conference scope, track and/or title of your presentation. Social engagement promoting target topic content and events should also align. Comments made on industry articles (blogs and online magazines) are also opportunities to create affinity. Blogging about the target topic from different perspectives (what would a buyer need to know from start to finish) is also an effective directed content effort that will contribute to becoming the best answer.

Lastly, a limited number of guest posts on relevant, high profile blogs and contributed articles to industry magazines and websites on your target topic will provided added support for your brand and the target topic.

Desire:
Anticipation is a gateway to topical authority. Continuing to blog on the target topic and growing influencer relationships will lead to even more community engagement opportunities. Consistent creation of useful and entertaining blog content as well as alignment with industry influencers will create a very powerful mental state amongst your blog readers: anticipation. A community that can’t wait to see what you’re going to publish next will be instrumental for amplifying content and stimulating new perspectives on your target topic. That desire leads to advocacy, evangelism and scale for reaching a target audience in a highly credible way.

Acquire:
Demand for information and expertise leads to demand for your solutions. As authority is built on your target topic represented by the content you create on your own websites, third party references to your brand as an authority, growth of your community around the topic and advertising activities, there are several opportunities to show more tangible evidence of expertise: Some examples include:

Case studies
Definitive topic resource/guide
Events – online and off
Industry survey and report (annual)
Lists recognizing experts in the topic (annual or quarterly)
All of these tactics provide opportunities for readers to move from awareness and learning about the topic (with your brand at the forefront) to consideration and action – leads and transactions. Consumers increasingly expect to be able to educate themselves to a purchase decision and making it easy to find, experience and act on your content isn’t just good content marketing, it’s what buyers want.

Specificity is essential with topic targeting as are patience and persistence. This is an earned achievement that also needs to be maintained. But once consensus and momentum are achieved, the ability to attract those actively seeking what you have to offer will expand the value of your content beyond lead generation and sales to other means of monetization – sponsorships, advertising, syndication.

To apply the approach mentioned in this post will require some homework – research in your market or industry to see what kinds of content and messages resonate with the target audience. That’s where the audience Discovery, Consumption and Action model for understanding your audience comes in to play. It is also a continuous effort that can start simply and scale based on what works and what doesn’t.

But the most important thing if all, is to start: How do you want to be known? How does that fit with what your customers want to know?

For more on this SEO Tips /SEO Advice / Blogging Advice post or content marketing in general see:
http://omhub.wordpress.com/2014/10/27/how-to-be-the-best-answer-with-topic-targeting/

The SEO Tips / SEO Advice / Blogging Advice page was posted “By Mike Armstrong”

IMG_0153.JPG

A recent post about how users view and interact with Today’s Google Search Engine Results Page

New post on Online Marketing Hub

Eye Tracking in 2014: How Users View and Interact with Today’s Google SERPs
by christopherjanb
Posted by rMaynes1

In September 2014, Mediative released its latest eye-tracking research entitled “The Evolution of Google’s Search Engine Results Pages and Their Effects on User Behaviour”.

This large study had participants conduct various searches using Google on a desktop. For example, participants were asked “Imagine you’re moving from
Toronto to Vancouver. Use Google to find a moving company in Toronto.” Participants were all presented with the same Google SERP, no matter the search
query.

Mediative wanted to know where people look and click on the SERP the most, what role the location of the listing on the SERP plays in winning views and
clicks, and how click activity on listings has changed with the introduction of Google features such as the carousel, the knowledge graph etc.

Mediative discovered that, just as Google’s SERP has evolved over the past decade, so too has the way in which search engine users scan the page before
making a click.

Back in 2005 when
a similar eye-tracking study was conducted for the first time by Mediative (formerly Enquiro), it was
discovered that people searched in a distinctive “triangle” pattern, starting in the top left of the search results page where they expected the first
organic listing to be located, and reading across horizontally before moving their eyes down to the second organic listing, and reading horizontally, but
not quite as far. This area of concentrated gaze activity became known as Google’s “Golden Triangle”. The study concluded that if a business’s listing was
not in the Golden Triangle, its odds of being seen by a searcher were dramatically reduced.

Heat map from 2005 showing the area known as Google’s “Golden Triangle” (see link below).

But now, in 2014, the top organic results are no longer always in the top-left corner where searchers expect them to be, so they scan other areas of the
SERP, trying to seek out the top organic listing, but being distracted by other elements along the way. The #1 organic listing is shifting further down the
page, and while this listing still captures the most click activity (32.8%) regardless of what new elements are presented, the shifting location has opened
up the top of the page with more potential areas for businesses to achieve visibility.

Where scanning was once more
horizontal, the adoption of mobile devices over the past 9 years has habitually conditioned searchers to now scan
more
vertically—they are looking for the fastest path to the desired content, and, compared to 9 years ago, they are viewing more search results
listings during a single session and spending less time viewing each one.

Searchers on Google now scan far more vertically than several years ago (see link below)

One of the biggest changes from SERPS 9 years ago to today is that Google is now trying to keep people on the result page for as long as they can.

An example is in the case of the knowledge graph. In Mediative’s study. when searchers were looking for “weather in New Orleans”, the results page that was
presented to them showed exactly what they needed to know. Participants were asked to click on the result that they felt best met their needs, even if, if reality, they wouldn’t have clicked through (in order to end that task). When a knowledge graph result exactly met the intent of the searcher, the study found 80% of people looked at that result, and 44% clicked on it. Google provided searchers with a relevant enough answer to keep them on the SERP.
The top organic listing captured 36.5% of pages clicks—compared to 82% when the knowledge graph did not provide the searcher with the answer they were
looking for.

It’s a similar case with the carousel results; when a searcher clicks on a listing, instead of going through to the listing’s website, another SERP is
presented specifically about the business, as Google tries to increase paid ad impressions/clicks on the Google search results page.

How can businesses stay on top of these changes and ensure they still get listed?
There are four main things to keep in mind:

1.
The basic fundamentals of SEO are as important as ever Create unique, fresh content, which speaks to the needs of your customers as this will always trump chasing the algorithm. There are also on-page and
off-page SEO tactics that you can employ that can increase your chances of being listed in areas of the SERP other than your website’s organic listing such as front-loading keywords in page titles and meta descriptions, getting listed on directories and ratings and reviews site, having social pages etc. It’s important to note that SEO strategy is no longer a one-size-fits-all approach.

2.
Consider using schema mark-up wherever possible
In Mediative’s 2014 Google SERP research, it was discovered that blog posts that had been marked up using schema to show the picture and name of the author
got a significant amount of engagement, even when quite far down the first page—these listings garnered an average of 15.5% of total page clicks.

Note:

As of August 2014, Google removed authorship markup entirely. However, the results are still a good example of how schema mark-up can be used to make
your business listing stand out more on the SERP, potentially capturing more view and clicks, and therefore more website traffic.

In the study, participants were asked to “Imagine that you’re starting a business and you need to find a company to host your website. Use Google to find
information about website hosting companies”. The SERP presented is shown below:

Almost 45% of clicks went to 2 blog posts titled “Five Best Web Hosting Companies” and “10 Best Web Hosting Companies”.

In general, the top clicked posts were those that had titles including phrases such as:

“Best…”
“Reviews of…”
“Top 5…”
“How-to…”
According to Google, “On-page markup helps search engines understand the information on webpages and provide richer results…Google doesn’t use markup
for ranking purposes at this time-but rich snippets can make your web pages appear more prominently in search results, so you may see an increase in
traffic.”

Schema markup is probably the most under-utilized tool for SEO, presenting a huge opportunity for companies that do utilize the Google approved tool.
Searchmetrics reported that only 0.3% of websites
use schema markup, yet over a third of Google’s results contain rich snippets (additional text, images and links below the individual search results).
BruceClay.com reports rich snippets can increase CTRs of listings between
15-50% and that websites using schema markup tend to rank higher in search results.

Schema mark-up can be used to add star ratings, number of reviews, pricing (all shown in the listing below) and more to a search results page listing.

3.
Know the intent of your users
Understanding what searchers are trying to discover when they conduct a search can help determine how much effort you should try and put into appearing in the number one organic listing, which can be an extremely difficult task without unlimited budget and resources—and, even if you do make it the number one organic listing, traffic is not guaranteed as discovered in this reaserch. If you’re competing with big name brands, or ratings and review sites, and
THAT is what your customers want, they you are going to struggle to compete.

The importance of your business being the first listing vs. on the first page therefore, is highly dependent on the searcher’s intent, plus the strength of your brand. The key is to always keep
user intent top-of-mind, and this can be established by talking to real people, rather than guessing. What are they looking for when they are searching for your site? Structure your content around what people really want and need, list your site
on the directories that people actually visit or reference, create videos (if that’s what your audience wants)—know what your actual customers are
looking for, and then provide it.

There are going to be situations when a business can’t get to number one on the organic listings. As previously mentioned, the study shows that this is still the key place to be, and the top organic listing captures more clicks that any other single listing. But if your chances of getting to that number
one spot are slim, you need to focus on other areas of the SERP, such as positions #4 or higher, which will be easier to obtain ranking for—businesses that are positioned lower on the SERP (especially positions 2-4) see more click activity than they did several years ago, making this real estate much more valuable. As Gord Hotchkiss writes about, searchers tend to “chunk” information on the SERP and scan each chuck in the same way they used to search the entire SERP—in a triangle pattern. Getting listed at the top of a “chunk” can therefore be effective for many businesses. This idea of “chunking” and scanning can be seen in the heat map below.

To add to that, Mediative’s research showed that everything located above the top 4 organic listings (so, carousel results, knowledge graph, paid listings,
local listings etc.) combined captured 84% of clicks. If you can’t get your business listing to #1, but can get listed somewhere higher than #4, you have a good chance of being seen, and clicked on by searchers. Ultimately, people expect Google to continue to do its job, and respond to search queries with the most relevant results at the top. The study points out that only 1% of participants were willing to click through to Page 2 to see more results. If you’re not listed on page 1 of Google for relevant searches, you may as well not exist online.

4.
A combination of SEO and paid search can maximize your visibility in SERP areas that have the biggest impact on both branding and traffic Even though organic listings are where many businesses are striving to be listed (and where the majority of clicks take place), it’s important not to
forget about paid listings as a component of your digital strategy. Click-through rates for top sponsored listings (positions 1 and 2) have changed very little in the past decade. Where the huge change has taken place is in the ability of sponsored ads on the right rail to attract attention and clicks.

Activity on this section of the page is almost non-existent. This can be put down to a couple of factors including searchers conditioned behaviour as
mentioned before, to scan more vertically, thanks to our increased mobile usage, and the fact that over the years we have learned that those results may not typically be very relevant, or as good as the organic results, so we tend not to even take the time to view them.

Mediative’s research also found that there are branding effects of paid search, even if not directly driving traffic. We asked participants to “Imagine you are traveling to New Orleans and are looking for somewhere to meet a friend for dinner in the French Quarter area. Use Google to find a restaurant.”
Participants were presented with a SERP showing 2 paid ads—the first was for opentable.com, and the second for the restaurant Remoulade, remoulade.com.

The top sponsored listing, opentable.com, was viewed by 84% of participants, and captured 26% of clicks. The second listing, remoulade.com, only captured
2% of clicks but was looked at by 73% of participants. By being seen by almost 3/4 of participants, the paid listing can increase brand affinity, and
therefore purchase (or choice) consideration in other areas! For example, if the searcher comes back and searches again another time, or clicks to opentable.com and then sees Remoulade listed, it may benefit from a higher brand affinity from having already been seen in the paid listings. Mediative conducted a Brand Lift study featuring Honda that found the more real estate that brands own on the SERP, the higher the CTR, and the higher the brand affinity, brand recognition, purchase consideration etc. Using paid search for more of a branding play is essentially free
brand advertising—while you should be prepared to get the clicks and pay for them of course, it likely that your business listing will be seen by a large number of people without capturing the same number of clicks. Impression data can also be easily tracked with Google paid ads so you know
exactly how many times your ad was shown, and can therefore estimate how many people actually looked at it from a branding point of view.

Rebecca Maynes is a Marketing Communications Strategist with Mediative, and was a major contributor on this study. The full study, including click-through rates for all areas of the SERP, can be downloaded at

http://ift.tt/1vuhDXI.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

For more on this article or for pictures see: http://omhub.wordpress.com/2014/10/22/eye-tracking-in-2014-how-users-view-and-interact-with-todays-google-serps/

The A recent post about how users view and interact with Today’s Google Search Engine Results Page was posted “By Mike Armstrong”

SEO & Content Marketing Advice

New post on Online Marketing Hub

How To Be Successful With Content Marketing Through Your Search Engine Strategy
by christopherjanb

Find a publicly advertised brand you identify with, from a healthy perspective, and follow their brand online.

What do you see?

Consistent values? Positive, helpful and resourceful communication? Ways to get in touch with them?

If you answer, “yes” to these questions, then that brand shows positive signs of effective content marketing through search engine strategy.

Content marketing and search engine strategy go hand-in-hand now more than ever.

Taking all the above into consideration, I did a brief case study using Kaiser Permanente to demonstrate how search strategy works. You can see for yourself how Kaiser has tapped into the power of content marketing through their search engine strategy.

In this article, I’ll walk you through the basics of optimizing Google’s search capabilities for your own brand’s content marketing success. You’ll learn about the following:

How IP addresses and location-based services play into localized search engine strategy.
The importance of branding… and why paying for Google ads may be worth your while.
Why and how Google rewards brands that create quality content.
First, let’s talk about search engine strategy…
Enter ‘Kaiser’ in the Google search bar and you’ll find this:

See link below:

You’ll Notice the following in the screenshot:

1. The result numbers from your search term.

The search term, ‘Kaiser,’ yields about 47,100,000 results from across the entire Web. What does this mean to you? Take into consideration the strength you – i.e., your website – must have to be #1 on the first page of only about ten.

2. A map showing you the results of the matching Kaiser locations in your area.

Your computer has what is known as an “IP address,” which is short for Internet Protocol address. Google, and other sites, use your IP address to identify computers on the Internet.

Your IP address is typically based on your real-world location, which helps Google provide you with relevant, local results. You can learn more about what Google has to say about IP addresses, including how to find yours here.

More people currently rely on their mobile devices to search and obtain instant and accurate results from their search efforts than ever before. Location services for mobile devices are essential for viewing maps and other local results, based on your current location. Google Maps can detect your location through the following:

GPS
WiFi
Cell Tower
Apple Location Services
Google provides helpful suggestions for improving your location’s accuracy, GPS signals, and more for your mobile device here.

3. Kaiser’ ad is the first result.

Google likes ads. I’ll talk more about that below.

4. The second result is their name in URL/website domain name format, and their top locations in your area.

Site structure, page names, and organization is important for the user experience and helps Google find information quickly. IP addresses and GPS/location settings also make a difference.

5. The third result is a social profile where reviews are encouraged and unconfirmed, such as Yelp.

Google prefers sites which are based on traffic, use or advertisement procurement.

What did this case study teach me about using search engine strategy for my own content marketing?
I learned about the importance of branding and advertising
Your brand name is everything.
Dominating the webiverse with your brand name is a great end goal. Since IP addresses are key for people finding you online, you’ll want to start locally and build up from there. Google rewards brands that register their physical locations by giving them top placement within search engine results. A recent article via Vertical Response lists the top 20 places where you should list your business online.

Social profiles and reviews are important to Google.
As your brand’s name gains exposure, more people start learning about you and your name. As more people use your brand name in these sites, it also helps Google recognize you, helping your brand show up higher in Google rankings and search results.

Google likes being paid.
Paying for advertising, especially search advertising, increases the likelihood that your brand name shows up in Google search results. Using Google AdWords makes it easy to create and run ad campaigns. Depending on how much you pay and the types of campaigns you set up, you can control different features of your ad, such as where it shows up in a Google search.

I also learned more about how to create the content Google wants…and how that can work for my clients and my business…
Our website pages and content are only as valuable as the words we use.
How often do you hear the expression, “content is king”? We hear it often because, when it comes to your website ranking higher in search results, it’s still very much the truth. When it comes to creating content, we need to understand that it’s the quality of our words that matters. Not only are people searching online for content that informs and educates them, but the Google search engine also craves useful and appealing content. Google wants results which are relevant to the search and to share the best, most trustworthy results.

Moz explains the importance of creating “great content” for your site in the following way:

“ Every search performed at the engines comes with an intent – to find, learn, solve, buy, fix, treat, or understand. Search engines place web pages in their results in order to satisfy that intent in the best possible way, and crafting the most fulfilling, thorough content that addresses a searcher’s needs provides an excellent chance to earn top rankings.”

Regular content is extremely valuable.
We know now how Google prefers quality content, but Google especially rewards those sites which produce quality content regularly.

Eric Sornoso lists and explains the following 3 reasons about why the Google search engine likes regular posting in his recent article for SEOBlog.com, “Google Rewards Sites That Regularly Post Great Content”:

Google likes fresh, new content.
Google likes a constant flow of content.
Google likes accountability.
Creating quality content on a regular basis certainly helps Google (and potential clients) find you, but you can also take extra steps to improve your results. For example, one of the first (and best) actions you should take, especially when your site is new, is to set up your RSS with Feedburner.

This is Google’s RSS Management tool, and it notifies Google each time your blog is updated with a new post. Even better? As you start producing content which generates higher traffic to your website, Google will continue rewarding you through more traffic referrals. It’s a win-win-win: for you, for Google, and for the searcher.

Over to you
Without a doubt, tapping into the power of Google and forming a strong search engine strategy can take your content marketing – and your business – to entirely new levels. Are you using search engine strategy as a way to bolster your own content marketing efforts?

Author information

Jamie Teasdale
Founder & Lead Strategist at Plan Promote Prosper
Jamie Teasdale is a business growth advocate and strategist who is passionate about supporting small businesses in their quest to effectively communicate with their target market. Focusing on content marketing (inbound and outbound) and brand messaging, Jamie’s company Plan Promote Prosper assists companies who recognize the value of strategic and consistent content marketing through blogging, email marketing and social media engagement, and the impact it makes toward positive SEO.
Planning a content marketing strategy is no small endeavor and should be done each year. Jamie and her team make it easy and affordable. Plan Promote Prosper offers eight white-labeled content marketing products and services that are sought after by leading marketing companies, web firms, copywriters and social managers. Jamie’s office is located in downtown Portland, Oregon. In her spare time, Jamie enjoys painting, traveling and spending time with friends and family.

For this full article including images see:
http://omhub.wordpress.com/2014/10/15/how-to-be-successful-with-content-marketing-through-your-search-engine-strategy/

SEO & Content Marketing Advice page posted “By Mike Armstrong”