Big majority of marketers now highly focused on user experience
Good news for those annoyed by horrifically complex user journeys, marketers are finally paying attention to your plight! It's probably good news for marketers as well, as a lack of focus on customer journey will mean prospects dropping out of your marketing funnel left right and centre.
The chart below based on data from the programmatic ad agency Celtra, show how many marketers have been awakened to importance of customer experience. Very close to half said it was a high priority, whilst another quarter declared it was their highest priority. For those 22% who are laggards who don't think about it, you may want to consider it! Although in fairness, it may be a flaw in the survey given that there was no 'not applicable option'.
If you've ever had the pleasure of purchasing a diamond, you're probably familiar with the four Cs that determine its value: Cut, Clarity, Color, and Carat.
As the Editor of a national online publication, I'm faced with the task of assessing the “value” of articles that are submitted to me every day.
The value of an article depends on its ability to resonate with readers.
Any editor will tell you that predicting what will strike a chord with an audience is an inexact science. But, over the years, I've developed my own system of three Cs that help me effectively evaluate the quality of an article.
I check for:
Clarity
Continuity
Connection
And the beauty is, you can also use the three Cs to decide whether or not your work is ready to be published.
Sometimes I read an article and can't pin down exactly what the writer is trying to say. What idea is he trying to communicate? If it's not clear, the writer hasn't spent enough time creating a precise message.
Similarly, a writer may begin an article with one idea and then veer off on a related, but separate, tangent halfway through the text. We've all done it - even me.
For instance, my last post on Copyblogger was about getting comfortable with throwing away your words. In the first draft of that post, I also covered self-editing. Those two ideas are related, but the introduction of that additional idea weakened my main message. In my second draft, I deleted everything related to self-editing to bring clarity back to my primary message.
After you've written a first draft, here's a three-step process for bringing clarity to a piece of writing:
Communicate one big idea. If your article contains two big ideas, save the second one for another piece of content.
Craft a magnetic headline. Your headline must make a strong promise based on your one big idea. If it doesn't show how a reader will benefit from the article, rewrite it.
Cut extra text. Eliminate every word in your article that does not deliver on the promise made in your headline.
Once you've brought clarity to your article, you can move on to the next C.
2. Continuity
This C improves the structure of your article. Now that your headline makes a strong promise and you know the big idea you're trying to communicate, it's time to ensure your article takes the reader on a logical journey.
Here are three elements that promote continuity:
State your premise. For example, the premise of this article is that it's helpful to have a framework to evaluate the quality of your content before it's published.
Introduce and support your big idea. The big idea here is that measuring Clarity, Continuity, and Connection will help you create high-quality content. Use subheads and bullet points to reinforce your message.
Give readers a payoff. Highlight how the big idea will make their lives better and motivate them to take action now.
In short, your blog post needs to be structured in a way that naturally leads the reader to your desired conclusions and delivers a genuine payoff for them: a big “aha” moment.
3. Connection
This final C is the key to creating an article that readers will be inclined to share. It doesn't matter how clear your ideas are, how well-structured your article is, or even how informative it might be … if your readers don't connect with it, they won't feel compelled to pass it on.
The fastest path to connection is showing vulnerability. The easiest way to get vulnerable? Share a story. It doesn't need to be long, but the story must be honest - just like my confession above about the mistake I made when writing the first draft of my last Copyblogger post.
Speaking of that post, I told a longer story in that article about getting critiqued by a writing teacher who told me my work was completely vanilla. That made it very easy for readers to feel connected to me because we've all had a cringe-worthy experience like that, right?
Use the 3 Cs to transform the quality of your content
The three Cs remind you to remain audience-focused when creating content, and you can use them when you write content for clients as well as when you're promoting your own business.
They'll help you produce useful content readers will engage with and share.
What techniques do you use to evaluate your writing?
Through patent filings over the years, Google has explored many ways that it might use “freshness” as a ranking signal. Back in 2011, we published a popular Moz Blog post about these “Freshness Factors” for SEO. Following our own advice, this is a brand new update of that article.
In 2003, Google engineers filed a patent named Information retrieval based on historical data that shook the SEO world. The patent not only offered insight into the mind of Google engineers at the time, but also seemingly provided a roadmap for Google's algorithm for years to come.
In his series on the “10 most important search patents of all time,” Bill Slawski's excellent writeup shows how this patent spawned an entire family of Google child patents–the latest from October 2011.
This post doesn't attempt to describe all the ways that Google may determine freshness to rank web pages, but instead focuses on areas we may most likely influence through SEO.
Giant, great big caveat: Keep in mind that while multiple Google patent filings describe these techniques - often in great detail - we have no guarantee how Google uses them in its algorithm. While we can't be 100% certain, evidence suggests that they use at least some, and possibly many, of these techniques to rank search results.
For another take on these factors, I highly recommend reading Justin Briggs' excellent article Methods for Evaluating Freshness.
Google may determine exactly which queries require fresh content by monitoring the web and their own huge warehouse of data, including:
Search volume: Are queries for a particular term spiking (i.e. “Earthquake Los Angeles”)?
News and blog coverage: If a number of news organizations start writing about the same subject, it's likely a hot topic.
Social media: A spike in mentions of a particular topic may indicate the topic is “trending.”
While some queries need fresh content, other search queries may be better served by older content.
Fresh is often better, but not always. (More on this later.)
Below are ten ways Google may determine the freshness of your content. Images courtesy of my favorite graphic designer, Dawn Shepard.
1. Freshness by inception date
Initially, a web page can be given a “freshness” score based on its inception date, which decays over time. This freshness score may boost a piece of content for certain search queries, but degrades as the content becomes older.
The inception date is often when Google first becomes aware of the document, such as when Googlebot first indexes a document or discovers a link to it.
"For some queries, older documents may be more favorable than newer ones. As a result, it may be beneficial to adjust the score of a document based on the difference (in age) from the average age of the result set." – All captions from US Patent Document Scoring Based on Document Content Update
2. Amount of change influences freshness: How Much
The age of a webpage or domain isn't the only freshness factor. Search engines can score regularly updated content for freshness differently from content that doesn't change. In this case, the amount of change on your webpage plays a role.
For example, changing a single sentence won't have as big of a freshness impact as a large change to the main body text.
"Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time."
In fact, Google may choose to ignore small changes completely. That's one reason why when I update a link on a page, I typically also update the text surrounding it. This way, Google may be less likely to ignore the change. Consider the following:
"In order to not update every link's freshness from a minor edit of a tiny unrelated part of a document, each updated document may be tested for significant changes (e.g., changes to a large portion of the document or changes to many different portions of the document) and a link's freshness may be updated (or not updated) accordingly."
3. Changes to core content matter more: How important
Changes made in “important” areas of a document will signal freshness differently than changes made in less important content.
Less important content includes:
JavaScript
Comments
Advertisements
Navigation
Boilerplate material
Date/time tags
Conversely, “important” content often means the main body text.
So simply changing out the links in your sidebar, or updating your footer copy, likely won't be considered as a signal of freshness.
"…content deemed to be unimportant if updated/changed, such as Javascript, comments, advertisements, navigational elements, boilerplate material, or date/time tags, may be given relatively little weight or even ignored altogether when determining UA."
This brings up the issue of timestamps on a page. Some webmasters like to update timestamps regularly - sometimes in an attempt to fake freshness - but there exists conflicting evidence on how well this works. Suffice to say, the freshness signals are likely much stronger when you keep the actual page content itself fresh and updated.
4. The rate of document change: How often
Content that changes more often is scored differently than content that only changes every few years.
For example, consider the homepage of the New York Times, which updates every day and has a high degree of change.
"For example, a document whose content is edited often may be scored differently than a document whose content remains static over time. Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time."
Google may treat links from these pages differently as well (more on this below.) For example, a fresh “link of the day” from the Yahoo homepage may be assigned less significance than a link that remains more permanently.
5. New page creation
Instead of revising individual pages, fresh websites often add completely new pages over time. (This is the case with most blogs.) Websites that add new pages at a higher rate may earn a higher freshness score than sites that add content less frequently.
"UA may also be determined as a function of one or more factors, such as the number of 'new' or unique pages associated with a document over a period of time. Another factor might include the ratio of the number of new or unique pages associated with a document over a period of time versus the total number of pages associated with that document."
Some webmasters advocate adding 20–30% new pages to your site every year. Personally, I don't believe this is necessary as long as you send other freshness signals, including keeping your content up-to-date and regularly earning new links.
6. Rate of new link growth signals freshness
Not all freshness signals are restricted to the page itself. Many external signals can also indicate freshness as well, oftentimes with powerful results.
If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you're about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)
"…a downward trend in the number or rate of new links (e.g., based on a comparison of the number or rate of new links in a recent time period versus an older time period) over time could signal to search engine 125 that a document is stale, in which case search engine 125 may decrease the document's score."
Be warned: an unusual increase in linking activity can also indicate spam or manipulative link building techniques. Search engines are likely to devalue such behavior. Natural link growth over time is usually the best bet.
7. Links from fresh sites pass fresh value
Links from sites that have a high freshness score themselves can raise the freshness score of the sites they link to.
For example, if you obtain a link off an old, static site that hasn't been updated in years, this may not pass the same level of freshness value as a link from a fresh page, i.e. the homepage of Wired. Justin Briggs coined this FreshRank.
"Document S may be considered fresh if n% of the links to S are fresh or if the documents containing forward links to S are considered fresh."
8. Traffic and engagement metrics may signal freshness
When Google presents a list of search results to users, the results the users choose and how much time they spend on each one can be used as an indicator of freshness and relevance.
For example, if users consistently click a search result further down the list, and they spend much more time engaged with that page than the other results, this may mean the result is more fresh and relevant.
"If a document is returned for a certain query and over time, or within a given time window, users spend either more or less time on average on the document given the same or similar query, then this may be used as an indication that the document is fresh or stale, respectively."
You might interpret this to mean that click-through rate is a ranking factor, but that's not necessarily the case. A more nuanced interpretation might say that the increased clicks tell Google there is a hot interest in the topic, and this page - and others like it - happen to match user intent.
For a more detailed explanation of this CTR phenomenon, I highly recommend reading Eric Enge's excellent article about CTR as a ranking factor.
9. Changes in anchor text may devalue links
If the subject of a web page changes dramatically over time, it makes sense that any new anchor text pointing to the page will change as well.
For example, if you buy a domain about racing cars, then change the format to content about baking, over time your new incoming anchor text will shift from cars to cookies.
In this instance, Google might determine that your site has changed so much that the old anchor text is now stale (the opposite of fresh) and devalue those older links entirely.
"The date of appearance/change of the document pointed to by the link may be a good indicator of the freshness of the anchor text based on the theory that good anchor text may go unchanged when a document gets updated if it is still relevant and good."
The lesson here is that if you update a page, don't deviate too much from the original context or you may risk losing equity from your pre-existing links.
10. Older is often better
Google understands the newest result isn't always the best. Consider a search query for “Magna Carta." An older, authoritative result may be best here.
In this case, having a well-aged document may actually help you.
Google's patent suggests they determine the freshness requirement for a query based on the average age of documents returned for the query.
"For some queries, documents with content that has not recently changed may be more favorable than documents with content that has recently changed. As a result, it may be beneficial to adjust the score of a document based on the difference from the average date-of-change of the result set."
A good way to determine this is to simply Google your search term, and gauge the average inception age of the pages returned in the results. If they all appear more than a few years old, a brand-new fresh page may have a hard time competing.
Freshness best practices
The goal here shouldn't be to update your site simply for the sake of updating it and hoping for better ranking. If this is your practice, you'll likely be frustrated with a lack of results.
Instead, your goal should be to update your site in a timely manner that benefits users, with an aim of increasing clicks, user engagement, and fresh links. These are the clearest signals you can pass to Google to show that your site is fresh and deserving of high rankings.
Aside from updating older content, other best practices include:
Create new content regularly.
When updating, focus on core content, and not unimportant boilerplate material.
Keep in mind that small changes may be ignored. If you're going to update a link, you may consider updating all the text around the link.
Steady link growth is almost always better than spiky, inconsistent link growth.
All other things being equal, links from fresher pages likely pass more value than links from stale pages.
Engagement metrics are your friend. Work to increase clicks and user satisfaction.
If you change the topic of a page too much, older links to the page may lose value.
Updating older content works amazingly well when you also earn fresh links to the content. A perfect example of this is when Geoff Kenyon updated his Technical Site Audit Checklist post on Moz. You can see the before and after results below:
Be fresh.
Be relevant.
Most important, be useful.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Adult Swim is deputing a new show - Brad Neely's Harg Nallin Sclopio Peepio - on July 10. If you want to catch it before the premiere, you can view the entire first episode on your favorite six second 140 second streaming platform, Vine. I just watched an adult swim show on vine. Long form is dope pic.twitter.com/Oyfx3QMgnF - Kenny Knox (@KennyKnox97) June 27, 2016 The initial clip is the typical six second, repeating, Vine. After clicking though, the player launches a full-screen video that lets you catch the episode in its entirety - nearly 10 minutes. If you want…
Late 2016 the tipping point will be reached, as digital spend outstrips TV advertising for the first time
If the forecasts based on current data from emarketer are correct, the US is about to spend more on digital ads that it does on to TV for the first time ever. This is the culmination of years of growth in digital advertising, but has been given a massive boost by the explosion of mobile ads over the last couple of years. From virtually nothing a few years ago, mobile advertising is in a renaissance. Now that over 50% of internet users are on mobile devices, it makes sense for budgets to be shifted to digital spending.
The data in this chart also points to other mega-trends, such as increased online video steaming rather than cable TV viewing, as well as increased time spent on social networking sites. These all combine to mean we are increasingly spending large chucks of our free time online rather than watching TV (or often doing both, and giving neither our full attention). As such ad spend is shifting towards where the eyeballs are.
This is good news for those of us in the digital marketing industry, but this is not the time to be complacent. Many advertisers still prefer TV for brand awareness, digital must prove it can offer the ability to tell stories as well as generate clicks.
This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author's views are entirely his or her own and may not reflect the views of Moz, Inc.
[Estimated read time: 8 minutes]
As SEOs, we often find ourselves facing new changes implemented by search engines that impact how our clients' websites perform in the SERPs. With each change, it's important that we look beyond its immediate impact and think about its future implications so that we can try to answer this question: "If I were Google, why would I do that?"
Recently, Google implemented a series of manual penalties that affected sites deemed to have unnatural outbound links. Webmasters of affected sites received messages like this in Google Search Console:
Webmasters were notified in an email that Google had detected a pattern of "unnatural artificial, deceptive, or manipulative outbound links." The manual action itself described the link as being either "unnatural or irrelevant."
The responses from webmasters varied in their usual extreme fashion, with recommendations ranging from "do nothing" to "nofollow every outbound link on your site."
Google's John Mueller posted in product forums that you don't need to nofollow every link on your site, but you should focus on nofollowing links that point to a product, sales, or social media page as the result of an exchange.
Now, on to the fun part of being an SEO: looking at a problem and trying to reverse-engineer Google's intentions to decipher the implications this could have on our industry, clients, and strategy.
The intent of this post is not to decry those opinions that this was specifically focused on bloggers who placed dofollow links on product/business reviews, but to present a few ideas to incite discussion as to the potential big-picture strategy that could be at play here.
A few concepts that influenced my thought process are as follows:
Penguin has repeatedly missed its "launch date," which indicates that Google engineers don't feel it's accurate enough to release into the wild.
The growth of negative SEO makes it even more difficult for Google to identify/penalize sites for tactics that are not implemented on their own websites.
Penguin temporarily impacted link-building markets in a way Google would want. The decline reached its plateau in July 2015, as shown in this graph from Google Trends:
If I were Google, I would expect webmasters impacted by the unnatural outbound links penalty to respond in one of these ways:
Do nothing. The penalty is specifically stated to "discount the trust in links on your site." As a webmaster, do you really care if Google trusts the outbound links on your site or not? What about if the penalty does not impact your traffic, rankings, visibility, etc.? What incentive do you have to take any action? Even if you sell links, if the information is not publicly displayed, this does nothing to harm your link-selling business.
Innocent site cleanup effort. A legitimate site that has not exchanged goods for links (or wants to pretend they haven't) would simply go through their site and remove any links that they feel may have triggered the issue and then maybe file a bland reconsideration request stating as much.
Guilty site cleanup effort. A site that has participated in link schemes would know exactly which links are the offenders and remove them. Now, depending on the business owner, some might then file a reconsideration request saying, "I'm sorry, so-and-so paid me to do it, and I'll never do it again." Others may simply state, "Yes, we have identified the problem and corrected it."
In scenario No. 1, Google wins because this helps further the fear, uncertainty, and doubt (FUD) campaigns around link development. It is suddenly impossible to know if a site's outbound links have value because they may possibly have a penalty preventing them from passing value. So link building not only continues to carry the risk of creating a penalty on your site, but it suddenly becomes more obvious that you could exchange goods/money/services for a link that has no value despite its MozRank or any other external "ranking" metric.
In scenarios No. 2 and No. 3, Google wins because they can monitor the links that have been nofollowed/removed and add potential link scheme violators to training data.
In scenario No. 3, they may be able to procure evidence of sites participating in link schemes through admissions by webmasters who sold the links.
If I were Google, I would really love to have a control group of known sites participating in link schemes to further develop my machine-learned algorithm for detecting link profile manipulation. I would take the unnatural outbound link data from scenario No. 3 above and run those sites as a data set against Penguin to attempt 100% confidence, knowing that all those sites definitely participated in link schemes. Then I would tweak Penguin with this training dataset and issue manual actions against the linked sites.
This wouldn't be the first time SEOs have predicted a Google subtext of leveraging webmasters and their data to help them further develop their algorithms for link penalties. In 2012, the SEO industry was skeptical regarding the use of the disavow tool and whether or not Google was crowdsourcing webmasters for their spam team.
"Clearly there are link schemes that cannot be caught through the standard algorithm. That's one of the reasons why there are manual actions. It's within the realm of possibilities that disavow data can be used to confirm how well they're catching spam, as well as identifying spam they couldn't catch automatically. For example, when web publishers disavow sites that were not caught by the algorithm, this can suggest a new area for quality control to look into." - Roger Montti, Martinibuster.com
What objectives could the unnatural outbound links penalties accomplish?
Legit webmasters could become more afraid to sell/place links because they get "penalized."
Spammy webmasters could continue selling links from their penalized sites, which would add to the confusion and devaluation of link markets.
Webmasters could become afraid to buy/exchange links because they could get scammed by penalized sites and be more likely to be outed by the legitimate sites.
The Penguin algorithm could have increased confidence scoring and become ready for real-time implementation.
"There was a time when Google would devalue the PR of a site that was caught selling links. With that signal gone, and Google going after outbound links, it is now more difficult than ever to know whether a link acquired is really of value." -- Russ Jones, Principal Search Scientist at MOZ
Again, if I were Google, the next generation of Penguin would likely heavily weight irrelevantly placed links, and not just commercial keyword-specific anchor text. Testing this first on the sites I think are guilty of providing the links and simply devaluing those links seems much smarter. Of course, at this point, there is no specific evidence to indicate Google's intention behind the unnatural outbound links penalties were intended as a final testing phase for Penguin and to further devalue the manipulated link market. But if I were Google, that's exactly what I would be doing.
"Gone are the days of easily repeatable link building strategies. Acquiring links shouldn't be easy, and Penguin will continue to change the search marketing landscape whether we like it or not. I, for one, welcome our artificially intelligent overlords. Future iterations of the Penguin algorithm will further solidify the “difficulty level” of link acquisition, making spam less popular and forcing businesses toward legitimate marketing strategies." - Tripp Hamilton, Product Manager at Removeem.com
Google's webmaster guidelines show link schemes are interpreted by intent. I wonder what happens if I start nofollowing links from my site for the intent of devaluing a site's rankings? The intent is manipulation. Am I at risk of being considered a participant in link schemes? If I do link building as part of an SEO campaign, am I inherently conducting a link scheme?
So, since I'm an SEO, not Google, I have to ask myself and my colleagues, "What does this do to change or reinforce my SEO efforts?" I immediately think back to a Whiteboard Friday from a few years ago that discusses the Rules of Link Building.
"At its best, good link building is indistinguishable from good marketing." - Cyrus Shepard, former Content Astronaut at Moz
When asked what type of impact SEOs should expect from this, Garret French from Citation Labs shared:
"Clearly this new effort by Google will start to dry up the dofollow sponsored post, sponsored review marketplace. Watch for prices to drop over the next few months and then go back and test reviews with nofollowed links to see which ones actually drive converting traffic! If you can't stomach paying for nofollowed links then it's time to get creative and return to old-fashioned, story-driven blog PR. It doesn't scale well, but it works well for natural links."
In conclusion, as SEOs, we are responsible for predicting the future of our industry. We do not simply act in the present. Google does not wish for its results to be gamed and have departments full of data scientists dedicated to building algorithms to identify and devalue manipulative practices. If you are incapable of legitimately building links, then you must mimic legitimate links in all aspects (or consider a new career).
Takeaways
Most importantly, any links that we try to build must provide value. If a URL links to a landing page that is not contextually relevant to its source page, then this irrelevant link is likely to be flagged and devalued. Remember, Google can do topical analysis, too.
In link cleanup mode or Penguin recovery, we've typically approached unnatural links as being obvious when they have a commercial keyword (e.g. "insurance quotes") because links more naturally occur with the URL, brand, or navigational labels as anchor text. It would also be safe to assume that natural links tend to occur in content about the destination the link offers and that link relevance should be considered.
Finally, we should continue to identify and present clients with methods for naturally building authority by providing value in what they offer and working to build real relationships and brand advocates.
What are your thoughts? Do you agree? Disagree?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Sundar Pichai isn't going to have a happy start to his week. The CEO of Google's Quora account appears to have been hacked by a group called OurMine, which previously broke into Facebook boss Mark Zuckerberg's Twitter and Pinterest accounts earlier this month. The three-man hacker outfit has been posting messages on Quora through Pichai's account; it's also connected to his Twitter account and as a result, OurMine was able to publicize their hack to all 508,000 of his followers. The tweets have now been removed, but we've got a screenshot. OurMine has been targeting major tech execs of late,…
Apple is again celebrating Pride in San Francisco, but this time are gifting employees with a rainbow Apple Watch band for taking part in the festivities. Many Apple employees have taken to Twitter and Instagram to show off the band, which has a card included letting them know that their participation earned them the band. It doesn't appear that Apple is making the band available for purchase, so expect this to be one of the hottest eBay items ever if an employee or 12 decides to cash in.