Google has begun displaying multiple Quick Answers on certain search engine result pages (SERP). How does this impact optimizing for the Quick Answer box and featured snippet placements? Buried in a January article on Google’s The Keyword webmaster blog was a mention that the search engine giant was going to start displaying more than one […]

The post Google Adds Multiple Quick Answers to SERPs: What It Means for SEO appeared first on BrightEdge SEO Blog.

Digital Marketers Point to AI-Powered Applications as Their Next Big Trends

English, British
News Item Title
Digital Marketers Point to AI-Powered Applications as Their Next Big Trends
News Item Author Name
MarketingCharts
News Item Published Date
News Item Summary

What's "the next big trend" in marketing? BrightEdge asked more than 500 search, content, and digital marketers that very question.

Search Engine Optimization 101

Search Engine Optimization 101

This webinar will help you build a strong foundation of SEO knowledge.

Available on-demand

Whether you're new to the world of SEO or looking to help educate others in your organization, this webinar will help you build a strong foundation of knowledge. We will talk through real-world example and best practices.

  • What is SEO?
  • How do search engines work?
  • How to succeed in SEO?
  • How to identify quick wins?

Download now.

 

 

 

 

 

 

CAPTCHA
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

AI, Content and Search: 5 Macro Market Trends for Micro Marketing

English, British
News Item Title
AI, Content & Search: 5 Macro Market Trends for Micro Marketing
News Item Author Name
Andy Betts
News Item Published Date
News Item Summary

A recent BrightEdge survey of over 500 search, content and digital marketers found that more than 30% of respondents cited a better understanding of the customer as the foremost advantage.

WhiteHat Security Uses Smart Content Framework

cbao
cbao
M Posted 8 years 1 month ago
t 9 min read

We recently caught up with Avi Bhatnagar, Senior Director of Digital Strategy at WhiteHat Security, to learn about how he increased website traffic and leads using the BrightEdge Smart Content framework. Avi oversees several marketing channels and often comes up with a new approach for one program that he later reapplies elsewhere. The interview below covers his recent experiences.

The New Era Of Smart Content

Smart Content framework interview

BrightEdge: Avi, what does WhiteHat Security do?

Avi Bhatnagar: WhiteHat Security has been in the business of securing applications for over 15 years. It offers a cloud service that allows organizations to bridge the gap between Security and Developers. We are the leading application security provider that combines the best of technology and human intelligence to ensure a safe digital life for our customers.

BE: You’ve been focusing on brand awareness and top-of-the-funnel traffic lately. Tell us about that.

AB: Yes, as we all know, the last few years have seen an enormous increase in data breaches, malware attacks, and other security nightmares. Just few months ago, there were several high profile data breaches reported, including the news about Equifax. It became a priority to educate security teams and developers about the importance of security best practices in the application development process, as opposed to treating security as an afterthought. Increasing brand awareness through valuable content becomes an apparent opportunity. Therefore, we developed a Knowledge Center glossary for our audiences to leverage as a source for Quick Answers. We leveraged a framework called Smart Content to ensure that those new website pages would be optimized for search engines and accessed by lots of people much faster than how traditional content performs. smart content framework for seo - brightedge

BE: How did you apply the Smart Content framework?

AB: Traditional content development starts with an ideation process around the topics that are aligned with our integrated marketing campaigns. It’s a very inside-out approach to content development: The story will be based on a particular topic, so the persona is limited to a specific target audience. There’s always a challenge to market this content to a niche persona. Smart Content takes an outside-in approach. It evaluates the marketplace of ideas: what questions prospects and customers are asking, and how well brands and their competitors are addressing those questions with quality content. We took a comprehensive approach by partnering with our Engineering teams to define topics that were outside of our product taxonomy. We looked at specific questions that our target audience raised around security testing tools, application vulnerabilities to specific threats and attacks. We leveraged the results from our Stats Reports and we developed glossary entries specifically for these topics. More and more brands are coming to realize that search engines are a great way of reaching out to new audiences and establishing wider brand awareness. In traditional content development, a writer creates a piece of content, publishes it, and then a search engine optimization (SEO) professional needs to optimize the content to make it more visible in search. Smart Content flips this approach on its head: it injects all the SEO and mobile-friendliness wisdom into the content right in the writing process, which helps ensure that the content is optimized for SEO at the get-go. Smart Content makes it easy for search engines to understand the content, and so the content ranks high on search result pages.

BE: There’s almost parallelism between Smart Content that bakes SEO into the content writing process, and WhiteHat Security that bakes security practices into application development process.

AB: It’s almost poetic, isn’t it? Yeah, maybe that’s why I got so drawn to Smart Content.

BE: You were able to increase website leads too, which is a primary business metric for you guys.

AB: Yes, we increased our lead conversions by 150% year-over-year by implementing this type of Smart Content. Again, in traditional content development, the messaging is focused on writing the value statements, but risks neglecting content performance. The Subject Matter Experts (SMEs) write to explain things, but often times pay little or no attention to the marketing funnel or customer journey. Smart Content focuses on content performance: from prioritizing content topics that have strong demand, yet weak competition, to driving relevant internal links that will help bots and humans find the content and take the desired CTA (Call to Action). Performance is key.

BE: So what happened next?

AB: The glossary took off instantaneously, and started to generate almost 10% of our overall traffic within months. With Smart Content, we have increased our organic search traffic by 60% year-over-year. I was amazed by how fast the content performed on the SERPs. So I developed a hypothesis that maybe I could increase my SEM efficiency also, if tied my paid search ads to the Smart Content pages.

I ran an experiment for Integrated Search - SEO + SEM alignment. I placed bids on some of my glossary terms and set my Smart Content pages as the landing pages in my SEM ads. Below is an example for the term "os command injection," for which we rank on position six organically. WhiteHat Security "os comman injection" SEM ad for smart content framework - brightedge

My destination URL is the Smart Content glossary page for this term: WhiteHat glossary page for os command injection The content is well-tuned for SEO and has a clear CTA directly related to the topic. I also included dynamic content recommendations at the bottom of the page to keep non-converting readers from abandoning the site.

Using Smart Content to power paid search campaigns has proven fruitful: because the pages were so tuned, the Quality Scores increased from a site average of 3.4 to 6.1. So I was able to maintain my ad positions while reducing Cost-Per-Click by about 15%.

When I compared the Smart Content SEM campaigns with the traditional SEM campaigns, Click-Through-Rate increased by 55%, and Form Submissions - our measure for lead generation activities - increased by a substantial 125%.

BE: What was the reaction when people saw the results?

AB: The extended team was ecstatic to have such positive results and see the true value of investing in organic and paid search together. I used the results to make a case for - and was able to secure - additional SEM budget. I will increase the spend on paid search ads that will drive relevant new visitors to my Smart Content pages.

BE: What’s the key lesson from all of this?

AB: Marketing organizations need to realize creating content to tell a story isn’t enough. The new mindset to adopt is "if you optimize it, they will come." It should function more like a connective tissue between what customers are looking for and the actions that the brands ultimately want them to take.

Content development must be pragmatic: It should focus on precise interests and intents that your target audiences care about the most; some of which may seem to be outside of your product terminology, blog topics, or typical taxonomy. While it’s important to provide something interesting, useful or shareable, branded content should also clearly lead readers to conversion points, so that they don’t have to guess what to do next.

This helps to continuously build your website into a marketing engine.

BE: Thanks for sharing your experiences with us.

AB: My pleasure. And if anyone wants to learn more, feel free to DM me on Twitter: @avibay.   

We recently caught up with Avi Bhatnagar, Senior Director of Digital Strategy at WhiteHat Security, to learn about how he increased website traffic and leads using the BrightEdge Smart Content framework. Avi oversees several marketing channels and often comes up with a new approach for one program that he later reapplies elsewhere. The interview below […]

The post WhiteHat Security Increases SEM Efficiency Using Smart Content Framework appeared first on BrightEdge SEO Blog.

60% of Enterprise Marketers Set To Use Artificial Intelligence (AI) in Content Marketing Strategy This Year

60% of Enterprise Marketers Set To Use AI in Content Strategy in 2018

More than 500 surveyed marketers provide insight on AI-adoption tipping point, with key benefits aligned around personalization and customer experience

A new survey released by BrightEdge, the leader in enterprise SEO and content performance marketing, reveals marketers have become more receptive to adoption of artificial intelligence technologies, such as machine learning, and their capabilities to deliver better customer experience and marketing performance. In its second annual “Future of Marketing” survey, BrightEdge surveyed over 500 digital marketers at Fortune 500 brands split evenly across both B2B and B2C companies over a one-month period in February 2018. The “Future of Marketing” survey offers a bevy of insights on the current state of how brands are using AI to deliver a more personalized customer experience through smart content.

In the 2017 “Future of Marketing” survey, marketers believed in a future ruled by the likes of Artificial Intelligence (AI) and voice search. Yet, most in-house marketers and agency practitioners had done little to introduce these powerful components into their digital marketing strategy to deliver compelling customer experiences that perform.

In 2018, marketer responses on the “Future of Marketing” reveal quite different patterns of technology adoption and benefits. Most notably, this year’s findings revealed 60 percent of marketers intend to use artificial intelligence (AI) to develop content marketing strategy. 

In this year’s survey, nearly 75% of all marketers responded that three trends are dominating the thinking of marketers. Additionally, each of the top three themes require and use Artificial Intelligence today. According to marketers, the next big technology is as follows:

  • Personalization - 29% of respondents
  • Artificial Intelligence - 26% of respondents
  • Voice Search -  22% of respondents

“Despite some of the hype surrounding artificial intelligence, this survey shows that AI is very real and marketers are adopting AI-first technology in search and content marketing sectors faster than most,”  said Jim Yu, CEO of BrightEdge. “The insights that AI brings allows marketers to make smarter and faster decisions to deliver compelling customer experiences that perform.”

Finding #1: Marketers Set to use AI to Develop their Content Marketing Strategies

how like likely are you to use artificial intelligence? Survey results

Nearly 60% of all marketers responded that they plan to use artificial intelligence in their content marketing strategy. In 2017, only 43% of respondents said they were likely to use artificial intelligence or deep learning to develop their content marketing strategy, while 50% of respondents said they were “very unlikely.” This represents a major finding--and an important leap forward for the modern data-driven marketer. Marketers are adopting cutting-edge technologies to deliver more personalized customer experiences:

  • 42 percent of marketers responded they are somewhat likely to use Artificial Intelligence, Machine Learning and Deep Learning to develop their content marketing strategy
  • 17 percent of marketers are very likely to implement AI and Deep Learning
  • 4 percent already implemented AI and Deep Learning

“Artificial Intelligence and Machine Learning is an interesting area we’ve been exploring to weave into our SEO strategy at IBM and our partnership with BrightEdge is critical to staying ahead of the digital marketing curve.” Ellen Mamedov, Head of SEO at IBM.

Finding #2: AI is Enabling Marketers to Understand the Customer, Drive Productivity and Create Better Performing Content

What success stories are you seeing with AI? Survey results

Marketers already using AI indicated it is delivering them benefits, particularly in terms of understanding the customer and driving productivity.

  • 31 percent have a better understanding of the customer, helping to drive more personalized customer experiences
  • 27.1 percent of respondents said it is driving more productivity and time-savings
  • 14.5 percent indicated better-performing content
  • 8.5 percent of respondents said it is driving an increase in ROI

“Digital marketing has become very complex from opportunity identification through content creation to optimization. BrightEdge brings the AI firepower to my job and makes it easier for me to succeed.” Eugene Feygin, Sears

Finding #3: Obstacles to AI Adoption and a Wake-Up Call for MarTech

Marketers have gone through plenty of hype cycles. From the social craze to the big data movement, the next big thing comes and goes. But AI is real, and its potential can’t be understated. Companies like Google and Amazon are transforming into AI-first companies and accelerating the pace of change. From Google’s RankBrain algorithm to voice search to Amazon’s product recommendations, AI is driving change in the market faster than ever before.

With that said, AI is still met with confusion and, in some cases, skepticism. The early signs of a possible AI divide are emerging in the digital marketing space. Some marketers and companies are making the push towards AI-First. Others are not, and the typical challenges facing marketers are still prevalent:

  • 37 percent of marketers have no plans to use AI and Deep Learning, especially around the development of their content marketing strategy
  • 10 percent of respondents said they believe AI is all hype or lack an understanding of what AI is and what AI really does for a marketer
  • In terms of obstacles, marketers indicated confusion on what is/is not AI (30%) and limited budget (28 percent) as the greatest deterrents to adoption

Finding #4: 58 percent of organizations do not have a data scientist in their organization

Do you have a data scientist at your company? Survey results

One of the key challenges that marketers highlighted is the absence of a data scientist in the organization. 58% of marketers surveyed said they did not have a data scientist with and additional 10 percent ‘not sure.’ In the absence of a data scientist, brands often turn to AI to help them uncover insights

Finding #5: AI has Become “Mission Critical” to Marketers

The survey also asked whether AI is mission-critical in current marketing technologies. A majority of respondents (54 percent) said AI is a must-have, mission-critical, or important to have. This is a tipping point. For the first time, the majority of marketers see AI and a key feature in the marketing technology they purchase. Additionally, 32% of respondents agreed marketing technologies must integrate AI into their current role and workflow.

However, more than a quarter of marketers remain skeptical of the benefits of artificial intelligence, with 29% of respondents saying they see AI as a requirement not an added feature.

Marketers looking to integrate AI into their strategies can start by researching current technology partners that are part of their current workflow, looking at current platforms applying AI and learning about the power of AI. The most innovative companies are natively integrating AI into their platform, making it possible for any marketer to be an AI-first marketer today.

You can download the full survey, results and charts below:

BrightEdge Future of Marketing and AI Survey

 

About BrightEdge:

BrightEdge, the global leader in enterprise organic search and content performance, empowers marketers to transform online content into business results such as traffic, conversions and revenue. The BrightEdge S3 platform is powered by a sophisticated deep learning engine and is the only company capable of web-wide, real-time measurement of content engagement across all digital channels, including search, social and mobile. BrightEdge’s 1,500+ customers include global brands such as 3M, Microsoft and Nike, as well as 57 of the Fortune 100. The company has eight offices worldwide and is headquartered in Foster City, California.

 

PR Inquiries: brightedge@theabbiagency.com

 

Press Release Date

How Do Manual Crawl Limits Impact Crawl Management

enewton@brightedge.com
enewton@brightedge.com
M Posted 8 years 1 month ago
t 9 min read

Earlier this month, Google made a subtle change in their Search Console that many overlooked: they changed the max quotas for the number of times you can manually request Google to crawl a URL. Previously, you could request that Google crawl a selected URL 500 times per 30-day period. You could also request that Google crawl a selected URL and all of its directly linked pages 10 times in a 3-day period. The new limits say that you can request a specific URL 10 times per day and a specific URL and the pages it links to directly 2 times per day. Google's update to their crawl management quotas BrightEdge For many marketers, this change will not cause too much concern or change to their SEO reports because they do not submit URLs except when they update the page and few people would do so above the max frequency. BrightEdge always keeps customers apprised of industry changes and this one gives us a chance to reiterate the importance of the fundamentals of site hygiene, taxonomy, internal links, robots.text, and sitemaps and how they influence the crawl, crawl budget, and indexation of site pages.

What is crawl management and why is it important?

Crawl management describes SEOs’ efforts to control how search engines crawl their sites, including the pages they read and how they navigate the website. Search engine spiders can enter and exit a domain in any order. They might land on your site through a random product page and exit through your homepage. Once the spider has landed on your site, however, you want to ensure that it can navigate your domain easily. The way your site links together will direct the spider to new pages. It will also tell the spider how the site is organized. This helps it determine where your brand’s expertise lies. The crawling of your site remains critical to how your pages rank on the SERP. The spider logs the information about your page, which will be used to help determine your rankings. After you update a particular page, the new page will not impact your rankings until Google returns to crawl it again. Crawl management can also control your reputation with Google. Through techniques like configuring your robots.txt file, you can keep spiders away from certain content, such as duplicate content, and avoid the associated penalties. Google uses a crawl budget to determine how much of your content to crawl and when. According to Google’s Gary Illyes, however, crawl budget should not be a main priority for the majority of sites. For sites that have large numbers of pages, it might be more of a consideration. Illyes reports that Google determines crawl budget based on two main factors:

  1. How often and how fast the Google spider can crawl your site without hurting your server. If the spider detects that crawling your site slowed it down for users, they will crawl less often.
  2. The demand for your content, including the popularity of your website as well as the freshness of the rest of the content on the topic.

Webmasters with thousands of pages, therefore, will want to carefully consider their site’s ability to handle the crawls, their site speed, and the freshness of their content to boost their crawl budget. crawl management understanding - brightedge

How do I successfully control crawl management?

Successful crawl management requires building a site that is user and bot friendly, while also carefully protecting pages you do not want the bot to find. Consider the site structure before you design or redesign a website. The taxonomy you select will determine how you categorize and link posts. Linking posts in a purposeful way will help your readers and the bot understand your structure. Remember when creating your links that you want to focus on people first and bots second - the main priority should be providing value and not just trying to make the bot happy. As you build your website you may also find it useful to mark certain pages as "Disallow" within robots.txt to indicate that you do not want search engines to crawl a particular page. This could be for a variety of reasons, such as duplicate content on the page or a page that you currently are building and do not want it included just yet. These commands will let you direct the search engine spiders away from certain text, protecting the reputation of your site.

Crawl management could not be complete without also discussing your sitemap. Google recommends that when constructing a sitemap, you carefully look through each page and determine the canonical version of it to help avoid any duplicate content penalties. You can then submit your sitemap to Google to tell them how to crawl your site. It will help ensure that Google can find all of your content, even if your efforts to build links throughout your site have not been particularly strong.

How will the Google changes impact crawl management?

Google has changed their crawl limit quotas. They have also removed the quota statement on the actual ‘fetch as Google’ space - although the limit still remains clearly stated on the Google help page. These adjustments appear to be in response to abuse of the feature. Google wants to discourage people from manually asking Google to crawl individual pages. It wants people to instead focus on building websites that are naturally crawlable and generating useful sitemaps. This push encourages better-designed websites and an improved experience for users.

What you need to know about the changes to the crawl limits

  1. Google has changed the quotas for the number of URLs you can manually request Google crawls.
  2. Google has made these changes to encourage people to create naturally crawlable sites so they do not need to manually request each URL to be crawled.
  3. Using crawl management best practices including; quality site organization or taxonomy, linking within the site, and using robots.txt, can help you encourage crawling and ensure that the page earns a positive reputation with the bots and with users.
  4. Creating a sitemap will help you inform Google of your site organization and priorities.

Google continues to push webmasters towards quality site construction. They regularly do so with their algorithm updates that punish poor quality content and reward the valuable. This change with the crawl limits likely has similar intentions. Brands that focus on site organization and prioritize user-friendly layouts will find that this change impacts them minimally. If you previously used this crawl feature on a regular basis, Google wants you to know that the time has come for you to redesign your site.

GSC can help identify problems

Here are a few red-flag use cases where you need to proactively manage the crawlability of selected URLs visible in GSC:

  1. If a page is generating a 404 error message, the Googlebot will not crawl beyond that page and all of the link equity to that page will be removed. Regaining the link equity will be a hard and long process. 404s are the number-one site hygiene symptom to search out and repair.
  2. You have orphan pages, but you need this page to be crawled. For instance, this URL is being promoted with paid advertising as the landing page of an integrated-campaign landing page.
  3. You have made significant changes to a page and want to make sure Google has re-crawled that page immediately, say for competitive reasons.
  4. You have changed the status of a URL from NoIndex to Index, or vice versa.
  5. A poorly populated sitemap with either URLs that you do not want to be crawled or missing new URLs that you have published and do want to be crawled.

BrightEdge detects these scenarios for your easily within the platform

  1. Your first line of defense is BrightEdge’s latest AI-powered innovation–BrightEdge Insights. The Insights product works like another colleague who looks over your shoulder to ensure you stay ahead of any potential site hygiene issues. Insights scans through your website each week and will flag any 404 or No Index issue on the most important pages on your site.
  2. You can also secure peace of mind by creating a weekly ContentIQ crawl. ContentIQ is an advanced website-auditing solution built for the mobile era. It helps identify website errors, so that your website content stays well-tuned, easily found by search engine bots, and easy to use for your website visitors.
  3. A sitemap-based ContentIQ crawl will run each week and help surface URLs that are have discoverability issues, such as NoIndex, NoFollow, or Disallowed by Robots.txt. Your review of the weekly ContentIQ results will help you be the first one to know and can take actions easily.
  4. Create an Orphan Page ContentIQ crawl to ensure you know of any orphan pages on your site. Read our blog to learn more about how to detect Orphan Pages.

Log into the BrightEdge platform today to take advantage of these capabilities.

Earlier this month, Google made a subtle change in their Search Console that many overlooked: they changed the max quotas for the number of times you can manually request Google to crawl a URL. Previously, you could request that Google crawl a selected URL 500 times per 30-day period. You could also request that Google […]

The post New Manual Crawl Limits in Google Search Console: How It Will Impact Crawl Management appeared first on BrightEdge SEO Blog.

,