BrightEdge and Oncrawl Join Forces

Jim
Jim
M Posted 4 years 1 month ago
t 9 min read

Fifteen years ago, Lem and I started BrightEdge because we saw a huge gap in digital. Brands were investing to produce and optimize content but had no way to predict what would make that content findable by their customers. I had myself run websites and knew how hard it was to figure this out. 

Lem and I are both engineers and we worked hard to build the DataCube, the first reverse index of the internet. I will never forget how it felt when our first customers shared with us how we had helped them. It was exciting to hear about BrightEdge working as we had designed it – and even more exciting to hear about new use cases that our customers discovered on their own. 

That is because SEO is by its nature holistic. Every element of a page influences how it ranks in search: the title, the content, how fast it loads, everything. Our dream has always been to create that holistic platform that helps marketers win at every aspect of search. And so that early success of DataCube inspired us to build ContentIQ as part of our solution in 2017, allowing for cross-team governance and SEO technical analysis on an enterprise scale. We also created pixel parsing, allowing for a truer vision of top ranking SERPs. And with Autopilot, we created the only machine learning solution that improves search performance. 

Now, we are so excited to announce that we are extending our holistic platform even further by joining forces with Oncrawl. BrightEdge and Oncrawl have combined because holistic SEO is not easy. Getting it right requires high quality enterprise data with multiple technical capabilities. At BrightEdge, we have a comprehensive SEO suite and adding Oncrawl’s advanced technical SEO capabilities through data science will represent the future of enterprise SEO.   

Oncrawl pioneered big data infrastructure in crawl technology as well as semantic analysis of SEO data, allowing for much more control and visibility. Oncrawl has rapidly become a must-have for advanced SEOs and has won numerous awards.  Oncrawl leadership shares our dream of holistic SEO and has built a following of SEO ambassadors in the industry.   

Through technical innovation, Oncrawl is positioning itself to be the driving force behind making data science core to SEO programs. With this new combination, BrightEdge and Oncrawl now provide customers with the most flexibility in SEO deployments. For SEO, some scenarios require technical crawlers to be integrated into a broader platform for things like site governance and ongoing SEO hygiene. Other scenarios demand highly customized crawls and data modeling to pinpoint what optimizations need to occur. Accommodating both is complicated for enterprise organizations, especially if they need to leverage multiple toolsets to address all these use cases. 

We are truly excited - I have never encountered a team with a vision like this. Oncrawl is as committed as we are to building AI and machine learning that helps marketers. Together we bring you the most comprehensive SEO and data platform.  

Thank you and looking forward to the next fifteen! 
Jim

BrightEdge Acquires Oncrawl to Future-proof Web 3.0 Strategies

BrightEdge Acquires Oncrawl to Future-proof Web 3.0 Strategies

Combining next-generation marketing and data science technologies to understand machine-to-machine communication for future data privacy and compliance needs.

FOSTER CITY, Calif. – February 23rd, 2022 – BrightEdge, the global leader in organic search and content performance, today announced that it has acquired Oncrawl. Oncrawl is a multi-award-winning AI-driven platform that helps digital and search marketers make smarter technical website and content marketing decisions

According to BrightEdge CEO Jim Yu, “BrightEdge and Oncrawl have a common DNA and a shared vision for the future for digital technology and marketing. We are delighted to expand industry horizons by acquiring highly specialized search technologies whose creators have the same commitment to innovation and customer success. We were extremely interested in Oncrawl’s vision of the role of data and AI in the future of data-driven marketing."
 
There are over 2 billion websites worldwide today, with 250,000 new websites created every day. The combination of more sites, more data, and the evolution of web 3.0 represent significant challenges and opportunities in how the internet will operate in the future. 
 
Understanding how websites interact with users was the ultimate digital marketing priority in the past. However, as websites now interact autonomously with other websites and with entities like Google, Amazon, and Meta, machine-to-machine communication is becoming the fabric of Web 3.0.
 
It is this new complexity of site-to-site interaction that this acquisition and partnership help marketers navigate. 
 
Together, BrightEdge and Oncrawl are the first to offer a complete enterprise-oriented solution that provides both sets of customers with the additional flexibility needed to do machine learning-led project-based research and analysis, utilize business intelligence and activate resources. It also helps customers address current and future compliance requirements such as GDPR, CCPA, and soon CPRA.
 
The deal creates the world's most comprehensive, flexible, and intuitive organic search software. In addition, it is the first and only one capable of addressing the informational data complexities faced by modern enterprise marketing departments of today. BrightEdge and Oncrawl are coming together to drive a new wave of data science innovations by combining resources while offering integrated and dedicated search marketing solutions. 
 
"When we created Oncrawl, our mission was to democratize Natural Language Processing (NLP) and advanced engineering to build the next generation of data and technical SEO solutions," said François Goube, Oncrawl CEO. "This acquisition gives marketers ultimate elasticity to get the most out of their data and support the entire digital marketing ecosystem. I am excited to combine forces with BrightEdge to be the first in the market to deliver this to customers."
 
From today, BrightEdge users can now perform sophisticated data scientist tasks in their website analysis to complement the work they are already doing on the platform. Oncrawl users can leverage BrightEdge advanced automation and data visualization technology to help customers reduce manual labor and scale their SEO and digital marketing campaigns—all without adding new vendors to their tech stack. 
 
The union of these two organic search technology pioneer paves the way for how cutting-edge enterprise organizations will leverage true data science in their SEO programs. Both search and digital marketers will benefit from the ultimate flexibility, security, insights, and control over their data. 
 
The industry is expected to follow. Learn more here.
 
About BrightEdge
BrightEdge, the global leader in enterprise organic search and content performance, empowers marketers to transform online content into business results, such as traffic, conversions, and revenue. It is powered by a sophisticated deep learning engine, the BrightEdge platform. It is the only company capable of web-wide, real-time measurement of content engagement across all digital channels, including search, social, and mobile. BrightEdge's thousands of enterprise customers include global brands, such as Microsoft and Adobe, and 64 of the Fortune 100 and 9 of 10 leading international digital agencies. The company has offices worldwide and is headquartered in Foster City, California.
 
About Oncrawl
Oncrawl is a technical SEO platform that pioneered big data infrastructure in crawl technology and semantic analysis of organic search data. Their solutions help more than a thousand clients in 66 countries to improve their organic traffic, rankings, and revenues by opening Google's black box. Clients include Vistaprint, Canon, Lastminute.com, Forbes, and other leading companies. In 2021, Oncrawl became the most awarded SEO platform with multiple awards at the US, UK, Canadian, Global, European, APAC, and Mena Search Awards as Best SEO Software.

 

 

Press Release Date

​Google Begins Using Page Experience to Rank Desktop Search Results

tvura
tvura
M Posted 4 years 1 month ago
t 9 min read

February 2022 marks the start of Google using Page Experience as a ranking factor for desktop search. The search company says the desktop rollout will be complete by the end of March. Page Experience has been used as a ranking factor for mobile searches since August 2021. 

Page Experience is essentially Google’s way of rewarding or penalizing websites based on a page’s user experience. If a page loads too slowly, shifts once loaded or inhibits users from accessing the page’s content with intrusive ads, for example – even if the content is relevant – Google effectively sees this as diminishing the value of the content.  It is in keeping with a long-standing focus on returning the best result after understanding the intent of a given search. 

At the heart of Page Experience are (mostly) objective measures of technical performance for a given page, starting with what Google calls Core Web Vitals. For desktop Page Experience, the five elements of performance are as follows:  

1) Largest Contentful Paint (LCP): A measure of the time it takes for a page’s main content to load. Google recommends a target of 2.5 seconds or less for LCP. (Core Web Vital 1 of 3) 

2) Cumulative Layout Shift (CLS): Refers to the stability of the content once it’s loaded. If content shifts up or down while a user is viewing it, Google deems this a negative experience. Google recommends a CLS of less than .1, which sites can measure using BrightEdge or Google’s Page Experience Report. (Core Web Vital 2 of 3) 

3) First Input Delay (FID): The time between when the first web page objects load (a button, image, scroll bar) and when the page becomes interactive. If page content takes too long to be interactive, Google views this as a poor user experience. Google recommends an FID of less than 100 milliseconds. (Core Web Vital 3 of 3)  

4) Proper use of HTTPS security protocols: Pages not served over hypertext transfer protocol secure (HTTPS), represent greater risk to the user and, accordingly, a poor user experience that negatively impacts the page’s search ranking. Learn more about HTTPS.   

5) The absence of intrusive interstitials: Intrusive interstitials are pop-ups and overlays that interfere with the content on the page and are a Page Experience no-no.  

Per Google, the elements and their thresholds are the same as those used for mobile Page Experience (minus mobile-friendliness, which only applies to mobile search rankings.) Brands that have already optimized for mobile Page Experience, in other words, are ahead of the game, but should still measure desktop Page Experience and address any desktop-specific issues. Google has added dedicated desktop performance reporting in its Search Console to help highlight any disparities in Page Experience-related performance for mobile and desktop users of their sites. 

For websites that have not yet been optimized for Page Experience, here are some additional resources to help site operators understand the new ranking factor and make the necessary improvements: 

Measuring and Managing Page Experience with BrightEdge Instant 

BrightEdge Instant provides analysis and dashboards for users to evaluate, prepare and implement changes to improve Page Experience on both desktop and mobile. Ongoing reporting within the tool makes it easy to see and communicate the impact of website changes. 

BrightEdge Instant Use Case: Website Performance Analysis 

Key Takeaways 

We will keep an eye on the impact of Page Experience on desktop search results. As a potential point of comparison, early indications from the rollout in mobile search suggest a comparatively high prioritization of Page Experience as a ranking factor.  

If you have waited to evaluate and improve your site for Page Experience -- perhaps the majority of your traffic comes from desktop, for example – we’d strongly recommend prioritizing it in 2022

 

 

Instant Product Use Case:
Website Performance Analysis

As marketers and SEOs, we often think about website performance in terms of the results the site generates – visits, bounce rates, time-on-site, conversions, and so forth. Organic search, the leading referral source for qualified traffic, on the other hand, thinks about website performance primarily in terms of user experience and uses it as a ranking factor. In fact, with the rollout of Google’s Page Experience update and associated Core Web Vitals in mid-2021, website performance has proven to be among the top-ranking factors for organic search.

It is essential, then, for any brand that relies on traffic from organic search to benchmark site performance, understand opportunities for improvement, work with developers to implement required changes and then continue to monitor performance and its impact on search rank.

In addition to its ability to provide real-time insight into page rank, keyword research and competitor rankings, Instant provides capabilities to measure, optimize and monitor website technical performance. Instant enables brands to optimize the customer experience and maximize search rankings by providing access to Google Page Speed Insights (lab data) and Google Chrome User Experience (field data) at scale. Now they can analyze multiple URLs in parallel across their sites or benchmarked against competitors. 

Download the product use case today to learn how Instant helps SEOs and marketers evaluate ranking factors, benchmark page speed performance, prioritize precious development resources, and measure the impact of changes in real-time.

CAPTCHA
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Best Practices for International SEO in 2022

monique.johnson
monique.johnson
M Posted 4 years 1 month ago
t 9 min read

Digital media continues to create global interconnections, and in response many brands want to expand beyond their borders and engage with new prospective customers in foreign countries. We put together this list of international SEO best practices to equip you with what you need to know about driving qualified, high-intent global traffic to your organization.

Know the search engines used in your target countries

Google is the largest search engine globally, with a market share of nearly 90%. In many countries optimizing for Google alone will allow you to reach your prospective clients. Bing is the second largest search engine worldwide, with a small but consistent presence across the world. That said, there are several large countries that rely mostly on local search engines. 

Here are some examples of international search engines:

  • China - Baidu
  • Russia - Yandex
  • South Korea - Naver

Look at the search statistics for the areas you want to target and figure out which search engines are most popular there. Although some of the basics for optimization will remain consistent across search engines, like a focus on quality content and user experience, other factors will differ. 

For example, sites wanting to have visibility in Baidu should ideally be created with a Chinese domain and hosted on a Chinese server. If you want to learn more, we’ve written an in-depth post about the various major international search engines. We’ve also published a comprehensive overview of Baidu.

Work with native speakers for keyword research

When it comes to expanding internationally into non-English speaking markets, foreign-language content will form the basis of your search marketing strategy. 

As always, when undertaking search engine optimization, consider user experience. No one likes to read content generated by an automated translator. Many ideas are easily lost in translation, being clunky at best, or misleading and incorrect at worst, opening up a whole new world of unpleasant liabilities. 

Working with a native speaker allows you to capitalize on local colloquialisms, culture, and other aspects of semantics that might not be readily obvious to someone who speaks a language fluently but doesn’t live there. 

Don't just rely on translating your existing content. You should also create content that is tailored to address issues the local audience cares about.

Certain keywords will carry high volumes irrespective of international boundaries, and a significant portion of your content will be suitable for direct translation, but your keyword research processes should also account for country-specific queries, both general and long-tail. 

It’s similarly important to pay attention to small details like currencies, time zones, addresses, payment options, and so on. All of these factors will contribute to making visitors feel at home on your site. 

Site structure for international visitors

Setting up your website for international visitors will be one of the most important steps you take in international SEO best practices. And there are few ways to do it.

Some brands, such as Amazon, create unique top-level domains (TLDs) for each country, such as Amazon.com in the United States and Amazon.co.uk in the United Kingdom. Generally, this method works best when you have a very large business with offers and services that differ from country to country.

Success with this type of setup also requires brands to carry a considerable amount of recognition because of the lack of integration between the domains. When you create a new domain for a new country, you more or less have to start from zero from an SEO perspective. This new domain is new, has little to no reputation, no backlinks, and so on. 

Unless your brand is recognizable enough to overcome this challenge, like Amazon, separate domains will likely not be the way to go.

For most moderately sized companies, creating sub-directories or sub-folders branched from a primary TLD tends to work best. This allows you to channel all the power of the domain. It also becomes simple to add an additional country later if you choose to expand again. 

Here are your two main options when it comes to picking a URL structure:

  • Sub-domain:  Host foreign language content on your current domain but create a separate site (e.g. ca.clothingretailer.com or de.clothingretailer.com)
  • Sub-directory: Create a sub-directory on your domain which will act as a separate area of your website dedicated to foreign language pages (e.g. clothingretailer.com/canada/ or clothingretailer.com/deutschland/)

It’s also possible to amend a URL with parameters of the form site.com?loc=. This is not recommended, however, by Google. 

Inform Google of Foreign Language Versions of Pages With “hreflang” Tags

Whatever URL structure you decide to use, you will need to use hreflang tags to provide Google with the strongest signal for language and country. Although Google can generally detect language pretty reliably, many languages are spoken in more than one country and you may want to tailor content towards people in that country. 

Hreflang tags also make it crystal clear to Google how the pages are related and when they should be presented. 

There are three main options for setting up hreflang tags:

  • Add them to the HTML for your site.
  • Include them in your Sitemap.xml file.
  • Add them to your HTTP header. 

Regardless of which one you choose, make sure that all of the pages point to each other. For example, the US English version of a page - hreflang=en-us - should provide access to the UK version - hreflang=en-uk - for UK users and the UK should point back to the US for US visitors. If you fail to connect the pages from both sides, you may find yourself running into errors in Google Search Console. 

Look for international competitors and strategies on local markets

As you move into new markets, you'll need to identify your local competitors. Just because you compete with other international brands in the United States does not mean that those companies will also be your primary competitors overseas. 

You also have to consider local companies. Examine the SERPs related to your industry and see what kind of content local search engines are rewarding for your most valuable keyword groups. 

As you begin to identify your top competitors, you also want to look at their digital strategy. Research into the local search engines can provide helpful information about what the SERPs generally value, but for any search engine, that can vary by industry. Looking at the content strategies employed by others in your sector can help you improve your ability to compete.

Adjust content and products to account for changes in local opinions

For many businesses, their product offerings will remain largely consistent across different countries. 

This makes it tempting to simply translate the product names and pages to the new language, but doing so will overlook potential areas of optimization. Taking the time to tailor content and product lines to local sensibilities will make it easier to gain footing within the region’s market and begin to secure more leads and customers. 

Work with native speakers to create product names, pages, and content that fits the needs and interests of local audiences. Consider what's most important to them and how your product can fill local needs as you progress. 

Marketing in general is about personalization and speaking to the needs of specific personas. Customers in your new country want to feel as though you are just as committed to serving them as you are to your native audience which can also help bounce rate. You can demonstrate this by producing content tailored to their needs.

Other Considerations

Expanding overseas can present you with incredible opportunities to grow your business. Taking the international SEO best practices into account can help you see success. 

In addition to the tips outlined above, here are a few more things to keep in mind:

  • Don’t assume that all visitors will want to browse in the language of their location. Provide the option to change language and location on all pages. If a visitor in Japan, for example, arrives at your site via an English-language search result, it doesn’t mean that they want to be forwarded to your Japanese-language site automatically. Ask them!
  • Be conscious of culture when it comes to design, layout, color choice and so forth. This can be particularly relevant for conversions and, with Google’s growing emphasis on page experience, for search rankings. 
  • Recognize the difference between multilingual and multi-regional sites, which require slightly different approaches. A multilingual site offers content in multiple languages. A multi-regional site targets users in different countries. Your site may be either of these or both. 
  • Remember to build links and other forms of off-site authority in the country which you are targeting. Rankings in one national Google domain don’t necessarily precipitate rankings in another. 
  • If you are duplicating or using very similar content on one top-level domain aimed at a specific audience , use canonical tags to tell Google which is the primary page. This will be served in search results. This is important if you have multiple versions of a page and the only difference is the language, so you can default one of them to show in search.

 

 

 

Long-Tail Keywords: A Comprehensive Guide

monique.johnson
monique.johnson
M Posted 4 years 1 month ago
t 9 min read

Updated: February 2022

Long-tail keywords account for over 90% of search queries. 

Accessing that traffic pool can be incredibly profitable. The good news is that identifying, targeting and ranking for long-tail keywords is a straightforward and cost-effective process. 

In this post, we’ll clear up some common misconceptions about long-tail keywords, cover the benefits of having a dedicated long-tail strategy, and describe how to conduct thorough, extensive keyword research. 

Where Does the Idea of the “Long Tail” Come From?

The concept of the “long tail” has been instrumental in shaping how organizations understand markets. Chris Anderson popularized the concept  in his 2004 book The Long Tail: Why the Future of Business is Selling Less of More but statisticians have been studying it in various forms since at least the 1950s. 

In his now-famous article on Wired and his subsequent book, Chris Anderson argued that there is more profit to be made by selling lots of different products, each with low demand and few competitors, than by attempting to create one big hit that relies on leveraging unified demand in a crowded space. The advent of democratized marketplaces with low barriers to entry - just like the good ol’ internet - has made this approach possible.

Applied to search engine optimization, the idea is simple: target lots of low-volume, low-competition keywords instead of wastefully expending resources on highly competitive, high-volume counterparts. 

Now, while that strategy looks uncomplicated on paper, it’s a little more multifaceted in practice. So let’s dig into the specifics. 

What Are Long-Tail Keywords? 

There is no shortage of definitions of the term “long-tail keyword” on the web. But despite its importance and uniqueness, search engine experts still get things wrong when describing the concept. 

Long-tail keywords are search queries that have relatively low search volumes compared to high-volume “head keywords.” You can understand this idea in terms of specific, thematically related groups of keywords, say around the topic of “home improvement,” or as applied to the totality of Google search queries over a given time (or any other search engine).

Low-volume queries sit on the “tail” of a curve of a graph that maps search volumes on the y-axis against keywords on the x-axis. If you could see the whole graph, the long-tail would stretch for miles. 

High-volume keywords comprise what is called the “head.” Middle-volume keywords are sometimes said to constitute the “thorax” or “chunky middle.” 

Misconceptions About Long-Tail Keywords

To say that long-tail keywords are phrases of multiple words with search volumes of ten or less isn’t entirely accurate, although this is often the case. There are many single-word long-tail keywords. What’s more, the term “low” has to be understood relatively for the concept of the long tail to make sense.

Another common mistake people make is to define long-tail keywords as always being highly specific. While this is usually true, there are exceptions. For example, the low-volume keyword “bog snorkeling” (yep, it’s a recognized sport) is just as semantically general as “golf.”

The key takeaway here is that long-tail keywords should be understood primarily in terms of volume (or number of monthly search volume). Applying other attributes just serves to needlessly muddy the waters and is usually unhelpful from a marketing perspective. 

Why Are Long-Tail Keywords Important?

Long-tail keywords are important because they are effective at driving traffic. A well-executed long-tail keyword strategy can result in significant amounts of new visitors and high value leads.

Here’s a quick rundown of the main reasons that long-tail keywords are worth your attention: 

1. Long-Tail Keywords Have Low Competition

Long-tail keywords tend to be less competitive from a search perspective than high-volume keywords. As such, they are easier to rank for. 

This is due to a mix of reasons. First, many companies focus exclusively on high-volume terms, leaving long-tail keywords wide open. Moreover, the sheer number of long-tail keywords means that competitor activity is more widely distributed. 

2. Long-Tail Keywords Are Easy to Target From a Practical Perspective

Creating content for long-tail keywords is a relatively straightforward process. Specific terms typically only require short and precise explanations. For example, a query like “radiators” easily lends itself to an article of thousands of words of content, with numerous sub-sections. A term like “where to buy cheap radiators in Honolulu,” on the other hand, can be targeted with a comparatively brief piece of text. 

It’s also possible to target numerous long-tail keywords within a single webpage or piece of content. Using long-tail keywords to structure content will enable your website to rank for terms that might otherwise have been missed. Additionally, there is little cost required to  optimize longer-form content for long-tail traffic. 

3. Long-Tail Keywords Have High Conversion Rates

Consider the difference between the keywords “water bottle” and “two-liter blue water bottle with a folding cap.” The second one carries a highly specific intent. As a result, the searcher responsible for tabbing it into Google is more likely to follow up by purchasing a product related to the keyword.

Because long-tail keywords are usually very precise, companies can tailor highly-targeted offers and opt-in incentives to capture site visitors. 

4. The Pool of Long-Tail Keywords Is Large

There are billions of long-tail keywords. You can’t see it on a typical graph because it has to be truncated for practical reasons, but the long tail goes on for miles. If you’re in a well-known industry, it will be practically impossible to run ou

t of keywords. And even niche organizations will have their work cut out for them in attempting to capture even a portion of all available long-tail traffic. 

How to Find and Use Long Tail Keywords: A 5-Step Guide 

Here are five general steps that can help form the basis of your long-tail keyword strategy: 

1. Create Buyer Personas and Identify Broad Keyword Topics

Before you dig deeper and pinpoint specific terms, you need to identify the broad topics you will be researching. This definition will normally be in terms of generic keywords. Having clearly-defined parameters will enable you to stay on track during the later stages of this process. 

In particular, you should ask two questions:

  • Who is your target audience?
  • What topics are they interested in?

Keep in mind that your answers will likely be different depending on which part of the customer journey you’re considering. Profiles for first-time searchers will be different to those of returning visitors, and your long-tail keyword targeting should account for this. 

2. Use a Keyword Research Tool Like Data Cube

Once you’ve identified “tier one” terms, enter them into a research solution like BrightEdge’s Data Cube. Data Cube has dedicated functionality for discovering long-tail keywords. You can use it to sort potential target queries by volume, competition, potential value, and more. 

While it’s not unusual for SEOs to use ancillary tools and apps, particularly those that specialize in semantic keyword and long-tail query generation, it is good practice to leverage one high-quality solution as the basis for finding and organizing long-tail keywords. In this way, your workflow will have a central, easily accessible hub. 

3. Evaluate Competitors

Competitor tracking is another effective way of identifying profitable long-tail keywords. The Share of Voice functionality within the BrightEdge platform allows you to uncover long-tail terms for which other sites are ranking.

Many of your competitors’ search results will not be the outcome of actively targeting a particular long-tail keyword. Often, existing content will be ranking “accidentally.” This presents you with an opportunity to create content of a greater quality and achieve better results.  

4. Collect Questions From Community Spaces

Analyzing user-generated content on sites like Reddit, Quora, Facebook, Amazon and topical forums can give you a range of insights into the ways potential customers are talking about their interests and problems. 

Trawling through forums and discussion boards is a time-consuming process, and you will still need to run gathered terms through your software. That said, you can find lots of hidden gems this way and, depending on their value to your company, it may be worthwhile as a long-term approach. 

5. Organize and Rank Long-Tail Keywords

Once you’ve collected a set of “raw” keywords that show promise, you should organize them into a coherent structure that can act as a guide for creating content. Metrics to consider when undertaking this process include value, relevancy, competition and, of course, volume. 

You should also account for the following semantic distinctions:

  • Synonyms - The terms “how to get over the January blues,” “feeling down in January,” and “tips for beating January blues” are synonymous. They all mean the same thing. Google is smart enough to recognize this. Rather than creating individual pages for each one, you should target synonyms in the same piece of content. 
  • Primary terms - These terms will act as the main subjects of individual pieces of content. Some primary terms, like “how to dye curly hair naturally” or “how to revive a dying spider plant,” will be obvious in the sense that they cover quite a lot of ground. Others may look like secondary terms but actually warrant their own page. “How to dye curly hair naturally for women” could be added as a subtopic to an article about dying hair, for example, but will probably be targeted more effectively individually. 
  • Secondary terms - Secondary terms should constitute part of a larger piece of content. One of the best ways to decide whether or not to designate a term as a primary or secondary keyword is by checking existing results and seeing whether Google is ranking dedicated pages or ones covering a broader topic. 

Organizing keywords semantically and topically isn’t an exact science. The decision of whether to create a new piece of content or update an existing one will often come down to personal judgment. 

Conclusion: What’s Next?

So you’ve done your research and built a jam-packed list of long-tail keywords. What’s next? 

Well, it’s time to start creating content. A long-tail strategy is an invaluable business asset. But it’s nothing without a well-developed content plan. 

SEOs that can target high-value, low-competition keywords can guarantee a steady stream of website visitors and leads. Dedicating time and resources to ongoing research will pay dividends well into the future. 

 

 

 

SEO Bright Now: February 4, 2022

andrew.riker
andrew.riker
M Posted 4 years 2 months ago
t 9 min read

The last two weeks have seen several feature releases from Google. In addition, Google documentation updates and explanatory content from John Mueller (Google’s Search Advocate) and other sources provided search engine optimizers with clarity about a handful of long-running uncertainties.

In terms of practical features, web admins can now take advantage of indexifembedded tags that tell Google how to handle embedded content. There’s also a new search section on mobile results. The upcoming release of the “Topics” feature, in line with Google’s depreciation of third-party cookies, is also an important development. 

In other news, Google made a few changes to its online guidance. SafeSearch documentation has been merged, and a new note in Google Webmaster Guidelines lays out the relationship between “Car” and “Product” schema markup. John Mueller also provided some insights into how Google evaluates internal links and Danny Sullivan (Google’s Search Liaison) explained how “deduplication” works in relation to “Top stories.” 

Let’s dig into the latest updates, announcements, and search-related analysis from the last two weeks. 

Topics to Replace FLoC as Part of Google’s Privacy Sandbox Initiative

On January 25th, Google announced that it would be retiring “Federated Learning of Cohorts” (FLoC) and replacing it with an alternative targeting technology called “Topics.” 

“Topics'' is part of Google’s “Privacy Sandbox,” an initiative tasked with developing digital tools that allow publishers and advertisers to continue to leverage data about user behavior as cookies become redundant. 

“Topics” will share subjects that individual browsers have expressed interest in with third-party sites, thus negating the need to provide confidential personal information. Google published a nifty little explainer video that shows how everything works. 

In an official blog post, Google wrote: “With Topics, your browser determines a handful of topics, like “Fitness” or “Travel & Transportation,” that represent your top interests for that week based on your browsing history. Topics are kept for only three weeks and old topics are deleted.”

It’s still early days, and a developer trial will launch in Chrome shortly. What the final tech will look like remains to be seen. Nonetheless, it’s a change that will affect all businesses that rely on visitor data to serve ads and generate audience insights. It highlights the importance of web admins taking Google's depreciation of cookies seriously. 

New Robots Meta Tag (indexifembedded) Added to Documentation

Google has introduced a new robots tag - indexifembedded - that lets web admins stipulate that they’d like Google to index content that’s embedded in iframes (and some other HTML tags) on third-party pages (or elsewhere on the same site) even if the “unembedded” content on the parent page contains the noindex tag.

Google says that it may prove to be a particularly useful tool for media publishers, who often allow third parties to embed content but don’t want to publish the content on their own site. 

You can learn more about the new tag in this recently published post on the Google Search Console blog. Currently, Google is the only search engine that supports this feature.

New Mobile Search Feature on Google Mobile

Google has released a new feature called “People search next” on its mobile search pages. A Google spokesman confirmed the rollout with Search Engine Land

“People search next” appears alongside other features like “Related searches” and “People also search for.” It would appear that it is only available in the US at the time of writing. 

“People search next” is interesting from an SEO perspective for two reasons. First, it provides content creators with new ideas for keyword-focused pages. Second, search result page widgets like this one potentially take up space occupied by results from third-party websites. It’s essential for SEOs to be aware of these changes as they may affect how ranking strategies are formulated. 

Google Updates SafeSearch Documentation

Google has updated its SafeSearch documentation but the guidance remains the same. All documentation is now available in one place instead of spread across different subsections of the Google Search Central documentation. 

If you haven’t already, you should make sure that your website is optimized for SafeSearch. The instructions show you how to check if some or all of your web pages are being filtered and how to remedy any mistakes on Google’s part. 

Google Clarifies Car and Product Schema

Google has added a note instructing web admins about how to label “Car” markup in a way that doesn’t obviate eligibility for “Product” review snippets. 

The note reads: “Currently Car is not supported automatically as a subtype of Product. So for now, you will need to include both Car and Product types if you would like to attach ratings to it and be eligible for the Search feature.

In a nutshell, this means that you should use both car and product schema on vehicle listing pages. If you only use “Car” schema markup, product reviews may not appear in search results. 

Google Removes Time Ranges From Recipe Schema Markup

If you publish recipes on your blog, then Google’s update to its “Recipe” documentation will be of note and you’ll need to make some minor changes to your schema markup. 

All references to ranges have been removed and Google no longer supports time ranges for “Time” properties. Instead, Google advises publishers to use “an exact time; time ranges aren't supported.”

John Mueller Provides Some Insights Into How Google Evaluates Internal Links Based on Page Location

“Internal linking” refers to the practice of linking to different pages of a website so as to create an optimized “flow” of authority (or “link equity”). The argument runs that well-structured internal link architectures are correlated with higher rankings. 

In an office-hours hangout, Search Advocate John Mueller said that the location of internal links on a page (header, footer, in-content, etc.) doesn’t matter from Google’s perspective. So it looks like SEO’s don’t need to worry about exactly where they place internal links. It is likely a much better approach to focus on optimizing user experience. 

Danny Sullivan Provides Insight Into Deduplication Process for “Top stories”

In reply to a tweet by Executive Editor of The Verge Dieter Bohn, Search Liaison Danny Sullivan shed some light on how Google’s deduplication process works in relation to “Top stories.”

In a nutshell, Google will remove a link to a webpage from the main results if that link appears first in “Top stories.” However, if the “Top stories” widget appears after the normal results, the link is not removed. 

Danny Sullivan said, “...we deduplicate a link from web results if a link appears as the first link in Top Stories and if the Top Stories box appears before web results. If it comes after, we don't.

 

 

 

,