SEO Bright Now: May 20, 2022

andrew.riker
andrew.riker
M Posted 3 years 10 months ago
t 9 min read

Despite a lack of major news stories, it’s been an interesting two weeks in search, largely due to a myriad of small updates and announcements. Google also made a few mildly embarrassing missteps.

MOZ was removed from the Google index after an overly-broad DMCA request. And it was later discovered that Google’s official digital marketing course contains incorrect advice about content length and keyword density.

On a more positive note, Google published several updates to Search Central. There are two new pages, one on metadata and the other on schema for Q&A and flashcard pages. You’ll also find extra information regarding page titles—if you’ve experienced incorrect titles in search results, stay tuned.

You’ll want to take note of multiple minor feature updates and announcements. A handful of sitemap extension tags are no longer supported by Google, Search Console has a filter for translatable results, and an indexing report for videos will be released soon.

We’ll also be covering inclusive ranking factors, Vimeo, and virtual reality optimization (don’t worry, it’s not a thing yet). With all that in mind, let’s look at all the recent news, updates, and tidbits from the world of search engine optimization over the last couple of weeks.

MOZ Removed From Google Index After False DMCA Request

MOZ was removed from Google’s index after a DMCA notice was filed, a mechanism in which copyright holders request content to be removed from a site.

As the issue was resolved quickly, this may seem unnewsworthy. But the story clarifies an important point for site owners regarding a process that has long been viewed as open to abuse and error.

The fact that Google responded quickly is a positive sign. It also shows that the search engine is aware of problems associated with DMCA and that SEOs have a safeguard against misuse. In a statement issued through Search Engine Land, Google said, “If we find that pages have been removed from our results in error, we reinstate them, which we did in this case.”

The events in which MOZ was embroiled demonstrate how important it is to monitor indexation through platforms like Search Console. Additionally, it hints at what happens when the human element is eliminated from some processes and automation is left unchecked.

Google Recognizes Its Own SEO Course Contains Mistakes

If you were thinking about instructing your SEO team to take the recently-announced digital marketing course from Google, you have reason to pause. It looks like some of the training is false. And that isn’t an opinion from third parties. Representatives from Google have said so.

Course material specified that pages should have a 2% keyword density and a minimum of 300 words of content to meet industry standards. Search Liaison Danny Sullivan quickly responded by saying, “This can be ignored,” and emphasized the relevance of Search Central documentation as the authoritative resource.

To our minds, the incident highlights the importance of having a multi-faceted training system for in-house SEO. It’s essential to draw from multiple sources and not take official advice at face value.

Google Makes Several Updates to Search Central Documentation

Google has made several updates to its Search Central documentation. Practically all of the topics outlined in Search Central are foundational aspects of search engine optimization and web admins need to be familiar with Google’s guidance.

Here’s an overview of the changes:

  • Metadata elements - A new page covers best practices in relation to metadata. It also specifies which elements not to use in the <head> section of a webpage to avoid issues with HTML processing.
  • Quiz schema for flashcard and Q&A pages - Another recently published document outlines instructions for adding Quiz schema to flashcard and Q&A pages. The guidance—titled Education Q&A—helps webmasters structure their content so it’s eligible for inclusion in the Education Q&A carousel, and in Google Assistant and Google Lens results.
  • Title links - Search central guidance on titles has been amended and now includes a troubleshooting table, covering issues like half-empty, obsolete, and inaccurate title tags. If your title links aren’t appearing correctly in search results, consult this section.

All of the new advice is relatively easy to implement, and although traffic gains are unlikely to be significant, there are no downsides from an SEO perspective.

Google Deprecates Some Sitemap Extension Tags

Google will be removing support for some sitemap extension tags in August. Sitemap tags are instances of extended syntax that provide information about media files.

If you use any of the video and image extension tags listed below on your sitemap, it’s a good idea to remove them:

  • caption
  • geo_location
  • title
  • license
  • category
  • player_loc[@allow_embed]
  • player_loc[@autoplay]
  • gallery_loc
  • price[@all]
  • tvshow[@all]

Your rankings won’t be affected if you don’t remove the tags listed above, but it’ll keep your site uncluttered and up to date. You may receive warnings in Search Console after the update is made.

Google Releases New Two Search Console Features

Google has released two new search console features: a filter for translated results and a video indexing report.

The filter in the performance report allows admins to monitor searchers that interact with their website in a translated language. Google will sometimes translate the title tag and description of a result and serve it in response to foreign-language queries. If a user visits the site, they will interact with pages through Google Translate.

The video report, which was announced at Google I/O, helps web admins identify and resolve indexing issues on video pages on their site. There’s no exact release date but it’s expected fairly soon.

Vimeo Adds Structured Data to All Videos

Vimeo has decided to add structured data to all of its videos, making it likely they will receive increased exposure in search results. If you publish content on Vimeo, you may see more hits. It’s also worth experimenting with the platform if you don’t already use it.

Inclusive Schema to Become Image Ranking Factor

Google has announced that it will include diversity as a ranking factor in the near future. In essence, image data related to skin tone, hair texture, hair color, and so on will be incorporated into the algorithm.

A new inclusive schema for labeling images and other visual assets will be released soon. SEOs should keep an eye on this development and consider tailoring their use of images accordingly.

John Mueller Clarifies Role of Meta Descriptions in Rankings

It’s an ongoing SEO debate. Do meta descriptions affect rankings? Well, according to Google Search Advocate John Mueller, the answer is no.

In an SEO Office Hours session, he said, “...the meta description is primarily used as a snippet in the search results page. And that’s not something that we would use for ranking.”

However, he did point out that meta descriptions can affect whether or not people click on a result, and therefore still carry importance. So while this age-old debate amongst SEO still rages, understand that there is value in owning real estate in results and attention should be given to meta descriptions.

Should You Be Thinking About Virtual Reality Optimization?

How often do you think about virtual reality? If you’re like most SEOs, it probably won’t be very often. But it may well turn out to be the next big thing.

If you’re interested in this topic, Tyler Kurtz published a fascinating piece in Search Engine Land, titled Virtual environment optimization (VEO): The next evolution of SEO?

We’re not there yet, of course. Odds are we’re not even close. But it’s definitely a direction we’re trending towards as technology becomes more immersive.

SEO by the Sea Founder Bill Slawski Passes Away

Bill Slawski was a widely respected and pioneering search expert, well known for his incisive posts taking apart Google patents on his blog SEO by the Sea. He was also SEO Director at Go Fish Digital. Tributes have been pouring in on Twitter, and he’ll be missed by many in the industry.

BrightEdge and Oncrawl Create Industry's First Intelligent System for SEO

BrightEdge and Oncrawl Create Industry's First Intelligent System for SEO

The combination of SEO and Data Science technology creates a new level of industry-specific insights across retail, banking, insurance, and real estate industries.

FOSTER CITY, Calif. – May 18th, 2022 – BrightEdge and Oncrawl are combining best-in-class technologies to create a new intelligent system for search marketers. Research unveiled at Share22 reveals a new layer of insights is achievable, powered by the combination of SEO and Data Science. By connecting different data sets, marketers can now gain a new understanding that helps drive value faster and guides the development of future search marketing technology development.

Today's data revolution is happening so fast that a lockstep change in technology is needed for marketers to make sense and extract value from all the information at their disposal. As McKinsey research clearly acknowledges, companies that use data effectively are 23 times more likely to outperform their competitors.

To help marketers stay ahead in their industries, BrightEdge constantly innovates ahead of the market so they can evolve in line with major shifts. From the creation of the Data Cube in 2014 and the pioneering launch of real-time SEO with BrightEdge Instant in 2019, innovation and customer success are always BrightEdge’s number one priority.

Fast forward to 2022, and following on from the acquisition of Oncrawl earlier this year, the creation of a new intelligent system is unlocking a new level of insights never seen in our industry before. By bringing together best-of-breed technologies – BrightEdge SearchIQ and OnCrawl Data Science – both companies are revealing a new layer of insights that represent the future of SEO via new intelligent systems.

As a result, the types of findings can be highly insightful and help marketers make decisions and take action based on hard and specific industry data points - rather than generic one-size-fits best practices.

 

 

 

 

 

 

 

 

 

 

 

For example, new insights show how web experiences vary by industry.

●      In the retail industry, manufacturer product descriptions create too many duplicate content experiences.

●      Short and concise content wins in the banking industry.

●      It is vital to create unique experiences with fast page speed in the insurance industry.

●      Focusing on unique scannable content (5-10 min read time) matters most in real estate.

"The most critical challenge that marketers face today and in the future is making sense of all the data at their disposal," said BrightEdge CEO Jim Yu. "Data needs to be structured, and new patterns discovered to drive meaningful value. There are distinct differences between what matters in some markets compared to others. As a result, a new type of technology based on connected data sets is required to deliver unique and specific layers of intelligence that help marketers drive value even faster, and with less manual action." 

The new intelligent system combines data from BrightEdge Search IQ technology and Oncrawl's Data Science modules. The convergence of SEO and Data Science and the birth of new intelligent systems represents a new future for the SEO and digital marketing community.

 

About BrightEdge

BrightEdge, the global leader in enterprise organic search and content performance, empowers marketers to transform online content into business results, such as traffic, conversions, and revenue. It is powered by a sophisticated deep learning engine, the BrightEdge platform. It is the only company capable of web-wide, real-time measurement of content engagement across all digital channels, including search, social, and mobile. BrightEdge's thousands of enterprise customers include global brands, such as Microsoft and Adobe, and 64 of the Fortune 100 and 9 of 10 leading international digital agencies. The company has offices worldwide and is headquartered in Foster City, California.

About Oncrawl

Oncrawl is a technical SEO platform that pioneered big data infrastructure in crawl technology and semantic analysis of organic search data. Their solutions help more than a thousand clients in 66 countries to improve their organic traffic, rankings, and revenues by opening Google's black box. Clients include Vistaprint, Canon, Lastminute.com, Forbes, and other leading companies. In 2021, Oncrawl became the most awarded SEO platform with multiple awards at the US, UK, Canadian, Global, European, APAC, and Mena Search Awards as Best SEO Software.

Press Release Date

Minification and SEO: The Complete Guide

andrew.riker
andrew.riker
M Posted 3 years 11 months ago
t 9 min read

Minification of web code is essential when optimizing for search engine rankings. However, it can seem like an opaque topic to those without coding knowledge. 

Despite appearances, minification is a relatively straightforward process. And web developers have access to a range of tools, many of them free, to streamline their workflow when minifying code. What’s more, there are various platforms for measuring the effects of minification in terms of page load time and overall site speed. 

In this post, you’ll learn about minification and why it’s important. We’ll also show you how to minify your site’s code using a variety of methods, from manual inputs to the largely hands-off option of a CDN. 

What Is Minification?

Minification is the process of shortening source code to remove unnecessary characters, empty space, comments, delimiters, long-name variables, and other superfluous elements. In the context of search engine optimization (SEO), minification is applied to languages typically used for web development like HTML, CSS, and JavaScript. 

Here’s a quick example of minification with some JavaScript code. Below is the original JavaScript code: 

// example of a function called greetbrightedgereaders()
function greetbeusers() {
            console.log("Hello BE Readers");
}

After minification, it looks like this: 

function greetbeusers(){console.log("Hello BE Readers")}

Some people ask why programmers don’t write minified code directly. And there’s a simple reason for this. It’s very difficult to write code without spaces, comments and properly named functions. Minified code is designed to be understood easily by web browsers, not humans. 

Why Is Minification Important?

Minification is important because it improves both site speed and search robot crawlability. Websites and online applications that use minified code take less time to load, primarily because of lower bandwidth use and faster script execution, which in turn increases user satisfaction. 

This is relevant for SEO because Google incorporates speed as a key ranking factor, to the point where it provides dedicated tools (mainly in the form of PageSpeed Insights and the associated API) for web admins to monitor the performance of their pages.

Google evaluates the minification of CSS and JavaScript as one of the top priority audits when undertaking speed assessments, and we see it across most site’s we assess.

Differences Between Minification and Similar Processes

Minification is often confused with other processes that alter source code. It’s important to understand the main differences because they tend to have distinct SEO roles. 

Here is a quick overview of the main processes often mistaken for minification:

  • Concatenation - When files are merged they are said to be “concatenated.” It is common practice to combine different CSS files into one, thus lowering the amount of server calls required to load a website.
  • Compression - Compression reduces the size of a file without forfeiting any data (or losing only limited amounts). Compression involves capturing the same amount of data in fewer bytes compared to the parent file. GZIP and Deflate compression are commonly used by SEOs. 
  • Encryption - Encryption converts code into a non-readable form that can only be decoded with a cipher. It is primarily used to protect information during transit. 
  • Obfuscation - Obfuscation involves creating complex code that is unreadable by humans, usually for the purpose of concealment. 

When it comes to minifying source code, you can either proceed manually with minification tools or by leveraging a content delivery network (CDN). We’ll look at minification tools first. 

Minification Tools

Manual minification programs broadly fall into two categories: online interfaces and web development apps with minification features, particularly IDEs and GUI tools. It’s also possible to use command line apps to quickly minify source code. 

Online tools, which offer a browser interface into which code is pasted, are unsuitable for most developers, especially those working on enterprise SEO. They are rudimentary applications and provide little in the way of customization. 

The second, more feasible option is to minify code directly within a development environment. Programs that don’t provide native minification features can usually be modified with extensions. There are, for example, numerous minification plugins for Microsoft Visual Studio. GUI applications like Koala and Prepros include native minification and bundling functionality.

Content Delivery Network

Content delivery networks (CDN) provide one of the most efficient ways of minifying code, especially for large, complex websites. They are also typically cheaper when compared with the equivalent development costs of manually minifying and uploading source code. 

A CDN is a system of servers covering a specific geographic region. Some CDNs like Cloudflare are global. Cached versions of sites are stored on these servers, thereby reducing transfer times when requests are made. 

Manual minification is usually a cumbersome process that requires the maintenance of two groups of server-side files, one set containing the human-readable source code and another containing minified code. Ensuring consistency across these files can be difficult, especially when dealing with large sites and web applications. 

A CDN overcomes this problem by storing minified versions of a site on its caching servers. In the vast majority of cases, no configuration is required on the hosting server. 

Minification: An Essential Piece of the SEO Jigsaw Puzzle

Minification improves overall site speed. As a result, it can have significant impacts on page rankings due to the ability for search engines to traverse and understand more of your site. What’s more, it’s a very quick and inexpensive change to make to a site’s source code. And while there are some potential issues that may arise, they tend to be rare and easy to fix. 

However, keep in mind that minification is not the only aspect of SEO related to site speed that requires consideration. Minification is best undertaken in conjunction with other types of performance optimization, including image compression, rectifying redirects, enabling compression, and so on. 

More Resources

Are you eager to ensure your pages are loading as quickly as possible? Here are some more resources to check out:

What is Page Speed in SEO? 9 Best Page Speed Fixes

Core Web Vitals: Preparing for the Page Experience Update

Optimizing Images for SEO

SEO Bright Now: May 5, 2022

andrew.riker
andrew.riker
M Posted 3 years 11 months ago
t 9 min read

What percentage of search clicks do you think goes to websites featured in snippets? According to one study, the answer is 35.1%. Whichever way you look at it, that’s a hefty amount of traffic. So we think it’s excellent news that Google is testing two additions to featured snippets—“From the web” and “Other sites say”—that give exposure to additional sites.

In other news, Google has made its first public reference to SpamBrain, an AI-powered spam filter, in its 2021 webspam report. As SEOs, it’s essential we keep up to up to date with Google’s position on what constitutes spam.

There are also some technical updates happening in the SEO space. BingBot will have fully transitioned to a new user agent by Q4 2022, with implications for certain websites. And if your site serves dynamic web pages, you may be affected by Google’s announcement to launch SXG (Signed Exchanges) for desktop users.

Finally, Alphabet has published its quarter one earnings, so we’ll take a look at that. Search traffic has played a sizable role in the company’s revenue growth, as per usual.

With all that in mind, here’s your twice-monthly roundup of all the latest news, announcements, and other noteworthy developments from the world of search.

Google Tests Two Snippet Features: “From the web” and “Other sites say”

Snippets account for 35.1% of all clicks through search engines. As such, they represent a sizable chunk of search traffic competing with first rank results. In recent weeks, Google has been testing out two new features: a “From the web” section which displays content in the form of extracts from multiple websites, and an area labeled “Other sites say,” which lists alternatives to the main featured snippet.

Considering the prevalence of snippets, this is potentially big news for SEOs. The caveat being, of course, that it’s still early days and we don’t know whether Google will commit to a full rollout.

Depending on how often your content comes up in snippets, this news is either good or bad. Our expectation is that the majority of websites will benefit from the increased exposure.

Google Releases Webspam 2021 Report and References AI System SpamBrain

Google released its 2021 webspam report on the 21st of April. Notably, the report directly referenced SpamBrain, an AI system for filtering spam websites, and detailed how it has helped “keep more than 99% of searches spam-free.”

So why does this matter from an SEO perspective? It’s important for web admins to stay up to date with Google’s spam policies. It’s the best way to protect against updates to Google’s filters and ensure rankings aren’t negatively affected with the type of content you produce.

Recognizing that SpamBrain sits at the core of Google’s anti-spam processes provides a long-term reference point for SEOs in establishing what constitutes spam, as well as help content creators or webmasters avoid certain triggers.

Bing to Drop Old BingBot User Agents

Bing has said that it will finalize the move over to new crawler user agents, with separate ones for mobile and desktop, by the the end of this year. In December 2019, Bing announced, “Bing is adopting the new Microsoft Edge as the engine to run JavaScript and render web pages.”

Web admins are advised to download Microsoft Edge to check their site renders correctly. If your site doesn’t currently support the new user agents, you’ll need to update its code.You can learn how to modify how Bing crawlers interact with your site by consulting the official documentation. The majority of sites will not be affected, but it’s something to check if you haven’t already.

Google to Launch SXG (Signed Exchange) Support for Desktop Users

Google announced that it will provide SXG support for desktop users. Signed Exchange technology enables the pre-fetching of websites. It has a variety of SEO benefits, particularly related to site speed. You can learn more about it (and how to implement it on your site) by checking the documentation in Google Search Central.

Sites using dynamic serving will need to make some changes to ensure desktop users aren’t directed to the mobile versions of pages. If you use responsive design or separate URLs for mobile and desktop, you don’t need to take action.

John Mueller Provides Information About Optimizing for Google Lens

In a recent Google office-hours hangout, Search Advocate John Mueller answered a question about Google Lens, available on the Chrome Android and iOS apps. He talked about how the multisearch feature, which allows people to search with images and text simultaneously, might affect SEO.

In essence, SEOs should ensure they’re following best practices when using images—proper context, correct tags, device-friendly, and so on. John Mueller said there’s not much to do manually to improve rankings as long as existing practices are followed.

If your content lends itself to being found through images, particularly if you run an ecommerce store, it is worth auditing your site to ensure that all visual assets are optimized, crawled and indexed.

Google Offers SEO Certification

Google now offers search engine optimization training as part of its Digital Marketing and Ecommerce Certificate, offered through the Grow With Google training platform. This has come as a surprise to many. Google has traditionally maintained it would not provide SEO certifications (like it does with Ads, Analytics, and other tools).

The new course has received a mixed response from industry insiders, with one commentator pointing out the potential for SEOs to misuse Google’s name for leverage with customers.

However, if you run a business, there would seem to be definite value in employing search marketers with training from Google specialists. Similarly, it may be in your organization’s interests to encourage existing employees to complete the certification.

Google and Bing Publish First Quarter Earnings

To finish off, let’s look at the first-quarter earnings of Alphabet (Google’s parent company) and the third-quarter earnings of Microsoft.

Google reported revenues of $68 billion, up from $55.3 billion in the first quarter of 2021, of which $54.7 billion was from advertising revenue. This indicates strong growth and a limited impact from halting ad sales in Russia after the invasion of Ukraine. These figures put to bed any talk of a slump in ad spending, which was a widespread prediction going into 2022.

Microsoft reported similar growth of around 20% in the same quarter, surpassing its earnings estimate and driving an increase in its share price.

By all accounts, these reports point to the ongoing strength and market stability of search engines, and the role that SEO will play. And on that note, it’s over and out.

2022 U.S. Travel & Hospitality Industry Research Report

When the pandemic started in 2020, the U.S. travel industry saw immediate and dramatic declines in air passenger volume and hotel stays. The rollout of vaccines in 2021 brought some relief and an uptick in travel demand, which is growing still through the start of 2022.

In this report we delve into a deeper analysis of search trends around travel over roughly the last four years. To do this and to better understand the nature of consumer attitudes and the dynamics of consumer interest across the U.S. travel industry, we conducted an analysis of organic search trends related to travel using BrightEdge Data Cube.

In this guide we will explore:

  • Search trends for the overall travel industry
  • Differences in how the pandemic affected segments of the travel industry, including hotels, flights, and activities
  • Directional insights for the travel industry in 2022

Download the report today!

CAPTCHA
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
,