Marketers and SEO professionals alike watch for Google search engine updates with a mix of anticipation and dread. Google announced its latest search engine update in May, when it introduced Multitask Uniform Model (MUM) update. MUM is a multimodal algorithm designed to provide answers to complex queries by concurrently assessing information across multi-language text, images, video and audio. In September, Google followed its earlier MUM announcement at Search On, with further previews of how MUM could innovate how people search for information.
Google MUM and a Brief Explanation of Multimodal Search
Since its launch in 1997, Google has consistently dominated the search engine market. Over the years, Google has made thousands of changes to its search, culminating with the current algorithm, BERT. BERT enhanced voice search and added features that reorganized how information was presented on the SERP. With MUM, Google introduced a unique machine learning model to account for more complex queries and the ways information is deployed online.
A brief explanation about the significance of the multimodal model: Multimodal is a composite machine learning technique which compares and combines information from multiple sources to form a single response. The “modal” in multimodal refers to the aggregation of data within media, such as visual data from images and video, language data from text documents, and audio data from music and sound recordings. Modalities are incorporated into the training dataset for machine learning models. Multimodal sentiment analysis, for example, can inspect various combinations of text, audio and visual data to assess the sentiment towards an event or occurrence. With MUM, Google is treating media as modalities to improve the user experience with its search.
The choice of multimodal models fits Google because of the increased number of non-text based sources, such as video in the form of livestreams or similar, and audio files, as in the case of podcasts. To develop MUM, Google trained the algorithm “across 75 different languages and many different tasks at once” to refine its comprehension of information and digital details. MUM also considers knowledge across languages, comparing a query to sources that aren’t written in the user’s native language to bring better information accuracy. As a result Google claims MUM is 1,000 times more powerful than BERT.
Related Article: What You Need to Know About Google BERT and the Top Stories Carousel
What MUM Means to SEO Strategy, Google … and You
MUM complements a broader trend in the use of multiple forms of media as a communication method online. Marketers are increasingly deploying a variety of media to communicate with customers. Through MUM, Google will revitalize how it connects people to information on any given brand — potentially repositioning search as a competitor to the social media platforms, which people often use to engage brands.
For marketers, the addition of MUM to search will call for further refinement of content marketing strategies, ensuring correct labelling of audio and video files, and creative thinking about how to coordinate content across platforms that appear in search results.
For Google, MUM means upgrading the match of media from different platforms that appear in search results. Over the last few years I have reported about how posts from Pinterest and YouTube can be part of SEO query considerations. MUM is an evolution of that tactic, so marketers should be savvier about how their white papers, podcasts, memes and posts are deployed.
MUM also gives Google an opportunity to address some public concerns on machine learning bias. With its significant technological investment, Google sounds hopeful that MUM’s enhanced modeling across media can minimize bias in search results.
Related Article: 4 Reasons Why Explainable AI Is the Future of AI
What’s Next for MUM
Google will continue to invest in MUM as it launches a variety of updates across the products that rely on search, such as Search Console and Google Analytics 360. The first notable application of MUM will be with Google Lens, an image recognition application offered in Android phones. Marketers will see other “MUM-powered features and improvements” soon. In the meantime, Google will continue to test and refine MUM to address a number of concerns, including applying its latest research on how to reduce the carbon footprint of expansive machine learning training systems. Most industry experts see MUM as the successor to BERT.
Marketers should recognize that their search and content strategies need robust and cohesive identification when launched online. The ability to link images, video, and supporting documentation will be more critical to capture the attention of MUM and of prospective customers as a result.
Pierre DeBois is the founder of Zimana, a small business digital analytics consultancy. He reviews data from web analytics and social media dashboard solutions, then provides recommendations and web development action that improves marketing strategy and business profitability.