FACT-Finder, a company that offers ecommerce companies tools to personalize their site with things like AI-driven recommendations, said it has acquired Loop54, a company that provides personalized search results. It's the latest in a trend of consolidation in the ecommerce world, where a host of companies arose to offer personalization with new technologies like AI, but now the bigger companies are gobbling up the smaller ones -- and specifically in the ecommerce software-as-a-service (SaaS) search market. On the smaller side, we reported last week on Coveo's acquisition of AI-powered personalization provider Qubit. On the much bigger side, yesterday, reports emerged that PayPal is making a $45 billion bid for e-commerce giant Pinterest. "With the expertise and unique approach that our new colleagues at Loop54 bring to the table, we will significantly expand our market leadership and push the bounds of what is possible in e-commerce," said Emile Bloemen, CEO of FACT-Finder.
As we're able to introduce new technologies like MUM into Search, they'll help us greatly improve our systems and introduce entirely new product experiences. And they can also help us tackle other challenges we face. Improved AI systems can help bolster our spam fighting capabilities and even help us combat known loss patterns. In fact, we recently introduced a BERT-based system to better identify queries seeking explicit content, so we can better avoid shocking or offending users not looking for that information, and ultimately make our Search experience safer for everyone.
Today, Google is working to improve its search engine to be better and able to meet the needs of users. For example, when a user wants to browse pages related to a specific topic such as technology, artificial intelligence will display the best pages whose content is about technology. If the user has any questions, the artificial intelligence will answer these questions. Pandu Nayak, Vice President of Search at Google, gave an example with a sentence like, "Can you get someone's medicine from the pharmacy?" And that question was a challenge to Google's search engine, as it was unable to understand it.
Google is developing a new feature called Big Moments, which will compete with rivals Facebook and Twitter in delivering the latest breaking news updates during major events. The COVID-19 pandemic forced the search engine to react quickly and constantly to its users' needs for the latest and most authoritative information, according to Google. A team at Google has been working on the project for over a year, after the company struggled to provide the latest updates on the U.S. Capitol attack in January and Black Lives Matter protests last summer, says The Information, a Silicon Valley-basedtechnology news site. Big Moments hopes to build upon Google's Full Coverage feature, which it launched in Google News in 2018 and later integrated with its search engine in March of 2021. Full Coverage allows users to tap into a news headline and see how that story is reported from a variety of sources.
Using MUM, we can even show related topics that aren't explicitly mentioned in the video, based on our advanced understanding of information in the video. In this example, while the video doesn't say the words "macaroni penguin's life story," our systems understand that topics contained in the video relate to this topic, like how macaroni penguins find their family members and navigate predators. The first version of this feature will roll out in the coming weeks, and we'll add more visual enhancements in the coming months. Across all these MUM experiences, we look forward to helping people discover more web pages, videos, images and ideas that they may not have come across or otherwise searched for. We're also making it easier to shop from the widest range of merchants, big and small, no matter what you're looking for.
In May, Google executives unveiled experimental new artificial intelligence trained with text and images they said would make internet searches more intuitive. Wednesday, Google offered a glimpse into how the tech will change the way people search the web. Starting next year, the Multitask Unified Model, or MUM, will enable Google users to combine text and image searches using Lens, a smartphone app that's also incorporated into Google search and other products. So you could, for example, take a picture of a shirt with Lens, then search for "socks with this pattern." Searching "how to fix" on an image of a bike part will surface instructional videos or blog posts.
Google has announced a new redesign of its search tools, making it more visual and adding in extra contextual information about its results. At its Search On event, the web giant also announced new features for Google Chrome and its Google Lens artificially-intelligent photo software. The main aesthetic change are visually browsable results, "for searches where you need inspiration" such as "pour painting ideas", Google says, which will surface a series of pictures at the top of search results without having to navigate to the Images tab. It will also bring in more contextual information, rolled out over the coming months, with a new'Things to know" section that includes "different dimensions people typically search for". For those searching how to paint with acrylics, for example, underneath the top result will be a series of drop-down results that include a step-by-step guide, tips, or style options.
Google announced today it will be applying A.I. advancements, including a new technology called Multitask Unified Model (MUM) to improve Google Search. At the company's Search On event, the company demonstrated new features, including those that leverage MUM, to better connect web searchers to the content they're looking for, while also making web search feel more natural and intuitive. One of the features being launched is called "Things to know," which will focus on making it easier for people to understand new topics they're searching for. This feature understands how people typically explore various topics and then shows web searchers the aspects of the topic people are most likely to look at first. For example, Google explained, if you were searching for "acrylic painting," it may suggest "Things to know" like how to get started with painting, step-by-step, or the different styles of acrylic painting, tips about acrylic painting, how to clean acrylic paint, and more.
Being able to recommend links between users in online social networks is important for users to connect with like-minded individuals as well as for the platforms themselves and third parties leveraging social media information to grow their business. Predictions are typically based on unsupervised or supervised learning, often leveraging simple yet effective graph topological information, such as the number of common neighbors. However, we argue that richer information about personal social structure of individuals might lead to better predictions. In this paper, we propose to leverage well-established social cognitive theories to improve link prediction performance. According to these theories, individuals arrange their social relationships along, on average, five concentric circles of decreasing intimacy. We postulate that relationships in different circles have different importance in predicting new links. In order to validate this claim, we focus on popular feature-extraction prediction algorithms (both unsupervised and supervised) and we extend them to include social-circles awareness. We validate the prediction performance of these circle-aware algorithms against several benchmarks (including their baseline versions as well as node-embedding- and GNN-based link prediction), leveraging two Twitter datasets comprising a community of video gamers and generic users. We show that social-awareness generally provides significant improvements in the prediction performance, beating also state-of-the-art solutions like node2vec and SEAL, and without increasing the computational complexity. Finally, we show that social-awareness can be used in place of using a classifier (which may be costly or impractical) for targeting a specific category of users.
Snapchat's camera has to date mostly been associated with sending disappearing messages and goofy AR effects, like a virtual dancing hot dog. But what if it did things for you, like suggest ways to make your videos look and sound better? Or show you a similar shirt based on the one you're looking at? Starting Thursday, a feature called Scan is being upgraded and placed front and center in the app's camera, letting it identify a range of things in the real world, like clothes or dog breeds. Scan's prominent placement in Snapchat means that the company is slowly becoming not just a messaging app, but a visual search engine.