•Google has created a new technology called Multitask Unified Model, or MUM, which is 1,000 times more powerful than BERT.
• MUM uses the T5 text-to-text framework and is trained across 75 different languages and many different tasks at once, allowing it to develop a more comprehensive understanding of information and world knowledge than previous models.
• MUM has the potential to break down language barriers by transferring knowledge across languages. It can learn from sources that aren’t written in the language you wrote your search in, and help bring that information to you.
• Eventually, you might be able to take a photo of your hiking boots and ask “can I use these to hike Mt Fuji?”MUM would understand the image and connect it with your question to let you know your boots would work just fine.

MUM Is the New Name of Complex Search Results Page
by
Tags:
Leave a Reply