Microsoft’s interest in GPT-3 reflects a growing use of transformer-based deep learning models. More organizations are using these models because they perform better understanding semantic relationships. One of the problems Microsoft has found when relating transformer-based deep learning is that nuances in relationships are often beyond the models. Microsoft Research has now created a neural network that nearly matches the capabilities of GPT-3. With 135 billion parameters, it is the biggest universal AI model ever built. According to a paper from the research team, the model is already running in Bing. As one of the most sophisticated artificial intelligences ever made, it is close to the 175 billion parameters that GPT-3 can handle.
MEB
Microsoft is naming the AI model MEB (Make Every Feature Binary) and is leveraging it in the Bing search engine. Specifically, it analyzes search terms from users and helps to locate the most relevant websites. It surfaces the best results by tapping into other machine learning algorithms to boost productivity. “One reason MEB works so well as a complement to Transformer-based deep learning models for search relevance is that it can map single facts to features, allowing MEB to gain a more nuanced understanding of individual facts. For example, many deep neural network (DNN) language models might overgeneralize when filling in the blank in this sentence: “(blank) can fly.” Since the majority of DNN training cases result in “birds can fly,” DNN language models might only fill the blank with the word “birds.”” You can read more about MEB and it capabilities on the Microsoft Research blog here. Tip of the day: File History is a Windows 10 back up feature that saves each version of files in the Documents, Pictures, Videos, Desktop, and Offline OneDrive folders. Though its name implies a primary focus on version control, you can actually use it as a fully-fledged backup tool for your important documents.