News & features

Research Focus: Week of December 4, 2023
Research Focus: Using LLMs in a Rust-based formal verification framework; Rethinking network measurements with user feedback; 3D telemedicine using HoloportationTM communication technology could enhance overseas surgical visits.

The Power of Prompting
| Eric Horvitz
Microsoft Chief Scientific Officer Eric Horvitz explains how new prompting strategies can enable generalist large language models like GPT-4 to achieve exceptional expertise in specific domains, such as medicine, and outperform fine-tuned specialist models.

Research Focus: Week of November 22, 2023
A new deep-learning compiler for dynamic sparsity; Tongue Tap could make tongue gestures viable for VR/AR headsets; Ranking LLM-Generated Loop Invariants for Program Verification; Assessing the limits of zero-shot foundation models in single-cell biology.
In the news | Communications of the ACM
What Would the Chatbot Say?
Sebastien Bubeck is a senior principal research manager in the Machine Learning Foundations Group at Microsoft Research. Bubeck often generates stories about unicorns for his young daughter using a chatbot powered by GPT-4, the latest large language model (LLM) by…
In the news | Scientific American
When It Comes to AI Models, Bigger Isn’t Always Better
Artificial intelligence models are getting bigger, along with the data sets used to train them. But scaling down could solve some big AI problems. Artificial intelligence has been growing in size. The large language models (LLMs) that power prominent chatbots,…

Orca 2: Teaching Small Language Models How to Reason
| Ahmed Awadallah, Andres Codas, Luciano Del Corro, Hamed Khanpour, Shweti Mahajan, Arindam Mitra, Hamid Palangi, Corby Rosset, Clarisse Simoes Ribeiro, and Guoqing Zheng
At Microsoft, we’re expanding AI capabilities by training small language models to achieve the kind of enhanced reasoning and comprehension typically found only in much larger models.
In the news | VentureBeat
Microsoft releases Orca 2, a pair of small language models that outperform larger counterparts
Even as the world bears witness to the power struggle and mass resignation at OpenAI, Microsoft, the long-time backer of the AI major, is not slowing down its own AI efforts. Today, the research arm of the Satya Nadella-led company…

Skeleton-of-Thought: Parallel decoding speeds up and improves LLM output
| Xuefei Ning and Zinan Lin
This research was accepted by the 2024 International Conference on Learning Representations. Large language models (LLMs) such as LLaMA and OpenAI’s GPT-4 are revolutionizing technology. However, one of the common complaints about LLMs is their speed, or lack thereof. In…

What’s Your Story: Desney Tan
| Johannes Gehrke and Desney Tan
From service in the Singapore Armed Forces to autonomous navigation with NASA & VR with Disney, Desney Tan’s life journey hasn’t been linear. Learn how Tan landed at Microsoft & about the purpose guiding his work in the podcast series…