When you purchase through links on our site, we may earn an affiliate commission.Heres how it works.
The model being smaller also allows it to run on phones and laptops rather than requiring the web.
Microsoft shared details about Phi-3 in aresearch paper.The Vergethen shared insight on the model and quotes from Microsoft.

Microsoft’s new small AI model can compete with larger models that power tools like ChatGPT and Copilot.
Phi-3 Mini is a 3.8 billion parameter language model that was trained on 3.3 trillion tokens.
Phi-3 Mini is a scaled up version of Phi-2, which was released in December 2023.
Lightweight models aren’t exclusive to Microsoft.

Some PCs will be able to run Microsoft Copilot locally rather than through the cloud.
Microsoft was inspired by how children learn from hearing bedtime stories, according to the VP.
A model like Phi-3 Mini is not meant to replace GPT-4 or LLMs.
Instead, small models can focus on specific tasks and use cases.

Small models are also useful for companies using internal data for training.
Smaller models like Phi-3 Mini are small enough to run on phones, laptops, and other small devices.
That figure is significant because Copilot requires at least 40 TOPS of NPU performance to run locally.

Qualcomm’sSnapdragon X Elite has 45 TOPS of NPU performance, meaning the processor can also power Copilot locally.








