Microsoft has unveiled the details behind the Copilot+ PC, which utilizes NPU power to run on-device artificial intelligence models. The model, Phi Silica, is a subversion of the Phi 3 model customized for NPU execution.
Derived from the Phi-3-mini model, Phi Silica has been further scaled down in size (parameter size reduced from 3.8B to 3.3B) for optimal NPU performance. Capable of processing 650 tokens per second with just 1.5 watts of power consumption, Phi Silica showcases impressive efficiency.
Microsoft states that Copilot+ PCs will come equipped with approximately 40 standard models, collectively known as the Windows Copilot Library. Future enhancements will include additional AI model customization features such as RAG, Vector Embeddings, and Text Summarization.
When integrated with frameworks and tools like PyTorch and DirectML, the Library becomes a comprehensive AI development suite, aptly named the Windows Copilot Runtime.
TLDR: Microsoft reveals Copilot+ PC’s innovative Phi Silica model running AI on NPU with improved efficiency and a growing set of customizable AI models in the Windows Copilot Library.
Leave a Comment