Microsoft Phi-3: Cheaper, smaller, yet better AI?

Microsoft Phi-3: Cheaper, smaller, yet better AI?

/ 01:50 PM April 24, 2024

OpenAI has been rapidly transforming the world with ChatGPT and later, GPT-4.
Consequently, people couldn’t keep their excitement for the upcoming version.

Yet, TechCrunch reported its perplexing statement at an MIT event last year. “I think we’re at the end of the era where it’s going to be these giant models, and we’ll make them better in other ways,” it said.

READ: Microsoft says China will use AI to disrupt elections

Article continues after this advertisement

Surprisingly, AI competitor Microsoft realized that prediction with its Phi-3 family of AI models.
The official blog says they are the “most capable and cost-effective small language models (SLMs) available.”

FEATURED STORIES

What are Microsoft Phi-3’s features?

Microsoft announced the first of the Phi-3 family members: Phi-3-mini. It has a capacity of 3.8 billion parameters, which means it performs better than models twice its size. 

Phi-3 also exceeds similar models across language, reasoning, math and coding benchmarks. 

Article continues after this advertisement

It has two context-length variants: 4K and 128K tokens. The latter represents the number of words an AI model can process.

Moreover, it is the first of its class to support 128K tokens for its context window with little impact on quality.

Article continues after this advertisement

In other words, Microsoft Phi-3 can handle roughly that number of words without reducing performance. 

Article continues after this advertisement

The latest model is instruction-tuned, meaning it follows instructions similar to how people usually communicate. As a result, the model is ready to use out of the box. 

Microsoft says its latest AI program is a more practical choice for those who want to build generative AI applications.

Moreover, it is ideal for the following purposes:

Article continues after this advertisement
  • Transforming businesses with generative AI
  • Resource-constrained environments, including on-device and offline inference scenarios
  • Latency-bound scenarios where fast response times are critical
  • Cost-constrained use cases, particularly those with simpler tasks

Soon, IT experts in third world countries like the Philippines may use Microsoft Phi-3 to build generative AI apps. 

The Bill Gates-founded company stated it will launch additional models in the coming weeks: Phi-3-small (7B) and Phi-3-medium (14B). 

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

TOPICS: technology
TAGS: technology

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter!

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

This is an information message

We use cookies to enhance your experience. By continuing, you agree to our use of cookies. Learn more here.