Skip to main content

Apple releases eight small AI language models aimed at on-device use

posted onApril 26, 2024
by l33tdawg
Arstechnica
Credit: Arstechnica

In the world of AI, what might be called "small language models" have been growing in popularity recently because they can be run on a local device instead of requiring data center-grade computers in the cloud. On Wednesday, Apple introduced a set of tiny source-available AI language models called OpenELM that are small enough to run directly on a smartphone. They're mostly proof-of-concept research models for now, but they could form the basis of future on-device AI offerings from Apple.

Apple's new AI models, collectively named OpenELM for "Open-source Efficient Language Models," are currently available on the Hugging Face under an Apple Sample Code License. Since there are some restrictions in the license, it may not fit the commonly accepted definition of "open source," but the source code for OpenELM is available.

On Tuesday, we covered Microsoft's Phi-3 models, which aim to achieve something similar: a useful level of language understanding and processing performance in small AI models that can run locally. Phi-3-mini features 3.8 billion parameters, but some of Apple's OpenELM models are much smaller, ranging from 270 million to 3 billion parameters in eight distinct models.

Source

Tags

Industry News Artificial Intelligence Apple

You May Also Like

Recent News

Thursday, May 16th

Wednesday, May 15th

Tuesday, May 14th

Monday, May 13th

Friday, May 10th

Thursday, May 9th

Wednesday, May 8th

Tuesday, May 7th