
Meta's latest release includes small-scale multimodal models designed to run locally on mobile devices and edge hardware.
Meta has expanded its Llama family with the introduction of Llama 3.2. This release is notable for including 1B and 3B parameter models that are optimized for mobile devices, as well as 11B and 90B multimodal models that can understand both text and images.
The 1B and 3B models are designed to run on Qualcomm and MediaTek hardware, enabling features like on-device summarization and personal assistants without the need for an internet connection. This is a major step toward ubiquitous, private AI that doesn't rely on cloud latency.
Mark Zuckerberg emphasized that Llama 3.2 is part of Meta's commitment to open-source AI. By providing these models for free, Meta aims to become the industry standard for both high-end cloud applications and lightweight edge computing, challenging the proprietary dominance of Google and Apple.