In the race to make truly useful AI for a mass audience,gloomy eroticism Meta just jumped forward a few key steps — including AI's ability to "see" objects and provide live, lip-synched translations.
At the Meta Connect developers' conference, CEO Mark Zuckerberg unveiled the latest version of Llama. That's the open-source Large Language Model (LLM) powering the AI chatbot in the company's main services: Facebook, WhatsApp, Messenger, and Instagram.
Given that reach, Zuckerberg described Meta AI as "the most-used AI assistant in the world, probably," with about 500 million active users. The service won't be available in the European Union yet, given that Meta hasn't joined the EU's AI pact, but Zuckerberg said he remains "eternally optimistic that we can figure that out."
He's also optimistic that the open-source Llama — a contrast to Google's Gemini and OpenAI's GPT, both proprietary closed systems — will become the industry standard. "Open source is the most cost-effective and the most customizable," Zuckerberg said. Llama is "sort of the Linux of AI."
But what can you do with it? "It can understand images as well as text," Zuckerberg added — showing how a photo could be manipulated simply by asking the Llama chatbot to make edits. "My family now spends a lot of time taking photos and making them more ridiculous."
Voice chat is now rolling out to all versions of Meta AI, including voices from celebrities such as Judi Dench, John Cena and Awkafina. Another user-friendly update: When using Meta AI's voice assistant with its glasses, you no longer have to use the words "hey Meta" or "look and tell me."
SEE ALSO: Meta Connect 2024: Meta’s Orion AR glasses unveiledZuckerberg and his executives also demonstrated a number of use cases. For example, a user can set up Meta AI to provide pre-recorded responses to frequently asked questions over video. You can use it to remember where you parked. Or you can ask it to suggest items in your room that might help to accessorize a dress.
The most notable, and possibly most useful feature: live translation. Currently available in Spanish, French, Italian and English, the AI will automatically repeat what the other person said in your chosen language. Zuckerberg, who admitted that he doesn't really know Spanish, demonstrated this feature by having an awkward conversation live on stage with UFC fighter Brandon Moreno.
Slightly more impressive was the live translation option on Reels, and other Meta videos. The AI will synchronize the speakers' lips so they look like they're actually speaking the language you're hearing. Nothing creepy about that at all.
Topics Artificial Intelligence Meta
OBITUARY: Jim Akioka, 89; Venice JCC Facilities ManagerHirono: ‘No Amount of Cruelty Is Enough for This Administration’Hirono: ‘No Amount of Cruelty Is Enough for This Administration’Aratani CARE Award Applications Now AvailableMinidoka Visitor Center Open in September, OctoberHonoring a HeroTalk on ‘A Divided Korea’ at GLA JACL MeetingCAPAC Leaders Applaud NBC for Supporting DiversityHonoring a Hero‘Shadows for Peace: The Hiroshima Watch a tiger casually rip a car bumper clean off Former nuclear weapons operator goes ballistic about Trump in 20 tweets Sweet dog refuses to leave the hospital where his owner died Zika cases prompt CDC to issue a 1 The DNC and relationship drama shined on Instagram this week Twitter reacts to 'racist' political cartoon in Australian newspaper This otherwise lovely proposal may have also escalated a sibling rivalry Are the 'Harry Potter' films better than the books? A passionate and magical debate Gaze upon this terribly tiny and adorable rescued glider joey Ariana Grande goes to great lengths to wear her high ponytail with a hat
0.1426s , 10040.1171875 kb
Copyright © 2025 Powered by 【gloomy eroticism】Enter to watch online.Meta AI upgrades: It can see, hear and dub,Global Perspective Monitoring