Meta AI is Now a Standalone App and It’s Good

DJURO SEN - EDITOR
5 Min Read

Today Meta launched Meta AI app built with Llama 4.

When it comes to AI interaction, accuracy is paramount but speed is a very close second. My initial voice experience with the new Meta AI standalone app delivered on both. It was fast and accurate but it takes a wild turn when you activate the full duplex demo. This more natural sounding AI is very conversational (with ums and ahs) but also experimental. Can’t wait to share that experience on video.

WHAT’S HAPPENING IN BRIEF

  • Meta launched its first version of the Meta AI app: a new stand alone app where you can access Meta AI.
  • The app includes a Discover feed, a place to share and explore how others are using AI. 
  • The new Meta AI app will merge with the Meta View companion app for Ray-Ban Meta glasses, and will be the companion app for the AI glasses

Meta says, “We’ve improved our underlying model with Llama 4 to bring you responses that feel more personal and relevant, and more conversational in tone. And the app integrates with other Meta AI features like image generation and editing, which can now all be done through a voice or text conversation with your AI assistant. “

This is a serious challenge to ChatGPT.

The voice demo built with full-duplex speech technology, needs to be toggled on to test. And I believe it’s not available everywhere. It’s a different type of interaction as the AI is generating voice directly instead of reading written responses.

This is why Meta included it.

“It doesn’t have access to the web or real-time information, but we wanted to provide a glimpse into the future by letting people experiment with this,” Meta says in its blog.

” You may encounter technical issues or inconsistencies so we’ll continue to gather feedback to help us  improve the experience over time. Voice conversations, including the full duplex demo, are available in Australia to start.”

The brains behind Meta AI is Llama 4. It’s designed to solve problems and better understand the world around you. Its ability to search the web and give meaningful responses is impressive. The Meta AI app includes a Discover feed, a place to share and explore how others are using AI.

Wearables are going to play a larger part in the AI expansion.

“Glasses have emerged as the most exciting new hardware category of the AI era, and Ray-Ban Meta glasses have led the way in defining what’s possible,” Meta says.

“To integrate all our most powerful AI experiences, we’re merging the new Meta AI app with the Meta View companion app for Ray-Ban Meta glasses, and in some countries you’ll be able to switch from interacting with Meta AI on your glasses to the app. You’ll be able to start a conversation on your glasses, then access it in your history tab from the app or web to pick up where you left off. And you can chat between the app and the web bidirectionally (you cannot start in the app or on the web and pick up where you left off on your glasses).”

Meta AI on the web is also being upgraded. It comes with voice interactions and the new Discover feed, just like the app. The web interface has been optimised for larger screens and desktop workflows and includes an improved image generation experience.

Australians will be able to test drive a rich document editor. Its purpose is to generate documents full of text and images and then export those documents as PDFs. Meta is testing the ability to import documents for Meta AI to analyse and understand. 

More reporting and video examples coming soon.

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *