Author: Boxu Li
Meta’s Vibes Feed is a recently launched short-form video feed composed entirely of AI-generated videos. Introduced in late September 2025, this feature has quickly drawn attention for its TikTok-style endless scroll of algorithmically personalized clips – except in this case, every video is generated by artificial intelligence[1][2]. In the weeks since launch, Vibes has correlated with a surge in engagement on Meta’s AI app, raising important questions about how AI-driven content feeds work and how they compare to platforms like TikTok, YouTube Shorts, and Instagram Reels. This report provides an overview of Vibes, a technical look at its content generation and recommendation system, its early growth and user engagement, comparisons to other feeds, and insights into its future trajectory.

Vibes is an AI video feed integrated into the Meta AI mobile app and website (at meta.ai), designed for discovering, creating, and sharing short AI-generated videos[4]. In essence, it functions similarly to popular short-video feeds on social platforms – a user opens the feed and vertically scrolls through an endless stream of clips. However, unlike TikTok or Reels where videos are created by human users, Vibes clips are generated by artificial intelligence models (often from text or image prompts) and then shared by users or creators via the feed[1]. Mark Zuckerberg unveiled Vibes with examples of its AI-made videos, such as “a group of fuzzy-looking creatures hopping between fuzzy cubes, a cat kneading dough, and an ancient Egyptian woman taking a selfie overlooking Ancient Egypt”[5] – surreal, creative snippets emblematic of AI-generated visuals.
Meta positions Vibes as a way to spark creativity and inspiration. As you browse Vibes, you’ll encounter a range of AI videos contributed by “creators and communities,” according to the company, and the feed will gradually personalize to your tastes over time[2]. If a video catches your eye, Vibes lets you take action: you can generate your own video from scratch or remix an existing video on the feed by tweaking its visuals, style, or music[6]. The interface includes built-in AI editing tools so that with a few taps you might, for example, take an AI clip of a city skyline at dusk and “remix” it into a sunrise scene with different music. Once satisfied, you can share your creation to the public Vibes feed, send it to friends via DM, or cross-post it to Instagram/Facebook as a Reel or Story[7]. (In fact, if someone sees a Meta AI-generated video on Instagram, they can tap it and jump into the Meta AI app to remix it themselves[8][9].) In short, Vibes is not only a lean-back feed for consumption, but also a lean-forward playground for AI video creation and collaboration.
This feature matters because it represents one of the first attempts by a major social media company to build a dedicated social feed around AI-generated content. By doing so, Meta is testing user appetite for what some critics have dubbed “infinite AI slop” – endless streams of synthetic media[1][10]. Initial reactions were mixed: many users were skeptical, posting comments like “Nobody wants this” on Zuckerberg’s announcement[10]. Yet, early data shows that Vibes has driven a substantial uptick in app usage, suggesting significant curiosity and engagement with AI videos[11][12]. This tension – between skepticism of “AI content” and actual user behavior – makes Vibes an important experiment in the evolving relationship between AI and social media.
Vibes blends cutting-edge AI generation tools with a familiar recommendation engine akin to other short-video platforms. Understanding the feature involves two technical pieces: (1) how the AI videos are created and (2) how the feed decides which videos to show each user.
All content in the Vibes feed is produced by generative AI models, often based on text-to-image and image-to-video synthesis. At launch, Meta disclosed partnerships with well-known AI content generators – Midjourney (for creating high-quality images) and Black Forest Labs (a group known for their FLUX text-to-video model and other visual AI research) – to power the early versions of Vibes[13]. In practice, a typical Vibes video might start with a user entering a descriptive prompt (e.g. “majestic mountain goats leaping through deep snow in bright daylight”). The system generates a short video clip that matches this prompt; in fact, in the Vibes app each video is displayed alongside the text prompt that created it, providing transparency into its AI origin[14]. Users have reported that many Vibes clips feel like “dreamy vignettes assembled from text prompts, paired with music” – for example, “pastel pets in impossible rooms,” “neon cityscapes swooshing by,” or “photorealistic road trips that evaporate mid-turn.”[15] These visuals reflect the current capabilities (and limitations) of generative video AI, which can produce striking imagery but often lack narrative structure or continuity (a point we’ll revisit)[16][17].
From a toolset perspective, Vibes lets users either generate new videos from scratch or remix existing ones. Generating from scratch involves providing an idea – either in the form of a prompt, a starting image, or even using one of your own photos/videos as a base – and letting Meta’s AI models create a new clip[18]. The “remix” feature is especially novel: you can grab any AI video on the feed and modify it by “adding new visuals, layering music, or changing the style”[18]. For example, you might see a clip of a tranquil forest generated by someone, and you could remix it by instructing the AI to add a waterfall and apply a cartoonish art style, resulting in a new derived video. This remix can then be attributed to you and shared. The idea is to encourage iterative creativity, much like how TikTok allows duets or remixes, but here the remixing is AI-powered. Meta emphasizes that Vibes is a place to “find creative inspiration and experiment with [its] AI media tools”, seamlessly going “from inspiration to creation.”[19][8]
On the backend, Meta is still developing its own proprietary generative AI models for video, so it leans on partners in the interim[13]. Midjourney’s technology likely helps generate high-quality images or keyframes based on user prompts, while Black Forest Labs’ video model might interpolate those into motion or otherwise generate short video sequences from text. (Black Forest Labs has been noted for working on state-of-the-art text-to-video AI, and their FLUX system can indeed “transform text or images into stunning videos”[20].) The Vibes content pipeline may involve combining image generation with animation techniques – for instance, producing a series of AI-generated frames and smoothing them into a video. The exact models in use haven’t been publicly detailed, but commentators observed that the results in Vibes show far fewer glaring flaws (like extra fingers or mangled faces) than last year’s generation of AI videos, indicating Meta curated the content to highlight relatively high-quality outputs[21]. Still, artifacts remain: physics can be off (water or motion may appear unnatural) and coherence between frames sometimes “winks on and off,” reminding us that generative video tech is still maturing[17].
Meta has put guardrails on what the AI can generate. For example, the Meta AI app does not allow creating videos of real people, celebrities, or public figures, nor can users make videos using each other’s likeness[22][23]. These policies aim to prevent deepfake-style misuse on the platform. Additionally, Meta has pledged to label AI-generated media clearly, and supports industry standards like the C2PA Content Credentials for provenance[24]. In practice, that means Vibes videos are marked as AI-generated (and, as noted, show the originating prompt). This transparency will be increasingly important if/when Vibes content spreads to larger platforms – users should be able to distinguish an authentic video from an AI-generated clip at a glance to maintain trust.
A core component of Vibes’ technology is the algorithmic feed that decides which AI videos appear for each user. In its current incarnation, Vibes behaves much like any other engagement-optimized content feed: it uses AI-driven recommendations to predict what clips will keep you watching and interacting, then serves those clips in an infinite stream. “As you browse, the feed will become more personalized over time,” Meta says[25], meaning the system is learning from your behavior. If you linger on or “like” videos of, say, playful kittens and vibrant cityscapes, Vibes will show you more of those; if you quickly swipe past sports clips or skip abstract art videos, those will appear less often. The more you consume, the finer-tuned the feed becomes in keeping you scrolling[26]. This design taps into the same dopamine loop as TikTok’s For You page or Instagram’s Reels: swipe, get content you enjoy, which encourages more swiping, in a self-reinforcing cycle[27].
Under the hood, Meta likely employs a large-scale deep learning recommendation model similar to those behind its other feeds (Facebook, Instagram) – one that considers dozens of signals (video topic, visual style, your past likes/views, possibly your demographics or interests) to rank content. One unique twist is that because Vibes’ content is AI-generated, the system isn’t limited to showing you videos that human creators happened to make – in theory, it could generate videos on-demand tailored to your interests. Traditional feeds are constrained by the pool of user-generated content: for example, TikTok’s algorithm can only serve videos that someone, somewhere has created and uploaded. With Vibes, if the algorithm detects you really love a niche (say, “sunsets over ocean cliffs with lo-fi music”), it could potentially just generate endless fresh clips of that very niche. Observers have noted this as a revolutionary shift: “Traditional recommendation algorithms pick stuff they think you’re likely to engage with, but with [Vibes], the content can be tailor-made for you specifically.”[28] In other words, the feed itself could become a generative engine. (At launch, Vibes mostly serves videos created by users or invited AI creators, rather than conjuring wholly new ones without human prompt. But the infrastructure opens the door to algorithmically steering the generative process – a powerful concept that Meta could explore in the future.)
Even without on-the-fly generation, personalization in Vibes can leverage Meta’s vast data ecosystem. In fact, Meta has announced it will start using data from your interactions with AI across its platforms to fine-tune content recommendations. For example, your conversations with the Meta AI assistant (text or voice chats) may be used as signals to personalize what posts or Reels you see on Facebook and Instagram[29]. If you chat with Meta’s AI about hiking, you might later be shown more outdoor activity content – “hiking groups, posts about local trails, or even ads for hiking boots,” as Meta explains[29]. This policy takes effect in late 2025 and suggests a future where your interests expressed to AI agents feed into your content feeds. For Vibes, this could mean that if you’ve been asking the Meta AI assistant (built into the app) a lot about a certain anime or a style of art, the Vibes algorithm might take that into account and surface AI videos aligned with those interests. Meta assures that sensitive topics (like one’s health, religion, political views expressed in chats) will be excluded from such personalization[30], focusing only on benign interest indicators.
Overall, the Vibes recommendation engine is an extension of Meta’s expertise in engagement-driven feeds, now applied to an AI content catalog. It uses the familiarity of what makes TikTok so sticky – rapid feedback loops and predictive modeling – and adds the dimension of generative AI. The result is what one commentator called an “infinite slop machine” – content scientifically optimized to hold attention, though potentially “hypnotic motion with minimal meaning.”[31][32] In the next section, we’ll see how this is playing out in terms of user engagement numbers.
Since the launch of Vibes on September 25, 2025, Meta AI’s mobile app has experienced a dramatic surge in usage. Independent analysis by market intelligence firm Similarweb showed that the app’s daily active users (DAUs) jumped from roughly 775,000 before Vibes to about 2.7 million by mid-October[11][12] – a ~3.5× increase in less than a month. The growth trajectory is evident in Similarweb’s data: immediately after Vibes rolled out, the DAU curve inflected sharply upward[33]. (See figure below.)

Daily Active Users of the Meta AI app (worldwide, iOS + Android). A sharp rise from under 0.8M to ~2.7M DAUs coincides with the late-Sept 2025 introduction of the Vibes video feed[11][33]. Chart: Similarweb.
Alongside usage, app downloads spiked as well. The Meta AI app went from under 200,000 new installs per day to around 300,000 new downloads each day in October[34][35]. For context, a year prior (October 2024) the app was almost obscure – on the order of only 4,000 downloads per day[36]. Now it’s pulling hundreds of thousands of daily installs, at times even hitting top ranks on app stores. This suggests Vibes not only re-engaged existing users but also attracted many new users curious about AI video generation.
Meta hasn’t reported official internal metrics yet, but the timing strongly implicates Vibes as the growth driver. Similarweb noted “no meaningful correlation” with external marketing or search trends that could explain the surge, and no broad Facebook/Instagram traffic spike that week (though if Meta cross-promoted inside its own apps, Similarweb’s models might not fully capture it)[37][38]. Instead, all signs point to the short-form AI video feed as the hook bringing people in. Many users likely saw news of Vibes or posts about it and opened the Meta AI app to try making or watching AI clips.
Interestingly, the surge also coincided with a headline-grabbing moment in the AI world: OpenAI’s release of “Sora,” a rival AI video generator, around the same time. Sora’s app hit the top of the iOS App Store as people clamored to generate videos with OpenAI’s tech[39]. However, Sora was available only by invite, leaving a large swath of interested users locked out. Meta appears to have benefited from this pent-up demand – it offered a comparable AI video tool with no waitlist (anyone with a Meta account could use Vibes for free)[40][23]. As TechCrunch noted, “those who couldn’t try out the OpenAI app may have looked for an alternative… OpenAI’s decision to gatekeep Sora may have directly boosted its rivals.”[41] In other words, some of Sora’s hype spilled over to Meta AI, driving curious creators or users its way.
It’s also worth noting that during the period where Meta AI’s app shot up ~15.6% in DAU, some other AI apps saw dips: OpenAI’s ChatGPT app’s usage was estimated down ~3.5%, Elon Musk’s Grok (an AI chatbot) down ~7.3%, and AI search app Perplexity down ~2.3%[42][43]. This suggests a shift in user attention towards generative visual apps – people might have diverted some of their “AI time” to playing with video creation instead of just text chatbots. Meta AI effectively differentiated itself by offering a fun, visual experience via Vibes at a time when interest in AI content creation is high.
From an engagement standpoint, short-form video is a potent format. Data from Data.ai and others show TikTok leads the industry in time spent per user, with users often watching hundreds of short videos in one sitting[44]. Instagram and YouTube have also reorganized around quick clips (Reels, Shorts) because of this high engagement potential[44]. By introducing Vibes, Meta gave its AI app a similar “heavyweight champion” format of the attention economy[45]. Early anecdotal evidence indicates that Vibes can indeed be “bingeable” – some users report getting quickly hooked on the feed’s hypnotic flow of content[31][27]. If each AI video is only 10–30 seconds long, and the algorithm learns what visuals fascinate you, it’s easy to imagine users losing track of time scrolling this “infinite AI video feed.”
That said, a key question is whether this engagement is sustainable. Novelty is a big factor – the first time you see AI videos of, say, a cat floating through space or a photorealistic ancient Rome tour, it’s mesmerizing. But some have noted that “novelty plateaus” quickly[46]. Unlike human-created TikToks, which often carry humor, stories, or relatable moments, many AI-generated clips lack a narrative backbone or emotional hook[16]. They can feel like random “eye candy” – beautiful or odd for a moment, but without context or payoff. Over time, this could limit how long users stay engaged or how frequently they return. Meta’s challenge will be to keep the content compelling as the bar for AI video entertainment rises.
For now, however, the growth numbers post-Vibes launch are a strong signal that there is both curiosity and appetite for AI-driven video experiences. Meta has effectively jump-started its AI app’s user base and is now in a race to deepen that engagement before the fad fades or competitors catch up.
Vibes enters an arena dominated by platforms like TikTok, YouTube Shorts, and Instagram Reels, which have set the standard for algorithmically driven video feeds. It’s useful to compare Vibes to these incumbents in terms of content, user experience, and underlying algorithms:
In summary, Vibes replicates the addictive form factor of TikTok/Shorts (short vertical videos, endless scroll, personalized picks) while innovating in content creation by using generative AI and built-in remixing. The user experience is a trade-off: you lose the human relatability and spontaneity of traditional viral videos, but you gain an abundance of fantastical, “out-of-this-world” content on demand. For AI enthusiasts and creators, that’s exciting. For users seeking authentic connection or information, it may fall short compared to seeing real people’s lives. It’s telling that Meta launched Vibes as a separate feed – perhaps acknowledging that a pure AI video feed appeals to a certain subset of users, while others might ignore it or even react negatively if it intruded into their normal social feed.
With Vibes still in early days, there are several open questions and potential directions for Meta’s AI video feed:
Meta’s Vibes video feed is a bold foray into AI-driven social content, marrying the proven engagement mechanics of TikTok-like feeds with the emerging capabilities of generative AI. In a short time, Vibes has demonstrated both the allure and the controversy of such a concept – spurring millions to try creating and watching AI videos, even as skeptics deride the content as “AI slop.” Technically, Vibes showcases how recommendation algorithms and generative models can intertwine, potentially ushering in a new era of personalized media that is created for you, not just chosen for you. It also highlights challenges around content quality, authenticity, and user impact that Meta and others will need to navigate.
As of late 2025, Vibes is still an early experiment – but one with significant momentum. Its user growth and engagement spike indicate that Meta has tapped into something intriguing for the AI-curious public. The feature will likely evolve rapidly, with better AI models, more integration across Meta’s platforms, and new rules as both the company and its community learn what works (and what doesn’t). Comparisons to TikTok, YouTube Shorts, and Reels are natural, and indeed Vibes competes with them for eyeballs, yet Vibes is also a different creature altogether: a feed where the creator is as much an algorithm as it is a person.
For an AI-literate audience, Vibes is a case study in applied AI at scale – it’s one part generative AI showcase, one part social media strategy. Whether it remains a quirky side project or becomes a blueprint for future feeds will depend on how users respond in the long run. Will we embrace infinite AI content if it entertains us, or will we crave the irreplaceable authenticity of human-made moments? Meta is essentially posing that question with Vibes. The coming year should provide some answers, as we watch Vibes develop and perhaps inspire competitors to launch their own AI-fueled feeds. In the meantime, if you need more AI videos in your life, Meta’s new Vibes feed will “serve you all you can stomach.”[63]
Sources:
[3] [15] [16] [17] [21] [24] [26] [27] [31] [32] [44] [45] [46] [49] [50] [54] [55] [57] What’s Inside Meta Vibes Infinite AI Video Feed
https://www.theverge.com/news/786499/meta-ai-vibes-feed-discover-videos
https://flux-ai.io/flux-video-ai/
https://www.theverge.com/news/789168/meta-ai-chats-personalized-advertising