Author: Boxu Li

Meta’s Vibes Feed is a recently launched short-form video feed composed entirely of AI-generated videos. Introduced in late September 2025, this feature has quickly drawn attention for its TikTok-style endless scroll of algorithmically personalized clips – except in this case, every video is generated by artificial intelligence[1][2]. In the weeks since launch, Vibes has correlated with a surge in engagement on Meta’s AI app, raising important questions about how AI-driven content feeds work and how they compare to platforms like TikTok, YouTube Shorts, and Instagram Reels. This report provides an overview of Vibes, a technical look at its content generation and recommendation system, its early growth and user engagement, comparisons to other feeds, and insights into its future trajectory.

What Is “Vibes” ?

Vibes is an AI video feed integrated into the Meta AI mobile app and website (at meta.ai), designed for discovering, creating, and sharing short AI-generated videos[4]. In essence, it functions similarly to popular short-video feeds on social platforms – a user opens the feed and vertically scrolls through an endless stream of clips. However, unlike TikTok or Reels where videos are created by human users, Vibes clips are generated by artificial intelligence models (often from text or image prompts) and then shared by users or creators via the feed[1]. Mark Zuckerberg unveiled Vibes with examples of its AI-made videos, such as “a group of fuzzy-looking creatures hopping between fuzzy cubes, a cat kneading dough, and an ancient Egyptian woman taking a selfie overlooking Ancient Egypt”[5] – surreal, creative snippets emblematic of AI-generated visuals.

Meta positions Vibes as a way to spark creativity and inspiration. As you browse Vibes, you’ll encounter a range of AI videos contributed by “creators and communities,” according to the company, and the feed will gradually personalize to your tastes over time[2]. If a video catches your eye, Vibes lets you take action: you can generate your own video from scratch or remix an existing video on the feed by tweaking its visuals, style, or music[6]. The interface includes built-in AI editing tools so that with a few taps you might, for example, take an AI clip of a city skyline at dusk and “remix” it into a sunrise scene with different music. Once satisfied, you can share your creation to the public Vibes feed, send it to friends via DM, or cross-post it to Instagram/Facebook as a Reel or Story[7]. (In fact, if someone sees a Meta AI-generated video on Instagram, they can tap it and jump into the Meta AI app to remix it themselves[8][9].) In short, Vibes is not only a lean-back feed for consumption, but also a lean-forward playground for AI video creation and collaboration.

This feature matters because it represents one of the first attempts by a major social media company to build a dedicated social feed around AI-generated content. By doing so, Meta is testing user appetite for what some critics have dubbed “infinite AI slop” – endless streams of synthetic media[1][10]. Initial reactions were mixed: many users were skeptical, posting comments like “Nobody wants this” on Zuckerberg’s announcement[10]. Yet, early data shows that Vibes has driven a substantial uptick in app usage, suggesting significant curiosity and engagement with AI videos[11][12]. This tension – between skepticism of “AI content” and actual user behavior – makes Vibes an important experiment in the evolving relationship between AI and social media.

Under the Hood: How Vibes Works

Vibes blends cutting-edge AI generation tools with a familiar recommendation engine akin to other short-video platforms. Understanding the feature involves two technical pieces: (1) how the AI videos are created and (2) how the feed decides which videos to show each user.

AI-Generated Video Content Pipeline

All content in the Vibes feed is produced by generative AI models, often based on text-to-image and image-to-video synthesis. At launch, Meta disclosed partnerships with well-known AI content generators – Midjourney (for creating high-quality images) and Black Forest Labs (a group known for their FLUX text-to-video model and other visual AI research) – to power the early versions of Vibes[13]. In practice, a typical Vibes video might start with a user entering a descriptive prompt (e.g. “majestic mountain goats leaping through deep snow in bright daylight”). The system generates a short video clip that matches this prompt; in fact, in the Vibes app each video is displayed alongside the text prompt that created it, providing transparency into its AI origin[14]. Users have reported that many Vibes clips feel like “dreamy vignettes assembled from text prompts, paired with music” – for example, “pastel pets in impossible rooms,” “neon cityscapes swooshing by,” or “photorealistic road trips that evaporate mid-turn.”[15] These visuals reflect the current capabilities (and limitations) of generative video AI, which can produce striking imagery but often lack narrative structure or continuity (a point we’ll revisit)[16][17].

From a toolset perspective, Vibes lets users either generate new videos from scratch or remix existing ones. Generating from scratch involves providing an idea – either in the form of a prompt, a starting image, or even using one of your own photos/videos as a base – and letting Meta’s AI models create a new clip[18]. The “remix” feature is especially novel: you can grab any AI video on the feed and modify it by “adding new visuals, layering music, or changing the style”[18]. For example, you might see a clip of a tranquil forest generated by someone, and you could remix it by instructing the AI to add a waterfall and apply a cartoonish art style, resulting in a new derived video. This remix can then be attributed to you and shared. The idea is to encourage iterative creativity, much like how TikTok allows duets or remixes, but here the remixing is AI-powered. Meta emphasizes that Vibes is a place to “find creative inspiration and experiment with [its] AI media tools”, seamlessly going “from inspiration to creation.”[19][8]

On the backend, Meta is still developing its own proprietary generative AI models for video, so it leans on partners in the interim[13]. Midjourney’s technology likely helps generate high-quality images or keyframes based on user prompts, while Black Forest Labs’ video model might interpolate those into motion or otherwise generate short video sequences from text. (Black Forest Labs has been noted for working on state-of-the-art text-to-video AI, and their FLUX system can indeed “transform text or images into stunning videos”[20].) The Vibes content pipeline may involve combining image generation with animation techniques – for instance, producing a series of AI-generated frames and smoothing them into a video. The exact models in use haven’t been publicly detailed, but commentators observed that the results in Vibes show far fewer glaring flaws (like extra fingers or mangled faces) than last year’s generation of AI videos, indicating Meta curated the content to highlight relatively high-quality outputs[21]. Still, artifacts remain: physics can be off (water or motion may appear unnatural) and coherence between frames sometimes “winks on and off,” reminding us that generative video tech is still maturing[17].

Meta has put guardrails on what the AI can generate. For example, the Meta AI app does not allow creating videos of real people, celebrities, or public figures, nor can users make videos using each other’s likeness[22][23]. These policies aim to prevent deepfake-style misuse on the platform. Additionally, Meta has pledged to label AI-generated media clearly, and supports industry standards like the C2PA Content Credentials for provenance[24]. In practice, that means Vibes videos are marked as AI-generated (and, as noted, show the originating prompt). This transparency will be increasingly important if/when Vibes content spreads to larger platforms – users should be able to distinguish an authentic video from an AI-generated clip at a glance to maintain trust.

Personalization and the Vibes Recommendation Engine

A core component of Vibes’ technology is the algorithmic feed that decides which AI videos appear for each user. In its current incarnation, Vibes behaves much like any other engagement-optimized content feed: it uses AI-driven recommendations to predict what clips will keep you watching and interacting, then serves those clips in an infinite stream. “As you browse, the feed will become more personalized over time,” Meta says[25], meaning the system is learning from your behavior. If you linger on or “like” videos of, say, playful kittens and vibrant cityscapes, Vibes will show you more of those; if you quickly swipe past sports clips or skip abstract art videos, those will appear less often. The more you consume, the finer-tuned the feed becomes in keeping you scrolling[26]. This design taps into the same dopamine loop as TikTok’s For You page or Instagram’s Reels: swipe, get content you enjoy, which encourages more swiping, in a self-reinforcing cycle[27].

Under the hood, Meta likely employs a large-scale deep learning recommendation model similar to those behind its other feeds (Facebook, Instagram) – one that considers dozens of signals (video topic, visual style, your past likes/views, possibly your demographics or interests) to rank content. One unique twist is that because Vibes’ content is AI-generated, the system isn’t limited to showing you videos that human creators happened to make – in theory, it could generate videos on-demand tailored to your interests. Traditional feeds are constrained by the pool of user-generated content: for example, TikTok’s algorithm can only serve videos that someone, somewhere has created and uploaded. With Vibes, if the algorithm detects you really love a niche (say, “sunsets over ocean cliffs with lo-fi music”), it could potentially just generate endless fresh clips of that very niche. Observers have noted this as a revolutionary shift: “Traditional recommendation algorithms pick stuff they think you’re likely to engage with, but with [Vibes], the content can be tailor-made for you specifically.”[28] In other words, the feed itself could become a generative engine. (At launch, Vibes mostly serves videos created by users or invited AI creators, rather than conjuring wholly new ones without human prompt. But the infrastructure opens the door to algorithmically steering the generative process – a powerful concept that Meta could explore in the future.)

Even without on-the-fly generation, personalization in Vibes can leverage Meta’s vast data ecosystem. In fact, Meta has announced it will start using data from your interactions with AI across its platforms to fine-tune content recommendations. For example, your conversations with the Meta AI assistant (text or voice chats) may be used as signals to personalize what posts or Reels you see on Facebook and Instagram[29]. If you chat with Meta’s AI about hiking, you might later be shown more outdoor activity content – “hiking groups, posts about local trails, or even ads for hiking boots,” as Meta explains[29]. This policy takes effect in late 2025 and suggests a future where your interests expressed to AI agents feed into your content feeds. For Vibes, this could mean that if you’ve been asking the Meta AI assistant (built into the app) a lot about a certain anime or a style of art, the Vibes algorithm might take that into account and surface AI videos aligned with those interests. Meta assures that sensitive topics (like one’s health, religion, political views expressed in chats) will be excluded from such personalization[30], focusing only on benign interest indicators.

Overall, the Vibes recommendation engine is an extension of Meta’s expertise in engagement-driven feeds, now applied to an AI content catalog. It uses the familiarity of what makes TikTok so sticky – rapid feedback loops and predictive modeling – and adds the dimension of generative AI. The result is what one commentator called an “infinite slop machine” – content scientifically optimized to hold attention, though potentially “hypnotic motion with minimal meaning.”[31][32] In the next section, we’ll see how this is playing out in terms of user engagement numbers.

Since the launch of Vibes on September 25, 2025, Meta AI’s mobile app has experienced a dramatic surge in usage. Independent analysis by market intelligence firm Similarweb showed that the app’s daily active users (DAUs) jumped from roughly 775,000 before Vibes to about 2.7 million by mid-October[11][12] – a ~3.5× increase in less than a month. The growth trajectory is evident in Similarweb’s data: immediately after Vibes rolled out, the DAU curve inflected sharply upward[33]. (See figure below.)

Daily Active Users of the Meta AI app (worldwide, iOS + Android). A sharp rise from under 0.8M to ~2.7M DAUs coincides with the late-Sept 2025 introduction of the Vibes video feed[11][33]. Chart: Similarweb.

Alongside usage, app downloads spiked as well. The Meta AI app went from under 200,000 new installs per day to around 300,000 new downloads each day in October[34][35]. For context, a year prior (October 2024) the app was almost obscure – on the order of only 4,000 downloads per day[36]. Now it’s pulling hundreds of thousands of daily installs, at times even hitting top ranks on app stores. This suggests Vibes not only re-engaged existing users but also attracted many new users curious about AI video generation.

Meta hasn’t reported official internal metrics yet, but the timing strongly implicates Vibes as the growth driver. Similarweb noted “no meaningful correlation” with external marketing or search trends that could explain the surge, and no broad Facebook/Instagram traffic spike that week (though if Meta cross-promoted inside its own apps, Similarweb’s models might not fully capture it)[37][38]. Instead, all signs point to the short-form AI video feed as the hook bringing people in. Many users likely saw news of Vibes or posts about it and opened the Meta AI app to try making or watching AI clips.

Interestingly, the surge also coincided with a headline-grabbing moment in the AI world: OpenAI’s release of “Sora,” a rival AI video generator, around the same time. Sora’s app hit the top of the iOS App Store as people clamored to generate videos with OpenAI’s tech[39]. However, Sora was available only by invite, leaving a large swath of interested users locked out. Meta appears to have benefited from this pent-up demand – it offered a comparable AI video tool with no waitlist (anyone with a Meta account could use Vibes for free)[40][23]. As TechCrunch noted, “those who couldn’t try out the OpenAI app may have looked for an alternative… OpenAI’s decision to gatekeep Sora may have directly boosted its rivals.”[41] In other words, some of Sora’s hype spilled over to Meta AI, driving curious creators or users its way.

It’s also worth noting that during the period where Meta AI’s app shot up ~15.6% in DAU, some other AI apps saw dips: OpenAI’s ChatGPT app’s usage was estimated down ~3.5%, Elon Musk’s Grok (an AI chatbot) down ~7.3%, and AI search app Perplexity down ~2.3%[42][43]. This suggests a shift in user attention towards generative visual apps – people might have diverted some of their “AI time” to playing with video creation instead of just text chatbots. Meta AI effectively differentiated itself by offering a fun, visual experience via Vibes at a time when interest in AI content creation is high.

From an engagement standpoint, short-form video is a potent format. Data from Data.ai and others show TikTok leads the industry in time spent per user, with users often watching hundreds of short videos in one sitting[44]. Instagram and YouTube have also reorganized around quick clips (Reels, Shorts) because of this high engagement potential[44]. By introducing Vibes, Meta gave its AI app a similar “heavyweight champion” format of the attention economy[45]. Early anecdotal evidence indicates that Vibes can indeed be “bingeable” – some users report getting quickly hooked on the feed’s hypnotic flow of content[31][27]. If each AI video is only 10–30 seconds long, and the algorithm learns what visuals fascinate you, it’s easy to imagine users losing track of time scrolling this “infinite AI video feed.”

That said, a key question is whether this engagement is sustainable. Novelty is a big factor – the first time you see AI videos of, say, a cat floating through space or a photorealistic ancient Rome tour, it’s mesmerizing. But some have noted that “novelty plateaus” quickly[46]. Unlike human-created TikToks, which often carry humor, stories, or relatable moments, many AI-generated clips lack a narrative backbone or emotional hook[16]. They can feel like random “eye candy” – beautiful or odd for a moment, but without context or payoff. Over time, this could limit how long users stay engaged or how frequently they return. Meta’s challenge will be to keep the content compelling as the bar for AI video entertainment rises.

For now, however, the growth numbers post-Vibes launch are a strong signal that there is both curiosity and appetite for AI-driven video experiences. Meta has effectively jump-started its AI app’s user base and is now in a race to deepen that engagement before the fad fades or competitors catch up.

How Vibes Compares to TikTok, Shorts, and Reels

Vibes enters an arena dominated by platforms like TikTok, YouTube Shorts, and Instagram Reels, which have set the standard for algorithmically driven video feeds. It’s useful to compare Vibes to these incumbents in terms of content, user experience, and underlying algorithms:

  • Content Origin: The most obvious difference is that Vibes is 100% AI-generated content, whereas TikTok, Shorts, and Reels predominantly feature human-generated videos (with occasional filters or effects). On TikTok or Reels, you’re watching real people dancing, doing comedy skits, cooking recipes, or reporting news. In Vibes, you’re watching AI-synthesized scenes – often fantastical or impossible moments (a cat piloting a spaceship, a fictional character come to life, etc.). This gives Vibes a very different vibe (pun intended): it feels more like watching an animation or abstract art channel than peering into real people’s lives. Some users enjoy the surreal creativity; others find it “bleak” or disjointed compared to genuine human stories[47][48].
  • Narrative and Meaning: Relatedly, user experience on Vibes vs TikTok can diverge in how meaningful the content feels. TikTok’s most viral videos usually have a human story or clever idea (a joke, a challenge, a narrative payoff in a short skit). Vibes videos, as current AI tech stands, are more about visual spectacle than storytelling – many clips are “moments with nothing to tether them into memories”, as one reviewer put it[49][50]. This can make the Vibes feed feel like a series of visual experiments or music video snippets, rather than a social feed of relatable content. The user experience could be described as cool but impersonal. By contrast, scrolling TikTok or Reels often feels like tapping into culture and people’s real experiences, which can be more emotionally engaging. YouTube Shorts content runs the gamut (from repurposed TikToks to creator vlogs), but again, most Shorts have a human behind them, inherently providing context or personality which Vibes content lacks by design.
  • Personalization Algorithms: All these feeds use powerful AI-driven recommendation algorithms, but the inputs to those algorithms differ. TikTok’s famed For You algorithm learns from an enormous user base and content base – every like, share, re-watch, or follow helps it cluster users and videos into niche interest groups. Vibes, starting out, had a smaller pool of both users and content (initially seeded by invited AI creators and early adopters). It likely relies on similar signals – did you watch a video till the end? Did you hit the heart or remix button? – to gauge interest. One interesting advantage for Vibes is that the algorithm might utilize the actual prompt metadata of videos. For example, if a video’s prompt contains “sunset” and “beach” and you engage strongly, the system knows those keywords appealed to you, and can find (or generate) other videos with similar descriptors. TikTok’s algorithm has to infer what’s in a video via computer vision or hashtags, whereas Vibes knows the intended content of each clip from the prompt that generated it. This could make topic-based recommendations even more precise in Vibes. On the flip side, TikTok benefits from social signals and trends – e.g., if a million people are loving a dance trend, TikTok can surface that to you even if you haven’t shown interest in its topic, knowing it’s broadly engaging. Vibes, at least initially, doesn’t have the same scale of viral trends (few of its AI videos will spontaneously get millions of shares on their own). Its feed is more individually tailored and less driven by mass popularity – a double-edged sword for user experience.
  • Social Features and Community: TikTok and Reels are inherently social – you can follow creators, see content from people you follow intermingled with algorithmic suggestions, and creators build audiences. Vibes is in a Meta AI app that most people hadn’t used socially before; it’s essentially a new network centered on AI content. You can follow friends or creators on Vibes (the app allows sharing to “friends and followers”[51]), but it’s not yet clear how robust the social graph is there. Instead, Vibes feels more akin to a content discovery platform than a social network. Instagram Reels integrates with your existing Instagram connections, so sometimes you see what your friends liked or posted; Vibes, being separate, doesn’t have a built-in giant social graph (although it may import your Facebook/IG account as an identity). However, Meta made a smart move by allowing cross-posting between Vibes and its flagship platforms. A creator can make an AI video in Vibes and then share it on Instagram or Facebook easily[7]. This means Vibes could feed content into the larger social ecosystem of Meta. Already, if you encounter an AI Reel on Instagram (tagged as created by Meta AI), you know it came from Vibes – and Instagram even lets you click through to remix it in Vibes[8]. In effect, Vibes might become an AI content creation hub feeding many channels, whereas TikTok is a closed loop (content largely stays on TikTok, unless re-uploaded elsewhere manually).
  • Remix Culture: TikTok has features like “Stitch” and “Duet” that let users build on each other’s videos, but these still rely on users filming or editing something themselves. Vibes takes remixing to another level by letting the AI do the heavy lifting. The user just specifies changes. This lowers the barrier to creative participation – you don’t need to dance or film your face; you can simply imagine something and have the AI render it. It’s a different kind of creative empowerment. One could argue Vibes is to video creation what Instagram filters were to photography – a way to make average users feel like creators by providing powerful tools. This could cultivate a new breed of “AI creators” who may not be charismatic influencers on camera, but are great at concocting cool AI scenes. TikTok’s algorithm famously turned ordinary people into viral stars; Vibes might turn ordinary people into AI filmmakers, if the content resonates.
  • Quality and “Sloppiness”: A critical comparison being made is around content quality and the risk of “slop”. TikTok and Instagram have been wrestling with a rise of low-effort, clickbait videos – some of it even algorithmically generated or mass-produced – and are trying to promote more original, high-quality content. In fact, earlier in 2025 Meta stated it wanted to curb “unoriginal” content and encourage “authentic storytelling” on Facebook[52]. This is partly why observers found Vibes puzzling – it could flood Meta’s ecosystem with more algorithmically generated, low-substance content at odds with those goals[53]. YouTube is similarly concerned about AI-generated spam and recently looked into cracking down on “AI slop” on its platform[52]. Vibes in its current form inherently produces what some call “high gloss, low stakes” videos[54] – they’re visually polished (thanks to powerful models) but often don’t say much. In the attention economy, this kind of content can do well because it’s eye-catching and doesn’t require mental investment from the viewer. All the major platforms’ algorithms, if left unchecked, tend to favor such easily consumable content to maximize watch time[54]. The question is whether Vibes doubles down on that trend or eventually finds a way to encourage more depth (for instance, longer AI videos with plots, or multi-scene AI stories). YouTube’s approach, for example, is to experiment with AI-assisted content (like Dream Screen, which auto-generates video backgrounds for Shorts) but still have a human creator framing it with some narrative[55]. Vibes for now is at the extreme end – all AI, all the time. It’s a bold move that sets it apart, but also means it must contend with the novelty and monotony problems unique to fully AI media.

In summary, Vibes replicates the addictive form factor of TikTok/Shorts (short vertical videos, endless scroll, personalized picks) while innovating in content creation by using generative AI and built-in remixing. The user experience is a trade-off: you lose the human relatability and spontaneity of traditional viral videos, but you gain an abundance of fantastical, “out-of-this-world” content on demand. For AI enthusiasts and creators, that’s exciting. For users seeking authentic connection or information, it may fall short compared to seeing real people’s lives. It’s telling that Meta launched Vibes as a separate feed – perhaps acknowledging that a pure AI video feed appeals to a certain subset of users, while others might ignore it or even react negatively if it intruded into their normal social feed.

Future Outlook and Open Questions for Meta

With Vibes still in early days, there are several open questions and potential directions for Meta’s AI video feed:

  • Will AI Video Quality and Variety Improve? Meta has indicated it is working on more powerful generative models and creative tools for Vibes, collaborating with “talented visual artists and creators” to refine the output[56]. In the near future, we can expect Meta’s in-house video AI models (perhaps building on their Make-A-Video research or the Emu text-to-video model family) to be integrated, reducing reliance on third parties like Midjourney. This could bring longer or more coherent videos to the feed. If AI clips can evolve to have multiple scenes, consistent characters, and story arcs (a hard technical challenge), Vibes content might become more engaging and narrative-driven rather than the current one-shot visuals[57]. Meta’s ability to push the state of the art here will directly impact whether users stick around once the initial wow factor fades.
  • Longevity of User Engagement: The big question – was the post-launch spike a one-time novelty rush, or can Meta retain and grow that audience? The retention of users who downloaded the app for Vibes will be a key metric. If many of those 2.7 million DAUs log off after a few weeks, Meta will need to iterate fast. To keep people coming back, Meta might introduce gamification or community features – perhaps challenges for best AI video of the week, or ways to follow top AI creators in-app. It could also integrate Vibes content more into Facebook/Instagram feeds, essentially syndicating the AI videos to where eyeballs already are. However, doing so runs the risk of irritating users who didn’t ask for “AI slop” in their regular feed. Meta will have to gauge sentiment carefully. As it stands, Vibes is a bit of an experimental silo (within the Meta AI app), which gives the company freedom to try things without upsetting the billions on Facebook/IG.
  • Monetization and Creator Incentives: None of the current info suggests monetization in Vibes yet – videos are ad-free and there’s no known payment for AI creators. But if usage remains high, Meta will surely consider ways to monetize the feed. This could include sponsored AI videos (brands generating catchy AI clips to insert, much like ads between Reels) or even selling premium styles/visual effects for creators. Another angle is subscription or paywalls for advanced features; for instance, Meta could allow free users to generate a limited number of videos per day, but heavy creators might pay for unlimited generation or higher-resolution outputs. Also, how will traditional creators react? Some might embrace AI tools (e.g. an influencer could use Vibes to make content faster), while others might feel crowded out by a flood of faceless AI content. Meta might need to create a creator economy around Vibes – possibly recognizing top AI video makers and giving them a following, or even a cut of future revenue if their creations drive engagement. This will determine whether Vibes becomes a thriving ecosystem or just a novelty toy.
  • Integration with AR/VR and the Metaverse: It’s no coincidence that the Meta AI app is also tied to managing Meta’s AI glasses and AR devices[58]. Vibes could be a stepping stone to more immersive AI media in those domains. Imagine wearing Meta’s smart glasses and having AI “vibes” (visuals or little video scenes) overlaying your environment, or using voice to ask your glasses to generate a vibe clip on the fly. In VR (Meta’s Horizon or Quest platform), AI-generated videos or environments could become part of the experience. Meta’s long-term vision likely sees generative AI content populating the metaverse, and a feed like Vibes helps train both the AI models and the users to accept AI visuals as part of their social content diet. Meta’s recent restructuring into groups focusing on foundation models and product integration[59]also hints that features like Vibes are just the first of many AI injections into their products.
  • Moderation and Ethical Concerns: With user-generated AI comes the perennial issue of content moderation. Meta will need to prevent harmful or misleading content on Vibes. For instance, while deepfakes of real people are banned, what about AI videos that feel real and could be mistaken for news? As generative video improves, a fake clip of “an explosion in a city” or “celebrity doing X” could spread misinformation if not clearly labeled. Meta has committed to labeling AI content[24], and likely employs automatic detection of banned subjects (e.g. pornographic or violent imagery) in the generation process. Still, this is an ongoing area of concern. Regulators are watching AI content; in some jurisdictions, rules may emerge that affect features like Vibes (for example, requiring age restrictions or watermarking on AI-generated media). Meta will have to navigate these to scale Vibes globally. Notably, Meta is excluding the EU, UK, and a few regions from some AI features initially due to stricter data and AI regulations[60] – a sign that rollout of things like Vibes might be slower or adjusted outside the US until compliance is sorted.
  • User Autonomy and “Addiction”: The combination of AI-generated infinite content and personalization raises philosophical questions: Could a feed like Vibes become too engaging? Some have half-jokingly called it “the fentanyl to social media’s crystal meth”, suggesting that an AI that can generate exactly what stimulates you might be dangerously addictive[61][62]. Of course, this is speculative and tongue-in-cheek, but it reflects a real concern in the AI community: when algorithms know you even better and can create (not just curate) content for you, it could amplify the attention-trapping nature of social media. Meta will likely face scrutiny if Vibes usage skyrockets at the cost of user well-being. Already, all social platforms deal with concerns about excessive screen time. An AI content feed might require new forms of digital wellness safeguards – perhaps built-in reminders to take a break, or tools to tune the algorithm (like “I want more educational content and less mindless content”). It’s uncharted territory how to give users control over an AI feed that literally makes what they want to see.
  • Impact on the Broader Content Ecosystem: Finally, it’s worth pondering how an AI-first feed coexists with human content feeds. If Vibes remains a separate domain, it might attract those who specifically seek AI art and video. But if it starts bleeding into mainstream feeds (e.g., Facebook could one day have an “AI recommends” section showing Vibes-style clips), it could change the overall content mix people consume. There’s a potential for AI content fatigue or backlash: some might tire of not knowing if what they see online is real or AI. Others might embrace the creativity and entertainment value regardless of origin. Meta’s own stance has some contradiction – earlier they pushed for authenticity and downranking low-effort viral memes; now they’ve built a system that can algorithmically pump out such content at scale[53]. The coming months will reveal whether Meta can resolve this tension. Perhaps they see Vibes as supplemental – a sandbox for fun – while keeping Facebook and Instagram feeds grounded in reality. Or perhaps, if Vibes proves wildly successful, it could herald a future where a significant chunk of our social media diet is machine-generated.

Conclusion

Meta’s Vibes video feed is a bold foray into AI-driven social content, marrying the proven engagement mechanics of TikTok-like feeds with the emerging capabilities of generative AI. In a short time, Vibes has demonstrated both the allure and the controversy of such a concept – spurring millions to try creating and watching AI videos, even as skeptics deride the content as “AI slop.” Technically, Vibes showcases how recommendation algorithms and generative models can intertwine, potentially ushering in a new era of personalized media that is created for you, not just chosen for you. It also highlights challenges around content quality, authenticity, and user impact that Meta and others will need to navigate.

As of late 2025, Vibes is still an early experiment – but one with significant momentum. Its user growth and engagement spike indicate that Meta has tapped into something intriguing for the AI-curious public. The feature will likely evolve rapidly, with better AI models, more integration across Meta’s platforms, and new rules as both the company and its community learn what works (and what doesn’t). Comparisons to TikTok, YouTube Shorts, and Reels are natural, and indeed Vibes competes with them for eyeballs, yet Vibes is also a different creature altogether: a feed where the creator is as much an algorithm as it is a person.

For an AI-literate audience, Vibes is a case study in applied AI at scale – it’s one part generative AI showcase, one part social media strategy. Whether it remains a quirky side project or becomes a blueprint for future feeds will depend on how users respond in the long run. Will we embrace infinite AI content if it entertains us, or will we crave the irreplaceable authenticity of human-made moments? Meta is essentially posing that question with Vibes. The coming year should provide some answers, as we watch Vibes develop and perhaps inspire competitors to launch their own AI-fueled feeds. In the meantime, if you need more AI videos in your life, Meta’s new Vibes feed will “serve you all you can stomach.”[63]

Sources:

  • Meta Newsroom – “Introducing Vibes: A New Way to Discover and Create AI Videos” (Sept 25, 2025)[19][18]
  • TechCrunch – “Meta launches ‘Vibes,’ a short-form video feed of AI slop” by A. Malik (Sept 25, 2025)[2][13]
  • TechCrunch – “Meta AI’s app downloads and daily users spiked after launch of ‘Vibes’...” by S. Perez (Oct 20, 2025)[11][33]
  • Gadgets360 – “Meta AI App Popularity Gained With Vibes Feature” (Oct 21, 2025)[64][65]
  • The Verge – “Meta AI funnels AI videos from creators into new ‘Vibes’ feed” by E. Welle (Sept 26, 2025)[51][14]
  • FindArticles (Bill Thompson) – “What’s Inside Meta Vibes Infinite AI Video Feed” (Sept 27, 2025)[15][27]
  • Reddit (r/singularity) – Discussion on Meta Vibes and AI feeds (Oct 2025)[28]
  • The Verge – “Meta will soon use your AI chats to personalize your feeds” by E. Roth (Oct 1, 2025)[29][30]
  • [1] [2] [5] [6] [7] [10] [13] [52] [53] [59] Meta launches 'Vibes,' a short-form video feed of AI slop | TechCrunch
  • https://techcrunch.com/2025/09/25/meta-launches-vibes-a-short-form-video-feed-of-ai-slop/

[3] [15] [16] [17] [21] [24] [26] [27] [31] [32] [44] [45] [46] [49] [50] [54] [55] [57] What’s Inside Meta Vibes Infinite AI Video Feed

https://www.theverge.com/news/786499/meta-ai-vibes-feed-discover-videos

https://techcrunch.com/2025/10/20/meta-ais-app-downloads-and-daily-users-spiked-after-launch-of-vibes-ai-video-feed/

https://www.gadgets360.com/ai/news/meta-ai-app-download-daily-active-users-increase-vibes-feature-ai-videos-9490041/amp

  • [20] Free Flux AI Video Generator - image to video AI

https://flux-ai.io/flux-video-ai/

https://www.reddit.com/r/singularity/comments/1nqqlmf/meta_introduces_vibe_an_ai_generated_video_feed/

  • [29] [30] [60] Meta will soon use your AI chats to personalize your feeds | The Verge

https://www.theverge.com/news/789168/meta-ai-chats-personalized-advertising

Boxu earned his Bachelor's Degree at Emory University majoring Quantitative Economics. Before joining Macaron, Boxu spent most of his career in the Private Equity and Venture Capital space in the US. He is now the Chief of Staff and VP of Marketing at Macaron AI, handling finances, logistics and operations, and overseeing marketing.

Apply to become Macaron's first friends