AR and VR were once seen as experimental... Today, they’re solving our major problems. By 2030, the global market is projected to reach $200B+, with Europe contributing $15.8B by 2029. But the more important shift is where AR/VR is actually starting to work. We’re already seeing tangible impact in healthcare: → The World Health Organization reports a 20% reduction in surgical errors through immersive simulation. → The NHS in the UK now treats over 10,000 patients annually using VR-based therapies. → In France, hospitals using AR-assisted surgeries have achieved 35% faster recovery rates. Backing these outcomes, the European Investment Bank recorded over €5 billion in public and private investment in AR/VR healthcare applications in 2022 alone. Countries like Germany, France, and the UK are leading the way through industrial use cases. 5G networks are improving the technical foundation. And the integration of AI is making AR and VR systems more intelligent, more responsive, and easier to apply in real-world environments. The broader trend is clear: AR/VR is no longer a speculative bet. That changes the lens for both founders and investors. Instead of asking “What can we build with AR/VR?”, the better question is: “Where does this technology naturally integrate to unlock measurable value?” We have put together in the map below the AR/VR startups in Europe; if we missed any, let us know in the comments so we can add them in version 2 of this map. #Venturecapital #AI #Deeptech #Startups Follow us at APEX Ventures and subscribe to our newsletter for exclusive content on groundbreaking Deep Tech startups: 🔗 https://t2m.io/EV2qHQuo
VR Gaming Platforms
Explore top LinkedIn content from expert professionals.
-
-
"AI" doesn't mean "ChatGPT" AI in games has been around for a long time and not in the way people realize. For years, Ubisoft’s animation research has been featured at SIGGRAPH; they use Generative AI to drive natural, adaptive animation. That’s real AI applied to production problems. And the research has already been applied to AI agents who are able to play games. In the case of StarCraft II, AI trained with reinforcement learning were able to beat professional players. This AI was actually learning strategy, adapting, and improving beyond human skill. So where are all the games that use this technology? Where’s the AI that learns how you play, or that evolves its strategy as the game goes on? Where's the game with smooth AI animation transitions instead of having the character pop every time the game needs a new animation state? The technology exists but we don’t put it into games. Instead, the spotlight goes to chatbots who pretend to agree with everything you say. Scammy in the best case, harmful in the worst as we've seen multiple times in the last few months. The real future of Generative AI in games isn’t text autocomplete. It’s adaptive AI that learns, evolves, and creates new kinds of gameplay and content we’ve never experienced before.
-
Entertainment is moving from watching to experiencing. Would you try this one? The global AR/VR market is projected to exceed $100B+ this decade, with double-digit CAGR driven by gaming, live events, and location-based experiences. But the real unlock isn’t just headsets like Meta Quest 3 or Apple Vision Pro. It’s 6D immersion: • Motion platforms • Haptic feedback • Wind / heat / scent effects • Spatial audio • Multiplayer synchronization Location-based players like Zero Latency are proving consumers will pay premium pricing for full immersion. Why this matters: 🎟 Higher ticket values ⏱ Longer engagement time 💰 New monetization (virtual goods + immersive ads) 🌍 Global scalability of live experiences Streaming optimized convenience. VR + 6D optimizes intensity. The winners won’t produce content. They’ll build worlds. #VirtualReality #ImmersiveTech #FutureOfEntertainment #Innovation #ExperienceEconomy
-
Microsoft just unveiled Muse — an AI model that can generate minutes of cohesive gameplay from a single second of frames and controller actions. The implications for gaming are absolutely massive. Muse is the first World and Human Action Model (WHAM), and the scale of training is mind-blowing: — Trained on 1B+ gameplay images — Used 7+ YEARS of continuous gameplay data — Learned from real Xbox multiplayer matches From a single second of gameplay + controller inputs, Muse can create multiple unique, playable sequences that follow actual game physics, mechanics, and rules. The version shown in research was trained on just a single game (Bleeding Edge). We're watching the beginning of a major shift in how games are created. Game development traditionally takes months/years of character design, animation, and gameplay testing. Scaled up models like Muse could eventually compress this cycle from months to minutes. Microsoft's research is published in Nature today, and researchers will also be releasing Muse's model weights, training data samples, and sharing the testing interface. Another win for open-source! Hot take: We'll see the first fully AI-assisted indie game hit top 10 on Steam within 18 months. Not because AI makes better games, but because it lets creative people iterate 100x faster.
-
Meta leads the global XR (AR/VR headset) Market; AR glasses captured one-fourth of the share in Q3 2025, Counterpoint Research VR is slowing, while AR+AI smart glasses are expanding the market into a more “daily-wear” form factor, and that naturally fragments shares. #Meta still led VR but felt the slight slowdown. Meta's shipments dipped 4% YoY and share fell to 57% (from 71% in Q2). At the same time, AR smart glasses hit their highest-ever quarterly shipments, pushing AR to ~25% of total XR volumes, with #MicroLED rising to 25% of AR shipments. Meta’s strategy is clear with Ray-Ban Meta Display: enter early, scale fast through mainstream eyewear distribution, and make #AI + camera glasses a daily habit while displays mature. Notably, sales began Sept 30, so Q3 sell-in reflects just one day; the real signal will be Q4 demand and how quickly Meta can convert “glasses” into its next platform. RayNeo’s rise is what “right product at the right moment” looks like in #AR+AI glasses. It’s less about spec wars and more about shipping volume as the category moves to everyday utility. Sony remains the steady premium anchor with #PSVR2, because ecosystems beat one-off hardware cycles. When content and platform gravity are strong, you don’t need to win units; you win engagement. XREAL is quietly riding the shift toward display-led smart glasses as the mainstream “personal screen” use case forms. If comfort, optics, and portability keep improving, this lane will expand faster than many expect. PICO XR is proving that #XR demand for enterprise and location-based entertainment can still move meaningful volume. In a soft market, predictable deployments matter more than hype-driven launches.
-
🚀 The XR Showdown: A Season of High-End Product Reveals Set to Reshape the Industry From now until October, the XR (Extended Reality) world is entering one of its most intense and exciting seasons yet. 🌐✨ A wave of high-end product presentations from the biggest players in the industry is about to hit the stage, each promising to redefine immersive experiences. 🏢 Major tech companies are gearing up for a strategic showdown, unveiling their latest innovations in Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR). The competition is heating up—not just in terms of hardware performance and design, but also in ecosystem integration, developer support, and business applications. 🔥 📈 This surge of innovation comes at a pivotal moment: - On the consumer side, XR is edging closer to mainstream adoption thanks to lighter headsets, stunning visuals, and richer interactive content. 🎮👓 - On the enterprise side, XR is no longer just experimental—it’s becoming a cornerstone for training, remote collaboration, product design, and customer engagement. 🏭💼 🌟 The convergence of these worlds—consumer entertainment and enterprise productivity—means the next few months could mark a turning point for the industry. By October, the XR landscape may look radically different, with new leaders emerging and the boundaries between digital and physical reality becoming more blurred than ever. ⚡ The message is clear: the XR revolution is not on the horizon—it’s happening right now. #XR #ExtendedReality #VirtualReality #AugmentedReality #MixedReality #TechInnovation #FutureOfWork #DigitalTransformation #ImmersiveTech #EnterpriseXR #ConsumerTech #VR #AR #MR #Innovation #TechTrends #XRRevolution
-
🚀 𝐓𝐡𝐞 𝐀𝐑/𝐕𝐑 𝐖𝐚𝐯𝐞 𝐈𝐬 𝐍𝐨𝐭 𝐂𝐨𝐦𝐢𝐧𝐠 — 𝐈𝐭’𝐬 𝐀𝐥𝐫𝐞𝐚𝐝𝐲 𝐇𝐞𝐫𝐞. We’re witnessing a quiet revolution. A few years back, AR and VR were buzzwords tossed around at tech conferences. Today, they’re the building blocks of how industries will operate by 2030. Let’s break it down 👇 🛍️ 𝐄𝐜𝐨𝐦𝐦𝐞𝐫𝐜𝐞: Virtual try-on is turning “window shopping” into “mirror shopping.” Customers can visualize, compare, and personalize products — leading to higher conversions and lower returns. 🏠 𝐑𝐞𝐚𝐥 𝐄𝐬𝐭𝐚𝐭𝐞 & 𝐅𝐮𝐫𝐧𝐢𝐭𝐮𝐫𝐞: 3D walkthroughs and room-scale AR are letting buyers “feel at home” before they sign. No more imagination gaps. 👗 𝐅𝐚𝐬𝐡𝐢𝐨𝐧 & 𝐁𝐞𝐚𝐮𝐭𝐲: AI-powered body and face mapping are merging personalization with immersion — redefining digital fitting rooms. 🏥 𝐇𝐞𝐚𝐥𝐭𝐡𝐜𝐚𝐫𝐞: AR-assisted training and remote diagnosis are reducing error rates and expanding reach. 🎓 𝐄𝐝𝐮𝐜𝐚𝐭𝐢𝐨𝐧: Immersive simulations are replacing static textbooks, helping learners retain better through experience. ⚙️ 𝐌𝐚𝐧𝐮𝐟𝐚𝐜𝐭𝐮𝐫𝐢𝐧𝐠: From remote assembly to AR-based maintenance — the industrial metaverse is making factories smarter. 💡 𝐁𝐲 2030, 𝐀𝐑/𝐕𝐑 𝐰𝐨𝐧’𝐭 𝐛𝐞 𝐚 𝐬𝐞𝐩𝐚𝐫𝐚𝐭𝐞 𝐢𝐧𝐝𝐮𝐬𝐭𝐫𝐲 — 𝐢𝐭’𝐥𝐥 𝐛𝐞 𝐭𝐡𝐞 𝐢𝐧𝐟𝐫𝐚𝐬𝐭𝐫𝐮𝐜𝐭𝐮𝐫𝐞 𝐞𝐯𝐞𝐫𝐲 𝐢𝐧𝐝𝐮𝐬𝐭𝐫𝐲 𝐫𝐞𝐥𝐢𝐞𝐬 𝐨𝐧. The brands that experiment today will be the ones defining the rules tomorrow. Marvin XR is proud to be part of this shift — helping eCommerce brands lead the immersive future through WebAR and virtual try-on that scales across platforms. Because innovation doesn’t wait until 2030. It starts right now — with the decision to see your business in 3D. #AugmentedReality #VirtualReality #3DCommerce #FutureOfEcommerce #Innovation #VirtualTryOn #MarvinXR #ARStartups #WebAR #DigitalTransformation
-
Lethal Company sold 10M+ copies without traditional marketing. One viral mechanic helped the selling. For years, gameplay and marketing were separate disciplines. Build the game → then focus on distribution. That model is breaking. Why? Because three things changed at the same time: → High-fidelity engines like Unreal Engine 5, are now cheap and accessible. → Discovery is algorithmic (Steam, TikTok, Shorts). → Players trust raw gameplay clips more than polished trailers. Distribution isn’t solely coming from highly optimised marketing campaigns anymore, but rather, from moment generation. If a game reliably creates high-emotion moments, players can do the marketing for you. One example of mechanics turning into viral marketing devices, is the recent surge in popularity of proximity chat. The implementation of directional, proximity-based voice chat, often with audio degradation or "muffling" effects, has become a primary driver of organic virality. Lethal Company’s developer, Zeekerss, utilised a background in Roblox horror to engineer a specific auditory experience where voice chat cuts off immediately upon death or muffles behind walls. This mechanic turns every player interaction into a potential skit, effectively outsourcing trailer production to the player base. Combined with "slapstick" physics, these unscripted moments ensure the game feels "alive" on social feeds long after launch - leading to 10M+ sales. I also particularly liked Content Warning’s approach to gamified distribution, which achieved 6M+ downloads in just 24 hours. After launch, they allowed players to save in-game recordings directly to their actual desktop - Allowing players to flood the internet with branded content, with zero editing required. Here’s how you can too design a mechanic that creates its own distribution. 1. Design to Pique Spectator Interest First, Player Mastery Second Ask one brutal question during mechanic design: “If someone saw this moment on TikTok with no context, would they immediately feel something?” If a mechanic only becomes interesting after 10 minutes of explanation, it won’t travel. 2. Force Unscripted Outcomes Inside the Core Loop Viral moments are rarely “earned,” but rather, forced by constraints. Look to bake in: → Time pressure (extraction timers, collapsing zones) → Irreversible consequences (death = silence, lost loot) Inevitable mistakes are what create clips. 3. Remove One Full Step Between “That Was Funny” and “Posted” Audit your clip funnel: → Can a player export a clip without opening another app? → Does the game auto-capture highlights by default? → Is the file usable as-is with no editing? Every extra step cuts your content volume in half. So work to reduce the friction. If your game launched tomorrow, what exact mechanic would generate clips without you paying a creator? If you can’t name it, you may end up relying solely on paid ads to grow.
-
Virtual simulations from game tech will increasingly be used for real-world applications Last week we at Andreessen Horowitz did a round up of the Big Ideas for 2025 Here was mine: Traditionally, games have been virtual world simulations designed for fun. Now gaming technology is extending beyond entertainment to transform how businesses operate. While gaming has long pioneered breakthrough technologies — from Nvidia’s graphics to Unreal Engine’s real-time 3D rendering — these tools are now solving critical business challenges. Consider Applied Intuition, a company built on Unreal Engine, which creates virtual simulations to train and test autonomous vehicles. Three forces are accelerating this shift: generative AI is slashing the cost of virtual content creation; advanced 3D capture technologies are digitizing real-world environments (aka digital twins); and next-generation XR devices are making immersive experiences practical for workers. The applications are already here: Anduril Industries leverages game engines for defense simulations; Tesla creates virtual worlds for autonomous systems; BMW is incorporating AR in future heads-up display systems; Matterport revolutionizes real estate with virtual walkthroughs; Traverse3D helps companies unlock virtual interactive training for their workforce. Whether it’s training autonomous systems in virtual environments, helping consumers shop with 3D visuals, or scaling tomorrow’s workforce via simulations, I think game tech will infuse every sector in 2025
-
Arrowhead Game Studios have leveraged/co opted social media platforms as their own #marketing arm. How they did it, and how you can too. While typical campaigns in the genre revolve around seasonal roadmaps spearheaded and amplified by the developers themselves across social media, Arrowhead opted for a different strategy: 1. Releasing patches to address issues and implement quality of life changes. 2. Arrowhead would announce these fixes and alterations. 3. The patch would also introduce new content. 4. They would then maintain secrecy regarding the specifics of the new content, or if they exist at all. This in practice, done over and over again, achieves the following: ➸ Patches are perceived almost like huge loot boxes: Players anticipate the introduction of some form of new content within each patch, fostering excitement regardless of its significance. ➸ Every update transforms into an Easter Egg hunt, prompting players to eagerly seek out hidden elements. ➸ This excitement extends to content creators, who strive to be among the first to discover and share the new content, fueling their own creative endeavors. ➸ Once uncovered, social media platforms become inundated with player-shared content, contributing to the perception of a game is alive and ever evolving, thus driving players to return to the game if they churned out to see the new content for themselves, or, drive new players into the game. Over two months from the launch of Helldivers 2, and I have yet to see the comment "dead game" thrown out describing the game, which is a feat in and of itself.