9 Takeaways from the Vision Pro After 6 Months
Apple’s biggest bet in 17 years released on February 2. Before giving any detailed thoughts on the device, I wanted to spend a few months using it and seeing how other users and developers did (or didn’t). Here are my “nine thoughts after six months” on Apple’s own expectations and spend, what the Vision Pro has and hasn’t achieved, fair and unfair evaluations, and more.
#1: Apple Did Not (and Does Not) Want to Manage Expectations for the Vision Pro
The Vision Pro is arguably the highest-profile and most important device debuted by Apple since the iPhone in January 2007. The company spent more time (eight years versus the iPhone’s three) and money (see point #2) developing the device than any other in its history. The Vision Pro is clearly the most ambitious of their product launches since the iPhone, the first to be wholly developed under the purview of CEO Tim Cook (though various head-mounted display prototypes were underway as early as 2006), and reporting suggests that its viability was controversial internally (with some employees arguing that Head-Mounted Displays (“HMDs”) impart harm by isolating its wearers from other people and, ultimately, the world around them).
Despite this pressure, Apple did not seem worried about raising expectations for their first HMD. When Cook unveiled the Vision Pro at WWDC 2023, he declared it was a “revolutionary product” that would sit among the few that “shift the way [civilization] looks at technology and the role it plays in our lives.” Mike Rockwell, the Apple VP who led Vision Pro’s development, declared it to be “the most advanced personal electronics device ever” (a description used extensively across Apple’s marketing of the Vision Pro). And for a company that spent its first thirty-one years named “Apple Computer Inc.,” there was something particularly significant about Cook’s declaration that with the Vision Pro, “the era of spatial computing had arrived.” This last line was also repeated extensively on Apple website, social accounts, and ad campaigns.
The week before the Vision Pro’s February 2024 release in the United States, Cook described the device as “mind-blowing” and commented that he had known “for years that Apple would get here. I didn’t know when, but I knew that we would arrive here.” Meanwhile, Greg Joswiak, Apple’s SVP of Worldwide Marketing, said that the Vision Pro “feels like we reached into the future and grabbed this product.” In July 2024, as Apple prepared to expand the Vision Pro’s availability to more countries globally, Cook boasted, “Spatial computing is a big idea, you know, just like the Mac ushered in personal computing and the iPhone really propelled mobile computing,” adding, “I use it in all aspects of my daily life now” and “Vision Pro will introduce everybody to spatial computing.”
Apple’s language is relevant not just because it’s an extraordinary marketer but also because the company has consistently proven that it knows what (nearly) everyone in the world needs but doesn’t even know they might want. And rather than release the Vision Pro as a “dev kit” (a strategy some expert outsiders have argued Apple should pursue) or position this first HMD as a high-end luxury device for high-octane designers and media professionals, Apple has not just argued it that was “for everybody” but has also priced the device quite reasonably, given the devices it could replace. The very sentence before Apple announced the price of the Vision Pro at WWDC23, Rockwell explained—rationalized—that “If you purchased a new state-of-the-art TV, surround sound system, powerful computer with multiple high-definition displays, high-end camera, and more, you still would not have come close to what Vision Pro delivers.” Given this, we have to evaluate the Vision Pro with the fullest of expectations. And to that end…
#2: The Vision Pro Probably Cost Tens of Billions to Develop
So what did it take to release a product that felt “grabbed from the future”? Even before the Vision Pro had debuted, Apple had begun to publicly explain the answer (and, in doing so, raise expectations).
In a 2023 cover story in GQ, Cook addressed rumors about Apple’s not-yet-announced HMD and the fact that, thus far, neither Google Glass nor Meta’s Quest seem to have made (as GQ put it) a “dent” in the marketplace. “Pretty much everything we’ve ever done, there were loads of skeptics with it. If you do something that’s on the edge, it will always have skeptics.” In the same story, Cooked added that before entering a market, Apple first asks Can we make a significant contribution, in some kind of way, something that other people are not doing? Can we own the primary technology? I’m not interested in putting together pieces of somebody else’s stuff. Because we want to control the primary technology. Because we know that’s how you innovate.”
We don’t know exactly how much it cost Apple to build an HMD that met Cook’s requirements, as the company never discloses (and argues it does not plausibly have) direct P&Ls for its products. Investments in Apple’s silicon technology, displays, production capability, et al., are widely re-purposed across its device lines. But when he unveiled the Vision Pro in June 2023, Cook said the company filed for more than five thousand patents during its development.
Reports suggest that development on Vision Pro began in late 2015, and from that time until WWDC, Apple filed for over twenty thousand worldwide patents and spent about $130 billion on R&D. If one assigns 25% of this budget (5,000 of 20,000 patents) to the Vision Pro, the product has a price tag of $33 billion, which is roughly in line with Meta’s spend after removing device marketing, subsidies, and content investments.
Direct allocation may overstate or understate this allocation, though the latter is more likely. XR R&D likely exceeds that on, say, MacBooks or iPads on a dollar-per-patent basis. Apple’s annual R&D spend grew from $8 billion in 2015 to $30 billion in 2023; it’s likely that “most” of the intensive R&D for the Vision Pro took place late during that stretch, in which case it might be more accurate to posit a sliding scale of per-year allocations (e.g., 10% of R&D in 2015, but 50% in 2021). But even if one applies a 30% haircut to the above estimate, that’s still $24 billion! As with Meta’s investments, this sum cannot be exclusively attributed to a single device (e.g., the Vision Pro) as it also includes foundational investments for other XR-related devices and subsequent entries to the Vision line, but the cost to make this first device is doubtless immense.
#3: The Vision Pro Is Not Really From The Future
Though the cost to develop the Vision Pro may have been comparable to Meta’s own investments in XR, the device’s unveiling led many to believe that Apple had instantly vaporized Meta’s hardware ambitions. According to preview testers, the Vision Pro was by far the most impressive HMD they had ever used—not only was it marvelously constructed out of precision-cut titanium (in contrast to the Quest’s plastic enclosure), but it supported eye and face tracking (not just hand tracking), did not require controllers even for precise inputs (unlike the Quest line), had over three times the resolution per eye and a much wider color range, a third of the passthrough latency, and on and on. Sure, Apple had a history of blowing away those earlier to a given category, but Mark Zuckerberg was not just one of the most adept business leaders of the twenty-first century, having already overseen Facebook’s best-in-class pivot from one platform to another. How, then, pundits wondered, could Apple have so fully eclipsed Zuckerberg’s decade-long investments into Oculus VR/Reality Labs and half dozen HMDs (including Meta’s own high-end HMD, 2022’s Quest Pro).
Three days after WWDC, Zuckerberg sent a missive to his staff that seemed to reject such skepticism, writing, “From what I’ve seen initially, I’d say the good news is that there’s no kind of magical solutions that they have to any of the constraints on laws of physics that our teams haven’t already explored and thought of.” Zuckerberg’s remarks were quickly mocked. And this mockery was renewed after the Vision Pro released.
A few months later, there is a wider understanding that while Apple has built some brilliant technology (inclusive of software and hardware), much of its relative spectacle stemmed from the high-end components Apple chose to use and which Meta has thus far opted against. The display, for example, is beautiful—but it’s not made by Apple, nor is it exclusive to Apple. Instead, the display is manufactured by Sony, and Sony appears to be using it for its forthcoming enterprise headset and will probably sell a similar to display to OEMs such as Samsung and LG as they produce their own higher-end Horizon OS and Android XR-based headsets.
Sony’s display, according to some reports, costs over $450—more than twice the retail price of 2021 Quest 2 (which retails for $200 and started at $300) and nearly matches 2023’s Quest 3 ($500). The display has broader implications, too. To support such a high-resolution display, the Vision Pro requires a very large and powerful battery. This is a major reason why the Vision Pro has a tethered, standalone battery that weighs 70% as much as the entire Quest 3 (including its headset-based battery). Another reason is heat-related. The Vision Pro also draws 36 watt-hours from this battery—just over twice that of the Quest 3—and such a battery simply gets too hot to be placed on the wearer’s face. This draw also explains why the Quest 3 and Vision Pro have comparable battery life (2–3 hours) even though the former weighs 69 grams and the latter 350 grams. The Vision Pro also boasts twice the sensors and cameras as the Quest 3, all of which enables better eye, hand, and environmental tracking (and with it, the dropping of mandatory controller use), but also increases the cost of the device, its weight, battery drain, CPU requirements, and so on.
It is wrong to reduce the differences between the two devices just to bill of materials. In comparing the Quests 2 and 3 to Apple’s Vision Pro, we see a fundamentally different perspective on what the minimum specs of an XR device should be. For that matter, Meta’s Quest Pro seems to have resided in an uncanny valley of being too costly for the average user (it debuted at $1,500 but was subsequently cut to $1,000), yet not powerful enough to convert many high-end customers. It’s also valid to wonder whether customers would be as keen to buy a $3,500-level device from Meta rather than Apple—and whether Meta could construct an equivalent device on an equivalent budget. Apple’s silicon is an extraordinary advantage, for example, and the Vision Pro’s 11ms passthrough latency requires a lot more than just “higher-end components.” One must also consider the role of “taste” in building successfully hardware/software platforms—and also developer support.
But BOM matters a lot. In many developed markets, the sales tax on the Vision Pro exceeds the price of the Quest 3 and is over twice the price of the Quest 2. At $499, the two year AppleCare warranty for the Vision Pro is also costs as much as a Quest 3 (and more than twice the price of a Quest 2)— and each repair has a $300 deductible!
As of now, Zuckerberg does seem right: there is no clear breakthrough “magic” in the Vision Pro—and the category remains as constrained by “science” as ever. The Vision Pro is heavy, uncomfortable, and expensive, and its mixed reality environmental recognition capabilities remain basic (it understands “floor” but not “rug,” “street,” and so forth). Apple may claim the Vision Pro is from the future, but to me, it feels more like a modern-day device that uses components we believe will be affordable “in the future” – at which point the devices of that era are likely to be far more sophisticated, too.
#4: EyeSight is An Expensive Feature — and Not Worth It
One of the Vision Pro’s most distinctive features is EyeSight, an externally facing display that reproduces a (relatively) low resolution version of the wearer’s eyes in real time. The intent of EyeSight is twofold, both aspects seeking to reduce the stigma and social isolation that comes from wearing a VR device. First, EyeSight enables those in the vicinity of someone using a Vision Pro to know when the Vision Pro user is looking at the real world (i.e., through the external camera). This “disclosure” seeks to reduce the creepiness that might arise from someone not knowing when they are and are not being watched. Second, the current thinking has it that EyeSight makes it easier for a Vision Pro wearer to have a “natural” conversation with someone physically near them without requiring the removal of the Vision Pro.
EyeSight was not a wholly unique—Meta had even publicly demonstrated a similarly minded prototype in 2021—but culturally, it seemed uniquely Apple. When marketing the Apple Watch, for example, Cook had emphasized the way it reduced digital isolation by keeping users from pulling out their phones and tilting their heads down to it. In time, we may come to consider EyeSight (or similar technologies) essential to the mainstream adoption (and, further, use of) HMDs. Thus far, however, the feature seems like a costly mistake. It adds considerable weight and financial cost to the device (another screen with precision-cut glass), makes it far more fragile (it is a single, highly curved piece of glass), and is a further burden on the already scarce computing and battery power. In exchange, the benefits seem modest-to-none. And given the Vision Pro seems to be in need of more comfort, a (much) lower price, and potentially an integrated battery, the insistence on EyeSight feels like a mistake. I suspect that the standard SKU Vision, when it releases, will drop EyeSight.
#5: The Vision Pro is, Today, Mostly a VR device (Even Though Apple Claims Otherwise)
The Vision Pro is best-in-class when it comes to “spatial mapping” of real-world environments. It’s passthrough functionality is also best-in-class in latency, precision, and image quality. It was also important to Apple that the device be seen as a “mixed-reality” or “spatial computing” device, not a virtual reality one. At the same time, the device is, functionally speaking, a virtual reality device. Nilay Patel, editor in chief of The Verge, noted this extensively through his in advance review “Apple doesn’t want anyone to think of the Vision Pro as a VR headset, but it’s a VR headset—albeit a VR headset that almost lets you pretend it’s not a VR headset. . . . It is a VR headset masquerading as an AR headset. . . . And in the entertainment context, where Apple lets the Vision Pro fall back into being the VR headset it fundamentally is, it absolutely shines.”
After seven months, Patel’s observations have been validated—and Apple has yet to meaningfully counter them. Many of the most popular use cases, such as watching a movie, has no requirement for any passthrough functionality (and are arguably improved by turning off passthrough, as this aids viewability).
One of the highest-profile Vision Pro exclusives — Marvel Studio’s What If...? An Immersive Story — is narratively rooted in a mixed reality experience (a multiversal traveler visits the user in their living room before enlisting them in an intergalactic adventure). However, this MR experience is really just a passthrough video with a virtual character standing on their floor – not unlike early Pokémon Go experiences. No part of the user’s living room, or the real world, is engaged, let alone consequential (when the user jumps through the traveler’s portal, the portal comes to them, the user doesn’t move through it). And nearly all of What If’s story and interactivity takes place in pure virtual reality. It is a bit like having Captain America brief you on the roller roaster ride you’re about to take at Disneyland, and then debrief you after you get off. Technically the attraction is more than the roller coaster, and it does feel more personal when he talks specifically to you, but it’s still a roller coaster.
There is some utility in using the Vision Pro’s multiple virtual monitors in a passthrough setting (e.g., at your physical world desk while able to see those physically around you), but that’s not really “AR” or “MR”; it’s just a partially rendered view, and there are various benefits from turning off passthrough (larger and more screens are visible, easier to focus, brightness/viewability improves, among others).
In time, the Vision Pro may become, on a practical basis, more of an AR/MR/spatial device rather than a VR one. To support this, Apple is now providing enterprise applications with access to the Vision Pro’s raw camera feed, which is essential for advanced real-world object recognition and related use cases (e.g., automotive repairs) and is not currently available on Meta’s various HMDs. But at least to date, Apple’s revolutionary spatial computer is mostly a virtual one.
#6: The Benefits of Using a Vision Pro Fall Significantly Short of its Drawbacks
The Vision Pro is a remarkably constructed device that does often feel like magic – not just to those new to VR, but even those who have tried dozens of headsets. When Apple says it’s the most advanced consumer electronics device ever released, that’s almost assuredly true (though as Patel, noted in his review “visionOS feels also designed for an eye tracking system that’s just slightly more precise than it actually is”). There are some things the Vision Pro does much better than other devices—at least according to some. I know many Hollywood executives who strongly prefer the device for watching “dailies,” a number of industrial designers in automotive and real estate swear by it, and clearly many consumers love its video-watching functionality.
However, the Vision Pro’s many drawbacks (weight, comfort, price, battery life, exclusionary usage) far outweigh its benefits—at least most of the time and for most people. In most cases, using the Vision Pro isn’t better than using another device (it’s often worse). In the outlier cases, usage is capped by comfort, battery life, location of use, and the need to swap to other devices (e.g. a laptop or PC, smartphone, tablet, etc.) for a given use case. It’s incredible looking to use spatial videos and photos, or 20 foot wide and 10 foot tall live photo, but you can’t co-experience it with a loved one . . . unless they also buy another Vision Pro. The same goes for all apps. The Vision Pro may afford any user a portable TV that’s larger, higher resolution, and with better color than the one they have at home – but they can’t share watching it.
Overall, it’s unlikely there are any users for which it might currently or reasonably foreseeably replace the need for your laptop or smartphone or tablet, which in turn means it’s the N+1 device you need to bring with you or store at your desk and swap between. This is a big problem because of HMD’s face a trilemma: greater “power” (varyingly defined) is needed to expand use cases, but the devices also need to be lighter and more comfortable, and cheaper, too. Each of these variables trade off versus one another, and the severity of these constraints seems unlikely to relax anytime soon.
Thus far, Meta’s Quest strategy seems more suited to the present-day constraints of HMDs. Zuckerberg and Meta’s leaders have been fairly candid that HMDs have progressed more slowly than expected and that adoption has been accordingly slower too. Furthermore, the company has struggled to drive enterprise/productivity adoption of Quest devices (despite these use cases being at the forefront of device marketing in 2021 and 2022). However, the Quest has found modest product-market fit in games thanks to its low price and weight, integrated controllers, and app pioneers such as RecRoom, VR Chat, Gorilla Tag, Beat Saber, and more. Sales growth is still fairly modest (especially after accounting for Quest 3s that replace Quest 2s rather than directly expand the Quest’s install base), but the company’s shift to an OEM licensing model may re-accelerate growth. Beginning next year, it’s likely that a fleet of specialized, Meta-powered devices from the likes of Samsung, LG, HTC, and who knows who else will come to market, each one targeting a specific use case. For example, we might see an even lower-cost and weight headset designed just for fitness and medication, or a higher-powered gaming-focused device that is still more comfortable and lighter than existing headsets because it skips the many tracking cameras and sensors that most VR games don’t need.
(Interestingly, Apple seems largely disinterested in gaming on the Vision Pro. The company rarely gets excited about gaming app across any of its devices, but this is particularly noteworthy given its the category where the Quest demonstrates the greatest product/market fit - and because the Vision Pro’s aluminium frame and weight makes it a rather ill-suited to the use case).
#7: Developer Investment Remains a Problem
When Steve Jobs unveiled the iPhone in January 2007, he pitched it as a single device that converged three others, that is to say “a widescreen iPod with touch controls,” “a mobile phone,” and “an Internet communicator.” (i.e., a web browser). Though the iPhone was indeed better at each of these areas than the dedicated devices currently in market, it was only after the launch of the App Store in 2008 that the iPhone became extraordinary. In 2009, the company debuted its “There’s an App for That” campaign, emphasizing that, well, the iPhone was now for everything. And today, the three “devices” that were originally touted are far from the most popular “apps” on an iPhone.
For the Vision Pro to thrive, it needs apps made for the Vision. After six months, Apple says there are 2,500 “spatial” apps for the Vision Pro, with 250 premiering each month. In contrast, there are more than 2,000,000 iOS apps (i.e. only 0.12% have been natively adapted for Vision Pro). Many of these “spatial” apps are also rudimentary (Disney+ only qualifies as a spatial app because you can watch its content in a half dozen virtual Disney theaters, rather than Apple’s stock environment) and more importantly, a breakout app is nowhere.
To be fair, six months is not that long. However, there was an eight-month gap between Apple’s unveiling of the Vision Pro (and the simultaneous launch of its dev kit) and the public release of the device. As such, developers have had fourteen months to create and ship an application. Indeed, it’s likely that the length of this delay (the longest in Apple’s history), as well as Apple’s choice to reveal the device at its Worldwide Developers Conference, was specifically so that it might launch with more and higher quality apps. And after six months, the iPhone had over 15,000 apps (after a year, it had over 50,000). Yes, the iPhone apps of 2008/2009 and were much easier and cheaper to make than Vision Pro apps are in 2024, but the app economy barely existed in 2008 and now it drives trillions in revenue annually and employs hundreds of thousands.
It’s also notable that the many of the most technically and financial adept “Big Media” and “Big Tech” companies, such as Netflix, YouTube, Amazon, and Spotify, have all declined to ship a native app on the Vision Pro (Meta has ported its hit VR game Beat Saber to Sony’s PlayStation VR2, but not the Vision Pro). Of course, these companies have some competitive reasons not to support Apple’s device, but it’s telling that they don’t feel any pressure to do so (they do ship on Apple TV, another rivalrous device), and their absence makes it harder for customers to justify that $3,500 hit to their bank account.
And while Apple’s HMD is by far the most powerful in its category, it remains tiny not just as a computing platform (over 1.2B smartphones and 250MM PCs are sold annually) but also among HMDs (roughly 25MM Quests have been sold since 2020). If Apple sells a half million Vision Pros this year and another million next year, it’ll still be difficult for most developers to justify a native app for the device. And even for VR/XR-specific developers, it’s difficult to dedicate resources to producing an app that takes advantage of the Vision Pro’s unique capabilities/power/feature set because that means ignoring 98% of the remaining user base (e.g., Quest). This dynamic makes it harder for the Vision Pro to really show itself off and earn new customers.
And beyond outside developers, it’s surprising that Apple has not debuted spatial versions of more of its Apple TV originals and sports content (and where they have, the spatial integrations are not as rich as one might expect). Especially in the case of series such as Monarch: Legacy of Monsters. These are coming, I’m sure, but it’s not clear why they aren’t already here given that Apple, unlike outside developers, have had years to plan, prepare, test, etc.
#8: Apple has Promptly Reshaped Terminology, Customer Perceptions, and Competitor Plans
The term “spatial computing,” which dates from the early 1990s, was never intended to describe mixed-reality use cases, Apple’s adoption of the term has led much of the industry to do the same (and led many companies to CTRL+F/Replace “Metaverse”). Even Meta now uses the term—when the Quest 3 was unveiled at Meta Connect last September, Meta’s CTO and the head of Reality Labs declared it to be “the best-value spatial computing headset on the market for a long time to come,” and a day before the Vision Pro’s release, Meta launched “spatial photos” and “spatial videos.” There has also been a notable shift in Meta’s marketing focus toward mixed-reality use cases such as virtual monitors.
Based on reports, Apple’s decision to launch such a high-end device does seem to have changed the device plans of its competitors. Google, Samsung, LG and others are all said to be planning devices at the $1,000-1,500 price point, which seemed unthinkable following the failure of Meta’s 2022 Quest Pro, which retailed at $1,500 (and was quickly discounted to $1,000 to little effect), and Sony’s PSVR2, which cost $600 and required a $500 PlayStation, as well as Meta’s market dominance with devices at $200-500. Apparently Meta, too, has renewed plans for a Quest Pro 2, now, as well. These moves may be coincidence. Or perhaps Apple gave manufacturers optimism that there was an audience for devices at four-digit prices. It’s also possible that Apple has so significantly reset customer expectations for performance and visuals that the rest of the market (especially those who have yet to enter) no longer believes a $500 SKU is viable.
#9: The Apple Vision Pro is a Simultaneous Reminder that if the era of HMDs has arrived, it’s also still far away
HMDs were first described in the 1930s, have been in active development since the 1950s, with notable developments from companies such as Philco and McDonnell-Douglas in the 1960s and 1970s as well as NASA and Nintendo in the 1980s and 1990s, with modern pioneers such as Magic Leap, Microsoft’s HoloLens, Oculus VR, and Google Glass kicking off in the early 2010s. The pace of improvement, cost of R&D, and rate of customer adoption (and usage) have all been consistent disappointments thus far.
The launch of the Vision Pro is an obvious sign the era of HMDs has begun—not because Apple has started it but because it has joined the category. There is no company with a richer history of cracking open a longstanding category as Apple. Not only does Apple typically discover the core form factor for a category (has proven particularly tough for HMDs), it is expert at communicating the value proposition of that device (critically important for HMDs, given the various stigmas associated with use and their less-than-intuitive purpose)), and boasts a 500-store retail network and tens of thousands of employees to help explain the device, and teach would-be customers how to use it (immersive HMDs are inherently hard to explain via a commercial or webpage).
If Apple’s decision to launch an HMD in 2024 is the unmistakable harbinger of a new computing era, the device Apple chose to release also suggests that as the “mainstream phenomenon” phase remains a ways away. That isn’t to say that Vision Pro sales should have been gangbusters, or even “hot” after just six months of (limited worldwide) availability. It is clear in the price ($3,500 before tax, custom prescription lenses, AppleCare, and accessories) as well as the sales pathway (initially requiring a hands-on 30-minute demo for in-store pickups, plus a 7–10 day wait for the custom lenses to arrive at your home) that Apple did not expect to ship millions of units in 2024. The limitations to adoption are also evident in Apple’s branding decisions. The Vision Pro will be the first time Apple has ever debuted a “Pro” model before offering a standard edition. The first Plus or Pro model iPhone launched fully seven years after the first standard-issue iPhone and a year after Apple introduced a low-end model (the iPhone c/SE). The iPad Pro came five years after the first iPad and three after the low-cost iPad Mini, while the Apple Watch Ultra came seven years after the debut Apple Watch and two years after the entry-level Apple Watch SE. The AirPod Pros came two years after the AirPods, with the AirPods Max debuting after three. Apple’s inverted approach to the Vision HMD doubtless reflects its belief that, in 2024, releasing a more affordable model would require so many concessions that the device would not be worth selling.
It's also important to stress that HMDs face an inherently harder path to adoption than mobile phones, the last major computing platform. For the vast majority of people on earth, smartphones were their first-ever computer and first-ever personal gateway to the internet. HMDs are, at best, a person’s 2nd internet-enabled computer, more likely their 3rd or 4th or 5th. It is impractical to assume that the N+1th device to provide an individual access to two of mankind’s most powerful creations will see a similar adoption curve to the first. Moreover, HMDs will, for the most part, be more expensive and/or limited than these other devices. AirPods, which like HMDs are “smart wearables,” benefited from the fact that they replaced another device (tethered dumb earphones) that everyone already used, while the Apple Watch primarily displaced other watches (and also bolstered the utility of an iPhone), while the iPad grew in part by displacing the PC platform with a device that was cheaper, easier to use, and more portable.
But even when considering all of the various caveats, it’s clear the Vision Pro is not in high demand—even among those the device squarely targets—and that many of those who did purchase it don’t use it often. That’s not good news for anyone (including Meta).
A Summary View to the Future.
There are fundamental limitations from evaluating any new “thing” after eight months. The iPhone is the most successful product in history, and that trajectory was not apparent after eight or even eighteen months. Apple’s Vision line is backed not just by Apple’s commitment to the category, prowess, and reach but billions in R&D that has yet to be shipped. But Apple is unlikely to ship a new Vision device until mid-2026, and a trajectory shift before then seems equally unlikely.
It’s typical to hear a few justifications of the Vision Pro’s weak sales: “only” 1.5MM iPhones shipped in the device’s first year but 10MM sold in year two, 20MM in year three, 40MM in year four (and to point, the 2020 Quest 1 sold roughly a million units in its first year whereas 2021’s Quest 2 sold 10MM); the first three iPhones had dreadful cameras and no multitasking, the App Store didn’t launch for a year after the first iPhone, the first two iPhones couldn’t reliably make phone calls; even if Apple only sells 500k units, that’s close to $2B in revenue, etc. Yet these are obviously inadequate defenses. In 2024, there will be nearly 2 billion total personal computing devices sold annually, there are over 1.5 billion active iOS users (and 2.2 billion active Apple devices), and trillions are spent through these devices. It will also be far harder to convince those who purchased a $3,500 Vision Pro 1 to upgrade to a newer Vision device than it was the convince those who purchased the early iPhones (for $500 and later, $100-200 after wireless carrier subsidies) to get a model 3 or 4 iPhone (at which point nearly all were bought for $100-200 with subsidy), and to convince those who skipped the Vision Pro 1 that they should get the Vision Pro 2 or Vision 1. The enormity Apple’s investments and publicly-set expectation also merits a more critical lens.
The utility of HMDs in certain fields (e.g. healthcare, construction, safety training) and use cases (e.g. learning about the circulatory system) is, I find, obvious to anyone who tries these devices. I detail a number of these instances in my book – and just two weeks ago, the FAA provided its first-ever approval for a VR flight simulator that would give pilots credit towards their ratings and flight eligibilities. Apple’s Encounter Dinosaurs app is also a simple demonstration of how rad the future of HMDs will be, and it’s not too hard to imagine a future Vision device - perhaps at $1,999 thanks to a reduced Apple margin, falling components costs, the dropping of EyeSight, and maybe even a switch to high-quality plastics (versus heavy, precision-cut aluminium), which would also reduce the HMD’s weight.
But the Vision Pro is not just a promise of the future, nor did Apple intend for it to be just one of many HMDs in the category’s history. The Vision Pro is an individual product. And Apple wanted us to evaluate it as one.
Matthew Ball (@ballmatthew)