Earlier this year, Apple debuted its Vision Pro XR headset to provide an immersive productivity tool to act as a technology leap forward following the firm’s Mac laptop and desktop portfolio. The firm is selling its upcoming device – due 2024 – based on a spatial computing vision which allows users to work in a 3D computing space that displays digital workplace applications on virtual monitors surrounding the user.

Recently, Apple introduced 3D spatial recording features on its latest iPhone iteration, allowing users to capture moments as 3D assets using the device’s camera. Users can then view their iPhone-captured 3D spatial recordings via Apple’s Vision Pro MR headset as an AR visualisation.

Following its latest iPhone spatial update, the firm is deepening its offering but debuting object capture and photogrammetry abilities, which can increase the number of workplace use cases for the iPhone in specific sectors. Apple’s iPhone portfolio already has a deep integration into many businesses thanks to the product’s ubiquity as a worldwide digital communication tool.

Last week, Apple introduced Object Capture as a new feature for the latest iPhone and iPad models that developers can access via the macOS Monterey and Xcode 13 beta.

In a promotional video, an Apple spokesperson said:

Instead of manually creating 3D models, which can take weeks, Object Capture uses photogrammetry to turn a series of 2D images into photo-realistic 3D objects in just minutes. Leading developers like Maxo and Unity are already using Object Capture to unlock entirely new ways of creating 3D content.

Moreover, Apple’s promotional video explained that to create a 3D asset, a user must take photos of a real-life object from various angles and then, using Cinema4D, the images are converted into a 3D model.

Following the conversion, developers can view the scanned object as an AR visualisation via a compatible device. Apple notes how Wayfair uses Object Capture to develop innovative digital tools at its manufacturing plants. Employees use iPhones and Macs to create AR assets that customers can place in their homes before purchasing a piece of furniture.

This is a massive step forward for 3D content creation. What used to be the most difficult and expensive part of building AR experiences and 3D scenes is now available for all developers in MacOS Monterey.

Object Capture for Enterprise

The features debut means a new range of enterprise end-users can leverage Apple’s growing product portfolios for new use cases. Even if a user or their workplace does not own a Vision Pro headset, they can still view the 3D assets on various devices thanks to an integrated OpenUSD file type – created by Pixar – that promotes interoperability.

Firms such as Arvizio already allow clients working in construction and repair to leverage photogrammetry assets to quickly send up-to-date information to co-workers, saving capital and time while driving sustainability.

Moreover, product design teams leverage tools from a firm like Campfire to collaborate over scanned AR assets, massively streamlining product review sessions.

With Apple entering the photogrammetry market with a new extended roadmap for enterprise, many immersive technology solutions providers may see a new form of competition.

Apple’s existing hardware presence in enterprise workflows provides a pre-existing user base ready to adopt Object Capture, therefore potentially driving photogrammetry adoption in a market already familiar with 3D file types.

However, Apple is not the only player in the enterprise photogrammetry market. Giants like Microsoft, NVIDIA, and Epic Games are all individually working on photogrammetry capture tools to suit use cases ranging from entertainment application development, to industrial maintenance. However, again, Apple’s device ubiquity will be a significant factor in securing the firm’s market share.