The new RoomPlan Swift API uses the iPhone and iPad's LIDAR sensor to scan indoor spaces — including furniture layouts.
Launched at WWDC 2022, RoomPlan is an augmented reality technology that sought to help collect data about a room and its contents.
In a dedicated Apple Developer page, the company touted the RoomPlan API as being useful in creating floor plans. These plans then can be applied directly in real estate and hospitality apps that require a deep knowledge of a building's layout.
The 3D scanning of an interior space was also promoted as a way to easily plan for solutions in architectural and interior design workflows.
Utilizing the LIDAR sensors on select iPhone and iPad models, the display will show "real-time scanning progress, and a dollhouse visualization that shows everything in the room that has been recognized."
The plans will be exported in widely-compatible USD or USDZ file formats for import in many popular 3D utilities, such as Cinema 4D and AutoCAD. They will come with reports of dimensions of the room, as well as those of the included furniture.
Twitter user @jonstephen85 tested RoomPlan with a 12.9-inch iPad Pro. In a series of tweets detailing his experiences, the technology was demonstrated to be able to detect mirrors and other loose objects.
However, it seemed to have considerable trouble with adapting to recently-closed doors, moving to different stories, and keeping scans of the walls consistent.
The new RoomPlan Swift API is available June 8 as a beta to registered developers.
9 Comments
That will be useful. I can just bring the scanned interior to Home Depot for home improvement and furnishing. Good one, Apple.
What technology would be necessary for the Home app to know, which room exactly I am in, so it knows, which fan to turn on, when I say, “Hey Siri, turn on the fan!”?
I am sure a more or less accurate house floor plan stored in the Home app would help. Hopefully, it would be protected like other personal data.