ストリーミングはほとんどのブラウザと
Developerアプリで視聴できます。
-
visionOSでの空間アクセサリ入力の詳細
空間アクセサリをアプリに統合する方法を学びましょう。バーチャルコンテンツの表示、アプリの操作、空間内でのトラッキング、インタラクションに関する情報の取得により、visionOSでのバーチャル体験を向上させることができます。
関連する章
- 0:00 - Start
- 2:41 - Build a sculpting app
- 13:37 - Track accessories with ARKit
- 14:45 - Design considerations
リソース
関連ビデオ
WWDC25
WWDC24
WWDC21
WWDC20
-
このビデオを検索
Hi there! I’m Amanda Han, an engineer on the RealityKit team. In this session, I’ll take you through creating a spatial sculpting app for visionOS using new spatial accessory support. On visionOS, we have a powerful eyes and hands input first paradigm. Using just your eyes and hands, you can navigate Apple Vision Pro seamlessly. And we’re expanding the range of experiences you can build on Vision Pro with tools that will let people have finer control, button input, and haptic feedback. This year, we’re adding spatial accessory support.
We’re adding support for two spatial accessories: PlayStation VR2 Sense controller and Logitech Muse.
PS VR 2 Sense controller is great for gaming. It has buttons, a joystick, and a trigger. It can even navigate the system using standard gestures, such as tap.
PS VR 2 Sense controller can be tracked in space so well you can even play sports, like in Pickle Pro by Resolution Games! Another new accessory is Logitech Muse. Logitech Muse has force sensors that allow for variable input on the tip and side button, as well as powerful haptic feedback.
The precision is great for both productivity and creativity apps. Here’s an example with a sculpting app I’ll build in this session with the Photos app open next to it. You can use these accessories in full space and shared space apps. We track the position and rotation of these spatial accessories using a combination of Apple Vision Pro’s cameras and the spatial accessory’s sensors.
You can make use of tactile haptic feedback in your apps to increase levels of immersion.
You connect to spatial accessories using the Game Controller framework and access anchor data using RealityKit or ARKit. In this session, I’ll take you through using spatial accessory input in your apps. I’ll build a sculpting app using a combination of GameController, RealityKit, and ARKit.
I’ll look at directly using ARKit for tracking spatial accessories.
And lastly, I’ll cover some design considerations.
Now, let’s start building the sculpting app! Start by setting up the Xcode project.
Connect to a spatial accessory via the Game controller framework. Display the sculpting tool virtually. Start interacting with the app by carving into virtual clay.
And, display a reactive toolbar based on tracking interaction data.
Let’s get started with setup in Xcode.
I’m using the Game Controller framework to manage connections with my accessories.
So, I’ll add in spatial game controller support to the plist via the Xcode capabilities editor.
And I’ll indicate that I’m supporting game controllers with spatial accessory tracking by ticking the box Spatial Gamepad.
I’ll describe the way I’ll use the accessory in the description field for Accessory Tracking Usage in the app’s plist.
For the sculpting app, I’ll say I’m “Tracking accessory movements to carve into virtual clay”. People will see this as a pop up to allow accessory tracking.
Now, let’s track one of the accessories. Spatial accessories require active connections to transmit inertial sensor data, so I need to discover the connection in order to make the sculpting tool.
I’ll use the GameController framework to discover the connections with spatial accessories. Listen to connection and disconnection events or iterate over current connections. For more on game controllers, watch “Tap into virtual and physical game controllers”.
Two of the classes in Game Controller can support spatial accessory tracking, GCController and GCStylus. They represent game controllers and styli respectively. Both conform to the GCDevice protocol.
Not every game controller or stylus may support spatial accessory tracking. I’ll show you how to check that either does in our sculpting app.
My digital sculptor might not have their accessories connected when the app opens. So, I’ll need to listen for when they connect their accessories. For styli, I’ll listen for the GCStylusDidConnect notification and grab the GCStylus. To check that the GCStylus supports spatial accessory tracking, I’ll make sure that its product category is Spatial Stylus.
The accessory might also disconnect during the app’s lifetime. Disconnect events are implemented similarly. And to use the GCController API, I can swap out GCStylus for GCController, checking that the GCController has the product category of Spatial Controller.
I’ve set up a connection, but I don’t see anything on my accessory yet! Let’s display a virtual tip on the end of it. On visionOS, the technique to track an object’s position and orientation in space is called anchoring. Regardless of whether you’re using RealityKit, ARKit, or a combination of the two, anchoring to accessories is available in full space and shared space apps. The sculpting app will use the shared space, as I know that pulling up reference images in other apps like Safari or Photos for sculpting is super useful. To protect privacy, only the currently focused and authorized app can track accessory movements.
Every accessory labels its own set of locations for anchoring. PS VR 2 Sense controller labels the aim, grip, and grip surface. While Logitech Muse doesn’t label where your grip goes, instead only labeling the aim location.
The sculpting app anchors to each spatial accessory’s Aim. I’ll use a RealityKit AnchorEntity to anchor virtual content to my accessory. An AnchorEntity is an Entity that attaches virtual content to the physical world. It can interact with other elements of my RealityKit scene.
Let’s take a look at anchoring here in code. Create an AccessoryAnchoringSource from a spatial accessory.
I’ll target the “aim” location of my accessory. If my accessory doesn’t support that location, I won’t anchor.
And I’ll create the AnchorEntity with an `accessory` target, that “aim” location, and the predicted tracking mode.
I’ve created the AnchorEntity on the left. The predicted tracking mode uses a sophisticated model to predict where your accessory will be at the time content renders. But if you make jerky movements, the movement might overshoot. You can see that on frame 4, where the purple predicted frame does not match the actual position in gray. Use it for rendering and low latency interactions.
Use the continuous tracking mode for when you need more accuracy. This has a higher latency, but provides higher accuracy poses and does not overshoot. For our accessory, I’ll chose to maximize responsiveness with the predicted tracking mode. In a sculpting app, I don’t expect my sculptors to perform jerky hand movements. Now I have a virtual tip anchored on to my accessory! It might look great, but my sculpting tool can’t interact with anything else in the app. I’d like to carve into the clay based on my accessory’s position, and provide haptic feedback.
To carve into virtual clay, I’ll need to get the sculpting tool’s transform.
Last year, we added the SpatialTrackingSession, which allows you to get the transforms of configured target AnchorEntity types.
New this year, we’ve added .accessory to the list of supported SpatialTrackingSession configurations. Once I add .accessory to the Spatial Tracking Configuration, I can run the session to get the transforms of any accessory AnchorEntity.
For more on SpatialTrackingSession, watch Adrian’s talk, “Build a Spatial Drawing App with RealityKit”.
Let’s add a tactile feel to the sculpting. A spatial accessory can have haptics, which is a great feedback mechanism. I’ll add it in our sculpting app to really feel the clay.
Get the haptics from the accessory, create a haptics engine, and start it.
For more on haptics, like setting up haptic patterns, watch Advancements in Game Controllers. Let’s look at carving into the clay using my sculpting tool. Whenever the accessory moves into the clay, I remove some material and play haptics, for an immersive experience. But I might remove too much material from some parts of my creation, just like that. So, let me add an additive sculpting mode. I’ll show a toolbar to swap between sculpting modes that appears based on the user’s interactions with their accessories. I’ll show it on the left or the right of the accessory depending on which hand is holding it. For this, I can use the ARKit AccessoryAnchor. The ARKit AccessoryAnchor provides four properties: the handedness, telling me which hand is holding the accessory, the relative motion of the accessory in space, the relative rotational movement in space, and the tracking state. Tracking quality declines when sensors or cameras have reduced coverage.
For specific use cases, such as taking real-world measurements with the help of spatial accessories, ARKit also exposes an API that allows you to receive metric anchor transforms. For more details on this, check out the “Coordinate spaces” API in the documentation.
Back to my example. I need to access the ARKit AccessoryAnchor, but I only have the RealityKit AnchorEntity. This year, RealityKit allows you to get ARKit anchors from any RealityKit AnchorEntity if you have a SpatialTrackingSession running and configured. Let’s set up a function to get the ARKit AccessoryAnchor from a RealityKit AnchorEntity.
Simply access the ARKitAnchorComponent on an AnchorEntity, grab its ARKitAnchor, then conditionally cast it to an AccessoryAnchor.
For more on ARKitAnchorComponent, listen to Laurence’s talk, What’s new in RealityKit.
Let’s get to displaying that toolbar. I’ll grab the accessory Anchor from my AnchorEntity using that helper function I just made. And I’ll use the handedness to determine the position of the toolbar. That property is stored in the accessory anchor’s held chirality.
If the handedness is left, I’ll display the toolbar in positive x, and vice versa for right handedness. Otherwise, the accessory is not held in either hand, so I won’t offset it.
Let’s see that reactive toolbar in my app. When I press the button while held in my right hand, it shows up to the left of my accesory. I can swap to the additive mode and cover up that mistake that I made earlier.
Eh... good enough! Let’s recap. I created an immersive sculpting app using the new spatial accessories APIs. I used GameController to connect to accessories.
Then, I used RealityKit APIs to create our sculpting tool. And, I displayed a reactive toolbar using the combined capabilities of RealityKit and ARKit. The finished app looks and feels great. But if you have a preference for working directly with ARKit, or if you’re building an app with custom rendering, there are APIs to help you achieve equivalent functionality for tracking spatial accessories. Let’s take a look. I’ll outline the key points to ARKit for accessory tracking with the accessory tracking provider and accessory anchor updates. Use a GCStylus or a GCController to create an Accessory.
Use an Accessory tracking provider to track Accessory objects.
And when an accessory connects, or disconnects from your app, you need to gracefully handle it. This means you will need to re-run the ARKit session with the changed configuration of Accessories.
For a look at the implementation, check out the ARKit sample app “Tracking accessories in volumetric windows”. Accessory anchors are similar to hand anchors in their update patterns.
You can choose to receive accurate updates in a stream or use on demand prediction for interactive user feedback. For more on ARKit prediction, watch “Create enhanced spatial computing experiences with ARKit”.
I’ve covered a lot of great spatial accessory APIs. Next, I’ll share some design considerations when building your apps.
Let’s start with using gestures to interact with UI. You can already tell a view to receive game controller input like buttons or triggers as the input method instead of gestures.
And now, you can handle both standard hand gestures and game controllers as input in your views.
Here is how this works in code. Tell your SwiftUI view to receive game controller events. Then, tell it to receive gesture events as well by setting the modifier .receivesEventsInView.
If the game controller has spatial accessory tracking, your spatial event gesture will have the chirality, also known as the handedness, of the game controller populated.
For apps running in a full space, consider using the .persistentSystemOverlays API to hide the home indicator and .upperLimbVisibility API to hide upper limbs and accessories. This can further enhance the immersion of apps and games.
Spatial accessories enable powerful new ways to interact with apps and games. And to make sure your app supports as many users as possible, you can offer adaptive support for both spatial accessories and hands.
ARKit natively tracks hands even faster this year, so hands and eyes are even better as input.
When you want to display that you support game controllers with spatial accessory tracking, add the "Spatial game controller support" badge on the app store. And if your app absolutely needs game controllers with spatial accessory tracking, you can display the “Spatial game controller required” badge. Check the documentation for the corresponding keys to add to your app’s plist to display either badge. Let’s wrap up this session. You can adopt spatial accessories for finer input control and haptic feedback. And integrate accessories using GameController, RealityKit, and ARKit.
And when designing your apps, make sure to design for adaptive support for hands and accessories. This is just a glimpse of the apps and games you can build using spatial accessories. I can’t wait to see what you create. Have a great WWDC25!
-
-
0:09 - get in-app transforms
// Get in-app transforms let session = SpatialTrackingSession() let configuration = SpatialTrackingSession.Configuration(tracking: [.accessory]) await session.run(configuration)
-
4:57 - Check for accessory support
// Check spatial accessory support NotificationCenter.default.addObserver(forName: NSNotification.Name.GCControllerDidConnect, object: nil, queue: nil) { notification in if let controller = notification.object as? GCController, controller.productCategory == GCProductCategorySpatialController { } }
-
7:20 - Anchor virtual content to an accessory
// Anchor virtual content to an accessory func setupSpatialAccessory(device: GCDevice) async throws { let source = try await AnchoringComponent.AccessoryAnchoringSource(device: device) guard let location = source.locationName(named: "aim") else { return } let sculptingEntity = AnchorEntity(.accessory(from: source, location: location), trackingMode: .predicted) }
-
9:45 - add haptics
// Add haptics to an accessory let stylus: GCStylus = ... guard let haptics = stylus.haptics else { return } guard let hapticsEngine: CHHapticEngine = haptics.createEngine(withLocality: .default) else { return } try? hapticsEngine.start()
-
11:25 - Access ARkit anchors from AnchorEntity
// Access ARKit anchors from AnchorEntity func getAccessoryAnchor(entity: AnchorEntity) -> AccessoryAnchor? { if let component = entity.components[ARKitAnchorComponent.self], return accessoryAnchor } return nil }
-