1.visionPro上手势追踪技术
手势追踪,是ARKit全新功能。手势追踪提供包含每只手的骨骼数据的锚点
在visionPro上,苹果配备了6颗索尼IMX418摄像头来实现空间、身体、手势的识别,在下部有2颗红外泛光照明灯,辅助更好的在暗光环境的手势识别,另外还增加了LIDAR 和2个深感摄像头做精细的手部追踪,苹果的R1芯片的图像处理、动作捕捉、动作预测、动作补偿等算法综合上述图像一同提供了visionPro上手势追踪。
IMX418
红外泛光照明灯
LIDAR
R1芯片
2.手势追踪的限制
手部追踪需要跑ARKit’s session和佩戴者的授权,ARkit只可以在ImmersiveSpace上,在window、Volume上我们只能使用swiftUI提供的2Dgesture。
3.手势的关键点
当检测到您的手时,它们会以 HandAnchors 的形式提供给您。可用于相对于手放置内容或检测自定义手势
HandAnchor是一个TrackableAnchor。HandAnchors 包括skeleton(骨骼)和chirality(手性)。手性告诉我们这是左手还是右手。
HandAnchor的变换transform,是手腕相对于应用原点的变换。
skeleton骨架由关节(joint)组成,可以通过名称查询。
手部骨架中所有可用的关节有26个,每个关节我们都可以看到它的父关节、localTransform、rootTransform、isTracked
4.如何加入手势追踪
初始化ARKitSession、HandTrackingProvider
let session = ARKitSession()
var handTracking = HandTrackingProvider()
@Published var latestHandTracking: HandsUpdates = .init(left: nil, right: nil)
struct HandsUpdates {
var left: HandAnchor?
var right: HandAnchor?
}
启动ARKitSession 注册handTracking.anchorUpdates
func start() async {
do {
if HandTrackingProvider.isSupported {
print("ARKitSession starting.")
try await session.run([handTracking])
}
} catch {
print("ARKitSession error:", error)
}
}
func publishHandTrackingUpdates() async {
for await update in handTracking.anchorUpdates {
switch update.event {
case .updated:
let anchor = update.anchor
// Publish updates only if the hand and the relevant joints are tracked.
guard anchor.isTracked else { continue }
// Update left hand info.
if anchor.chirality == .left {
latestHandTracking.left = anchor
} else if anchor.chirality == .right { // Update right hand info.
latestHandTracking.right = anchor
}
default:
break
}
}
}
func monitorSessionEvents() async {
for await event in session.events {
switch event {
case .authorizationChanged(let type, let status):
if type == .handTracking && status != .allowed {
// Stop the game, ask the user to grant hand tracking authorization again in Settings.
}
default:
print("Session event (event)")
}
}
}
制定检测规则,检测左右手的每个关节的关系
func computeTransformOfUserPerformedHeartGesture() -> simd_float4x4? {
// Get the latest hand anchors, return false if either of them isn't tracked.
guard let leftHandAnchor = latestHandTracking.left,
let rightHandAnchor = latestHandTracking.right,
leftHandAnchor.isTracked, rightHandAnchor.isTracked else {
return nil
}
// Compute a position in the middle of the heart gesture.
let halfway = (rightHandIndexFingerTipWorldPosition - leftHandThumbTipWorldPosition) / 2
let heartMidpoint = rightHandIndexFingerTipWorldPosition - halfway
// Compute the vector from left thumb knuckle to right thumb knuckle and normalize (X axis).
let xAxis = normalize(rightHandThumbKnuckleWorldPosition - leftHandThumbKnuckleWorldPosition)
// Compute the vector from right thumb tip to right index finger tip and normalize (Y axis).
let yAxis = normalize(rightHandIndexFingerTipWorldPosition - rightHandThumbTipWorldPosition)
let zAxis = normalize(cross(xAxis, yAxis))
// Create the final transform for the heart gesture from the three axes and midpoint vector.
let heartMidpointWorldTransform = simd_matrix(
SIMD4(xAxis.x, xAxis.y, xAxis.z, 0),
SIMD4(yAxis.x, yAxis.y, yAxis.z, 0),
SIMD4(zAxis.x, zAxis.y, zAxis.z, 0),
SIMD4(heartMidpoint.x, heartMidpoint.y, heartMidpoint.z, 1)
)
return heartMidpointWorldTransform
}