Semantic feedback/haptics on iOS
| Originator: | frozendevil | ||
| Number: | rdar://21540065 | Date Originated: | 24-Jun-2015 10:07 PM |
| Status: | Open | Resolved: | |
| Product: | iOS SDK | Product Version: | iOS 9.0 |
| Classification: | Enhancement | Reproducible: | Not Applicable |
watchOS 2.0 introduced the concept of semantic haptics with the `-[WKInterfaceDevice playHaptic:]` API. I would really like to use this on iOS. Being able to have a consistent set of haptic & audio queues to use for well-defined UX events would bring a level of polish and consistency to iOS apps that is difficult to achieve today. As a contractor, most large companies aren’t interested in investing in sound design for their apps; as an independent developer I simply don’t have the resources for recording and implementing professional-quality sound design. As both, the only public option I have for device vibration is the somewhat clumsy `AudioServicesPlaySystemSound (kSystemSoundID_Vibrate)` API. Having the ability to simply call `playHaptic:` at the appropriate times would be great. This could also be an accessibility boon, as it would provide a common vocabulary for non-visual UX feedback. I think it would be a great augmentation for a VoiceOver interface that—while powerful—is relatively one dimensional.
Comments
Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!