Definition of Unicode literal string in Swift is incorrect
| Originator: | craig.hockenberry | ||
| Number: | rdar://19758644 | Date Originated: | 07-Feb-2015 06:04 PM |
| Status: | Open | Resolved: | |
| Product: | Documentation | Product Version: | Xcode 6.1.1 (6A2008a) |
| Classification: | Other Bug | Reproducible: | Always |
The documentation states that a String literal in Swift can be “An arbitrary Unicode scalar, written as \u{n}, where n is between one and eight hexadecimal digits”
https://developer.apple.com/library/prerelease/ios/documentation/Swift/Conceptual/Swift_Programming_Language/StringsAndCharacters.html#//apple_ref/doc/uid/TP40014097-CH7-ID293
In the current version of Swift, the value for n must be between 1 and 5 hexadecimal digits (which is consistent with the 21-bit number used to represent the value.)
Comments
Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!