NSNumberFormatter misinterpreting NSUInteger as NSInteger during formatting.
| Originator: | regexident | ||
| Number: | rdar://19068423 | Date Originated: | 22-Nov-2014 09:25 PM |
| Status: | Open | Resolved: | |
| Product: | OS X | Product Version: | Mac OS X 10.10 |
| Classification: | Serious Bug | Reproducible: | Always |
Summary: NSNumberFormatter appears to falsely interpret 64bit unsigned integer (a.k.a NSUInteger on 64bit) NSNumbers (“NSNumber(unsignedLongLong: …)”) as 64bit signed(!) integer (a.k.a NSInteger on 64bit) NSNumbers when formatting. Steps to Reproduce: Swift Playground: import Cocoa // or UIKit let uint64_max: UInt64 = UInt64.max let uint64_max_number = NSNumber(unsignedLongLong: uint64_max) let uint64_max_string = NSNumberFormatter().stringFromNumber(uint64_max_number)! Expected Results: Playground console log: 18446744073709551615 18446744073709551615 “18446744073709551615” Actual Results: Playground console log: 18446744073709551615 18446744073709551615 “-1” Regression: Notes: This also occurs with the Objective-C side of things (on both iOS and OS X). It’s thus not purely Swift-related.
Comments
Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!