Calling integerValue and unsignedIntegerValue on NSDecimalNumbers created from strings (including those using KVC) yields inconsistent results.

Originator:isaac.greenspan
Number:rdar://20211532 Date Originated:18-Mar-2015 03:46 PM
Status:Closed (dup of rdar://19812966) Resolved:
Product:Developer Tools Product Version:10.10.2 / Xcode 6.2
Classification:Enhancement Reproducible:Always
 
This is a duplicate of rdar://20211159

Summary:
This bit someone trying to use key-value coding to get the average of a given number out of an array of items with an NSUInteger property.

The NSNumber which came out of the @avg.value KVC method was of value 588.33333333333333333333333333333333333. The person then tried to access this value by calling both average.integerValue and average.unsignedIntegerValue.

On this person's computer, which was running Mavericks, the average integer would be 588. On my computer, running Yosemite, the average would be zero. After quite a bit of futzing about, we discovered that NSDecimalNumbers created from a string respond differently to being converted to an integer depending on the length. 

Steps to Reproduce:
See attached playground. 

Expected Results:
The integerValue and unsignedIntegerValues would be the same across platforms and regardless of the method of creation for the NSDecimalNumber. 

Actual Results:
(see “summary”)

Regression:
This is occurring on 10.10.2, 

Notes:
Please see attached playground for some examples of how this winds up working.

Comments


Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!