Swift: Unmanaged<AnyObject>? is a 9-byte value and breaks optimized builds
| Originator: | kevin | ||
| Number: | rdar://19533815 | Date Originated: | 1/20/2015 |
| Status: | Open | Resolved: | |
| Product: | Developer Tools | Product Version: | |
| Classification: | Serious Bug | Reproducible: | Always |
Summary:
The type `Unmanaged<AnyObject>?` is considered to be a 9-byte value even though
other types such as `Unmanaged<NSObject>?` are correctly treated as an 8-byte
value. But it's only a 9-byte value part of the time. When wrapped in
`UnsafePointer`, as in `UnsafePointer<Unmanaged<AnyObject>?>` it becomes an
8-byte value. Most of the time. In optimized builds (`-O`) it still ends up
being a 9-byte value part of the time.
This causes particular trouble when calling C functions that use an out-param
like `UnsafeMutablePointer<Unmanaged<AnyObject>?>`, such as
`SecItemCopyMatching()`. Passing a reference to a local `Unmanaged<AnyObject>?`
that is initialized to `nil` seems to work in debug builds, but in optimized
builds the local value is considered to be `nil` even though the function wrote
a valid pointer to it, because the 9th byte, the discriminant, is still set to
0x01 and the compiler emits code that tests the discriminant instead of testing
the pointer.
This can be reproduced with `SecItemCopyMatching()`, but it can also be tested
by redeclaring `CFArrayGetValues()` as using `Unmanaged<AnyObject>` as well, as
with the code in the reproduction steps.
Something about this that's truly puzzling is the fact that the discriminant is
chosen such that a value of 0x01 indicates `nil` and 0x00 indicates a valid
value. This seems quite backwards, one would expect a discriminant of `0x00` be
chosen to represent `nil` so that it can be unified with the bytes used for the
pointer.
Also puzzling is the fact that `sizeofValue()` always returns 9 for
`Unmanaged<AnyObject>?`, but returns 8 for the expression `x.memory` where `x`
is `UnsafePointer<Unmanaged<AnyObject>?>`. My best guess is that `x.memory` is
actually of the type `@lvalue Unmanaged<AnyObject>?` and this means it's really
a pointer, and the compiler is then able to perform the null pointer
optimization. But having the size change like that merely by taking a pointer to
the value is kind of frightening. It also doesn't explain why `x == nil` changes
behavior under optimization, as `==` doesn't take lvalues.
And finally, all this weirdness goes away when using `Unmanaged<NSObject>?`
instead of `Unmanaged<AnyObject>?` (or using any subclass of `NSObject`).
`Unmanaged<NSObject>?` is an 8-byte value and behaves identically in optimized
and non-optimized builds.
Steps to Reproduce:
1. Compile and run the following code both without optimization and with -O:
import Foundation
@asmname("CFArrayGetValues") func getValues(ary: NSArray, range: CFRange, values: UnsafeMutablePointer<Unmanaged<AnyObject>?>)
@inline(never)
func foo() {
var x: Unmanaged<AnyObject>?
println("x == nil: \(x == nil)")
println(unsafeBitCast(x, (UInt64,UInt8).self))
show(&x)
getValues(["one"], CFRange(location: 0, length: 1), &x)
println("set x to: \(x?.takeRetainedValue() as String?)")
println("x == nil: \(x == nil)")
println(unsafeBitCast(x, (UInt64,UInt8).self))
show(&x)
}
func show<T>(x: UnsafePointer<T>) {
let p = UnsafePointer<UInt8>(x)
var s = "<"
for i in 0..<sizeofValue(x.memory) {
if i > 0 && i % 4 == 0 {
s.extend(" ")
}
let byte = String(p[i], radix: 16)
if countElements(byte) == 1 {
s.extend("0")
}
s.extend(byte)
}
s.extend(">")
println(s)
}
foo()
Expected Results:
Ideally, it would correctly understand that Unmanaged<AnyObject> is an 8-byte
value. In that case the above program would actually fail at unsafeBitCast()
because of the different sizes, but with the casted type replaced by `UInt64`
then ideally it would print something like:
x == nil: true
0
<00000000 00000000>
set x to: Optional("one")
x == nil: false
140720087302720
<4002d1f2 fb7f0000>
In fact, this is precisely what happens when you edit the code to use
`Unmanaged<NSObject>?` instead of `Unmanaged<AnyObject>?`.
Actual Results:
However, it is a 9-byte value, and so the above can't happen. Instead, under
optimization it prints:
x == nil: true
(0, 1)
<00000000 00000000>
set x to: Optional("one")
x == nil: false
(140720087302720, 1)
<4002d1f2 fb7f0000>
Three things to note here. First, `x` starts out as `(0, 1)`, which demonstrates
the oddly swapped discriminants. Second, the size as seen by `show()` is only 8
bytes, and so it prints an 8-byte null pointer for `nil`. Third, even though the
discriminant remains as `1` after the value is populated, Swift believes the
value to be non-nil.
With optimization (using -O):
x == nil: true
(0, 1)
<00000000 00000000 01>
set x to: nil
x == nil: true
(140498359550960, 1)
<f003d052 c87f0000 01>
Things to note here: the discrimnant is still 1 for `nil`, but this time
`show()` actually sees a 9-byte value. Not only that, but Swift believes the
value to still be `nil` after the call to `getValues()` because it's now testing
the discriminant.
Version:
Swift version 1.1 (swift-600.0.57.3)
Target: x86_64-apple-darwin14.0.0
Notes:
This bug is a very serious problem for anyone using the Keychain APIs, and
anything else that uses `Unmanaged<AnyObject>`. Not only is it obviously broken,
but it's very oddly only broken for optimized builds, meaning that people who
hit this won't ever see it in their development environment, and perhaps even
worse, won't ever get usable assertion failure messages when they e.g.
force-unwrap the optional value, because optimization disables those.
Configuration:
OS X 10.10.1 (14B25)
Comments
Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!