Core Image sampling mode not exposed on iOS, cannot display image with 'nearest' filtering
| Originator: | raphael | ||
| Number: | rdar://15376683 | Date Originated: | 02-Nov-2013 09:38 AM |
| Status: | Open | Resolved: | |
| Product: | iOS SDK | Product Version: | iOS 7.0.3 |
| Classification: | Enhancement | Reproducible: | Always |
CIImage sampling mode cannot be chosen on iOS. This means that, as an example, when drawing to an OpenGL context, displaying a pixellated image using CIPixellate filter (4x4 pixels) at a zoom level above 1.0 will always result in blurry (linearly interpolated) pixel edges. On OS X, this can be done by using a pass through filter with specific sampling: CISampler *sampler = [CISampler samplerWithImage:self keysAndValues:kCISamplerFilterMode, kCISamplerFilterNearest, nil]; Could you please consider adding an API to expose sampling on iOS? Thank you.
Comments
Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!