No fast path from UIView layers to CGImageRef for use in image processing

Originator:dillion
Number:rdar://14232801 Date Originated:22-Jun-2013 05:10 AM
Status:Open Resolved:No
Product:iPhone SDK Product Version:iOS 7.0 beta
Classification:Feature (New) Reproducible:Always
 
Summary:

The new snapshot method in UIView returns a UIView based on the contents of the current view - I would like a similar method that returns a UIImage or CGImageRef composited from the layers of the current view, in a more performant manner than CALayer's renderInContext: method. 

This would be useful for image processing effects, such as blur, for the purpose of screenshot based animations, since CIFilter or OpenGL filters can take in CGImageRefs as input. Otherwise, having to perform renderInContext: prior to image processing, whether cpu or gpu, significantly slows down the animation. 

If it helps the performance, probably the rendering can be done at a low resolution, since the image data is going to be filtered and then animated within a short period of time, and for other purposes that require high resolution I believe the snapshot method is sufficient.

Steps to Reproduce:

Expected Results:

Actual Results:

Regression:

Notes:

Comments


Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!