How can I do fast image processing from the iPhone camera?

I am trying to write an iPhone application which will do some real-time camera image processing. I used the example presented in the AVFoundation docs as a starting point: setting a capture session, making a UIImage from the sample buffer data, then drawing an image at a point via -setNeedsDisplay, which I call on the main thread.

This works, but it is fairly slow (50 ms per frame, measured between -drawRect: calls, for a 192 x 144 preset) and I've seen applications on the App Store which work faster than this.
Примерно половину моего времени я трачу на -setNeedsDisplay .

Как я могу ускорить обработку изображения?

12
задан Brad Larson 9 March 2011 в 19:16
поделиться