Recently we noticed that we can load about 50 2448 x 3264 (iphone camera size) images into UIImageViews that are on the screen and currently visible. So all 50 images are decoded. We know on iphone memory size per app is about 500 MB (RAM and VRAM together). If we decode one 2448 x 3264 size image into RGB888 then it will be 24 MB, 50(image count) * 24(one image size) > 500 (memory for application) , so how it can be ? We run the instruments, selected blank then from library 'VM Tracker', and under 'IOKit'(there are the memory for textures) we saw that memory increasing not 24 MB but about 11.9 MB per image. We considering that may be iOS decodes jpeg files not to RGB888 but YUV420.
Can someone help us describe this situation ?
I can send code samples if there is the need for that.
Aucun commentaire:
Enregistrer un commentaire