I'm using CIDetector to find faces in images on the iPhone. It's working and I'm able to find the eye and mouth locations and the face angle, no problems.
I'm wondering if there is a way to work out if the person is looking left or right from the positions of the features? Or some other info in the CIFaceFeature data?
Does anyone know a formula for working this out, or is there a different API you'd suggest to do this? I'd like to do it on both moving and still images.
Any advice would be awesome.
Aucun commentaire:
Enregistrer un commentaire