--- title: FaceRecognitionIndexer keywords: fastai sidebar: home_sidebar nb_path: "nbs/indexers.FaceRecognitionIndexer.ipynb" ---
indexer = FaceRecognitionIndexer()
photo = IPhoto.from_data(file=PYI_TESTDATA / "photos" / "facerecognition" / "celebs.jpg")
boxes, landmarks = indexer.predict_boundingboxes(photo)
photo.draw_boxes(boxes)
crops = photo.get_crops(boxes, landmarks)
Lets get two random faces from our photo
show_images([crops[0], crops[1]])
And check if they are the same
try:
similarity = indexer.compare(crops[0], crops[1])
assert similarity < 0.5
finally:
print("Not the same person")
ellen1 = IPhoto.from_data(file=PYI_TESTDATA / "photos" / "facerecognition" / "ellen1.png")
ellen2 = IPhoto.from_data(file=PYI_TESTDATA / "photos" / "facerecognition" / "ellen2.png")
show_images([ellen1.data, ellen2.data])
try:
sim = indexer.compare(ellen1.data, ellen2.data)
assert sim > 0.5
finally:
print("Same person")
When we plot the crops, we see that our model actually is doing some magic behind the scenes to normalize and scale our images.
photo.plot_crops(boxes, landmarks)
# show_images(crops, cols=4)