给出一句话来描述想要的图片,就能从图库中搜出来符合要求的

介绍

地址:https://github.com/mazzzystar/Queryable

The open-source code of Queryable, an iOS app, leverages the OpenAI's CLIP model to conduct offline searches in the 'Photos' album. Unlike the category-based search model built into the iOS Photos app, Queryable allows you to use natural language statements, such as a brown dog sitting on a bench, to search your album. Since it's offline, your album privacy won't be compromised by any company, including Apple or Google.

给出一句话来描述想要的图片,就能从图库中搜出来符合要求的_第1张图片

How does it work?

  • Encode all album photos using the CLIP Image Encoder, compute image vectors, and save them.
  • For each new text query, compute the corresponding text vector using the Text Encoder.
  • Compare the similarity between this text vector and each image vector.
  • Rank and return the top K most similar results.

The process is as follows:

给出一句话来描述想要的图片,就能从图库中搜出来符合要求的_第2张图片

For more details, please refer to my blog: Run CLIP on iPhone to Search Photos.

Run on Xcode

Download the ImageEncoder_float32.mlmodelc and TextEncoder_float32.mlmodelc from Google Drive. Clone this repo, put the downloaded models below CoreMLModels/ path and run Xcode, it should work.

 

你可能感兴趣的:(人工智能,Queryable)