Loading Custom Open CLIP Models into Marqo
In this recipe we will provide some example code for exporting models from openclip
to use in Marqo. For the purposes of this demo we will use a pre-trained model from openclip
however in reality this would typically be done with weights that you have fine-tuned on your own data, or a model you trained from random initialization.
For detailed instructions on training models with openclip
please refer to their documentation. We recommend openclip
as it is well maintained and has the best open source code for scalable CLIP training with multi-GPU, multi-node, and SLURM support.
This recipe requires openclip
.
pip install open-clip-torch
Exporting the Model
We will use Open CLIP which relies on the PyTorch library. First we instantiate the model and load the pre-trained weights for the purposes of this example.
import open_clip
# create_model_and_transforms returns the model, training transforms, and evaluation transforms
# we only need the model here
model, _, _ = open_clip.create_model_and_transforms(
"ViT-B-32", pretrained="laion2b_s34b_b79k"
)
model.eval()
This will load a ViT model with the architecture ViT-B-32
and the pretrained weights laion2b_s34b_b79k
(~600MB).
We can then export this to a .pt
file with torch
:
import torch
tensors = model.state_dict()
torch.save(tensors, "my_model_for_marqo.pt")
Loading the Model into Marqo
To load the model into Marqo you will need to host somewhere that makes it accessible to Marqo via a URL. A common choice is blob storage like AWS S3 or Google Cloud Storage. If you want to secure your model behind some authorisation then please refer to the docs for using S3 or Hugging Face authentication with Marqo. We do not recommend using pre-signed URLs as Marqo will not be able to access the model once the URL expires, this will be an issue if you try to scale inference nodes or if a node goes down and needs to be replaced.
To load your hosted model you need to provide the index settings with the model
and modelProperties
keys:
import marqo
settings = {
"type": "unstructured", # unstructured or structured, the model syntax is the same
"model": "my_custom_vitb32", # a unique name you can give your model to identify it in Marqo (appears in the cloud UI)
"modelProperties": {
"name": "ViT-B-32", # this is the architecture name, must match the model you trained
"dimensions": 512, # the number of dimensions in the embeddings produced by the model
"type": "open_clip", # the library to use for loading the model
"url": "https://url/to/my/model/my_model_for_marqo.pt", # the URL to the model
},
"normalizeEmbeddings": True,
}
mq.create_index("index_w_custom_model", settings_dict=settings)