Tensorizer is a great tool for load transformer models very quickly, however loading sentence transformers models isn't supported natively. I could load the sentence transformers models through AutoModel and then run the forward pass in torch to get embeddings, but using SentenceTransformers, model.encode() method is very convenient, would love to use that instead.
Thanks !