Skip to content
Advertisement

Load a model as DPRQuestionEncoder in HuggingFace

I would like to load the BERT’s weights (or whatever transformer) into a DPRQuestionEncoder architecture, such that I can use the HuggingFace save_pretrained method and plug the saved model into the RAG architecture to do end-to-end fine-tuning.

JavaScript

But I got the following error

JavaScript

I am using the last version of Transformers.

Advertisement

Answer

As already mentioned in the comments, DPRQuestionEncoder does currently not provide any functionality to load other models. I still recommend creating your own class that inherits from DPRQuestionEncoder that loads your custom model and adjusts its method.

But you asked in the comments if there is another way, and yes there is in case the parameters of your model and the model that your DPRQuestionEncoder object is holding are completely the same. Please have a look at the commented example below:

JavaScript

It works from a technical perspective but I can not tell you how it will perform for your task.

User contributions licensed under: CC BY-SA
1 People found this is helpful
Advertisement