Following my question on how to delete layers from a finetuned LM, I came across a Github that on first glance seem to do that (see from line 580).
They use
model.config.num_hidden_layers
when defining their model for training
I am not sure what this does though? if I specify this to 10 after loading a finetuned model, will I only be using the first 10 layers?
More specifically, I loaded a finetuned model for qa as such:
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
model_name = 'twmkn9/bert-base-uncased-squad2'
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
and then
model.config.num_hidden_layers = 10
Does that mean that I only have the last layer on now?
Do I need to add a layer to process the output so that I can get the SQuAD output?
Thank you