1
votes
from transformers.modeling_utils import PreTrainedModel, Conv1D, prune_conv1d_layer, SequenceSummary
from transformers.modeling_gpt2 import *
from transformers.modeling_bert import gelu
from transformers.configuration_gpt2 import GPT2Config
from transformers.file_utils import add_start_docstrings`
    
tokens = tokenizer.batch_encode_plus(text.tolist(), max_length=128, 
                                     pad_to_max_length=True, truncation=True, 
                                     return_token_type_ids=False)

Input is text of different length
Padding is not working here

I'm using this older version of transformers because the above models gpt2 are working fine

How to solve this

Please detail your input, and the expected shape you want to receive. Also note that version 2.4.1 is extremely outdated, and the latest version is 4.11.3; unless there is any specific reason to keep this old version, is it possible for you to update?dennlinger
PLease go through question once againSS Varshini