from transformers.modeling_utils import PreTrainedModel, Conv1D, prune_conv1d_layer, SequenceSummary
from transformers.modeling_gpt2 import *
from transformers.modeling_bert import gelu
from transformers.configuration_gpt2 import GPT2Config
from transformers.file_utils import add_start_docstrings`
tokens = tokenizer.batch_encode_plus(text.tolist(), max_length=128,
pad_to_max_length=True, truncation=True,
return_token_type_ids=False)
Input is text of different length
Padding is not working here
I'm using this older version of transformers because the above models gpt2 are working fine
How to solve this