You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We know that when loading pretrained bert, max_seqence=512, but when you are dealing with the sentences from data set, each sentence is padding based on the longest sentence of the batch, but I don't see you do truncate when the sequence longer than 512 lengths?