How to use Medissa/t5_more_context with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Medissa/t5_more_context") model = AutoModelForSeq2SeqLM.from_pretrained("Medissa/t5_more_context")
No model card