-
Notifications
You must be signed in to change notification settings - Fork 484
Description
I am getting this tokeniser
`---> 86 probabilities, predicted_labels = predict_medical_concept_relevance(metadata)
87 print("Probabilities:", probabilities)
88 print("Predicted Labels:", predicted_labels)
Cell In[2], line 68
65 dataset = [input_example]
67 # Use PromptDataLoader to tokenize and prepare input data
---> 68 data_loader = PromptDataLoader(dataset=dataset, template=prompt_model.template, tokenizer=prompt_model.tokenizer, max_seq_len=512, batch_size=1, shuffle=False)
70 # Model prediction: process inputs and decode outputs
71 for batch in data_loader:
File /usr/local/lib/python3.10/dist-packages/openprompt/pipeline_base.py:76, in PromptDataLoader.init(self, dataset, template, tokenizer_wrapper, tokenizer, tokenizer_wrapper_class, verbalizer, max_seq_length, batch_size, shuffle, teacher_forcing, decoder_max_length, predict_eos_token, truncate_method, drop_last, **kwargs)
74 if tokenizer_wrapper is None:
75 if tokenizer_wrapper_class is None:
---> 76 raise RuntimeError("Either wrapped_tokenizer or tokenizer_wrapper_class should be specified.")
77 if tokenizer is None:
78 raise RuntimeError("No tokenizer specified to instantiate tokenizer_wrapper.")
RuntimeError: Either wrapped_tokenizer or tokenizer_wrapper_class should be specified.`