Did a couple tests with Colab, moving the installs and imports up to the top. Results... seem to suggest that doing all the installs/imports before actually running the tokenization and such might fix the problem too?
It's a bit confusing. I made a couple cells at the top, like thus:!pip install clearml
andfrom clearml import Task task = Task.init(project_name="project name", task_name="Esperanto_Bert_2")
and# Check that PyTorch sees it import torch torch.cuda.is_available()
and
` # We won't need TensorFlow here
!pip uninstall -y tensorflow
Install transformers
from master
!pip install git+
!pip list | grep -E 'transformers|tokenizers'
transformers version at notebook update --- 2.11.0
tokenizers version at notebook update --- 0.8.0rc1 `and it seems that no matter what order I run them in, I don't get an error. This is complicated by the fact that I'm trying to get Colab to give me a clean runtime each time but I'm having some odd issues with that.
So I wonder if it's got something to do with not just the installs but all the other imports along the way, e.g. importing the tokenizer object and so forth?