Unanswered
Hi All
Im Trying To Save My Model Checkpoints During Runtime But Am Running Into A Confusing Snag.
I'M Using The Huggingface Architecture For A Transformer. Using Their Training Module To Control Training. In The Training Args, I Have The
called with:
task = Task.init(
project_name=project_name, task_name=task_name, output_uri="fileserver_address"
)
task.connect(config)
checkpoint = config.get("model_path")
image_processor = AutoImageProcessor.from_pretrained(
checkpoint,
num_labels=config.get("class_number"),
)
best_model = training(checkpoint, image_processor)
28 Views
0
Answers
2 months ago
2 months ago