Unanswered
Hello Everyone,
Is The Logging Of "Automodelforcausallm" Models Supported? Even With Manually Logging Of Llama-2-7B-Hf It Does Not Appear In The Model List (Even When Uploading As Artifact)
Hi @<1523701070390366208:profile|CostlyOstrich36>
Thanks for the quick reply.
Here is a snipped of the code I use to register the model.
I load in the model to set the parameters and try to register it.
The result is a model pickle file in the artifacts tab but no registration of the model.
task_name = "Llama2_model"
task = Task.init(project_name='project_name', task_name=task_name, output_uri="s3://...")
model = AutoModelForCausalLM.from_pretrained(Llama2_source,
trust_remote_code=True,
local_files_only=True,
torch_dtype=torch.float16,
).to(device)
model.config.do_sample = True
model.config.temperature = temperature
model_kwargs = {
'do_sample': True,
'temperature': temperature,
'max_new_tokens': max_new_tokens,
'top_k': top_k,
'num_return_sequences': num_return_sequences,
}
task.upload_artifact('Llama2_model', model)
I also tried to use class OutputModel() but with no results:
task_name = "Llama2_model"
task = Task.init(project_name='project_name', task_name=task_name, output_uri="s3://...")
output_model = OutputModel(task=task)
...
model.save_pretrained('model')
output_model.update_weights('model')
output_model.set_name('Test Model Name')
output_model.publish()
task.close()
120 Views
0
Answers
one year ago
12 months ago