Reputation
Badges 1
67 × Eureka!i updated to 1.10
i am uploading the model inside the main() function, using this code:
model_path = model_name + '.pkl'
with open(model_path, "wb") as f:
pickle.dump(prophet_model, f)
output_model.update_weights(weights_filename=model_path, iteration=0)
ok, it is solved with the force_git_root_python_path: true in clearml.conf
@<1523701205467926528:profile|AgitatedDove14>
ok so now i upload with the following line:
op_model.update_weights(weights_filename=model_path, upload_uri=upload_uri) #, upload_uri=upload_uri, iteration=0)
and while doing it locally, it seems to upload
when i let it run remotely i get yhe original Failed uploading error.
altough, one time when i ran remote it did uploaded it. and then at other times it didn't. weird behaivor
can you help?
the base image is python:3.9-slim
we use the clearml hosted server, so i don't know the version
WebApp: 3.16.3-949 • Server: 3.16.1-974 • API: 2.24
no, i just commented it and it worked fine
that's the one, I'll add a comment (I didn't check the number of connections it opens, so idk the right number)
only sometimes, the pipeline runs using local machines
ok so, idk why it helped, but setting base_task_id
instead of base_task_name in the pipe.add_step
function, seems to overcome this
why doesn't it try to use ssh as default? the clearml.conf doesn't contain user name and password
i need to read and write, i do have access from genesis autoscaler when i set off all firewall rules. but this is not recommend by microsoft.
I need to add specific firewall rules for the genesis machines, to allow them to authorize to my azure blob storage
(im running it on docker)
another question: if i save heavy artifcats, should my services worker ram be at least as high? (or is it enough for the default queue workers to have that)
i can send you our pipeline file and task
but why does it matter if i ran it on a remote agent?
how do i access the clearnl.conf custom variables then?
or - how do i configure and access env variables that way?
the use case is simple:
i wanna fetch data from an sql table, inside a task.
so i want to execcute a query, and then do some operations on it, from within a task, to do that i have to connect to the db,
and i don't want the connection details to be logged
then it works
i opened a new, clean venv just now
plus, is there an option to configure the agent configuration? for example we are using:
force_git_root_python_path: true
can we do it there as well?
@<1523701070390366208:profile|CostlyOstrich36>
By the way, how do i set up a shell script?
i don't see an option to do it from the UI
@<1523701070390366208:profile|CostlyOstrich36>
hey john, let us know if you need any more information
ok, yeah, makes sense. thanks John!
yes,
so basically I should create a services queue, and preferably let it contain its own workers
ok, thanks jake
what will be the fastest fix for it?
so i think debian (and python 3.9)