![Profile picture](https://clearml-web-assets.s3.amazonaws.com/scoold/avatars/VexedCat68.png)
Reputation
Badges 1
371 × Eureka!Or is there any specific link you can recommend to try and create my own server.
Another question, in the parents sequence in pipe.add_step, we have to pass in the name of the step right?
Wait, so the pipeline step only runs if the pre execute callback returns True? It'll stop if it doesn't run?
Tagging AgitatedDove14 SuccessfulKoala55 For anyone available right now to help out.
Here's the screenshot TimelyPenguin76
These are the pipeline steps. Basically unable to pass these .
Some more of the error.
ValueError: Node train_model, parameter '${split_dataset.split_dataset_id}', input type 'split_dataset_id' is invalid
2021-12-30 16:22:00,130 - clearml.Repository Detection - WARNING - Failed auto-generating package requirements: exception SystemExit() not a BaseException subclass
After the step which gets the merged dataset, I should use pipe.stop if it returned -1?
Not sure myself. I have a pipeline step now, that'll return either clearml dataset id or -1. I want to stop the pipeline execution if I get -1 in the output of that step but I'm not sure how to achieve that
Is the only possible way to get a specific node, is to use one of the get_running_nodes or get_processed_nodes, and then checking every node in the list to see if the name matches the one we're looking for?
I don't think I changed anything.
since I've either added add_functional_step or add_step
before pipe.add_step(train_model)?
Okay so I read the docs and the above questions are cleared now thank you. I just have one other question, how would I access the artifact of a previous step within the pre execute callback? Can you share an example?
This is the task scheduler btw which will run a function every 6 hours.
I'm assuming the triton serving engine is running on the serving queue in my case. Is the serving example also running on the serving queue or is it running on the services queue? And lastly, I don't have a clearml agent listening to the services queue, does clearml do this on its own?
As I wrap my head around that, in terms of the example given in the repo, can you tell me what the serving example is in terms of the explanation above and what the triton serving engine is, in context to the above explanation
For anyone reading this. apparently there aren't any credentials for my own custom server for now. I just ran it without credentials and it seems to work.
AgitatedDove14 Just wanted to confirm in what kind of file is the string artifact stored in? txt file or pkl file?
This is the original repo which I've slightly modified.
because those spawned processes are from a file register_dataset.py , however I'm personally not using any file like that and I think it's a file from the library.
I just want to be able pass output from some step as input to some other step.
Yeah, I kept seeing the message but I was sure there were files in the location.
I just realized, I hadn't worked with the Datasets api for a while and I forgot that I'm supposed to call add_files(location) and then upload, not upload(location). My bad.
Normally when you save a model in tensorflow, you get a whole saved_model not just the weights. Is there no way to get the whole model including the architecture?
For anyone reading this. I think I've gotten an understanding. I can add folders to a dataset so I'll be creating single dataset and will just keep adding folders to it. Then keep records of it in a database