Reputation
Badges 1
25 × Eureka!looks like at the end of the day we removedΒ
proxy_set_header Host $host;
Β and use the fqdn for the proxy_pass line
And did that solve the issue?
I can share some code
Please do π
Hmm I guess doable π could you open a github issue with feature request ?
If we have enough support it will bump it in the priority π€
Just wanted to know how many people are actively working on clearml.
probably 30+ π
ReassuredTiger98 are you afraid from lack of support? or are you offering some (it is always welcomed) ?
And is this repo installed on the pipeline creating machine ?
Basically I'm asking how come it did not automatically detect it?
Hi SteadyFox10
Short answer no π
Long answer, full permissions are available in the paid tier, along side a few more advanced features.
Fortunately in this specific use case, the community service allows you to share a single (or multiple) experiments with a read-only link. Would that work ?
MagnificentSeaurchin79 making sure the basics work.
Can you see the 3D plots under the Plot section ?
Regrading the Tensors, could you provide a toy example for us to test ?
And you are seeing a bunch of the GS SSL errors?
Hi SarcasticSparrow10
which database services are used to...
Mongo & Elastic
You can query everything using ClearML interface, or talk directly with the databases.
Full RestAPI is here:
https://clear.ml/docs/latest/docs/references/api/endpoints
You can use the APIClient for easier pythonic interface:
See example here
https://github.com/allegroai/clearml/blob/master/examples/services/cleanup/cleanup_service.py
What is the exact use case you have in mind?
Task deletion failed: unhashable type: 'dict'
Hi FlutteringWorm14 trying to figure where this is coming from, give me a sec
For that I need more info, what exactly do you need (or trying to achieve) ?
SmarmySeaurchin8
Something like this one:vector_series = np.random.randint(10, size=10).reshape(2,5) logger.report_vector(title='vector example', series='vector series', values=vector_series, iteration=0, labels=['A','B'], xaxis='X axis label', yaxis='Y axis label')
New version will contain much more advanced search (including all the task fields)
are there any more fields in this function with partial matching? for example project? tags?
Yes they can all be filtered (basically everything you see in the UI)
notice: tags are strings (you can provide list of tags), project is an ID of the project
(Use Task.get_project_id, I think)
I meant even just a link to a blank comparison and one can then add the experiments from that view
Just making sure you are aware, once you are in comparison you can always add Tasks (any Task):
Notice you can press on the "Add experiments", then select Any experiment (including all projects! as filters)
Notice you need to remove all filters (right side red x on the filter Icon)
Hi @<1523703397830627328:profile|CrookedMonkey33>
If you click on the "Task Information" (on the Version Info panel, right hand-side). It will open the Task details page, there you have the "hamburger" menu top right, where you have publish
(Maybe we should add that to the main right click menu?!)
Hi GrotesqueOctopus42
In theory it can be built, the main hurdle is getting elk/mongo/redis containers for arm64 ...
Please go ahead with the PR π
Click on the Task it is running and abort it, it seems to be stuck, I guess this is why the others are not pulled
BTW: Can you also please test with the latest clearml version , 1.7.2
YummyWhale40 from the code snippet, it seems like the argument is passed.
"reuse_last_task_id=True" is the default, and it means that if the previous run of the task did not create any artifacts/models and was executed 72 hours ago (configurable), The Task will be reset (i.e. all logs cleared) and will be reused in the current run.
Hi ImpressionableRaven99
Yes, it is π
Call this one before task.init, and it will run offline (at the end of the execution, you will get a link to the local zip file of the execution)Task.set_offline(True)
Then later you can import it to the system with:Task.import_offline_session('./my_task_aaa.zip')
Oh my bad, post 0.17.5 π
RC will be out soon, in the meantime you can install directly from github:pip install git+
JitteryCoyote63 fix should be pushed later today π
Meanwhile you can manually add the Task.init() call to the original script at the top, it is basically the same π
And as far as I can see there is no mechanism installed to load other objects than the model file inside the Preprocess class, right?
Well actually this is possible, let's assume you have another Model that is part of the preprocessing, then you could have:
something like that should work
def preprocess(...)
if not getattr(self, "_preprocess_model):
self._preprocess_model = joblib.load(Model(model_id).get_weights())
Can you verify by adding the the following to your extra_docker_shell_script:
https://github.com/allegroai/clearml-agent/blob/a5a797ec5e5e3e90b115213c0411a516cab60e83/docs/clearml.conf#L152extra_docker_shell_script: ["echo machine example.com > ~/.netrc", "echo login MY_USERNAME >> ~/.netrc", "echo password MY_PASSWORD >> ~/.netrc"]
It seems like there is no way to define that a Task requires docker support from an agent, right?
Correct, basically the idea is you either have workers working in venv mode or docker.
If you have a mixture of the two, then you can have the venv agents pulling from one queue (say default_venv) and the docker mode agents pulling from a different queue (say default_docker). This way you always know what you are getting when you enqueue your Task
ConfusedPig65 could you send the full log (console) of this execution?