I assume that at some points in the execution, the client (where the task is running) is sending JSONs to the mongo service, and that is what we see in the web UI.
Since we are talking about a case where there is no internet available, maybe these could be dumped into files/stdout and let the user manually insert them.
The manual insertion UX could be something like a CLI copy-paste or and endpoint for files - but since your UX is so good ( 🙂 ) I'm sure you'll figure this part out better
that is because my own machine has 10.2 (not the docker, the machine the agent is on)
essentially editing apiserver.conf section auth.fixed_users.users
Yep, if communication is both ways, there is no way (that I can think of) it can be solved for offline mode.
But if the calls that are made from the server to the client can be redundant in a specific setup (some functionality will not work, but enough valuable functionality remains) then it is possible in the manual way
Especially coming from the standpoint of a team leader or other kind of supervision (or anyone who wants to view the experiment which is not the code author), when looking at an experiment you want to see the actual code
moreover, in each pipeline I have 10 different settings of task A -> Task b (and then task C), each run 1-2 fails randomly
Very nice thanks, I'm going to try the SA server + agents setup this week, let's see how it goes ✌
but the task pending says its in the queue
I'm trying it now
Anyway I checked the base task, and this is what it has in installed packages (seems like it doesn't list all the real packages in the environment)
Now I see the watermarks are 2gb
yeah I guessed so
👍
Searched for "custom plotly" and "log plotly" in search, didn't thinkg about "report plotly"
AgitatedDove14 just so you'd know this is a severe problem that occurs from time to time and we can't explain why it happens... Just to remind, we are using a pipeline controller task, which at the end of the last execution gathers artifacts from all the children tasks and uploads a new artifact to the pipeline's task object. Then what happens is that Task.current_task() returns None for the pipeline's task...
when I specify --packages I shoudl manually list them all not?
Example code? I didn't see anywhere an example of filtering using project name
I dont think that has to do anything with the value zero, the lines that should come out of 'mean' and 'median' have the value of None under quantile, but have a dre_0.5 assoxiated with them. those lines appear in the notebook and not in the ui
nvidia/cuda:10.1-base-ubuntu18.04
🤔 is the "installed packages" part editable? good to know
Isn't it a bit risky manually changing a package version? what if it won't be compatible with the rest?
