Reputation
Badges 1
26 × Eureka!_reload()
prints ('verify', '/etc/ssl/certs/ca-certificates.crt')
Originally I wanted to use the environment variable AWS_CA_BUNDLE=/etc/ssl/certs however it seems boto3 doesn't respect that variable from my testing
the config that gets sent to boto3 still only has verify=True
api {
web_server:
api_server:
files_server:
credentials {
access_key: key
secret_key: secret
}
}
sdk {
aws {
s3 {
bucket: bucket-name
key: my-key
secret: my-secret
secure: true
verify: "etc/ssl/certs/ca-certificates.crt"
multipart: false
}
}
}
I have, _reload()
prints ('verify', '/etc/ssl/certs')
while __init__()
simply prints {'endpoint_url': '
None ', 'use_ssl': True, 'verify': True, 'region_name:' None, 'config': <botocore.config.Config object at 0x7f4408d08a00>}
We've added {"script.requirements.pip": "urllib3==1.26.14"}
in each step of the pipelines however it still installs urllib3 2.0.3
We've added that parameter under the controller, but that parameter doesn't exist for add_step()
, is there an alternative?
When I'm using the default python kernel for notebooks it does store it successfully, however when switching to a venv it doesn't.
When working outside of my venv I can see my jupyter notebook inside "uncommitted changes" and also as an artifact. When inside a venv I dont see it as an artifact and all I see under "uncommited changes" is the contents of ipykernel_launcher.py
seems like an issue when using ipykernel...
steps to reproduce:|
virtualenv my_env
source my_env/bin/activate
pip install ipykernl
python -m ipykernel install --user --name=my_env
then switch the kernel to the new ipykernel to reproduce
I should mention that reporting artifacts and everything else works. I'm just not seeing the changes inside the web ui.
I need ClearML Agent to connect to a pipeline, from what I understood you needed an agent running in services mode to do that.
@<1523701205467926528:profile|AgitatedDove14> hey, we found out what was causing that issue
when a new venv is created it does not contain any python libraries, so when ClearML was trying to list the current running jupyter servers (using the Jupyter Notebook python library) it was failing since that library does not exist.
Not sure why there were no warnings or errors regarding it...
We fixed it by running pip install notebook
inside the venv, and it worked!
CC: @<1564422819012415488:p...
I see, so I simply need to run k8s glue and connect it to the "services" queue?
let me correct myself, when switching to a different python kernel it breaks.
We're using a pipeline, there is no Task
object
Nevermind, i messed up the preprocessing
I'm using the helm chart, is that not part of it?
where do i run clearml-serving then?