Reputation
Badges 1
109 × Eureka!it is ok?
Task init
params setup
task.execute_remotely()
real_code here
File "aicalibration/generate_tfrecord_pipeline.py", line 30, in <module>
  task.upload_artifact('train_tfrecord', artifact_object=fn_train)
 File "/home/usr_341317_ulta_com/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/task.py", line 1484, in upload_artifact
  auto_pickle=auto_pickle, preview=preview, wait_on_upload=wait_on_upload)
 File "/home/usr_341317_ulta_com/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/binding/artifacts.py", line 560, in upload_artifa...
if I have 2.4 or 2.2 in both there is no issue
not sure, I'm using GCS not S3. Is download_folder
doing something different than downloading all files inside the folder?
it seems that I need to add it ( import pandas
) in the main file...even though I don't use it there...
I think that worked, because now I'm having a different issue..it says that cannot import pandas..I have it both in my requirements.txt
and in task.add_requirements('pandas')
no, my_package is never added manually
and then it works
not according to the logs:
info ClearML Monitor: Reporting detected, reverting back to iteration based reporting
Hi! I was going to ask about this, but I didn't understand the solution...currently logs are >100MB because of this..is there a way to save the line only once the epoch is done? AgitatedDove14
the my_package now works ok 🙂
what could be wrong?
the thing is that I have to manually add all imports of packages that don't appear in my main script
in this example my main func is going to be the scripts that creates the pipeline controller
for it to work in a remote worker
and in dummy_module I have:
import pandas as pd
def func(args):
pd.read_csv(args.file)
whereas this is what is being logged in your toy example: tf.Tensor(1742.0144, shape=(), dtype=float32)
How would you like me to share it?
So far I have this:
tensorflow_object_detection_autoinstall.sh
Before running:
You need to set your venv
install numpyexport TF_DIR=$HOME/tensorflow mkdir $TF_DIR cd $TF_DIR echo
pwdwget
unzip protoc-3.14.0-linux-x86_64.zip -d protoc export PATH=$PATH:
pwd`/protoc/bin
git clone
cd models
git checkout 8a06433
cd $TF_DIR/models/research
protoc object_detection/protos/*.proto --python_out=.
git clone
cd cocoapi/PythonAPI
make
cp -r py...
Hi! I'm tryin to find a workaround for this: can't do pip install <name_of_package>
I executed the task, and it created a cache venv.
But when running the code, it couldn't import the package because it wasn't listed.
I then sourced the venv, and manually installed the package
so if I do python -c 'import object_detection' if works
and can the agent when running locally with no base docker, inherit as well system wide packages?
so I have a couple of questions regarding setup.py.
If I add the requirement '.' as the last entry, does that mean that it will install my package lastly? Can I do in setup.py the modifications to the tensorflow code?I need to see how I can change the tensoflow code after it was installed and prevent other tensorflow installation to overwrite it..is it clear?
yes, that would work, except that I need to modify tensorflow as well..I'm currently working on creating a wheel for modified tf..but it's taking a while...
so if it lasts executes python setup.py install, I can do stuff like add a line to a file in my venv inside the setup script
btw, do you see these messages AgitatedDove14 when they are inside an old thread? or should I start a new message?