Reputation
Badges 1
109 × Eureka!How would you like me to share it?
So far I have this:
tensorflow_object_detection_autoinstall.sh
Before running:
You need to set your venv
install numpyexport TF_DIR=$HOME/tensorflow mkdir $TF_DIR cd $TF_DIR echopwdwget unzip protoc-3.14.0-linux-x86_64.zip -d protoc export PATH=$PATH:pwd`/protoc/bin
git clone
cd models
git checkout 8a06433
cd $TF_DIR/models/research
protoc object_detection/protos/*.proto --python_out=.
git clone
cd cocoapi/PythonAPI
make
cp -r py...
so I need to run a sed command to replace some lines in one of the tensorflow files..do you know if I can do this as part of the setup.py install?
whereas this is what is being logged in your toy example: tf.Tensor(1742.0144, shape=(), dtype=float32)
FYI, in case it is useful for someone else:
` import tensorflow as tf
import numpy as np
from plotly.subplots import make_subplots
import plotly.graph_objects as go
import itertools
from clearml import Task
def get_trace(z, series, classes, colorscale='blues', showscale=True, verbose=False):
if verbose:
print(z)
ind = '1' if series=='train' else '2'
trace = dict(
type="heatmap",
z=z[::-1],
x=classes,
y=classes[::-1],
...
if I list_files the new dataset, I see the same files 😕
what could be wrong?
is there a way to prevent from creating a new setup in my worker each time?
it is ok?
Task init
params setup
task.execute_remotely()
real_code here
thanks! that was the script I used..but for same reason making two sbs was a bit more complicated than just stacking two..
but I was finally able to do it:
I commented the upload_artifact at the end of the code and it finishes correctly now
yes, that would work, except that I need to modify tensorflow as well..I'm currently working on creating a wheel for modified tf..but it's taking a while...
and what about those packages that are not being loaded because they don't appear in the main file?
let me know if you need any help/ have issues trying to reproduce...thanks!
and in dummy_module I have:
import pandas as pd
def func(args):
pd.read_csv(args.file)
I did, but I still have the same issue..
tglema@mvd0000xlrndtl2Â clearml-src
git:(28b8502) ✗
git status
HEAD detached at 0.17.5rc3
I did a python setup.py develop, and ran the script:
` from clearml import Dataset
dataset = Dataset.create(dataset_project='test', dataset_name='example')
dataset.add_files('/home/tglema/example.jpeg')
dataset.add_files('/home/tglema/logo.png')
print(dataset.list_files())
dataset.upload()
dataset.finalize()
dataset_new = Dataset.create...
I missed that part! sorry
One question, what do I put in the task_id in the step2 file?
because once it clones task of step1, the task_id changes, so it no longer points to the actual task that was run
I understand that it uses time in seconds when there is no report being logged..but, it has already logged three times..
https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/compute/api/create_instance.py I think this is a nice place to start
ah, I see master is the same as 0.17.5rc3
sounds like you need to run a service to monitor for new commits in PROJ_1, to trigger the pipeline
so I have a couple of questions regarding setup.py.
If I add the requirement '.' as the last entry, does that mean that it will install my package lastly? Can I do in setup.py the modifications to the tensorflow code?I need to see how I can change the tensoflow code after it was installed and prevent other tensorflow installation to overwrite it..is it clear?
if I put ~/clearml in the default_output_uri key, and start the task, when run as agent in GCP I get clearml.Task - INFO - Completed model upload to file:///$github_proj_directory/~/clearml/$proj_name/$experiment_name