Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi There, I Have A Package Called

Hi there,
I have a package called feast[redis] in my requirements.txt file.
When I run locally everything works, but from UI it does not list in INSTALLED PACKAGES
Then, when I run I get an error

Does anyone know how to solve this?

  
  
Posted one year ago
Votes Newest

Answers 12


Hi IrritableGiraffe81

I have a package called

feast[redis]

in my requirements.txt file.

This means feast is installing additional packages, once the agent is done installing everything, it basically calls pipe freeze and stores back All the packages including versions
Now the question is, how come redis is not installed.
Notice that the Task already has the autodetected packages (it basically ignores requirem,ents.txt as it is often not full missing or just wrong)
That said if you clear the "Installed packages" it will revert to using "requirements.txt"
You can also set it to use the requirements.txt in the code to begin with:
` Task.add_requirements("redis")

or full requirements with Task.force_requirements_env_freeze(requirements_file="./requirements.txt")

Task.init(...) `wdyt?

  
  
Posted one year ago

Here you go:
` @PipelineDecorator.pipeline(name='training', project='kgraph', version='1.2')
def pipeline(...):
return

if name == 'main':
Task.force_requirements_env_freeze(requirements_file="./requirements.txt")
pipeline(...) If you need anything for the pipeline component you can do: @PipelineDecorator.component(packages="./requirements.txt")
def step(data):

some stuff `

  
  
Posted one year ago

AgitatedDove14 Thanks for the explanation
I got it.
How I can use force_requirements_env_freeze with PipelineDecorator()
as I do not have the Task object created.
@PipelineDecorator.pipeline(name='training', project='kgraph', version='1.2') def main(feature_view_name, data_timestamp=None, tk_list=None): """Pipeline to train ...

  
  
Posted one year ago

I've also tried with clearml-1.6.5rc2, got same error
I am lost 😔

  
  
Posted one year ago

Go it!
Thanks a lot AgitatedDove14
I will try !

  
  
Posted one year ago

🤞

  
  
Posted one year ago

AgitatedDove14 Worked!

But a new error raises:

File "kgraph/pipelines/token_join/train/pipeline.py", line 48, in main timestamp = pd.to_datetime(data_timestamp) if data_timestamp is not None else get_latest_version(feature_view_name) File "/root/.clearml/venvs-builds/3.8/task_repository/Data-Science/kgraph/featurestore/query_data.py", line 77, in get_latest_version fv = store.get_feature_view(fv_name) File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/feast/usage.py", line 287, in wrapper raise exc.with_traceback(traceback) File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/feast/usage.py", line 276, in wrapper return func(*args, **kwargs) File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/feast/feature_store.py", line 380, in get_feature_view return self._get_feature_view(name, allow_registry_cache=allow_registry_cache) File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/feast/feature_store.py", line 388, in _get_feature_view feature_view = self._registry.get_feature_view( File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/feast/registry.py", line 1438, in get_feature_view return FeatureView.from_proto(feature_view_proto) File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/typeguard/__init__.py", line 1033, in wrapper retval = func(*args, **kwargs) File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/feast/feature_view.py", line 471, in from_proto batch_source = DataSource.from_proto(feature_view_proto.spec.batch_source) File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/typeguard/__init__.py", line 1033, in wrapper retval = func(*args, **kwargs) File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/feast/data_source.py", line 339, in from_proto return cls.from_proto(data_source) File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/feast_postgres/offline_stores/postgres_source.py", line 56, in from_proto event_timestamp_column=data_source.event_timestamp_column, AttributeError: event_timestamp_columnThis error is when I run from the UI

When I run using:
# Run the pipeline in your local machine PipelineDecorator.run_locally()It work's just fine

This seems to be some kind of type error similar to this one:

https://clearml.slack.com/archives/CTK20V944/p1660259947051559?thread_ts=1660258298.953429&cid=CTK20V944

  
  
Posted one year ago

IrritableGiraffe81 could it be the pipeline component is not importing pandas inside the function? Notice that a function decorated with pipeline component become a stand-alone, this means that if you need pandas you need to import inside the function. The same goes for all the rest of the packages used.
When you are running with run_loclly or debug_pipeline you are using your local env , as opposed to the actual pipeline where a new env is created inside the repo.
Can you send the Entire pipeline component console log? (Just click download in the web ui and attach here)

  
  
Posted one year ago

I don’t think so AgitatedDove14
I’ve tested with:

PipelineDecorator.debug_pipeline() PipelineDecorator.run_locally() Docker
I’ve got no error

  
  
Posted one year ago

I've build a container using the same image used by agent.
Training ran with no errors

  
  
Posted one year ago

This looks like 'feast' error, could it be a configuration missing?

  
  
Posted one year ago

Thanks for the reply, I will send you soon.

  
  
Posted one year ago