Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
BeefyFrog17
Moderator
2 Questions, 4 Answers
  Active since 18 March 2023
  Last activity 9 months ago

Reputation

0

Badges 1

4 × Eureka!
0 Votes
3 Answers
618 Views
0 Votes 3 Answers 618 Views
Hi, do you know how to upload pyspark dataframes with clearml as artifact? For example, i have code: task = Task.init( project_name="Try to upload pyspark df...
9 months ago
0 Votes
4 Answers
1K Views
0 Votes 4 Answers 1K Views
Hi, when I use clearML in jupyter notebook, the task displays SCRIPT PATH: ipykernel_layncher.py by that UNCOMMITTED CHANGES does not show the code. Does any...
one year ago
0 Hi, When I Use Clearml In Jupyter Notebook, The Task Displays Script Path: Ipykernel_Layncher.Py By That Uncommitted Changes Does Not Show The Code. Does Anyone Know How To Fix This?

Yes, of course:

UNCOMMITTED CHANGES:

"""Entry point for launching an IPython kernel.

This is separate from the ipykernel package so we can avoid doing imports until
after removing the cwd from sys.path.
"""

import sys

if __name__ == "__main__":
 # Remove the CWD from sys.path while we load stuff.
 # This is added back by InteractiveShellApp.init_path()
 if sys.path[0] == "":
 del sys.path[0]

 from ipykernel import kernelapp as app

 app.launch_new_instance()
one year ago
0 Hi, Do You Know How To Upload Pyspark Dataframes With Clearml As Artifact? For Example, I Have Code:

Hi @<1523701435869433856:profile|SmugDolphin23> ! Thanks for answer. This is my error:

RuntimeError: It appears that you are attempting to reference SparkContext from a broadcast variable, action, or transformation. SparkContext can only be used on the driver, not in code that it run on workers. For more information, see SPARK-5063.

Can you show me example of your solution?

9 months ago