Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Unanswered
Hello, I Am First Timer In Clearml And Try To Deploy Locally A Clear Ml Server (Successfully) And Then Agent In My Kubernetes Cluster. I Follow The Helm Chart From "Helm Repo Add Clearml


kubectl describe pod -n clearml-prod -l app.kubernetes.io/name=clearml-agent
kubectl logs -n clearml-prod -l app.kubernetes.io/name=clearml-agent --previous 2>/dev/null || true
Name:             clearml-agent-848875fbdc-x8x6s
Namespace:        clearml-prod
Priority:         0
Service Account:  clearml-agent-sa
Node:             kharrinhao/192.168.70.211
Start Time:       Mon, 21 Jul 2025 15:23:02 +0000
Labels:           app.kubernetes.io/instance=clearml-agent
                  app.kubernetes.io/managed-by=Helm
                  app.kubernetes.io/name=clearml-agent
                  app.kubernetes.io/version=1.24
                  helm.sh/chart=clearml-agent-5.3.3
                  pod-template-hash=848875fbdc
Annotations:      checksum/config: 5c1b50a353fea7ffd1fa5e62f968edc92e2610e0f0fd7783900a44f899ebe9ca
                  cni.projectcalico.org/containerID: 6964e25aa0cf54fa1dc91e36648d97e6deeae3366a924579be1e72742a25365a
                  cni.projectcalico.org/podIP: 192.168.31.162/32
                  cni.projectcalico.org/podIPs: 192.168.31.162/32
Status:           Running
IP:               192.168.31.162
IPs:
  IP:           192.168.31.162
Controlled By:  ReplicaSet/clearml-agent-848875fbdc
Init Containers:
  init-k8s-glue:
    Container ID:  
5
    Image:         docker.io/allegroai/clearml-agent-k8s-base:1.24-21
    Image ID:      docker.io/allegroai/clearml-agent-k8s-base@sha256:772827a01bb5a4fff5941980634c8afa55d1d6bbf3ad805ccd4edafef6090f28
    Port:          <none>
    Host Port:     <none>
    Command:
      /bin/sh
      -c
      set -x; while [ $(curl --insecure -sw '%{http_code}' "
" -o /dev/null) -ne 200 ] ; do
        echo "waiting for apiserver" ;
        sleep 5 ;
      done; while [[ $(curl --insecure -sw '%{http_code}' "
" -o /dev/null) =~ 403|405 ]] ; do
        echo "waiting for fileserver" ;
        sleep 5 ;
      done; while [ $(curl --insecure -sw '%{http_code}' "
" -o /dev/null) -ne 200 ] ; do
        echo "waiting for webserver" ;
        sleep 5 ;
      done

    State:          Terminated
      Reason:       Completed
      Exit Code:    0
      Started:      Mon, 21 Jul 2025 15:23:03 +0000
      Finished:     Mon, 21 Jul 2025 15:23:03 +0000
    Ready:          True
    Restart Count:  0
    Environment:    <none>
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-7f2zt (ro)
Containers:
  k8s-glue:
    Container ID:  
6
    Image:         docker.io/allegroai/clearml-agent-k8s-base:1.24-21
    Image ID:      docker.io/allegroai/clearml-agent-k8s-base@sha256:772827a01bb5a4fff5941980634c8afa55d1d6bbf3ad805ccd4edafef6090f28
    Port:          <none>
    Host Port:     <none>
    Command:
      /bin/bash
      -c
      export PATH=$PATH:$HOME/bin; source /root/.bashrc && /root/entrypoint.sh

    State:          Waiting
      Reason:       CrashLoopBackOff
    Last State:     Terminated
      Reason:       Error
      Exit Code:    1
      Started:      Mon, 21 Jul 2025 15:23:58 +0000
      Finished:     Mon, 21 Jul 2025 15:24:02 +0000
    Ready:          False
    Restart Count:  3
    Environment:
      CLEARML_API_HOST:              

      CLEARML_WEB_HOST:              

      CLEARML_FILES_HOST:            

      CLEARML_API_HOST_VERIFY_CERT:  false
      K8S_GLUE_EXTRA_ARGS:           --namespace clearml-prod --template-yaml /root/template/template.yaml  --create-queue
      CLEARML_CONFIG_FILE:           /root/clearml.conf
      K8S_DEFAULT_NAMESPACE:         clearml-prod
      CLEARML_API_ACCESS_KEY:        <set to the key 'agentk8sglue_key' in secret 'clearml-agent-ac'>     Optional: false
      CLEARML_API_SECRET_KEY:        <set to the key 'agentk8sglue_secret' in secret 'clearml-agent-ac'>  Optional: false
      CLEARML_WORKER_ID:             clearml-agent
      CLEARML_AGENT_UPDATE_REPO:
      FORCE_CLEARML_AGENT_REPO:
      CLEARML_DOCKER_IMAGE:          ubuntu:18.04
      K8S_GLUE_QUEUE:                default
    Mounts:
      /root/clearml.conf from k8sagent-clearml-conf-volume (ro,path="clearml.conf")
      /root/template from clearml-agent-pt (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-7f2zt (ro)
Conditions:
  Type                        Status
  PodReadyToStartContainers   True
  Initialized                 True
  Ready                       False
  ContainersReady             False
  PodScheduled                True
Volumes:
  clearml-agent-pt:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      clearml-agent-pt
    Optional:  false
  k8sagent-clearml-conf-volume:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  clearml-agent-ac
    Optional:    false
  kube-api-access-7f2zt:
    Type:                    Projected (a volume that contains injected data from multiple sources)
    TokenExpirationSeconds:  3607
    ConfigMapName:           kube-root-ca.crt
    ConfigMapOptional:       <nil>
    DownwardAPI:             true
QoS Class:                   BestEffort
Node-Selectors:              <none>
Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
  Type     Reason     Age                From               Message
  ----     ------     ----               ----               -------
  Normal   Scheduled  96s                default-scheduler  Successfully assigned clearml-prod/clearml-agent-848875fbdc-x8x6s to kharrinhao
  Normal   Pulled     95s                kubelet            Container image "docker.io/allegroai/clearml-agent-k8s-base:1.24-21" already present on machine
  Normal   Created    95s                kubelet            Created container: init-k8s-glue
  Normal   Started    95s                kubelet            Started container init-k8s-glue
  Normal   Pulled     40s (x4 over 94s)  kubelet            Container image "docker.io/allegroai/clearml-agent-k8s-base:1.24-21" already present on machine
  Normal   Created    40s (x4 over 94s)  kubelet            Created container: k8s-glue
  Normal   Started    40s (x4 over 93s)  kubelet            Started container k8s-glue
  Warning  BackOff    10s (x6 over 84s)  kubelet            Back-off restarting failed container k8s-glue in pod clearml-agent-848875fbdc-x8x6s_clearml-prod(42a51ff8-6423-485a-89e3-6109b3c0583a)
    not nested and not items))
  File "/usr/lib/python3.6/sre_parse.py", line 765, in _parse
    p = _parse_sub(source, state, sub_verbose, nested + 1)
  File "/usr/lib/python3.6/sre_parse.py", line 416, in _parse_sub
    not nested and not items))
  File "/usr/lib/python3.6/sre_parse.py", line 734, in _parse
    flags = _parse_flags(source, state, char)
  File "/usr/lib/python3.6/sre_parse.py", line 803, in _parse_flags
    raise source.error("bad inline flags: cannot turn on global flag", 1)
sre_constants.error: bad inline flags: cannot turn on global flag at position 92 (line 4, column 20)
  
  
Posted one month ago
32 Views
0 Answers
one month ago
one month ago