![Profile picture](https://clearml-web-assets.s3.amazonaws.com/scoold/avatars/SmarmyDolphin68.png)
Reputation
Badges 1
40 × Eureka!AgitatedDove14 , I'll have a look at it and let you know. According to you the VPN shouldn't be a problem right?
I tried it, and it is uploading it to Debug Samples ( Task.get_task()
) with task.get_logger().report_matplotlib_figure()
, but with a Task.init()
, it's uploading it to Plots.
Checking with the RC package now
Should we also provide credentials for the Storage Account on the Web UI under 'Profile' section?
I tried setting the variables with export
but got this error:
` Traceback (most recent call last):
File "test.py", line 1, in <module>
from trains import Task
File "/home/sam/VirtualEnvs/test/lib/python3.8/site-packages/trains/init.py", line 4, in <module>
from .task import Task
File "/home/sam/VirtualEnvs/test/lib/python3.8/site-packages/trains/task.py", line 28, in <module>
from .backend_interface.metrics import Metrics
File "/home/sam/VirtualEnvs/test/lib/pyth...
Ahh okay, commented out the whole thing and got the same error as earlier ( Could not get access credentials
)
This is the whole error dump:
`
2020-12-03 13:31:27,296 - trains.storage - ERROR - Azure blob storage driver not found. Please install driver using "pip install 'azure.storage.blob>=2.0.1'"
Traceback (most recent call last):
File "test.py", line 3, in <module>
task = Task.init(project_name="Test", task_name="debugging")
File "/home/sam/VirtualEnvs/test/lib/python3.8/site-packages/trains/task.py", line 461, in init
task.output_uri = cls.__default_output_uri
File "/home/sam/Virtu...
Thanks a lot SuccessfulKoala55 ๐
Oh it worked! I did the pip install multiple times earlier, but to no avail. I think it's because of the env variables? Let me try to unset those and provide it within the trains.conf
Understood, I'll look into it!
My use case is that, let's say I'm training with a file called train.py
in which I have Task.init()
, now after the training is finished, I generate some more graphs with a file called graphs.py
and want to attach/upload to this training task which has finished. That's when I realised Task.get_task()
is not working as intended, but it is when I have a Task.init()
before it.
Indeed, sleep()
did the trick but it's going into the Debug Samples tab and not the Plots, any reason why? Earlier (with Task.init()
followed by Task.get_task()
) the same plt plots got reported to 'Plots'.
Are you doingย
plt.imshow
Nope
And yes, I set the report_image=False
0.16.1-320
you mean 0.16?
Ah my bad, I picked up the version from docker-compose file :D
` from trains import Task
import matplotlib.pyplot as plt
import numpy as np
import time
task = Task.get_task(task_id='task_id')
for i in range(0,10):
x_1 = np.random.rand(50)
y_1 = np.random.rand(50)
x_2 = np.random.rand(50)
y_2 = np.random.rand(50)
plt.figure()
plt.scatter(x_1, y_1, alpha=0.5)
plt.scatter(x_2, y_2, alpha=0.5)
# Plot will be reported automatically
# plt.show()
task.get_logger().report_matplotlib_figure(title="My Plot Title", serie...
Hey SuccessfulKoala55 , I'm new to Trains and want to setup an Azure Storage Container to store my model artifacts. I see that we can do this by providing an output_uri in Task.init() but is there another way to send all the artifacts to Azure instead of using the Task.init()? Like setting a variable somewhere, so that whenever I run my tasks I know the artifacts will get stored in Azure even if i dont provide an output_uri
Yeah I noticed that too! Ports are configured properly in the conf file though
Oh! With the sleep()
function? Let me try it again
Great! Thanks for the clarification AgitatedDove14 ๐
AgitatedDove14 , thanks a lot! I'll get back with a script in a day or two.
AgitatedDove14 , Let me clarify, I meant, let's say I have all the data like checkpoints, test and train logdirs, scripts that were used to train a model. So, how would I upload all of that to the ClearML server without retraining a model, so that the 'Scalars', 'Debug Samples', 'Hyperparameters', everything show up on ClearML server like they generally do?
Okay, my bad, the code snippet I sent is correctly uploading it to 'Plots' but in the actual script I use this:
` def plot_graphs(input_df,output_df,task):
# Plotting all the graphs
for metric in df_cols:
# Assigning X and Y axes data
in_x = input_df["frameNum"]
in_y = input_df[metric]
out_x = output_df["frameNum"]
out_y = output_df[metric]
# Creating a new figure
plt.figure()
plt.xlabel('Frame Number')
...
Verifying credentials ... Error: could not verify credentials: key=xxxxxx secret=xxxxx Enter user access key:
It's going to Debug Samples with the RC too
No, those env variables aren't set.
It says container name is optional in azure.storage
, if I don't provide the container name, and remove it from the end of default_uri
, would that work?
I tried it about 2-3 months ago with trains-init (same use-case as this one) and it failed that time too.
Could it be the credentials are actually incorrect?
Highly unlikely, like I said, I generated a new set of credentials from the Web-UI and it worked perfectly fine for an Azure VM (not under the VPN).