Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
PompousParrot44
Moderator
32 Questions, 85 Answers
  Active since 10 January 2023
  Last activity 8 months ago

Reputation

0

Badges 1

85 × Eureka!
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
hmm it seems the plot are ending up under debug samples
4 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
for the frameworks which are supported in built, trains stores the trained model as output model e.g. for xgboost here https://github.com/allegroai/trains/bl...
4 years ago
Show more results questions
4 years ago
0 Wondering Why My Plots Are Not In Plot Section But Under Debug Section

it may be that i am new to trains but in my normal notebook flow they both are images and i as trains user expected it to be under the Plot section as i think this is an image.. as in nutshell all matplotlib plots display data as an image 🙂

4 years ago
0 For The Frameworks Which Are Supported In Built, Trains Stores The Trained Model As Output Model E.G. For Xgboost Here

so i was expecting that uploaded model will be for example LightGBM.1104445eca4749f89962669200481397/models/Model%20object/model.pkl

4 years ago
0 Another Question Is If I Have A Conda Env Available On My Workers Systemwide.. Can I Use That Env Directly When Running Tasks With

i know it support conda.. but i have another system wide env which is not base .. say ml so wondering if i can comnfigure trains-agent to use that... not standard practice but just asking if it is possible

4 years ago
0 Colors Of Cm Reporting Are Strange... Is It Possible To Adjust The Default Ones

trains is run using docker-compose allegroai/trains-agent-services:latest and allegroai/trains:latest

4 years ago
4 years ago
0 Colors Of Cm Reporting Are Strange... Is It Possible To Adjust The Default Ones

i am simply proxying it using ssh port forwarding

4 years ago
0 Colors Of Cm Reporting Are Strange... Is It Possible To Adjust The Default Ones

also one thing i noticed.. when i report confusion matrix and some other plots e.g. seaborn with matplotlib.. on server side i can the plots are there but not visible at all

4 years ago
0 When Trains-Agent Is Configured With

is it because of something wrong with this package build from their owner or something else

4 years ago
0 In Ui Under Execution Tab, I See That The Trains Has

with this layout.. it didn't work earlier

4 years ago
0 In Ui Under Execution Tab, I See That The Trains Has

i can not check the working directory today due to vpn issues in accessing server but script path was -m test.scripts it was missing script from it

4 years ago
0 When Trains-Agent Is Configured With

thanks for the update... it seems currently i can not pass the http/s proxy parameters as when agent creates a new env and try to download some package its being blocked by our corp firewall... all outgoing connection needs to pass through a proxy.. so is it possible to specify that or environment variables to agent

4 years ago
0 When Trains-Agent Is Configured With

thanks for letting me know.. but it turns out after i have recreated my whole system environment from scratch, trains agent is working as expected..

4 years ago
0 When Trains-Agent Is Configured With

AgitatedDove14 it seems i am having issues when i restart the agent... it fails in creating/setting up the env again... when i clean up the .trains/venv-builds folder and run a job for agent.. it is able to create the env fine and run job successfully.. when i restart the agent it fails with messages like
` Requirement already satisfied: cffi@ file:///home/conda/feedstock_root/build_artifacts/cffi_1595805535531/work from file:///home/conda/feedstock_root/build_artifacts/cffi_1595805535...

4 years ago
0 In Ui Under Execution Tab, I See That The Trains Has

trains-agent version as mentioned is 0.16.1 and server is 0.16.1 as well

4 years ago
0 Colors Of Cm Reporting Are Strange... Is It Possible To Adjust The Default Ones

seems like port forwarding had an issue.. fixed that.. now running test again to see if things workout as expected

4 years ago
0 Colors Of Cm Reporting Are Strange... Is It Possible To Adjust The Default Ones

this is when executed from directly with task.init()

4 years ago
0 I Am Seeing Issue When Running A Script With Command As

there are multiple scripts under test/scripts folder.. example is running one script from that folder

4 years ago
0 When Running In

i don't need this right away.. i just wanted to know the possibility fo dividing the current machine into multiple workers... i guess if its not readily available then may be you guys can discuss to see if it makes sense to have it on roadmap..

4 years ago
0 When Running In

thanks for your help AgitatedDove14

4 years ago
0 Is There A Link Which Describes The Differences In Community And Enterprise Versions

couldn't find the licensing price for enterprise version

4 years ago
0 For The Frameworks Which Are Supported In Built, Trains Stores The Trained Model As Output Model E.G. For Xgboost Here

AgitatedDove14 when using OutputModel(task, name='LightGBM model', framework='LightGBM').update_weights(f"{args.out}/model.pkl") i am seeing this in the logs No output storage destination defined, registering local model /tmp/model.pkl when i got to trains UI.. i see the model name and details but when i try to download it point to the path file:///tmp/model.pkl which is incorrect wondering how to fix it

4 years ago
0 For The Frameworks Which Are Supported In Built, Trains Stores The Trained Model As Output Model E.G. For Xgboost Here

AgitatedDove14 it seems uploading artifact and uploading models are two different things when it comes to treating fileserver... as when i upload artifact it works as expected but when uploading model using outputmodel class, it wants output_uri path.. wondering how can i as it to store it under the fileserver like artifacts LightGBM.1104445eca4749f89962669200481397/artifacts/Model%20object/model.pkl

4 years ago
Show more results compactanswers