Ah apparently reason was that the  squash()   method defaults  its output url to  file_server  instead of the project's default storage string, might be nice to do the checks about storage validity before spawning sub-processes
			
				Answered
			
			
 
			
	
		
			
		
		
		
		
	
			
 				
	
	
		
			
		
		
		
		
	
 
					
		
		Heya, Good Day To Everyone, I'M Finding Myself Facing This Random Error With A Very Opaque Backtrace When Attempting To Squash Two Distinct Versions Of The Same Dataset, Does It Ring Any Bells?
Heya, good day to everyone, I'm finding myself facing this random error with a very opaque backtrace when attempting to squash two distinct versions of the same dataset, does it ring any bells?File "/root/.clearml/venvs-builds/3.10/code/build_continuous_learning_dataset.py", line 74, in build_continuous_learning_dataset new_baseline_dataset = Dataset.squash(dataset_name='VinzYoloBalancedDataset', dataset_ids=[baseline_dataset.id, dataset.id]) File "/root/.clearml/venvs-builds/3.10/lib/python3.10/site-packages/clearml/datasets/dataset.py", line 1763, in squash pool.map( File "/usr/lib/python3.10/multiprocessing/pool.py", line 367, in map return self._map_async(func, iterable, mapstar, chunksize).get() File "/usr/lib/py
2K Views
				1
Answer
				
					 
	2 years ago
				
					
						 
	2 years ago
					
					 Tags
					
			