Unanswered
Hello everyone,
*Context:*
I am currently facing a headache-inducing issue regarding the integration of flash attention V2 for LLM training.
I am running a python script locally, that then runs remotely. Without the integration of flash attention, the co
Hi @<1556812486840160256:profile|SuccessfulRaven86> , how exactly are you running the code remotely? Is this a daemon agent running on that EC2 instance?
128 Views
0
Answers
one year ago
one year ago