Cohere releases Tiny Aya, a family of 3.35B-parameter open-weight models supporting 70+ languages for offline use, trained on a single cluster of 64 H100 GPUs (Ivan Mehta/TechCrunch)
Cohere releases Tiny Aya, a family of 3.35B-parameter open-weight models supporting 70+ languages for offline use, trained on a single cluster of 64 H100 GPUs (Ivan Mehta/TechCrunch) https://bit.ly/40kd1c9
Ivan Mehta / TechCrunch:
Cohere releases Tiny Aya, a family of 3.35B-parameter open-weight models supporting 70+ languages for offline use, trained on a single cluster of 64 H100 GPUs — Enterprise AI company Cohere launched a new family of multilingual models on the sidelines of the ongoing India AI Summit.
0 Response to "Cohere releases Tiny Aya, a family of 3.35B-parameter open-weight models supporting 70+ languages for offline use, trained on a single cluster of 64 H100 GPUs (Ivan Mehta/TechCrunch)"
Post a Comment
THANK YOU