What’s up today? (Part 2)

@dirvine
1-bit LLM’s being developed by Microsoft – LLM’s radically reduced in size and computational (inference) complexity, without compromising performance. LLM’s on smartphones will be easily possible:

3 Likes