simple ,
@simple@lemm.ee avatar

Isn’t there things like qemu and box64…

Yeah, but they're experimental and probably very buggy. I've used box64 on my phone, it doesn't play well with everything.

Can it do LLM inference as fast as a M2/M3 Macbook?

It should be better at AI stuff than M series laptops, allegedly. Many manufacturers actually started listing their prices for the new laptops, the new Microsoft ones start at 16GB of RAM at $1000. I know the Lenovo one can reach 64GB of RAM but not sure about the pricing.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines