Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
bad-frog5524yjust curious...
how do you manage to leverage multicore in a way that a gpu wouldnt?
i am interested in that chip bc of many cores, but then its apple... -
@bad-frog control heavy tasks and or stuff with a lot of IO time or a lot of sparsity (both compute and memory) suck on GPU. Multi-core CPUs generally do much better but it's dependent on the task and scale. You'd only use a GPU for regular, dense, independent, throughput optimized compute at large scale. For everything else (which is most stuff that's not based on linear algebra) you want a CPU.
Also that chip (and most modern SOCs) has a GPU and an AI accelerator (that's basically a massive matrix computation unit), so you can use whichever one you want depending on the task. The SoC design and unified memory means they're very close to each other physically so data transfer between units is pretty fast. -
@homo-lorens don’t think so - but I’m not using anything like that in this coming year
!rant... but just gotta say... this new M1 MacBook feels fucking awesome. And I already had a late 2019 MacBook. The actual feel is different... right? Not just the chip? But it's super super fast. Also NO fan when I record screencasts...
It's super fucking rad... and - yeah. Just kinda want to shout that. Maybe I'm crazy...
rant