Even Apple finally admits that 8GB RAM isn't enough ( www.xda-developers.com )

There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple's claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won't be able to use it. There's a memory requirement for Predictive Code Completion in Xcode 16, and it's the closest thing we'll get from Apple to an admission that 8GB of memory isn't really enough for a new Mac in 2024.

KingThrillgore ,
@KingThrillgore@lemmy.ml avatar

They moved to on-die RAM for a reason: To nickel and dime yo ass.

I needed to expense a Mac Mini for iOS development, and everyone (Me, the company, our purchasing department) was baffled at how much it cost to get 16 GB. And they only go up to 24GB. Imagine how much they'll charge for 32 in a year!

sugar_in_your_tea ,

It's technically a bit faster, but yeah, I think charging more is the bigger motivation.

dojan ,
@dojan@lemmy.world avatar

Companies primarily make decisions to maximise the profitability of someone and it's never the consumer.

Zink ,

Sounds like one of those rare cases where engineering and marketing might agree on something.

echodot ,

It's a bit first but if their primary motivation was performance improvements they wouldn't be soldering 16 GB.

If you're going to weld shoes to your feet, you better at least make sure that they're good shoes.

sugar_in_your_tea ,

Why not? There is a performance benefit to being closer to the CPU, and soldering gets you a lot closer to the CPU. That's a fact.

echodot ,

Yeah but if you're only putting 8 GB of RAM on then you're also going to be constantly querying the hard drive. So any performance gain you get from soldering, is lost by going all the way to the hard drive every 3 microseconds.

It's only better performance on paper in reality there's no real benefit. If you can run an application entirely entirely within the 8 GB of RAM, and assuming you're not running anything else, then maybe you get better performance.

sugar_in_your_tea ,

And that's the idea. Soldering memory is an engineering decision. How much to solder is a marketing decision. Since users can't easily add more, marketing can upsell on more RAM.

It's not "on paper," the RAM itself is performing better vs socketed RAM. Whether the system runs better depends on the configuration, as in, did you order enough RAM.

echodot ,

I can't tell if you're a stooge or if you really think that. I hope you are stooge, because otherwise that's a really stupid position you've decided to take and you clearly don't actually understand the issue.

sugar_in_your_tea ,

I'm pretty sure I do understand the issue. Here are some facts (and an article to back it up):

  1. putting memory closer to the CPU improves performance due to less latency - from 96GB/s -> 200 (M1) or 400 (M1 Max) GB/s
  2. customers can't easily solder on more RAM
  3. Apple's RAM upgrades are way more expensive than socketed options on the market

And here's my interpretation/guesses:

  1. marketing sees 1 & 2, and sees an opportunity to do more of 3
  2. marketing probably asked engineering what the bare minimum is, and they probably said 8GB (assuming web browsing and whatnot only), though 16GB is preferable (that's what I'd answer)
  3. marketing sets the minimum @ 8GB, banking on most users who need more than the basics to buy more, or for users to buy another laptop sooner when they realize they ran out of RAM (getting after-sale RAM upgrades is expensive)

So:

  • using soldered RAM is an engineering decision due to improved performance (double socketed RAM w/ Intel on M1, quadruple on M1 Max)
  • limiting RAM to 8GB is a marketing decision
  • if you don't have enough RAM, that doesn't mean the RAM isn't performing well, it means you don't have enough RAM

Using socketed RAM won't fix performance issues related to running out of RAM, that issue is the same regardless. Only adding RAM will fix those performance issues, and Apple could just as easily make "special" RAM so you can't buy socketed RAM on the regular market anyway (e.g. they'd need a different memory standard anyway due to Unified Memory).

I have hated Apple's memory pricing for decades now, it has always been way more expensive to add RAM to an Apple device at order time vs PC competitors (I still add my own RAM to laptops, but it's usually way cheaper through HP, Lenovo, etc than Apple at build-time). I'm not defending them here, I'm merely saying that the decision to use soldered RAM makes a lot of engineering sense, especially with the new Unified Memory architecture they're using in the M-series devices.

stoly ,

Mac Mini is meant to be sort of the starter desktop. For higher end uses, they want you on the Mac Studio, an iMac, or a Mac Pro.

FarraigePlaisteach ,

I assumed that the Mini was the effectively a Mac without a monitor. Is it relatively underpowered too?

Thekingoflorda ,
@Thekingoflorda@lemmy.world avatar

As far as I understand, the Mac lineup don’t have screens, the IMacs are stationary and do have a screen, the MacBooks are the laptops.

PrettyLights ,

Its not underpowered for average users, but it's not meant for professional uses beyond basic office work.

Similar to the mini they offer the Studio which doesn't have a monitor built in
https://www.apple.com/mac-mini/compare/?modelList=Mac-studio-2023,Mac-mini-M2

Then for the higher end uses they offer a more typical tower format https://www.apple.com/mac-pro/

FarraigePlaisteach ,

But would an M1 Mini be similar to an M1 iMac?

egeres ,
@egeres@lemmy.world avatar

Why do they struggle so much with some "obvious things" sometimes ? We wouldn't have a type-C iphone if the EU didn't pressured them to do make the switch

TheSealStartedIt ,

💸💸💸

helenslunch ,
@helenslunch@feddit.nl avatar

They don't "struggle". They are intentional and malicious decisions meant to drive revenue, as they have been since the beginning.

Valmond ,

The E-Mac (looks like a toilet, sounds like a jet) came with 256 MB of RAM in one of the two slots, adding a 512MB stick was dirt cheap (everyone had at the very least 1GB on their PC), well it was dirt cheap except if you bought it from Apple...

It's how Apple monetizes their customers. Figuring out an artificial shortcoming they can sell as an upgrade to them (check out dongles for example).

dan ,
@dan@upvote.au avatar

They didn't have a reason to switch to USB-C, and several reasons to avoid it for as long as possible. Their old Lightning connector (and the big 30-pin connector that came before it) was proprietary, and companies had to pay a royalty to Apple for every port and connector they manufactured. They made a lot of money off of the royalties.

helenslunch ,
@helenslunch@feddit.nl avatar

I don't consider that an "admission" at all...

maxinstuff ,
@maxinstuff@lemmy.world avatar

Oh man, I remember so many people defended 8GB since the M1 first came out (and since).

I always argued it would significantly reduce the lifetimes of these machines if you bought one, not just because you’d be swapping a lot more on the (soldered in BTW) ssd, but because after a few years of updates it would become unbearably slow, or hardware would fail, or both.

Didn’t stop people constantly “tHe aRchITecTuRE iS cOmPlETelY diFFeRenT!!!”

Sure it’s different, but it’s still just a computer. A technical person can still look at the spec sheet and calculate effective performance accounting for bus widths etc.

Disclosure: I bought a top spec 16GB M1 Mac Air on launch and have been extremely happy with it - it’s still going strong.

uis , (edited )
@uis@lemm.ee avatar

Didn’t stop people constantly “tHe aRchITecTuRE iS cOmPlETelY diFFeRenT!!!”

Different Turing Machine on different math and alternative physics, I guess.

I bought a top spec 16GB M1 Mac Air on launch

My condolences.

EDIT: do people geuenly belive that math doesn't apply to Apple's products or they just don't understand even such concentrated sarcasm?

Treczoks ,

I'd love to have 8GB of RAM. The SOC I'm working with has only 2K ;-)

AVincentInSpace ,

Come on, man, AVR chips aren't SoCs except in the technical sense.

Treczoks ,

No AVR, it's a small LPC from NXP. Chosen for the price, of course, but I have to somehow squeeze the software in it. At this point, even 8k would make me happy...

uis ,
@uis@lemm.ee avatar

NXP, fancy. I expected ST, AVR, nRF, WCH or some chinese cheaptroller.

Why them? Something to do with NFC?

uis ,
@uis@lemm.ee avatar

Man, microcontrollers are namegivers of SoC.

Valmond ,

Bet your compiler isnt running on that hardware either ;-)

Treczoks ,

Luckily, no ;-)

Centaur ,

2 kilobytes?

Treczoks ,

Yes. 2 kilobytes. Coincidentally, this is as big as the displays internal buffer, so I cannot even keep a shadow copy of it in my RAM for the GUI.

uis ,
@uis@lemm.ee avatar

I've never seen backbuffer called shadow copy.

Treczoks ,

And I have never heard it called "backbuffer", so we are even.

uis ,
@uis@lemm.ee avatar

I guess so.

Example: https://www.khronos.org/opengl/wiki/Default_Framebuffer#Double_buffering

EDIT: Wait. Do you have framebuffer at all? Because from sounds of it, you might not even have it at all. If you don't store entire frame in RAM, then you don't have framebuffer, not just backbuffer.

Treczoks ,

I never said anything about framebuffers. The 256x64 pixel display in 16 brightness levels probably has something comparable inside. I just tell it that i want to update a rectangle, and send it some data for that.

uis ,
@uis@lemm.ee avatar

It should have.

Then, if you don't store contents of entire screen in memory, which simple math says you can't, I was partially wrong(depending on if you don't count buffer in display as framebuffer) when interpreted "shadow copy" as backbuffer.

Centaur ,

Thanks for clarification.

Nicoleism101 , (edited )

I have everything from apple but I know that 8gb is basically planned obsolescence in disguise.

We pay serious extra cash for just a ‚notch’ more refined experience. However I had to try to buy every possible thing from apple at least once in my life to see if it is worth it and basically only M4 iPad Pro 13 is truly worth the money and irreplaceable for me.

Everything else is nice for someone who is super lazy like me but can be easily replaced with not much difference for cheaper shit

uis ,
@uis@lemm.ee avatar

To be fair there are only two reasons I hate it:

  1. People incorrectly use term UMA
  2. It's crApple

On Linux if you don't compile rust or firefox 8GB is fine. 4 is fine too.

ssebastianoo ,
@ssebastianoo@programming.dev avatar

I have a macbook air m2 with 8gb of ram and I can even run ollama, never had ram problems, I don't get all the hate

sverit OP ,

Which model with how many parameters du you use in ollama? With 8GB you should only be able to use the smallest models, which ist faaaar from ideal:

You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models.

ssebastianoo , (edited )
@ssebastianoo@programming.dev avatar

llama3:8b, I know it's "far from ideal" but only really specific use cases require more advanced models to run locally, if you do software development, graphic design or video editing 8gb is enough

edit: just tried it after some time and it works better than I remembered showcase

yournamehere ,

maybe in a browser using external resources. open some chrometabs to feel the pain.
apple is a joke.

ssebastianoo ,
@ssebastianoo@programming.dev avatar

here you are

vscode + photoshop + illustrator + discord + arc + chrome + screen recording and still no lag

yournamehere ,

so not a single cool app and yet you own a computer

ssebastianoo ,
@ssebastianoo@programming.dev avatar

wtf does that mean

Muffi ,

Every time I compare specs to prices on Apples website, I get irrationally angry.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines