Even Apple finally admits that 8GB RAM isn't enough ( www.xda-developers.com )

There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple's claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won't be able to use it. There's a memory requirement for Predictive Code Completion in Xcode 16, and it's the closest thing we'll get from Apple to an admission that 8GB of memory isn't really enough for a new Mac in 2024.

Evotech ,

8GB of dedicated VRAM is hardly enough these days...

arin ,

Especially with 4k

uis ,
@uis@lemm.ee avatar

4k rendering or 4k textures?

fishbone ,

This is my biggest lament about getting a 2060 without knowing how important vram is. I can make it perform better and more efficiently a bunch of different ways, but to my knowledge, I can't get around the 6GB vram wall.

_number8_ ,

imagine showing this post to someone in 1995

shit has gotten too bloated these days. i mean even in my head 8GB still sounds like 'a lot' of RAM and 16GB feels extravagant

rottingleaf ,

I still can't fully accept that 1GB is not normal, 2GB is not very good, and 4GB is not all you ever gonna need.

If only it got bloated for some good reasons.

Aux ,

High quality content is the reason. Sit in a terminal and your memory usage will be low.

rottingleaf ,

256MB or 512MB was fine for high-quality content in 2002, what was that then.

Suppose the amount of pixels and everything quadrupled - OK, then 2GB it is.

But 4GB being not enough? Do you realize what 4GB is?

Aux ,

One frame for a 4K monitor takes 33MB of memory. You need three of them for triple buffering used back in 2002, so half of your 256MB went to simply displaying a bloody UI. But there's more! Today we're using viewport composition, so the more apps you run, the more memory you need just to display the UI. Now this is what OS will use to render the final result, but your app will use additional memory for high res icons, fonts, photos, videos, etc. 4GB today is nothing.

I can tell you an anecdote. My partner was making a set of photo collages, about 7 art works to be printed in large format (think 5m+ per side). So 7 photo collages with source material saved on an external drive took 500 gigs. Tell me more about 256MB, lol.

rottingleaf ,

Yes, you wouldn't have 4K in 2002.

4GB today is nothing.

My normal usage would be kinda strained with it, but possible.

$ free -h
               total        used        free      shared  buff/cache   available
Mem:            17Gi       3,1Gi        11Gi       322Mi       3,0Gi        14Gi
Swap:          2,0Gi          0B       2,0Gi
$ 
Aux ,

I can do a cold boot and show you empty RAM as well. So fucking what?

rottingleaf ,

It's not a cold boot and it's not empty.

lastweakness ,

They didn't just quadruple. They're orders of magnitude higher these days. So content is a real thing.

But that's not what's actually being discussed here, memory usage these days is much more of a problem caused by bad practices rather than just content.

rottingleaf ,

I know. BTW, if something is done in an order of magnitude less efficient way than it could and it did, one might consider it a result of intentional policy aimed at neutering development. Just not clear whose. There are fewer corporations affecting this than big governments, and those are capable of reaching consensus from time to time. So not a conspiracy theory.

lastweakness ,

So we're just going to ignore stuff like Electron, unoptimized assets, etc... Basically every other known problem... Yeah let's just ignore all that

Aux ,

Is Electron that bad? Really? I have Slack open right now with two servers and it takes around 350MB of RAM. Not that bad, considering that every other colleague thinks that posting dumb shit GIFs into work chats is cool. That's definitely nowhere close to Firefox, Chrome and WebStorm eating multiple gigs each.

lastweakness ,

Yes, it really is that bad. 350 MBs of RAM for something that could otherwise have taken less than 100? That isn't bad to you? And also, it's not just RAM. It's every resource, including CPU, which is especially bad with Electron.

I don't really mind Electron myself because I have enough resources. But pretending the lack of optimization isn't a real problem is just not right.

Aux ,

First of all, 350MB is a drop in a bucket. But what's more important is performance, because it affects things like power consumption, carbon emissions, etc. I'd rather see Slack "eating" one gig of RAM and running smoothly on a single E core below boost clocks with pretty much zero CPU use. That's the whole point of having fast memory - so you can cache and pre-render as much as possible and leave it rest statically in memory.

Verat , (edited )

When (according to about:unloads) my average firefox tab is 70-230MB depending on what it is and how old the tab is (youtube tabs for example bloat up the longer they are open), a chat app using over 350 is a pretty big deal

just checked, my firefox is using 4.5gb of RAM, while telegram is using 2.3, while minimized to the system tray, granted Telegram doesnt use electron, but this is a trend across lots of programs and Electron is a big enough offender I avoid apps using it. When I get off shift I can launch discord and check it too, but it is usually bad enough I close it entirely when not in use

Aux ,

Telegram is using only 66 megs here. Again - it's about content.

Shadywack ,
@Shadywack@lemmy.world avatar
AnxiousOtter ,

Well that other guy said it's only 66megs so, you're wrong.

/s

Aux ,

Again - content.

rottingleaf ,

If a program has to keep in RAM all the things you are not currently, like right now, displaying or editing, then its author shouldn't be in the profession. That also applies to the portions of a huge text file or a huge image you are not touching right now.

EDIT: Thankfully people writing video players usually understand this.

Shadywack ,
@Shadywack@lemmy.world avatar

Again, you're wrong.

Aux ,

Ahaha, ok!

lastweakness ,

CPU usage is famously terrible with Electron, which i also pointed out in the comment you're replying to. But yes, having multiple chromium instances running for each "app" is terrible

Aux ,

No, it's not.

lastweakness ,

... Okay?

Shadywack ,
@Shadywack@lemmy.world avatar

Yes it is.

"iT'S oNLy a FeW hUnDrED MB oF LiBRAriES and BiNAriES pEr aPp, iT'S oNLy dOuBLe oR tRiPLe tHe RAM, DiSk, anD cpU uSAgE"

Then we have the fucking shit show of 6-8GB of RAM used just by booting the fucking machine. Chromium/Webkit is practically an OS by itself for all the I/O, media handling, and built in libraries upon libraries of shit. Let's run that whole entire stack for all these electron apps, and then fragment each one independent of each other (hello Discord, who used Electron 12 for WAY too long) then say "bUt iT's pORtaBLe!".

Yes, it isn't just terrible, it's fucking obnoxiously and horrendously terrible, like we grabbed defeat from the jaws of victory terrible, and moronically insipid. Optimization in the fucking trash can and a fire hydrant in all our fucking assholes, terrible. That's HOW terrible it actually is, so you're wrong.

Aux ,

RAM usage doesn't matter in the slightest.

Shadywack ,
@Shadywack@lemmy.world avatar

[Thread, post or comment was deleted by the moderator]

  • Loading...
  • Shadywack ,
    @Shadywack@lemmy.world avatar

    [Thread, post or comment was deleted by the moderator]

  • Loading...
  • AnxiousOtter ,

    Great rebuttal. Really got him with this one.

    Aux ,

    Do you really want me to go into the details of how JIT works in V8 and which Electron APIs allow the apps to idle correctly?

    nossaquesapao ,

    First of all, 350MB is a drop in a bucket

    People don't run just a single app in their machines. If we triple ram usage of several apps, it results in a massive increase. That's how bloat happens, it's a cumulative increase on everything. If we analyze single cases, we could say that they're not that bad individually, but the end result is the necessity for a constant and fast increase in hardware resources.

    Aux ,

    People don’t run just a single app in their machines

    That's not bloat, that's people running more apps than ever.

    the end result is the necessity for a constant and fast increase in hardware resources.

    That's not true. 8 to 16GB RAM machines became common in early 2010-s and barely anyone is using 32 gigs today. Even if we look at the most recent Steam Hardware & Software Survey, we will see that even gamers are pretty much stuck with 16 gigs. 32 gigs are installed on less than 30% of machines and more than that is barely 4%. Ten years ago 8 gigs was the most common option with 12+ gigs (Steam didn't have 16gig category in 2014) being the third option. The switch to 16 gigs being number one happened in December 2019, so we're five years in with 16 gigs being the most common option and more RAM is not getting anywhere close to replacing it (47.08% for 16 gigs and 28.72% for 32 gigs as of May 2024).

    Now if you look at late 90-s and 2000-s you will see that RAM was doubling pretty much every 2-3 years. We can look at Steam data once again. Back in 2008 (that's the earliest data available on archive.org) 2 gigs were the most common option. Next year 3 gigs option got very close and sat at 2nd place. In 2010 2GB, 3GB and 4GB were splitting hairs. 4GB option became the most common in 2011 with 3GB variant being very close 2nd place. 5GB option became the king in 2012. And the very next year 8 gigs became the norm.

    So, 2 gigs in 2008, 4 gigs in 2011 and 8 gigs in 2013. You can check historical data yourself here https://web.archive.org/web/20130915000000*/http://store.steampowered.com/hwsurvey/

    nossaquesapao ,

    That’s not bloat, that’s people running more apps than ever.

    Not necessarily. People used to write text documents while looking for references on the internet, listening to music and chatting with friends at the same time in 2010, and even earlier, but the same use case (office suite+browser+music payer+chat app) takes much more resources today, with just a small increase in usability and features.

    Bloat is a complicated thing to discuss, because there's no hard definition of it, and each person will think about it in a different way, so what someone can consider bloat, someone else may not, and we end up talking about different things. You're right that hardware resources have been increasing in a slower rate, and it may force some more optimizations, but a lot of software are still getting heavier, without bringing new functionalities.

    Aux ,

    The software is getting heavier because content, not code. Again, we can look at the games. Take some old games like GTA V or Skyrim, they will fly on modern high end machines! Now add mods with 8K textures, higher definition models, HDR support, etc and these old games will bend over your RTX4090.

    nossaquesapao ,

    Content is also getting heavier, but both things aren't mutually exclusive. It's more objective to compare modern software, instead of older and newer ones. Before reddit created obstacles for third-party apps, they were famous for being much lighter than the official one, while doing the same (some even had more features). Now, if we compare lemmy to reddit, it's also much lighter, while providing a very similar experience. Telegram has a desktop app that does everything the web version does, and more, while lighter on resources. Most linux distros will work fine with far less hardware resources than windows. If you install lineageos on an older phone, it will perform better than the stock rom, even while using a newer aosp version. If you play a video on youtube, and the same one on vlc, vlc will do the same with less resources. If you use most sites with and without content blockers, the second one will be lighter, while not losing anything important.

    I could go on and on, but that's enough examples. There is a bloat component to software getting heavier, and not everything can be explained by heavier content and more features.

    jas0n ,

    Just wanted to point out that the number 1 performance blocker in the CPU is memory. In the general case, if you're wasting memory, you're wasting CPU. These two things really cannot be talked about in isolation.

    Aux ,

    No, that's the other way round. You either have high CPU load and low memory, or low CPU load and high memory.

    jas0n ,

    I'm not sure what metric you're using to determine this. The bottom line is, if you're trying to get the CPU to really fly, using memory efficiently is just as important (if not more) than the actual instructions you send to it. The reason for this is the high latency required to go out to external memory. This is performance 101.

    nossaquesapao ,

    It sure is. I'm running ferdium at this very moment with 3 chat apps open, and it consumes almost a gigabyte for something that could take just a few megabytes.

    Jakeroxs ,

    What's wrong with using Gifs in work chat lmao, can laugh or smile while hating your job like the rest of us.

    Aux ,

    Get a better job.

    Jakeroxs ,
    Honytawk ,

    The moment you use a file that is bigger than 1GB, that computer will explode.

    Some of us do more than just browse Lemmy.

    rottingleaf ,

    Wow. Have you ever considered how people were working with files bigger than total RAM they had in the normal days of computing?

    So in your opinion if you have 2GB+ of a log file, editing it you should have 2GB RAM occupied?

    I just have no words, the ignorance.

    SpaceCadet , (edited )
    @SpaceCadet@feddit.nl avatar

    I remember when I got my first computer with 1GB of RAM, where my previous computer had 64MB, later upgraded to 192MB. And there were only like 3 or 4 years in between them.

    It was like: holy shit, now I can put all the things in RAM. I will never run out.

    cyberpunk007 ,

    I chalk it up to lazy rushed development. Good code is art.

    Aux ,

    That's not true at all. The code doesn't take much space. The content does. Your high quality high res photos, 4K HDR videos, lossless 96kHz audio, etc.

    cyberpunk007 ,

    But there are lots of shortcuts now. Asset packs and coding environments that come bundled with all kinds of things you don't need. People import packages that consume a lot of space to use one tiny piece of it.

    To be clear, I'm not talking about videos and images. You'd have these either way.

    Aux ,

    All these packages don't take much memory. Also tree shaking is a thing. For example, one of the projects I currently work on has over 5 gigs of dependencies, but once I compile it for production, the whole code based is mere 3 megs and that's including inlined styles and icons. The code itself is pretty much non-existent.

    On the other hand I have 100KB of text translations just for the English language alone. Because there's shit loads of text. And over 100MB of images, which are part of the build. And then there's a remote storage with gigabytes of documents.

    Even if I double the code base by copy pasting it will be a drop in a bucket.

    Bjornir ,

    I have a VPS that uses 1GB of RAM, it has 6-7 apps running in docker containers which isn't the most ram efficient method of running apps.

    A light OS really helps, plus the most used app that uses a lot of RAM actually reduce their consumption if needed, but use more when memory is free, the web browser. On one computer I have chrome running with some hundreds of MB used, instead of the usual GBs because RAM is running out.

    So it appears that memory is full,but you can actually have a bit more memory available that is "hidden"

    derpgon ,

    Same here. When idle, the apps basically consume nothing. If they are just a webserver that calls to some PHP script, it basically takes no RAM at all when idle, and some RAM when actually used.

    Websites and phone apps are such an unoptimized pieces if garbage that they are the sole reason for high RAM requirements. Also lots of background bloatware.

    Specal ,

    This is resource reservation, it happens at an OS level. If chrome is using what appears to be alot of ram, it will be freed up once either the OS or another application requires it.

    It just exists so that an application knows that if it needs that resource it can use X amount for now.

    Aux ,

    You can always switch to a text based terminal and free up your memory. Just don't compain that YouTube doesn't play 4K videos anymore.

    SpaceCadet ,
    @SpaceCadet@feddit.nl avatar

    Just don’t compain that YouTube doesn’t play 4K videos anymore.

    strange, mpv handles it just fine

    Aux ,

    MPV doesn't work in terminal (well, technically it does, but what's the point of 4K HDR video in ASCII mode?). Please don't confuse terminal emulator in GUI mode with a real text mode terminal.

    SpaceCadet ,
    @SpaceCadet@feddit.nl avatar

    The point is that your example use case of "YouTube 4k videos" doesn't need a browser full of bloated js garbage.

    Aux ,

    The point is that MPV will use shitloads of memory too.

    SpaceCadet , (edited )
    @SpaceCadet@feddit.nl avatar

    Actually lot less than the browser. Under 300MB, I just checked, and that's mostly just the network buffer which is 150MB by default.

    Aux ,

    That's about what my Slack is using, while being written in Electron, lol. Oh, you people...

    uis ,
    @uis@lemm.ee avatar

    KMSDRM is in terminal enough for me. Fbcon too.

    EDIT: obviously not dummy terminal over UART or like that.

    stoly ,

    You just have to watch your favorite tablet get slower year after year to understand that a lot of this is artificial. They could make applications that don't need those resources but would never do so.

    Shadywack ,
    @Shadywack@lemmy.world avatar

    We measure success by how many GB's we have consumed when the only keys depressed from power on to desktop is our password. This shit right here is the real issue.

    mycodesucks ,
    @mycodesucks@lemmy.world avatar

    Absolutely.

    Bad, rushed software that wires together 200 different giant libraries just to use a fraction of them and then run it in a sandboxed container with three daemons it needs for some reason doesn't mean "8 Gb isn't enough", it means write tighter, better software.

    AnxiousOtter ,

    That ship has long sailed unfortunately. The industry gave up on optimization in favour of praying that hardware advancements can keep up with the bloat.

    jas0n ,

    Guy from '95: "I bet it's lightning fast though..."

    No dude. It peaks pretty soon. In my time, Microsoft is touting a chat program that starts in under 10 seconds. And they're genuinely proud of it.

    uis ,
    @uis@lemm.ee avatar

    And latency is more shit than it ever was

    qqq , (edited )

    I once went for lower CAS timing 2x 128MB ram sticks (256 MB) instead of 2x 256s with slower speeds because I thought 512MB was insane overkill. Realized how wrong I was when trying to play Star Wars galaxies mmorpg when a lot of people were on the screen it started swapping to disk. Look up the specs for an IBM Aptiva, first computer my parents bought, and you'll understand how 512MB can seem like a lot.

    Now my current computer has 64 GB (most gaming computers go for 32GB) at the time I built it. My workstation at work has 128GB which really isn't even enough for some workloads we have that use a lot of in-memory cache.. And large servers can have multiple TB of RAM. My mind has been blown multiple times.

    Hux ,

    This isn’t a big deal.

    If you’re developing in Xcode, you did not buy an 8GB Mac in the last 10-years.

    If you are just using your Mac for Facebook and email, I don’t think you know what RAM is.

    If you know what RAM is, and you bought an 8GB Mac in the last 10-years, then you are likely self-aware of your limited demands and/or made an informed compromise.

    filister ,

    If you know what RAM is, and you bought an 8GB Mac in the last 10-years, then you are likely self-aware of your limited demands and/or made an informed compromise.

    Or you simply refuse to pay $200+ to get a proper machine. Like seriously, 8GB Mac's should have disappeared long ago, but nope, Apple stick to them with their planned obsolescence tactics on their hardware, and stubbornly refusing to admit that in 2023 releasing a MacBook with soldered 8Gb of RAM is wholy inadequate.

    Specal ,

    I get around this by simply not buying a Mac. Free's up so much money for ram.

    sverit OP ,

    Yeah, the 8GB model's purpose is to make an "starting at $xxxx" price tag possible.

    DJDarren ,
    @DJDarren@thelemmy.club avatar

    I’m not gonna stand up and declare that 8gb is absolutely fine, because in very short order it won’t be. But yeah, currently for an average use case, it is.

    My work Mac mini has 8gb. It’s a 2014 so can’t be upgraded, but for the tasks I ask of it it’s ok. Sure, it gets sluggish if I’m using the Win11 VM I sometimes need, but generally I don’t really have any issues doing regular office tasks.

    That said, I sometimes gets a bee in my bonnet about it, so open Activity Monitor to see what’s it’s doing, and am shocked by how much RAM some websites consume in open tabs in Safari.

    8gb is generally ok on low end gear, but devs are working very hard to ensure that it’s not.

    stoly ,

    Funny: knowing that you only get one shot, I bought 32GB of RAM for my Mac Mini like 1.5 years ago. I figured that it gave me the best shot of keeping it usable past 5 years.

    poorlytunedAstring ,

    For the record, on Windows 10, I'm using 9GB (rounded up from 8.something) to run Firefox and look at this website, can't forget Discord inviting itself to my party in the background, and the OS. I had to close tabs to get down here. Streams really eat the RAM up.

    Throw a game in there, with FF open for advice and Discord running for all the usual gaming reasons, and yeah, way over.

    Notice I haven't even touched any productivity stuff that demands more.

    8? Eat a penis, Apple. Fuckin clown hardware.

    howlingecko ,

    Also for the record, I have experienced an 8GB Mac Mini run Firefox with at least 20 tabs, Jetbrains Rider with code open and editable, Jetbrains DataGrip with queries, somehow Microsoft Teams, MS Outlook and didn’t seem to have a problem. Was also able to share the screen on a Teams call and switch between the applications without lag.

    Windows OS couldn’t handle your application load? Eat a penis, Microsoft. Fucking clown memory management.

    Telodzrum ,

    MacOS’s memory scheduler is leaps and bounds better than what Windows uses. It’s more apt to compare the RAM on a machine running MacOS to one running a common Linux distro. Windows needs more RAM than the other two by two to three times because it’s fuckterrible at using it.

    SpeedLimit55 ,

    8GB is definitely not enough for coding, gaming, or most creative work but it’s fine for basic office/school work or entertainment. Heck my M1 Macbook Air is even good with basic Photoshop/Illustrator work and light AV editing. I certainly prefer my PC laptop with 32GB and a dedicated GPU but its power adapter weighs more than a Macbook Air.

    cmnybo ,

    8GB would be fine for basic use if it was upgradable. With soldered RAM the laptop becomes e-waste when 8GB is no longer enough.

    slaacaa ,

    Yeah, the soldering is outrageous. I miss the time when Apple was a (more) customer friendly company. I could open my Mac mini 2009 and just add more RAM, which I did.

    DJDarren ,
    @DJDarren@thelemmy.club avatar

    When I bought my first MacBook in ‘07 I asked the guy in the store about upgrading the RAM. He told me that what Apple charged was outrageous and pointed me to a website where I’d get what I needed for much less.

    I feel that if Apple could have soldered the RAM back then, they would have.

    boonhet ,

    I feel that if Apple could have soldered the RAM back then, they would have.

    Apple used to ship repair and upgrade kits with guides on how to apply them. Not sure they were as anti-repair then as they are now.

    barsquid ,

    Embrace, extend, extinguish is an attitude for more than one company I guess.

    cheddar ,
    @cheddar@programming.dev avatar

    8GB is definitely not enough for coding, gaming, or most creative work but it’s fine for basic office/school work or entertainment.

    The thing is, basic office/school/work tasks can be done on any laptop that costs twice less than an 8GB MacBook.

    SpeedLimit55 ,

    This is true for part time or casual use but for all day work use including travel you get better build quality and far less problems with a pro grade machine. We spend the same on a macbook, thinkpad, surface or probook for our basic full time users.

    While it may be a bit overkill for someone who spends their day in word, excel, chrome and zoom we save money in the long term due to reliability. There is far less downtime and IT time spent on each user over the life of the system (3-4 years). The same is true about higher quality computer accessories.

    Specal ,

    I mean I develop software on an 8GB laptop. Most of the time it's fine, when I need more I have a desktop with 128GB ram available.

    Really depends what type of software you're making. If you're using python a few TB might be required.

    small44 , (edited )

    For who? My mother who only use facebook, youtube and googling don't need 8gb

    ABCDE ,

    I don't know what Xcode is so yeah, I haven't been found wanting with my 8GB M2. Videos, downloading, web browsing, writing, chat applications, some photo editing, games (what I can actually play on a Mac, anyway), all good here.

    16GB+ is obviously going to be necessary though, and not exactly that expensive to put into their base models so it should be put in soon.

    Glowstick ,

    Sounds like all she needs is a dirt cheap chromebook then

    small44 ,

    That's what she has

    Glowstick ,

    Then her situation isn't applicable to this topic

    cybersandwich ,

    This comment chain made me chuckle. It's such an "internet comment section" ..trope? I don't know the right word.

    QuarterSwede ,
    @QuarterSwede@lemmy.world avatar

    It’s a straw man argument that doesn’t address the main point the comment or was making.

    disguy_ovahea ,

    That all depends on how much work they want to put into troubleshooting it for her. I got my mom a Mac Mini when her PC needed to be replaced. It’s way less responsibility on my part. I mostly just answer the occasional how-to.

    Glowstick ,

    Mac is easier than Windows, sure, but not easier than a chromebook. Nothing is simpler than a Chromebook. You can do much more with a Mac, but a chromebook is much easier.

    iopq ,

    I had a laptop with 8GB. Doing one of those things was fine, but when you open up another program it takes forever to switch to the browser

    Petter1 ,

    And then you have to activate linux app support for a thing she needs and can not do with chromebook and suddenly it is more complicated than macOS?

    JeeBaiChow ,

    They should do 4Gb. I hear M3 mac's make it seem like 8Gb.

    disguy_ovahea ,

    I think you mean gigabytes, not gigabits.

    8 Gb = 1 GB

    Aatube ,
    @Aatube@kbin.melroy.org avatar

    Darn you broadband

    disguy_ovahea ,

    That’s true. Data transmission is usually measured in bits, not bytes. Gigabit Ethernet can only transmit a maximum of ~128 MB/s.

    JeeBaiChow ,

    You mean they can even make 0.5GB appear as 8GB?! That's 16x! That apple silicon is just something else!

    Petter1 ,

    8Gb from 4Gb is 1GB from 0.5GB 😉

    rockSlayer , (edited )

    If you allocate it right, you can add 200GB of swap space and then that 4GB of RAM will feel like 408GB!

    ryannathans ,

    Wake me up when phones have enough ram to run good voice to text engines on the phone itself

    sunzu ,

    12 aint enough?

    ryannathans ,

    Think you need 16

    can ,

    How much do you need?

    ryannathans ,

    Last check about 10GB for the model. Anything less doesn't translate my voice to text accurately

    can ,

    What app are you using?

    Aatube ,
    @Aatube@kbin.melroy.org avatar

    We already have that since iOS 15 if you have a phone that released after the iPhone X. It's time to become woke, sheeple.

    can ,

    So 8gb?

    Aatube ,
    @Aatube@kbin.melroy.org avatar

    3GB, actually. That was on iPhone XR, which is basically the only budge iPhone Apple has made.

    Jtee ,
    @Jtee@lemmy.world avatar

    And now all the fan boys and girls will go out and buy another MacBook. That's planned obsolescence for ya

    mp3 ,
    @mp3@lemmy.ca avatar

    And why they solder the RAM, or even worse make it part of the SoC.

    cm0002 ,

    BUT BUT you'll get 5% fasTEr SpeED!!! And MOrE seCuRiTy!!!

    umami_wasbi ,

    Well. The claim they made still holds true, despit how I dislike this design choice. It is faster, and more secure (though attacks on NAND chips are hard and require high skill levels that most attacker won't posses).

    And add one more: it saves power when using LPDDR5 rather DDR5. To a laptop that battery life matters a lot, I agree that's important. However, I have no idea how much standby or active time it gain by using LPDDR5.

    rockSlayer ,

    There are real world performance benefits to ram being as close as possible to the CPU, so it's not entirely without merit. But that's what CAMM modules are for.

    akilou ,

    But do those benefits outweigh doubling or tripling the amount of RAM by simply inserting another stick that you can buy for dozens of dollars?

    rockSlayer ,

    That's extremely dependent on the use case, but in my opinion, generally no. However CAMM has been released as an official JEDEC interface and does a good job at being a middle ground between repairability and speed.

    halcyoncmdr ,
    @halcyoncmdr@lemmy.world avatar

    It's an officially recognized spec, so Apple will ignore it as long as they can. Until they can find a way to make money from it or spin marketing as if it's some miraculous new invention of theirs, for something that should just be how it's done.

    umami_wasbi ,

    Parts pairing will do. That's what Apple known for, knee capping consumer rights.

    gravitas_deficiency ,

    It’s highly dependent on the application.

    For instance, I could absolutely see having certain models with LPCAMM expandability as a great move for Apple, particularly in the pro segment, so they’re not capped by whatever they can cram into their monolithic SoCs. But for most consumer (that is, non-engineer/non-developer users) applications, I don’t see them making it expandable.

    Or more succinctly: they should absolutely put LPCAMM in the next generation of MBPs, in my opinion.

    BorgDrone ,

    Yes, there are massive advantages. It’s basically what makes unified memory possible on modern Macs. Especially with all the interest in AI nowadays, you really don’t want a machine with a discrete GPU/VRAM, a discrete NPU, etc.

    Take for example a modern high-end PC with an RTX 4090. Those only have 24GB VRAM and that VRAM is only accessible through the (relatively slow) PCIe bus. AI models can get really big, and 24GB can be too little for the bigger models. You can spec an M2 Ultra with 192GB RAM and almost all of it is accessible by the GPU directly. Even better, the GPU can access that without any need for copying data back and forth over the PCIe bus, so literally 0 overhead.

    The advantages of this multiply when you have more dedicated silicon. For example: if you have an NPU, that can use the same memory pool and access the same shared data as the CPU and GPU with no overhead. The M series also have dedicated video encoder/decoder hardware, which again can access the unified memory with zero overhead.

    For example: you could have an application that replaces the background on a video using AI. It takes a video, decompresses it using the video decoder , the decompressed video frames are immediately available to all other components. The GPU can then be used to pre-process the frames, the NPU can use the processed frames as input to some AI model and generate a new frame and the video encoder can immediately access that result and compress it into a new video file.

    The overhead of just copying data for such an operation on a system with non-unified memory would be huge. That’s why I think that the AI revolution is going to be one of the driving factors in killing systems with non-unified memory architectures, at least for end-user devices.

    dustyData ,

    Bus goes Vrrrroom vrrooom. Fuck AI.

    neo2478 ,

    That’s a fantastic explanation! Thank you!

    vaultdweller013 ,
    @vaultdweller013@sh.itjust.works avatar

    I feel like this is an arguement for new specialized computers at best. At worst it shows that this AI crap is even more harmful to the end user.

    FarraigePlaisteach ,

    And even if the out-of-the-box RAM is soldered to the machine, it should still be possible to add supplementary RAM that isn't soldered for when the system demands it. Other computers have worked like this in the past with chip RAM but a socket to add more.

    TheGrandNagus ,

    Apple's SoC long predates CAMM.

    Dell first showed off CAMM in 2022, and it only became JEDEC standardised in December 2023.

    That said, if Dell can create a really good memory standard and get JEDEC to make it an industry standard, so can Apple. They just chose not to.

    balder1991 ,

    In this particular case the RAM is part of the chip as an attempt to squeeze more performance. Nowadays, processors have become too fast but it’s useless if the rest of the components don’t catch up. The traditional memory architecture has become a bottleneck the same way HDDs were before the introduction of SSDs.

    You’ll see this same trend extend to Windows laptops as they shift to Snapdragon processors too.

    stoly ,

    People do like to downplay this, but SoC is the future. There's no way to get performance over a system bus anymore.

    helenslunch ,
    @helenslunch@feddit.nl avatar

    There is. It's called CAMM.

    stoly ,

    Funny that within one minute, they state the exact same problem.

    helenslunch ,
    @helenslunch@feddit.nl avatar

    If you actually watch past the first minute of the video, they explain that LPCAMM solves that problem...

    bamboo ,

    Someone who is buying a MacBook with the minimum specs probably isn’t the same person that’s going to run out and buy another one to get one specific feature in Xcode. Not trying to defend Apple here, but if you were a developer who would care about this, you probably would have paid for the upgrade when you bought it in the first place (or couldn’t afford it then or now).

    TheGrandNagus ,

    Well no, not this specific scenario, because of course devs will generally buy machines with more RAM.

    But there are definitely people who will buy an 8GB Apple laptop, run into performance issues, then think "oh I must need to buy a new MacBook".

    If Apple didn't purposely manufacture ewaste-tier 8GB laptops, that would be minimised.

    narc0tic_bird ,
    @narc0tic_bird@lemm.ee avatar

    I wouldn't be so sure. I feel like many people would not buy another MacBook if it were to feel a lot slower after just a few years.

    This feels like short term gains vs. long term reputation.

    Mongostein ,

    And the apple haters will keep making this exact same comment on every post using their 3rd laptop in ten years while I’m still using my 2014 MacBook daily with no issues.

    Be more original.

    Jtee ,
    @Jtee@lemmy.world avatar

    Nice attempt to justify planned obsolescence. To think apple hasn't done this time and time again, you'd have to be a fool

    Mongostein ,

    👍

    -posted from my ten year old MacBook which shows no need for replacement

    Jtee ,
    @Jtee@lemmy.world avatar

    And is what, 3 or 4 operating systems behind due to it being obsolete

    Honytawk ,

    At which point did Apple decide your MacBook was too old to be usable and stop giving updates or allow new software to run on it?

    Mongostein , (edited )

    Still gets security updates. All the software I need to run on it runs on it.

    My email, desktop, and calendar all still sync with my newer desktop. I can still play StarCraft. I can join zoom meetings while running Roll 20. I can even run Premiere and do video editing… to a point.

    I guess if you need the latest and greatest then you might have a point, but I don’t.

    This whole thread is bitching about software bloat and Apple does that to stop the software bloat on older machines, but noooo that’s planned obsolescence. 🙄

    stoly ,

    This is pretty much it. People really just want to find reasons to hate Apple over the past 2 - 3 years. You're right, though, your Mac can run easily for 10+ years. You're good basically until the web browsers no longer support your OS version, which is more in the 12-15 year range.

    theneverfox ,
    @theneverfox@pawb.social avatar

    In fairness, most computers built after around 2014-2016+ last way longer, performance started to level off not long after that. After all, devs write software for what people have, if everyone had 128 gigs of RAM we'd load everything we could think of into memory and you'd need it to keep up

    Macs did have some incredible build quality though, the newer ones aren't holding up even close to as well. I'm still using a couple 2012 Macs to play videos, it's slow as hell when you interact, but once the video is playing it still looks and sounds good

    helenslunch ,
    @helenslunch@feddit.nl avatar

    They will keep making the same comment as long as it keeps being true.

    • Typed from my 2009 ThinkPad

    Meanwhile your 2014 MacBook stopped receiving OS updates 3 years ago.

    Mongostein ,

    Weren’t you just complaining about software bloat?

    helenslunch ,
    @helenslunch@feddit.nl avatar

    ...no?

    Honytawk ,

    I still have a fully functioning Windows 95 machine.

    My daily driver desktop is also from around 2014.

    Mongostein ,

    That’s pretty sick actually

    Lucidlethargy ,
    @Lucidlethargy@sh.itjust.works avatar

    These were obsolete the minute they were made, though... So it's not really planned obsolescence. I got one for free (MacBook Air), and it's always been trash.

    bamboo ,

    I have an M2 MBA and it’s the best laptop I’ve ever owned or used, second to the M3 Max MBP I get to use for work. Silent, battery lasts all week, interface is fast and runs all my dev tools like a charm. Zero issues with the device.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines