techradar.com

Th4tGuyII , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding
@Th4tGuyII@fedia.io avatar

The TL;DR for the article is that the headline isn't exactly true. At this moment in time their PPU can potentially double a CPU's performance - the 100x claim comes with the caveat of "further software optimisation".


Tbh, I'm sceptical of the caveat. It feels like me telling someone I can only draw a stickman right now, but I could paint the Mona Lisa with some training.

Of course that could happen, but it's not very likely to - so I'll believe it when I see it.

Having said that they're not wrong about CPU bottlenecks and the slowed rate of CPU performance improvements - so a doubling of performance would be huge in this current market.

barsquid ,

Putting the claim instead of the reality in the headline is journalistic malpractice. 2x for free is still pretty great tho.

barsquid ,

Just finished the article, it's not for free at all. Chips need to be designed to use it. I'm skeptical again. There's no point IMO. Nobody wants to put the R&D into massively parallel CPUs when they can put that effort into GPUs.

frezik ,

Not every problem is amenable to GPUs. If it has a lot of branching, or needs to fetch back and forth from memory a lot, GPUs don't help.

Now, does this thing have exactly the same limitations? I'm guessing yes, but it's all too vague to know for sure. It's sounds like they're doing what superscalar CPUs have done for a while. On x86, that starts with the original Pentium from 1993, and Crays going back to the '60s. What are they doing to supercharge this idea?

Does this avoid some of security problems that have popped up with superscalar archs? For example, some kernel code running at ring 0 is running alongside userspace code, and it all gets the same ring 0 level as a result.

Clusterfck ,

I get that we have to impress shareholders, but why can’t they just be honest and say it doubles CPU performance with the chance of even further improvement with software optimization. Doubling performance of the same hardware is still HUGE.

Zorque ,

They... they did?

Feathercrown ,

Not in the title

Zorque ,

They didn't write the title.

Feathercrown ,

I don't know what "they" you're talking about, but I think it's clear I'm referring to the person responsible for writing the original title. Not OP and not the article author if the publisher is choosing the title.

Zorque ,

And I think it's pretty clear I'm not. And it seems pretty clear the OP wasn't either.

So... are you just stating random things for the fuck of it, or did you have an actual reason for bringing up a non-sequitur?

Feathercrown ,

Was it though?

pop ,

I'm just glad there are companies that are trying to optimize current tech rather than just piling over new hardware every damn year with forced planned obsolescence.

Though the claim is absurd, I think double the performance is NEAT.

dustyData ,

This is new hardware piling. What they claim to do requires reworking manufacturing, is not retroactive with current designs, and demands more hardware components. It is basically a hardware thread scheduler. Cool idea, but it won't save us from planned obsolescence, if anything it is more incentive for more waste.

MadMadBunny ,

Ah, good ol’ magic wishful thinking…

thoralf , to Technology in I'm convinced watchOS 11 is hinting at an Apple Watch 10 with better battery life
@thoralf@discuss.tchncs.de avatar

I sold my watch because I was so annoyed by the bad battery life.

I expect a watch to run for a couple of days at least.
Charging it while taking a shower must be sufficient. And I do not shower for half an hour.

As long as the Apple Watch is not able to do that, it’s a no-go for me.

kalleboo ,

"A couple of days" seems like the worst of both worlds - it needs to be charged often, but not on a fixed schedule, so you have to keep tabs on the battery and plan ahead.

Personally I just have a charger on my night stand and charge it every night alongside my phone. It's an easy routine and I don't want to sleep with a watch on anyway (smart or not) since when I do I eventually get a rash on my wrist.

For those who want to do sleep tracking, they need to speed up charging so that the "charging while I take a shower" works for those of us who take shorter showers

LucidNightmare ,

Just thought I’d mention that if you are getting a rash when sleeping with the watch, you need to start cleaning the bottom of the watch or clean it better. I wipe mine down with an alcohol wipe once a day, and use my nails to get the wipe into the crevices where dirt and sweat might be building up. Hope that helps you!

mlg , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding
@mlg@lemmy.world avatar

Startup discovers what a northbridge is

Kazumara , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding

The techradar article is terrible, the techcrunch article is better, the Flow website has some detail.

But overall I have to say I don't believe them. You can't just make threads independent if they logically have dependencies. Or just remove cache coherency latency by removing caches.

bitfucker ,

Can't have cache latency if there is no cache!

StupidBrotherInLaw ,

So THIS is what the communists were talking about when they told me about the benefits of transitioning to a cacheless society!

tombruzzo , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding

I don't care. Intel promised 5nm 10ghz single core processors by this point and I still want it out of principle

probableprotogen , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding

Gee its like all modern computers already have massively parallel processing devices built in.

blahsay , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding

10 tricks to speed up your cpu and trim belly fat. Electrical engineers hate them! Invest now! Start up is called 'DefinitelyNotAScam'.

TheFeatureCreature , to Technology in I watched Nvidia's Computex 2024 keynote and it made my blood run cold
@TheFeatureCreature@lemmy.world avatar

On the plus side, the industry is rapidly moving towards locally-run AI models specifically because they don't want to purchase and run fleets of these absurd things or any other expensive hardware.

MudMan ,
@MudMan@fedia.io avatar

The tragic irony of the kind of misinformed article this is linking is that the server farms that would be running this stuff are fairly efficient. The water is reused and recycled, the heat is often used for other applications. Because wasting fewer resources is cheaper than wasting more resources.

But all those locally-run models on laptop CPUs and desktop GPUs? That's grid power being turned into heat and vented into a home (probably with air conditioning on).

The weird AI panic, driven by an attempt to repurpose the popular anti-crypto arguments whether they matched the new challenges or not, is going to PR this tech into wasting way more energy than it would otherwise by distributing it over billions of computer devices paid by individual users. And nobody is going to notice or care.

I do hate our media landscape sometimes.

Chee_Koala ,

But efficiency is not the only consideration, privacy and self reliance are important facets as well. Your argument about efficiënt computing is 100% valid but there is a lot more to it.

XeroxCool ,

If I make a gas engine with 100% heat efficiency but only run it in my backyard, do the greenhouse gases not count because it's so efficient? Of course they do. The high efficiency of a data center is great, but that's not what the article laments. The problem it's calling out is the absurdly wasteful nature of why these farms will flourish: to power excessively animated programs to feign intelligence, vainly wasting power for what a simple program was already addressing.

It's the same story with lighting. LEDs seemed like a savior for energy consumption because they were so efficient. Sure they save energy overall (for now), but it prompted people to multiply the number of lights and total output by an order of magnitude simply because it's so cheap. This stems a secondary issue of further increasing light pollution and intrusion.

Greater efficiency doesn't make things right if it comes with an increase in use.

MudMan ,
@MudMan@fedia.io avatar

For one thing, it's absolutely not true that what these apps provide is the same as what we had. That's another place where the AI grifters and the AI fearmongers are both lying. This is not a 1:1 replacement for older tech. Not on search, where LLM queries are less reliable at finding facts and being accurate but better at matching fuzzy searches without specific parameters. Not with image generation, obviously. Not with tools like upscaling, frame interpolation and so on.

For another, some of the numbers being thrown around are not realistic or factual, are not presented in context or are part of a power increase trend that was already ongoing with earlier applications. The average high end desktop PC used to run on 250W in the 90s, 500W in the 2000s. Mine now runs at 1000W. Playing a videogame used to burn as much power as a couple of lightbulbs, now it's the equivalent of turning on your microwave oven.

The argument that we are burning more power because we're using more compute for entertainment purposes is not factually incorrect, but it's both hyperbolic (some of the cost estimates being shared virally are deliberate overestimates taken out of context) and not inconsistent with how we use other computer features and have used other computer features for ages.

The only reason you're so mad about me wasting some energy asking an AI to generate a cute picture but not at me using an AI to generate frames for my videogame is that one of those is a viral panic that maps nicely into the viral panic about crypto people already had and the other is a frog that has been slow burning for three decades so people don't have a reason to have an opinion about it.

downhomechunk , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding
@downhomechunk@midwest.social avatar

Overclockers:
"Give me some liquid nitrogen and I'll make that 102x."

over_clox ,

Meh, I just spit on it.

rtxn , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding

Cybercriminals are creaming their jorts at the potential exploits this might open up.

Brunbrun6766 ,
@Brunbrun6766@lemmy.world avatar

Please, hackers wear cargo shorts and toe shoes sir

MajorHavoc , (edited )

Oof. But yeah. Fair.

I want to go on record that sometimes I just wear sandals with socks.

qevlarr , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding
@qevlarr@lemmy.world avatar

🚨 ⚠ 🚨 Hoax alert! 🚨 ⚠ 🚨

_sideffect , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding

You can download more ram too!

xantoxis , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding

This change is likened to expanding a CPU from a one-lane road to a multi-lane highway

This analogy just pegged the bullshit meter so hard I almost died of eyeroll.

rottingleaf ,

Apparently the percentage of people actually understanding what they are doing in the management part of the industry is now too low to filter out even such bullshit.

AnarchistArtificer ,

You've got to be careful with rolling your eyes, because the parallelism of the two eyes means that the eye roll can be twice as powerful ^1


(1) If measured against the silly baseline of a single eyeroll

amanda , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding
@amanda@aggregatet.org avatar

Has anyone been able to find an actual description of what this does? I clicked two layers deep and neither explains the details. It does sound like they’re doing CPU scheduling in the hardware, which is cool and makes some sense, but the descriptions are too vague to explain what the hell this is except “more parallelism goes brrrr” and it’s not clear to me why current GPUs aren’t already that.

Buffalox , to Technology in Finnish startup says it can speed up any CPU by up to 100x using a tiny piece of hardware with no recoding

Why is this bullshit upvoted?
Already the first sentence, they change from the headline "without recoding" to "with further optimization".
Then the explanation "a companion chip that optimizes processing tasks in real-time"
This is already done at compiler level and internally in any modern CPU for more than a decade.

It might be possible to some degree for some specific forms of code, like maybe Java. But generally for the CPU this is bullshit, and the headline is decidedly dishonest.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • kbinchat
  • All magazines