This profile is from a federated server and may be incomplete. View on remote instance

Markaos ,

Yeah, even the TLDR makes it sound more like Qualcomm is yielding to the pressure from OEMs who want to be able to offer longer updates

Markaos ,

Convenience (after you install it, all you have to do is enter the code and you're connected, no other setup required), familiarity (it's the default name people will think of or find if they want remote access - that alone means they can get away with pushing their users slightly more) and - IMHO most importantly - connectivity: if two computers can connect to the TeamViewer servers, they will be able to connect to each other.

That's huge in the world of broken Internet where peer to peer networking feels like rocket science - pretty much every consumer device will be sitting behind a NAT, which means "just connecting" is not possible. You can set up port forwarding (either manually or automatically using UPnP, which is its own bag of problems), or you can use IPv6 (which appears to be currently available to roughly 40% users globally; to use it, both sides need to have functional IPv6), or you can try various NAT traversal techniques (which only work with certain kinds of NAT and always require a coordinating server to pull off - this is one of the functions provided by TeamViewer servers). Oh, and if you're behind CGNAT (a kind of NAT used by internet providers; apparently it's moderately common), then neither port forwarding or NAT traversal are possible. So if both sides are behind CGNAT and at least one doesn't have IPv6, establishing a direct link is impossible.

With a relay server (like TeamViewer provides), you don't have to worry about being unable to connect - it will try to get you a direct link, but if that fails, it will just act as a tunnel and pass the data between both devices.

Sure, you can self host all this, but that takes time and effort to do right. And if your ISP happens to use CGNAT, that means renting a VPS because you can't host it at home. With TeamViewer, you're paying for someone else to worry about all that (and pay for the servers that coordinate NAT traversal and relay data, and their internet bandwidth, neither of which is free).

Markaos ,

It doesn't slow charge, at least not on Pixel 7a. Well, you could argue whether 20W is slow charging, but it's all this phone can do.

It just charges normally to 80%, stops, and then resumes charging about an hour or two before the alarm. And last time I used it, it had a cool bug where if it fails to reach 80% by the point in time when it's supposed to resume charging, it will just stop charging no matter what the current charge level is. Since that experience, I just turned this feature off and charge it in whenever it starts running low.

Markaos ,

If it doesn't come at the expense of battery wear, then sure, lower charge time is just better. But that would make phone batteries the only batteries that don't get excessively stressed when fast charging. Yeah, phone manufacturers generally claim that fast charging is perfectly fine for the battery, but I'm not sure I believe them too much when battery degradation is one of the main reasons people buy new phones.

I have no clue how other manufacturers do it (so for all I know they could all be doing it right and actually use slow charging), but Google has a terrible implementation of battery conservation - Pixels just fast charge to 80%, then wait until some specific time before the alarm, then fast charge the rest. Compare that to a crappy Lenovo IdeaPad laptop I have that has a battery conservation feature that sets a charge limit AND a power limit (60% with 25W charging), because it wouldn't make sense to limit the charge and still use full 65W for charging.

Markaos , (edited )

I don't really see the big problem here? Like sure, it's silly that it's cheaper to make wireless headphones than wired ones (I assume - the manufacturers are clearly not too bothered by trademarks and stuff if they put the Lightning logo on it so they wouldn't avoid wired solution just due to licensing fees), but what business does Apple have in cracking down on this? Other than the obvious issues with trademarks, but those would be present even if it were true wired earphones. It's just a knockoff manufacturer.

Cheapest possible wired earphones won't sound much better than the cheapest possible wireless ones, so sound quality probably isn't a factor. And on the plus side, you don't have multiple batteries to worry about, or you could do something funny, like plugging the earphones into a powerbank in your pocket and have a freak "hybrid" earphones with multi-day battery (they're not wireless, but also not tethered to your phone). On the other side, you do waste some power on the wireless link, which is not good for the environment in the long run (the batteries involved will see marginally more wear)

Honestly the biggest issue in my mind is forcing people to turn on Bluetooth, but I don't think this will change anyone's habits - people who don't know what Bluetooth is will definitely just leave it on anyway (it's the default state), and people technical enough to want to turn it off will recognize that there's something fishy about these earphones.

Markaos ,

Are you sure you didn't set DNS directly on some/all of your devices? Because in that case they won't care about the router settings and will use whatever you set them to.

Also as the other commenter said, DNS changes might not propagate to other devices on the network until the next time they connect - a reboot or unplugging the cable from your computer for a few seconds is a dirty but pretty OS agnostic way to do that.

Markaos ,

I feel like the ingest system will be sophisticated enough to throw away pieces of text that begin with a message like "ChatGPT says". Probably even stuff that follows the "paragraph with assumptions and clarifications followed by a list followed by a brief conclusion" structure - everything old has been ingested already, and most of the new stuff containing this is probably AI generated.

Markaos ,

If a thief knows your PIN (by watching an earlier unlock), Android is now requiring “biometrics for accessing and changing critical Google account and device settings, like changing your PIN, disabling theft protection or accessing Passkeys, from an untrusted location.”

Sounds great for Pixel 6 series with their reportedly highly reliable fingerprint sensors /s

Honestly, I'm not sure what to think about this - extra protection against unauthorized access is good, but requiring biometric verification with no apparent alternative irks me the wrong way.

Maybe that's just because of my experiences with Nokia 5.3 and its awful rear fingerprint sensor with like 10% success rate. But then again, there will eventually be phones with crappy sensors running Android 15.

Markaos ,

I had a similar opinion when I was buying that phone - pretty much every phone had a fingerprint scanner and people generally didn't complain about them, so decent scanners should have been mass produced and cheap - but HMD/Nokia managed to make me reconsider that opinion.

For context, Nokia 5.3 is a 3 or 4 years old model, so it definitely doesn't disprove your statement, but I remain sceptical.

Markaos ,

The downside is that you're then zooming in on the compression artifacts and all the "enhancements" we've all learned to "love" over the past decade (thanks, Google!), while the in-app zoom probably works with raw image data before zooming in.

Markaos ,

GadgetBridge is not really against supporting online-only functions, it just can't be part of the main app. Weather Providers is what you're looking for.

Markaos ,

Oh sorry, I understood your comment as saying that you couldn't get weather info with GadgetBridge because of its somewhat unique architecture. If you're using a different companion app then apps for GadgetBridge probably won't work.

Also I'm not familiar with Pebble, I'm just assuming it works similarly to other GadgetBridge-supported watches / bands like the Mi Band I use.

Markaos ,
Markaos ,

Well, feel free to click on this link then: https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode.en

(it's just a link to Google homepage - the point is that you really shouldn't trust the link text lol)

Markaos ,

Not a fair comparison IMHO - Ethernet is designed to be a connection between two or more otherwise independent peers (on L2), while USB's goal was to allow connecting simple peripheral devices to computers. There was never meant to be a situation where it's unclear which side is the Host.

Also note that the bridging "cable" is literally just two USB devices, one for each computer (although they are both on the same chip, so there's that), with some internal link to pass the data.

Markaos ,

Yeah, it's not practical right now, but in 10 years? Who knows, we might finally have some built-in AI accelerator capable of running big neural networks on consumer CPUs by then (we do have AI accelerators in a large chunk of current CPUs, but they're not up to the task yet). The system memory should also go up now that memory-hungry AI is inching closer to mainstream use.

Sure, Internet bandwidth will also increase, meaning this compression will be less important, but on the other hand, it's not like we've stopped improving video codecs after h.264 because it was good enough - there are better codecs now even though we have the resources to handle bigger h.264 videos.

The technology doesn't have to be useful right now - for example, neural networks capable of learning have been studied since the 1940s, even though there would be no way to run them for many decades, and it would take even longer to run them in a useful capacity. But now that we have the technology to do so, they enjoy rapid progress building on top of that original foundation.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • kbinchat
  • All magazines