manton
manton
How to rethink Siri manton.org
|
Embed
Progress spinner
In reply to
paulrobertlloyd
paulrobertlloyd

@manton If they do use cloud based models, curious how they’d account for the impact these would have on their very public climate commitments (the answer can’t be offsets).

|
Embed
Progress spinner
bax
bax

@manton I mean just try using an Apple Watch with no connectivity (like on a jog with no phone, as I do multiple times a week) trying to add a reminder or note of something you just thought of… and its a huge exercise in frustration that it can't do something so basic despite newer watches having on-device neural processing for things like transcription now.

Granted, third party apps are also bad no-phone/connectivity watch citizens too but I'd expect better from the first party.

|
Embed
Progress spinner
manton
manton

@paulrobertlloyd I think Apple has two choices: announce that all cloud AI is at their own data centers, which are already 100% renewable; or, announce the OpenAI partnership using Microsoft servers, which will be 100% renewable in 2025. If they let this go without saying anything, I'll be very surprised. I'm also skeptical that Apple's own data centers are ready to scale up.

|
Embed
Progress spinner
fgtech
fgtech

@manton That is an excellent point about universal interactions. Our mental model is that we are interacting with the same entity each time, but we are not. This is a design flaw, but a fixable one.

|
Embed
Progress spinner
fgtech
fgtech

@manton I keep seeing people say stuff like this:

If there was a way to have some kind of weighting in the models so that Siri could answer “I’m not sure, but I think…” that would go a long way to dealing with hallucinations

While that would be a great thing to have, it’s just not a thing an LLM can do. For this to work, the model needs to “understand” what it’s saying. It doesn’t. It’s just generating words that match a statistical model of the words in its training data. That’s it.

|
Embed
Progress spinner