for me, it’s easier to rotate my wrist this way. and if you can rotate display to landscape… seems like the way to go. i’m sure there’s good reason they don’t do this. but doesn’t it seem like sensors would work better on the inside of the wrist too? here’s a bad mock up.
- just a guy with dumb ideas.
Would be great. Strongly support.
Difficulties:
Different biology down there. Apple is doing a lot of unstructured learning on large datasets to make its inference algorithms from what we can see. It’s possible that they’d be set back a lot if they had to support an area they weren’t collecting data for. (Hard to say — I don’t know what the major research bottlenecks are for them. I agree that we’re probably a lot of great signals down there, but a great language you don’t understand isn’t communicative, I’d you get me.)
It’s a touch screen. You’d need to very reliably get it to turn off when it faces away from you or you could get lots of accidental taps and movement from a leg or wet gym clothes etc. (And watches don’t have a camera; they don’t know when you’re looking at them)
It’s a scratch able surface that you now have facing down. It’s going to be constantly run back and forth over metal laptops etc. This significantly increases the abrasion that a lot of people would be looking at.