Looks like this is a case of Apple Engineers' stab at Empathic Computing
Thanks for posting this interesting item. Reading their a paper in arXiv (https://arxiv.org/abs/2501.12493 ), it is Apple's stab at bringing empathic computing closer to everyday experience. If they are successful at this, it will be a game changer. Most robotic stuff such as vacuum cleaners need to use only "functional" aspects but as their research shows, adding expressions to functions such as reminders and social cues can be well accepted at least to some users. That said, the research Apple engineers cited were still White male dominated essentially employees of Apple, so the experiences they cited that made up their conclusion that expressive computing over and above functional uses will be well accepted does not yet stack up unless wider research is conducted. That said, there is nothing to stop Apple from marketing some of these stuff in regular uses in tools that they make. The table lamps were instantiations (in real world nothing like that may eventuate), but at the least we can expect voiceovers and Siri might actually express voice based empathy and even that would be well-accepted. I'd say this is an exciting space to watch.