When computers begin to perceive or feel things the way a human would they could be described as sentient. In a way, it may not matter what the computer feels if the human feels like they are connecting with something that is sentient; much like the storyline of the 2013 movie, Her. Patrick Levy Rosenthal, Founder & CEO of Emoshape, Inc. demonstrates their “emotional engine”, which he explains senses 64 million possible emotional states every 1/10 of a second.
Analogous to a graphical processing unit, Levy Rosenthal describes a device that can augment a PC with emotional intelligence. At CES2018, Emoshape demonstrated this capability by immersing attendees in a dream-like, virtual reality setting that changed based on the feedback from the viewer/participant.
Levy Rosenthal sees the integration of their emotional engine chip into consumer devices and provides the example of how it could be integrated into an e-reader, such that the computer voice would change to reflect the context of what it was reading (e.g. automatically change to a high-pitch if the character were nervous). Of course, this actor may also have a visual component, as shown by their fictional character Jade. At a superficial level, it would seem like this might be the death-knell for the acting profession and it may represent a huge change for that industry.
But the bigger change may be the creation of virtual assistants that reflect the personality of its human master. These virtual assistants are what Michael Robinson described with his Ambrogio concept and that Emoshape explains in this post of how we may someday interact with machines and humans; a line that will be increasingly blurred.