CASE STUDY: USING POSTURE AND BEHAVIOR TO INFORM AFFECTIVE COMPUTERS BY PREDICTING HUMAN INTENTION
Affective computing is a research area in computing which suggests that for computers to be able to perform at a level of interaction more natural for people, and for computers to be genuinely intelligent, they must have the ability to recognize, understand, and even to have and express emotions (Picard 1997). Polhemus LIBERTY was utilized in this study.
T D Jones and S W Lawson
School of Computing, Napier University, Edinburgh
Key Results: Preliminary experiments show that an affective computer system is capable of monitoring human posture and successfully outputting the correct posture classification to match the movement being exhibited.
How does the work advance the state-of-the-art: Affective computers can predict the intention of a person through the behavior that the person exhibits and provide a service or function based on the recognition of the action to be performed.
Motivation (problems addressed): A single class of behavior can be exhibited in varied ways by different people; recognition of a particular movement that can be performed in diverse ways will also have to adjust to varying speeds of the same behavior and alternate motions that comprise it.
The context in which behavior is performed is also paramount in defining the outcome of the intended action; an affective computer would need to have sufficient data of the context in which a person is situated.
Affective computing is a research area in computing which suggests that for computers to be able to perform at a level of interaction more natural for people, and for computers to be genuinely intelligent, they must have the ability to recognize, understand, and even to have and express emotions (Picard 1997).
Behavior is an expression of our emotions and can provide others with a certain amount of information about their state of mind. A person's posture and behavior can inform those around them of their intended actions, such is to say that a human can predict the actions of another by interpreting their movements based on previous experience.
The aim of the research introduced in this paper is to capture human behavior using sensors that will record a performed movement. This data will be processed automatically by an affective computer which will attempt to predict the intention of the person from their exhibited behavior.
The remainder of this paper describes the apparatus and experimental procedure adopted for the research, it then discusses the results of some preliminary experiments and concludes with proposed future work.
Apparatus and Experimental Procedure
To capture behaviors, a Polhemus LIBERTY system was used, which was equipped with eight independent sensors, measuring six degrees or reference: spatial coordinates X,Y,Z and orientation coordinates X,Y,Z; each sensor is capable of producing 240 updates per second. Fig. 1: Shows Polhemus Liberty sensors attached to subject.
Two behaviors were captured from six subjects: a punch and a handshake. Three sensors were strapped to the right arm (Fig. 1), the first positioned on the hand (sensor A), the second on the upper forearm (sensor B), and the third on the upper arm (sensor C).
The sensors relay their information in relation to a stationary transmitter which produces a low electromagnetic frequency.
Results and Discussion
The data gathered by the Polhemus equipment was analysed offline. Fig. 2 shows the data produced by a punch for (sensor A) only, which was attached to the hand.
The X coordinate shows the movement of the hand travelling directly toward the punch bag. The Y coordinate shows the hand moving back behind the body and then its trajectory as it moves forward and upwards to the target, and the Z coordinate shows the height of the hand as it moves backwards and up and then travels forward towards the target. Fig. 2: Results of punch behavior
Fig. 3 shows the data produced by a handshake for sensor A only, which was attached to the hand.
The X coordinate shows the hand travelling toward the target by crossing in front of their body, indicated by the upwards slope on the graph. The Y coordinate shows the forward motion of the hand. The Z coordinate show the height of the hand to the point of contact, indicated by the first line and then one shake of the hands to a rest position, the second line. Fig. 3: Results of handshake behavior
For each subject the data was analysed and compared with their repetitions. Next, the data was analysed against other test subjects data for comparison. Preliminary results show there is very little variation between individuals repetitions; overall, there are small variations between test subjects performing the same behavior.
For a computer to be able to predict behavior it must first be able to recognize the behaviors and make sense of them. To do this the data gathered so far will be used as training sets for programs such as neural and probabilistic networks (Rumelhart 1986), (Speckt 1990). Once the networks are trained they will be fed data from more tests in an attempt to achieve the highest percentage of recognition. Once this has been achieved the data will be streamlined for the least data required to perform the same result. Finally, the networks will then be trained to predict behavior, it is intended that a real time system will be created to demonstrate this.
The data gathered has shown that it is possible to recognize human behavior and determine the trajectory of movement throughout all phases of the behavior being exhibited. If an autonomous affective computer system can successfully process this data it should be possible to have a system that can recognize and predict human behavior.