(麻豆淫院) -- Samsung phones of the future may tell if you are happy, sad, or altogether disgusted. Samsung has filed for a patent on a method and device that can tell a user鈥檚 emotions based on facial expressions. The patent, 鈥淢ethod and Apparatus for Recognizing an Emotion of an Individual Based on Facial Action Units鈥 back in October last year but came to light this month. Samsung鈥檚 approach will be to use 鈥淎ction Units鈥 (AUs) to recognize how a person feels.
The AUs are components of a facial-action coding system. Several AUs combine to form a string that's detected and matched to an emotion label that best fits the string. The patent does not specify smartphones or any other specific form factor; the device is referred to generally in the patent as 鈥渁n apparatus.鈥
The language is difficult to understand, but it states that what Samsung has in mind is 鈥淎n apparatus for recognizing an emotion of an individual using Action Units (AUs), the apparatus comprising: a processor; and a memory coupled to the processor, wherein the memory includes instructions stored therein, that when executed by the processor, cause the processor to:[sic] receive an input AU string鈥︹
Fundamentally, the patent is talking about a device that will pick up a person鈥檚 emotions by 鈥渞eading鈥 the person鈥檚 face, whether the results indicate the user is angry, disgusted, sad, happy, or something else in the realm of emotions. It would be impossible to say that Samsung is the first to entertain the notion of using a coding system for facial expressions or the notion of computers reading emotions.
Facial Action Coding System(FACS) is an index of facial expressions that was by Paul Ekman, Wallace V. Friesen, and Richard J Davidson in 1978. Action Units (AUs) were described as the fundamental actions of individual muscles or groups of muscles.
鈥淥ur primary goal in developing the Facial Action Coding System (FACS) was to develop a comprehensive system which could distinguish all possible visually distinguishable facial movements,鈥 said Ekman and Friesen.
Also, over at Cambridge University, Prof. Simon Baron-Cohen, Director of the Autism Research Center, has provided a of facial expressions and the emotions they represent. Researchers have worked up a computer system based on his research.
At MIT, the Affective Computing Research Group is working on computers that can read facial expressions and track basic states like confusion, liking or disliking.
The four people filing for the Samsung patent, meanwhile, are from Bangalore, India. That a new mobile breakthrough might emanate from Bangalore would come as no surprise to those familiar with Samsung India Software Operations (SISO), a Samsung R&D center.
While the SISO site makes no mention of an emotion-reading device in the works, there is mention of a Samsung research area where SISO plays a significant role, 鈥渢he Next eXperiences Team (NXT).鈥 These researchers are tasked with working out new ideas through 鈥渦ser research and design with a special focus on mobile devices.鈥
That emotion-reading technology is coming to your smartphone may be later than sooner, but Samsung鈥檚 patent application implies they want to step in that direction. As Ubergizmo specifies, 鈥淧erhaps future Galaxy range from Samsung will be able to tell whether you are happy or sad.鈥
Outside the realm of Samsung, scientists consider the concept of facial-reading technology worth pursuing. Medically, such technology would teach people with disorders to better recognize emotions. In fact, 鈥渆motion measurement technology will be soon ubiquitous," says MIT Media Lab Research Scientist Rana El Kaliouby. "It will allow people to communicate in new different ways. It's a kind of very sophisticated version of the 'Like' button on Facebook," she the BBC.
漏 2012 麻豆淫院.Org