Safer, more comfortable and energy-efficient living spaces are always demanded. However, most buildings are designed based on prescribed scenarios so that they do not act on abrupt changes of environments. We propose "Biofication of Living Spaces" that has functions of learning occupants' lifestyles and taking actions based on collected information. By doing so, we can incorporate the high adaptability to the building. Our goal is to make living spaces more "comfortable". However, human beings have emotion that implies the meaning of "comfortable" depends on each individual. Therefore our study focuses on recognition of human emotion. We suggest using robots as sensor agents. By using robots equipped with various sensors, they can interact with occupants and environment. We use a sensor agent robot called "e-bio". In this research, we construct a human tracking system and identified emotions of residents using their walking information. We focus on the influences of illuminance and sound. We classified emotions by calculating the distance of the mapped points in comfortable and uncomfortable spaces with parametric eigen space method, in which parameters are determined by a mapping of tracks in the space. As a method of pattern recognition, a weighted k-nearest neighbor is used. Experiments considering illuminance and sound environments, illustrates good correlation between emotion and environments.