Interactions between robots and their environment are essential in many robotic tasks. In the interaction, both visual and haptic information are important. Visual information gives us a state of environment before the interactions. On the other hand, haptic information gives us that after the interactions. Recent studies investigate relationships between vision and touch using deep learning. However, the models become complicated and it is difficult to understand. In this study, we propose a framework that can estimate a probabilistic distribution of a object's stiffness using visual observation and contact information based on object detection and Gaussian mixture model (GMM). We focused on environmental stiffness as one of the important properties of environment. The proposed framework can use prior knowledge of the environment in designing parameters of GMM. In addition, We applied the proposed method to object identification task and experimentally validated it.