Recent progress of mobile devices such as smartphones enables human to leverage their perception ability as a part of sensing framework. This sensing framework, so called participatory sensing, distributes various sensing tasks (e.g., whether report, waiting time in a queue, traffic conditions etc.) to possible participants. Then, participants can select and achieve the sensing tasks. However, in the coming future with the rapid growth of participatory sensing and increasing number of sensing tasks, it must be very hard for users to choose appropriate sensing tasks around them. To solve this problem, we propose a system called Help Me!, which can value and visualize importance of sensing tasks by quantifying them in cooperation with physical sensors. Since Help Me! system provides objective index for sensing tasks, it enhances opportunity for users to participate to sensing tasks. We designed and implemented Help Me! system as an integrated architecture of physical sensors and participatory sensors. Through initial in-lab experiment, we confirmed that Help Me! system can enhance opportunity to participate to sensing tasks for users.