The purpose of this study is sensing movements of 100-m runners from video that is publicly available, for example, Internet broadcasts. Normally, information that can be obtained from a video is limited to the number of steps and average stride length. However, our proposed method makes it possible to measure not only this information, but also time-scale information like every stride length and speed transition from the same input. Our proposed method can be divided into three steps. First, we generate a panoramic image of the 100-m track. By doing this, we can estimate where the runners are running in a frame at the 100-meter scale. Second, we detect whether the runner steps in the frame. For this process, we utilize the detected track lines and leg joint positions of runners. Finally, we project every steps to the overview image of the 100-m track to estimate the stride length at the 100-m scale. In the experiment part, we apply our method to various race videos. We evaluate the accuracy of our method via comparison with the data measured using typical methods. In addition, we evaluate the accuracy of estimation of the number of steps and show visualized runners’ steps and speed transitions.