The relationship between audience and performers is crucial to what makes live events so special. The aim of this work is to develop a new approach amplifying the link between audiences and performers. Specifically, we explore the use of wearable sensors in gathering real-time audience data to augment the visuals of a live dance performance. We used the J!NS MEME, smart glasses with integrated electrodes enabling eye movement analysis (e.g. blink detection) and inertial motion sensing of the head (e.g. nodding recognition). This data is streamed from the audience and visualised live on stage during a performance, alongside we also collected heart rate and eye gaze of selected audience. In this paper we present the recorded dataset, including accelerometer, electrooculography(EOG), and gyroscope data from 23 audience members.