In this paper, we present initial work towards evaluating augmented reality interfaces to enhance underwater navigation. We propose a conceptual framework that combines real-time GPS coordinates fetched from an Aqua-Fi module with computer vision approach to overlay a first-person view (FPV) rendering real-time AR-generated arrows pointing in the direction of the exit point of the dive. The system will allow the diver to trace the progression of the dive and easily find the way out in low visibility and high turbidity conditions. We present an initial requirements analysis based on test dives of the researchers to understand the problem domain better and an initial proposed system with early feasibility tests. We are testing an integrated AR system (inertial motion sensing, GPS and Computer Vision, DolphinSLAM ) with visual feedback for a first test, yet are considering also haptic and other modalities for interaction.