Motion Segmentation for Neuromorphic Aerial Surveillance

Motion Segmentation for Neuromorphic Aerial Surveillance

This paper addresses these challenges by introducing a novel motion segmentation method that leverages self-supervised vision transformers on both event data and optical flow information. Our approach eliminates the need for human annotations and reduces dependency on scene-specific parameters.