The performance of mobile multi-robot systems dramatically depends on the mutual awareness of individual robots, particularly the positions of other robots. GPS and motion capture cameras are commonly used to acquire and ultimately communicate positions of robots. Such sensing schemes depend on infrastructure and restrict the capabilities of a multi-robot system, e.g., the robots cannot operate in both indoor and outdoor environments. Conversely, peer-to-peer localization algorithms can be used to free the robots from such infrastructures. In such systems, robots use on-board sensing to infer the positions of nearby robots. In this approach, it is essential to have a model of the motion of other robots. We introduce a flocking localization scheme that takes into account motion behavior exhibited by the other robots. The proposed scheme depends only on the robots’ on-board sensors and computational capabilities and yields a more accurate localization solution than the peer-to-peer localization algorithms that do not take into account the flocking behavior. We verify the performance of our scheme in simulations and demonstrate experiments on two unmanned aerial vehicles.