DAPs: Deep Action Proposals for Action Understanding

Victor Escorcia, Fabian Caba Heilbron, Juan Carlos Niebles, Bernard Ghanem

Research output: Chapter in Book/Report/Conference proceedingConference contribution

160 Scopus citations

Abstract

Object proposals have contributed significantly to recent advances in object understanding in images. Inspired by the success of this approach, we introduce Deep Action Proposals (DAPs), an effective and efficient algorithm for generating temporal action proposals from long videos. We show how to take advantage of the vast capacity of deep learning models and memory cells to retrieve from untrimmed videos temporal segments, which are likely to contain actions. A comprehensive evaluation indicates that our approach outperforms previous work on a large scale action benchmark, runs at 134 FPS making it practical for large-scale scenarios, and exhibits an appealing ability to generalize, i.e. to retrieve good quality temporal proposals of actions unseen in training.
Original languageEnglish (US)
Title of host publicationLecture Notes in Computer Science
PublisherSpringer Nature
Pages768-784
Number of pages17
ISBN (Print)9783319464862
DOIs
StatePublished - Sep 17 2016

Fingerprint

Dive into the research topics of 'DAPs: Deep Action Proposals for Action Understanding'. Together they form a unique fingerprint.

Cite this