University of Bristol

Abstract

UnweaveNet

Our lives can be seen as a complex weaving of activi- ties; we switch from one activity to another, to maximise our achievements or in reaction to demands placed upon us. Observing a video of unscripted daily activities, we parse the video into its constituent activity threads through a process we call unweaving. To accomplish this, we introduce a video representation explicitly capturing activity threads called a thread bank, along with a neural controller capable of detecting goal changes and resuming of past activities, together forming UnweaveNet. We train and evaluate UnweaveNet on sequences from the unscripted egocentric dataset EPIC-KITCHENS. We propose and showcase the effi- cacy of pretraining UnweaveNet in a self-supervised manner.

CVPR 2022 Talk (June 2022)

Supplementary Video (Mar 2022)

Dataset

You can find the annotations for EPIC-KITCHENS-100 threads here.


BibTeX

@InProceedings{Price_2022_CVPR,
    author        = {Price, Will and Vondrick, Carl and Damen, Dima},
    title         = {UnweaveNet: Unweaving Activity Stories},
    booktitle     = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    year          = {2022},
    pages          = {13770-13779},
}

Downloads

Acknowledgements

Funded by EPSRC National Productivity Investment Fund (NPIF) Doctoral Training Programme, EPSRC UMPIRE (EP/T004991/1) and the NSF NRI Award #2132519