OpenMMLab's Next Generation Video Understanding Toolbox and Benchmark
-
Updated
Aug 14, 2024 - Python
OpenMMLab's Next Generation Video Understanding Toolbox and Benchmark
An open-source toolbox for action understanding based on PyTorch
Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits
Spatial Temporal Graph Convolutional Networks (ST-GCN) for the recognition of quick human actions. Quick actions are derived by downsampling the NTU-RGB+D dataset.
A platform to serve moderators and researchers to leverage open data to understand and further research in Wikipedia edits contribution based on geospatial and temporal event distribution
Add a description, image, and links to the spatial-temporal-action-detection topic page so that developers can more easily learn about it.
To associate your repository with the spatial-temporal-action-detection topic, visit your repo's landing page and select "manage topics."