论文标题

CSCNET:拥挤空间中轨迹预测的上下文语义一致性网络

CSCNet: Contextual Semantic Consistency Network for Trajectory Prediction in Crowded Spaces

论文作者

Xia, Beihao, Wong, Conghao, Peng, Qinmu, Yuan, Wei, You, Xinge

论文摘要

轨迹预测旨在预测行人,骑自行车的人,车辆等代理商的运动趋势。分析和理解拥挤空间中的人类活动并在许多领域进行广泛应用是有帮助的,例如监视视频分析和自动驾驶系统。由于深度学习的成功,轨迹预测取得了重大进展。当前的方法致力于研究代理商在社会互动和风景的物理约束下的未来轨迹。此外,如何处理这些因素仍然引起了研究人员的关注。但是,当在各种预测场景中对这些相互作用进行建模时,它们会忽略\ textbf {语义转移现象}。内在或社交和物理互动之间存在几种语义偏差,我们称之为“ \ textbf {gap}”。在本文中,我们提出了A \ textbf {c} onTextual \ textbf {s} emantic \ textbf {c} onSisterency \ textbf {net} work(\ textbf {cscnet}),以预测具有强大而有效上下文的未来活动。我们利用精心设计的上下文感知转移来从场景图像和轨迹中获取中间表示。然后,我们通过对齐活动语义和场景语义来跨越差距来消除社会和身体互动之间的差异。实验表明,CSCNET在定量和定性上的性能优于大多数当前方法。

Trajectory prediction aims to predict the movement trend of the agents like pedestrians, bikers, vehicles. It is helpful to analyze and understand human activities in crowded spaces and widely applied in many areas such as surveillance video analysis and autonomous driving systems. Thanks to the success of deep learning, trajectory prediction has made significant progress. The current methods are dedicated to studying the agents' future trajectories under the social interaction and the sceneries' physical constraints. Moreover, how to deal with these factors still catches researchers' attention. However, they ignore the \textbf{Semantic Shift Phenomenon} when modeling these interactions in various prediction sceneries. There exist several kinds of semantic deviations inner or between social and physical interactions, which we call the "\textbf{Gap}". In this paper, we propose a \textbf{C}ontextual \textbf{S}emantic \textbf{C}onsistency \textbf{Net}work (\textbf{CSCNet}) to predict agents' future activities with powerful and efficient context constraints. We utilize a well-designed context-aware transfer to obtain the intermediate representations from the scene images and trajectories. Then we eliminate the differences between social and physical interactions by aligning activity semantics and scene semantics to cross the Gap. Experiments demonstrate that CSCNet performs better than most of the current methods quantitatively and qualitatively.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源