论文标题

加速Exa.trkx管道的推断

Accelerating the Inference of the Exa.TrkX Pipeline

论文作者

Lazar, Alina, Ju, Xiangyang, Murnane, Daniel, Calafiura, Paolo, Farrell, Steven, Xu, Yaoyuan, Spiropulu, Maria, Vlimant, Jean-Roch, Cerati, Giuseppe, Gray, Lindsey, Klijnsma, Thomas, Kowalkowski, Jim, Atkinson, Markus, Neubauer, Mark, DeZoort, Gage, Thais, Savannah, Hsu, Shih-Chieh, Aurisano, Adam, Hewes, V, Ballow, Alexandra, Acharya, Nirajan, Wang, Chun-yi, Liu, Emma, Lucas, Alberto

论文摘要

最近,图形神经网络(GNN)已成功用于高能物理(包括粒子跟踪)的各种粒子重建问题。基于GNN的EXA.TRKX管道在密集环境中重建粒子轨道时表现出了有希望的性能。它包括五个离散步骤:数据编码,图形构建,边缘过滤,GNN和轨道标签。所有步骤均以Python编写,并在GPU和CPU上运行。在这项工作中,我们通过自定义和启用GPU的软件库来加速管道的Python实现,并开发用于推断管道的C ++实现。该实现具有改进的,启用了CUDA的固定 - 固定 - 拉迪乌斯最近的邻居搜索图形构建和弱连接的组件图算法,用于跟踪标签。 GNN和其他受过训练的深度学习模型将转换为ONNX,并通过ONNX运行时C ++ API推断。管道的完整C ++实现允许与现有的跟踪软件集成。我们报告了应用于TrackML基准数据集的实现的内存使用情况和平均事件延迟跟踪性能。

Recently, graph neural networks (GNNs) have been successfully used for a variety of particle reconstruction problems in high energy physics, including particle tracking. The Exa.TrkX pipeline based on GNNs demonstrated promising performance in reconstructing particle tracks in dense environments. It includes five discrete steps: data encoding, graph building, edge filtering, GNN, and track labeling. All steps were written in Python and run on both GPUs and CPUs. In this work, we accelerate the Python implementation of the pipeline through customized and commercial GPU-enabled software libraries, and develop a C++ implementation for inferencing the pipeline. The implementation features an improved, CUDA-enabled fixed-radius nearest neighbor search for graph building and a weakly connected component graph algorithm for track labeling. GNNs and other trained deep learning models are converted to ONNX and inferenced via the ONNX Runtime C++ API. The complete C++ implementation of the pipeline allows integration with existing tracking software. We report the memory usage and average event latency tracking performance of our implementation applied to the TrackML benchmark dataset.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源