作者小头像 Lv.1
0 成长值

个人介绍

这个人很懒,什么都没有留下

感兴趣或擅长的领域

暂无数据
个人勋章
TA还没获得勋章~
成长雷达
0
0
0
0
0

个人资料

个人介绍

这个人很懒,什么都没有留下

感兴趣或擅长的领域

暂无数据

达成规则

发布时间 2024/02/16 16:25:39 最后回复 福州司马懿 2024/02/27 10:02:38 版块 图像识别
98 6 0
他的回复:
问题找到了,修改原文件fairmot_postprocess.py中:hm_eval = torch.from_numpy(np.fromfile(dataloader[i + 3], dtype='float32').reshape(1, 1, 152, 272))wh_eval = torch.from_numpy(np.fromfile(dataloader[i + 2],dtype='float32').reshape(1, 4, 152, 272))id_eval = torch.from_numpy(np.fromfile(dataloader[i + 1],dtype='float32').reshape(1, 128, 152, 272))reg_eval = torch.from_numpy(np.fromfile(dataloader[i],dtype='float32').reshape(1, 2, 152, 272))为:hm_eval = torch.from_numpy(np.fromfile(dataloader[i + 0], dtype='float32').reshape(1, 1, 152, 272))wh_eval = torch.from_numpy(np.fromfile(dataloader[i + 1],dtype='float32').reshape(1, 4, 152, 272))id_eval = torch.from_numpy(np.fromfile(dataloader[i + 2],dtype='float32').reshape(1, 128, 152, 272))reg_eval = torch.from_numpy(np.fromfile(dataloader[i + 3],dtype='float32').reshape(1, 2, 152, 272))就行了,就是dataloader中次序反了。修改后,输出结果:Fix size testing. training chunk_sizes: [6, 6] The output will be saved to  ./FairMOT/src/lib/../../exp/mot/default heads {'hm': 1, 'wh': 4, 'id': 128, 'reg': 2} 2024-02-16 17:19:08 [INFO]: start seq: MOT17-02-SDP 2024-02-16 17:19:08 [INFO]: start seq: MOT17-02-SDP /pytorch/aten/src/ATen/native/BinaryOps.cpp:81: UserWarning: Integer division of tensors using div or / is deprecated, and in a future release div will perform true division as in Python 3. Use true_divide or floor_divide (// in Python) instead. 2024-02-16 17:19:09 [INFO]: save results to ./dataset/MOT17/images/train/../results/MOT17_test_public_dla34/MOT17-02-SDP.txt 2024-02-16 17:19:09 [INFO]: save results to ./dataset/MOT17/images/train/../results/MOT17_test_public_dla34/MOT17-02-SDP.txt 2024-02-16 17:19:09 [INFO]: Evaluate seq: MOT17-02-SDP 2024-02-16 17:19:09 [INFO]: Evaluate seq: MOT17-02-SDP               IDF1   IDP   IDR  Rcll  Prcn GT MT PT ML FP FN IDs  FM  MOTA  MOTP IDt IDa IDm MOT17-02-SDP 91.3% 96.9% 86.4% 86.4% 96.9% 22 19  0  3  3 15   0   0 83.6% 0.163   0   0   0 OVERALL      91.3% 96.9% 86.4% 86.4% 96.9% 22 19  0  3  3 15   0   0 83.6% 0.163   0   0   0