Ranking | Team Name | PACC | ASR | ACC |
---|---|---|---|---|
1 | HZZQ | 62.04 | 5.86 | 75.57 |
2 | mmbd | 57.19 | 21.62 | 83.48 |
3 | Avenger | 54.43 | 30.85 | 84.54 |
4 | duola | 51.46 | 29.31 | 87.59 |
5 | CRISES | 42.61 | 45.43 | 80.98 |
6 | apple | 42.14 | 43.92 | 89.87 |
7 | I-BAU (Baseline) | 42.08 | 37.49 | 82.73 |
8 | Cherry | 41.35 | 44.02 | 87.95 |
9 | BJTU-THETA-LAB | 41.16 | 41.93 | 87.6 |
10 | tom | 40.96 | 46.15 | 90.18 |
11 | banana | 40.92 | 47.29 | 90.21 |
12 | 123 | 39.5 | 46.57 | 80.41 |
13 | dongdong4fei | 17.96 | 68.57 | 84.5 |
14 | No Defense (Baseline) | 3.03 | 96.1 | 91.73 |
* Three groups voluntarily quit the final evaluation. Another three groups’ submission is the same as the No Defense baseline, thus dropping from this list.
To see the detailed evaluation results and attack settings, please check this link. We will contact the winners of the awards as soon as possible. Thanks to everyone who participated!
Ranking | Team Name | PACC | ASR | ACC |
---|---|---|---|---|
1 | duola | 71.9 | 9.5 | 89.7 |
2 | banana | 71.3 | 9.9 | 89.4 |
3 | BJTU-THETA-LAB | 70.7 | 11.0 | 90.0 |
4 | apple | 70.0 | 12.4 | 89.2 |
5 | Avenger | 69.5 | 12.5 | 90.5 |
6 | tom | 69.1 | 13.7 | 90.5 |
7 | Cherry | 68.0 | 13.9 | 89.5 |
8 | HZZQ | 67.2 | 9.3 | 81.1 |
9 | CRISES | 66.0 | 4.5 | 76.6 |
10 | Frontdoor | 64.5 | 2.7 | 89.4 |
11 | Highlander | 63.6 | 4.0 | 89.1 |
12 | mmbd | 59.8 | 20.7 | 86.8 |
13 | 123 | 57.3 | 10.4 | 79.0 |
14 | dongdong4fei | 22.5 | 54.0 | 86.6 |
15 | Lowlanders | 22.3 | 73.8 | 88.6 |
16 | No_Defense (Baseline) | 3.9 | 95.8 | 91.9 |
Ranking | Team Name | PACC | ASR | ACC |
---|---|---|---|---|
1 | Avenger | 61.5 | 11.8 | 84.7 |
2 | HZZQ | 61.4 | 5.3 | 75.7 |
3 | Frontdoor | 60.1 | 1.0 | 76.8 |
4 | CRISES | 59.7 | 15.8 | 82.0 |
5 | Highlander | 59.4 | 1.4 | 75.9 |
6 | mmbd | 58.5 | 20.9 | 77.5 |
7 | I_BAU (Baseline) | 56.7 | 31.8 | 82.3 |
8 | 123 | 53.2 | 17.3 | 69.5 |
9 | banana | 53.0 | 23.3 | 80.1 |
10 | BJTU-THETA-LAB | 52.8 | 24.0 | 80.7 |
11 | tom | 51.9 | 6.2 | 75.5 |
12 | Cherry | 51.4 | 24.9 | 79.6 |
13 | duola | 40.5 | 3.2 | 39.3 |
14 | apple | 38.7 | 9.5 | 38.6 |
15 | JMN | 38.0 | 36.0 | 62.2 |
16 | dongdong4fei | 28.2 | 58.3 | 83.2 |
17 | 1111 | 3.9 | 95.8 | 91.9 |
18 | No_Defense (Baseline) | 2.4 | 96.3 | 91.5 |
19 | Seawolf | 2.4 | 96.3 | 91.5 |
20 | Lowlanders | 2.4 | 96.3 | 91.5 |
IEEE TRC’22 is supported by the granted funding to IEEE Smart Computing STC (Awarded by IEEE Computer Society Planning Committee for Emergying Techniques 2022, Dakota State University #845360).
Please contact Yi Zeng or Ruoxi Jia if you have any questions.