IEEE TRC'22
Leaderboards

Final Evaluation Ranking 🥇 🥈 🥉

RankingTeam NamePACCASRACC
1HZZQ62.045.8675.57
2mmbd57.1921.6283.48
3Avenger54.4330.8584.54
4duola51.4629.3187.59
5CRISES42.6145.4380.98
6apple42.1443.9289.87
7I-BAU (Baseline)42.0837.4982.73
8Cherry41.3544.0287.95
9BJTU-THETA-LAB41.1641.9387.6
10tom40.9646.1590.18
11banana40.9247.2990.21
1212339.546.5780.41
13dongdong4fei17.9668.5784.5
14No Defense (Baseline)3.0396.191.73

* Three groups voluntarily quit the final evaluation. Another three groups’ submission is the same as the No Defense baseline, thus dropping from this list.

To see the detailed evaluation results and attack settings, please check this link. We will contact the winners of the awards as soon as possible. Thanks to everyone who participated!

Second Held-Out Model Set Ranking

RankingTeam NamePACCASRACC
1duola71.99.589.7
2banana71.39.989.4
3BJTU-THETA-LAB70.711.090.0
4apple70.012.489.2
5Avenger69.512.590.5
6tom69.113.790.5
7Cherry68.013.989.5
8HZZQ67.29.381.1
9CRISES66.04.576.6
10Frontdoor64.52.789.4
11Highlander63.64.089.1
12mmbd59.820.786.8
1312357.310.479.0
14dongdong4fei22.554.086.6
15Lowlanders22.373.888.6
16No_Defense (Baseline)3.995.891.9

First Held-Out Model Set Ranking

RankingTeam NamePACCASRACC
1Avenger61.511.884.7
2HZZQ61.45.375.7
3Frontdoor60.11.076.8
4CRISES59.715.882.0
5Highlander59.41.475.9
6mmbd58.520.977.5
7I_BAU (Baseline)56.731.882.3
812353.217.369.5
9banana53.023.380.1
10BJTU-THETA-LAB52.824.080.7
11tom51.96.275.5
12Cherry51.424.979.6
13duola40.53.239.3
14apple38.79.538.6
15JMN38.036.062.2
16dongdong4fei28.258.383.2
1711113.995.891.9
18No_Defense (Baseline)2.496.391.5
19Seawolf2.496.391.5
20Lowlanders2.496.391.5

IEEE TRC’22 is supported by the granted funding to IEEE Smart Computing STC (Awarded by IEEE Computer Society Planning Committee for Emergying Techniques 2022, Dakota State University #845360).

Please contact Yi Zeng or Ruoxi Jia if you have any questions.