Full SC20 List

This is the full list released at Supercomputing 20. The list shows all submissions.

# Information IO500
BoF Institution System Storage Vendor File System Type Client Nodes Total Client Proc. Score BW MD
(GiB/s) (kIOP/s)
101 ISC20 JINR Govorun RSC Group Lustre 50 800 88.51 36.28 215.97
102 ISC20 SPbPU Polytechnic RSC Tornado RSC Group Lustre 59 944 64.29 21.56 191.73
103 ISC20 SPbPU Polytechnic RSC Tornado RSC Group Lustre 59 944 64.11 21.00 195.66
104 ISC20 Red Hat EC2-10xi3en.metal Red Hat CephFS 10 320 57.17 26.29 124.30
105 ISC20 Red Hat EC2-10xi3en.metal Red Hat CephFS 10 320 55.13 25.67 118.42
106 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 66.88 28.58 156.48
107 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 59.42 28.44 124.13
108 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 56.28 27.67 114.50
109 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 53.44 28.10 101.63
110 ISC20 Officinalis (1 Client) Red Hat, Intel, QCT CephFS 1 112 27.43 8.81 85.48
111 ISC20 Officinalis (1 Client) Red Hat, Intel, QCT CephFS 1 112 21.24 7.87 57.32
112 SC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage - CephFS 15 120 14.91 5.74 38.72
113 SC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage - CephFS 10 80 12.43 4.64 33.34
114 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 220 15.61 7.34 33.22
115 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 220 15.38 7.04 33.57
116 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 200 15.07 7.05 32.21
117 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 200 14.55 6.95 30.46
118 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 88 2,816 12.50 8.31 18.79
119 SC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 260 10.64 7.44 15.22
120 ISC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 620 9.87 6.36 15.32
121 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 820 8.01 6.31 10.17
122 ISC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 620 7.27 5.96 8.86
123 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 6.20 3.51 10.95
124 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 5.68 3.67 8.80
125 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 4.98 3.56 6.96
126 SC19 WekaIO WekaIO on AWS WekaIO WekaIO Matrix 345 8,625 938.95 174.74 5,045.33
127 SC19 WekaIO WekaIO WekaIO WekaIO Matrix 10 2,610 156.51 56.22 435.76
128 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 48 1,248 134.09 48.03 374.40
129 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 40 1,040 123.47 43.63 349.44
130 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 320 57.99 20.80 161.71
131 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO wekaio 18 1,368 48.75 26.22 90.62
132 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 1,440 36.55 18.58 71.90
133 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 31.39 12.34 79.83
134 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 30.83 13.35 71.22
135 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 400 24.70 16.32 37.37
136 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 24.19 11.85 49.39
137 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 21.32 11.27 40.32
138 SC19 iFLYTEK iFLYTEK Yanrong YRCloudFile 10 200 78.54 13.55 455.18
139 SC19 iFLYTEK iFLYTEK Yanrong YRCloudFile 10 200 63.98 11.41 358.87