Historical SC20 List

This is the historic data list released at Supercomputing 20. Since the IO500 benchmark has been improved to reduce the opportunity of caching, some results of the former lists cannot be directly compared to the new results. The list shows all submissions up to now.

# Information IO500
BoF Institution System Storage Vendor File System Type Client Nodes Total Client Proc. Score BW MD Repro.
(GiB/s) (kIOP/s)
101 SC20 EPCC NextGENIO BSC & JGU GekkoFS 10 3,800 239.37 45.79 1,251.32 -
102 SC20 EPCC NextGENIO BSC & JGU GekkoFS 10 360 184.99 27.21 1,257.57 -
103 ISC20 EPCC NextGENIO NextGenIO (BSC & JGU) GekkoFS 10 280 152.30 24.83 934.12 -
104 ISC20 EPCC NextGENIO NextGenIO (BSC & JGU) GekkoFS 10 280 149.92 24.91 902.12 -
105 SC19 EPCC NextGENIO NextGenIO (BSC & JGU) GekkoFS 10 240 125.08 21.47 728.68 -
106 ISC19 JCAHPC Oakforest-PACS DDN IME 2,048 2,048 275.65 492.06 154.41 -
107 SC20 JCAHPC Oakforest-PACS DDN IME 2,048 4,096 253.57 697.20 92.22 -
108 ISC20 JCAHPC Oakforest-PACS DDN IME 512 4,096 175.85 348.95 88.62 -
109 ISC20 JCAHPC Oakforest-PACS DDN IME 2,048 16,384 168.09 581.43 48.60 -
110 ISC20 JCAHPC Oakforest-PACS DDN IME 512 4,096 146.33 341.29 62.74 -
111 ISC18 JCAHPC Oakforest-PACS DDN IME 2,048 16,384 137.78 560.10 33.89 -
112 ISC18 JCAHPC Oakforest-PACS DDN IME 2,048 16,384 101.48 471.25 21.85 -
113 SC18 JCAHPC Oakforest-PACS DDN Lustre 256 8,192 44.24 21.42 91.40 -
114 SC18 JCAHPC Oakforest-PACS DDN Lustre 256 8,192 42.18 20.04 88.78 -
115 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 66.88 28.58 156.48 -
116 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 59.42 28.44 124.13 -
117 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 56.28 27.67 114.50 -
118 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 53.44 28.10 101.63 -
119 ISC20 Officinalis (1 Client) Red Hat, Intel, QCT CephFS 1 112 27.43 8.81 85.48 -
120 ISC20 Officinalis (1 Client) Red Hat, Intel, QCT CephFS 1 112 21.24 7.87 57.32 -
121 SC19 Oracle Cloud Infrastructure Oracle Cloud Infrastructure with Block Volume Service running Spectrum Scale Oracle Cloud Infrastructure Block Volumes Service Spectrum Scale 30 480 47.55 20.38 110.93 -
122 SC18 Clemson University Palmetto Dell BeeGFS 32 32 9.42 2.94 30.19 -
123 SC18 Clemson University Palmetto Dell BeeGFS 16 16 9.07 2.84 28.95 -
124 SC18 Clemson University Palmetto Dell BeeGFS 48 48 7.64 2.93 19.88 -
125 SC18 Clemson University Palmetto Dell BeeGFS 10 10 7.14 2.32 22.05 -
126 SC18 Clemson University Palmetto Dell BeeGFS 32 32 6.67 2.95 15.10 -
127 SC18 Clemson University Palmetto Dell BeeGFS 32 128 6.26 1.25 31.42 -
128 SC18 Clemson University Palmetto Dell BeeGFS 32 64 6.16 1.25 30.28 -
129 SC18 Clemson University Palmetto Dell BeeGFS 32 256 5.79 1.05 31.92 -
130 SC18 Clemson University Palmetto Dell BeeGFS 16 16 5.08 2.24 11.49 -
131 SC18 Clemson University Palmetto Dell OrangeFS 32 32 2.31 1.93 2.77 -
132 SC18 Clemson University Palmetto Dell OrangeFS 48 48 2.15 1.80 2.57 -
133 SC18 Clemson University Palmetto Dell OrangeFS 16 16 1.47 1.21 1.79 -
134 SC19 Clemson University Palmetto Burstbuffer AdvancedHPC BeeGFS 10 120 21.55 12.97 35.80 -
135 SC20 Pengcheng Laboratory Pengcheng Cloudbrain-II on Atlas 900 Pengcheng Laboratory MadFS 255 18,360 7,043.99 1,475.75 33,622.19 -
136 SC20 Pengcheng Laboratory Pengcheng Cloudbrain-II on Atlas 900 Pengcheng Laboratory MadFS 256 18,432 6,831.55 1,524.11 30,621.10 -
137 SC20 Pengcheng Laboratory Pengcheng Cloudbrain-II on Atlas 900 Pengcheng Laboratory MadFS 10 1,440 1,129.75 168.43 7,578.06 -
138 SC19 Penguin Computing Benchmarking and Innovation Lab Penguin-ASG-NVBeeOne Penguin Computing/Excelero BeeGFS 10 320 39.65 17.71 88.76 -
139 ISC20 SPbPU Polytechnic RSC Tornado RSC Group Lustre 59 944 64.29 21.56 191.73 -
140 ISC20 SPbPU Polytechnic RSC Tornado RSC Group Lustre 59 944 64.11 21.00 195.66 -
141 ISC20 Argonne National Laboratory Presque Argonne National Laboratory DAOS 16 544 537.31 108.19 2,668.57 -
142 ISC20 Argonne National Laboratory Presque Argonne National Laboratory DAOS 16 544 483.52 102.54 2,280.05 -
143 ISC20 Argonne National Laboratory Presque Argonne National Laboratory DAOS 10 380 440.64 95.80 2,026.80 -
144 ISC20 Argonne National Laboratory Presque Argonne National Laboratory DAOS 10 380 398.25 89.36 1,774.79 -
145 ISC19 National Computational Infrastructure Australia Raijin NetApp Lustre 15 180 40.92 12.89 129.89 -
146 ISC19 National Computational Infrastructure Australia Raijin NetApp Lustre 10 150 37.62 12.82 110.37 -
147 ISC20 LNCC SDumont Cay/HPE Lustre 10 240 3.06 2.32 4.02 -
148 ISC18 Fraunhofer Seislab ThinkparQ BeeGFS 24 24 16.96 5.13 56.14 -
149 ISC18 SNL Serrano IBM Spectrum Scale 16 160 4.25 0.65 27.98 -
150 SC18 KAUST Shaheen II Cray DataWarp 10 80 13.99 14.45 13.53 -
151 ISC18 KAUST ShaheenII Cray DataWarp 1,024 8,192 77.37 496.81 12.05 -
152 ISC18 KAUST ShaheenII Cray DataWarp 300 2,400 70.90 151.53 33.17 -
153 ISC18 KAUST ShaheenII Cray Lustre 1,000 16,000 41.00 54.17 31.03 -
154 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 48 1,248 134.09 48.03 374.40 -
155 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 40 1,040 123.47 43.63 349.44 -
156 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 320 57.99 20.80 161.71 -
157 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO wekaio 18 1,368 48.75 26.22 90.62 -
158 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 1,440 36.55 18.58 71.90 -
159 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 31.39 12.34 79.83 -
160 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 30.83 13.35 71.22 -
161 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 400 24.70 16.32 37.37 -
162 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 24.19 11.85 49.39 -
163 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 21.32 11.27 40.32 -
164 SC19 Stanford Sherlock/Fir Dell, QCT, HGST Lustre 10 240 18.25 5.23 63.74 -
165 SC19 Stanford Sherlock/Oak Dell, QCT, HGST Lustre 10 240 9.31 1.89 45.87 -
166 SC19 Stanford Sherlock/Regal Dell Lustre 10 160 7.05 1.64 30.28 -
167 ISC18 IBM Sonasad IBM Spectrum Scale 10 10 24.24 4.57 128.61 -
168 ISC20 Oracle Cloud Infrastructure SpectrumScale on Oracle Cloud Oracle Cloud Infrastructure Spectrum Scale 48 768 70.91 34.91 144.06 -
169 ISC20 Oracle Cloud Infrastructure SpectrumScale on Oracle Cloud Oracle Cloud Infrastructure Spectrum Scale 48 768 66.69 31.78 139.96 -
170 SC18 Oak Ridge National Laboratory Summit IBM Spectrum Scale 504 1,008 330.56 88.20 1,238.93 -
171 SC18 Oak Ridge National Laboratory Summit IBM Spectrum Scale 10 160 44.30 9.84 199.48 -
172 SC19 State Key Laboratory of High-end Server & Storage Technology (HSS) TStor3000 INSPUR BeeGFS 10 300 82.57 41.14 165.71 -
173 ISC19 Inspur TStor3000 Inspur BeeGFS 10 400 14.67 4.48 48.06 -
174 SC19 CEA Tera-1000 DDN Lustre 128 4,096 210.26 81.01 545.74 -
175 SC20 University of Reading ThunderX2 Local file system 1 16 6.23 0.37 104.54 -
176 SC19 National Supercomputing Center in Changsha Tianhe-2E National University of Defense Technology Lustre 480 5,280 453.68 209.43 982.78 -
177 SC19 National Supercomputing Center in Changsha Tianhe-2E National University of Defense Technology Lustre 10 160 52.58 35.06 78.86 -
178 SC19 National Supercomputing Center in Changsha Tianhe-2E National University of Defense Technology Lustre 10 160 50.08 32.71 76.67 -
179 SC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage - CephFS 15 120 14.91 5.74 38.72 -
180 SC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage - CephFS 10 80 12.43 4.64 33.34 -
181 ISC19 SUSE TigerShark SUSE, Intel, Lenovo CephFS 14 98 8.38 3.58 19.60 -
182 ISC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage 10 70 7.32 2.48 21.60 -
183 ISC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 96 3,072 17.18 8.15 36.22 -
184 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 88 2,816 12.50 8.31 18.79 -
185 ISC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 11.96 5.64 25.37 -
186 SC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 260 10.64 7.44 15.22 -
187 ISC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 620 9.87 6.36 15.32 -
188 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 8.45 6.16 11.60 -
189 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 820 8.01 6.31 10.17 -
190 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 820 8.01 6.31 10.17 -
191 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 7.77 5.39 11.18 -
192 ISC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 620 7.27 5.96 8.86 -
193 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 6.20 3.51 10.95 -
194 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 5.68 3.67 8.80 -
195 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 4.98 3.56 6.96 -
196 ISC20 National Supercomputer Center in Guangzhou Venus2 National Supercomputer Center in Guangzhou kapok 10 260 148.08 30.62 716.11 -
197 ISC20 National Supercomputer Center in Guangzhou Venus2 National Supercomputer Center in Guangzhou kapok 10 260 146.42 28.99 739.49 -
198 ISC19 Janelia Research Campus, HHMI Weka WekaIO 10 3,200 66.43 27.74 159.12 -
199 SC19 WekaIO WekaIO WekaIO WekaIO Matrix 10 2,610 156.51 56.22 435.76 -
200 SC18 WekaIO WekaIO WekaIO 17 935 78.37 37.39 164.26 -