Please submit your IO500 results for the upcoming SC'24 list, which will be presented at our BOF. See the Call for Submissions for details.

Historical ISC21 List

# Information IO500
BoF Institution System Storage Vendor File System Type Client Nodes Total Client Proc. Score BW MD Repro.
(GiB/s) (kIOP/s)
201 ISC20 Oracle Cloud Infrastructure BeeGFS on Oracle Cloud Oracle Cloud Infrastructure BeeGFS 10 160 32.79 14.02 76.67 -
202 ISC20 Oracle Cloud Infrastructure BeeGFS on Oracle Cloud Oracle Cloud Infrastructure BeeGFS 10 160 27.56 10.38 73.20 -
203 SC19 Oracle Cloud Infrastructure Oracle Cloud Infrastructure with Block Volume Service running Spectrum Scale Oracle Cloud Infrastructure Block Volumes Service Spectrum Scale 30 480 47.55 20.38 110.93 -
204 SC19 Oracle Cloud Infrastructure Oracle Cloud Infrastructure with Block Volume Service running Spectrum Scale Oracle Cloud Infrastructure Block Volumes Service Spectrum Scale 30 480 47.55 20.38 110.93 -
205 ISC21 Pengcheng Laboratory Pengcheng Cloudbrain-II on Atlas 900 Pengcheng MadFS 512 36,864 36,850.37 3,421.62 396,872.82 -
206 SC20 Pengcheng Laboratory Pengcheng Cloudbrain-II on Atlas 900 Pengcheng Laboratory MadFS 255 18,360 7,043.99 1,475.75 33,622.19 -
207 SC20 Pengcheng Laboratory Pengcheng Cloudbrain-II on Atlas 900 Pengcheng Laboratory MadFS 256 18,432 6,831.55 1,524.11 30,621.10 -
208 SC20 Pengcheng Laboratory Pengcheng Cloudbrain-II on Atlas 900 Pengcheng Laboratory MadFS 10 1,440 1,129.75 168.43 7,578.06 -
209 SC20 Raytheon WekaIO Penguin Computing wekafs 42 1,596 50.58 16.39 156.10 -
210 SC20 Raytheon WekaIO Penguin Computing wekafs 42 1,596 50.58 16.39 156.10 -
211 SC20 Raytheon WekaIO Penguin Computing wekafs 42 1,596 50.58 16.39 156.10 -
212 SC20 Raytheon WekaIO Penguin Computing wekafs 10 440 32.20 14.83 69.88 -
213 SC19 Penguin Computing Benchmarking and Innovation Lab Penguin-ASG-NVBeeOne Penguin Computing/Excelero BeeGFS 10 320 39.65 17.71 88.76 -
214 SC19 Penguin Computing Benchmarking and Innovation Lab Penguin-ASG-NVBeeOne Penguin Computing/Excelero BeeGFS 10 320 39.65 17.71 88.76 -
215 ISC18 STFC Jasmin/Lotus Purestorage NFS 64 128 2.33 0.26 20.93 -
216 ISC18 STFC Jasmin/Lotus Purestorage NFS 16 64 1.98 0.21 18.77 -
217 ISC19 Janelia Research Campus, HHMI nrs Qumulo Scale out NFS 10 320 5.95 2.59 13.69 -
218 ISC19 Janelia Research Campus, HHMI qflash Qumulo Scale out NFS 96 960 5.32 2.80 10.11 -
219 ISC19 Janelia Research Campus, HHMI qflash Qumulo Scale out NFS 10 120 4.46 2.12 9.35 -
220 SC19 Janelia Research Campus, HHMI nrs Qumulo Scale-out NAS 10 760 4.19 2.88 6.11 -
221 SC19 Janelia Research Campus, HHMI nrs Qumulo Scale-out NAS 10 760 4.19 2.88 6.11 -
222 SC19 Janelia Research Campus, HHMI nearline4 Qumulo Scale-out NAS 10 760 3.67 2.45 5.49 -
223 SC20 The Science and Technology Facilities Council JASMIN 4 Quobyte Inc. Quobyte 48 336 17.55 19.76 15.60 -
224 SC20 The Science and Technology Facilities Council JASMIN 4 Quobyte Inc. Quobyte 48 336 17.55 19.76 15.60 -
225 SC20 The Science and Technology Facilities Council JASMIN 4 Quobyte Inc. Quobyte 10 240 11.34 12.34 10.43 -
226 ISC21 Joint Institute of Nuclear Research Govorun RSC DAOS 10 160 132.06 20.19 863.69 -
227 ISC18 Joint Institute for Nuclear Research Govorun RSC Lustre 24 192 12.08 3.34 43.65 -
228 SC20 Joint Supercomputer Center of the Russian Academy of Sciences MVS10POP2 RSC Group Lustre 54 864 125.50 45.31 347.61 -
229 SC20 Joint Supercomputer Center of the Russian Academy of Sciences MVS10POP2 RSC Group Lustre 54 864 125.50 45.31 347.61 -
230 ISC20 JINR Govorun RSC Group Lustre 50 800 90.87 35.61 231.88 -
231 ISC20 JINR Govorun RSC Group Lustre 50 800 90.87 35.61 231.88 -
232 ISC20 JINR Govorun RSC Group Lustre 50 800 88.51 36.28 215.97 -
233 ISC20 SPbPU Polytechnic RSC Tornado RSC Group Lustre 59 944 64.29 21.56 191.73 -
234 ISC20 SPbPU Polytechnic RSC Tornado RSC Group Lustre 59 944 64.29 21.56 191.73 -
235 ISC20 SPbPU Polytechnic RSC Tornado RSC Group Lustre 59 944 64.11 21.00 195.66 -
236 ISC20 Red Hat EC2-10xi3en.metal Red Hat CephFS 10 320 57.17 26.29 124.30 -
237 ISC20 Red Hat EC2-10xi3en.metal Red Hat CephFS 10 320 57.17 26.29 124.30 -
238 ISC20 Red Hat EC2-10xi3en.metal Red Hat CephFS 10 320 55.13 25.67 118.42 -
239 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 66.88 28.58 156.48 -
240 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 59.42 28.44 124.13 -
241 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 56.28 27.67 114.50 -
242 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 53.44 28.10 101.63 -
243 ISC20 Officinalis (1 Client) Red Hat, Intel, QCT CephFS 1 112 27.43 8.81 85.48 -
244 ISC20 Officinalis (1 Client) Red Hat, Intel, QCT CephFS 1 112 21.24 7.87 57.32 -
245 SC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage - CephFS 15 120 14.91 5.74 38.72 -
246 SC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage - CephFS 15 120 14.91 5.74 38.72 -
247 SC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage - CephFS 10 80 12.43 4.64 33.34 -
248 ISC19 SUSE TigerShark SUSE, Intel, Lenovo CephFS 14 98 8.38 3.58 19.60 -
249 ISC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage 10 70 7.32 2.48 21.60 -
250 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 220 15.61 7.34 33.22 -
251 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 220 15.38 7.04 33.57 -
252 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 200 15.07 7.05 32.21 -
253 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 200 14.55 6.95 30.46 -
254 ISC18 DKRZ Mistral Seagate Lustre 100 1,000 32.15 22.77 45.39 -
255 ISC18 DKRZ Mistral Seagate Lustre 100 1,200 29.33 20.35 42.29 -
256 ISC18 DKRZ Mistral Seagate Lustre1 100 1,200 15.47 12.68 18.88 -
257 SC18 DKRZ Mistral Seagate Lustre2 10 80 5.32 1.05 27.06 -
258 SC18 EPCC Archer Seagate Lustre 10 80 3.70 0.77 17.84 -
259 ISC21 DKRZ Mistral Seagate/Cray Lustre 10 80 5.22 1.16 23.50 -
260 ISC21 Supermicro Supermicro DAOS 10 1,120 415.04 112.17 1,535.63 -
261 ISC18 JSC JURON ThinkparQ BeeGFS 8 64 35.77 14.24 89.81 -
262 ISC18 Fraunhofer Seislab ThinkparQ BeeGFS 24 24 16.96 5.13 56.14 -
263 ISC21 Howard Hughes Medical Institute Janelia Research Campus Vast nrsv 10 300 15.50 7.54 31.84 -
264 ISC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 96 3,072 17.18 8.15 36.22 -
265 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 88 2,816 12.50 8.31 18.79 -
266 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 88 2,816 12.50 8.31 18.79 -
267 ISC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 11.96 5.64 25.37 -
268 SC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 260 10.64 7.44 15.22 -
269 ISC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 620 9.87 6.36 15.32 -
270 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 8.45 6.16 11.60 -
271 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 820 8.01 6.31 10.17 -
272 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 820 8.01 6.31 10.17 -
273 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 7.77 5.39 11.18 -
274 ISC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 620 7.27 5.96 8.86 -
275 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 6.20 3.51 10.95 -
276 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 5.68 3.67 8.80 -
277 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 4.98 3.56 6.96 -
278 ISC21 Howard Hughes Medical Institute Janelia Research Campus Janelia Weka.io Weka.io 10 300 62.68 18.17 216.22 -
279 SC19 WekaIO WekaIO on AWS WekaIO WekaIO Matrix 345 8,625 938.95 174.74 5,045.33 -
280 SC19 WekaIO WekaIO on AWS WekaIO WekaIO Matrix 345 8,625 938.95 174.74 5,045.33 -
281 SC19 WekaIO WekaIO WekaIO WekaIO Matrix 10 2,610 156.51 56.22 435.76 -
282 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 48 1,248 134.09 48.03 374.40 -
283 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 48 1,248 134.09 48.03 374.40 -
284 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 40 1,040 123.47 43.63 349.44 -
285 SC18 WekaIO WekaIO WekaIO 17 935 78.37 37.39 164.26 -
286 ISC19 Janelia Research Campus, HHMI Weka WekaIO 10 3,200 66.43 27.74 159.12 -
287 SC18 WekaIO WekaIO WekaIO Matrix 10 700 58.25 27.05 125.43 -
288 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 320 57.99 20.80 161.71 -
289 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO wekaio 18 1,368 48.75 26.22 90.62 -
290 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 1,440 36.55 18.58 71.90 -
291 SC19 Genomics England GELous WekaIO 10 1,400 34.38 15.17 77.89 -
292 SC19 Genomics England GELous WekaIO 10 1,400 34.38 15.17 77.89 -
293 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 31.39 12.34 79.83 -
294 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 30.83 13.35 71.22 -
295 SC19 Janelia Research Campus, HHMI sharedscratch WekaIO Parallel Filesystem 10 480 25.86 7.75 86.29 -
296 SC19 Janelia Research Campus, HHMI sharedscratch WekaIO Parallel Filesystem 10 480 25.46 7.65 84.69 -
297 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 400 24.70 16.32 37.37 -
298 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 24.19 11.85 49.39 -
299 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 21.32 11.27 40.32 -
300 SC19 Janelia Research Campus, HHMI sharedscratch WekaIO Parallel Filesystem 10 480 18.07 7.75 42.12 -