Historical ISC20 List

This is the historic data list released at ISC-HPC 20. Since the IO500 benchmark has been improved to reduce the opportunity of caching, some results of the former lists cannot be directly compared to the new results. The list shows all submissions up to now.

# Information IO500
BoF Institution System Storage Vendor File System Type Client Nodes Total Client Proc. Score BW MD Repro.
(GiB/s) (kIOP/s)
1 SC19 iFLYTEK iFLYTEK Yanrong YRCloudFile 10 200 78.54 13.55 455.18 -
2 SC19 iFLYTEK iFLYTEK Yanrong YRCloudFile 10 200 63.98 11.41 358.87 -
3 ISC19 Yanrong YRCloudFile Yanrong YRCloudFile 10 160 19.14 4.22 86.86 -
4 SC19 WekaIO WekaIO on AWS WekaIO WekaIO Matrix 345 8,625 938.95 174.74 5,045.33 -
5 SC19 WekaIO WekaIO WekaIO WekaIO Matrix 10 2,610 156.51 56.22 435.76 -
6 SC18 WekaIO WekaIO WekaIO 17 935 78.37 37.39 164.26 -
7 ISC19 Janelia Research Campus, HHMI Weka WekaIO 10 3,200 66.43 27.74 159.12 -
8 SC18 WekaIO WekaIO WekaIO Matrix 10 700 58.25 27.05 125.43 -
9 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO wekaio 18 1,368 48.75 26.22 90.62 -
10 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 1,440 36.55 18.58 71.90 -
11 SC19 Genomics England GELous WekaIO 10 1,400 34.38 15.17 77.89 -
12 SC19 Genomics England GELous WekaIO 10 1,400 34.38 15.17 77.89 -
13 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 31.39 12.34 79.83 -
14 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 30.83 13.35 71.22 -
15 SC19 Janelia Research Campus, HHMI sharedscratch WekaIO Parallel Filesystem 10 480 25.86 7.75 86.29 -
16 SC19 Janelia Research Campus, HHMI sharedscratch WekaIO Parallel Filesystem 10 480 25.46 7.65 84.69 -
17 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 400 24.70 16.32 37.37 -
18 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 24.19 11.85 49.39 -
19 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 21.32 11.27 40.32 -
20 SC19 Janelia Research Campus, HHMI sharedscratch WekaIO Parallel Filesystem 10 480 18.07 7.75 42.12 -
21 ISC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 96 3,072 17.18 8.15 36.22 -
22 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 88 2,816 12.50 8.31 18.79 -
23 ISC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 11.96 5.64 25.37 -
24 ISC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 620 9.87 6.36 15.32 -
25 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 8.45 6.16 11.60 -
26 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 820 8.01 6.31 10.17 -
27 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 820 8.01 6.31 10.17 -
28 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 7.77 5.39 11.18 -
29 ISC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 620 7.27 5.96 8.86 -
30 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 6.20 3.51 10.95 -
31 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 5.68 3.67 8.80 -
32 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 4.98 3.56 6.96 -
33 ISC18 JSC JURON ThinkparQ BeeGFS 8 64 35.77 14.24 89.81 -
34 ISC18 Fraunhofer Seislab ThinkparQ BeeGFS 24 24 16.96 5.13 56.14 -
35 ISC18 DKRZ Mistral Seagate Lustre 100 1,000 32.15 22.77 45.39 -
36 ISC18 DKRZ Mistral Seagate Lustre 100 1,200 29.33 20.35 42.29 -
37 ISC18 DKRZ Mistral Seagate Lustre1 100 1,200 15.47 12.68 18.88 -
38 SC18 DKRZ Mistral Seagate Lustre2 10 80 5.32 1.05 27.06 -
39 SC18 EPCC Archer Seagate Lustre 10 80 3.70 0.77 17.84 -
40 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 220 15.61 7.34 33.22 -
41 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 220 15.38 7.04 33.57 -
42 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 200 15.07 7.05 32.21 -
43 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 200 14.55 6.95 30.46 -
44 SC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage - CephFS 15 120 14.91 5.74 38.72 -
45 SC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage - CephFS 10 80 12.43 4.64 33.34 -
46 ISC19 SUSE TigerShark SUSE, Intel, Lenovo CephFS 14 98 8.38 3.58 19.60 -
47 ISC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage 10 70 7.32 2.48 21.60 -
48 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 66.88 28.58 156.48 -
49 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 59.42 28.44 124.13 -
50 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 56.28 27.67 114.50 -
51 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 53.44 28.10 101.63 -
52 ISC20 Officinalis (1 Client) Red Hat, Intel, QCT CephFS 1 112 27.43 8.81 85.48 -
53 ISC20 Officinalis (1 Client) Red Hat, Intel, QCT CephFS 1 112 21.24 7.87 57.32 -
54 ISC20 Red Hat EC2-10xi3en.metal Red Hat CephFS 10 320 57.17 26.29 124.30 -
55 ISC20 Red Hat EC2-10xi3en.metal Red Hat CephFS 10 320 55.13 25.67 118.42 -
56 ISC20 JINR Govorun RSC Group Lustre 50 800 90.87 35.61 231.88 -
57 ISC20 JINR Govorun RSC Group Lustre 50 800 88.51 36.28 215.97 -
58 ISC20 SPbPU Polytechnic RSC Tornado RSC Group Lustre 59 944 64.29 21.56 191.73 -
59 ISC20 SPbPU Polytechnic RSC Tornado RSC Group Lustre 59 944 64.11 21.00 195.66 -
60 ISC18 Joint Institute for Nuclear Research Govorun RSC Lustre 24 192 12.08 3.34 43.65 -
61 ISC19 Janelia Research Campus, HHMI nrs Qumulo Scale out NFS 10 320 5.95 2.59 13.69 -
62 ISC19 Janelia Research Campus, HHMI qflash Qumulo Scale out NFS 96 960 5.32 2.80 10.11 -
63 ISC19 Janelia Research Campus, HHMI qflash Qumulo Scale out NFS 10 120 4.46 2.12 9.35 -
64 SC19 Janelia Research Campus, HHMI nrs Qumulo Scale-out NAS 10 760 4.19 2.88 6.11 -
65 SC19 Janelia Research Campus, HHMI nearline4 Qumulo Scale-out NAS 10 760 3.67 2.45 5.49 -
66 ISC18 STFC Jasmin/Lotus Purestorage NFS 64 128 2.33 0.26 20.93 -
67 ISC18 STFC Jasmin/Lotus Purestorage NFS 16 64 1.98 0.21 18.77 -
68 SC19 Penguin Computing Benchmarking and Innovation Lab Penguin-ASG-NVBeeOne Penguin Computing/Excelero BeeGFS 10 320 39.65 17.71 88.76 -
69 SC19 Oracle Cloud Infrastructure Oracle Cloud Infrastructure with Block Volume Service running Spectrum Scale Oracle Cloud Infrastructure Block Volumes Service Spectrum Scale 30 480 47.55 20.38 110.93 -
70 ISC20 Oracle Cloud Infrastructure BeeGFS on Oracle Cloud Oracle Cloud Infrastructure BeeGFS 270 3,240 267.25 293.05 243.73 -
71 ISC20 Oracle Cloud Infrastructure BeeGFS on Oracle Cloud Oracle Cloud Infrastructure BeeGFS 170 2,040 191.10 175.24 208.39 -
72 ISC20 Oracle Cloud Infrastructure BeeGFS on Oracle Cloud Oracle Cloud Infrastructure BeeGFS 270 3,240 104.30 47.44 229.33 -
73 ISC20 Oracle Cloud Infrastructure BeeGFS on Oracle Cloud Oracle Cloud Infrastructure BeeGFS 170 2,040 91.48 42.65 196.20 -
74 ISC20 Oracle Cloud Infrastructure SpectrumScale on Oracle Cloud Oracle Cloud Infrastructure Spectrum Scale 48 768 70.91 34.91 144.06 -
75 ISC20 Oracle Cloud Infrastructure SpectrumScale on Oracle Cloud Oracle Cloud Infrastructure Spectrum Scale 48 768 66.69 31.78 139.96 -
76 ISC20 Oracle Cloud Infrastructure BeeGFS on Oracle Cloud Oracle Cloud Infrastructure BeeGFS 10 160 32.79 14.02 76.67 -
77 ISC20 Oracle Cloud Infrastructure BeeGFS on Oracle Cloud Oracle Cloud Infrastructure BeeGFS 10 160 27.56 10.38 73.20 -
78 ISC20 EPCC NextGENIO NextGenIO (BSC & JGU) GekkoFS 10 280 152.30 24.83 934.12 -
79 ISC20 EPCC NextGENIO NextGenIO (BSC & JGU) GekkoFS 10 280 149.92 24.91 902.12 -
80 SC19 EPCC NextGENIO NextGenIO (BSC & JGU) GekkoFS 10 240 125.08 21.47 728.68 -
81 ISC19 National Computational Infrastructure Australia Raijin NetApp Lustre 15 180 40.92 12.89 129.89 -
82 ISC19 National Computational Infrastructure Australia Raijin NetApp Lustre 10 150 37.62 12.82 110.37 -
83 SC19 National Supercomputing Center in Changsha Tianhe-2E National University of Defense Technology Lustre 480 5,280 453.68 209.43 982.78 -
84 SC19 National Supercomputing Center in Changsha Tianhe-2E National University of Defense Technology Lustre 10 160 52.58 35.06 78.86 -
85 SC19 National Supercomputing Center in Changsha Tianhe-2E National University of Defense Technology Lustre 10 160 50.08 32.71 76.67 -
86 ISC20 National Supercomputer Center in Guangzhou Venus2 National Supercomputer Center in Guangzhou kapok 10 260 148.08 30.62 716.11 -
87 ISC20 National Supercomputer Center in Guangzhou Venus2 National Supercomputer Center in Guangzhou kapok 10 260 146.42 28.99 739.49 -
88 ISC20 Johannes Gutenberg University Mainz MOGON II JGU (ADA-FS)& BSC (NEXTGenIO) GekkoFS 10 160 96.46 11.61 801.60 -
89 ISC20 Johannes Gutenberg University Mainz MOGON II JGU (ADA-FS)& BSC (NEXTGenIO) GekkoFS 10 160 92.91 10.96 787.71 -
90 ISC20 Intel Wolf Intel DAOS 52 1,664 1,792.98 371.67 8,649.57 -
91 ISC20 Intel Wolf Intel DAOS 52 1,664 1,614.02 368.44 7,070.51 -
92 SC19 Intel Wolf Intel DAOS 26 728 933.64 183.36 4,753.79 -
93 ISC20 TACC Frontera Intel DAOS 60 1,440 763.80 78.31 7,449.56 -
94 ISC20 Intel Wolf Intel DAOS 10 420 758.71 164.77 3,493.56 -
95 ISC20 TACC Frontera Intel DAOS 60 1,440 756.01 75.40 7,580.23 -
96 ISC20 Intel Wolf Intel DAOS 10 420 736.33 163.19 3,322.40 -
97 SC19 Intel Wolf Intel DAOS 10 310 516.41 123.89 2,152.46 -
98 ISC20 TACC Frontera Intel DAOS 10 420 508.88 79.16 3,271.49 -
99 ISC20 TACC Frontera Intel DAOS 10 420 507.13 78.47 3,277.36 -
100 ISC19 Inspur Inspur BeeGFS 10 400 20.17 8.61 47.26 -