Historical ISC22 List

This is the ISC22 historian IO500 list
# Information IO500
BoF Institution System Storage Vendor File System Type Client Nodes Total Client Proc. Score BW MD Repro.
(GiB/s) (kIOP/s)
201 ISC20 Oracle Cloud Infrastructure SpectrumScale on Oracle Cloud Oracle Cloud Infrastructure Spectrum Scale 48 768 70.91 34.91 144.06 -
202 ISC20 Oracle Cloud Infrastructure SpectrumScale on Oracle Cloud Oracle Cloud Infrastructure Spectrum Scale 48 768 66.69 31.78 139.96 -
203 ISC20 Oracle Cloud Infrastructure BeeGFS on Oracle Cloud Oracle Cloud Infrastructure BeeGFS 10 160 32.79 14.02 76.67 -
204 ISC20 Oracle Cloud Infrastructure BeeGFS on Oracle Cloud Oracle Cloud Infrastructure BeeGFS 10 160 27.56 10.38 73.20 -
205 SC19 Oracle Cloud Infrastructure Oracle Cloud Infrastructure with Block Volume Service running Spectrum Scale Oracle Cloud Infrastructure Block Volumes Service Spectrum Scale 30 480 47.55 20.38 110.93 -
206 ISC22 National Supercomputing Center in Jinan Shanhe PDSL flashfs 10 2,560 3,534.42 207.79 60,119.50 -
207 SC21 Huawei Cloud PDSL Flashfs 15 1,560 2,016.70 109.82 37,034.00 -
208 ISC21 Pengcheng Laboratory Pengcheng Cloudbrain-II on Atlas 900 Pengcheng MadFS 512 36,864 36,850.40 3,421.62 396,872.82 -
209 SC20 Pengcheng Laboratory Pengcheng Cloudbrain-II on Atlas 900 Pengcheng Laboratory MadFS 255 18,360 7,043.99 1,475.75 33,622.19 -
210 SC20 Pengcheng Laboratory Pengcheng Cloudbrain-II on Atlas 900 Pengcheng Laboratory MadFS 256 18,432 6,831.55 1,524.11 30,621.10 -
211 SC20 Pengcheng Laboratory Pengcheng Cloudbrain-II on Atlas 900 Pengcheng Laboratory MadFS 10 1,440 1,129.75 168.43 7,578.06 -
212 SC21 Raytheon MCK-HPC Penguin Computing WekaFS 44 1,584 86.24 27.64 269.10 -
213 SC21 Raytheon MCK-HPC Penguin Computing WekaFS 10 360 62.55 25.96 150.70 -
214 SC20 Raytheon WekaIO Penguin Computing wekafs 42 1,596 50.58 16.39 156.10 -
215 SC20 Raytheon WekaIO Penguin Computing wekafs 10 440 32.20 14.83 69.88 -
216 SC19 Penguin Computing Benchmarking and Innovation Lab Penguin-ASG-NVBeeOne Penguin Computing/Excelero BeeGFS 10 320 39.65 17.71 88.76 -
217 ISC18 STFC Jasmin/Lotus Purestorage NFS 64 128 2.33 0.26 20.93 -
218 ISC18 STFC Jasmin/Lotus Purestorage NFS 16 64 1.98 0.21 18.77 -
219 SC21 QCT DevCloud QCT DevCloud QCT DAOS 10 560 427.27 102.85 1,774.97 -
220 ISC19 Janelia Research Campus, HHMI nrs Qumulo Scale out NFS 10 320 5.95 2.59 13.69 -
221 ISC19 Janelia Research Campus, HHMI qflash Qumulo Scale out NFS 96 960 5.32 2.80 10.11 -
222 ISC19 Janelia Research Campus, HHMI qflash Qumulo Scale out NFS 10 120 4.46 2.12 9.35 -
223 SC19 Janelia Research Campus, HHMI nrs Qumulo Scale-out NAS 10 760 4.19 2.88 6.11 -
224 SC19 Janelia Research Campus, HHMI nearline4 Qumulo Scale-out NAS 10 760 3.67 2.45 5.49 -
225 SC20 The Science and Technology Facilities Council JASMIN 4 Quobyte Inc. Quobyte 48 336 17.55 19.76 15.60 -
226 SC20 The Science and Technology Facilities Council JASMIN 4 Quobyte Inc. Quobyte 10 240 11.34 12.34 10.43 -
227 ISC21 Joint Institute of Nuclear Research Govorun RSC DAOS 10 160 132.07 20.19 863.69 -
228 ISC18 Joint Institute for Nuclear Research Govorun RSC Lustre 24 192 12.08 3.34 43.65 -
229 SC20 Joint Supercomputer Center of the Russian Academy of Sciences MVS10POP2 RSC Group Lustre 54 864 125.50 45.31 347.61 -
230 ISC20 JINR Govorun RSC Group Lustre 50 800 90.87 35.61 231.88 -
231 ISC20 JINR Govorun RSC Group Lustre 50 800 88.51 36.28 215.97 -
232 ISC20 SPbPU Polytechnic RSC Tornado RSC Group Lustre 59 944 64.29 21.56 191.73 -
233 ISC20 SPbPU Polytechnic RSC Tornado RSC Group Lustre 59 944 64.11 21.00 195.66 -
234 ISC20 Red Hat EC2-10xi3en.metal Red Hat CephFS 10 320 57.17 26.29 124.30 -
235 ISC20 Red Hat EC2-10xi3en.metal Red Hat CephFS 10 320 55.13 25.67 118.42 -
236 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 66.88 28.58 156.48 -
237 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 59.42 28.44 124.13 -
238 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 56.28 27.67 114.50 -
239 ISC20 Officinalis Red Hat, Intel, QCT CephFS 8 256 53.44 28.10 101.63 -
240 ISC20 Officinalis (1 Client) Red Hat, Intel, QCT CephFS 1 112 27.43 8.81 85.48 -
241 ISC20 Officinalis (1 Client) Red Hat, Intel, QCT CephFS 1 112 21.24 7.87 57.32 -
242 SC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage - CephFS 15 120 14.91 5.74 38.72 -
243 SC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage - CephFS 10 80 12.43 4.64 33.34 -
244 ISC19 SUSE TigerShark SUSE, Intel, Lenovo CephFS 14 98 8.38 3.58 19.60 -
245 ISC19 SUSE TigerShark SUSE, Intel, Lenovo SUSE Enterprise Storage 10 70 7.32 2.48 21.60 -
246 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 220 15.61 7.34 33.22 -
247 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 220 15.38 7.04 33.57 -
248 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 200 15.07 7.05 32.21 -
249 ISC20 SUSE HammerHead SUSE/Ampere/Arm CephFS 10 200 14.55 6.95 30.46 -
250 ISC18 DKRZ Mistral Seagate Lustre 100 1,000 32.15 22.77 45.39 -
251 ISC18 DKRZ Mistral Seagate Lustre 100 1,200 29.33 20.35 42.29 -
252 ISC18 DKRZ Mistral Seagate Lustre1 100 1,200 15.47 12.68 18.88 -
253 SC18 DKRZ Mistral Seagate Lustre2 10 80 5.32 1.05 27.06 -
254 SC18 EPCC Archer Seagate Lustre 10 80 3.70 0.77 17.84 -
255 ISC21 DKRZ Mistral Seagate/Cray Lustre 10 80 5.22 1.16 23.50 -
256 ISC21 Supermicro Supermicro DAOS 10 1,120 415.04 112.17 1,535.63 -
257 ISC18 JSC JURON ThinkparQ BeeGFS 8 64 35.77 14.24 89.81 -
258 ISC18 Fraunhofer Seislab ThinkparQ BeeGFS 24 24 16.96 5.13 56.14 -
259 ISC21 Howard Hughes Medical Institute Janelia Research Campus Vast nrsv 10 300 15.50 7.54 31.84 -
260 SC21 HHMIJRC Janelia Compute Cluster Vast Data Vast 80 2,880 21.72 14.11 33.42 -
261 ISC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 96 3,072 17.18 8.15 36.22 -
262 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 88 2,816 12.50 8.31 18.79 -
263 ISC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 11.96 5.64 25.37 -
264 SC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 260 10.64 7.44 15.22 -
265 ISC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 620 9.87 6.36 15.32 -
266 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 8.45 6.16 11.60 -
267 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 820 8.01 6.31 10.17 -
268 SC19 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 480 7.77 5.39 11.18 -
269 ISC20 Janelia Research Campus, HHMI Vast Vast Data Scale-out NAS 10 620 7.27 5.96 8.86 -
270 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 6.20 3.51 10.95 -
271 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 5.68 3.67 8.80 -
272 ISC20 Janelia Research Campus, HHMI VastTest Vast Data Scale-out NAS 10 620 4.98 3.56 6.96 -
273 ISC21 Howard Hughes Medical Institute Janelia Research Campus Janelia Weka.io Weka.io 10 300 62.68 18.17 216.22 -
274 SC19 WekaIO WekaIO on AWS WekaIO WekaIO Matrix 345 8,625 938.95 174.74 5,045.33 -
275 ISC22 Oracle Cloud Infrastructure Oracle Cloud with WEKA on RDMA WekaIO WEKA 373 7,460 625.95 233.17 1,680.38 -
276 ISC22 Oracle Cloud Infrastructure Oracle Cloud running WEKA on non RDMA netwok WekaIO WEKA 151 6,040 323.52 97.61 1,072.32 -
277 SC19 WekaIO WekaIO WekaIO WekaIO Matrix 10 2,610 156.51 56.22 435.76 -
278 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 48 1,248 134.09 48.03 374.40 -
279 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 40 1,040 123.47 43.63 349.44 -
280 SC18 WekaIO WekaIO WekaIO 17 935 78.37 37.39 164.26 -
281 ISC19 Janelia Research Campus, HHMI Weka WekaIO 10 3,200 66.43 27.74 159.12 -
282 SC18 WekaIO WekaIO WekaIO Matrix 10 700 58.25 27.05 125.43 -
283 SC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 320 57.99 20.80 161.71 -
284 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO wekaio 18 1,368 48.75 26.22 90.62 -
285 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 1,440 36.55 18.58 71.90 -
286 SC19 Genomics England GELous WekaIO 10 1,400 34.38 15.17 77.89 -
287 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 31.39 12.34 79.83 -
288 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 30.83 13.35 71.22 -
289 SC19 Janelia Research Campus, HHMI sharedscratch WekaIO Parallel Filesystem 10 480 25.86 7.75 86.29 -
290 SC19 Janelia Research Campus, HHMI sharedscratch WekaIO Parallel Filesystem 10 480 25.46 7.65 84.69 -
291 SC19 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 400 24.70 16.32 37.37 -
292 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 24.19 11.85 49.39 -
293 ISC20 Janelia Research Campus, HHMI SharedScratch WekaIO Parallel Filesystem 10 600 21.32 11.27 40.32 -
294 SC19 Janelia Research Campus, HHMI sharedscratch WekaIO Parallel Filesystem 10 480 18.07 7.75 42.12 -
295 ISC22 Wu Dong Laboratory Wu Dong Compute Cluster Wu Dong Laboratory OPENFS 10 640 218.81 43.86 1,091.49 -
296 SC19 iFLYTEK iFLYTEK Yanrong YRCloudFile 10 200 78.54 13.55 455.18 -
297 SC19 iFLYTEK iFLYTEK Yanrong YRCloudFile 10 200 63.98 11.41 358.87 -
298 ISC19 Yanrong YRCloudFile Yanrong YRCloudFile 10 160 19.14 4.22 86.86 -