The IO500 Foundation Steering Committee Rules - Version 2.0
Submission Rules for the Research and Production Lists
The following rules should ensure a fair comparison of the IO500 results between systems and configurations. They serve to reduce mistakes and improve accuracy.
- Submissions are made using the latest version of the IO500 application in GitHub and all binaries should be built according to the instructions in Running.
- Read-after-write semantics: The system must be able to correctly read freshly written data from a different client node after the close operation on the writer has been completed.
- The stonewall flag must be set to 300 to ensure all create/write run for at least 300 seconds.
- We defined a very high workload for all benchmarks that should satisfy this requirement but you may have to set higher values.
- There can be no edits made to the source code including used codes such as IOR beyond changing the allowed variables and adding commands to configure the storage system in io500.sh (e.g. setting striping parameters).
- An exception to this rule is possible for submitters who have a legitimate reason by requesting an exception from the committee via committee@io500.org.
- The file names for the mdtest and IOR output files may not be pre-created.
- You must run all phases of the benchmark on a single storage system without interruption.
- There is no limitation on the number of storage nodes, the storage servers may optionally be co-located on the client nodes.
- All data must be written to persistent storage within the measured time for the individual benchmark, e.g. if a file system caches data, it must ensure that data is persistently stored before acknowledging the close.
- Data and metadata must be written in its entirety and not reduced based on its contents. The goal of the IO500 is to provide dataset independent performance results, and techniques such as deduplication, compression, and other lossless and lossy techniques would bias performance as the IO500 benchmark uses partially predictable content that is not representative of existing workloads in the real world.
- Submitting the results must be done in accordance with the instructions on our submission page. Please verify the correctness of your submission before you submit it.
-
If a tool other than the included pfind is used for the find phase, then it must follow the same input and output behavior as the included pfind, and the source code must be included in the submission.
-
It is not required to capture the list of matched files.
- Please also refer to the README documents in the GitHub repo.
- Please read the CHANGELOG.md file for the new changes on the IO500 benchmark
-
Only submissions using at least 10 physical client nodes are
eligible to win IO500 awards and at least one benchmark process
must run on each client.
- We accept results on fewer nodes for documentation purposes but they cannot be awarded.
- Virtual machines can be used but the above rule must be followed. More than one virtual machine can be run on each physical node.
- For the 10 node challenge, there must be exactly 10 physical client nodes and at least one benchmark process must run on each node.
- The only exception to this rule is the find benchmark, which may optionally use fewer nodes/processes.
Please send any requests for changes to these rules or clarifying questions to our mailing list.
The IO500 committee will assign a reproducibility score per submission.
This will impact a submission as follows:
- Undefined/Limited Score - Lowest levels of reproducibility. The IO500 committee will work with submitters to try and gather more information and raise the score, but if additional information is not received the submission may be rejected.
- Proprietary/Fully Reproducible Score - Eligible for IO500. Entries with a 'Proprietary' score are eligible for the Research List and entries with a 'Fully Reproducible’ score are eligible for the Production list (if they also meet the other requirements for the Production List).
Additional Eligibility Rules for the Production List
Each submission will only be on one of the Research or Production lists. The following additional requirements must be satisfied to be eligible for the Production list.
- Submission meets the definition of a "Production System" as defined in the Production List description
- Submission achieves the highest 'reproducibility score' as defined in the Reproducibility description
- No 'single point of failure'
The IO500 steering committee has final say on whether a submission meets the above requirements.
All information, including fault tolerance mechanisms, will be listed prominently on the IO500 list so it is clear to everyone what tradeoffs are employed to achieve the published score. Further, multiple submissions with different fault-tolerance/reliability mechanisms may be submitted and published in order to demonstrate the capabilities of a submission along different dimensions (although we may limit the total number that can be on a list).
For further details, please see both the IO500 Reproducibility Proposal and List Split Proposal.