Contributing
#
Contributor License AgreementThis project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
#
How to Contribute#
Contribute New FeatureSuperBenchmark is an open-source project. Your participation and contribution are highly appreciated. There are several important things you need know before contributing new feature to this project:
#
What content can be added to SuperBenchmarkBug fixes for existing features.
New features for benchmark module (micro-benchmark, model-benchmark, etc.)
If you would like to contribute a new feature on SuperBenchmark, please submit your proposal first. In GitHub Issues module, choose
Enhancement Request
to finish the submission. If the proposal is accepted, you can submit pull requests to originmain
branch.
#
Contribution stepsIf you would like to contribute to the project, please follow below steps of joint development on GitHub.
Fork
the repo first to your personal GitHub account.- Checkout from main branch for feature development.
- When you finish the feature, please fetch the latest code from origin repo, merge to your branch and resolve conflict.
- Submit pull requests to origin main branch.
- Please note that there might be comments or questions from reviewers. It will need your help to update the pull request.
#
Contribute Benchmark ResultsIf you want to contribute benchmark results run by specified SuperBench version, please follow below guidelines.
#
Where to submitAll the results are stored under superbench-results repository. The directory structure is as follows. Please create <your-benchmark-folder>
to submit results.
superbench-results โโโ v0.2 โย ย โโโ your-benchmark-foldername โย ย โโโ LICENSE.md โย ย โโโ README.md โย ย โโโ configs โย ย โย ย โโโ config1.yaml โย ย โย ย โโโ config2.yaml โย ย โโโ results โย ย โย ย โโโ result1.json โย ย โย ย โโโ result2.json โย ย โโโ systems โย ย โโโ system1.json โย ย โโโ system2.json โโโ v0.3 โโโ your-benchmark-foldername โโโ LICENSE.md โโโ README.md โโโ configs โย ย โโโ config1.yaml โย ย โโโ config2.yaml โโโ results โย ย โโโ result1.json โย ย โโโ result2.json โโโ systems โโโ system1.json โโโ system2.json
#
Files to provideBesides README
and LICENSE
file, you should provide at least three benchmarking related files.
system.json
: This file lists all the system configurations in json format.You can get the system info automatically by executing
system_info.py
using below command. The file is undersuperbench/tools
folder.python system_info.py
config.yaml
: This file is the config file to run benchmarks. Click here to learn the details.result.json
: This file contains the results run by SuperBench with system configuations listed insystem.json
file.