Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test Coverage / Quality Umbrella #3

Open
5 of 18 tasks
HalosGhost opened this issue Jan 26, 2022 · 8 comments
Open
5 of 18 tasks

Test Coverage / Quality Umbrella #3

HalosGhost opened this issue Jan 26, 2022 · 8 comments
Labels
difficulty/01-good-first-issue Very approachable for new contributors enhancement/testing Increases test coverage or quality

Comments

@HalosGhost
Copy link
Collaborator

HalosGhost commented Jan 26, 2022

There are a great deal of tests that should be added, and many additional branches that should be covered. To that end, this issue is meant to be a centralized place to keep track of tests that have been identified as good opportunities for improvement. We will try to keep the task list below up-to-date; so if you think another item should be added, feel free to comment it below and we will add to the list as makes sense!

Note: task lists allow for each item to easily be converted into a stand-alone issue; If a particular test/improvement merits deeper discussion as someone seeks to implement it, we can always make that change!

@HalosGhost HalosGhost transferred this issue from another repository Jan 31, 2022
@HalosGhost HalosGhost added difficulty/01-good-first-issue Very approachable for new contributors enhancement/testing Increases test coverage or quality labels Jan 31, 2022
@mszulcz-mitre
Copy link
Contributor

I'd like to contribute to the project and it seems like writing a few tests is a good entry point. I've started looking into the 2nd task on the list, 'Add unit test for cbdc::from_buffer(nuraft::buffer&)'. Is anyone working on this yet?

@HalosGhost
Copy link
Collaborator Author

@mszulcz-mitre Not that I'm aware of; Feel free to hit the Convert to issue button, and dig in!

@mszulcz-mitre
Copy link
Contributor

@HalosGhost Thanks for the quick reply! Ready to dig in! I'm having some trouble converting the task to an issue though. According to the GitHub page about task lists, you're supposed to be able to hover over a task and then click a circle with a dot in it. I've opened this page in 3 different browsers, but I don't see any pop up when hovering on a task. If I manually open an issue, could I ask you to link it to the task? I don't think I have permission to edit the original issue post.

@HalosGhost
Copy link
Collaborator Author

@mszulcz-mitre happily!

I wondered about that; I think you're not the first one who's not been able to do that, but GitHub's documentation is unclear on who is allowed to hit that button. Please feel free to open the new issue; I'll edit the original task-list appropriately.

@ykaravas
Copy link

Hello, my name is Yiannis Karavas (with MITRE) and I'd like to start contribute to the project. I have begun work on "Improve test for cbdc::nuraft_serializer::read() to cover non-error cases". I cannot click the "Convert to issue" button either, so should i also just create a new issue?

@HalosGhost
Copy link
Collaborator Author

@ykaravas indeed; feel free to just reference this issue in the one you create and I'll update/add it to the list as-needed. :)

@ykaravas
Copy link

ykaravas commented Jun 15, 2022

@HalosGhost I would like to take a crack at the task that reads " Craft a config that can be used as a basis for most tests". I think this will be helpful in furthering my understanding of the code and will obviously be useful for testing etc... I was wondering if you could elaborate a bit on what is meant by "as a basis for more tests"?

  • Are we looking for larger and more complex config files?
  • Are we looking for config files that will test specific scenarios?
  • Should these new config file(s) target specific integration tests?
  • Also, what do you think about creating some sort of simple tool (maybe in the tools folder or tests folder) that could either be a simple C++ program or a bash script that could autogenerate a large config file from a few parameters? For example, one could put number of wallets, shards, atomizers, sentinels etc.... Eventually, we could have parameters like average transactions per second and maybe even parameters that could be something like probability of failure for each type of component that could, in an integration test that would ingest these, simulate such failures and such volumes of transactions and many more dynamics as well I'm sure. I am not very familiar with the existing testing framework or the benchmarking tools you have used but let me know if you think this sounds useful. I suppose, however, I am thinking about this mainly, in a non-docker context. But, I can envision an integration test that could parse the autogenerated configuration file, use docker-cli spin up all the docker containers required in the proper network and issue commands such as transactions, minting and simulating outages based on parameters in the file.

@HalosGhost
Copy link
Collaborator Author

@ykaravas I think it probably makes sense for that discussion to happen under that issue (feel free to open it and repost your questions), and I'll answer there.

However, in broad strokes, I can tell you my original motivation for adding that task: I thought (and still think) that having a more complex and varied configuration would be helpful. Namely, I think it might serve as a solid test bed for a system-test suite, as well as potentially being useful for many of our integration and unit tests (depending on what's being tests). Further, I think it might serve as a nice diving-in point for people who want to experiment with the system on a larger scale.

Additionally, I love the idea of a configuration generator (though, I'd wonder if it might start to have overlap with #81 and I'm unsure of which would end up being the right path to explore).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
difficulty/01-good-first-issue Very approachable for new contributors enhancement/testing Increases test coverage or quality
Projects
None yet
Development

No branches or pull requests

3 participants