How would you recommend to address a use case for integration testing purposes within a shell executor where a very large amount of input and golden files comparison data is necessary?
Currently we're relying on ~68 GB of data for these tests.
If there would be a way to mount an external volume, or to make some directory available for the Docker, I could set up a OneDev pipeline step where I manually trigger this comprehensive integration testing step on successful candidate integration builds.
Using run options of docker executor is suggested approach in this case.
PS: The command step does not have option to mount external volumes as OneDev tries to make the build reproducible, and external dependencies are not encouraged.
Jerome St-Louis commented 10 months ago
Right, thank you.
These large input and golden files are dependencies to further verify a successful build (integration testing), but not to produce it.
How would you recommend to address a use case for integration testing purposes within a shell executor where a very large amount of input and golden files comparison data is necessary? Currently we're relying on ~68 GB of data for these tests. If there would be a way to mount an external volume, or to make some directory available for the Docker, I could set up a OneDev pipeline step where I manually trigger this comprehensive integration testing step on successful candidate integration builds.
Thank you!
Could this be done with a particular Job Executor that gets
--mount
( https://docs.docker.com/storage/volumes/ ) Run Options for a Server Docker Executor?--mount source=/test-data/,destination=/path/in/container,readonly