We would like to store build artifacts on a remote location - S3, or whatever. We can easily upload files to the remote location from within the build script.
However, we would like to link to these files from the build/artifacts page. Is there a way to do it. If not, can this be added as an option?
Andrew Lalis commented 2 years ago
I think it's a good idea to add official support for this at some point, but an intermediate solution could be to simply write the remote locations to a plaintext file and publish that as the internal build artifact.
Robin Shen commented 2 years ago
Or show the artifact link to S3 via build description via the "Set Build Description" step.
Nathan Clayton commented 2 years ago
It may be interesting to look into something like Apache jclouds as a storage backend - it can handle filesystem storage in addition to S3, B2, GCS, Azure. All with the same API. Especially nice when looking at some of the other tickets asking about Docker repositories. It'd also be helpful if other repository types were implemented (NPM, MVN, PIP, etc.), as the volume of data stored can get pretty sizeable.
Robin Shen commented 2 years ago
Thanks for the info. Will definitely look into this.
Hi,
We would like to store build artifacts on a remote location - S3, or whatever. We can easily upload files to the remote location from within the build script.
However, we would like to link to these files from the build/artifacts page. Is there a way to do it. If not, can this be added as an option?