Unknown opened 1 year ago
|
|||||||
Robin Shen changed fields 1 year ago
|
|||||||
OneDev changed state to 'Closed' 1 year ago
|
|||||||
State changed as code fixing the issue is committed |
|||||||
@robin I assume that this means that every new build will also include a .tar.xz or .tar.gz compressed binary? |
|||||||
Only .tar.gz and .zip |
|||||||
zip is absolutely more widespread than tar, it make sense to use zip as only default. reported to onedev build, the difference of ratio compression may be not relevant even on what you called low bandwidth... |
|||||||
@bufferunderrun I completely disagree. The reason why zip is so widespread is the adoption of windows as the primary operating system which has zip built in, and thus that is what everyone uses. tar.gz have better compression ratios, and also nearly every linux distribution comes preinstalled with tar and gzip, they are normally contained within the base system packages. Those who run Linux (me included) would greatly benefit from gzip compressed tars for installation. And yes the compression ratio is important for the following reasons:
Only downside of using it over zip is that it takes longer (and more resources) to compress and decompress, but that is the trade off with compression, xz compressed tars are even slower! |
|||||||
OneDev changed state to 'Released' 1 year ago
|
|||||||
State changed as build #3271 is successful |
|||||||
@polarian Don't confuse global linux packaging (tar.*) with the onedev artifact. For all the advantages you list, that's why all linux distros set tar as the de facto standard. I suspect you really only ask for .tar.gz to allow easier AUR packaging which is a complete lost of time. unzip package is installed in all my linux servers, that's it and i'm not the only one doing this 🙃
Seriously... |
|||||||
Maybe it is because most things in the archive are various jars, which is already compressed. I do notice that the ubuntu image on EC2 does not have zip installed. I did not provide .tar.gz because that old gnu tar utility can not handle long path names, but I think nowadays this should no longer be a problem. |
|||||||
How did you know I needed a tar for AUR packaging :P Yeah but seriously, it makes packaging a whole lot easier, and no it is not a complete waste of time! also that 1MB difference makes a HUGE difference, its an entire 1MB less data to download, 0.69% reduction compared to .zip xD |
|||||||
I would like to put emphasis on the fact it makes my life easier, thus of course I am going to ask :P |
|||||||
Maybe i need to clarify some of my statements Yes. Unlike tar, unzip is not battery included in most of distros. When i said it's installed in all my servers, i should have said "i installed". It's apart of a common base environment i setup on new servers : htop, git, wget, unzip, prometheus... quite classic in sysadmin. @polarian You wrote about AUR in another discussion. I follow almost all OneDev threads as it's my main primary tool. I take care like my children :p Nothing personal and i have nothing wrong with packaging. I think packaging a huge sensitive server software which is already released as container image is a waste of time. I'm in IT more than 2 decades and i see a lot of changes and use many proprietary/opensource products and OSes, including Arch. For sysadmins and developers, the container ecosystem is THE major breakthrough of this last years. It fully change the way we run and deliver application : faster, easier, stronger. A insignificant price to pay : little CPU overhead, image sometime heavyweight a first download. Server distros tends to have minimal environment (package manager for base, libs) to run OCI-containers. Of course, you can package for AUR but i don't think this way will be choose by real production users. Apologies if i'm wrong. |
|||||||
@bufferunderrun to prove my point I will highlight I do not have prometheus or wget installed on my servers, most of the time not even htop. Every sysadmin is different, but tar and gzip are always included in base distributions, gzip is required by the linux kernel as it is defaultly gzip compressed if I remember correctly.
Shows how much of a typical sysadmin you are, if you believe that docker containers are the only thing which should be used, can I highlight my disgust for docker, I much rather use an LXC container and configure things myself than use docker, and there are a few people who want to install on bare metal, and a package would make that extremely easy.
Sysadmins have been saying containersation is the biggest thing which has ever happened, and yet it is 2023, many years later, and still they have not taken over completely, why?
My entire network, including my email server, XMPP server, web servers all do not use docker or kubernetes, why? because they are not as robust in my opinion, and also they add way too much overhead, yes a massive enterprise server I am sure can spare the resources, but what is the point, LXC containers can be used to separate things out, and LXC containers are just slightly more advanced than a chroot, there is little to no overhead with them, all packages are pulled just like a normal system, in my opinion that is the best way to host anything, having full control over the filesystem and software, not having a daemon do it all for you.
https://onedev.polarian.dev/ aims to be a community production server for OneDev, and it is currently hosted using the AUR package, of course this is bad currently as the AUR package is unstable and I am currently working on patching a major issue, after that patch though it should be smooth sailing. |
|||||||
Run your network how you like, but don't dictate how others should run their networks! |
|||||||
Absolutely
But... Whatever tools you use, managing servers imply some basics things to do : metrics, alerts, backups... and sometimes forensic, you know, to find why a service is not working. This is a sort of base environment you will have all along your sysadmin life...
I don't know what a typical sysadmin is, but i'm definitely not one, that's not my day job 😅
Never say this.
Well, container have been a game changer for both sysadmin and developer, just a fact. It seems you don't like docker/podman, so they are bad and that's it ? It's more about simplicity, efficiency, availability, interoperability... You took my remarks very personal, i just said packaging onedev is a waste of time because all the specific logic is already done by Robin in the image released.
Never say this. It was my last message in this thread as this debate is steril. |
|||||||
Hello, I do not have metrics on my servers, I try to the best of my ability to not track the users which connect to my server as much as possible, the only thing I have is logs for errors, but apart from that I keep metrics to the minimum. I can check the status of the storage, resource utilisation etc without needing metrics, all they do is increase overhead. I got backups and stuff, just I manage my servers very minimalistic. I am not going to reply to the other comments you have made, but I have read them because I would be repeating my points. Lets stop this debate now because its useless, we all manage our servers separately, and if I want to manage my server in this way that is up for me to decide. I don't use zips, I use tars, thus the option should stay! |
Type |
Improvement
|
Priority |
Normal
|
Assignee |
Currently OneDev only outputs a .zip for the builds, which is useful for windows, but the zip format does not offer a high compression ratio. Linux users also need to install additional dependencies to be able to unzip. .tar.xz or .tar.gz (xz preferred personally if the build server can handle the extra compression) would allow for higher compression ratios which makes it easier to install on a lower bandwidth, and also easier to extract on Linux as most distributions have xz and gzip installed by default (along with tar).
I am not sure how much extra compute resources needed to release two different compressions, but it would be useful.