Allow use of system-provided pugixml #2863

Closed
PureTryOut wants to merge 1 commit from support-system-pugixml into master
PureTryOut commented 2022-06-29 07:11:03 +00:00 (Migrated from github.com)

Linux distributions don't like vendoring of libraries by applications and prefer to ship their own version.
This makes the build look for a system-provided pugixml first and if not found, fallback to the vendored one like before.

Linux distributions don't like vendoring of libraries by applications and prefer to ship their own version. This makes the build look for a system-provided pugixml first and if not found, fallback to the vendored one like before.
rtsisyk reviewed 2022-06-29 07:11:03 +00:00
biodranik commented 2022-06-29 07:43:19 +00:00 (Migrated from github.com)

Can you please clarify that "don't like" statement? We manually update all our dependencies, usually to the latest versions. There is a high risk that it won't compile on some non-recent Linux distro.

pugixml consists of 3 source files, what is the strong reason to make the code more complex to rewrite it and create this PR?

Can you please clarify that "don't like" statement? We manually update all our dependencies, usually to the latest versions. There is a high risk that it won't compile on some non-recent Linux distro. pugixml consists of 3 source files, what is the _strong_ reason to make the code more complex to rewrite it and create this PR?
PureTryOut commented 2022-06-29 07:54:48 +00:00 (Migrated from github.com)

Distributions want full control of the software in their repositories. They fix things locally where needed and might have custom settings and options enabled that upstream (read: you) doesn't account of. They might also ship an older version of a dependency (you use an old expat version actually) than upstream does but with various bugfixes made to it.

Now being compatible with older and custom versions of library is not something you should as an upstream developer care much about, let the distribution developer handle that since they are the one doing custom things. Being able to build at all with system-provided versions however is something you should allows/support.

It doesn't really matter how many source files a dependency has, I would make such a PR even if the source only had a single file. My distribution (Alpine Linux) is quite tolerant when it comes to this stuff but e.g. a Debian would never allow packages vendoring their deps and they put a lot of effort in to make sure everything works even if they ship some older dependency.

I'm planning on making equivalent PR's for other dependencies like expat. I hope I explained the need well enough?

Distributions want full control of the software in their repositories. They fix things locally where needed and might have custom settings and options enabled that upstream (read: you) doesn't account of. They might also ship an older version of a dependency (you use an old expat version actually) than upstream does but with various bugfixes made to it. Now being compatible with older and custom versions of library is not something you should as an upstream developer care much about, let the distribution developer handle that since they are the one doing custom things. Being able to build at all with system-provided versions however _is_ something you should allows/support. It doesn't really matter how many source files a dependency has, I would make such a PR even if the source only had a single file. My distribution (Alpine Linux) is quite tolerant when it comes to this stuff but e.g. a Debian would never allow packages vendoring their deps and they put a lot of effort in to make sure everything works even if they ship some older dependency. I'm planning on making equivalent PR's for other dependencies like expat. I hope I explained the need well enough?
biodranik commented 2022-06-29 08:35:32 +00:00 (Migrated from github.com)

I still don't get your point. "Because distros do like that" is not an argument for OM.

We build and link libraries statically for several reasons. One of them is to avoid any issues when OS has a different version. Another reason is that we sometimes use custom defines/flags, and even patches or use non-released branches of dependencies. One more reason is that it saves us a lot of time for maintenance, and eases the way for contributors to start working on the project. One more reason is that we don't want to spend a lot of time debugging an issue caused by some outdated/not compatible/wrongly configured system dependency.

Our approach is to update the outdated library and fix the code if necessary, instead of linking something dynamically from the system and hoping that it will build and work well.

I still don't get your point. "Because distros do like that" is not an argument for OM. We build and link libraries statically for several reasons. One of them is to avoid any issues when OS has a different version. Another reason is that we sometimes use custom defines/flags, and even patches or use non-released branches of dependencies. One more reason is that it saves us a lot of time for maintenance, and eases the way for contributors to start working on the project. One more reason is that we don't want to spend a lot of time debugging an issue caused by some outdated/not compatible/wrongly configured system dependency. Our approach is to update the outdated library and fix the code if necessary, instead of linking something dynamically from the system and hoping that it will build and work well.
biodranik commented 2022-06-29 08:37:35 +00:00 (Migrated from github.com)

What are the PROS of following your path, compared to the arguments mentioned above? Being able to add OM into Debian?

What are the PROS of following your path, compared to the arguments mentioned above? Being able to add OM into Debian?
PureTryOut commented 2022-06-29 08:53:35 +00:00 (Migrated from github.com)

The reasons you list for doing it like you do are things the distribution maintainers should take care off. If somebody has a problem with say the Debian package because they ship some patched or older version of a dependency, that's not for you to solve but for Debian (Debian is just an example here, it could be any distribution). You don't have to spent time debugging that issue, that maintainer will. They might ask for help but you can just answer "that's not a use-case we support, either ship <version you want> or fix it yourself" if you don't feel like helping out.

When applications manage and ship their own versions of libraries, as a user and distribution you have to trust the developer to do the right thing. You have to trust they update to a newer version in time, you have to trust they actively follow and fix any security issues that might pop up, you have to trust they don't do weird things that might screw up your system in some way. I rather not trust all individual application developers to do this, they just focus on their application and not on all the individual dependencies they might use. When using a system-provided version and some new security issue comes up, just that one library has to be fixed by the distribution and all packages using it will automatically be secure. You don't have to wait and check if all applications using that one library are updated to pull in a newer version.

And yes, I would consider being able to add OM to Debian a pro. It's a big distribution with quite a lot of users and thus a lot of potential OM users. Also Mobian is based on it and probably wants a nice mapping/navigation application for their users. If applications like OM don't support dynamically linking to system-provided libraries then either the application won't be shipped on that distribution, or they will forever carry a downstream patch to make it possible.

The reasons you list for doing it like you do are things the distribution maintainers should take care off. If somebody has a problem with say the Debian package because they ship some patched or older version of a dependency, that's not for you to solve but for Debian (Debian is just an example here, it could be any distribution). _You_ don't have to spent time debugging that issue, that maintainer will. They might ask for help but you can just answer "that's not a use-case we support, either ship \<version you want> or fix it yourself" if you don't feel like helping out. When applications manage and ship their own versions of libraries, as a user and distribution you have to trust the developer to do the right thing. You have to _trust_ they update to a newer version in time, you have to _trust_ they actively follow and fix any security issues that might pop up, you have to _trust_ they don't do weird things that might screw up your system in some way. I rather not trust all individual application developers to do this, they just focus on their application and not on all the individual dependencies they might use. When using a system-provided version and some new security issue comes up, just that one library has to be fixed by the distribution and all packages using it will _automatically_ be secure. You don't have to wait and check if all applications using that one library are updated to pull in a newer version. And yes, I would consider being able to add OM to Debian a pro. It's a big distribution with quite a lot of users and thus a lot of potential OM users. Also Mobian is based on it and probably wants a nice mapping/navigation application for their users. If applications like OM don't support dynamically linking to system-provided libraries then either the application won't be shipped on that distribution, or they will forever carry a downstream patch to make it possible.
biodranik commented 2022-06-29 09:08:27 +00:00 (Migrated from github.com)

I still don't understand your points. We do not ship any dependencies with our statically built binary. Statically build OM uses only a small subset from any static lib, removing everything that is not used on a linking stage.

And we never planned/wanted to add some work for maintainers. We are responsible for our binary, and it consists of our code + all 3party deps we're using, so it's our responsibility to make it work and to support it. And there are many benefits from this approach that I listed above.

There is no security risk to the system in this approach because OM does not expose the interfaces of libraries it uses. See it from that point: we ship a single binary, and we are responsible for its functionality.

I still don't understand your points. We do not ship any dependencies with our statically built binary. Statically build OM uses only a small subset from any static lib, removing everything that is not used on a linking stage. And we never planned/wanted to add some work for maintainers. We are responsible for our binary, and it consists of our code + all 3party deps we're using, so it's our responsibility to make it work and to support it. And there are many benefits from this approach that I listed above. There is no security risk to the system in this approach because OM does not expose the interfaces of libraries it uses. See it from that point: we ship a single binary, and we are responsible for its functionality.
PureTryOut commented 2022-06-29 10:01:16 +00:00 (Migrated from github.com)

I'm not sure I'll be able to convince you. You do not ship the libraries as separate files no, but the parts you link to still "suffer" from the things I mentioned. Also there is the portability argument: if a bug gets found in a library when porting to a new platform, not every application linking to/using that library has to be fixed.

Yes there is a security risk, as your application might be using some vulnerable part of the library. Like I said, as an end-user I do not want to have to trust every developer of every application I use to do it right. OM is just one of the applications in use, now consider e.g. protobuf getting a CVE in the parts OM uses and possibly 30 other applications I might have installed. I now have to verify for each of those applications if they started shipping an updated version of protobuf yet (and if they even use the functionality in question) rather than just making sure once I have the newest and fixed version of protobuf.

That's just not workable and distributions have solved this problem for years and years already.
I'm hoping that even if you do not really want to support the use-cases I'm talking about you would accept the PR on a basis of "fine but we don't support this so make sure you fix problems that come from this yourself and don't come to us".

Anyway if you don't want to accept this then that's fine, you of course have the right too, it just means I'll have to ship this patch downstream forever and deal with the maintenance burden that comes from it (rebasing mainly) or not ship OM at all. Alpine Linux and postmarketOS (for which I'm packaging this) are small and you probably won't miss it's users, but it'd be a shame to miss out on any potential users from Debian, Fedora, openSUSE, <insert other distribution here>.

I'm not sure I'll be able to convince you. You do not ship the libraries as separate files no, but the parts you link to still "suffer" from the things I mentioned. Also there is the portability argument: if a bug gets found in a library when porting to a new platform, not every application linking to/using that library has to be fixed. Yes there is a security risk, as your application might be using some vulnerable part of the library. Like I said, as an end-user I do not want to have to trust every developer of every application I use to do it right. OM is just one of the applications in use, now consider e.g. protobuf getting a CVE in the parts OM uses and possibly 30 other applications I might have installed. I now have to verify for each of those applications if they started shipping an updated version of protobuf yet (and if they even use the functionality in question) rather than just making sure once I have the newest and fixed version of protobuf. That's just not workable and distributions have solved this problem for years and years already. I'm hoping that even if you do not really want to support the use-cases I'm talking about you would accept the PR on a basis of "fine but we don't support this so make sure you fix problems that come from this yourself and don't come to us". Anyway if you don't want to accept this then that's fine, you of course have the right too, it just means I'll have to ship this patch downstream forever and deal with the maintenance burden that comes from it (rebasing mainly) or not ship OM at all. Alpine Linux and postmarketOS (for which I'm packaging this) are small and you probably won't miss it's users, but it'd be a shame to miss out on any potential users from Debian, Fedora, openSUSE, \<insert other distribution here>.
skarnet commented 2022-06-29 10:33:05 +00:00 (Migrated from github.com)

It's really a question of ownership and where responsibilities belong.

Developers like to have ownership of their code. And for small, statically linked projects, that makes sense, even if they have to embed a copy of some project - as long as they can maintain it, it's fine. No dilution of responsibility; authors know exactly what's happening with their code, and if a problem is reported they can fix it without having to test N+1 configurations. As a dev, I absolutely understand this, and I've had several prickly exchanges with distributions that were packaging my software all wrong - I hate that.

Distributors, however, are used to packaging bigger projects with more complex dependencies, and what they hate is when authors vendor these complex dependencies - because it's a lot of code duplication, build burden, and maintenance burden. Vendored code is usually pretty poorly maintained by developers who ship it, and from the distributor's side, using vendored libraries instead of maintained system-provided libraries is a waste of machine and support resources, a security risk, etc.

No one's wrong in this, it's just a question of balance and choices. How big is pugixml? Can the organicmaps team fully support a vendored version without burdening distribution maintainers with bug-reports they cannot fix? Would it be a better factorization of resources to delegate pugixml maintainership to distributions? There is no inherently correct answer, and you'll have to agree on where the line should be drawn.

It's really a question of ownership and where responsibilities belong. Developers like to have ownership of their code. And for small, statically linked projects, that makes sense, even if they have to embed a copy of some project - as long as they can maintain it, it's fine. No dilution of responsibility; authors know exactly what's happening with their code, and if a problem is reported they can fix it without having to test N+1 configurations. As a dev, I absolutely understand this, and I've had several prickly exchanges with distributions that were packaging my software all wrong - I *hate* that. Distributors, however, are used to packaging bigger projects with more complex dependencies, and what *they* hate is when authors vendor these complex dependencies - because it's a lot of code duplication, build burden, and maintenance burden. Vendored code is usually pretty poorly maintained by developers who ship it, and from the distributor's side, using vendored libraries instead of maintained system-provided libraries is a waste of machine and support resources, a security risk, etc. No one's wrong in this, it's just a question of balance and choices. How big is pugixml? Can the organicmaps team fully support a vendored version without burdening distribution maintainers with bug-reports they cannot fix? Would it be a better factorization of resources to delegate pugixml maintainership to distributions? There is no inherently correct answer, and you'll have to agree on where the line should be drawn.
eli-schwartz commented 2022-06-29 13:21:12 +00:00 (Migrated from github.com)

We build and link libraries statically for several reasons. One of them is to avoid any issues when OS has a different version.

This problem is trivially solved. You can have the build system check for a minimum version of the dependency, and refuse to support anything older.

It is not your problem if Debian oldstale doesn't have a good enough version. They can conform to their mandatory policies w.r.t. building software against system dependencies, by upgrading the dependency. Or only package the software in debian testing which has the new version of the dependency.

Another reason is that we sometimes use custom defines/flags, and even patches or use non-released branches of dependencies.

In general, it's not a good idea to fork and maintain a diverging version of your dependencies. So the answer is probably "don't do that". Is there some kind of compelling reason you need to do this, or is it the sunk cost fallacy -- you've already soft-forked it for the sake of vendoring, and while you're there you might as well patch it arbitrarily? Because this is still going to make merging updates from upstream harder.

One more reason is that we don't want to spend a lot of time debugging an issue caused by some outdated/not compatible/wrongly configured system dependency.

There is zero (0%) debugging involved in responding to user support tickets in which the user says "help, I tried to tell the software to build using the system $dependency, but it told me that my version wasn't new enough".

I still don't understand your points. We do not ship any dependencies with our statically built binary. Statically build OM uses only a small subset from any static lib, removing everything that is not used on a linking stage.

The fact that you statically link the dependencies you ship doesn't mean you aren't shipping it. :)

There is no security risk to the system in this approach because OM does not expose the interfaces of libraries it uses. See it from that point: we ship a single binary, and we are responsible for its functionality.

This is guaranteed provably wrong in the generic sense, but depending on the software and on the dependency, it may be right in specific circumstances, you'd probably want a security expert to vet it for that.

Security vulnerabilities aren't made manifest by the fact that there's a shared library with that vulnerability in a function exposed by its symbol table. Security vulnerabilities are made manifest by the fact that some application does something wrong or dangerous when given certain input (files, command line arguments, network requests...).

Obviously, the application uses the dependency internally in some manner or it wouldn't depend on it. If the dependency has a vulnerability, that vulnerability may be in a function that the application uses, or it may not be. If it is, then the application can be assumed by default to be vulnerable, unless you can guarantee that the application's input as given to that function is always sourced from a trusted location.

If the application is vulnerable, it does not matter that only the application invulnerable! The application is part of the system, so by extension the system is vulnerable.

...

The easiest way to solve all these issues is to allow ISVs the chance to, if they so wish, use shared linkage to the dependency (bounded by the minimum version pins in the build system). This gives them the opportunity to update the dependency with new security releases, so that regardless of whether the application used to be vulnerable, it isn't any more.

They don't usually like to make exceptions, because everyone thinks they should be the exception and usually turn out not to be, and having this discussion 15k times isn't a great use of time. Also their policies may not actually permit making exceptions. The distros most likely to make a fuss about it are also the ones most likely to not do exceptions.

> We build and link libraries statically for several reasons. One of them is to avoid any issues when OS has a different version. This problem is trivially solved. You can have the build system check for a minimum version of the dependency, and refuse to support anything older. It is ***not your problem*** if Debian oldstale doesn't have a good enough version. They can conform to their mandatory policies w.r.t. building software against system dependencies, by upgrading the dependency. Or only package the software in debian testing which has the new version of the dependency. > Another reason is that we sometimes use custom defines/flags, and even patches or use non-released branches of dependencies. In general, it's not a good idea to fork and maintain a diverging version of your dependencies. So the answer is probably "don't do that". Is there some kind of compelling reason you need to do this, or is it the sunk cost fallacy -- you've already soft-forked it for the sake of vendoring, and while you're there you might as well patch it arbitrarily? Because this is still going to make merging updates from upstream harder. > One more reason is that we don't want to spend a lot of time debugging an issue caused by some outdated/not compatible/wrongly configured system dependency. There is zero (0%) debugging involved in responding to user support tickets in which the user says "help, I tried to tell the software to build using the system $dependency, but it told me that my version wasn't new enough". > I still don't understand your points. We do not ship any dependencies with our statically built binary. Statically build OM uses only a small subset from any static lib, removing everything that is not used on a linking stage. The fact that you statically link the dependencies you ship doesn't mean you aren't shipping it. :) > There is no security risk to the system in this approach because OM does not expose the interfaces of libraries it uses. See it from that point: we ship a single binary, and we are responsible for its functionality. This is guaranteed provably wrong in the generic sense, but depending on the software and on the dependency, it may be right in specific circumstances, you'd probably want a security expert to vet it for that. Security vulnerabilities aren't made manifest by the fact that there's a shared library with that vulnerability in a function exposed by its symbol table. Security vulnerabilities are made manifest by the fact that some application does something wrong or dangerous when given certain input (files, command line arguments, network requests...). Obviously, the application uses the dependency internally in some manner or it wouldn't depend on it. If the dependency has a vulnerability, that vulnerability *may* be in a function that the application uses, or it may not be. If it is, then the application can be assumed by default to be vulnerable, unless you can guarantee that the application's input as given to that function is always sourced from a trusted location. If the application is vulnerable, it does not matter that only the application invulnerable! The application is part of the system, so by extension the system is vulnerable. ... The easiest way to solve all these issues is to allow ISVs the chance to, if they so wish, use shared linkage to the dependency (bounded by the minimum version pins in the build system). This gives them the opportunity to update the dependency with new security releases, so that regardless of whether the application used to be vulnerable, it isn't any more. They don't usually like to make exceptions, because everyone thinks they should be the exception and usually turn out not to be, and having this discussion 15k times isn't a great use of time. Also their policies may not actually permit making exceptions. The distros most likely to make a fuss about it are also the ones most likely to not do exceptions.
eli-schwartz commented 2022-06-29 13:23:01 +00:00 (Migrated from github.com)

Again, if you're going to use released versions of your dependencies (you should!) there's no real downside to permitting those exact versions, at least (but ideally that version and any newer version) from being found as a system dependency.

You can even default to using the vendored copy. :)

Again, if you're going to use released versions of your dependencies (you should!) there's no real downside to permitting those exact versions, at least (but ideally that version and any *newer* version) from being found as a system dependency. You can even default to using the vendored copy. :)
vng (Migrated from github.com) requested changes 2022-06-29 22:03:18 +00:00
vng (Migrated from github.com) commented 2022-06-29 22:02:21 +00:00

This can be done in more "natural" way:
Add to 3party/pugixml/CMakeList.txt

target_include_directories(pugixml PUBLIC pugixml/src)

add_library(pugixml::pugixml ALIAS pugixml)

Then each dependent library can declare in CMakeList.txt just:

target_link_libraries(${PROJECT_NAME} pugixml::pugixml)

and in source files:

#include <pugixml.hpp>
This can be done in more "natural" way: Add to 3party/pugixml/CMakeList.txt ``` target_include_directories(pugixml PUBLIC pugixml/src) add_library(pugixml::pugixml ALIAS pugixml) ``` Then each dependent library can declare in CMakeList.txt just: ``` target_link_libraries(${PROJECT_NAME} pugixml::pugixml) ``` and in source files: ``` #include <pugixml.hpp> ```
vng commented 2022-06-29 22:17:01 +00:00 (Migrated from github.com)

Did you check that it actually works with system pugi? Because I have added this function into pugi

bool set_value(const char_t* rhs, size_t sz)

to use in our code a few months ago.

Did you check that it actually works with system pugi? Because I have added this function into pugi ``` bool set_value(const char_t* rhs, size_t sz) ``` to use in our code a few months ago.
PureTryOut commented 2022-06-30 04:07:42 +00:00 (Migrated from github.com)

My system pugixml for now has the patch that adds that so yes, it works. 😉 Till that's merged upstream however it won't work out of the box

My system pugixml for now has the patch that adds that so yes, it works. 😉 Till that's merged upstream however it won't work out of the box
biodranik commented 2022-06-30 05:41:42 +00:00 (Migrated from github.com)
  1. Pugi patch should be propagated upstream, otherwise there is no way for our code to work. Manual patching by maintainers is prone to errors and crashes if we change something again.
  2. Our goal is to keep 3party deps up-to-date. So if you see that something is outdated, or has a security issue, please update its version, or create an issue.
  3. I still don't see any benefits neither for us nor for users from the proposed approach. With our current static binary, it will just work everywhere. With external deps, it won't work, plus it requires a lot of support efforts.
1. Pugi patch should be propagated upstream, otherwise there is no way for our code to work. Manual patching by maintainers is prone to errors and crashes if we change something again. 2. Our goal is to keep 3party deps up-to-date. So if you see that something is outdated, or has a security issue, please update its version, or create an issue. 3. I still don't see any benefits neither for us nor for users from the proposed approach. With our current static binary, it will just work everywhere. With external deps, it won't work, plus it requires a lot of support efforts.
PureTryOut commented 2022-06-30 06:18:55 +00:00 (Migrated from github.com)
  1. Don't worry, @vng has already submitted it upstream https://github.com/zeux/pugixml/pull/490. This PR is meant to be used with a release of pugixml that has that merged, this is just preparation.
  2. OrganicMaps works just fine with the latest expat (assuming you're referring to that). I can update it in this tree and will look into that, but my local package already uses the latest of what's available using the system dep
  3. Don't you see the benefits at all or do you think your arguments are more important? There is a big difference between those and as we mentioned several benefits for end users I'm not sure how it's possible to not see any benefits there still. As for support efforts, like I said that is not really true as it's up to the distributions shipping the package to make sure it works with their versions of things, not up to you. As upstream you just keep supporting your vendored version of dependencies.
1. Don't worry, @vng has already submitted it upstream https://github.com/zeux/pugixml/pull/490. This PR is meant to be used with a release of pugixml that has that merged, this is just preparation. 2. OrganicMaps works just fine with the latest expat (assuming you're referring to that). I can update it in this tree and will look into that, but my local package already uses the latest of what's available using the system dep 3. Don't you see the benefits at all or do you think your arguments are more important? There is a big difference between those and as we mentioned several benefits for end users I'm not sure how it's possible to not see _any_ benefits there still. As for support efforts, like I said that is not really true as it's up to the distributions shipping the package to make sure it works with their versions of things, not up to you. As upstream you just keep supporting your vendored version of dependencies.
biodranik commented 2022-06-30 06:51:13 +00:00 (Migrated from github.com)

For now, I see a more religious-like approach "let's remove all statically linked libs, and make them dynamically linked ones, ignoring any reasons behind it because Linux distro maintainers are doing like that".

Can you please make a full and honest list of PROS and CONS for us and our users in the short and long term, with your proposed approach? So we can confirm that 1) you fully understand the concerns I mentioned above and 2) everyone will see what we get and what we lose?

For now, I see a more religious-like approach "let's remove all statically linked libs, and make them dynamically linked ones, ignoring any reasons behind it because Linux distro maintainers are doing like that". Can you please make a full and honest list of PROS and CONS for _us_ and _our users_ in the short and long term, with your proposed approach? So we can confirm that 1) you fully understand the concerns I mentioned above and 2) everyone will see what we get and what we lose?
biodranik (Migrated from github.com) reviewed 2022-06-30 06:53:44 +00:00
biodranik (Migrated from github.com) commented 2022-06-30 06:53:44 +00:00

There is no such file in the current directory, it leads to unnecessary stat calls by the compiler. <> should be used instead.

There is no such file in the current directory, it leads to unnecessary `stat` calls by the compiler. `<>` should be used instead.
PureTryOut commented 2022-06-30 07:24:34 +00:00 (Migrated from github.com)

I uh, am not sure what to respond. I gave a honest list of PROS, and you gave a honest list of CONS. I'm not sure why you think it's a religious-like approach when all I try to do is explain my reasoning.

I do understand your concerns and I tried to address them but it seems you don't agree with them. That's fine but please don't try to make it appear like I (or others) didn't try to give any reasoning for doing this. Let's try it again I suppose... I'll try to counter the cons you mentioned in-line.

PROS mentioned so far:

  1. Better security on Linux systems due to moving the chain of trust back to the distro maintainers rather than individual application developers. Remember that OrganicMaps is just one application out of thousands a distribution ships, if every application starts shipping their own deps it's impossible to keep track of and fix security issues
  2. Distributions can make OrganicMaps work with any exotic or patched dependencies they have. Note that it's their responsibility to make the application work with that, not yours

CONS mentioned so far:

  1. Possibility of distros using outdated deps breaking the application. Although this isn't really the case because the changes as proposed make sure a minimum version is requested (check the top-level CMakeLists.txt). You can just increase this version every time you update the vendored version and force distributions to use a minimum of that version
  2. Unable to rely on custom changes you might have made to libraries. This is not something you should do anyway, any change you make to libraries you use should be upstreamed so the maintenance cost for you is lower and everybody can benefit from the change. The 1 patch that pugixml has right now for OM is a good example, it's in the process of being upstreamed. Every change you make to your dependencies should be temporary
  3. There is no security risk as you only ship the small subset of the libraries you use. This isn't true, that part of the library can still be vulnerable and with some wrong input from your application or the user it can still be at risk. It's still required to check your dependencies for security issues and fix them where relevant, distributions can take this work out of your hand, at least on Linux

If you want I can put use of system dependencies behind a CMake flag (something like -DUSE_SYSTEM_PUGIXML=True or -DUSE_SYSTEM_PUGIXML_YES_I_KNOW_THIS_CAN_BREAK_THINGS_AND_IM_RESPONSIBLE_FOR_IT_MYSELF=True) so unless users explicitly set this it'll use the vendored deps.

I uh, am not sure what to respond. I gave a honest list of PROS, and you gave a honest list of CONS. I'm not sure why you think it's a religious-like approach when all I try to do is explain my reasoning. I _do_ understand your concerns and I tried to address them but it seems you don't agree with them. That's fine but please don't try to make it appear like I (or others) didn't try to give any reasoning for doing this. Let's try it again I suppose... I'll try to counter the cons you mentioned in-line. PROS mentioned so far: 1. Better security on Linux systems due to moving the chain of trust back to the distro maintainers rather than individual application developers. Remember that OrganicMaps is just one application out of thousands a distribution ships, if every application starts shipping their own deps it's impossible to keep track of and fix security issues 2. Distributions can make OrganicMaps work with any exotic or patched dependencies they have. Note that it's _their_ responsibility to make the application work with that, not yours CONS mentioned so far: 1. Possibility of distros using outdated deps breaking the application. Although this isn't really the case because the changes as proposed make sure a minimum version is requested (check the top-level `CMakeLists.txt`). You can just increase this version every time you update the vendored version and force distributions to use a minimum of that version 2. Unable to rely on custom changes you might have made to libraries. This is not something you should do anyway, any change you make to libraries you use should be upstreamed so the maintenance cost for you is lower and everybody can benefit from the change. The 1 patch that pugixml has right now for OM is a good example, it's in the process of being upstreamed. Every change you make to your dependencies should be temporary 3. There is no security risk as you only ship the small subset of the libraries you use. This isn't true, that part of the library can still be vulnerable and with some wrong input from your application or the user it can still be at risk. It's _still_ required to check your dependencies for security issues and fix them where relevant, distributions can take this work out of your hand, at least on Linux If you want I can put use of system dependencies behind a CMake flag (something like `-DUSE_SYSTEM_PUGIXML=True` or `-DUSE_SYSTEM_PUGIXML_YES_I_KNOW_THIS_CAN_BREAK_THINGS_AND_IM_RESPONSIBLE_FOR_IT_MYSELF=True`) so unless users explicitly set this it'll use the vendored deps.
vng (Migrated from github.com) reviewed 2022-06-30 09:36:09 +00:00
vng (Migrated from github.com) commented 2022-06-30 09:36:09 +00:00

Ok, but now latest pugixml 1.12 doesn't have needed function set_value.
If we merged this PR now as is, we will break compilation for devs, who have system pugixml 1.12, no?

Ok, but now latest pugixml 1.12 doesn't have needed function set_value. If we merged this PR now as is, we will break compilation for devs, who have system pugixml 1.12, no?
PureTryOut (Migrated from github.com) reviewed 2022-06-30 09:40:32 +00:00
PureTryOut (Migrated from github.com) commented 2022-06-30 09:40:32 +00:00

Yeah I would recommend waiting with merging till either a new release of pugixml, or I can set this to 1.13 so it'll fail automatically till pugixml 1.13 is available.

Yeah I would recommend waiting with merging till either a new release of pugixml, or I can set this to 1.13 so it'll fail automatically till pugixml 1.13 is available.
gerion0 commented 2022-07-10 23:34:57 +00:00 (Migrated from github.com)

Maybe let me give a few examples regarding advantages for security:

2014, the Heardbleed security issue in OpenSSL was revealed. Since OpenSSL was provided by the distribution directly, it was enough for them to update this one package (since OpenSSL usually is also dynamically linked they don't even needed to (re)compile any other package that depended on OpenSSL).
My system currently contains 102 packages that depend on OpenSSL. All of these were secured at once back in that time.

Rust applications (only as an example) do not support shared linking and also don't support precompiled static libraries, so essentially everything is "bundled" within a rust distribution package. Given that, Gentoo for example has 51 rust ebuilds (programs) that depend on the "regex" crate (their form of libraries). If there is a bug in "regex", it is not enough for Gentoo to just update the "regex"-ebuild (actually, no such ebuild exists). They need to update every "regex" dependency of the 51 packages or wait for every upstream to do that.

Other than that, there is a good summary of pros and cons in the Gentoo Wiki and in this blog post (of a very active Gentoo developer).

Maybe let me give a few examples regarding advantages for security: 2014, the [Heardbleed](https://en.wikipedia.org/wiki/OpenSSL#Heartbleed) security issue in OpenSSL was revealed. Since OpenSSL was provided by the distribution directly, it was enough for them to update this one package (since OpenSSL usually is also dynamically linked they don't even needed to (re)compile any other package that depended on OpenSSL). My system currently contains 102 packages that depend on OpenSSL. All of these were secured at once back in that time. Rust applications (only as an example) do not support shared linking and also don't support precompiled static libraries, so essentially everything is "bundled" within a rust distribution package. Given that, Gentoo for example has 51 rust ebuilds (programs) that depend on the "regex" crate (their form of libraries). If there is a bug in "regex", it is not enough for Gentoo to just update the "regex"-ebuild (actually, no such ebuild exists). They need to update every "regex" dependency of the 51 packages or wait for every upstream to do that. Other than that, there is a good summary of pros and cons in the [Gentoo Wiki](https://wiki.gentoo.org/wiki/Why_not_bundle_dependencies) and in [this blog post](https://blogs.gentoo.org/mgorny/2021/02/19/the-modern-packagers-security-nightmare/) (of a very active Gentoo developer).
biodranik commented 2022-07-11 05:36:46 +00:00 (Migrated from github.com)

We do understand all implications for you as maintainers.

The problem is that you do not understand or ignore implications for us, developers.

We also know the attack surface better, and can estimate the security risks better. You just trying to blindly follow the pattern.

We do understand all implications for you as maintainers. The problem is that you do not understand or ignore implications for us, developers. We also know the attack surface better, and can estimate the security risks better. You just trying to blindly follow the pattern.
PureTryOut commented 2022-07-11 06:38:32 +00:00 (Migrated from github.com)

That is just not true though. I tried to counter your arguments several times, indicating I very much understand the implications for you as developers. It's fine to disagree with my arguments, but you seem to suggest I/we ignore your arguments while to me it feels the other way around. No we do not "blindly" follow a pattern, we have good reasons for doing so and I mentioned them several times. Yes I read and understood why you do not want to follow that "pattern", and I do not agree with it and tried to tell you why. But you just keep saying we are ignoring your arguments while we are not.

At this point please just close the PR if you are not going to change your mind, rather than continuing this back and forth discussion. Distributions will either just carry this PR downstream or not ship OM at all, and that'll be it.

That is just not true though. I tried to counter your arguments several times, indicating I very much understand the implications for you as developers. It's fine to disagree with my arguments, but you seem to suggest I/we ignore your arguments while to me it feels the other way around. No we do not "blindly" follow a pattern, we have good reasons for doing so and I mentioned them several times. Yes I read and understood why you do not want to follow that "pattern", and I do not agree with it and tried to tell you why. But you just keep saying we are ignoring your arguments while we are not. At this point please just close the PR if you are not going to change your mind, rather than continuing this back and forth discussion. Distributions will either just carry this PR downstream or not ship OM at all, and that'll be it.
biodranik commented 2022-07-11 06:48:07 +00:00 (Migrated from github.com)

You still do not hear what I'm trying to say. I'll try to repeat it again:

By following your approach, any contributor who wants to help us will hit a build error on his Linux system immediately after checking out the repo. And chances are high that this contributor won't continue. Because people expect that the build should just work, instead of googling and finding "your OS is too outdated, and you should do 1, 2, 3 to update or patch that pugixml dependency before you can compile it and contribute your fix which is completely not relevant to that pugixml build error".

You still do not hear what I'm trying to say. I'll try to repeat it again: By following your approach, any contributor who wants to help us will hit a build error on his Linux system _immediately_ after checking out the repo. And chances are high that this contributor won't continue. Because people expect that the build should just work, instead of googling and finding "your OS is too outdated, and you should do 1, 2, 3 to update or patch that pugixml dependency before you can compile it and contribute your fix which is completely not relevant to that pugixml build error".
PureTryOut commented 2022-07-11 06:55:49 +00:00 (Migrated from github.com)

See, that's just new information to me. I fail to see where you said that but let's just keep that at a miscommunication. I did suggest earlier I can change it so it'll favour the vendored dependencies, using something like a -DSYSTEM_PUGIXML switch. I have no problem making that change, will it do?

See, that's just new information to me. I fail to see where you said that but let's just keep that at a miscommunication. I did suggest earlier I can change it so it'll favour the vendored dependencies, using something like a `-DSYSTEM_PUGIXML` switch. I have no problem making that change, will it do?
biodranik commented 2022-07-11 08:52:43 +00:00 (Migrated from github.com)

I mentioned it in my first and second comments to this thread. Let's call it miscommunication.

I'm ok with some custom approach if it:

  1. Does not introduce additional code changes that should be supported by us or contributors
  2. Won't fail by default (because of that custom approach) on any Linux system.
I mentioned it in my first and second comments to this thread. Let's call it miscommunication. I'm ok with some custom approach if it: 1. Does not introduce additional code changes that should be supported by us or contributors 2. Won't fail by default (because of that custom approach) on any Linux system.
PureTryOut commented 2022-07-11 10:18:45 +00:00 (Migrated from github.com)

Ok done. It'll now always use the vendored pugixml unless -DSYSTEM_PUGIXML=ON is set. No code changes are necessary besides the ones I made in this PR.

Ok done. It'll now always use the vendored pugixml unless `-DSYSTEM_PUGIXML=ON` is set. No code changes are necessary besides the ones I made in this PR.
PureTryOut (Migrated from github.com) reviewed 2022-07-11 12:24:25 +00:00
PureTryOut (Migrated from github.com) commented 2022-07-11 12:24:25 +00:00

Now it uses the vendored pugixml unless the user explicitly adds -DSYSTEM_PUGIXML=ON to the cmake call, this is safe to merge. If you pass that argument you are presumed to know what you are doing and should realize any problems will come from that switch.

Now it uses the vendored pugixml _unless_ the user explicitly adds `-DSYSTEM_PUGIXML=ON` to the cmake call, this is safe to merge. If you pass that argument you are presumed to know what you are doing and should realize any problems will come from that switch.
PureTryOut commented 2022-08-14 10:41:04 +00:00 (Migrated from github.com)

@biodranik anything blocking this still? I have some more changes like this ready to go but I'd like this in first.

@biodranik anything blocking this still? I have some more changes like this ready to go but I'd like this in first.
biodranik commented 2022-08-14 11:41:38 +00:00 (Migrated from github.com)

There are some issues with code changes (formatting, not all includes are properly replaced, hard-coded version, etc.).

But as our patch is still not merged into the upstream, I don't see any sense to merge it now.

It would be great if you can help with updating our 3party libs in the repo, or at least point to some most important updates that we should include.

There are some issues with code changes (formatting, not all includes are properly replaced, hard-coded version, etc.). But as our patch is still not merged into the upstream, I don't see any sense to merge it now. It would be great if you can help with updating our 3party libs in the repo, or at least point to some most important updates that we should include.
PureTryOut commented 2022-08-14 13:57:09 +00:00 (Migrated from github.com)

Sorry, hardcoded version? Do you mean find_package(pugixml 1.12) or something else? It's a specific version on purpose, as one of your earlier arguments was that you can not guarantee the application working with for example a way older dependency than you use yourself. The 1.12 call here makes sure it searches for at least (so this and anything higher) 1.12. Only if I add EXACT it will look for the specific version.

I think I caught all includes though, could you point to the specific ones I missed? And same for formatting.

Sorry, hardcoded version? Do you mean `find_package(pugixml 1.12)` or something else? It's a specific version on purpose, as one of your earlier arguments was that you can not guarantee the application working with for example a way older dependency than you use yourself. The 1.12 call here makes sure it searches for _at least_ (so this and anything higher) 1.12. Only if I add `EXACT` it will look for the specific version. I think I caught all includes though, could you point to the specific ones I missed? And same for formatting.
biodranik commented 2022-08-14 19:23:45 +00:00 (Migrated from github.com)

Version 1.12 does not contain the required changes.

Also, another option could be to try using the specific min system version (when our changes are merged), and fall back to the repo's one.

But again, that could be important only for production releases, not for app development. We don't distribute/support the Linux app officially (it is a dev version!), it's based solely on someone else's enthusiasm.

Version 1.12 does not contain the required changes. Also, another option could be to try using the specific min system version (when our changes are merged), and fall back to the repo's one. But again, that could be important only for _production releases_, not for app development. We don't distribute/support the Linux app officially (it is a dev version!), it's based solely on someone else's enthusiasm.
Ferenc- commented 2022-09-29 16:15:58 +00:00 (Migrated from github.com)

I have sent an extra commit upstream to pugixml hoping to help the upstreaming effort.
Does it make sense in the meantime to create further organicmaps PRs, for enabling other system-provided dependencies?
Strictly behind a -DSYSTEM_.... option?

I have sent an extra commit upstream to pugixml hoping to help the upstreaming effort. Does it make sense in the meantime to create further organicmaps PRs, for enabling other system-provided dependencies? Strictly behind a `-DSYSTEM_....` option?
biodranik commented 2022-09-29 18:57:08 +00:00 (Migrated from github.com)

I have a better idea. It should be possible to specify a min required version for 3party lib in CMake and make a macro/function that checks if there is at least this version is available, then it should be used from the system. But if not, then use the embedded 3p version. In this way, maintainers will be happy, and contributors won't get any compilation errors/won't waste their time on the system setup.

I have a better idea. It should be possible to specify a min required version for 3party lib in CMake and make a macro/function that checks if there is at least this version is available, then it should be used from the system. But if not, then use the embedded 3p version. In this way, maintainers will be happy, and contributors won't get any compilation errors/won't waste their time on the system setup.
gerion0 commented 2022-09-29 20:46:16 +00:00 (Migrated from github.com)

Maybe something like this works:

find_package(pugixml 1.12)
if (not ${pugixml_FOUND})
    add_subdirectory(3party/pugixml)
endif()

Interestingly, Meson has exactly that built-in.

Maybe something like this works: ``` find_package(pugixml 1.12) if (not ${pugixml_FOUND}) add_subdirectory(3party/pugixml) endif() ``` Interestingly, Meson has [exactly that built-in](https://mesonbuild.com/Subprojects.html#toggling-between-system-libraries-and-embedded-sources).
biodranik commented 2022-09-29 22:03:08 +00:00 (Migrated from github.com)

Meson guys are smart ;-)

Meson guys are smart ;-)
Ferenc- commented 2022-10-09 16:45:20 +00:00 (Migrated from github.com)

So the patches have been upstreamed, and organicmaps builds with upstream pugixml now.
@PureTryOut would this PR branch benefit from a rebase?

So the patches have been upstreamed, and [organicmaps builds with upstream pugixml](https://git.omaps.dev/organicmaps/organicmaps/pulls/3561) now. @PureTryOut would this PR branch benefit from a rebase?
PureTryOut commented 2022-10-09 16:59:48 +00:00 (Migrated from github.com)

Done. Do note that pugixml hasn't had a release yet and according to the author won't any time soon either. "Some time later this year". Annoyingly vague but we'll have to deal with it.

Done. Do note that pugixml hasn't had a release yet and according to the author won't any time soon either. "Some time later this year". Annoyingly vague but we'll have to deal with it.
Ferenc- commented 2022-10-16 18:13:00 +00:00 (Migrated from github.com)

@biodranik What could one do to move this PR forward?

@biodranik What could one do to move this PR forward?
biodranik commented 2022-10-16 20:10:21 +00:00 (Migrated from github.com)

As we discussed before, there are several pre-requisites:

  1. The code should be able to compile properly with both the system version and the embedded version.
  2. The system version should be used only if it's less greater or equal to the embedded version.
  3. A similar approach can be reused for other 3party deps.
  4. 3party deps should be upgraded to the latest versions before applying that system/embedded choice.
As we discussed before, there are several pre-requisites: 1. The code should be able to compile properly with both the system version and the embedded version. 2. The system version should be used only if it's ~less~ greater or equal to the embedded version. 3. A similar approach can be reused for other 3party deps. 4. 3party deps should be upgraded to the latest versions before applying that system/embedded choice.
PureTryOut commented 2022-11-04 20:45:22 +00:00 (Migrated from github.com)
  1. Done
  2. Done
  3. That's the case
  4. Not sure what you mean but I suppose done after organicmaps/organicmaps#3811 is merged?

Note that the one patch OrganicMaps dependent on that wasn't in upstream pugixml has now been merged, and because of that I made it depend on a minimum of 1.13. I don't think anything is holding this back anymore.

1. Done 2. Done 3. That's the case 4. Not sure what you mean but I suppose done after https://git.omaps.dev/organicmaps/organicmaps/pulls/3811 is merged? Note that the one patch OrganicMaps dependent on that wasn't in upstream pugixml has now been merged, and because of that I made it depend on a minimum of 1.13. I don't think anything is holding this back anymore.
Ferenc- commented 2022-11-05 20:06:35 +00:00 (Migrated from github.com)

3. That's the case

I think the idea here was to make use of the find_package_or_fallback_to_3party macro.

> 3\. That's the case I think the idea here was to make use of the `find_package_or_fallback_to_3party` macro.
biodranik commented 2022-11-06 09:25:02 +00:00 (Migrated from github.com)

As #3811 has been merged, please, rebase, and make sure that your code works in all cases: no pugi in the system (our 3party is used), old pugi in the system (our 3party is used), new pugi in the system.

As #3811 has been merged, please, rebase, and make sure that your code works in all cases: no pugi in the system (our 3party is used), old pugi in the system (our 3party is used), new pugi in the system.
PureTryOut commented 2022-11-06 11:48:14 +00:00 (Migrated from github.com)

Rebased and done. I've tested all the scenarios you mentioned when I made this PR (of course) and have multiple times since, but also did just now to be sure. 🤷

Rebased and done. I've tested all the scenarios you mentioned when I made this PR (of course) and have multiple times since, but also did just now to be sure. :shrug:
Ferenc- commented 2022-11-06 12:05:42 +00:00 (Migrated from github.com)

@PureTryOut have you found a fault in the macro, or is it not possible to use it?

@PureTryOut have you found a fault in the macro, or is it not possible to use it?
biodranik (Migrated from github.com) requested changes 2022-11-06 12:14:06 +00:00
biodranik (Migrated from github.com) left a comment

As mentioned, please use the already implemented universal macro. And clean up unnecessary CMakeLists.txt changes.

As mentioned, please use the already implemented universal macro. And clean up unnecessary CMakeLists.txt changes.
PureTryOut commented 2022-11-06 13:30:19 +00:00 (Migrated from github.com)

I did not realize a macro existed for that nowadays. I thought @Ferenc- said I should make one. I'll change the PR to use it.

I did not realize a macro existed for that nowadays. I thought @Ferenc- said I should make one. I'll change the PR to use it.
PureTryOut commented 2022-11-06 13:42:06 +00:00 (Migrated from github.com)

Done, works as expected.

Done, works as expected.
biodranik (Migrated from github.com) reviewed 2022-11-06 20:44:35 +00:00
biodranik (Migrated from github.com) commented 2022-11-06 20:44:18 +00:00

Is this alias really necessary now?

Is this alias really necessary now?
@ -314,6 +314,9 @@ if (NOT WITH_SYSTEM_PROVIDED_3PARTY)
set(JANSSON_WITHOUT_TESTS ON)
biodranik (Migrated from github.com) commented 2022-11-06 20:43:16 +00:00

Just make it a one-liner, the indentation looks strange and inconsistent.

Just make it a one-liner, the indentation looks strange and inconsistent.
PureTryOut (Migrated from github.com) reviewed 2022-11-07 05:37:22 +00:00
PureTryOut (Migrated from github.com) commented 2022-11-07 05:37:22 +00:00

Yes it is. The target is different when used from 3party vs the system dep so this is to prevent conditional statements everywhere we link against it.

Yes it is. The target is different when used from 3party vs the system dep so this is to prevent conditional statements everywhere we link against it.
biodranik (Migrated from github.com) requested changes 2022-11-07 09:22:25 +00:00
biodranik (Migrated from github.com) commented 2022-11-07 09:22:21 +00:00

I meant that you didn't remove pugixml::pugixml in other CMakeLists.txt files.

I meant that you didn't remove pugixml::pugixml in other CMakeLists.txt files.
PureTryOut (Migrated from github.com) reviewed 2022-11-07 09:57:16 +00:00
PureTryOut (Migrated from github.com) commented 2022-11-07 09:57:16 +00:00

I'm not sure I understand. The system library provides pugixml::pugixml, the 3party version provides pugixml instead. To make it work everywhere with both system and 3party version this alias is used.

I'm not sure I understand. The system library provides `pugixml::pugixml`, the 3party version provides `pugixml` instead. To make it work everywhere with both system and 3party version this alias is used.
biodranik (Migrated from github.com) requested changes 2022-11-07 10:03:31 +00:00
biodranik (Migrated from github.com) left a comment

I highlighted the required changes.

I highlighted the required changes.
biodranik (Migrated from github.com) commented 2022-11-07 10:03:11 +00:00

Please remove this change, it is unnecessary.

Please remove this change, it is unnecessary.
biodranik (Migrated from github.com) commented 2022-11-07 10:02:58 +00:00

Please remove this change, it is unnecessary.

Please remove this change, it is unnecessary.
biodranik (Migrated from github.com) commented 2022-11-07 10:02:54 +00:00

Please remove this change, it is unnecessary.

Please remove this change, it is unnecessary.
biodranik (Migrated from github.com) commented 2022-11-07 10:02:50 +00:00

Please remove this change, it is unnecessary.

Please remove this change, it is unnecessary.
biodranik (Migrated from github.com) commented 2022-11-07 10:02:40 +00:00

Please remove this change, it is unnecessary.

Please remove this change, it is unnecessary.
biodranik (Migrated from github.com) commented 2022-11-07 10:02:37 +00:00

Please remove this change, it is unnecessary.

Please remove this change, it is unnecessary.
PureTryOut (Migrated from github.com) reviewed 2022-11-07 10:12:40 +00:00
PureTryOut (Migrated from github.com) commented 2022-11-07 10:12:40 +00:00

Either you or I am misunderstanding something. I can't remove those changes as that won't work with the system-provided pugixml. The system version doesn't provide pugixml as a target, it provides pugixml::pugixml.

Either you or I am misunderstanding something. I can't remove those changes as that won't work with the system-provided pugixml. The system version doesn't provide `pugixml` as a target, it provides `pugixml::pugixml`.
biodranik (Migrated from github.com) reviewed 2022-11-07 10:30:26 +00:00
biodranik (Migrated from github.com) commented 2022-11-07 10:30:26 +00:00

I removed this line. Also, I removed other mentioned lines in CMakeLists.txt files. And it works on Mac with and without system pugixml installed.

Doesn't it work for you? What kind of error do you get?

I removed this line. Also, I removed other mentioned lines in CMakeLists.txt files. And it works on Mac with and without system pugixml installed. Doesn't it work for you? What kind of error do you get?
PureTryOut (Migrated from github.com) reviewed 2023-01-11 15:29:09 +00:00
PureTryOut (Migrated from github.com) commented 2023-01-11 15:29:08 +00:00

Honestly, I believe you. 🤷 It didn't work for me at the time but this PR is so long open now that I don't really remember the details. I removed the alias, should be fine now.

Honestly, I believe you. :shrug: It didn't work for me at the time but this PR is so long open now that I don't really remember the details. I removed the alias, should be fine now.
Ferenc- (Migrated from github.com) requested changes 2023-01-12 18:18:50 +00:00
Ferenc- (Migrated from github.com) left a comment

Let me know if you need help with fixing the iOS build.

Let me know if you need help with fixing the iOS build.
@ -2,3 +2,3 @@
#include "pugixml/src/pugixml.hpp"
#include <pugixml.hpp>
Ferenc- (Migrated from github.com) commented 2023-01-12 18:18:21 +00:00

This change currently breaks iOS build, as the iOS build doesn't use CMake, and does not have the pugixml/src/ among the include paths.
That should be fixed. If you need help with that then let me know.

This change currently breaks iOS build, as the iOS build doesn't use CMake, and does not have the `pugixml/src/` among the include paths. That should be fixed. If you need help with that then let me know.
PureTryOut (Migrated from github.com) reviewed 2023-01-12 21:26:59 +00:00
@ -2,3 +2,3 @@
#include "pugixml/src/pugixml.hpp"
#include <pugixml.hpp>
PureTryOut (Migrated from github.com) commented 2023-01-12 21:26:59 +00:00

I have no clue about iOS (and don't really care personally) so yes I'd like some help there 🤔

I have no clue about iOS (and don't really care personally) so yes I'd like some help there 🤔
Ferenc- (Migrated from github.com) reviewed 2023-01-13 07:21:58 +00:00
@ -2,3 +2,3 @@
#include "pugixml/src/pugixml.hpp"
#include <pugixml.hpp>
Ferenc- (Migrated from github.com) commented 2023-01-13 07:21:58 +00:00

OK, I have added changes to get the iOS build compile,
in this commit based on your current PR branch.
You can cherry-pick that hash or rebase onto my support-system-pugixml branch.

OK, I have added changes to get the iOS build compile, in [this commit](https://github.com/Ferenc-/organicmaps/commit/2d2fffd2c6f0530bbdbfdbee6c6f57632c90a5a8) based on your current PR branch. You can `cherry-pick` that hash or `rebase` onto my [support-system-pugixml](https://github.com/Ferenc-/organicmaps/tree/support-system-pugixml) branch.
Ferenc- commented 2023-01-20 22:39:07 +00:00 (Migrated from github.com)

@PureTryOut your changes in this PR have been merged to master with organicmaps/organicmaps#4268. So I would close this now.

@PureTryOut your changes in this PR have been merged to `master` with https://git.omaps.dev/organicmaps/organicmaps/pulls/4268. So I would close this now.
PureTryOut commented 2023-01-21 08:42:27 +00:00 (Migrated from github.com)

Interesting. Thanks!

Interesting. Thanks!
This repo is archived. You cannot comment on pull requests.
No labels
Accessibility
Accessibility
Address
Address
Android
Android
Android Auto
Android Auto
Android Automotive (AAOS)
Android Automotive (AAOS)
API
API
AppGallery
AppGallery
AppStore
AppStore
Battery and Performance
Battery and Performance
Blocker
Blocker
Bookmarks and Tracks
Bookmarks and Tracks
Borders
Borders
Bug
Bug
Build
Build
CarPlay
CarPlay
Classificator
Classificator
Community
Community
Core
Core
CrashReports
CrashReports
Cycling
Cycling
Desktop
Desktop
DevEx
DevEx
DevOps
DevOps
dev_sandbox
dev_sandbox
Directions
Directions
Documentation
Documentation
Downloader
Downloader
Drape
Drape
Driving
Driving
Duplicate
Duplicate
Editor
Editor
Elevation
Elevation
Enhancement
Enhancement
Epic
Epic
External Map Datasets
External Map Datasets
F-Droid
F-Droid
Fonts
Fonts
Frequently User Reported
Frequently User Reported
Fund
Fund
Generator
Generator
Good first issue
Good first issue
Google Play
Google Play
GPS
GPS
GSoC
GSoC
iCloud
iCloud
Icons
Icons
iOS
iOS
Legal
Legal
Linux Desktop
Linux Desktop
Linux packaging
Linux packaging
Linux Phone
Linux Phone
Mac OS
Mac OS
Map Data
Map Data
Metro
Metro
Navigation
Navigation
Need Feedback
Need Feedback
Night Mode
Night Mode
NLnet 2024-06-281
NLnet 2024-06-281
No Feature Parity
No Feature Parity
Opening Hours
Opening Hours
Outdoors
Outdoors
POI Info
POI Info
Privacy
Privacy
Public Transport
Public Transport
Raw Idea
Raw Idea
Refactoring
Refactoring
Regional
Regional
Regression
Regression
Releases
Releases
RoboTest
RoboTest
Route Planning
Route Planning
Routing
Routing
Ruler
Ruler
Search
Search
Security
Security
Styles
Styles
Tests
Tests
Track Recording
Track Recording
Translations
Translations
TTS
TTS
UI
UI
UX
UX
Walk Navigation
Walk Navigation
Watches
Watches
Web
Web
Wikipedia
Wikipedia
Windows
Windows
Won't fix
Won't fix
World Map
World Map
No milestone
No project
No assignees
2 participants
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference: organicmaps/organicmaps-tmp#2863
No description provided.