-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mac OSX binaries and Windows binaries that use GDAL/GEOS/PROJ #40
Comments
@edzer Thanks for starting this. As for macOS recipes is how a) you can replicate exactly what the macOS CRAN setup is and b) by issuing PRs you can update any libraries that your package needs which will be reflected on the macOS build server. Alternatively you can just let me know what your package needs and I can make those changes myself. My main problem here is that I don't know which features your packages need and which dependencies are required as you can build GDAL with many different options - I can't tell if something is useful or not. Recipes are not intended for end-users as users will simply download the package from CRAN which uses the libraries from the recipes. If developers want to re-build a package, they can download the library binaries from https://mac.r-project.org/libs-4/ instead of building them themselves. The macOS recipes setup pre-dates the Windows one by many years. The Windows system was put in place only very recently for R 4.0.0 and I agree that it would be nice to have some consistency, but those are very different operating systems so it may sound easier than it is. I was playing with a Jenkins setup, but the R support in Jenkins is terrible, so it pretty much has to be done from scratch. I am still hopeful that I can have a script that can replicate the CRAN setup in Jenkins. As for GitHub actions, those are, unfortunately, hopeless as they don't support the necessary macOS versions, so Jenkins is the only viable path I'm aware of. |
I spent most of today trying to work-around all those bugs in the spatial libraries that have to do with static linking. All of them are unrelated to macOS, and I'm documenting my progress on the recipes wiki on known issues in libraries. Especially for GDAL the list is pretty long. That said, the recipes now include GDAL 3.1.1, GEOS 3.8.1, PROJ 6.3.1, NetCDF 4.7.4, HDF5 1.12.0 and HDF4 4.2.15. I only tested |
wow, thank you! @s-u I'm a little reluctant to ask as I'm out of my depth, but - is the workflow in the first post here (to unpack the 10.13 SDK) not viable to emulate at least High Sierra on CRAN? |
@mdsumner no, it is not viable. We tried it before. It's not as good as it sounds, because SDKs don't work for |
ok, thanks very much |
BTW: to clarify, the reason the SDK use was added is sort of the inverse - to avoid using the most recent SDKs which are partially broken. So it makes recent Xcode to not break, but the resulting binaries are not guaranteed to work on High Sierra (Xcode has been always funky - Apple likes to supply newer SDKs than the OS which is really odd). So it helps in some cases, but doesn't solve the task of replicating the CRAN setup - for that you need a 10.13 VM. |
but, fwiw it's not completely pointless right? I feel pretty good that I can get those CRAN binaries and have a full test pass on 10.15: https://github.com/hypertidy/vapour/runs/825320263?check_suite_focus=true at least it's teaching me a lot, I wasn't sure about all your configure notes but - all the patches and configure details are now built-in to those binaries, is that accurate? I have no idea about Jenkins yet, but it's not impossible we have access to these VMs at my work so I will ask around. |
@mdsumner no, that's great! I like that the workflow allows to fetch the latest library binaries so, yes, for all intents and purposes it's as close as you can get with Actions. Don't get me wrong, I think it's perfect for testing. What I was referring to was to replicate the CRAN setup exactly to trace any issues that may come from the checks, which is possible only if the CI system allows the use of 10.13 VMs. |
Nice, understood! |
@s-u thanks for all this effort! This is highly appreciated from the Rspatial comunity. Anyway, this is just a few thoughts from a package maintainer that does not really have experience with any of the steps involved in getting things set up in a usable way (I leave all this up to @edzer via sf basically). I really appreciate all the work that is being done to make this as painless as possible for all parties involved |
Just one comment to add to @tim-salabim - totally agree about alignment of library versions, but I'm personally less concerned about alignment of available GDAL drivers on Windows and MacOS - I think it's more important to have as many drivers as possible on each, even if one is missing some the other has. I know that might be controversial, because it implies different capabilities and testing requirements. In my experience over the years the ability to access a new format (for me) was always instructive and helpful, more important than system consistency - but I can see that might be a topic for discussion ;) |
@tim-salabim Pain is essential to find pinch-points, like: OSGeo/PROJ#2084 and OSGeo/gdal#2672. We also feed back up the component tree when necessary. For both GDAL and PRØJ, @rouault has been very helpful and responded as quickly as possible, so subsequent releases of PRØJ and GDAL will have changes suggested by CRAN. |
@rsbivand of course. I am more thinking about catching these pain points early and having a system to avoid them later, hence my "shift" in brackets. Admittedly, I don't really have an idea about the whole process that is necessary to ensure stable useful builds on CRAN. Just thought my amateur views might be helpful (different angle) and wanted to express my gratitude to all people involved @mdsumner I agree that having a gdal suite that is as complete as possible would be great. In the end that's where the pwoer of gdal lies right |
Oh, and as a sidenote and from a very selfish standpoint, I'd love to see GDAL 3.1.x in both MacOS and Windows binaries, as it will make the mapview experience significantly nicer :-) |
I would like to draw attention to the following points which IMO usually do not get discussed:
Most of this thread talks about 1). This is fine as long as people use binaries. However, installation issues from source will not be solved by referring to the CRAN 10.13 setup as nobody will (be able to) replicate this setup. Installing snapshots of system libs is a pain and one cannot simply go back to macOS 10.13. What I am missing is a robust check system against the most recent macOS version. The default to check against macOS is either locally or via CI, using the latest stable macOS version. When giving R courses/consulting R, many people have Macs and I am always have to explain them why R packages are so unstable when it comes to packages that use more than just plain R code. Building universal binaries across major versions of macOS is hard. CRAN could also make use of GitHub Actions and ensure a proper build process against the most recent macOS system and the latest homebrew libraries. Usually the r-lib/tidyverse guys or other ambitioned packages maintainers lift all the support for the most recent macOS versions because these usually have a CI test system set up. |
@pat-s I agree. To make it very clear, those are also two very different tasks. The role of CRAN is to provide "just works" setup for users. The macOS version requirement is there simply to make sure we cover large enough fraction of useRs and is really just a minor detail in a sense. It doesn't mean the libraries have to stay behind, in fact the static builds allow us to move ahead at any time, so if package developers care, they can push the boundary where they think it's best for the users, all they have to do is to talk to us. That part is about making your package available to the users (=release), it is not about development. The whole point of this setup is that it is very stable. The other question is about CI for package development. I am more than happy to provide my expertise if needed, but they are independent of CRAN operations and just good practice. It is no replacement for CRAN incoming checks, you would still need to pass those on the CRAN setup to submit your release, but it would allow you as a developer to find out what's needed and what you have to communicate to us for your release (if needed). I think there is certainly hope for that setup given the work so far. The main issue with those things is just it's hard to find an active maintainer. As we have seen with Jenkins, you can't take "set and forget" approach as things will break upstream and someone has to deal with them. For CRAN this is well defined, but outside of the R foundation involvement it's not. |
Yes, that is how it would happen in an ideal world. However, I do not see this happening in practice. There needs to be more manpower to maintain a whole CI system that aims to cover most CRAN packages. But I guess were are going already off-topic here and loosing the r-spatial connection (even though all of the above would also apply to r-spatial problems).
Sure, but "good practice" would help everyone here, users and devs. Maybe we should outsource this discussion to the linked GitHub org and discuss there. Regarding the initial topic of this thread: I still stay with the opinion to link and test against homebrew-core on macOS as the default since this is what probably 90% of all macOS users will do every day (if they cannot/won't use binaries). |
Well, but that's exactly how it is setup - see And anyone using R should be familiar with the format as it's the same as R packages.
It is completely open, there is nothing hidden and fortunately there are people that are contributing. There are also community efforts such as the support for Jenkins and GH Actions.
But, again, this has been ongoing for quite some time - just look on GitHub, there are many R packages that use common CI platforms for quite a while, those are de-facto standards. |
I am aware of these since I am developing/maintaining a platform-agnostic CI DSL for R within ropensci. What prevents CRAN from having at least one runner in the build matrix that checks against the latest r version on the latest OS version using homebrew? Creating runners that use historic versions of OS versions is much more complicated than the task just outlined. Having such a runner could serve as a nice guideline for all the CI approaches out there to really mirror the CRAN check system (rather than just copying parts but relying on common CI solutions which always diverge a bit from the CRAN standard).
I am aware of it and it is a good start, thanks for this.
|
Last time I checked it was the opposite - the first step was to remove Homebrew specifically to not mess up But we're running in circles here - I think you're still confusing CRAN and CI. CRAN is providing binaries for R users on macOS, so that is our goal. The main worry of developers was to replicate the setup so that they can trouble-shoot cases where a binary was not available due to a failure. I think we got a solution for that now. CRAN is not a CI service nor a service for Homebrew. If Homebrew wants to provide a CI service, that's great, but but has nothing to do with CRAN nor R and package macOS releases on CRAN. If someone wants to maintain it, it would be perfectly fine to have R and packages in Homebrew - they already have R there so as I was saying all the time, it's perfectly fine if you want to install R from Homebrew, all the dependent libraries and packages - you could easily build a CI on that if that's what you want.
The CRAN build is maintained from that repo, so I test the PRs on a development CRAN setup and if it passes it is merged and installed on the production VMs.
This setup has been around for 5 years and it has been announced for a long time, so maintainers that care about macOS (which is not a large fraction as I can say from experience) should know about this.
He typically provides help to package authors by sharing his extensive expertise in fixing packages, often providing patches. He also maintains separate check setups that perform tests on additional platforms and extra tools/instrumentation. |
@s-u there's been some indication that Prof Ripley has artefacts in another repo, but it seems to me that we should treat https://mac.r-project.org/libs-4/ as the only place for macos binary builds, both for current and development. He mentioned proj 6.3.2 in particular for a specific libsqlite issue to me, but it seems that I can use the 6.3.1 from libs-4/ (or later if/when it becomes available) as the proper current CRAN dev target for static builds. Is that accurate? |
I'm trying to bring together several discussions here:
I agree that syncing the binary builds on OSX and Windows in terms of versions and drivers would be marvelous. I have experience with both statically building OSX binaries of sf (the way @s-u does for CRAN) as well as using homebrew, and bad experiences with trying the former while having homebrew libraries installed. I'm hesitant to advice dynamic linking to OSX users who have no clue what "compilation" or "dynamic linking" means, because they will need to recompile/reinstall the R packages when GDAL etc libraries get updated, but probably won't understand why things don't work in the first place when this happens. For those users, statically linked binary R packages seem the best advice.
@jeroen's rwinlib has made it easy for pretty much any user to install dev versions: install rtools and there you go, regardless whether you understand what is going on or not. (It may have led to the large number of CRAN packages linking to GDAL etc). It would be perfect if a static build system with similar simplicity existed for OSX (I don't know if it does). Jeroen pointed me to recipes but I'm not sure what to do there. Can I use that to locally build static builds with library versions I want?
I am confused about:
The text was updated successfully, but these errors were encountered: