-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Native compilation for openshift and knative consumes 4 times more RAM and takes 4 times longer #43360
Comments
/cc @Karm (mandrel), @galderz (mandrel), @geoand (knative,openshift), @iocanel (knative,openshift), @matejvasek (funqy), @patriot1burke (funqy), @zakkak (mandrel,native-image) |
My wild guess is that it's because of the Kubernetes Client and not really related to Quarkus. Could you compare the size of the binaries? Now I'm not entirely sure it will improve things but the first thing to do would be to depend on the |
Sorry if I misunderstood your comment, but I'll try to react:
IIUC yes, exactly. However Quarkus is managing this client and using it in the extension we use. I didn't open this issue, but my understanding was that is the reason why it is opened here. I think developers behind Fabric8 client are also integrating it into Quarkus K8/OpenShift together with @iocanel . @manusa right? So we can directly improve things by only bumping Fabric8 versions that doesn't cause this. Maybe we could have a test considering this is recurring issue?
|
It seem that using |
I think that's fantastic and I'd really like to understand how that is possible just out of personal interest to understand things. Let's see if anyone can give me a hint. Thank you |
Following the recommendation from quarkusio/quarkus#43360
Part of the work of an extension is to optimize the behavior in native and that's what is done in these extensions. If you have a look at IIRC this particular extension was highly problematic in the past and some work was done precisely to solve this issue. You can probably find some pointers by looking in the issues/PRs or by having a look at the history of the Kubernetes Client extension (and friends as there are also a few internal extensions). |
Understood, thank you. @fedinskiy is there anything else to this issue or are you going to close it? |
Following the recommendation from quarkusio/quarkus#43360
The Fabric8 clients should be used from the Quarkus extensions for the reasons Guillaume pointed out, especially when performing a native image build. For JVM mode, I don't think there should be a problem though. |
Alright, thank you for your time. |
Following the recommendation from quarkusio/quarkus#43360 (cherry picked from commit 64f9986)
Following the recommendation from quarkusio/quarkus#43360 (cherry picked from commit 64f9986)
Describe the bug
I have an application, which uses openshiftm knative and funqy. When I compile the application in native mode using Quarkus 3.5.2 it requires (via property
quarkus.native.native-image-xmx
) 4 gigabytes of RAM and the compilation takes around 2 minutes. When I compile the same application using Quarkus 3.14.3m, the compilation requires at least 16 gigabytes (it goes deadlocked at 8) and takes somewhere between eight and ten minutes.Expected behavior
The perfomance should not degrade that much
Actual behavior
The resource consumption became much worse
How to Reproduce?
git clone https://github.com/fedinskiy/quarkus-test-suite -b reproducer/funqy-consumption
cd quarkus-test-suite/funqy/knative-events
mvn clean verify -P root-modules -Dnative -Dquarkus.platform.version=3.14.3 -Dquarkus.native.native-image-xmx=16g
or
mvn clean verify -P root-modules -Dnative -Dquarkus.platform.version=3.5.2
for comparisionOutput of
uname -a
orver
6.10.4-200.fc40.x86_64
Output of
java -version
Java version: 21.0.1, vendor: Eclipse Adoptium
Mandrel or GraalVM version (if different from Java)
No response
Quarkus version or git rev
3.14.3
Build tool (ie. output of
mvnw --version
orgradlew --version
)Apache Maven 3.9.6 (bc0240f3c744dd6b6ec2920b3cd08dcc295161ae)
Additional information
The issue were previously reported as #37142 and #38683. Maybe one of these should be reopened, since the same problem affects 3.8.6 as well.
The text was updated successfully, but these errors were encountered: