Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Python 3.13 #2724

Open
wants to merge 35 commits into
base: main
Choose a base branch
from

Conversation

antonpirker
Copy link
Contributor

@antonpirker antonpirker commented Jul 22, 2024

Description

Add support for Python 3.13 to all instrumentations where the instrumented library already support Python 3.13.

Updated tox.ini and also the workflow generation script. So for all instrumentation that support Python 3.13 already the tests will also run in CI.

To make the tests pass I had to update some test-requirements. But only helper libs and not the libraries under test itself.

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

How Has This Been Tested?

I ran the all instrumentation tests in Python 3.13 locally, and those are the libraries that do not support Python 3.13 yet. Those CI workflows have not yet updated to run under Python 3.13:

py313-test-instrumentation-aiopg: FAIL code 1 (39.86=setup[0.02]+cmd[6.38,6.33,5.05,6.13,15.95] seconds)
py313-test-instrumentation-asyncpg: FAIL code 1 (22.23=setup[0.02]+cmd[4.22,4.03,3.63,3.52,6.81] seconds)
py313-test-instrumentation-celery: FAIL code 1 (36.26=setup[0.02]+cmd[3.83,4.71,4.54,4.21,4.12,14.82] seconds)
py313-test-instrumentation-confluent-kafka: FAIL code 1 (21.60=setup[0.02]+cmd[4.11,4.00,4.11,3.55,5.81] seconds)
py313-test-instrumentation-django-1: FAIL code 2 (34.17=setup[0.02]+cmd[5.97,5.85,8.85,4.00,9.17,0.30] seconds)
py313-test-instrumentation-falcon-1: FAIL code 2 (26.33=setup[0.03]+cmd[5.14,4.49,4.64,4.71,7.04,0.29] seconds)
py313-test-instrumentation-falcon-2: FAIL code 1 (26.67=setup[0.02]+cmd[4.80,4.51,3.93,4.47,8.95] seconds)	
py313-test-instrumentation-grpc-0: FAIL code 1 (145.68=setup[0.02]+cmd[6.33,3.66,3.89,4.21,127.58] seconds)
py313-test-instrumentation-grpc-1: FAIL code 1 (149.32=setup[0.02]+cmd[5.35,4.05,3.97,4.06,131.86] seconds)
py313-test-instrumentation-httpx-0: FAIL code 1 (22.01=setup[0.02]+cmd[4.44,4.08,4.09,3.84,5.27,0.26] seconds)
py313-test-instrumentation-psycopg2: FAIL code 1 (29.28=setup[0.02]+cmd[4.66,4.23,3.92,4.89,11.56] seconds)
py313-test-instrumentation-pyramid: FAIL code 2 (22.67=setup[0.02]+cmd[3.76,3.68,3.73,4.36,6.79,0.31] seconds)
py313-test-instrumentation-sqlalchemy-1: FAIL code 1 (22.50=setup[0.02]+cmd[3.91,3.75,3.97,4.03,6.82] seconds)
py313-test-instrumentation-system-metrics: FAIL code 1 (27.84=setup[0.02]+cmd[4.25,8.01,5.68,5.47,3.67,0.73] seconds) Note: ONLY tests failed

Note on py313-test-instrumentation-system-metrics: I think this can be fixed to be compatible to Python 3.13 because only the tests fail (but the tox environment is built and the tests run) But I have not enough knowledge yet to fix those two failing tests.

Does This PR Require a Core Repo Change?

Checklist:

See contributing.md for styleguide, changelog guidelines, and more.

  • Followed the style guidelines of this project
  • Changelogs have been updated
  • Unit tests have been added
  • Documentation has been updated

@antonpirker antonpirker requested a review from a team July 22, 2024 08:34
@antonpirker antonpirker marked this pull request as draft July 22, 2024 08:34
@xrmx
Copy link
Contributor

xrmx commented Jul 22, 2024

Do they work locally? I think it's premature to add them on CI since we are depending on other packages support and wheels (they removed some C-API symbols) more than something on our side.

@antonpirker
Copy link
Contributor Author

Makes sense.

@xrmx Would it be desirable to have a separate job that runs the tests only against the latest Python pre-release that is not mandatory so we get a heads-up if we will brake in upcoming releases of Python?

@xrmx
Copy link
Contributor

xrmx commented Jul 22, 2024

Makes sense.

@xrmx Would it be desirable to have a separate job that runs the tests only against the latest Python pre-release that is not mandatory so we get a heads-up if we will brake in upcoming releases of Python?

I think CI is already slow enough :)

.github/workflows/instrumentations_0.yml Outdated Show resolved Hide resolved
.github/workflows/instrumentations_1.yml Outdated Show resolved Hide resolved
@sentrivana
Copy link

I think CI is already slow enough :)

I mean, the tests will have to be added eventually, so we're probably not getting around that. :)

Is there anything in particular that is slow? Is it a concurrency issue, i.e., not enough runners? Would clustering tests somehow help? I can take a look if there's anything that I see that could be improved (appreciate any pointers about current pain points there!).

I think getting tests to run against 3.13 RCs would be great for uncovering issues early. I of course get that that's not always possible, especially if there are dependencies. Could we start by testing some core part of the codebase that isn't blocked? I.e., not the instrumentations?

@xrmx
Copy link
Contributor

xrmx commented Jul 25, 2024

I think CI is already slow enough :)

I mean, the tests will have to be added eventually, so we're probably not getting around that. :)

Is there anything in particular that is slow? Is it a concurrency issue, i.e., not enough runners? Would clustering tests somehow help? I can take a look if there's anything that I see that could be improved (appreciate any pointers about current pain points there!).

The current pain point I think is that the checkout of core libraries from git is slow. There is a PR switching to uv that should help #2667

I think getting tests to run against 3.13 RCs would be great for uncovering issues early. I of course get that that's not always possible, especially if there are dependencies. Could we start by testing some core part of the codebase that isn't blocked? I.e., not the instrumentations?

I'm not doubting it's useful. They are less useful if our instrumented libraries does not yet work with python 3.13 though. Some comments ago I asked if you have run them locally, if everything is fine (modulo you'll need the same exclusions we have for 3.12 in the workflows) then it's fine to add them. If we need to add temporary workarounds because packages don't have wheels already then I would prefer to wait at least for a final 3.13.

@xrmx xrmx closed this Jul 25, 2024
@xrmx xrmx reopened this Jul 25, 2024
@xrmx
Copy link
Contributor

xrmx commented Jul 26, 2024

Looks like we are reaching some kind of limit and some tests won't run? Anyway there is at least pydantic to bump.

@antonpirker
Copy link
Contributor Author

I do not know about the limits, sorry.
I have added the exclusions for boto and kafka-python.

@antonpirker
Copy link
Contributor Author

I ran all the test suites using Python 3.13.0b1 and those are the ones that are failing:

py313-test-instrumentation-aiopg: FAIL code 1 (63.75=setup[0.08]+cmd[13.91,12.72,7.29,9.46,20.29] seconds)
py313-test-instrumentation-botocore: FAIL code 1 (112.76=setup[1.06]+cmd[17.77,11.54,8.03,9.44,64.91] seconds)
py313-test-instrumentation-django-1: FAIL code 1 (28.75=setup[0.98]+cmd[10.57,17.20] seconds)
py313-test-instrumentation-falcon-1: FAIL code 2 (71.02=setup[0.26]+cmd[18.80,6.74,7.39,18.26,19.13,0.44] seconds)
py313-test-instrumentation-falcon-2: FAIL code 1 (78.57=setup[0.28]+cmd[20.51,9.25,17.03,10.84,20.66] seconds)
py313-test-instrumentation-fastapi: FAIL code 1 (111.77=setup[0.22]+cmd[11.47,7.51,17.36,15.19,60.01] seconds)
py313-test-instrumentation-fastapi-slim: FAIL code 1 (102.10=setup[0.53]+cmd[13.85,17.04,30.62,9.05,31.01] seconds)
py313-test-instrumentation-flask-0: FAIL code 1 (46.91=setup[0.02]+cmd[25.25,21.65] seconds)
py313-test-instrumentation-urllib3-1: FAIL code 1 (24.29=setup[0.65]+cmd[23.64] seconds)
py313-test-instrumentation-psycopg2: FAIL code 1 (77.35=setup[1.23]+cmd[9.88,5.79,7.86,7.42,45.17] seconds)
py313-test-instrumentation-pyramid: FAIL code 2 (66.54=setup[1.03]+cmd[10.71,6.82,6.29,12.09,28.69,0.91] seconds)
py313-test-instrumentation-asyncpg: FAIL code 1 (52.69=setup[0.03]+cmd[7.12,7.29,12.24,8.79,17.23] seconds)
py313-test-instrumentation-grpc: FAIL code 1 (280.81=setup[1.06]+cmd[9.51,10.82,6.56,19.25,233.61] seconds)
py313-test-instrumentation-sqlalchemy-1: FAIL code 1 (87.16=setup[0.62]+cmd[9.28,12.58,14.33,8.82,41.54] seconds)
py313-test-instrumentation-remoulade: FAIL code 2 (78.88=setup[0.25]+cmd[12.77,16.87,7.68,16.37,23.28,1.66] seconds)
py313-test-instrumentation-celery: FAIL code 1 (121.81=setup[1.14]+cmd[17.80,8.39,15.38,42.73,22.45,13.93] seconds)
py313-test-instrumentation-system-metrics: FAIL code 1 (114.00=setup[0.64]+cmd[21.56,13.99,12.82,44.14,17.84,3.01] seconds)
py313-test-instrumentation-tortoiseorm: FAIL code 1 (149.58=setup[0.91]+cmd[11.70,10.05,33.60,41.12,52.20] seconds)
py313-test-instrumentation-httpx-0: FAIL code 1 (141.74=setup[1.12]+cmd[15.88,25.46,54.34,7.77,35.94,1.24] seconds)
py313-test-instrumentation-confluent-kafka: FAIL code 1 (59.27=setup[1.12]+cmd[11.14,17.07,13.74,9.01,7.19] seconds)
py313-test-instrumentation-cassandra: FAIL code 2 (67.08=setup[1.17]+cmd[11.27,17.76,18.86,4.79,12.56,0.66] seconds)
py313-test-processor-baggage: FAIL code 1 (49.19 seconds)

@antonpirker
Copy link
Contributor Author

So a question @xrmx : will otel only support Python 3.13 when all the instrumentation's work with 3.13? (could take a while for all those libraries to add support for 3.13)

@antonpirker
Copy link
Contributor Author

I know there will be a lot of work needed to make opentelementry-python-contrib compatible with Python 3.13.
I am also aware that this will probably not happen before Python 3.13 final is out.
But at least we have now something in place to run the testsuites against Python 3.13.

@xrmx You can close this PR if it cluttering the prs. We can reopen at a later time.

I guess the way to approach this would be:

  • Make everything in opentelemetry-python compatible with Python 3.13
  • Enable 3.13 tests for all instrumentations in this repo that "just work" out of the box with 3.13
  • Migrate the rest of the instrumentations one-by-one and make them compatible. (could take some time, ex Celery is not the fasted to work with adopting new Python versions.)

@xrmx
Copy link
Contributor

xrmx commented Jul 30, 2024

So a question @xrmx : will otel only support Python 3.13 when all the instrumentation's work with 3.13? (could take a while for all those libraries to add support for 3.13)

I think this is the wrong question to ask :) Also it's not that I decide here, I'll try to help :)

I know there will be a lot of work needed to make opentelementry-python-contrib compatible with Python 3.13. I am also aware that this will probably not happen before Python 3.13 final is out. But at least we have now something in place to run the testsuites against Python 3.13.

@xrmx You can close this PR if it cluttering the prs. We can reopen at a later time.

I guess the way to approach this would be:

* Make everything in `opentelemetry-python` compatible with Python 3.13

* Enable 3.13 tests for all instrumentations in this repo that "just work" out of the box with 3.13

* Migrate the rest of the instrumentations one-by-one and make them compatible.  (could take some time, ex Celery is not the fasted to work with adopting new Python versions.)

The correct question in my opinion would be: how can I help to have 3.13 supported when it will be released? Your list looks fine to me, the actual timeline depends on people doing the work. So first thing would be to understand why things are failing. For 3.12 we did a big "add support for 3.12", but there were a few things that can be fixed separately and we did so. For -contrib at least 3 people opened a PR to add 3.12 support. To the list of things to do there should be probably be to being able to run 3.13 tests in CI 😅 Maybe #2687 will do the trick.

To elaborate a bit on the possible failures in tests, for 3.12 there were things that the python interpreter become more picky like warning about wrong assert methods. And these can be fixed right now. Another thing could be missing wheels / language compatibility issues, and as I said already I would avoid to add temporary workarounds for these but wait for updated packages instead.

@antonpirker
Copy link
Contributor Author

Updated the branch to use the new workflow generation script.

When running the tests locally with tox -f py313 everything is green. Should be the same in CI now.

@xrmx could you trigger the CI run, so we can check?

@antonpirker antonpirker marked this pull request as ready for review August 9, 2024 08:37
@antonpirker antonpirker changed the title [WIP] Run tests in Python 3.13 Run tests in Python 3.13 Aug 9, 2024
@antonpirker antonpirker changed the title Run tests in Python 3.13 Add support for Python 3.13 Aug 9, 2024
@antonpirker
Copy link
Contributor Author

All GH checks (except the changelog one) succeeded. (fixed this in the mean time)

So I guess this is good to review!

@lzchen
Copy link
Contributor

lzchen commented Aug 12, 2024

Shouldn't we support Python 3.13 in the api before we add support for the instrumentations?

Copy link
Member

@emdneto emdneto left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the idea of testing new releases and fixing the errors early, but I'm afraid users think we are supporting py3.13 with this, and at some point, we need to do workarounds to keep things working. Maybe can we just move forward with the idea of using this PR as a tracking point to see/test which instrumentations break and map what we can do to help otel-python support python3.13 when it gets released?

@xrmx
Copy link
Contributor

xrmx commented Aug 19, 2024

@antonpirker please clean your base branch so you don't have other people commits, thanks!

Then it would be nice to open a new PR with the following commits so we can reduce this PR to just the enablement:

@ocelotl ocelotl removed their assignment Sep 3, 2024
@antonpirker
Copy link
Contributor Author

@xrmx moved those commits into new PR: #2887

@antonpirker antonpirker requested a review from a team as a code owner September 25, 2024 10:52
@@ -28,6 +28,9 @@ jobs:
uses: actions/setup-python@v5
with:
python-version: "{{ job_data.python_version }}"
{%- if job_data.python_version == "3.13" %}
Copy link
Contributor

@xrmx xrmx Oct 3, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like we can drop this?

Copy link
Member

@emdneto emdneto Oct 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@@ -39,6 +39,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

### Fixed

- Add Python 3.13 support
Copy link
Member

@emdneto emdneto Oct 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

update here too to the correct section

@codeboten codeboten removed their assignment Oct 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.