Skip to content

Commit 21c6c9c

Browse files
authored
Add Python 3.13, drop Python 3.8, update tool versions (#211)
1 parent 147d24b commit 21c6c9c

28 files changed

+51
-95
lines changed

.github/workflows/publish.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ jobs:
1717
- name: Set up Python
1818
uses: actions/setup-python@v5
1919
with:
20-
python-version: '3.12'
20+
python-version: '3.13'
2121
- name: Install dependencies
2222
run: |
2323
python -m pip install --upgrade pip

.github/workflows/test.yml

+7-7
Original file line numberDiff line numberDiff line change
@@ -16,20 +16,20 @@ jobs:
1616
fail-fast: false
1717
matrix:
1818
include:
19-
- python-version: "3.8"
19+
- python-version: "3.9"
2020
toxenv: "min"
21-
- python-version: "3.8"
21+
- python-version: "3.9"
2222
toxenv: "pinned-scrapy-2x7"
23-
- python-version: "3.8"
23+
- python-version: "3.9"
2424
toxenv: "pinned-scrapy-2x8"
25-
- python-version: "3.8"
25+
- python-version: "3.9"
2626
toxenv: "asyncio-min"
27-
- python-version: "3.8"
2827
- python-version: "3.9"
2928
- python-version: "3.10"
3029
- python-version: "3.11"
3130
- python-version: "3.12"
32-
- python-version: "3.12"
31+
- python-version: "3.13"
32+
- python-version: "3.13"
3333
toxenv: "asyncio"
3434

3535
steps:
@@ -54,7 +54,7 @@ jobs:
5454
strategy:
5555
fail-fast: false
5656
matrix:
57-
python-version: ['3.12']
57+
python-version: ['3.12'] # Keep in sync with .readthedocs.yml
5858
tox-job: ["mypy", "docs", "linters", "twinecheck"]
5959

6060
steps:

.pre-commit-config.yaml

+3-3
Original file line numberDiff line numberDiff line change
@@ -3,12 +3,12 @@ repos:
33
- id: black
44
language_version: python3
55
repo: https://github.com/ambv/black
6-
rev: 22.12.0
6+
rev: 24.10.0
77
- hooks:
88
- id: isort
99
language_version: python3
1010
repo: https://github.com/PyCQA/isort
11-
rev: 5.11.5
11+
rev: 5.13.2
1212
- hooks:
1313
- id: flake8
1414
language_version: python3
@@ -19,4 +19,4 @@ repos:
1919
- flake8-docstrings
2020
- flake8-string-format
2121
repo: https://github.com/pycqa/flake8
22-
rev: 6.1.0
22+
rev: 7.1.1

.readthedocs.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ sphinx:
66
build:
77
os: ubuntu-22.04
88
tools:
9-
python: "3.12" # Keep in sync with .github/workflows/tests.yml
9+
python: "3.12" # Keep in sync with .github/workflows/test.yml
1010

1111
python:
1212
install:

MANIFEST.in

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
include CHANGES.rst
1+
include CHANGELOG.rst
22
include LICENSE
33
include README.rst
44

README.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ Installation
4848
4949
pip install scrapy-poet
5050
51-
Requires **Python 3.8+** and **Scrapy >= 2.6.0**.
51+
Requires **Python 3.9+** and **Scrapy >= 2.6.0**.
5252

5353
Usage in a Scrapy Project
5454
=========================

docs/intro/install.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ Installation
77
Installing scrapy-poet
88
======================
99

10-
``scrapy-poet`` is a Scrapy extension that runs on Python 3.8 and above.
10+
``scrapy-poet`` is a Scrapy extension that runs on Python 3.9 and above.
1111

1212
If you’re already familiar with installation of Python packages, you can install
1313
``scrapy-poet`` and its dependencies from PyPI with:

docs/providers.rst

-2
Original file line numberDiff line numberDiff line change
@@ -327,8 +327,6 @@ you could implement those limits in the library itself.
327327
Attaching metadata to dependencies
328328
==================================
329329

330-
.. note:: This feature requires Python 3.9+.
331-
332330
Providers can support dependencies with arbitrary metadata attached and use
333331
that metadata when creating them. Attaching the metadata is done by wrapping
334332
the dependency class in :data:`typing.Annotated`:

example/example/autoextract.py

+1
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
Example of how to create a PageObject with a very different input data,
33
which even requires an API request.
44
"""
5+
56
from typing import Any, Dict
67

78
import attr

example/example/spiders/books_01.py

+1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
"""
22
Baseline: regular Scrapy spider, sweet & easy.
33
"""
4+
45
import scrapy
56

67

example/example/spiders/books_02.py

+1
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
Scrapy spider which uses Page Objects to make extraction code more reusable.
33
BookPage is now independent of Scrapy.
44
"""
5+
56
import scrapy
67
from web_poet import WebPage
78

example/example/spiders/books_02_1.py

+1
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@
33
BookPage is now independent of Scrapy. callback_for is used to reduce
44
boilerplate.
55
"""
6+
67
import scrapy
78
from web_poet import WebPage
89

example/example/spiders/books_02_2.py

+1
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@
1010
has problems now, it is used in the latter examples, because as an API
1111
it is better than defining callback explicitly.
1212
"""
13+
1314
import scrapy
1415
from web_poet import WebPage
1516

example/example/spiders/books_02_3.py

+1
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@
77
Page object is used instead of callback below. It doesn't work now,
88
but it can be implemented, with Scrapy support.
99
"""
10+
1011
import scrapy
1112
from web_poet import WebPage
1213

example/example/spiders/books_03.py

+1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
"""
22
Scrapy spider which uses AutoExtract API, to extract books as products.
33
"""
4+
45
import scrapy
56
from example.autoextract import ProductPage
67

example/example/spiders/books_04.py

+1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
"""
22
Scrapy spider which uses Page Objects both for crawling and extraction.
33
"""
4+
45
import scrapy
56
from web_poet import WebPage
67

example/example/spiders/books_04_overrides_01.py

+1
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@
55
66
The default configured PO logic contains the logic for books.toscrape.com
77
"""
8+
89
import scrapy
910
from web_poet import ApplyRule, WebPage
1011

example/example/spiders/books_04_overrides_02.py

+1
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66
No configured default logic: if used for an unregistered domain, no logic
77
at all is applied.
88
"""
9+
910
import scrapy
1011
from web_poet import WebPage
1112
from web_poet.rules import ApplyRule

example/example/spiders/books_04_overrides_03.py

+1
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@
1010
difference is that this example is using the ``@handle_urls`` decorator to
1111
store the rules in web-poet's registry.
1212
"""
13+
1314
import scrapy
1415
from web_poet import WebPage, default_registry, handle_urls
1516

example/example/spiders/books_05.py

+1
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
Scrapy spider which uses Page Objects both for crawling and extraction.
33
You can mix various page types freely.
44
"""
5+
56
import scrapy
67
from example.autoextract import ProductPage
78
from web_poet import WebPage

scrapy_poet/_request_fingerprinter.py

+1-5
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
import json
1111
from functools import cached_property
1212
from logging import getLogger
13-
from typing import Callable, Dict, List, Optional, get_args, get_origin
13+
from typing import Annotated, Callable, Dict, List, Optional, get_args, get_origin
1414
from weakref import WeakKeyDictionary
1515

1616
from andi import CustomBuilder
@@ -37,10 +37,6 @@
3737
def _serialize_dep(cls):
3838
if isinstance(cls, CustomBuilder):
3939
cls = cls.result_class_or_fn
40-
try:
41-
from typing import Annotated
42-
except ImportError:
43-
pass
4440
else:
4541
if get_origin(cls) is Annotated:
4642
annotated, *annotations = get_args(cls)

scrapy_poet/downloadermiddlewares.py

+1
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
responsible for injecting Page Input dependencies before the request callbacks
33
are executed.
44
"""
5+
56
import inspect
67
import logging
78
import warnings

scrapy_poet/page_input_providers.py

+1
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@
88
different providers in order to acquire data from multiple external sources,
99
for example, from scrapy-playwright or from an API for automatic extraction.
1010
"""
11+
1112
from typing import Any, Callable, ClassVar, FrozenSet, List, Set, Union
1213
from warnings import warn
1314

setup.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@
1919
"scrapy.commands": ["savefixture = scrapy_poet.commands:SaveFixtureCommand"]
2020
},
2121
package_data={"scrapy_poet": ["VERSION"]},
22-
python_requires=">=3.8",
22+
python_requires=">=3.9",
2323
install_requires=[
2424
"andi >= 0.6.0",
2525
"attrs >= 21.3.0",
@@ -39,10 +39,10 @@
3939
"Operating System :: OS Independent",
4040
"Framework :: Scrapy",
4141
"Programming Language :: Python :: 3",
42-
"Programming Language :: Python :: 3.8",
4342
"Programming Language :: Python :: 3.9",
4443
"Programming Language :: Python :: 3.10",
4544
"Programming Language :: Python :: 3.11",
4645
"Programming Language :: Python :: 3.12",
46+
"Programming Language :: Python :: 3.13",
4747
],
4848
)

tests/test_commands.py

-4
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,6 @@
66
import tempfile
77
from pathlib import Path
88

9-
import pytest
109
from twisted.web.resource import Resource
1110
from web_poet.testing import Fixture
1211

@@ -246,9 +245,6 @@ class CustomItemAdapter(ItemAdapter):
246245
result.assert_outcomes(passed=3)
247246

248247

249-
@pytest.mark.skipif(
250-
sys.version_info < (3, 9), reason="No Annotated support in Python < 3.9"
251-
)
252248
def test_savefixture_annotated(pytester) -> None:
253249
project_name = "foo"
254250
cwd = Path(pytester.path)

0 commit comments

Comments
 (0)