Skip to content
This repository has been archived by the owner on Jul 7, 2021. It is now read-only.

Support streaming uploads #30

Open
mrcnski opened this issue Aug 3, 2020 · 2 comments
Open

Support streaming uploads #30

mrcnski opened this issue Aug 3, 2020 · 2 comments

Comments

@mrcnski
Copy link
Contributor

mrcnski commented Aug 3, 2020

We should support uploading generic data using streams.

We will need to use the https://toolbelt.readthedocs.io/ library for this as requests does not support this.

@xloem
Copy link
Contributor

xloem commented Oct 2, 2020

I use upload_file_request_with_chunks with a function that yields bytes, for this. The downside is that it seems if the upload is interrupted on the network, the server doesn't know it and I sometimes get skylinks for short content returned.

EDIT: it doesn't look like this works any more in the new version.

@xloem
Copy link
Contributor

xloem commented Oct 2, 2020

Requests does support this: https://requests.readthedocs.io/en/master/user/advanced/#streaming-uploads

EDIT: I also checked the requests source for streaming with known length, and it looks like it calls a super_len() function in utils.py from the request handler in models.py and this very flexible helper tries everything it can on the data passed and saves what works. Notably it tries calling .__len__() and also reading the .len attribute, so in theory specifying one of these should work fine.

EDIT: Just for completion, a file-like object can be passed just like a chunked iterator. It worked for me to implement streaming progress by inheriting from io.FileIO or io.BufferedReader and overloading read.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants