Skip to content
Advertisement

Upload large file using multiple connections/threads to an SFTP server with Python Paramiko

I am trying to SFTP a file to a remote server in chunks using threads and the python paramiko library.

It opens a local file and sftp chunks to the remote server in different threads.

I am basically following this solution which uses the same approach to download large file over SFTP. I would like to send large files instead. Downloading solution

However, I’m getting in write_chunks() on the line for chunk in infile.readv(chunks): in getting this error:

AttributeError: ‘_io.BufferedReader’ object has no attribute ‘readv’

Could anybody assist with this error please. I thought that infile is a file descriptor. I don’t understand why it is an _io.BufferedReader object.

JavaScript

Stack trace:

JavaScript

Advertisement

Answer

For an example how to do a parallel multi part upload of one large file, see the following example.

Note that most SFTP servers (including OpenSSH) do not allow merging files remotely. So you have to revert to shell command for that.

JavaScript

I’m not sure how much is that backed up by the SFTP specification, but many SFTP servers, including OpenSSH, allow writing to the same file from multiple connections in parallel. So you can do even without merging the files – by uploading directly to the respective parts of the target file:

JavaScript
User contributions licensed under: CC BY-SA
4 People found this is helpful
Advertisement