Skip to content

Upstream prematurely closed connection while reading upstream #5706

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
codesmaker opened this issue Apr 16, 2017 · 4 comments
Closed

Upstream prematurely closed connection while reading upstream #5706

codesmaker opened this issue Apr 16, 2017 · 4 comments

Comments

@codesmaker
Copy link

Expected behaviour

Owncloud client should successfully sync large files

Actual behaviour

Can not sync large files. Syncing stops as soon as a file reaches 1GB.

Server configuration

Operating system: Debian Jessie 8

Web server: Apache/2.4.10

Database: MySQL 5.5.54

PHP version: PHP 7.0.17-1~dotdeb+8.1

ownCloud version: 9.1.4

Storage backend (external storage):

Client configuration

Client version: 2.3.1

Operating system: Windows 10

OS language: English

Qt version used by client package (Linux only, see also Settings dialog):

Client package (From ownCloud or distro) (Linux only):

Installation path of client:

Logs

Please use Gist (https://gist.github.com/) or a similar code paster for longer
logs.

  1. Web server error log:
    2017/04/16 11:00:59 [error] 28461#28461: *2072 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 192.168.1.1, server: owncloud.example.com, request: "GET /remote.php/webdav/NewArrivals/file HTTP/1.1", upstream: "http://192.168.1.204:80/remote.php/webdav/NewArrivals/file", host: "owncloud.example.com"

  2. Server logfile: ownCloud log (data/owncloud.log):
    {"reqId":"cYxtvnjVwfi7BJd9ks5q","remoteAddr":"192.168.1.1","app":"PHP","message":"Invalid argument supplied for foreach() at /var/www/owncloud/apps/activity/lib/Formatter/CloudIDFormatter.php#89","level":3,"time":"2017-04-16T12:50:50+02:00","method":"GET","url":"/ocs/v1.php/cloud/activity?page=0&pagesize=100&format=json","user":"user"}
    {"reqId":"WcFGzfKmU0dyKXkVKyIY","remoteAddr":"192.168.1.1","app":"PHP","message":"fseek(): stream does not support seeking at /var/www/owncloud/apps/files_external/3rdparty/icewind/streams/src/Wrapper.php#74","level":3,"time":"2017-04-16T12:51:03+02:00","method":"GET","url":"/remote.php/webdav/NewArrivals/file","user":"user"}

Hi,

I have 2 Owncloud instances which are installed in two different countries. Both setups are identical. They are behind a reverse proxy (Nginx) that redirect the connections to the backend server. The owncloud installation is almost standard (Apache+MySQL+PHP7) as mentioned above. When I try to sync a large file (10GB+), the sync stops with an error "connection closed". I get the error message I mentioned above from the Nginx server. It is clear that the connection times out. I tried to increase the timout values in PHP (max execution time) and the reverse proxy but nothing helped. This is what I have in the reverse proxy:
location / { proxy_connect_timeout 159s; proxy_send_timeout 600; proxy_read_timeout 600; proxy_buffer_size 64k; proxy_buffers 16 32k; proxy_busy_buffers_size 64k; proxy_temp_file_write_size 64k; proxy_pass http://192.168.1.204; }
It's also worth noting that I'm using federated shares. How can I track what exactly is timing out?

@ogoffart
Copy link
Contributor

Hi,

I don't think this is a client bug. Most likely a configuration problem on the server side.
Please ask the question on the forum or user mailinglist. https://owncloud.org/support/
Make sure to also check https://github.com/owncloud/documentation/wiki/Uploading-files-up-to-16GB and https://doc.owncloud.org/server/10.0/admin_manual/configuration_files/big_file_upload_configuration.html

@codesmaker
Copy link
Author

Hi Ogoffart,

Indeed. It was a server configuration. I just had to set the following on the Ngnix (frontend reverse proxy):
proxy_request_buffering off;
proxy_buffering off;

Thanks for the help and for the links.

@soichih
Copy link

soichih commented Feb 15, 2018

@codesmaker Thanks. I added "proxy_buffering off;" which fixed a similar issue. Without this option, nginx throws the error message "upstream prematurely closed connection while reading upstream" and connection gets terminated.

But.. I don't quite understand why adding this option would fix this. According to nginx documentation, buffering should automatically stop when it reaches the maximum buffer size (should be very small by default). Can someone explain what is going on inside Nginx?

@mpwsh
Copy link

mpwsh commented Mar 2, 2018

Had a similar problem with my reverse proxy.
This is the configuration that finally did it for me:
location / {
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $http_host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forward-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forward-Proto http;
proxy_set_header X-Nginx-Proxy true;
proxy_temp_file_write_size 64k;
proxy_connect_timeout 10080s;
proxy_send_timeout 10080;
proxy_read_timeout 10080;
proxy_buffer_size 64k;
proxy_buffers 16 32k;
proxy_busy_buffers_size 64k;
proxy_redirect off;
proxy_request_buffering off;
proxy_buffering off;
proxy_pass http://192.168.1.201:9100/;
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants