M
M
Mikhail2019-10-02 23:27:40
C++ / C#
Mikhail, 2019-10-02 23:27:40

Why is the data transfer speed from the server being cut during the stream?

There is ASP. NET application running on a VPS server. (Channel width 200 Mb/s). It works as follows: data is transferred from the client to the server and immediately to the browser (about 1.5 mb / s). When I run it on IIS Express, everything works as expected, but when sending to a remote server, the sending speed is cut, while the receiving speed remains at the level (1.5mb / s).

There is only one answer - the problem is in the VPS server, but as soon as I stop sending data, the sending speed rises to the desired 1.5 mb / s. I've been struggling with this problem for over a month now. Changed hosting provider but does not help. What's the matter ??

Tried different VPS configurations. Up to 5 cores, 10 GB of RAM and 1 GB / s channel width. Even on the most powerful configuration, the problem is the same.

SignalR is used for data transfer (with WebSocket Enforcement). The site is hosted on IIS.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
M
Mikhail, 2019-10-07
@mikhail_matyasov

After some searching, I came to the conclusion that something should be used instead of web sockets. Because they use the TCP protocol for themselves, which apparently causes a delay on a remote server. Since there is zero packet loss when sending data to the local IIS, it is clear why everything works locally for me.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question