You are not logged in.

#1 2013-12-29 19:34:26

mash
Member
Registered: 2012-12-02
Posts: 15

TCP Connection Problems

Hi everybody!
I face some strange network issue which drives me nuts lately. It looks like I can easily saturate the network link (100MBit/s down / 2MBit/s up) of my computer with just using a browser.
First it looked like a DNS issue, but after using the nscd daemon this was no longer the case. Now the connection it self is stuck. I can provoke the problem just using the following script, which opens a number of connections to some hosts on the web.
The same works without a problem to a local webservice on the same maschine - with the same amount of concurrent connections.
20 Processes is not a problem, it starts getting dirty with 50 and more.
Other maschines on the same network have no problem, the web performance is not degraded while running the script.

I tried the following kernel parameters but with no luck:

net.ipv4.ip_local_port_range = 15000 61000
net.ipv4.tcp_fin_timeout = 30
net.ipv4.tcp_tw_recycle = 1
net.ipv4.tcp_tw_reuse = 1

Any ideas?

NUMBER_OF_PROCESSES = 50
TIMES = 1

URL_LIST = ['heise.de', 'spiegel.de', 'google.de', 'google.com', 'arstechnica.com',
            'golem.de', 'theverge.com', 'faz.de', 'bmw.com', 'python.org', 'zdf.de', 'heute.de', 'kabelbw.de', 'amazon.de',
            'zalando.de']

URL_PREFIX = 'http://'

import urllib.request as request
from multiprocessing import Process
from random import choice

def test_function(url):
    print('Opening... %s' % url)
    f = request.urlopen(url)
    print('Done... %s' % url)
    result = f.read(100)
    f.close()
    return


def spawn():
    processes = []
    for i in range(NUMBER_OF_PROCESSES):
        processes.append(Process(target=test_function, args=(URL_PREFIX + choice(URL_LIST),)))
    for process in processes:
        process.start()


for i in range(TIMES):
    spawn()

Offline

Board footer

Powered by FluxBB