Actions
Bug #21560
open 
  RUBY_MN_THREADS=1 causes large performance regression in Puma 7
    Bug #21560:
    RUBY_MN_THREADS=1 causes large performance regression in Puma 7
  
Description
Expected¶
Running a webserver with RUBY_MN_THREADS=1 will increase performance or keep it the same.
Actual¶
- Before: 22919.85 Requests/sec
- After: 2287.13 Requests/sec
Reproduction¶
Tracking this at https://github.com/puma/puma/issues/3720. I would like to get a smaller reproduction, but until then here's how we're able to induce the failure:
Prepare puma:
$ git clone https://github.com/puma/puma
$ cd puma
$ git checkout v7.0.0.pre1
$ rake compile
Boot the server to deliver a "hello world" response:
$ bundle exec ruby -Ilib bin/puma -w 1 -t1 --preload test/rackup/hello.ru
Exercise the server with wrk (brew install wrk):
$ wrk -H 'Host: tfb-server' -H 'Accept: text/plain,text/html;q=0.9,application/xhtml+xml;q=0.9,application/xml;q=0.8,*/*;q=0.7' -H 'Connection: keep-alive' --latency -d 15 -c 16 --timeout 8 -t 12 http://localhost:9292
Results¶
Before:
$ bundle exec ruby -Ilib bin/puma -w 1 -t1 --preload test/rackup/hello.ru
$ wrk -H 'Host: tfb-server' -H 'Accept: text/plain,text/html;q=0.9,application/xhtml+xml;q=0.9,application/xml;q=0.8,*/*;q=0.7' -H 'Connection: keep-alive' --latency -d 15 -c 16 --timeout 8 -t 12 http://localhost:9292
Running 15s test @ http://localhost:9292
  12 threads and 16 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   786.74us    1.45ms  45.57ms   90.62%
    Req/Sec     1.92k   188.55     4.20k    83.22%
  Latency Distribution
     50%  386.00us
     75%  523.00us
     90%    2.11ms
     99%    4.63ms
  344638 requests in 15.04s, 25.23MB read
Requests/sec:  22919.85
Transfer/sec:      1.68MB
After:
$ env RUBY_MN_THREADS=1 bundle exec ruby -Ilib bin/puma -w 1 -t1 --preload test/rackup/hello.ru
$ wrk -H 'Host: tfb-server' -H 'Accept: text/plain,text/html;q=0.9,application/xhtml+xml;q=0.9,application/xml;q=0.8,*/*;q=0.7' -H 'Connection: keep-alive' --latency -d 15 -c 16 --timeout 8 -t 12 http://localhost:9292
Running 15s test @ http://localhost:9292
  12 threads and 16 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    28.89ms   40.82ms 129.37ms   78.94%
    Req/Sec   207.34     64.09   252.00     70.70%
  Latency Distribution
     50%  212.00us
     75%   56.77ms
     90%   99.48ms
     99%  127.01ms
  34365 requests in 15.03s, 2.52MB read
Requests/sec:   2287.13
Transfer/sec:    171.45KB
(Notice the dramatic drop in requests per second).
Actions