Threads to Serve HTTP Request

Java web container use HTTPServlet to serve HTTP request. There are 3 ways to
handle requests:

- Create 1 thread which is used to serve all the HTTP requests. It obvious that
  this approach does not scale. It can leverage OS concurrency capability and
  multiple CPUs.
- Create 1 thread for every request. If there are too many concurrent requests, 
  too many threads will be created. With too many threads, a lot of resources
  will be spent on context switches.
- Create a pool with a fixed number of threads. This approach can show a
  graceful degradation under very heavy loads.

Almost non-trivial web containers use the 3rd approach. I want to make sure that
this assumption is right. So I made some experiments with Tomcat 6. The
experiments results validate the assumption. The toString method of thread to
execute HTTPServlet in Tomcat is like Thread[http-8080-2,5,main]. 8080 is port
number of Tomcat. 2 means HTTP thread ID. It is possible that 5 mains the size
of thread pool.

And it does not hold that a thread is always to serve the requests for the
session. A thread is often used to serve requests from multiple sessions.

你可能感兴趣的:(thread,tomcat,Web,OS)