As Geert says, Jetty's continuations are fairly minimalist.
Because Java NIO doesn't need to allocate a thread for each socket, you can take advantage of that to avoid needing to have a thread allocated to each HTTP request as it comes in.
There are 2 main uses cases for that.
#1, AJAX style processing, where the client is polling the server for updates. There are 3 design strategies you can employ here
a) Block at the server. The request hits the servlet (or some other HTTP artefact) and the servlet blocks until there is an update to return. That takes relatively little CPU usage, but you need to have a thread available for each HTTP client that is currently connected. Given the relatively low number of threads that most JVMs can handle, your server scalability tends to constrained not by CPU or Memory, but by thread count.
b) Make frequent HTTP requests. The client requests an update, the server returns "no change", and then the client requests again a second or so later. You end up with a lot of network traffic, and you still quite a number of concurrent HTTP requests (and, therefore, threads), but not as many as in (a).
c) You separate the connection between an HTTP connection and a thread. If you can maintain the HTTP state, but not have a thread permanently allocated to it, you keep network traffic to a minimum, and you can also scale out to accepting more connections than you have threads for. Jetty uses a continuation mechanism to support this model.
The other case for separating HTTP connections and threads, is when there is another resource involved. In reality this is just a more general case of #1 - you want to minimise the extent to which a deficiency in 1 resource can cause a ripple effect in another resource.
e.g. Often, if your HTTP request needs to get a database connection and the connection pool is empty, the request will block for some period of time. The problem is that the blocking request will (normally) consume a thread while it it blocking. So now you have 2 problems, you have a database pool with high contention, and you have a thread pool which is being locked up with waiting threads.
If you can free up threads when the DB pool blocks, you can keep that problem from expanding to become a bigger problem.
Jetty implements this using exceptions. If the jetty handler throws a
then the server will re-queue the HTTP request and return the thread to the thread-pool. After a defined period of time the HTTP request will be re-issued to the handler.
Greg and I have each written blog entries on it at:http://blogs.webtide.com/gregw/2006/10/18/1161127500000.htmlhttp://blogs.webtide.com/gregw/2006/12/07/1165517549286.html