I was trying out my hands on Akka (Java api). What I tried was to compare Akka's actor based concurrency model with that of plain Java concurrency model (java.util.concurrent classes).
The use case was a simple canonical map reduce implementation of character count. The dataset was a collection of randomly generated strings (400 chars in length), and calculate the number of vowels in them.
For Akka I used a BalancedDispatcher(for load balancing amongst threads) and RoundRobinRouter (to keep a limit on my function actors). For Java, I used simple fork join technique (implemented without any work stealing algorithm) that would fork map/reduce executions and join the results. Intermediate results were held in blocking queues to make even the joining as parallel as possible. Probably, if I am not wrong, that would mimic somehow the "mailbox" concept of Akka actors, where they receive messages.
Observation: Till medium loads (~50000 string input) the results were comparable, varying slightly in different iterations. However, as I increased my load to ~100000 it would hang the Java solution. I configured the Java solution with 20-30 threads under this condition and it failed in all iterations.
Increasing the load to 1000000, was fatal for Akka as well. I can share the code with anyone interested to have a cross check.
So for me, it seems Akka scales out better than traditional Java multithreaded solution. And probably the reason is the under the hood magic of Scala.
If I can model a problem domain as an event driven message passing one, I think Akka is a good choice for the JVM.
Test performed on: Java version:1.6 IDE: Eclipse 3.7 Windows Vista 32 bit. 3GB ram. Intel Core i5 processor, 2.5 GHz clock speed
Please note, the problem domain used for the test can be debated and I tried to be as much fair as my Java knowledge allowed :-)