I\'ve heard of Java programs with strict latency requirements where \"new\" instructions are never -or very rarely- used (because no new=>no objects->no GC=>improved latency
No, I have never heard of such programs, or even of the technique.
It seems like a bad idea, because then you are effectively limited to what you can do in C or similar low-level languages. And if you want that, it's easier to write in C. Plus there are many ways to avoid long GC pauses, and in practice most low-latency requirements can be met by them.
Plus, in order to do anything useful, you'll have to use the Java platform APIs, or other 3rd party libraries, which will probably allocate plenty of objects behind your back, so avoiding all object instantiations is probably not even practical in a non-trivial program.
So I am fairly certain that this is an urban legend, or at most a niche idea.
Edit:
This technique is used to obtain realtime or low-latency behaviour. Nowadays it might be obsolete because of better GC algorithms, but this will of course depend on circumstances. So it is probably something to consider at least for hotspots of an algorithm.
As an example:
Many Java realtime environments place some restrictions on object creation. This does not mean that they can only use primitives: Use of complex objects is still possible, but for example Safety Critical Java ( http://www.aicas.com/papers/scj.pdf ) requires all object instantiations to happen during an initialization phase. Once the application is running ("mission phase"), instantiation is no longer allowed, so you have to work with the object instances you have.
This avoids the unpredictability introduced by dynamic allocation and garbage collection, but still allows the use of dynamic memory (in a limited fashion).
Thanks to andersoj & mikera for explaining this to me.