It all comes down to shared data/shared state. If you share no data or state then you have no concurrency problems.
Most people, when they think of concurrency, think of multi-threading in a single process.
One way to think about this is, what happens if you split your process into multiple processes. Where do they have to communicate with each other? If you can be clear on where the processes have to communicate with each other then you have a good idea about the data they share.
Now, as mental test, move those multiple processes onto individual machines. Are your communication patterns still correct? Can you still see how to make it work? If not, one might want to reconsider multiple threads.
(The rest of this doesn't apply to Java threading, which I don't use and therefore know little about).
The other place where one can get caught, is, if you use locks to protect shared data, you should write a lock monitor that can find deadlocks for you. Then you need to have your program(s) deal with the possibility of deadlocks. When you get a deadlock error your have to release all of your locks, backup, and try again.
You are unlikely to make multiple locks work well otherwise without a level of care which is quite rare in real systems.
Good luck!