问题
For my Spark API I'm building integration tests. Sometimes I want to stop and start the Spark instance. When I do that I sometimes run into the problem that I'm creating a new Spark instance, while the old one is still shutting down on a separate thread. It would be helpful to know when the Spark instance actually shut down.
First I start my Spark instance like this:
Spark.init();
Spark.awaitInitialization();
Then I stop it like this:
Spark.stop();
Now after I call stop()
, the Spark Service hasn't actually stopped!
Is there a similar functionality to awaitInitialization()
or another way of knowing when the Spark service actually stopped?
回答1:
Spark 2.8.0 introduced an awaitStop()
method: https://github.com/perwendel/spark/pull/730
If you are stuck at a version below (e.g. using spark-kotlin which uses Spark 2.6.0), you could use some reflection to identify the current state of Spark:
fun awaitShutdown() {
Spark.stop()
while (isSparkInitialized()) {
Thread.sleep(100)
}
}
/**
* Access the internals of Spark to check if the "initialized" flag is already set to false.
*/
private fun isSparkInitialized(): Boolean {
val sparkClass = Spark::class.java
val getInstanceMethod = sparkClass.getDeclaredMethod("getInstance")
getInstanceMethod.isAccessible = true
val service = getInstanceMethod.invoke(null) as Service
val serviceClass = service::class.java
val initializedField = serviceClass.getDeclaredField("initialized")
initializedField.isAccessible = true
val initialized = initializedField.getBoolean(service)
return initialized
}
(extracted from https://github.com/debuglevel/sparkmicroserviceutils/blob/ec6b9692d808ecc448f1828f5487739101a2f62e/src/main/kotlin/de/debuglevel/microservices/utils/spark/SparkTestUtils.kt)
回答2:
I read this solution in https://github.com/perwendel/spark/issues/731 and works for me:
public static void stopServer() {
try {
Spark.stop();
while (true) {
try {
Spark.port();
Thread.sleep(500);
} catch (final IllegalStateException ignored) {
break;
}
}
} catch (final Exception ex) {
// Ignore
}
}
回答3:
I use spark-java to build mock services for integration/ functional test.
My test tear down code:
public FakeServer shutdown() {
service.stop();
// Remove when https://github.com/perwendel/spark/issues/705 is fixed.
try {
Thread.sleep(100);
} catch (InterruptedException e) {
e.printStackTrace();
}
return this;
}
Works seamlessly for me, each test sets up the FakeServer @Before and tears it down on test completion - @After.
Give it a shot.
来源:https://stackoverflow.com/questions/46451309/how-to-wait-until-spark-service-is-stopped