“Stream Processing with Apache Flink” how to run book code from IntelliJ?

醉酒当歌 提交于 2020-04-18 05:48:58

问题


As described in this post I have been unable to successfully run any code from the book "Stream Processing with Apache Flink, including the precompiled jar.

It is not my practice to use an IDE but I thought I would try to use IntelliJ as Chapter 3 "Run and Debug Flink Applications in an IDE" describes how to do that specifically for the code for this book.

The book describes a project import process that I have not found a way to use. It describes setting options on import, for example select Maven. I have not found a way to set any options on import.

I am able to import a project and run code. The code fails, looks like with missing dependency. Maybe because it isn't a Maven project?

Here are steps to reproduce

git clone https://github.com/streaming-with-flink/examples-scala.git

Start IntelliJ

Choose "Open or Import" and Select the "examples-scala/" folder. The project imports with no chance to select options.

Now I have a project

Browse to AverageSensorReadings class, open, and run.

Errors with

java.lang.NoClassDefFoundError: org/apache/flink/api/common/typeinfo/TypeInformation

How can I run this code in IntelliJ?


回答1:


I see that flink dependencies have provided scope in Maven - this means that thay are not included into the classpath when you running the application. Most likely the application is meant to be run on environment where these dependencies already exist (e.g. Hadoop). To be able to run it from from IDE set Include dependencies with "Provided" scope option in Run Configuration:



来源:https://stackoverflow.com/questions/61232605/stream-processing-with-apache-flink-how-to-run-book-code-from-intellij

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!