How to use hadoop MapReuce framework for an Opencl application?

前端 未结 2 1346
谎友^
谎友^ 2021-01-26 04:52

I am developing an application in opencl whose basic objective is to implement a data mining algorithm on GPU platform. I want to use Hadoop Distributed File System and want to

相关标签:
2条回答
  • 2021-01-26 05:29

    You could use Hadoop Streaming, with it you can write mappers and reducers in any language you want as long as your code can read from the stdio and write back to it. For inspiration you can take at examples of how R is used with Hadoop Streaming

    0 讨论(0)
  • 2021-01-26 05:41

    HDFS is a file system; you can use HDFS file system with any language.

    HDFS data is distributed over multiple machines, it is highly available to process the data in GPU computing.

    For more information reference Hadoop Streaming.

    0 讨论(0)
提交回复
热议问题