Apache Pig permissions issue

后端 未结 2 355
鱼传尺愫
鱼传尺愫 2020-12-18 01:22

I\'m attempting to get Apache Pig up and running on my Hadoop cluster, and am encountering a permissions problem. Pig itself is launching and connecting to the cluster just

相关标签:
2条回答
  • 2020-12-18 02:12

    Probably your pig.temp.dir setting. It defaults to /tmp on hdfs. Pig will write temporary result there. If you don't have permission to /tmp, Pig will complain. Try to override it by -Dpig.temp.dir.

    0 讨论(0)
  • 2020-12-18 02:16

    A problem might be that hadoop.tmp.dir is a directory on your local filesystem, not HDFS. Try setting that property to a local directory you know you have write access to. I've run into the same error using regular MapReduce in Hadoop.

    0 讨论(0)
提交回复
热议问题