I have a pig script, and need to load files from local hadoop cluster. I can list the files using hadoop command: hadoop fs –ls /repo/mydata,` but when i tried to load file
Get rid of the space on the either side of "=" in=LOAD '/repo/mydata/2012/02' USING PigStorage() AS (event:chararray, user:chararray)
I faced the same issue.. please find my suggestions below:
To start working on PIG please type: [root@localhost training]# pig -x local
Now type load statement as I am doing in below example: grunt> a= LOAD '/home/training/pig/TempFile.txt' using PigStorage(',') as (c1:chararray,c2:chararray,c3:chararray);
My suggestion:
Create a folder in hdfs : hadoop fs -mkdir /pigdata
Load the file to the created hdfs folder: hadoop fs -put /opt/pig/tutorial/data/excite-small.log /pigdata
(or you can do it from grunt shell as grunt> copyFromLocal /opt/pig/tutorial/data/excite-small.log /pigdata
)
Execute the pig latin script :
grunt> set debug on
grunt> set job.name 'first-p2-job'
grunt> log = LOAD 'hdfs://hostname:54310/pigdata/excite-small.log' AS
(user:chararray, time:long, query:chararray);
grunt> grpd = GROUP log BY user;
grunt> cntd = FOREACH grpd GENERATE group, COUNT(log);
grunt> STORE cntd INTO 'output';
The output file will be stored in hdfs://hostname:54310/pigdata/output