问题
I am trying to compute row similarity between wikipedia documents. I have the tf-idf vectors in format Key class: class org.apache.hadoop.io.Text Value Class: class org.apache.mahout.math.VectorWritable
. I am following the quick tour of text analysis from here: https://cwiki.apache.org/confluence/display/MAHOUT/Quick+tour+of+text+analysis+using+the+Mahout+command+line
I created a mahout matrix as follows:
mahout rowid \
-i wikipedia-vectors/tfidf-vectors/part-r-00000
-o wikipedia-matrix
I got the the number generated rows and columns:
vectors.RowIdJob: Wrote out matrix with 4587604 rows and 14121544 columns to wikipedia-matrix/matrix
The matrix is of format Key class: class org.apache.hadoop.io.IntWritable Value Class: class org.apache.mahout.math.VectorWritable
I also have a docIndex
document with the following format: Key class: class org.apache.hadoop.io.IntWritable Value Class: class org.apache.hadoop.io.Text
Then when I run the rowsimilarity job
mahout rowsimilarity
-i wikipedia-matrix/matrix
-o wikipedia-similarity
-r 4587604
--similarityClassname SIMILARITY_COSINE
-m 50
-ess
I am getting the following error:
13/08/25 15:18:18 INFO mapred.JobClient: Task Id : attempt_201308161435_0364_m_000001_1, Status : FAILED
java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.mahout.math.VectorWritable
at org.apache.mahout.math.hadoop.similarity.cooccurrence.RowSimilarityJob$VectorNormMapper.map(RowSimilarityJob.java:183)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:648)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
at org.apache.hadoop.mapred.Child$4.run(Child.java:266)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
at org.apache.hadoop.mapred.Child.main(Child.java:260)
Could someone please help me with the error? I am not sure from where is this org.apache.hadoop.io.Text
when the input matrix is of format Key class: class org.apache.hadoop.io.IntWritable Value Class: class org.apache.mahout.math.VectorWritable
Thank you very much.
Best, Dragan
回答1:
I solved it using the following command:
hadoop jar mahout-examples-0.9-SNAPSHOT.jar
org.apache.mahout.math.hadoop.similarity.cooccurrence.RowSimilarityJob
-i /user/dmilchev/wikipedia-matrix/matrix
-o /user/dmilchev/wikipedia-similarity
-r 4587604 --similarityClassname SIMILARITY_COSINE -m 50 -ess
and I did not get any error.
来源:https://stackoverflow.com/questions/18429571/mahout-rowsimilarity