Replies: 0
Hi,
I am trying to write my first mapreduce program that doubles the integers from 1 to 100.
ints = to.dfs(1:100)
calc = mapreduce(input = ints,
map = function(k, v) cbind(v, 2*v))
from.dfs(calc)
However, I get an error message:
13/11/19 16:02:36 INFO streaming.StreamJob: killJob…
Streaming Command Failed!
Error in mr(map = map, reduce = reduce, combine = combine, in.folder = if (is.list(input)) { :
hadoop streaming failed with error code 1
I am using CentOS and 2-Node cluster with RStudio running on the Master node.
However, after reassigning the environmental variables (HADOOP_CMD, HADOOP_CONF, HADOOP_STREAMING etc) in R using putty, I am able to run the program in the R Command Line (putty). When I run the same program in RStudio, I get the error message.
Please help.
Thanks,
Nitin