scala - Spark rdd write to Hbase -


i able read messages kafka using below code:

val ssc = new streamingcontext(sc, seconds(50))  val topicmap = map("test" -> 1) val lines = kafkautils.createstream(ssc,"127.0.0.1:2181", "test-consumer-group",topicmap) 

can please me how can write each message read kafka hbase? trying write no success.

lines.foreachrdd(rdd => {   rdd.foreach(record => {     val = +1     val hconf = new hbaseconfiguration()      val htable = new htable(hconf, "test")      val theput = new put(bytes.tobytes(i))      theput.add(bytes.tobytes("cf"), bytes.tobytes("a"), bytes.tobytes(record))    }) }) 

well, not executing put, mereley creating put request , adding data it. missing

htable.put(theput); 

Comments

Popular posts from this blog

ruby on rails - RuntimeError: Circular dependency detected while autoloading constant - ActiveAdmin.register Role -

c++ - OpenMP unpredictable overhead -

javascript - Wordpress slider, not displayed 100% width -