spark scala expert
$8-15 USD / hour
Hi there!
I'm struggling with one issue, there are two ways to fix
Basically I have list of string in the dataframe, I want to iterate all the strings and need to check against neo4j. Query seems working now, but I don't know all the rdd results coming empty
val query = "MATCH (n:Term) where [login to view URL] = {term} RETURN [login to view URL] as term,[login to view URL] as tf ";
val nRdd = [login to view URL] { term =>
val neo = Neo4j([login to view URL]()).cypher(query)
val resultRDD = [login to view URL]("term", Seq(term)).loadRowRdd
}
[login to view URL](20).foreach(println) <- Results are empty
I am using spark connector this issues going in discussion here : [login to view URL]
Other solution I am just thinking is creating csv string out of dataframe rows then I may run single neo4j query
That will definitely give me results
Query will look like this: WITH ["growth hacker","Popego"] as terms MATCH (t:Term) where [login to view URL] in terms return t
So Are confident in fixing any one of the way?
If yes, can you strart immediately I will give scala file and sbt settings to setup your local quickly
Project ID: #17712080
About the project
Awarded to:
Hello I work In Bigdata/Hadoop technologies. I worked in Spark, Kafka, Cassandra using Scala, Java and Python as well. Can we further talk? Thank you!