spark scala expert

Completed Posted 5 years ago Paid on delivery
Completed

Hi there!

I'm struggling with one issue, there are two ways to fix

Basically I have list of string in the dataframe, I want to iterate all the strings and need to check against neo4j. Query seems working now, but I don't know all the rdd results coming empty

val query = "MATCH (n:Term) where [login to view URL] = {term} RETURN [login to view URL] as term,[login to view URL] as tf ";

val nRdd = [login to view URL] { term =>

val neo = Neo4j([login to view URL]()).cypher(query)

val resultRDD = [login to view URL]("term", Seq(term)).loadRowRdd

}

[login to view URL](20).foreach(println) <- Results are empty

I am using spark connector this issues going in discussion here : [login to view URL]

Other solution I am just thinking is creating csv string out of dataframe rows then I may run single neo4j query

That will definitely give me results

Query will look like this: WITH ["growth hacker","Popego"] as terms MATCH (t:Term) where [login to view URL] in terms return t

So Are confident in fixing any one of the way?

If yes, can you strart immediately I will give scala file and sbt settings to setup your local quickly

Scala Spark

Project ID: #17712080

About the project

2 proposals Remote project Active 5 years ago

Awarded to:

deytps86

Hello I work In Bigdata/Hadoop technologies. I worked in Spark, Kafka, Cassandra using Scala, Java and Python as well. Can we further talk? Thank you!

$15 USD / hour
(6 Reviews)
3.8

2 freelancers are bidding on average $22/hour for this job

kuhu106

BIG DATA EXPERT! Hello, Hope you are doing well. Specialized Skill: Kafka / Kafka Stream / Kafka Connect/ Spark Streaming I noticed your issue about Spark. We can discuss more. We are team of Big Data ex More

$28 USD / hour
(0 Reviews)
0.0