❖ Expertize with the tools in Hadoop Ecosystem including Hive, HDFS, MapReduce, Sqoop, Spark and
Yarn.
❖ Excellent knowledge on Hadoop ecosystems such as HDFS, Spark programming using RDD, Data
Frame and Spark-SQL.
❖ Experience in manipulating/analyzing large datasets and finding patterns and insights within structured
and unstructured data.
❖ Strong experience on Hadoop distributions Cloudera.
❖ Experienced in writing Spark-Scala programs that work with different file formats like Text, Sequence,
Json, parquet and Avro.
❖ Experience in migrating the data using Sqoop from HDFS to Relational Database System and vice-
versa according to client's requirement.
❖ Experience in using IntelliJ IDEA and various repositories SVN and Git.
❖ Having Experience in C#.Net, ETL-SSIS and SQL Server 2008/2012.
❖ Good Knowledge about Object oriented Programming.