Hadoop Application Deeloperother related Employment listings at Geebo

Hadoop Application Deeloper

Hadoop Application Developer:
Location :
North Chicago, IL 60064 Duration :
6 months
Job
Summary:
Responsible for solution design and development of a next generation enterprise data hub, integration with other applications and source systems to provide reporting and analytic solutions.
Strong understanding data lake approaches, industry standards and industry best practice with experiences of enterprise scale implementation following the Software Development Lifecycle (SDLC) process.
Responsibilities/Qualification:
Define data solutions for reporting and analytics and integration of data assets from transactional applications Proficient in AWS Cloud, API Gateway Management, MicroServices Modeling experience using Big Data Technologies Innovate and create standardized framework for ingesting, managing, transforming and validating data and create reusable code/strategies to provide agile data integration solutions Identify, analyze, and interpret trends or patterns in complex data sets Establish and enforce guidelines to ensure consistency, quality and completeness of data assets.
Create reusable code/strategies to provide agile data integration solutions Good understanding of data cataloging for self-service use.
Experience in code/build/deployment tools like git, svn Experiences in experience with statistical analyses and predictive modeling techniques a huge plus Ability to work on multiple fast paced projects with exposure to managing and providing architectural guidance to both offshore and onshore development team.
3
years of hands-on experience with the technologies in the Hadoop ecosystem like Hadoop, HDFS, Spark, MapReduce, Pig, Hive, Flume, Sqoop, Cloudera Impala, Zookeeper, Oozie, Hue, Python, Scala, Shell Scripting Experience with data analytic tools including Hive, Spark SQL, HBase, Other No SQL DB, MongoDB Experience with search systems (Solr, Elastic Search).
Good understanding of relational databases and solid SQL skills Exposure to high availability configurations, application performance tuning.
2) What are the top skills/requirements this person is required have? Ability to work on multiple fast paced projects with exposure to managing and providing architectural guidance to both offshore and onshore development team.
3
years of hands-on experience with the technologies in the Hadoop ecosystem like Hadoop, HDFS, Spark, MapReduce, Pig, Hive, Flume, Sqoop, Cloudera Impala, Zookeeper, Oozie, Hue, Python, Scala, Shell Scripting Experience with data analytic tools including Hive, Spark SQL, HBase, Other No SQL DB, .
Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.