Lead Hadoop Engineerother related Employment listings - Chicago, IL at Geebo

Lead Hadoop Engineer

The Lead Hadoop Engineer is the Subject Matter Expert (SME) for our Cloudera CDP and Kafka cluster system administration to establish, install, maintain, monitor, troubleshoot and train others in these technologies. This person will also learn and become proficient in Cloud, EDB Postgres, and MongoDB technologies within our cross functional team. This role focuses on driving standards and automation to enable developers to become self-sufficient where possible. Additional functions are to drive quality improvement practices to ensure our technology is functioning at an optimal performance level, flexible for business needs and can scale for growth needs. Lastly, the Lead Hadoop Engineer focuses on facilitating cross training practices for this team as well as others who wish to learn more about how to use these technologies. Of note, this role and team operates in more of an administration/engineering capacity than in a development capacity with these technologies. Location Note This position can be performed remotely in a work from home arrangement, within the United States. Essential Functions Provide technical leadership for our (Cloudera CDP and Confluent Kafka) technologies with the willingness to learn and support open-source DBMSs Understand functionality components of CDP and Kafka and provide the best guidance to developers to meet business needs Work closely with architecture and developers to ensure optimal availability, performance, stability, etc. for the technology managed by this team Provide technical standard and ensure compliance Strong technical problem-solving skills and ability to teach others Continuous improvement of team's technical processes and procedures Provide guidance and set standards for teams to write data pipelines and build data repositories using current technologies Monitor and optimize performance of our team's technologies. Research/sustain competency relevant to current technology to maintain and/or improve functionality for Engineering organization's Hadoop applications. Plan and coordinate ongoing maintenance and enhancements to Hadoop, Kafka, EDB and MongoDB technologies. Required Experience Bachelor's degree in Engineering, Computer Science, Information Technology or related discipline, or equivalent work or military experience. Minimum 5 years of experience in data management, partitioning, topic creation and publishing using Hadoop and Kafka technologies 5 years of experience with Hadoop and Kafka data related principles, practices, and procedures. Understanding of Data Management (SQL, NoSQL, ETL and meta data management). Preferred Qualifications Experience working with application developers to understand their needs in order to help them to access Big Data platforms. Experience with Agile methodologies including Scrum. Experience with managing Big Data Platforms (Hadoop - HDFS, Impala, Hive, HBase, Spark, Yarn, Hue, KTS, Sentry, ZooKeeper, etc.) Experience with partitioning within Hadoop Experience with creating, managing and supporting event driven architecture (Kafka) Experience with RESTful API Services. Experience with Linux systems and scripting languages. Experience with functional programming (Scala or Python).
Salary Range:
$100K -- $150K
Minimum Qualification
Systems Architecture & EngineeringEstimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.