Create a Job Alert.

Enter your email below to save this search and receive job recommendations for similar positions.
Thank you. We'll send jobs matching these to
You already suscribed to this job alert.
0 suggestions are available, use up and down arrow to navigate them

DevOps Engineer - Hadoop job in Leeds at The Stars Group

Create Job Alert.

Get similar jobs sent to your email

Apply to this job.
Think you're the perfect candidate?
Apply on company site
DevOps Engineer - Hadoop at The Stars Group

DevOps Engineer - Hadoop

The Stars Group Leeds, West Yorkshire Full Time
Apply on company site

We are looking for a Hadoop – DevOps Engineer to join our busy and dynamic team based in our Leeds service office.

Are you our next star player?

As part of our data platform team, you will be responsible for driving the design and development of core platform frameworks that enable the delivery and construction processes for the Data Management, Data Discovery and Analytics group, and using emerging big data technologies. You will be required to apply your depth of knowledge and expertise to all aspects of the software development lifecycle, as well as partner continuously with your different business partners daily to stay focused on common goals.

Why we need you: –

You will efficiently translate architecture and low-level requirements to design. You will perform optimizations on Big Data and investigation of job bottlenecks.

You will be responsible for the documentation, design, development of Hadoop applications and handle the installation, configuration, and support of Hadoop cluster nodes.

You will maintain and support backend MapReduce, Hive, Storm, Flink applications and Hadoop cluster. You will convert hard and complex techniques as well as functional requirements into detailed designs.

As a dedicated team member, you will propose best practices and standards, handover to the operations. You will test software prototypes and transfer them to the operational team. You will maintain data security and privacy across the data warehouse.

Who are we looking for: –

As a perfect candidate you have:

  • Familiarity with Hadoop ecosystem and its components. Like HDFS, KAFKA, FLINK, HIVE, YARN, HBASE etc with exceptional knowledge of Hadoop administration.
  • Hands on experience in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera or Hortonworks.
  • Have experience in Development Operations (DevOps), Software Configuration Management, Build and Release Management.
  • Have knowledge in Zookeeper, journal Nodes, Hadoop HA, Hadoop HA Federation, Hue, MapReduce, HBase, Hive, Apache Ranger, Apache Sentry, Kerberos and Apache Knox.
  • Good experience in setting up the High-Availability Hadoop Clusters. focusing on high-availability, fault tolerance, and auto-scaling.
  • It is quintessential that you have knowledge and experience with DevOps automation tools and excellent knowledge on CI using either Maven, Nexus or Jenkins.
  • You should have experience in installing and administering.
  • Good knowledge of (Windows / Linux) system, better if RedHat/CentOS.
  • You will be responsible for deploying data science artefacts, implementing metrics gathering processes, managing the data science technology stack and a lot more.
  • Proficiency with at least one scripting language like Bash/PowerShell/Python etc.
  • Good if you have knowledge on Docker and Kubernetes.
  • Good if you have knowledge on Ansible/Salt Stack to automate Configuration management & Applications.
  • Good understanding of networking and firewalls and storage systems (DAS, NAS, SAN, FC etc) and file systems.
  • Good understanding of SDLC and distributed data / systems / architectures.
  • Good hands-on knowledge on Job Automation and Monitoring like Grafana, Ganglia, Kibana, Nagios
  • Analytical and problem-solving skills; the implementation of these skills in Big Data domain
  • What’s in it for you?

    Our experience-based salaries are competitive. Plus, there’s a discretionary annual performance bonus. And we provide advice and dedicated assistance to those moving to Leeds.

    Your package will include:

  • health and dental insurance for you, your partner and your children (if you all live at the same address)
  • the option to join our company pension scheme
  • £1,000 as a congratulations if you have a baby whilst you work for us
  • discounted gym membership
  • in-house training and development to develop your skills, progressing your career
  • free fresh fruit, snacks and drinks in the office
  • social events; including our sensational summer and Christmas parties.

    Recommended Skills

    Apache Hive
    Map Reduce
    Apache Hadoop
    Apache H Base
    Apache Zookeeper
    Hadoop Distributed File System

    Recommended Jobs

    DevOps Engineer - Hadoop
    The Stars Group Leeds Full Time
    Senior Data Engineer
    Everis London Full-Time
    Lead Software Engineer - Full Stack (QE Speciality)
    Discover Farnborough Full-Time
    Apply to this job.
    Think you're the perfect candidate?
    Apply on company site

    Job ID: bdfcd6c8cc53