HADOOP Administrator - Senior at AdventHealth

Date Posted: 3/12/2020

Job Snapshot

  • Job Schedule
    Full-Time
  • Job Category
  • Date Posted:
    3/12/2020
  • Job ID:
    20003169
  • Job Family
    Information Systems
  • Travel
    No
  • Shift
    1 - Day
  • Application Zone
    1-Shared Services
  • Organization
    AdventHealth Information Technology

Job Description


Description

HADOOP Admin - Senior

AdventHealth Information Technology

Location Address: Inspiration Avenue, Altamonte Springs FL

Top Reasons To Work At AdventHealth Corporate

•         Great benefits

•         Immediate Health Insurance Coverage

•         Career growth and advancement potential

•         Award-winning IT Department

Work Hours/Shift:

Full-Time, Monday – Friday

 

You Will Be Responsible For:

•               Implement and support Hadoop infrastructure on an Oracle Big Data Appliance

•               Implement and support our enterprise security standards on a Hadoop cluster

•               Propose and deploy new software environments required for Hadoop and expand existing environments

•               Set up new Hadoop users. This includes setting up and testing HDFS, Hive, Pig and MapReduce access for the new users

•               Tune performance of Hadoop clusters and Hadoop MapReduce or Spark jobs

•               Monitor Hadoop cluster job performance

•               Plan Hadoop cluster capacity

•               Manage and review Hadoop log files

•               Manage and monitor cluster file system

•               Work with the infrastructure, network, database, application and business intelligence teams to guarantee high availability

•               Install operating system and Hadoop updates, patches and version upgrades when required

•               Work with developers to evaluate their Hadoop use cases, provide feedback and guidance

•               Escalate support issues with vendors

•               Development activities as required by Big Data project specifications working with Python, Apache Spark and/or Java / Scala

•               Manage system Backups and DR plans


Qualifications

KNOWLEDGE AND SKILLS REQUIRED:

 

•         Ability to proactively identify, troubleshoot and resolve live systems issues

•         Understanding of system capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.

•         Ability to Install operating system and Hadoop updates, patches and version upgrades when required

•         Hadoop development skills including HBase, Hive, Pig, Mahout, etc.

•         Ability to deploy a Hadoop infrastructure, add/remove cluster nodes and install software

•         Ability to schedule, configure and keep track of jobs, monitor critical parts of the cluster, configure name-node high availability,

•         Ability to Manage System Backups and DR plans

•         Strong analytical and problem-solving skills with ability to clearly articulate solution alternatives.

•         Exceptional interpersonal skills to communicate both internally and externally and a team player.

•         Strong understanding of Hadoop design principals, cluster connectivity, security and the factors that affect distributed system performance

•         Strong knowledge and experience in supporting Linux environments

•         Flexible, open to suggestions, and eager to learn or share knowledge.

•         Orientation toward self-motivation, organization and attention to detail.

•         Ability to prioritize and work on multiple projects

KNOWLEDGE AND SKILLS PREFERRED:

•         Administration activities on an Oracle Big Data Appliance

•         Troubleshooting Core Java Applications

•         Oracle Databases

•         Impala, Python, Apache Spark and/or Java / Scala

EDUCATION AND EXPERIENCE REQUIRED:

•         BS degree in Computer Science or a related field

•         Min 3 years experience with Hadoop and related technology stack

•         Maintaining, troubleshooting and setting up large clusters

•         Supporting systems with 24X7 availability and monitoring

 

EDUCATION AND EXPERIENCE PREFERRED:

•         Implementing security on Hadoop (HDFS encryption, Kerberos and LDAP integration and Apache Ranger/Knox)

•         Performance tuning in Big Data environments.

•         Cloudera distribution of Hadoop

 

Summary:

Responsible for implementation and ongoing administration of a Hadoop infrastructure on an Oracle Big Data Appliance, working with the infrastructure, network, database, application and business intelligence teams to guarantee high availability, implementing and supporting enterprise security standards on a Hadoop cluster and performance tuning of Hadoop clusters and Hadoop MapReduce or Spark routines.



This facility is an equal opportunity employer and complies with federal, state and local anti-discrimination laws, regulations and ordinances.

VIEW ALL JOBS BY:
Location | Organization | Category | Job Function