Jobs Career Advice Signup
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Feb 1, 2022
    Deadline: Feb 7, 2022
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Never pay for any CBT, test or assessment as part of any recruitment process. When in doubt, contact us

    Integrated Staffing and Training Limited was formed in response to demands for a more flexible, cost effective and tailored recruitment services. The company seeks to bridge the gap between employers and job seekers in various industries.
    Read more about this company

     

    Big Data Analyst

    Our client in IT industry is seeking to hire Big Data/Kubernetes/Hadoop Administrator (DevOps) to design and implement data solutions.

    Responsibilities

    1. Designing and implementing Big Data solutions to leverage a Kubernetes cluster.
    2. Configuring hardware, peripherals, and services, managing settings and storage, deploying cloud-native applications, and monitoring and supporting a Kubernetes environment.
    3. Deploying a hadoop cluster, maintaining a hadoop cluster, adding and removing nodes using cluster monitoring tools like Kubernetes Cluster Manager, Ambari Manager & Apache Airflow manager, configuring the NameNode high availability and keeping a track of all the running Big Data jobs.
    4. Implementing, managing and administering the overall hadoop infrastructure.
    5. Takes care of the day-to-day running of Hadoop clusters
    6. A hadoop administrator will have to work closely with the database team, network team, BI team and application teams to make sure that all the big data applications are highly available and performing as expected.
    7. Hadoop admin is responsible for capacity planning and estimating the requirements for lowering or increasing the capacity of the hadoop cluster.
    8. Hadoop admin is also responsible for deciding the size of the hadoop cluster based on the data to be stored in HDFS.
    9. Ensure that the hadoop cluster is up and running all the time.
    10. Monitoring the cluster connectivity and performance.
    11. Manage and review Hadoop log files.
    12. Backup and recovery tasks
    13. Resource and security management
    14. Troubleshooting application errors and ensuring that they do not occur again.

     Skills

    1. Excellent knowledge of UNIX/LINUX OS.
    2. Excellent knowledge of Cloud Technology and Microservices.
    3. Knowledge of cluster monitoring tools like K8s, Ambari, Ganglia, or Nagios.
    4. Knowledge of Troubleshooting Core Java Applications is a plus.
    5. Good understanding of OS concepts, process management and resource scheduling.
    6. Basics of networking, CPU, memory and storage.
    7. Good hold of shell scripting
    8. A knack of all the components in the Hadoop ecosystem like HDFS, Apache Hive, Apache HBase, Apache Airflow, Apache Nifi, Apache Kafka, Apache Spark etc.

    Method of Application

    Interested and qualified candidates should forward their CV to: vacancies@integratedstaffingtl.com using the position as subject of email.

    Build your CV for free. Download in different templates.

  • Apply Now
  • Send your application

    View All Vacancies at Integrated Staffing and Traini... Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail