Job Details

Hadoop Administration

  2025-10-31     SonSoft     San Mateo,CA  
Description:

Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services.

Job Description

  • At least 4 years of experience in Implementation and Administration of Hadoop infrastructure
  • At least 2 years of experience Architecting, Designing, Implementation and Administration of Hadoop infrastructure
  • At least 2 years of experience in Project life cycle activities on development and maintenance projects.
  • Should be able to provide Consultancy to client / internal teams on which product/flavor is best for which situation/setup
  • Operational expertise in troubleshooting , understanding of systems capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
  • Hadoop, MapReduce, HBase, Hive, Pig, Mahout
  • Hadoop Administration skills: Experience working in Cloudera Manager or Ambari, Ganglia, Nagios
  • Experience in using Hadoop Schedulers - FIFO, Fair Scheduler, Capacity Scheduler
  • Experience in Job Schedule Management - Oozie or Enterprise Schedulers like Control-M, Tivoli
  • Good knowledge of Linux (RHEL, Centos, Ubuntu)
  • Experience in setting up Ad/LDAP/Kerberos Authentication models
  • Experience in Data Encryption technique

Responsibilities:-

  • Upgrades and Data Migrations
  • Hadoop Ecosystem and Clusters maintenance as well as creation and removal of nodes
  • Perform administrative activities with Cloudera Manager/Ambari and tools like Ganglia, Nagios
  • Setting up and maintaining Infrastructure and configuration for Hive, Pig and MapReduce
  • Monitor Hadoop Cluster Availability, Connectivity and Security
  • Setting up Linux users, groups, Kerberos principals and keys
  • Aligning with the Systems engineering team in maintaining hardware and software environments required for Hadoop
  • Software installation, configuration, patches and upgrades
  • Working with data delivery teams to setup Hadoop application development environments
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Screen Hadoop cluster job performances and capacity planning
  • Data modelling, Database backup and recovery
  • Manage and review Hadoop log files
  • File system management, Disk space management and monitoring (Nagios, Splunk etc)
  • HDFS support and maintenance
  • Planning of Back-up, High Availability and Disaster Recovery Infrastructure
  • Diligently teaming with Infrastructure, Network, Database, Application and Business Intelligence teams to guarantee high data quality and availability
  • Collaborating with application teams to install operating system and Hadoop updates, patches and version upgrades
  • Implementation of Strategic Operating model in line with best practices
  • Point of Contact for Vendor escalations
  • Ability to work in team in diverse/ multiple stakeholder environment
  • Analytical skills

Qualifications

  • Bachelors degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
  • At least 7 years of experience within the Information Technologies.

Additional Information

**U.S. citizens and those authorized to work in the U.S. are encouraged to apply.We are unable to sponsor at this time.

Note:-

  • This is a Full-Time Permanentjob opportunity for you.
  • Only US Citizen, Green Card Holder, GC-EAD,H4-EAD & L2-EAD can apply.
  • No OPT-EAD, TN Visa & H1B Consultants please.
  • Please mention yourVisa Statusin youremailorresume.

#J-18808-Ljbffr


Apply for this Job

Please use the APPLY HERE link below to view additional details and application instructions.

Apply Here

Back to Search