Cloudera Hadoop Administrator - Core Technology Infrastructure
Company
Bank of America
Location
Charlotte, NC
Type
Full Time
Job Description
Responsible for defining an architectural vision and solution architecture which aligns with the enterprise architecture strategy, technology and platform choices. Describes the solution intent and the associated operating environment, determines the primary systems/subsystems and their interfaces, defining non-functional requirements and architectural runway to support new epics/features. Ensures the solution is fit for purpose and use by working with stakeholders, vendors/service providers, and evaluating the impact of strategic design decisions. Works across business and technology to create the solution intent and architectural vision and evolves it to an emerging backlog. Consults the business to clearly understand the business problems and technology to understand the technology challenges of the solution and finds creative solutions through practical experiments and POCs. Leads rapid shaping of a high level architecture with details filled in with emerging business requirements; ensures architecture is flexible and modular and designed to adapt easily. Utilizes the defined best practices, templates and documentation to create architectural designs; suggests improvements to best practices and templates through practical knowledge. Works with Product Manager/Owner to plan and prioritize technology focused backlog items for the architecture runway to enable business epics/features. Clarifies the architecture for the development teams to support implementation, and provides solution options to resolve any architectural impediments. Performs design and code reviews to ensure all non-functional requirements for a solution are sufficiently met (e.g. security, performance, maintainability, scalability, usability, and reliability). Educates team members on the technology practices, standardization strategies and best practices to create innovative solutions. Individual contributor.
Position Summary:
The Cloudera Hadoop Administrator is someone responsible for administering the full Hadoop stack including, application integration, performance management, security implementation, configuration management, and problem management against an array of services and function at a platform and host level.
Required Skills:
- The ideal candidate should be Cloudera Certification
- Degree in Computer Science, Electronics, Communication, Information Technology
- Minimum of 5 years practical experience on enterprise platforms
- Experience with multiple large scale Enterprise Hadoop environment builds and operations including design, capacity planning, cluster set up, security, performance tuning and monitoring
- Experience with the full Cloudera CDP distribution to install, configure and monitor all services in the CDP stack
- Strong understanding of core Cloudera Hadoop services such as HDFS, MapReduce, Kafka, Spark-Streaming, Hive on Tez, Impala, HBASE, Sqoop, Ranger, Phoenix, Solr and Oozie
- Experience in importing and exporting Terabytes of data using Sqoop and other ETL tools such as Informatica, DataStage from HDFS/Hive/HBase to Relational Database Systems and vice-versa
- Experience in administering, and supporting RHEL Linux operating systems, databases, and hardware in an enterprise environment
- Expertise in typical system administration and programming skills such as storage capacity management, debugging, performance tuning
- Proficient in shell scripting (e.g. Bash, ksh etc.)
- Experience in setup, configuration and management of security for Hadoop clusters using Kerberos with integration with LDAP/AD at an Enterprise level
- Experience with scaling enterprise data into the ecosystem
- Working knowledge in configuring Apache NiFi
- Experience with design, build, backup, recovery, high availability and fine-tuning of SingleStore(MemSQL) Database
Desired Skills:
- Expertise in writing python scripts and debugging existing scripts.
- Enterprise Database Administration Platform Experience.
- Experience with Hadoop distributions in the Cloud such Microsoft Azure, IBM Cloud
- Experience In Large Analytic Tools including SAS, Search, Machine Learning, Log Aggregation.
- Experience with WANdisco 6. Experience with Apache Knox
Shift:
1st shift (United States of America)
Hours Per Week:
40
Date Posted
11/22/2022
Views
15
Similar Jobs
B2B Sales Representative, Outbound - $65K+ per Year - Spectrum
Views in the last 30 days - 0
Spectrum is seeking a driven goaloriented professional for a Telesales Representative position The role offers a base pay of 1800 per hour with potent...
View DetailsData Science Consultant - Sia Partners
Views in the last 30 days - 0
Sia Partners is a global management consulting firm with 3000 consultants in 19 countries expecting a turnover of USD 420 million They offer a unique ...
View DetailsField Marketing Specialist (5- month Contract) - Endava
Views in the last 30 days - 0
Endava is seeking a Field Marketing Specialist with 45 years of marketing experience focusing on event planning 360 campaign management lead generatio...
View DetailsSoftware Engineer II - The Walt Disney Company
Views in the last 30 days - 0
Disney Entertainment ESPN Technology is reimagining viewing experiences for beloved stories and transforming Disneys media business They are building...
View DetailsManaging Consultant, Back of House Restaurant Technology - Point B
Views in the last 30 days - 0
Point B is a business innovation firm that specializes in transformation by combining advanced technologies and industry expertise They aim to help bu...
View DetailsManaging Consultant, Front of House Restaurant Technology - Point B
Views in the last 30 days - 0
Point B is a business innovation firm that specializes in transformation by combining advanced technologies and industry expertise They aim to help bu...
View Details