Dallas, United States
about 1 year ago
As a Senior DevOps/Big Data Engineer, you will be responsible for driving development of applications and solutions to maximize the effective use of available data by users across the business. This will require a strategic and technical mindset to drive the future of the solutions teams.
Location: Dallas, Texas
Hadoop/Big Data Experience
- In-depth knowledge of capacity planning, performance management and troubleshooting for of various Hadoop components.
- In-depth understanding of Hadoop storage frameworks (HDFS, Hbase etc) and compute frameworks (Spark, MapReduce etc)
- Familiarity and experience using tools such as, Sqoop, Hive, Impala, SolR
- Understanding of various aspects of Hadoop security such as Kerberos, Sentry, LDAP integration tools (SSSD, Centrify, QAS etc), Encryption – TLS/SSL.
- Cloudera Certified Hadoop Administrator preferred.
- Cloudera Certified Hadoop Developer would be nice to have
- Experience with at least one of the following programing languages: Java, Scala, C++.
- Experience with at least one of the following scripting languages: Python, Bash, Perl
- Familiarity with RDBMS platforms (MySQL, Oracle etc) and SQL is preferred
- Experience using build tools such as Ant or Maven
- Experience with software dependency management and open source libraries
- Familiarity with version control, job scheduling and configuration management tools such as JIRA, Ansible, Github, Jenkins, etc.
- Strong understanding and experience with Linux internals and administration
- Strong understanding of Linux Filesystem management, Networking and resource management.
- Experience with Linux security integration solutions such as SSSD, Centrify etc.
- Experience with software package management (yum, apt-get, rpm etc)
- Experience with RHEL is preferred
- Understanding of master data management solutions
- Strong customer-facing communication and careful listening skills. Proven success in and genuine enthusiasm for working directly with customer technical teams.