- Work directly with customer’s technical resources to devise and recommend solutions based on the understood requirements
- Analyse complex distributed production deployments, and make recommendations to optimize performance
- Able to document and present complex architectures for the customer’s technical teams
- Work closely with Hortonworks’ teams at all levels to help ensure the success of project consulting engagements with customer
- Deploy, augment, upgrade and operate large Hadoop clusters
- Write and produce technical documentation, knowledge-base articles
- Keep current with the Hadoop Big Data ecosystem technologies
- Strong understanding with various enterprise security practices and solutions such as LDAP and/or Kerberos
- Strong understanding of network configuration, devices, protocols, speeds and optimisations
- Experience using configuration management tools such as Ansible, Puppet or Chef
- 5+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions
- Familiarity with scripting tools such as bash shell scripts, Python and/or Perl
- Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP
- Understanding of the Java ecosystem and enterprise offerings, including debugging and profiling tools (e.g. jstack, jmap, jconsole), logging and monitoring tools (log4j, JMX)
- Experience implementing data transformation and processing solutions using Apache PIG
- Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
- Experience implementing MapReduce jobs
- Experience setting up multi-node Hadoop clusters
- Ability to understand and translate customer requirements into technical requirements
- Excellent verbal and written communications
Nice, but not required experience:
- Site Reliability Engineering concepts and practices
- Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments.
- Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.
All employees are required to adhere to all Hortonworks employment policies, including without limitation information security policies.
ABOUT HORTONWORKS (NASDAQ – HDP)
Having gone public with its IPO in December 2014, Hortonworks is experiencing extraordinary growth as we deliver essential support to the burgeoning big data and IoT communities. Hortonworks is an industry leading innovator that creates, distributes and supports enterprise-ready open data platforms and modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. Hortonworks is focused on driving innovation in open source communities such as Apache Hadoop, Apache NiFi and Apache Spark. Along with our 2,100+ partners, Hortonworks provides the expertise, training and services that allow customers to unlock transformational value for their organizations across any line of business.
For more information, visit www.hortonworks.com.
Hortonworks, Powering the Future of Data, HDP and HDF are registered trademarks or trademarks of Hortonworks, Inc. and it's subsidiaries in the
Hortonworks is a leading innovator in the industry, creating, distributing and supporting enterprise-ready open data platforms and modern data applications. Our mission is to manage the world’s data. We have a single-minded focus on driving innovation in open source communities such as Apache Hadoop, NiFi, and Spark. We along with our 1600+ partners provide the expertise, training and services that allow our customers to unlock transformational value for their organizations across any line of business. Our connected data platforms powers modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. We are Powering the Future of Data™.
Want to learn more about Hortonworks? Visit Hortonworks's website.
The world's largest funding platform for creative projects