The Senior Consultant forms part of the Professional Services organisation. This role will work predominantly on government customers and complex projects which enable our customers on their Big Data journey. Engaging from Proof of Concept (POC) stages through to implementation of complex distributed production environments. You will work collaboratively with customers to optimize performance, develop reference architectures and form part of a team that will foster a long standing collaborative relationship with our customer group.
- Drive POCs with customers to successful completion
- Analyze complex distributed production deployments, and make recommendations to optimize performance
- Help develop reference Hadoop architectures and configurations
- Write and produce technical documentation, knowledge base articles
- Work directly with prospective customers' technical resources to devise and recommend solutions based on the understood requirements
- Participate in the pre- and post- sales process, helping both the sales and product teams to interpret customers' requirements
- Work closely with Hortonworks teams at all levels to ensure rapid response to customer questions and project
- Playing an active within the Open Source Community
- More than five years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
- SC Clearance Preferred
- 2+ years designing and deploying 3 tier architectures or large scale Hadoop solutions
- Experience working with Apache Hadoop including: Knowledge on how to create and debug Hadoop jobs
- Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments
- Ability to understand and translate customer requirements into technical requirements
- Experience implementing data transformation and processing solutions using Apache PIG
- Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
- Experience implementing MapReduce jobs
- Experience setting up multi-node Hadoop clusters
- Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment
- Strong understanding of the Java ecosystem and enterprise offerings, including debugging and profiling tools (jconsole), logging and monitoring tools (log4j, JMX), and security offerings (Kerberos/SPNEGO).
- Familiarity with scripting tools such as bash shell scripts, Python and/or Perl
- Hortonworks Certifications are an advantage but not essential
- Experience in systems administration or DevOps experience on one or more open-source operating systems
- Experience using configuration management tools such as Ansible, Puppet or Chef
All employees are required to adhere to all Hortonworks employment policies, including without limitation information security policies.
ABOUT HORTONWORKS (NASDAQ – HDP)
Having gone public with its IPO in December 2014, Hortonworks is experiencing extraordinary growth as we deliver essential support to the burgeoning big data and IoT communities. Hortonworks is an industry leading innovator that creates, distributes and supports enterprise-ready open data platforms and modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. Hortonworks is focused on driving innovation in open source communities such as Apache Hadoop, Apache NiFi and Apache Spark. Along with our 2,100+ partners, Hortonworks provides the expertise, training, and services that allow customers to unlock transformational value for their organizations across any line of business.
For more information, visit www.hortonworks.com.
Hortonworks, Powering the Future of Data, HDP and HDF are registered trademarks or trademarks of Hortonworks, Inc. and its subsidiaries in the United States and other jurisdictions.
Hortonworks is a leading innovator in the industry, creating, distributing and supporting enterprise-ready open data platforms and modern data applications. Our mission is to manage the world’s data. We have a single-minded focus on driving innovation in open source communities such as Apache Hadoop, NiFi, and Spark. We along with our 1600+ partners provide the expertise, training and services that allow our customers to unlock transformational value for their organizations across any line of business. Our connected data platforms powers modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. We are Powering the Future of Data™.