Hortonworks is seeking an experienced Sr. Consultant to join our team in the Central region. This key role has two major responsibilities: first to work directly with our customers and partners to optimize their plans and objectives for architecting, designing and deploying Apache Hadoop environments, and, secondly, to assist in building or designing reference configurations to enable our customers and influence our product.
The Sr. Consultant will facilitate the communication flow between Hortonworks teams and the customer. For these strategically important roles, we are seeking outstanding talent to join our team.
- Work directly with customer’s technical resources to devise and recommend solutions based on the understood requirements
- Analyze complex distributed production deployments, and make recommendations to optimize performance
- Able to document and present complex architectures for the customers technical teams
- Work closely with Hortonworks’ teams at all levels to help ensure the success of project consulting engagements with customer
- Help design and implement Hadoop architectures and configurations for customer
- Drive projects with customers to successful completion
- Write and produce technical documentation, knowledgebase articles
- Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements
- Keep current with the Hadoop Big Data ecosystem technologies
- Attend speaking engagements when needed
- Travel up to 75%
- More than two years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
- Experience designing and deploying production large-scale Hadoop solutions
- Ability to understand and translate customer requirements into technical requirements
- Experience designing data queries against data in a Hadoop environment using tools such as Apache Hive, Apache Druid, Apache Phoenix or others.
- Experience installing and administering multi-node Hadoop clusters
- Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment
- Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos
- Strong understanding of network configuration, devices, protocols, speeds and optimizations
- Strong understanding of the Java development, debugging & profiling
- Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP
- Solid background in Database administration and design, along with Data Modeling with star schema, slowing changing dimensions, and/or data capture.
- Experience in architecting data center solutions – properly selecting server and storage hardware based on performance, availability and ROI requirements
- Demonstrated experience implementing big data use-cases, understanding of standard design patterns commonly used in Hadoop-based deployments.
- Excellent verbal and written communications
Nice, but not required experience:
- Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.
- Hortonworks Certified - Admin and/or Developer or Data Science experience
- Familiarity with Data Science notebooks such as Apache Zepplin, Jupyter or IBM DSX
- Demonstrable experience implementing machine learning algorithms using R or Tensorflow
- Automation experience with Chef, Puppet, Jenkins or Ansible
- Familiarity with scripting tools such as bash shell scripts, Python and/or Perl
- Experience with Cloud Platforms & deployment automation
All employees are required to adhere to all Hortonworks employment policies, including without limitation information security policies.
ABOUT HORTONWORKS (NASDAQ – HDP)
Having gone public with its IPO in December 2014, Hortonworks is experiencing extraordinary growth as we deliver essential support to the burgeoning big data and IoT communities. Hortonworks is an industry leading innovator that creates, distributes and supports enterprise-ready open data platforms and modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. Hortonworks is focused on driving innovation in open source communities such as Apache Hadoop, Apache NiFi and Apache Spark. Along with our 2,100+ partners, Hortonworks provides the expertise, training and services that allow customers to unlock transformational value for their organizations across any line of business.
For more information, visit www.hortonworks.com.
Hortonworks, Powering the Future of Data, HDP and HDF are registered trademarks or trademarks of Hortonworks, Inc. and it's subsidiaries in the United States and other jurisdictions.
EOE/M/F/Vet/Disability. Hortonworks is an Equal Opportunity Employer in accordance with VEVRAA and Section 503 regulations. <span<stronghttp://www1.eeoc.gov/employers/upload/eeoc_self_print_poster.pdf
Hortonworks is a leading innovator in the industry, creating, distributing and supporting enterprise-ready open data platforms and modern data applications. Our mission is to manage the world’s data. We have a single-minded focus on driving innovation in open source communities such as Apache Hadoop, NiFi, and Spark. We along with our 1600+ partners provide the expertise, training and services that allow our customers to unlock transformational value for their organizations across any line of business. Our connected data platforms powers modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. We are Powering the Future of Data™.
Want to learn more about Hortonworks? Visit Hortonworks's website.
Jobs You May Like
Senior Contracts Manager
PlanGrid, San Francisco, CA
PlayStation, San Mateo, CA
Legal Operations Coordinator, Privacy
Netflix, Los Gatos, California
Manager of Global Equity
Shopify, Ottawa, Elgin - HQ
Director of Accounting
ThirdLove, San Francisco, California