System Architect - Professional Services APAC & Middle East
Hortonworks is seeking a System Architect to join its APAC & Middle East Professional Services team. In this role you’ll have the opportunity to develop massively scalable solutions to solve complex data problems using Hadoop, NiFi, Spark and related Big Data technology. This role is a client facing opportunity that combines consulting skills with deep technical design and development in the Big Data space. This role will present the successful candidate the opportunity to travel across Asia Pacific and the Middle East across multiple industries and large customer organizations.
- Work directly with customers to implement Big Data solutions at scale using the Hortonworks Data Platform and Hortonworks Dataflow
- Design and implement Hadoop and NiFi platform architectures and configurations for customers
- Perform platform installation and upgrades for advanced secured cluster configurations
- Analyze complex distributed production deployments, and make recommendations to optimize performance
- Able to document and present complex architectures for the customers technical teams
- Work closely with Hortonworks’ teams at all levels to help ensure the success of project consulting engagements with customer
- Drive projects with customers to successful completion
- Write and produce technical documentation, blogs and knowledgebase articles
- Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements
- Keep current with the Hadoop Big Data ecosystem technologies
- Attend speaking engagements when needed
- Travel up to 75%
- 10+ years in Information Technology and System Architecture experience
- 5+ years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
- 5+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions
- Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based and streaming data deployments.
- Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.
- Ability to understand and translate customer requirements into technical requirements
- Experience implementing data transformation and processing solutions
- Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
- Experience setting up multi-node Hadoop clusters
- Experience in configuring security configurations (LDAP/AD, Kerberos/SPNEGO)
- Experience in Hortonworks Software and/or HDP Certification (HDPCA / HDPCD) is a plus
- Strong experience implementing software and/or solutions in the enterprise Linux environment
- Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos
- Strong understanding of network configuration, devices, protocols, speeds and optimizations
- Strong understanding of the Java ecosystem including debugging, logging, monitoring and profiling tools
- Familiarity with scripting tools such as bash shell scripts, Python and/or Perl, Ansible, Chef, Puppet
- Solid background in Database administration or design
- Excellent verbal and written communications
- Experience in architecting data center solutions – properly selecting server and storage hardware based on performance, availability and ROI requirements
Security roles and responsibilities shall include the following requirements:
- implement and act in accordance with the organization's information security policies;
- protect assets from unauthorized access, disclosure, modification, destruction or interference;
iii. execute particular security processes or activities;
- ensure responsibility is assigned to the individual for actions taken; and
- report security events or potential events or other security risks to the organization.
All employees are required to adhere to all Hortonworks employment policies, including without limitation information security policies.
ABOUT HORTONWORKS (NASDAQ – HDP)
Having gone public with its IPO in December 2014, Hortonworks is experiencing extraordinary growth as we deliver essential support to the burgeoning big data and IoT communities. Hortonworks is an industry leading innovator that creates, distributes and supports enterprise-ready open data platforms and modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. Hortonworks is focused on driving innovation in open source communities such as Apache Hadoop, Apache NiFi and Apache Spark. Along with our 2,100+ partners, Hortonworks provides the expertise, training, and services that allow customers to unlock transformational value for their organizations across any line of business.
For more information, visit www.hortonworks.com.
Hortonworks, Powering the Future of Data, HDP and HDF are registered trademarks or trademarks of Hortonworks, Inc. and its subsidiaries in the United States and other jurisdictions.
Hortonworks is a leading innovator in the industry, creating, distributing and supporting enterprise-ready open data platforms and modern data applications. Our mission is to manage the world’s data. We have a single-minded focus on driving innovation in open source communities such as Apache Hadoop, NiFi, and Spark. We along with our 1600+ partners provide the expertise, training and services that allow our customers to unlock transformational value for their organizations across any line of business. Our connected data platforms powers modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. We are Powering the Future of Data™.
Want to learn more about Hortonworks? Visit Hortonworks's website.
Reddit is an American social news aggregation, web content rating, and discussion website.