Hadoop Evangelist =========== Description A consultant who - TopicsExpress



          

Hadoop Evangelist =========== Description A consultant who has extensive prior experience in designing and building applications using procedural languages, most recently in the Hadoop space. This person must be comfortable explaining design concepts to customers (the art of what is possible) as well as being capable of managing a team of developers. Work Place : one of the countries within Eastern Europe ( Austria, Czech republic, Poland , Hungary,Russia) Responsibilities: Hands on experience designing, developing, and maintaining complex software solutions in the Hadoop space Strong understanding of algorithms, software engineering and design pattern Ability to build large-scale high-performance Hadoop systems Participate in the pre- and post- sales process, helping both the sales and product teams to interpret customers’ requirements Work directly with prospective customers’ technical resources to devise and recommend solutions based on the understood requirements Analyze complex distributed production deployments, and make recommendations to optimize performance Ability to quickly learn and adapt in a demanding and rapidly changing environment Work closely with Hortonworks’ teams at all levels to ensure rapid response to customer questions and project blockers Assist in POCs and pilot projects with customers. As a design authority, can lead a team of Hadoop developers Qualifications: More than three years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions as well as 2+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions 5+ years of Java product development experience, some of which should be in the capacity of managing a development team Ability to understand and translate customer requirements into technical requirements Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment Ability to compile and install Linux applications from source, the Linux kernel and kernel modules Experience with integrating various solutions such as LDAP, or system / installation management tools into the overall solution Strong understanding of network configuration, devices, protocols, speeds and optimizations Python, Perl, or other scripting language required Familiarity with the Java ecosystem and enterprise offerings, including debugging and profiling tools (jconsole), logging and monitoring tools (log4j, JMX), and security offerings (Kerberos/SPNEGO). Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP Nice, but not required experience: Experience working with Apache Hadoop including: Knowledge on how to create and debug Hadoop jobs Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments. Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc. Job : Services/Consulting Schedule : Full-time Primary Location : Europe, Middle East & Africa-Czech Republic-Czech Republic-Prague Organization : EMEA Sales & Services
Posted on: Tue, 16 Dec 2014 07:54:04 +0000

Trending Topics



Recently Viewed Topics




© 2015