The Hadoop Developer Lead will provide technical leadership and design support to US architect, interface with BI business stakeholders to understand business requirements, and partner to build and deploy the product.
7+ years’ experience with data warehousing, data modeling, and enterprise BI ETL development (Informatica, IBM DataStage, etc.).
3+ years of solid experience working with Big Data technologies like Apache Hadoop, Spark, Kafka, Hive, Impala etc. Experience with Cloudera’s Hadoop platform is preferred.
3+ years’ experience with enterprise BI reporting solutions preferably Cognos Reporting Tools, Report Studio, Framework Manager, and Tableau & Power BI.
3+ years’ experience with Agile methodologies in a business intelligence or data modeling / warehouse context (e.g. Kimball methodology).
Basic understanding of Java programming language is a plus.
Knowledge of Python and R language along with machine learning concept is preferred.
Solid/strong experience using SQL in a DB2, Oracle or SQL Server shop. The platform is not important.
Foundation knowledge in UNIX and Windows based BI platforms a big plus.
Understanding of application architecture and technology infrastructure preferred.
Able to learn technical concepts quickly and apply them effectively in the workplace.
Able to adapt to changing business requirements and react quickly.
Strong customer focus and results oriented attitude.
Self-motivated individual, able to work independently and manage numerous deliverables in parallel.
Bachelor's degree in Computer Science / Engineering or equivalent.