- Experience:
+ Experience in designing and building dimensional data models, ETL processes, applied data warehouse concepts and methodologies, optimized data pipelines and assisted the architect as needed.
+ Experience monitoring complex system and solving data and systems issues having a consistent and algorithmic approach to resolving them.
+ Experience working in Agile teams to support digital transformation projects, having a clear understanding of Agile principles, practices and Scrum methodologies.
- Request:
+ Bachelor or Master’s degree in Statistics, Mathematics, Quantitative Analysis, Computer Science, Software Engineering or Information Technology.
+ 1 to 2 years of relevant experience with developing, debugging, scripting and employing big data.
+ Technologies (e.g. Hadoop, Spark, Flink, Kafka, Arrow, Tableau), database technologies (e.g. SQL, NoSQL, Graph databases), and programming languages (e.g. Python, R, Scala, Java, Rust, Kotlin) with preference towards functional/trait oriented.
+ Deep understanding of Information Security principles to ensure complaint handling and management of all data.
+ Has some know-how and partial scripting and coding experience to set up, configure & maintain a machine learning model development environment.
+ Some experience architecting, coding and delivering high performance micro services and/or recommenders delivering recommendations to a large user-base.