Solid SQL/PLSQL skills and strong understanding of data quality and cleansing strategies.
Proficiency in areas such as metadata management, master data management, and data governance.
Design and setup streaming data pipeline infrastructure.
Responsible for performance, speed, scalability and extensibility of any application requiring usage of the pipeline.
Determines database structural requirements by analyzing data-access pattern, client operations and applications; reviewing objectives with clients; evaluating current systems.
Understand and communicate how data models can be used to support, interact and integrate with other information systems.
Lead the analysis of current data management technologies and platforms to detect critical deficiencies and recommend solutions for improvement. In addition, lead the impact analysis of new technologies and market trends on the current data architecture
Education and Experiences
Bachelor's Degree or higher in Statistics, Computer Science, Information Technology or related fields.
Hand-on experience in Data Expert role such as Data Scientist, Data Engineer or DBA at least 1 years.
Knowledge and skill & Training requirement.
Able to build large-scale distributed products.
Experience with ETL processes using ETL Tools like SAS DI, Oracle Warehouse Builder, Microsoft SSIS.
Experience with the Big Data technologies and their ecosystem.
Experience with various database technologies (e.g. time series/metrics databases, column-oriented datastores, key-value datastores).
Innovative problem-solving skills with the ability to identify and resolve complex architectural issues.
Scala and/or other functional programming language experience or at least desire to learn.