Job no: 493162
Work type: Permanent
Categories: Infocomm Tech
You will work in the Data Engineering team in Infocomm Technology & Data Division to:-
• Design, Develop, Test, Deploy and Maintain data pipelines (ETL) on the Enterprise Data Warehouse and Big Data Platform
• Design and Develop the API /Web Services framework for curation of new datasets whether internal or external (Internet), and to interface with other systems (both internal and external)
• Explore and source new data sets to address emerging business use case needs
• Maintain the data quality and keep improving the data efficiency
• Possess a good Bachelor’s degree in Computer Science or Computer Engineering; a Specialization in Software Engineering will be advantageous.
• Those with 2-3 years of related work experience will be preferred.
• Good grasp of Software Engineering principles such as Requirements Gathering (both functional and non-functional), Modular & Re- usable Design.
• Proficient in ETL using programming language /tools such as Python and/or SSIS and/or Informatica Power Centre
• Able to develop data applications including integration with ICT systems, build APIs and web applications via .Java and/or Python
• Familiarity with MS SQL, PostgreSQL or Oracle is preferred.
• There will an added advantage for any of the following:-
o Proficient in Data Modelling and Data Mining.
o Experience in designing and building scalable database schema for applications.
o Understanding of Object-Oriented Design.
o Knowledge of or prior work experience on Big Data platforms such as Hadoop or using Spark.
o Experience in the cloud environment setup using Microsoft Azure
Back to search results Apply now Refer a friend