
ING
Key Responsibilities
- Implement and maintain data workflows, ETL / ELT processes, and pipelines to move identified data into target platforms for further consumption and integration as required by product owners.
- Design and implement identified data processes and storage such as usage of databases, data warehouses, data marts, and data lakes with data security in mind.
- Build supporting tooling and procedures needed for data movement and monitoring activities.
- Implement available automations for data movement processes.
- Ensure delivered workflows and processes meet expected quality outputs through conducting testing.
- Document specs and implementations of deliverables mentioned.
- Work with operations teams for maintenance and continuous improvement of workflows and pipelines.
- Work on distributed Data processing and database with Distribution & partitioning
- Able to convert LDM to PDM in Data-vault Data warehouse.
- Ability to perform tasks related to end-to-end Data Journey including Reporting.
- Ensure integrated data pipeline validation is performed.
Key Capabilities/Experience
- Expertise in one/ some of ETL tools like ADF, Airflow.
- Expertise in SQL / RDBMS databases, handling of structured data.
- Extensive experience in design and implementation of data handling processes (ingestion, transformation, modelling, and storage)
- Experience in creating of Big Data warehouses, its design, and its various implementation methods.
- Flexible and willing to learn various ETL / data integration tools and products.
- Strong problem solving and solutions engineering mindset.
- Experience in working and coordinating remotely with teams and stakeholders.
- Experience with data integration and ETL technologies.
- Experienced with software version control technologies such as Git.
- Experience in CI / CD and DevOps is a plus.
- Experience in Agile way of working is a plus.
Minimum Qualifications
- Expertise in one/ some of ETL tools like ADF, Airflow.
- Expertise in SQL / RDBMS databases, handling of structured data.
- Extensive experience in design and implementation of data handling processes (ingestion, transformation, modelling, and storage)
- Experience in creating of Big Data warehouses, its design, and its various implementation methods.
Nice to have
- Proficiency and experience in programming languages such as Python, shell scripting
- Experience in cloud platforms and technologies (Azure, AWS, GCP) is a plus.
Apply now
To help us track our recruitment effort, please indicate in your cover/motivation letter where (skilledworkerjobs.com) you saw this job posting.