84 Etl Engineer jobs in the Philippines
ETL Engineer
Posted today
Job Viewed
Job Description
Discover your 100% YOU with MicroSourcing
Position: ETL Analyst
Location: Eastwood Libis, Quezon City
Work setup & shift: Hybrid | Nightshift (8pm-5am PH)
**Why join MicroSourcing?
You'll have:**
- Competitive Rewards: Enjoy above-market compensation, healthcare coverage on day one, plus one or more dependents, paid time-off with cash conversion, group life insurance, and performance bonuses
- A Collaborative Spirit: Contribute to a positive and engaging work environment by participating in company-sponsored events and activities.
- Work-Life Harmony: Enjoy the balance between work and life that suits you with flexible work arrangements.
- Career Growth: Take advantage of opportunities for continuous learning and career advancement
- Inclusive Teamwork: Be part of a team that celebrates diversity and fosters an inclusive culture.
Your Role:
Design, build, and maintain scalable ETL/ELT data pipelines that extract, transform, and load data from multiple sources into internal systems for use by client-facing applications and operational reporting. This role focuses entirely on backend data engineering and integration—not on dashboards or data visualization.
- Data Pipeline Implementation and Support
Data Transformation
Create data transformations according to business rules and client-specific requirements.
Support data enrichment workflows
Automation & Refresh Process
Implement automation for recurring data loads and transformations across customers.
- Monitor and facilitate data refresh cycles, including scheduling, orchestration, and validation.
Own client refresh lifecycle, ensuring timely, complete, and accurate data updates.
Data Optimization
Continuously improve pipeline efficiency, reducing refresh times and resource usage.
Identify bottlenecks and refactor logic or infrastructure to scale with growing data volumes
Refresh Data QA & Validation
Validate data post-ingestion and transformation to ensure schema alignment, completeness, and business logic integrity.
- Collaborate with the Data Quality and Delivery teams to resolve ingestion-related data quality issues.
Build checks into the ETL workflows to proactively detect and flag anomalies.
Documentation & SOPs
What You Need:
Technical quantitative BS degree, combined with modelling work experience or training, with working knowledge of modelling (regression, machine learning, feature selection, validation), data ETL (extracting, preparing, validating) and building analytics workflows.
Essential Skills - Analytics
- 2+ years of hands-on experience developing automated data workflows in Alteryx or similar ETL tools (e.g., Talend, SSIS, Informatica).
- Strong SQL experience, especially in writing complex joins, data transformations, and quality checks.
- Candidates should understand data structures and logic but are not expected to create visualizations or dashboards.
- Working knowledge of programming languages such as SQL (R and Python are beneficial)
- Develop custom workflows using Alteryx to access snowflake database, create repeatable processes for other team members to execute.
- Data Collection & Structured Analysis - Develops logical approach to data collection and ensures data collection meets the timeline
- Identifies gaps in data, and/or anomalies and seeks clarification - surfaces concerns or issues immediately with recommendations.
This is a backend ETL role. Candidates focused on reporting, dashboarding, or visualization tools (e.g., Power BI, Tableau) are not a fit for this position.
About MicroSourcing
With over 9,000 professionals across 13 delivery centers, MicroSourcing is the pioneer and largest offshore provider of managed services in the Philippines.
Our commitment to 100% YOU
MicroSourcing firmly believes that our company's strength lies in our people's diversity and talent. We are proud to foster an inclusive culture that embraces individuals of all races, genders, ethnicities, abilities, and backgrounds. We provide space for everyone, embracing different perspectives, and making room for opportunities for each individual to thrive.
At MicroSourcing, equality is not merely a slogan – it's our commitment. Our way of life. Here, we don't just accept your unique authentic self - we celebrate it, valuing every individual's contribution to our collective success and growth. Join us in celebrating YOU and your 100%
For more information, visit
*Terms & conditions apply
Data Engineer, ETL Engineer
Posted today
Job Viewed
Job Description
We are looking for a Data, ETL & Connector Engineer with strong experience in data engineering or data science, including metadata ETL and APIs. This role will focus on data integration, transformation, and supporting search engine capabilities, especially in e-commerce applications.
Core Skills:
- Java, Spring Batch, JavaScript, Python, AWS
Preferred Skills:
- Search Engine (Solr, Elasticsearch, or others)
- Azure DevOps
Key Responsibilities:
- Develop, modify, and test connectors for acquiring and transforming catalog data and metadata feeds into formats required by the search engine.
- Support incremental processes, mappings, and security requirements.
- Perform performance benchmarking and improve data ingestion quality.
- Collaborate with data owners, API SMEs, and Technical Architects to understand source system schema.
- Support API enhancements and back up full-stack engineers as needed.
- Analyze bugs, document findings, and implement/test fixes with other teams.
- Participate in sprint activities with Technical Leads, Delivery Leads, Scrum Masters, and Product Owners.
- Use Azure DevOps for ticketing and task tracking, updating status daily.
- Apply CI/CD processes, resolve bugs, and perform unit testing.
- Maintain up-to-date documentation on solution changes.
Job Type: Full-time
Pay: Php120, Php130,000.00 per month
Benefits:
- Health insurance
- Life insurance
- Pay raise
Work Location: In person
ETL Data Engineer
Posted today
Job Viewed
Job Description
Job Description:
Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Summary:
As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with team members to address challenges and contribute to the overall success of data initiatives.
Roles & Responsibilities:
- Expected to perform independently and become an SME.
- Required active participation/contribution in team discussions.
- Contribute in providing solutions to work related problems.
- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.
- Monitor and optimize data pipelines for performance and reliability.
- Document data processes and workflows to ensure clarity and knowledge sharing.
- Stay updated with industry trends and best practices in data engineering.
- Assist in troubleshooting and resolving data-related issues as they arise.
Professional & Technical Skills:
- Required Skill: Expert proficiency in Microsoft Azure Data Services.
- Additional Good to Have Skills: Experience with Data Engineering.
- Strong understanding of data modeling and database design principles.
- Experience with ETL tools and data integration techniques.
- Familiarity with cloud computing concepts and services.
- Proficient in programming languages such as Python or SQL for data manipulation.
Additional Information:
- The candidate should have minimum 2 years of experience in Microsoft Azure Data Services.
- This position is based at our Manila office.
Minimum 2 year(s) of experience is required
ETL Data Engineer/Analyst
Posted today
Job Viewed
Job Description
Discover your 100% YOU with MicroSourcing
Position: ETL Analyst
Location: Eastwood Libis, Quezon City
Work setup & shift: Hybrid | Nightshift (8pm-5am PH); Hybrid remote work from home every Monday & Friday + 1 fully remote week scheduled each month (9 days in office each month, subject to change in months with 5 weeks)
**Why join MicroSourcing?
You'll have:**
- Competitive Rewards: Enjoy above-market compensation, healthcare coverage on day one, plus one or more dependents, paid time-off with cash conversion, group life insurance, and performance bonuses
- A Collaborative Spirit: Contribute to a positive and engaging work environment by participating in company-sponsored events and activities.
- Work-Life Harmony: Enjoy the balance between work and life that suits you with flexible work arrangements.
- Career Growth: Take advantage of opportunities for continuous learning and career advancement
- Inclusive Teamwork: Be part of a team that celebrates diversity and fosters an inclusive culture.
Your Role:
Design, build, and maintain scalable ETL/ELT data pipelines that extract, transform, and load data from multiple sources into internal systems for use by client-facing applications and operational reporting. This role focuses entirely on backend data engineering and integration—not on dashboards or data visualization.
- Data Pipeline Implementation and Support
Data Transformation
Create data transformations according to business rules and client-specific requirements.
Support data enrichment workflows
Automation & Refresh Process
Implement automation for recurring data loads and transformations across customers.
- Monitor and facilitate data refresh cycles, including scheduling, orchestration, and validation.
Own client refresh lifecycle, ensuring timely, complete, and accurate data updates.
Data Optimization
Continuously improve pipeline efficiency, reducing refresh times and resource usage.
Identify bottlenecks and refactor logic or infrastructure to scale with growing data volumes
Refresh Data QA & Validation
Validate data post-ingestion and transformation to ensure schema alignment, completeness, and business logic integrity.
- Collaborate with the Data Quality and Delivery teams to resolve ingestion-related data quality issues.
Build checks into the ETL workflows to proactively detect and flag anomalies.
Documentation & SOPs
What You Need:
Technical quantitative BS degree, combined with modelling work experience or training, with working knowledge of modelling (regression, machine learning, feature selection, validation), data ETL (extracting, preparing, validating) and building analytics workflows.
Essential Skills - Analytics
- 2+ years of hands-on experience developing automated data workflows in Alteryx or similar ETL tools (e.g., Talend, SSIS, Informatica).
- Strong SQL experience, especially in writing complex joins, data transformations, and quality checks.
- Candidates should understand data structures and logic but are not expected to create visualizations or dashboards.
- Working knowledge of programming languages such as SQL (R and Python are beneficial)
- Develop custom workflows using Alteryx to access snowflake database, create repeatable processes for other team members to execute.
- Data Collection & Structured Analysis - Develops logical approach to data collection and ensures data collection meets the timeline
- Identifies gaps in data, and/or anomalies and seeks clarification - surfaces concerns or issues immediately with recommendations.
This is a backend ETL role. Candidates focused on reporting, dashboarding, or visualization tools (e.g., Power BI, Tableau) are not a fit for this position.
About MicroSourcing
With over 9,000 professionals across 13 delivery centers, MicroSourcing is the pioneer and largest offshore provider of managed services in the Philippines.
Our commitment to 100% YOU
MicroSourcing firmly believes that our company's strength lies in our people's diversity and talent. We are proud to foster an inclusive culture that embraces individuals of all races, genders, ethnicities, abilities, and backgrounds. We provide space for everyone, embracing different perspectives, and making room for opportunities for each individual to thrive.
At MicroSourcing, equality is not merely a slogan – it's our commitment. Our way of life. Here, we don't just accept your unique authentic self - we celebrate it, valuing every individual's contribution to our collective success and growth. Join us in celebrating YOU and your 100%
For more information, visit
*Terms & conditions apply
ETL Data Engineer – AWS
Posted today
Job Viewed
Job Description
About The Role:
We are looking for a detail-oriented and proactive ETL Data Engineer with strong experience in data integration, transformation, and pipeline development. The ideal candidate should be proficient in AWS, Python, and SQL, with hands-on experience working with diverse data sources and formats. A solid understanding of both RDBMS and NoSQL databases, along with familiarity in API development and data ingestion, is essential. Candidates who stay current with emerging technologies and thrive in Agile environments will be a great fit for our dynamic and fast-paced team.
Position Requirements:
- Candidate must possess at least a Bachelor's Degree / College, Computer Science / IT
- At least 5 years+ of working experience in the related field (Python, ETL).
- Hands-on experience with the AWS Suite for cloud-based data and analytics solutions.
- Skilled in ingesting data from various sources including APIs, logs, flat files, and databases.
- Experienced in API client-server development (REST), with a focus on API authentication and data ingestion.
- Proficient in administering both RDBMS and NoSQL databases within a unified data and analytics environment.
- Capable of converting, processing, and transforming various file formats (e.g., CSV, JSON, Parquet, XML) using data tools.
- Comfortable using Python to address data and analytics requirements.
- Experienced in deploying and maintaining scripts using Git repositories.
- Keeps up to date with emerging technologies in the data and analytics space.
- Adept at working within SCRUM/Agile methodologies and environments.
- Open to working remotely and available for mid-shift or rotating schedules.
ETL Data Engineer – AWS
Posted today
Job Viewed
Job Description
We're Hiring: ETL Data Engineer – AWS & Python Specialist
We are seeking an experienced ETL Data Engineer to design, develop, and maintain robust data pipelines using AWS cloud services and Python. The ideal candidate will have expertise in extracting, transforming, and loading large datasets while ensuring data quality, performance, and scalability across our data infrastructure.
About The Role
We are looking for a detail-oriented and proactive ETL Data Engineer with strong experience in data integration, transformation, and pipeline development. The ideal candidate should be proficient in AWS, Python, and SQL, with hands-on experience working with diverse data sources and formats. A solid understanding of both RDBMS and NoSQL databases, along with familiarity in API development and data ingestion, is essential. Candidates who stay current with emerging technologies and thrive in Agile environments will be a great fit for our dynamic and fast-paced team.
Position Requirements
- Candidate must possess at least a Bachelor's Degree / College, Computer Science / IT
- At least 5 years+ of working experience in the related field (Python, ETL).
- Hands-on experience with the AWS Suite for cloud-based data and analytics solutions.
- Skilled in ingesting data from various sources including APIs, logs, flat files, and databases.
- Experienced in API client-server development (REST), with a focus on API authentication and data ingestion.
- Proficient in administering both RDBMS and NoSQL databases within a unified data and analytics environment.
- Capable of converting, processing, and transforming various file formats (e.g., CSV, JSON, Parquet, XML) using data tools.
- Comfortable using Python to address data and analytics requirements.
- Experienced in deploying and maintaining scripts using Git repositories.
- Keeps up to date with emerging technologies in the data and analytics space.
- Adept at working within SCRUM/Agile methodologies and environments.
- Open to working remotely and available for mid-shift or rotating schedules.
Ready to transform data into insights? Apply now and let's build the future together
Data Engineering
Posted today
Job Viewed
Job Description
Job Description
Job Description:
- Designs and develops Data and Analytics solutions
- Performs data analysis, data architecture, data engineering activities
- Designs and builds data ingestion pipelines; develops data transformation, cleansing and preparation processes
- Designs and builds data storage solutions including database objects and data structures
- Takes ownership of complex data-level or database incidents and problems and provides resolutions in a timely manner
- Provides advisory and consultation services to determine the right solution
- Utilizes best practices, work plans, checklists and defined processes
- Proactively supports application teams
- Performs optimization
- Participates in project technical reviews
- Leads or participates in work product reviews
- Engages SMEs or vendors to provide assistance to resolve more complex issues as required
Technology And Functional Skills
- Azure data platform and components: Azure Data Factory, Databricks, Blob Storage, Synapse, SQL DB/DW, Event Hubs, etc
- AWS data platform and components: S3, Redshift, Spectrum, Athena, Aurora, CLI, etc
- GCP data platform and components: BigQuery, Dataflow, Dataprep, Pub/Sub, etc
- Non-cloud native: Informatica, Talend, DataStage, SAP BODS, etc
- Big Data: Hadoop, Hadoop ecosystem, Spark, Scala, etc
- Language and Toolset: SQL, Python, PySpark, ELK (Elasticsearch, Logstash, Kibana)
- Data Integration, Extract-Transform-Load (ETL), Data Ingestion, Data Cleansing, Data Preparation, Database
- Data Warehouse, Data Lake, and Business Intelligence (BI) concepts
At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We're committed to fostering an inclusive environment where everyone can thrive.
Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here
.
Be The First To Know
About the latest Etl engineer Jobs in Philippines !
Data Engineering
Posted today
Job Viewed
Job Description
About Control Flow Labs
Control Flow Labs is a fast-growing pre-seed startup building an AI-powered ecommerce analytics platform. We aggregate data from major platforms to deliver comprehensive analytics and insights for medium to enterprise-level businesses.
With our MVP live and beta clients onboard, we're scaling rapidly and preparing for our next funding round. We're looking for a skilled Data & Integrations Manager to perfect our current data systems and build new capabilities as we grow.
The Role
We're seeking a Data Engineering & Integrations Manager to own the complete data lifecycle of our analytics platform. You'll be responsible for ensuring data accuracy, building robust integrations with ecommerce platforms, writing ETL pipelines, and creating meaningful analytics that our enterprise clients rely on for business decisions.
This is a hands-on technical role where you'll work closely with our current project lead and report to our incoming CTO. You'll be instrumental in scaling our data capabilities from beta to 100+ enterprise clients.
What You'll DoData Engineering & Pipeline Management
● Build and maintain integrations with ecommerce platforms using REST and GraphQL APIs
● Implement OAuth flows and manage platform authentication for secure data access
● Own the complete data pipeline using Dagster for orchestration, ensuring reliable data ingestion and transformation
● Develop and optimize ETL/ELT processes that can scale with rapid client growth
● Implement data validation and quality checks to guarantee accuracy across all client dashboards
Analytics & Visualization Development
● Create analytics endpoints that power our frontend dashboards and client interfaces
● Transform raw platform data into actionable ecommerce insights and KPIs
● Build data models that support both current reporting and future AI-powered features
● Collaborate with our dedicated frontend developer to ensure smooth data delivery to client dashboards
System Optimization & Scaling
● Perfect existing data systems to ensure 100% accuracy and reliability
● Monitor and prevent data drift through automated validation and anomaly detection
● Optimize performance for real-time analytics as we scale to more clients and data volume
● Document data flows and processes for team knowledge sharing and compliance
Platform Integration Expertise
● Design integration strategies for new ecommerce platforms as we expand
● Troubleshoot platform API changes and maintain integration reliability
● Communicate infrastructure requirements to our dedicated DevOps team for optimal system performance
Note: You'll focus purely on data engineering and backend analytics - we have dedicated team members handling frontend development and infrastructure management.
What We're Looking ForTechnical Skills (Required)
● Strong Python development experience, particularly with FastAPI and Dagster
● Advanced SQL skills and experience with PostgreSQL
● Familiarity with Cloud platform experience with AWS services (RDS, S3, etc.)
● Data engineering expertise with tools like Dagster, dbt, or similar orchestration platforms
● API integration experience with REST and GraphQL methodologies
● Data modeling and warehouse design experience
Domain Knowledge (Preferred)
● Ecommerce platform experience - understanding of Shopify, TikTok, Amazon, or similar platforms
● Analytics platform background - experience building customer-facing dashboards and reports
● Business intelligence tools knowledge for visualization and reporting
● Data validation and quality assurance methodologies
Experience Level
● 3-5+ years in data engineering, analytics engineering, or similar technical roles
● Proven track record building and maintaining data pipelines at scale
● Experience with customer-facing analytics or BI platforms
● Background in fast-paced startup or growth environments
Soft Skills
● Obsessive about data accuracy - you treat every dashboard as a promise to users
● Strong communication skills - can translate technical requirements to infrastructure team
● Problem-solving mindset - you proactively identify and fix data issues before they impact clients
● Collaborative approach - comfortable working closely with product, engineering, and business teams
What We OfferCompensation & Benefits
● Competitive salary: ₱80,000 - ₱100,000/month based on experience
● Flexible work arrangement - primarily onsite in BGC, Taguig with WFH flexibility as needed
● Full benefits package
● Growth opportunity - join a fast-scaling startup with clear advancement paths
Technical Environment
● Modern data stack: Python FastAPI, PostgreSQL, AWS, Dagster
● Interesting challenges: Multi-platform integrations, real-time analytics, enterprise-scale data processing
● Direct impact: Your work directly enables client business decisions and platform growth
● Learning opportunities: Exposure to AI/ML integration and enterprise-scale data challenges
Team & Culture
● Collaborative team of 13 across all departments
● Fast-paced environment - we move quickly and don't let blockers persist
● Data-driven culture - decisions backed by metrics and evidence
● Direct mentorship from experienced project lead and incoming CTO
Success Metrics (First 6 Months)
● 100% data accuracy across all client integrations and dashboards
● Zero data downtime or missing data incidents
● Successful integration of at least 2 new platform connections
● Improved data pipeline performance supporting 3x client growth
● Clear documentation of all data processes and integration methods
Requirements
● Reliable commute to BGC, Taguig office
● Immediate availability preferred
Ready to Join Us?
If you're passionate about building reliable, scalable data systems that power real business decisions, we'd love to hear from you.
To apply, please include:
● Your resume highlighting relevant data engineering and integration experience
● Brief examples of data pipelines or integrations you've built
● Any experience with ecommerce platforms or analytics systems
Control Flow Labs is an equal opportunity employer committed to building a diverse and inclusive team.
Job Type: Full-time
Pay: Php80, Php100,000.00 per month
Benefits:
- Additional leave
- Health insurance
Work Location: In person
Data Engineering Director
Posted today
Job Viewed
Job Description
As Director of Data Engineering, You Will…
- Develop and execute a data strategy that aligns with the company's objectives, leading the data engineering team to build scalable and robust data solutions
- Mentor, and lead a team of data engineers and architects
- Foster a collaborative, innovative, and results-oriented environment within the team
- Design and implement effective database solutions and models to store and retrieve company data
- Examine and identify database structural necessities by evaluating client operations, applications, and programming
- Manage timelines, resources, and deliverables for multiple projects simultaneously
- Ensure projects are completed on time and within budget while meeting business needs
- Ensure compliance with data governance and security requirements
- Establish policies and procedures for data handling and sharing
As Director of Data Engineering, You Have…
- A Degree or Diploma in Computer Science, Information Systems or equivalent experience
- Minimum of 8 years of experience in data engineering, with at least 3 years in a leadership role
- Strong knowledge of database management, warehousing solutions, and ETL processes
- Proficiency in SQL and familiarity with programming languages such as Python or Java
- Experience with cloud platforms (AWS, Azure, Google Cloud) and understanding of SaaS environments
- Knowledge of real-time analytics and BI tools like Looker, Tableau, PowerBI, or similar
- Certifications in data management, big data solutions, or cloud architecture
- Experience in a contact center or customer service environmentDemonstrated ability to build and maintain high-performing data engineering teams
- Excellent communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels
Job Type: Full-time
Benefits:
- Additional leave
- Free parking
- Health insurance
- Life insurance
- On-site parking
- Work from home
Work Location: Remote
Data Engineering Consultant
Posted today
Job Viewed
Job Description
- Develop innovative solutions, create actionable insights and drive better decisions and performance to help the business achieve its business objectives
- Coordination with all stake holders and Ensures Project Meets Requirements & Objectives
- Establish a deep understanding of the business and collaborate with Business SMEs to shape the reporting solutions provided
- Resolve issues as they arise across areas of the project and where they impact on other activities, systems and projects
- Influences senior leadership to adopt new ideas, projects and / or approaches
- Develops software solutions by studying information needs; conferring with users; studying systems flow, data usage, and work processes; investigating problem areas; Collaborating on solutions to those identified problem areas and following the software development lifecycle
- Performs all tasks in the development life cycle including requirements analysis, design, development, and testing
- Utilizes available programming methodologies and languages and adhere to coding standards, procedures and techniques while contributing to the technical code documentation
- Creates and helps maintain technical and end-user documentation for new and existing applications
- Performs all other related duties as assigned
- Identify and help mitigate risks to the data and organizational health
Must have skills:
7 years in MS SQL Server, ETL (SSIS), Snowflake, Databrick
Nice to have:
2 years in Microsoft .NET Framework / .Net Core, Microservices, Rest APIs, Python
two with telephony data experience
Job Types: Full-time, Permanent
Pay: Php100, Php150,000.00 per month
Education:
- Bachelor's (Required)
Experience:
- Snowflake: 7 years (Required)
- Databricks: 7 years (Required)
- MS SQL: 7 years (Required)
- ETL (SSIS): 7 years (Required)
Work Location: In person