What Jobs are available for Data Pipelines in the Philippines?
Showing 94 Data Pipelines jobs in the Philippines
Senior Data Engineer – Data Pipelines
Posted today
Job Viewed
Job Description
Job Description
As part of this maturnaty leave replacement you will be developing and operate supply chain solutions for specific data consuming applications or digital services. In addition, business consulting will be part of the job once the onboarding and training is finalized, and the setting will be in an agile teams.
Key Accountabilities / Responsibilities
- Design and development of data pipelines
- Support data scientists with data availability
- Shape and implement (data) best practices
- Snowflake, Azure, and lakehouse implementations in general
- Data Vault 2.0 and Vaultspeed
- DBT
Key Success Factors
- Onboarding of data sets according to plan
- Data Quality measures
- User Adoption KPIs
Required Minimum Qualifications
- Computer Science or other relevant field education
- Experience with Microsoft Azure or cloud platforms like Azure is a plus.
- Experience with Continuous Integration, Continuous Development and CI/CD pipelines is a plus.
- Knowledge of Python is a plus.
- Knowledge of SAP IBP is a plus
- Knowledge of data modeling and documentation is a plus.
Competencies
- Eager to learn more
- Fluent in English
- High drive
- Ability to work in complex work environment with a need to bridge solutions and approaches
Start date:
01/10/2025 
End date:
31/12/2025 
Location:
Manila, Philippines 
Required skills
Data Vault
CI/CD piplines
Lakehouse
Vaultspeed
Continuous Integration
Data Pipelines
Azure
Data Best Practices
Snowflake
Continuous Development
Dbt
Preferred skills
SAP IBP
Data Modeling
Documentation
Python
Is this job a match or a miss?
 
            
        
                                
            
                 
            
        
            Data Architecture Head
Posted today
Job Viewed
Job Description
Job brief
We are seeking a visionary
Head of
Data Architecture
for a global company building AI platforms from scratch to lead the design and execution of our enterprise-wide data backbone. This leader will unify data across CAD, Finance, Operations, and GTM functions, ensuring our organization has a trusted, scalable foundation for analytics, AI, and business decision-making. 
Responsibilities
- Design, implement, and own the enterprise-wide data architecture that connects critical business domains (CAD, Finance, Operations, GTM).
- Lead the development and maintenance of data models, pipelines, lineage, quality frameworks, adnd governance standards.
- Ensure the delivery of reliable, well-documented datasets to enable AI, ML, and advanced analytics use cases.
- Establish and enforce best practices for data management, scalability, and cloud-native architectures.
- Collaborate with cross-functional stakeholders to translate business needs into data solutions.
- Build, lead, and mentor a high-performing team of data engineers and architects.
Requirements
- Proven track record of designing and scaling enterprise-level, cloud-native data platforms beyond reporting and BI.
- Hands-on expertise with Snowflake or BigQuery, along with modern orchestration and transformation tools (e.g., dbt, Airflow) in AWS, GCP, or Azure environments.
- Demonstrated impact through enabling AI/ML and advanced analytics initiatives
- Strong leadership presence: ability to recruit, develop, and inspire data engineering talent.
- Excellent communication and collaboration skills, with experience working across executive, technical, and operational teams.
Is this job a match or a miss?
 
            
        
                                
            
                 
            
        
            Data Architecture Head
Posted 2 days ago
Job Viewed
Job Description
We are seeking a visionary Head of Data Architecture for a global company building AI platforms from scratch to lead the design and execution of our enterprise-wide data backbone. This leader will unify data across CAD, Finance, Operations, and GTM functions, ensuring our organization has a trusted, scalable foundation for analytics, AI, and business decision-making.
Responsibilities- Design, implement, and own the enterprise-wide data architecture that connects critical business domains (CAD, Finance, Operations, GTM).
- Lead the development and maintenance of data models, pipelines, lineage, quality frameworks, adnd governance standards.
- Ensure the delivery of reliable, well-documented datasets to enable AI, ML, and advanced analytics use cases.
- Establish and enforce best practices for data management, scalability, and cloud-native architectures.
- Collaborate with cross-functional stakeholders to translate business needs into data solutions.
- Build, lead, and mentor a high-performing team of data engineers and architects.
 
- Proven track record of designing and scaling enterprise-level, cloud-native data platforms beyond reporting and BI.
- Hands-on expertise with Snowflake or BigQuery , along with modern orchestration and transformation tools (e.g., dbt, Airflow ) in AWS, GCP, or Azure environments.
- Demonstrated impact through enabling AI/ML and advanced analytics initiatives
- Strong leadership presence: ability to recruit, develop, and inspire data engineering talent.
- Excellent communication and collaboration skills, with experience working across executive, technical, and operational teams.
Is this job a match or a miss?
 
            
        
                                
            
                 
            
        
            Director of Data Architecture
Posted today
Job Viewed
Job Description
We are seeking an experienced and strategic Director of Data Architecture to lead the design and implementation of an enterprise-wide data backbone that seamlessly integrates data across CAD, finance, operations, and go-to-market functions. This role will be responsible for building and maintaining robust, scalable data infrastructure that supports our business intelligence, analytics, and AI capabilities.
KEY RESPONSIBILITIES
- Design and implement enterprise‑wide data backbone that unifies CAD, finance, ops, and GTM data.
- Own data models, pipelines, quality, lineage, and governance.
- Enable AI/analytics by delivering reliable, well‑documented datasets.
QUALIFICATIONS
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related field.
- 10+ years of experience in data architecture, data engineering, or related fields, with at least 5 years in leadership roles.
- Proven track record of designing and implementing enterprise-scale, cloud-native data platforms beyond traditional reporting.
- Hands-on expertise with: 
- Cloud platforms: AWS, GCP, or Azure. 
- Data warehousing: Snowflake, BigQuery, or similar.
- Data pipeline/orchestration: dbt, Airflow, or equivalent. 
- Experience enabling AI/ML or advanced analytics use cases through robust data foundations. 
- Strong background in data governance, quality, and security frameworks.
- Demonstrated ability to recruit, mentor, and lead data engineering teams.
- Excellent communication and collaboration skills, with ability to influence technical and business stakeholders.
Is this job a match or a miss?
 
            
        
                                
            
                 
            
        
            Technology Consultant-Data Architecture Principle
Posted today
Job Viewed
Job Description
Advises, leads and works on high impact activities within the systems development lifecycle, and provides advisory work for the IT function itself.
As a Technology Consulting Practitioner, you will engage in high-impact activities throughout the systems development lifecycle. Your typical day will involve advising teams, leading initiatives, and collaborating with various stakeholders to enhance IT functions. You will be instrumental in providing strategic insights and solutions that drive efficiency and innovation within the organization.
Roles & Responsibilities:
- Expected to be an SME.
- Collaborate and manage the team to perform.
- Responsible for team decisions.
- Engage with multiple teams and contribute on key decisions.
- Expected to provide solutions to problems that apply across multiple teams.
- Facilitate knowledge sharing and best practices among team members.
- Monitor project progress and ensure alignment with strategic goals.
- Support the development of training materials and conduct workshops to enhance team capabilities. 
Job Qualifications
Professional & Technical Skills:
- Required Skill: Expert proficiency in Data Architecture Principles. 
- Must Have: Experience in Technical Pre-sales / Client Selling / Client Acquisition
- Additional Good To Have Skills: Experience with Generative AI.
- Strong understanding of data modeling techniques and database design.
- Proficiency in data integration and data governance frameworks.
- Experience with cloud-based data architecture solutions.
- Familiarity with big data technologies and analytics tools.
Additional Information:
- The candidate should have minimum 7 years of experience in Data Architecture Principles.
- This position is based at our Manila office. 
Is this job a match or a miss?
 
            
        
                                
            
                 
            
        
            Lead Data Architecture & Engineering (GDC) Consultant - Remote
Posted 8 days ago
Job Viewed
Job Description
Work Arrangement: Fully Remote
Schedule: Midshift
At least 5 years of relevant experience and at least 4 years of relevant leadership experience is needed.
Job Overview:
**Leadership Development** This role offers opportunities to enhance your technical leadership skills, including making technical decisions, mentoring junior team members, and building relationships with project, client, and company leadershipall while remaining actively involved in your technical domain.
**Technology Focus & Growth** Our primary focus is on Azure, providing numerous chances to develop your expertise within the Microsoft and Azure ecosystems in an environment dedicated to technical excellence and exceptional client experiences. Just six months with our team can equate to years of experience elsewhere.
**Partnerships & Opportunities** Our strong global partnership with Microsoft ensures access to innovative Azure projects and direct collaboration with the platforms creatorsan advantage not commonly found elsewhere.
**Compensation & Culture** We offer a competitive salary, allowances, standard benefits, and performance-based bonuses (quarterly and annually). Our company fosters a positive working environment and culture, with flexible options for remote work.
**Core Technical Skills Required:**
- Proven experience leading small to medium technical teams, including task delegation, technical escalation, support, and representing the team in Agile ceremonies and client meetings
- Expertise in designing and implementing logical and physical data models for cloud and hybrid data warehouse environments
- Experience in developing data architectures supporting diverse data formats: structured, semi-structured, and unstructured data
- Hands-on involvement in full lifecycle data warehouse projects
- Knowledge of data architecture necessary for data integration processes
- Proficiency with data modeling tools such as ER/Studio, ER/Win, or similar
- Familiarity with Microsoft Azure Data Platform services like Azure Data Lake Storage, Azure Blob Storage, Azure Synapse, Azure Data Factory, Azure SQL Database, Logic Apps, and APIs
- Demonstrated ability to quickly learn, adapt, and apply new technologies
- Experience in data profiling and mapping source-to-target data transformations
- Ability to provision and configure Azure data services
**Detailed Technical Skills:**
- Python and SQL scripting
- SQL and PySpark
- General cloud architecture skillscapable of translating requirements into data pipeline solutions
- API development knowledge; candidates with API creation skills are preferred
**Preferred Skills, Experience, & Certifications:**
- Microsoft Fabric
- Microsoft Azure Cosmos DB, Data Flows, ExpressRoute, Azure Active Directory
- Experience designing migration strategies from on-premise environments to Azure
- Power BI and semantic data modeling
- AWS Glue and Azure Data Factory
- AWS S3 and Azure Blob Storage
- AWS Athena and Azure Databricks
- AWS Redshift and Azure Synapse Analytics
- AWS ECS and Azure AKS
Is this job a match or a miss?
 
            
        
                                
            
                 
            
        
            Lead Consultant Data Architecture & Engineering (GDC) - Remote
Posted 8 days ago
Job Viewed
Job Description
Work Arrangement: Fully Remote
Schedule: Midshift
At least 5 years of relevant experience and at least 4 years of relevant leadership experience is needed.
Job Overview:
**Leadership Development** This role offers opportunities to enhance your technical leadership skills, including making technical decisions, mentoring junior team members, and building relationships with project, client, and company leadershipall while remaining actively involved in your technical domain.
**Technology Focus & Growth** Our primary focus is on Azure, providing numerous chances to develop your expertise within the Microsoft and Azure ecosystems in an environment dedicated to technical excellence and exceptional client experiences. Just six months with our team can equate to years of experience elsewhere.
**Partnerships & Opportunities** Our strong global partnership with Microsoft ensures access to innovative Azure projects and direct collaboration with the platforms creatorsan advantage not commonly found elsewhere.
**Compensation & Culture** We offer a competitive salary, allowances, standard benefits, and performance-based bonuses (quarterly and annually). Our company fosters a positive working environment and culture, with flexible options for remote work.
**Core Technical Skills Required:**
- Proven experience leading small to medium technical teams, including task delegation, technical escalation, support, and representing the team in Agile ceremonies and client meetings
- Expertise in designing and implementing logical and physical data models for cloud and hybrid data warehouse environments
- Experience in developing data architectures supporting diverse data formats: structured, semi-structured, and unstructured data
- Hands-on involvement in full lifecycle data warehouse projects
- Knowledge of data architecture necessary for data integration processes
- Proficiency with data modeling tools such as ER/Studio, ER/Win, or similar
- Familiarity with Microsoft Azure Data Platform services like Azure Data Lake Storage, Azure Blob Storage, Azure Synapse, Azure Data Factory, Azure SQL Database, Logic Apps, and APIs
- Demonstrated ability to quickly learn, adapt, and apply new technologies
- Experience in data profiling and mapping source-to-target data transformations
- Ability to provision and configure Azure data services
**Detailed Technical Skills:**
- Python and SQL scripting
- SQL and PySpark
- General cloud architecture skillscapable of translating requirements into data pipeline solutions
- API development knowledge; candidates with API creation skills are preferred
**Preferred Skills, Experience, & Certifications:**
- Microsoft Fabric
- Microsoft Azure Cosmos DB, Data Flows, ExpressRoute, Azure Active Directory
- Experience designing migration strategies from on-premise environments to Azure
- Power BI and semantic data modeling
- AWS Glue and Azure Data Factory
- AWS S3 and Azure Blob Storage
- AWS Athena and Azure Databricks
- AWS Redshift and Azure Synapse Analytics
- AWS ECS and Azure AKS
Is this job a match or a miss?
 
            
        
                                
            
                 
            
        
            Be The First To Know
About the latest Data pipelines Jobs in Philippines !
Data Integration Specialist
Posted today
Job Viewed
Job Description
Trax Inc. provides cloud-based solutions and services to automate and manage freight invoice auditing and payment processing for buyers and sellers of logistics services. The Trax platform enables companies to optimize supply chain performance through greater visibility into their logistics ecosystem and predictive analytics based on over one billion logistics transactions from all industries, modes, and countries. Trax operates on a global scale with offices in the U.S., Latin America, Asia and Europe.
Duties/Responsibilities:
- Analyzes and defines EDI requirements for clients, especially for needs and circumstances that are slightly unusual
- Provides advice and education to clients concerning EDI guidelines and capabilities
- Creates, develops, and produces mapping documents
- Collaborates with programmers and internal staff to develop and test new or modified systems
- Develops and maintains communication links with trading partners and oversees integration of information
- Periodically reviews system to ensure client needs are met; identifies possible modifications as required
- Remains current regarding EDI and internet field technological developments
- Testing and debugging the system
- Gathering user feedback on the accuracy, efficiency, and functionality of the system
- Adjusting and customizing the system based on user feedback
- Deploying completed systems and providing maintenance support
- Training end-users on the proper use of the system
Required Skills/Abilities:
- Fluent in oral and written Mandarin
- Excellent knowledge of software design and architecture principles
- Basic understanding and working knowledge of EDI formats and Internet functions
- Creative problem-solving required to design a system that meets clients individual needs
- Organizational skills and attention to detail
- Ability to work independently and as part of a team
- Ability to project manage and work with large teams
- Experience with end-user training
- Excellent verbal/written communication and interpersonal skills
- Proficient in Microsoft Office Suite or related software
Education, Experience & Other Requirements:
- Bachelor's degree in computer science, computer engineering, information technology, or a
similar field
- At least three years of experience in an EDI service center required
- Knowledge of XML and JSON and other scripting languages is a plus
- Intermediate knowledge SQL or other query languages
- Experience in Informatica System is a plus
Additional Information
- If Foreign Applicant - Must have all the necessary documents to work in the Philippines.
Certificates and Permit processing including fees will be shouldered by the applicant.
- Our preference is a Filipino Born already living in Cebu City.
Is this job a match or a miss?
 
            
        
                                
            
                 
            
        
            Data Integration Engineer
Posted today
Job Viewed
Job Description
Role Description
This position will be responsible for designing, building, and maintaining systems that move and transform data between applications, databases, and software platforms. The position will ensure the quality of the data infrastructure (e.g., processes, data storage, data pipelines), managing the integration and connecting various data sources, ensuring seamless data flow between systems to the Data Warehouse, and maintaining and security of the databases.
Key Responsibilities:
- Data Integration Design: Design and implement data integration solutions, including ETL processes, data pipelines, and data mappings, to facilitate the movement of data between systems and platforms 
- Data Transformation: Transform and cleanse data as needed to ensure data quality and consistency across different data sources and destinations 
- Integration Testing: Perform thorough testing and validation of data integration processes to ensure accuracy, reliability, and data integrity 
- SQL Optimization: Help write and optimize SQL statements to ensure efficient data retrieval and manipulation within our applications 
- Create and maintain documentation for data integration processes and configurations. 
- Work closely with cross-functional teams, including Data Engineers, Application Developers, and Business Process Owners/Stakeholders, to understand data requirements and deliver integrated solutions that meet business needs 
Experience & Qualifications:
- 3+ years of Proven experience as a Data Integration Developer or ETL Developer, with a strong track record of designing and implementing data integration solutions 
- Proficiency in data integration tools and technologies such as SSIS (SQL Server Integration Services), ETL tools, data integration platforms, and data warehousing solutions 
- Strong SQL skills, including query optimization, stored procedures, and database design. 
- Experience with database management systems (e.g., SQL Server, MySQL). 
- Experience with data modeling and data warehouse architecture. 
- Knowledge of API development and integration (REST, JSON, XML) 
- Understanding of data governance and data security best practices. 
- Experience in Power Apps and Power Platform is a plus. 
- Knowledge of Agile development methodologies 
- Experience working in a global setting 
- Excellent communication skills (both written and verbal) 
- Strong analytical and problem-solving skills 
- Ability to quickly learn new concepts and functionality 
Is this job a match or a miss?
 
            
        
                                
            
                 
            
        
            Data Integration Specialist
Posted today
Job Viewed
Job Description
HIRING - Data Integration Specialist (Talend)
Salary Range:
25, ,000 gross plus variable commission 
Work Setup:
 Hybrid 
Company Description
Micropinnacle Technology Corporation (MTC) is a premier IT solutions provider focused on delivering innovative data management solutions. Founded in May 2009, MTC has developed into a recognized leader in the field of data management, committed to enhancing operational efficiency and driving business success for its clients across various industries. MTC is dedicated to providing top-tier service and solutions that meet the evolving needs of its clients.
Role Description
This is a full-time hybrid role for a Data Integration Specialist, located in Makati with some work-from-home flexibility. The Data Integration Specialist will be responsible for integrating data from different sources, modeling data, and implementing Extract Transform Load (ETL) processes. The role involves ensuring the accuracy and efficiency of data processes, maintaining data warehouse systems, performing data analysis, and troubleshooting data issues. The role also includes collaborating with other departments to understand data requirements and ensuring data integration solutions meet business needs.
Qualifications
- Proficiency in Data Integration and Data Modeling
- Experience with Extract Transform Load (ETL) processes
- Strong Analytical Skills
- Knowledge of Data Warehousing
- Excellent problem-solving skills and attention to detail
- Ability to work independently and in a team environment
- Bachelor's degree in Computer Science, Information Technology, or related field
- Experience in the IT industry is a plus
Is this job a match or a miss?
 
            
        
                                
            
                