3,381 Data Team jobs in the Philippines
Data Engineering Lead, Data Science
Posted today
Job Viewed
Job Description
We are seeking a seasoned and strategic Data Engineering Lead to spearhead the development and management of our cutting-edge Data Science and AI platform. This role is central to our mission, enabling our data scientists and AI engineers to build and deploy innovative solutions at scale. The ideal candidate will possess deep expertise in modern data architecture, with a specific focus on leveraging AWS services, including AWS Bedrock, to build robust, scalable, and secure data pipelines.
Key Responsibilities:
- Technical Leadership: Lead the design, development, and maintenance of scalable and reliable data pipelines and infrastructure to support our Data Science and AI initiatives.
- Platform Development: Architect and build a comprehensive platform for data ingestion, transformation, and serving, with a particular emphasis on integrating with and optimizing for AWS Bedrock and other generative AI tools.
- Team Management: Mentor, guide, and manage a team of talented data engineers, fostering a culture of technical excellence, collaboration, and continuous improvement.
- Collaboration: Partner closely with data scientists, machine learning engineers, and product managers to understand their data needs and translate them into robust and efficient data solutions.
- AWS Expertise: Utilize a wide range of AWS services, including but not limited to S3, Glue, Lambda, EMR, Redshift, and Athena, to create end-to-end data workflows.
- Data Governance: Establish and enforce best practices for data quality, security, privacy, and governance to ensure the integrity and compliance of our data assets.
- Performance Optimization: Monitor and optimize data pipelines and infrastructure for cost-efficiency, performance, and reliability.
- Innovation: Stay current with emerging technologies and trends in data engineering, cloud computing, and AI/ML to drive innovation within the platform.
Required Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
- (Number) years of experience in data engineering, with at least (Number) years in a leadership or senior role.
- Proven experience designing and building scalable data platforms on AWS.
- Deep practical knowledge of core AWS services, including AWS Bedrock, S3, Lambda, and Glue.
- Expertise in a programming language, with a strong preference for Python.
- Proficiency in SQL and a solid understanding of data warehousing concepts.
- Experience with pipeline orchestration tools such as Airflow, Prefect, or AWS Step Functions.
- Excellent communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences.
Preferred Qualifications:
- Experience with MLOps and building production-grade machine learning pipelines.
- AWS Certified Data Analytics or AWS Certified Machine Learning certification.
- Familiarity with containerization technologies like Docker and Kubernetes.
- Experience with other cloud platforms (Azure, GCP) is a plus.
Data Engineering
Posted today
Job Viewed
Job Description
Job Description
Job Description:
- Designs and develops Data and Analytics solutions
- Performs data analysis, data architecture, data engineering activities
- Designs and builds data ingestion pipelines; develops data transformation, cleansing and preparation processes
- Designs and builds data storage solutions including database objects and data structures
- Takes ownership of complex data-level or database incidents and problems and provides resolutions in a timely manner
- Provides advisory and consultation services to determine the right solution
- Utilizes best practices, work plans, checklists and defined processes
- Proactively supports application teams
- Performs optimization
- Participates in project technical reviews
- Leads or participates in work product reviews
- Engages SMEs or vendors to provide assistance to resolve more complex issues as required
Technology And Functional Skills
- Azure data platform and components: Azure Data Factory, Databricks, Blob Storage, Synapse, SQL DB/DW, Event Hubs, etc
- AWS data platform and components: S3, Redshift, Spectrum, Athena, Aurora, CLI, etc
- GCP data platform and components: BigQuery, Dataflow, Dataprep, Pub/Sub, etc
- Non-cloud native: Informatica, Talend, DataStage, SAP BODS, etc
- Big Data: Hadoop, Hadoop ecosystem, Spark, Scala, etc
- Language and Toolset: SQL, Python, PySpark, ELK (Elasticsearch, Logstash, Kibana)
- Data Integration, Extract-Transform-Load (ETL), Data Ingestion, Data Cleansing, Data Preparation, Database
- Data Warehouse, Data Lake, and Business Intelligence (BI) concepts
At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We're committed to fostering an inclusive environment where everyone can thrive.
Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here
.
Data Engineering
Posted today
Job Viewed
Job Description
About Control Flow Labs
Control Flow Labs is a fast-growing pre-seed startup building an AI-powered ecommerce analytics platform. We aggregate data from major platforms to deliver comprehensive analytics and insights for medium to enterprise-level businesses.
With our MVP live and beta clients onboard, we're scaling rapidly and preparing for our next funding round. We're looking for a skilled Data & Integrations Manager to perfect our current data systems and build new capabilities as we grow.
The Role
We're seeking a Data Engineering & Integrations Manager to own the complete data lifecycle of our analytics platform. You'll be responsible for ensuring data accuracy, building robust integrations with ecommerce platforms, writing ETL pipelines, and creating meaningful analytics that our enterprise clients rely on for business decisions.
This is a hands-on technical role where you'll work closely with our current project lead and report to our incoming CTO. You'll be instrumental in scaling our data capabilities from beta to 100+ enterprise clients.
What You'll DoData Engineering & Pipeline Management
● Build and maintain integrations with ecommerce platforms using REST and GraphQL APIs
● Implement OAuth flows and manage platform authentication for secure data access
● Own the complete data pipeline using Dagster for orchestration, ensuring reliable data ingestion and transformation
● Develop and optimize ETL/ELT processes that can scale with rapid client growth
● Implement data validation and quality checks to guarantee accuracy across all client dashboards
Analytics & Visualization Development
● Create analytics endpoints that power our frontend dashboards and client interfaces
● Transform raw platform data into actionable ecommerce insights and KPIs
● Build data models that support both current reporting and future AI-powered features
● Collaborate with our dedicated frontend developer to ensure smooth data delivery to client dashboards
System Optimization & Scaling
● Perfect existing data systems to ensure 100% accuracy and reliability
● Monitor and prevent data drift through automated validation and anomaly detection
● Optimize performance for real-time analytics as we scale to more clients and data volume
● Document data flows and processes for team knowledge sharing and compliance
Platform Integration Expertise
● Design integration strategies for new ecommerce platforms as we expand
● Troubleshoot platform API changes and maintain integration reliability
● Communicate infrastructure requirements to our dedicated DevOps team for optimal system performance
Note: You'll focus purely on data engineering and backend analytics - we have dedicated team members handling frontend development and infrastructure management.
What We're Looking ForTechnical Skills (Required)
● Strong Python development experience, particularly with FastAPI and Dagster
● Advanced SQL skills and experience with PostgreSQL
● Familiarity with Cloud platform experience with AWS services (RDS, S3, etc.)
● Data engineering expertise with tools like Dagster, dbt, or similar orchestration platforms
● API integration experience with REST and GraphQL methodologies
● Data modeling and warehouse design experience
Domain Knowledge (Preferred)
● Ecommerce platform experience - understanding of Shopify, TikTok, Amazon, or similar platforms
● Analytics platform background - experience building customer-facing dashboards and reports
● Business intelligence tools knowledge for visualization and reporting
● Data validation and quality assurance methodologies
Experience Level
● 3-5+ years in data engineering, analytics engineering, or similar technical roles
● Proven track record building and maintaining data pipelines at scale
● Experience with customer-facing analytics or BI platforms
● Background in fast-paced startup or growth environments
Soft Skills
● Obsessive about data accuracy - you treat every dashboard as a promise to users
● Strong communication skills - can translate technical requirements to infrastructure team
● Problem-solving mindset - you proactively identify and fix data issues before they impact clients
● Collaborative approach - comfortable working closely with product, engineering, and business teams
What We OfferCompensation & Benefits
● Competitive salary: ₱80,000 - ₱100,000/month based on experience
● Flexible work arrangement - primarily onsite in BGC, Taguig with WFH flexibility as needed
● Full benefits package
● Growth opportunity - join a fast-scaling startup with clear advancement paths
Technical Environment
● Modern data stack: Python FastAPI, PostgreSQL, AWS, Dagster
● Interesting challenges: Multi-platform integrations, real-time analytics, enterprise-scale data processing
● Direct impact: Your work directly enables client business decisions and platform growth
● Learning opportunities: Exposure to AI/ML integration and enterprise-scale data challenges
Team & Culture
● Collaborative team of 13 across all departments
● Fast-paced environment - we move quickly and don't let blockers persist
● Data-driven culture - decisions backed by metrics and evidence
● Direct mentorship from experienced project lead and incoming CTO
Success Metrics (First 6 Months)
● 100% data accuracy across all client integrations and dashboards
● Zero data downtime or missing data incidents
● Successful integration of at least 2 new platform connections
● Improved data pipeline performance supporting 3x client growth
● Clear documentation of all data processes and integration methods
Requirements
● Reliable commute to BGC, Taguig office
● Immediate availability preferred
Ready to Join Us?
If you're passionate about building reliable, scalable data systems that power real business decisions, we'd love to hear from you.
To apply, please include:
● Your resume highlighting relevant data engineering and integration experience
● Brief examples of data pipelines or integrations you've built
● Any experience with ecommerce platforms or analytics systems
Control Flow Labs is an equal opportunity employer committed to building a diverse and inclusive team.
Job Type: Full-time
Pay: Php80, Php100,000.00 per month
Benefits:
- Additional leave
- Health insurance
Work Location: In person
Data Engineering Director
Posted today
Job Viewed
Job Description
As Director of Data Engineering, You Will…
- Develop and execute a data strategy that aligns with the company's objectives, leading the data engineering team to build scalable and robust data solutions
- Mentor, and lead a team of data engineers and architects
- Foster a collaborative, innovative, and results-oriented environment within the team
- Design and implement effective database solutions and models to store and retrieve company data
- Examine and identify database structural necessities by evaluating client operations, applications, and programming
- Manage timelines, resources, and deliverables for multiple projects simultaneously
- Ensure projects are completed on time and within budget while meeting business needs
- Ensure compliance with data governance and security requirements
- Establish policies and procedures for data handling and sharing
As Director of Data Engineering, You Have…
- A Degree or Diploma in Computer Science, Information Systems or equivalent experience
- Minimum of 8 years of experience in data engineering, with at least 3 years in a leadership role
- Strong knowledge of database management, warehousing solutions, and ETL processes
- Proficiency in SQL and familiarity with programming languages such as Python or Java
- Experience with cloud platforms (AWS, Azure, Google Cloud) and understanding of SaaS environments
- Knowledge of real-time analytics and BI tools like Looker, Tableau, PowerBI, or similar
- Certifications in data management, big data solutions, or cloud architecture
- Experience in a contact center or customer service environmentDemonstrated ability to build and maintain high-performing data engineering teams
- Excellent communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels
Job Type: Full-time
Benefits:
- Additional leave
- Free parking
- Health insurance
- Life insurance
- On-site parking
- Work from home
Work Location: Remote
Data Engineering Consultant
Posted today
Job Viewed
Job Description
- Develop innovative solutions, create actionable insights and drive better decisions and performance to help the business achieve its business objectives
- Coordination with all stake holders and Ensures Project Meets Requirements & Objectives
- Establish a deep understanding of the business and collaborate with Business SMEs to shape the reporting solutions provided
- Resolve issues as they arise across areas of the project and where they impact on other activities, systems and projects
- Influences senior leadership to adopt new ideas, projects and / or approaches
- Develops software solutions by studying information needs; conferring with users; studying systems flow, data usage, and work processes; investigating problem areas; Collaborating on solutions to those identified problem areas and following the software development lifecycle
- Performs all tasks in the development life cycle including requirements analysis, design, development, and testing
- Utilizes available programming methodologies and languages and adhere to coding standards, procedures and techniques while contributing to the technical code documentation
- Creates and helps maintain technical and end-user documentation for new and existing applications
- Performs all other related duties as assigned
- Identify and help mitigate risks to the data and organizational health
Must have skills:
7 years in MS SQL Server, ETL (SSIS), Snowflake, Databrick
Nice to have:
2 years in Microsoft .NET Framework / .Net Core, Microservices, Rest APIs, Python
two with telephony data experience
Job Types: Full-time, Permanent
Pay: Php100, Php150,000.00 per month
Education:
- Bachelor's (Required)
Experience:
- Snowflake: 7 years (Required)
- Databricks: 7 years (Required)
- MS SQL: 7 years (Required)
- ETL (SSIS): 7 years (Required)
Work Location: In person
Senior Data Engineering
Posted today
Job Viewed
Job Description
ABOUT DXC
DXC Technology is a Fortune 500 Global IT Services Leader and is ranked at 152. Our more than 130,000 people in 70-plus countries are entrusted by our customers to deliver what matters most. We use the power of technology to deliver mission critical IT services that transform global businesses. We deliver excellence for our customers, colleagues and communities around the world.
Accelerate your career and reimagine the possibilities with DXC
We inspire and take care of our people. Work in a culture that encourages innovation and where brilliant people embrace change and seize opportunities to advance their careers and amplify customer success. Leverage technology skills and deep industry knowledge to help clients. Work on transformation programs that modernize operations and drive innovation across our customer's entire IT estate using the latest technologies in cloud, applications, security, IT Outsourcing, business process outsourcing and modern workplace.
"At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We're committed to fostering an inclusive environment where everyone can thrive." #DXCSMARTFirst
JOB DESCRIPTION
Roles and Responsibilities:
Bachelor's Degree in Computer Science or Engineering courses
3+ years of experience
Knime, Power BI, Databricks, Power Apps
Language:
English: C1 Advance
What awaits you in DXC:
- Health Insurance (HMO) for you and dependents upon hiring
- Life Insurance coverage from day 1 of employment
- Days' Vacation and 15 Days Sick Leave
- Expanded maternity leave up to 120 days and Maternity Benefits
- Expanded paternity leave up to 30 days
- Non-Taxable Allowance (De-minimis)
- Company-sponsored trainings upskilling, and certification
- SMART First Working Arrangements
- Healthy and Encouraging Work Environment
- Recognition and Pay for Performance Culture
- Supplemental Pay (Standby/Shift)
- Retirement Program
- Employee Assistance Program
Data Engineering Consultant
Posted today
Job Viewed
Job Description
Qualifications:
- College Graduate
- 7 years in MS SQL Server, ETL (SSIS), Snowflake, Databrick
Nice to have:
- 2 years in Microsoft .NET Framework/.Net Core, Microservices, Rest API's, Python two with telephony data experience
Work Schedule and Work Setup: Mid shift (5:30 PM - 2:30 MN); Hybrid
Location: Makati City
Job Type: Full-time
Pay: Php100, Php150,000.00 per month
Work Location: In person
Be The First To Know
About the latest Data team Jobs in Philippines !
Data Engineering Manager
Posted today
Job Viewed
Job Description
Objectives of this role
- Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes
- Help streamline our data science workflows, adding value to our product offerings and building out the customer lifecycle and retention models
- Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning
- Be an advocate for best practices and continued learning
Responsibilities
- Work closely with our data science team to help build complex algorithms that provide unique insights into our data
- Use agile development processes to make iterative improvements to our back-end systems
- Model front-end and back-end data sources to help draw a more comprehensive picture of user flows throughout the system and to enable powerful data analysis
- Build data pipelines that clean, transform, and aggregate data from disparate sources
- Develop models that can be used to make predictions and answer questions for the overall business
Skills and qualifications
- Three or more years of experience with Python, SQL, and data visualization/exploration tools
- Familiarity with the Azure ecosystem, specifically Azure Data Lake Storage (ADL), Azure Data Factory (ADF), Azure Analysis Services (AAS) and Azure Databricks
- Communication skills, especially for explaining technical concepts to nontechnical business leaders
- Ability to work on a dynamic, research-oriented team that has concurrent projects
Preferred qualifications
- Bachelor's degree (or equivalent) in computer science, information technology, engineering, or related discipline
- Familiarity with the DAMA-DMBOK framework
- Experience in building or maintaining ETL/ELT processes
- Professional certification
Job Type: Full-time
Benefits:
- Flexible schedule
- Health insurance
- Life insurance
Ability to commute/relocate:
- Makati: Reliably commute or planning to relocate before starting work (Required)
Work Location: In person
Director Data Engineering
Posted today
Job Viewed
Job Description
About the Job
We are seeking a seasoned Director of Data Engineering to lead our data engineering team. As the Director, you will oversee the development and management of our data infrastructure, ensuring that it supports our business goals of enhancing customer experiences and operational efficiency. The ideal candidate will have a strong background in data architecture and engineering with proven experience in leading data teams in a fast-paced environment.
As Director of Data Engineering, You Will…
- Develop and execute a data strategy that aligns with the company's objectives, leading the data engineering team to build scalable and robust data solutions
- Mentor, and lead a team of data engineers and architects
- Foster a collaborative, innovative, and results-oriented environment within the team
- Design and implement effective database solutions and models to store and retrieve company data
- Examine and identify database structural necessities by evaluating client operations, applications, and programming
- Manage timelines, resources, and deliverables for multiple projects simultaneously
- Ensure projects are completed on time and within budget while meeting business needs
- Ensure compliance with data governance and security requirements
- Establish policies and procedures for data handling and sharing
As Director of Data Engineering, You Have…
- A Degree or Diploma in Computer Science, Information Systems or equivalent experience
- Minimum of 8 years of experience in data engineering, with at least 3 years in a leadership role
- Strong knowledge of database management, warehousing solutions, and ETL processes
- Proficiency in SQL and familiarity with programming languages such as Python or Java
- Experience with cloud platforms (AWS, Azure, Google Cloud) and understanding of SaaS environments
- Knowledge of real-time analytics and BI tools like Looker, Tableau, PowerBI, or similar
- Certifications in data management, big data solutions, or cloud architecture
- Experience in a contact center or customer service environmentDemonstrated ability to build and maintain high-performing data engineering teams
- Excellent communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels
Data Engineering Manager
Posted today
Job Viewed
Job Description
About Netskope
Today, there's more data and users outside the enterprise than inside, causing the network perimeter as we know it to dissolve. We realized a new perimeter was needed, one that is built in the cloud and follows and protects data wherever it goes, so we started Netskope to redefine Cloud, Network and Data Security.
Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and .
Position Summary
We are seeking a highly skilled and experienced
Data Engineering Manager
to be the Data Operations Chapter leader. In this role you will build, lead, and mentor a team of data engineers, devops engineers, data analysts, BI analysts and other associated roles to design, build, and maintain robust data pipelines, infrastructure, and visualizations that power our analytics and operational systems.
This is a hands-on role and you will be an integral part of the development team building data architecture and workflows, optimizing data pipelines, and implementing best practices for data management. This role requires strong expertise in modern data engineering technologies, cloud platforms, and a deep understanding of data governance, security, and compliance.
Key Responsibilities
- Lead, mentor, and inspire a team of data engineers, fostering professional growth and collaboration.
- Work closely with Product Owners, stakeholders, and engineering teams to define requirements and set priorities.
- Actively engage in Agile ceremonies to ensure team alignment and successful project execution.
- Drive process, tool, and workflow enhancements to optimize team performance.
- Design, build, and maintain scalable data pipelines for batch and real-time processing.
- Architect and implement solutions like data lakes, warehouses, and real-time streaming systems.
- Optimize cloud-based data infrastructure (e.g., AWS, Azure, GCP) for performance, reliability, cost-efficiency, and security.
- Ensure data quality, consistency, and compliance through governance, automated testing, and monitoring.
- Refactor legacy pipelines and integrate CI/CD frameworks to automate workflows.
- Partner with stakeholders, data scientists, and analysts to align data solutions with business goals.
- Troubleshoot, debug, and resolve complex data issues to ensure operational excellence.
- Evaluate and integrate new technologies to stay current with industry trends and enhance capabilities.
Qualifications
Required:
- Bachelor's degree in Computer Science, Data Engineering, or a related field (or equivalent experience).
- 7+ years of experience in data engineering or related roles, focusing on building and optimizing data pipelines and architectures.
- 2+ years of experience in a team lead or manager position.
- Proficiency in Python, Java, or Scala, along with strong SQL skills.
- Hands-on experience with AWS, Azure, or GCP and data services like S3, Redshift, BigQuery, Databricks, or Snowflake.
- Strong knowledge of data integration platforms and ETL/ELT frameworks such as Apache Airflow, dbt, Fivetran or Stitch.
- Experience with distributed data processing technologies like Apache Spark, Kafka, or Hadoop.
- Solid understanding of data modeling techniques for structured and unstructured data.
- Familiarity with data governance and compliance standards, including GDPR and CCPA.
- Excellent problem-solving abilities to troubleshoot and resolve complex data issues.
- Strong communication and collaboration skills to work effectively with cross-functional teams.
- Proven track record of leading or managing engineering teams and delivering complex projects.
- Deep understanding of CI/CD frameworks and processes for data pipelines.
- Fluent in English
Preferred:
- Master's degree in Computer Science, Data Engineering, or a related field.
- Experience with data versioning, orchestration, and CI/CD for data pipelines.
- Knowledge of machine learning workflows and integrating data pipelines with ML models.
- Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker).
- Certifications in relevant technologies, such as AWS Certified Data Analytics or Google Cloud Professional Data Engineer.
About You
You are a detail-oriented, innovative, and self-driven professional with a passion for building scalable data solutions. You thrive in a collaborative environment, leveraging your expertise to solve complex data challenges and drive impactful outcomes. With a deep understanding of modern data technologies, you are committed to delivering high-quality, reliable, and efficient data infrastructure that supports business goals.
Netskope is committed to implementing equal employment opportunities for all employees and applicants for employment. Netskope does not discriminate in employment opportunities or practices based on religion, race, color, sex, marital or veteran statues, age, national origin, ancestry, physical or mental disability, medical condition, sexual orientation, gender identity/expression, genetic information, pregnancy (including childbirth, lactation and related medical conditions), or any other characteristic protected by the laws or regulations of any jurisdiction in which we operate.
Netskope respects your privacy and is committed to protecting the personal information you share with us, please refer to Netskope's Privacy Policy for more details.