182 Cloud Data jobs in the Philippines
Cloud Data Engineer
Posted today
Job Viewed
Job Description
Design, build, and maintain cloud-based data pipelines and workflows that support
analytics and operational systems.
Integrate data from various sources using APIs and cloud services.
evelop clean, efficient, and test-driven code in Python for data ingestion and processing.
ptimize data storage and retrieval using big data formats like Apache Parquet and ORC.
mplement robust data models, including relational, dimensional, and NoSQL models.
ollaborate with cross-functional teams to gather and refine requirements and deliver
high-quality solutions.
eploy infrastructure using Infrastructure as Code (IaC) tools such as AWS
CloudFormation or CDK.
onitor and orchestrate workflows using Apache Airflow or Dagster.
ollow best practices in data governance, quality, and security.
Cloud Data Engineer
Posted today
Job Viewed
Job Description
Job Title & Qualification:
Cloud Data Engineer; 3 years of solid experience as a Data Engineer with expertise in cloud technology
Salary Range: Php 60, 000 to Php 80, 000 (depending on experience)
Nature of Engagement: Project-based for 6 months with a possibility of extension
Work Schedule: Hybrid (3 to 4 days a week onsite); 1-2 days work from home;
Work Location: UP Ayala Technopark, QC
Others: US Holidays will be observed instead of PH holidays
Cloud Data Engineer
Posted today
Job Viewed
Job Description
As a Cloud Data Engineer at Pointwest, you will design, build, and maintain scalable, secure cloud data pipelines and systems to support the Insurance Business Unit. You'll work with AWS technologies, Python, and modern data tools to ensure data quality, optimize data storage, and drive insights that delight our stakeholders.
This role is ideal for someone passionate about data engineering, cloud technologies, and continuous learning, and who thrives in a culture of Agility, Accountability, Innovation, Collaboration, and Customer Centricity—all while upholding our values of Leadership, Excellence, and Innovation.
What You'll Do
- Deliver impactful data solutions that meet stakeholder needs using AWS Glue, Lambda, S3, and other services.
- Develop, maintain, and optimize data pipelines and implement ETL workflows.
- Collaborate with cross-functional teams to define requirements and build scalable data architectures.
- Ensure data quality and security, applying best practices and compliance standards.
- Support business growth by enabling advanced analytics and reporting capabilities.
- Continuously improve processes through automation and optimization.
- Mentor peers and grow expertise, sharing best practices and new learnings within the team.
What Success Looks Like
- High stakeholder satisfaction through timely and high-quality delivery of data solutions
- Reduced data processing times and optimized resource usage
- Increased reuse of data pipeline components and automation
- Positive peer feedback and active contribution to team learning
What You'll Bring
- At least 5 years of experience in data engineering
- Strong Python skills for data handling and API integration
- Proficiency in SQL and big data formats (Parquet, ORC)
- Hands-on experience with AWS services like Glue, Lambda, Step Functions, IAM, S3
- Knowledge of data modeling (Relational, Dimensional, NoSQL)
- Experience with orchestration tools (Airflow, Dagster)
- Familiarity with IaC using AWS CloudFormation or CDK
- Excellent communication skills and a collaborative mindset
Nice-to-Haves
- Experience with Apache Spark, data streaming tools (Kinesis, Kafka), or working in multicultural client-facing teams
Grow with Pointwest
Here, you don't just build systems—you build careers, capabilities, and communities. Join us in creating cloud data solutions that power the insurance industry forward.
Cloud Data Engineer
Posted today
Job Viewed
Job Description
Job , has immediate opportunities for experienced Cloud Data Engineer, your primary responsibility will be to design, deploy, and maintain cloud-based solutions primarily on the Microsoft Azure platform. You will work closely with our internal leadership, on and offshore development teams, and data & analytics team to ensure the successful implementation and operation of cloud services. Your role will involve designing and implementing scalable, secure, and highly available cloud architecture, as well as troubleshooting and resolving any issues that arise.
- Azure Solution Design: Collaborate with development teams and architects to design cloud-based solutions on the Azure platform. Evaluate requirements, propose design options, and recommend best practices for scalability, performance, security, and cost optimization.
- Design, implement and maintain infrastructure solutions based on Azure services i.e Configuring and managing virtual networks, subnets, routing tables, network security groups, and load balancers within the Azure environment, Configuring the IP-Sec VPN tunnels and troubleshooting network-related issues and providing timely resolution to minimize downtime.
- Creating a robust data engineering culture with defined frameworks for development, code review, code standards, deployment practices, testing, security, privacy, and documentation.
- Working with an Agile Project Manager to manage team backlog and deliver projects using an Agile Framework.
- Technical oversight of BI/Data Lake Infrastructure and other enterprise data assets. You will work closely with architects and engineers to develop a comprehensive infrastructure plan that can handle the organization's current and future data needs.
- Implementing quality control measures to ensure data integrity, including data validation, cleansing, and execution of defined data governance practices.
- Collaborating with other enterprise stakeholders, but most notably with data science, business intelligence, and IT organizations, to understand and document requirements for the company's data roadmap.
- Researching new tools and technologies which will enable the company's data vision while monitoring and evaluating the effectiveness of current solutions to ensure they deliver the expected value.
- Managing and optimizing data engineering budgets to ensure cost-effective utilization of resources. This includes identifying cost-saving opportunities, evaluating vendor contracts, and allocating budgets for data infrastructure upgrades, cost optimization, and resource allocation.
Qualifications:
- Bachelor's degree in computer science, or a related field (or equivalent work experience).
- 5 + years' experience in Power BI and Data Lake required
- Strong experience in designing, implementing, and managing cloud solutions
- Proficiency in Azure services, including compute, storage, networking, and identity and access management
- CI/CD, GCP/Azure Pipelines, Azure DevOps, Lambda
- Proficient in Java, Python
- Experience with React is a plus
- Proven experience in designing and implementing network infrastructure solutions in Microsoft Azure.
- Strong knowledge of Azure networking services, including virtual networks, VPN gateways, ExpressRoute, Azure Firewall, and Azure Load Balancer.
- Knowledge of security best practices and compliance standards related to cloud environments.
- Solid understanding of Data Storage and retrieval.
- Excellent communication and collaboration skills to work effectively with teams and stakeholders.
Cloud Data Engineer
Posted today
Job Viewed
Job Description
Work Arrangements: Hybrid (UP Techno Hub, Quezon City Philippines)
Work Schedule: Mid-shift (3pm - 12mn) on office days; 4pm to 1am on WFH days; 1 month of night shift at the start of the project.
Job Summary:
We are looking for a skilled Cloud Data Engineer with a strong foundation in data engineering, cloud-based architectures, and open-source technologies. The ideal candidate will have hands-on experience in designing, building, and optimizing data pipelines and data integrations using AWS services and Python. You will play a critical role in supporting our data-driven decision-making by ensuring seamless and reliable data flow across our systems.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines for data ingestion, transformation, and integration.
- Implement and manage big data storage formats such as Apache Parquet and ORC.
- Write high-quality Python code for data processing and API integration with a focus on test-driven development.
- Develop and optimize SQL queries for various analytical and operational needs.
- Work with various data modeling techniques: relational, dimensional, and NoSQL.
- Implement infrastructure as code (IaC) using AWS CloudFormation or CDK.
- Leverage AWS services such as Glue, Lambda, IAM, DynamoDB, S3, and Step Functions for building data solutions.
- Use orchestration tools like Apache Airflow or Dagster to manage workflows.
- Collaborate with cross-functional and multicultural teams and, where applicable, engage with clients.
Required Experience:
- At least 5 years in a data engineering role, focusing on data integration, processing, and transformation.
- Strong programming skills in Python, with emphasis on API integration and data libraries, following test-driven development.
- Proficiency with big data storage formats (Apache Parquet, ORC) including optimization strategies.
- Proficiency with SQL for analytical and operational queries.
Experience with data modeling:
Relational modeling (required)
- Dimensional modeling (nice-to-have)
- NoSQL modeling (required)
- Infrastructure as Code (IaC): Working knowledge of AWS CloudFormation or CDK.
AWS Services:
Required: Glue, IAM, Lambda, DynamoDB, Step Functions, S3, CloudFormation or CDK.
- Nice-to-have: Athena, Kinesis, MSK, MWAA, SQS.
- Data Orchestration: Hands-on with EventBridge, Step Functions, Lambda; experience with Apache Airflow (nice-to-have).
- Data Streaming: Experience with Kinesis or Kafka (nice-to-have).
- Big Data Processing: Exposure to Apache Spark (nice-to-have).
- Client-facing experience, multicultural collaboration, and technical leadership (nice-to-have).
Application Process:
If you meet the qualifications and are passionate about driving impactful cloud solutions, we encourage you to apply. Please note that only qualified candidates will be contacted. JWay Group is proud to be an Equal Opportunity Employer. We value diversity and are committed to fostering an inclusive and supportive workplace for all.
Cloud Data Engineer
Posted today
Job Viewed
Job Description
Background on what ING is about (Generic ING background):
ING Hubs Philippines (ING Hubs PH) is an international part of the ING organization delivering services to many Business Units across the world for both Wholesale Banking and Retail Banking activities. Working for ING Hubs PH means working with the most diverse workforce and where no challenge is the same.
At ING our purpose is to empower people to stay a step ahead in life and business. We believe that sustainable progress is driven by people with the imagination and determination to make a better future for themselves and those around them.
ING is changing what banking is. For you, that means plenty of opportunities for personal growth in a continuously evolving environment. If this is the environment you thrive in, then apply and join us in changing the future of banking
Job Overview
The Cloud Data Engineer is a leader in data design, analysis and development, providing insights and identifying opportunities for improvement. Business and systems analysis, development and testing, combined with business risk assessments and data quality monitoring are also components of the role.
The role involves working with large volumes of data, both structured and unstructured, and implementing data solutions on the Azure platform. You will be responsible for design and build of complex data pipelines and extending platform capabilities to enable real time data and advanced analytics services using MS Azure native data technologies
The Cloud Data Engineer will also be responsible for coaching and mentoring less experienced team members to build their capability. Work collaboratively with product owners and stakeholder relationship management skills will be key to drive the right outcomes.
Key Responsibilities
Collaborate with the Business to fully understand business requirements.
Provide ETL solutions that support Business Information requirements in Informatica.
Design and develop modern data processes using Informatica to ING's standards.
Analyze source systems to identify appropriate sources of data.
Identify and resolve data quality issues.
Identify opportunities to value-add to source system data.
Estimate effort involved in work tasks.
Implement and enhance automation processes.
Design solutions that are in keeping with ING's data mesh architecture.
Improve system performance by tuning and optimizing the platform and platform workloads.
Use components of the Project Life Cycle to enhance quality in areas of Documentation, Standards, Formal and informal code reviews – as either reviewer or author.
Work to project plans and schedules.
Re-use components whenever possible.
Promoting 'Best Practice' ethos.
Key Capabilities/Experience
7+ years experience in Data warehousing or related function.
5+ years experience as a Data Integration Expert.
3+ End to End BI Project Delivery experience.
Minimum Qualifications
Big Data experience as engineer or admin in a modern bid data stack (KAFKA, Apache Beam/Flink/Spark, Hadoop 3.0, etc.).
Experience with open source and applications orientated languages within a data focused environment – with preference to Python, Java, Scala, R and Spark.
Experience with delivering end to end data integration solutions in a cloud environment with preference to Azure.
Advanced knowledge of a modern data application.
Strong knowledge in Informatica Power Center and related products.
Experience in Logical and Physical Data Modelling, including dimensional, relational, and flat-file data structures.
Fluent in Kimball or Inmon Datawarehouse modelling techniques.
Fluent in SQL.
Relational database experience (preferably MS SQL Server).
Cloud Data Engineer
Posted today
Job Viewed
Job Description
A7 Recruitment Corporation is hiring a Full time Cloud Data Engineer role in Salcedo Village, NCR. Apply now to be part of our team.
Job summary:
- Looking for candidates available to work:
- Monday: Afternoon
- Tuesday: Afternoon
- Wednesday: Afternoon
- Thursday: Afternoon
- Friday: Afternoon
- Expected salary: ₱150,000 - ₱200,000 per month
Work Details:
• Job Title: Senior Snowflake Data Engineer
• Work Setup: Hybrid (3x/week onsite in BGC, Taguig)
• Shift: Mid Shift
Key Responsibilities:
• Apply strong understanding of ETL, data warehousing, and BI (Qlik) concepts to support advanced analytics initiatives
• Design and implement scalable data pipelines using ETL tools such as AWS Glue
• Work with Snowflake, PostgreSQL, and MS SQL for data storage and processing
• Develop and maintain cloud-enabled solutions using AWS services (e.g., RDS, Fargate)
• Write clean, efficient code using Java and/or Python for data engineering tasks
• Collaborate with cross-functional teams in an Agile environment (Kanban, SAFe, SCRUM)
• Ensure data quality, performance, and security across platforms
Qualifications:
• Minimum 7+ years of experience in data engineering or related roles
• Strong practical experience with Snowflake, PostgreSQL, and MS SQL
• Proficient in Java and/or Python programming
• Hands-on experience with AWS services (RDS, Fargate, etc.)
• Experience with ETL tools such as AWS Glue
• Familiarity with BI tools like Qlik is a plus
• Strong understanding of Agile frameworks (Kanban, SAFe, SCRUM); certification is a plus
• Excellent problem-solving and communication skills
Be The First To Know
About the latest Cloud data Jobs in Philippines !
cloud data engineer
Posted today
Job Viewed
Job Description
The Cloud Data Engineer will build scalable, cloud-native data pipelines. The ideal candidate must have a strong foundation in Python and AWS technologies.
The role follows a hybrid work arrangement, requiring regular onsite work at UP Ayala Technohub in Quezon City. Candidates must be willing to report onsite, work on mid-shift and graveyard schedules, and observe U.S. holidays instead of Philippine holidays.
Key Responsibilities:
- Design, build, and maintain cloud-based data pipelines and workflows that support analytics and operational systems.
- Integrate data from various sources using APIs and cloud services.
- Develop clean, efficient, and test-driven code in Python for data ingestion and processing.
- Optimize data storage and retrieval using big data formats like Apache Parquet and ORC.
- Implement robust data models, including relational, dimensional, and NoSQL models.
- Collaborate with cross-functional teams to gather and refine requirements and deliver high-quality solutions.
- Deploy infrastructure using Infrastructure as Code (IaC) tools like AWS CloudFormation or CDK.
- Monitor and orchestrate workflows using Apache Airflow or Dagster.
- Follow best practices in data governance, quality, and security.
Core Expertise:
- Experience: At least 3 years in a data engineering role working on data integration, processing, and transformation use cases with open-source languages (i.e. Python) and cloud technologies.
- Strong programming skills in Python specifically for API integration and data libraries, with emphasis on quality and test-driven development.
- Demonstrated proficiency with big data storage formats (Apache Parquet, ORC) and practical knowledge of pitfalls and optimization strategies.
- Demonstrated proficiency with SQL
Experience with data modeling:
Relational modeling
- Dimensional modeling
- NoSQL modeling
- Working knowledge of IaC on AWS (CloudFormation or CDK)
Working knowledge of AWS Services:
Required: Glue, IAM, Lambda, DynamoDb, Step Functions, S3, CloudFormation or CDK
- Nice-to-have: Athena, Kinesis, MSK, MWAA, SQS
- Experience with orchestration of data flows/pipelines: Apache Airflow or Dagster
Nice-to-have:
Experience with data streaming (Kinesis, Kafka)
- Experience with Apache Spark
- Client-facing experience, multi-cultural team experience, technical leadership, team leadership
Key Competencies & Abilities:
- Ability to work independently and collaboratively in a team environment.
- High level of attention to detail and commitment to delivering quality work.
- Strong analytical and critical thinking skills.
- Effective time management and organizational skills.
- Strong customer focus and the ability to communicate technical concepts clearly to stakeholders.
- Excellent written and verbal communication skills.
Additional Information:
- Must be willing to work on a hybrid setup, with onsite reporting to UP Ayala Technohub, Quezon City.
- Must be available to start by October 1, 2025.
- Engagement is project-based for 6 months, with a possibility of extension.
- Work schedule is on a mid-shift and graveyard rotation.
- The role observes U.S. holidays instead of Philippine holidays.
Cloud Data Engineer
Posted today
Job Viewed
Job Description
About IRely
In 2008, iRely began selling its Commodity Management system to producers and processors. Since then, we have continued to expand and implement new software designed to deliver business management solutions for the petroleum distribution, retail, agriculture, and commodity industries.
Today, iRely remains privately owned and self-funded, with a long-term ownership plan that ensures private ownership for decades to come. Our commitment to customer success has made us a global leader in digital transformation.
Headquartered in Dallas, Texas with offices in Bangalore (India), and Makati City (Philippines), iRely has nearly 40 years of experience providing end-to-end ERP and CTRM to over 500 customers in more than 25 countries.
Here at iRely, we understand that your Business isn't simple, but our Innovative Software Solutions are.
Job Description
As a Cloud Data Engineer at iRely, your primary responsibility will be to design, deploy, and maintain cloud-based solutions primarily on the Microsoft Azure platform. You will work closely with our internal leadership, on and offshore development teams, and data & analytics team to ensure the successful implementation and operation of cloud services. Your role will involve designing and implementing scalable, secure, and highly available cloud architecture, as well as troubleshooting and resolving any issues that arise.
- Azure Solution Design: Collaborate with development teams and architects to design cloud-based solutions on the Azure platform. Evaluate requirements, propose design options, and recommend best practices for scalability, performance, security, and cost optimization.
- Design, implement and maintain infrastructure solutions based on Azure services i.e Configuring and managing virtual networks, subnets, routing tables, network security groups, and load balancers within the Azure environment, Configuring the IP-Sec VPN tunnels and troubleshooting network-related issues and providing timely resolution to minimize downtime.
- Creating a robust data engineering culture with defined frameworks for development, code review, code standards, deployment practices, testing, security, privacy, and documentation.
- Working with an Agile Project Manager to manage team backlog and deliver projects using an Agile Framework.
- Technical oversight of BI/Data Lake Infrastructure and other enterprise data assets. You will work closely with architects and engineers to develop a comprehensive infrastructure plan that can handle the organization's current and future data needs.
- Implementing quality control measures to ensure data integrity, including data validation, cleansing, and execution of defined data governance practices.
- Collaborating with other enterprise stakeholders, but most notably with data science, business intelligence, and IT organizations, to understand and document requirements for the company's data roadmap.
- Researching new tools and technologies which will enable the company's data vision while monitoring and evaluating the effectiveness of current solutions to ensure they deliver the expected value.
- Managing and optimizing data engineering budgets to ensure cost-effective utilization of resources. This includes identifying cost-saving opportunities, evaluating vendor contracts, and allocating budgets for data infrastructure upgrades, cost optimization, and resource allocation.
Qualifications
- Bachelor's degree in computer science, or a related field (or equivalent work experience).
- 5 + years' experience in Power BI and Data Lake required
- Strong experience in designing, implementing, and managing cloud solutions
- Proficiency in Azure services, including compute, storage, networking, and identity and access management
- CI/CD, GCP/Azure Pipelines, Azure DevOps, Lambda
- Proficient in Java, Python
- Experience with React is a plus
- Proven experience in designing and implementing network infrastructure solutions in Microsoft Azure.
- Strong knowledge of Azure networking services, including virtual networks, VPN gateways, ExpressRoute, Azure Firewall, and Azure Load Balancer.
- Knowledge of security best practices and compliance standards related to cloud environments.
- Solid understanding of Data Storage and retrieval.
- Excellent communication and collaboration skills to work effectively with teams and stakeholders.
Why Choose Us?
At iRely, we empower our team members to lead with innovation and consistently exceed customer expectations. As a hands-on leader passionate about ERP Implementation and team success, you'll have the chance to make a meaningful impact in this role. We provide competitive compensation, comprehensive benefits, and clear pathways for career growth, all within a supportive, collaborative environment that values your contributions.
Diversity and Inclusion
We believe that different perspectives and backgrounds are what make a company flourish. All qualified applicants will receive equal consideration for employment regardless of race, color, religion, sex, sexual orientation, gender identity, national origin, economic status, disability, age, or any other legally protected characteristics. We are proud to be an inclusive company with values grounded in equality and ethics, where we celebrate, support, and embrace diversity.
Cloud Data Engineer
Posted today
Job Viewed
Job Description
- REQ
- 18/09/2025
- IT Engineering
- Makati City, Filipijnen
- ING Hubs
Details van de functie
Background on what ING is about (Generic ING background):
ING Hubs Philippines (ING Hubs PH) is an international part of the ING organization delivering services to many Business Units across the world for both Wholesale Banking and Retail Banking activities. Working for ING Hubs PH means working with the most diverse workforce and where no challenge is the same.
At ING our purpose is to empower people to stay a step ahead in life and business. We believe that sustainable progress is driven by people with the imagination and determination to make a better future for themselves and those around them.
ING is changing what banking is. For you, that means plenty of opportunities for personal growth in a continuously evolving environment. If this is the environment you thrive in, then apply and join us in changing the future of banking
Job Overview
The Cloud Data Engineer is a leader in data design, analysis and development, providing insights and identifying opportunities for improvement. Business and systems analysis, development and testing, combined with business risk assessments and data quality monitoring are also components of the role.
The role involves working with large volumes of data, both structured and unstructured, and implementing data solutions on the Azure platform. You will be responsible for design and build of comple data pipelines and etending platform capabilities to enable real time data and advanced analytics services using MS Azure native data technologies
The Cloud Data Engineer will also be responsible for coaching and mentoring less eperienced team members to build their capability. Work collaboratively with product owners and stakeholder relationship management skills will be key to drive the right outcomes.
Key Responsibilities
- Collaborate with the Business to fully understand business requirements.
- Provide ETL solutions that support Business Information requirements in Informatica.
- Design and develop modern data processes using Informatica to ING's standards.
- Analyze source systems to identify appropriate sources of data.
- Identify and resolve data quality issues.
- Identify opportunities to value-add to source system data.
- Estimate effort involved in work tasks.
- Implement and enhance automation processes.
- Design solutions that are in keeping with ING's data mesh architecture.
- Improve system performance by tuning and optimizing the platform and platform workloads.
- Use components of the Project Life Cycle to enhance quality in areas of Documentation, Standards, Formal and informal code reviews – as either reviewer or author.
- Work to project plans and schedules.
- Re-use components whenever possible.
- Promoting 'Best Practice' ethos.
Key Capabilities/Eperience
7+ years e
- perience in Data warehousing or related function.
5+ years eperience as a Data Integration E- pert.
3+ End to End BI Project Delivery e- perience.
Minimum Qualifications
Big Data e
- perience as engineer or admin in a modern bid data stack (KAFKA, Apache Beam/Flink/Spark, Hadoop 3.0, etc.).
E- perience with open source and applications orientated languages within a data focused environment – with preference to Python, Java, Scala, R and Spark.
E- perience with delivering end to end data integration solutions in a cloud environment with preference to Azure. - Advanced knowledge of a modern data application.
- Strong knowledge in Informatica Power Center and related products.
E- perience in Logical and Physical Data Modelling, including dimensional, relational, and flat-file data structures. - Fluent in Kimball or Inmon Datawarehouse modelling techniques.
- Fluent in SQL.
Relational database e- perience (preferably MS SQL Server).
Your place of work
Eplore the area
Questions? Just ask
ING Recruitment team
Bij ING willen we het beste uit mensen halen. Daarom hebben we een inclusieve cultuur waarin iedereen de kans krijgt om te groeien en een verschil te maken voor onze klanten en de samenleving. Diversiteit, gelijkheid en inclusie staan bij ons altijd voorop. We behan iedereen eerlijk, ongeacht leeftijd, geslacht, genderidentiteit, culturele achtergrond, ervaring, geloof, ras, etniciteit, beperking, gezinssituatie, seksuele geaardheid, sociale afkomst of wat dan ook. Heb je hulp nodig of kunnen we iets voor je doen tijdens je sollicitatie of gesprek? Neem dan contact op met de recruiter die bij de vacature vermeld staat. We werken graag samen met jou om het proces eerlijk en toegankelijk te maken. Lees hier meer over hoe wij staan voor diversiteit, inclusie en erbij horen.