749 Aws Data jobs in the Philippines

AWS Data Engineer

₱900000 - ₱1200000 Y Unison Group

Posted today

Job Viewed

Tap Again To Close

Job Description

  • Design and build scalable ETL pipelines and data integration workflows using AWS services and Python.
  • Develop and optimize data lake and data warehouse solutions for structured and unstructured data.
  • Leverage Apache Spark for large-scale data processing and transformation tasks.
  • Collaborate with cross-functional teams to gather requirements and deliver clean, usable datasets for analytics and reporting.
  • Ensure high data quality, security, and compliance in all stages of the data lifecycle.
Requirements
  • 5+ years of hands-on experience in data engineering roles.
  • Proficiency in Python for building and automating data pipelines.
  • Strong experience with AWS services (S3, Glue, Redshift, Lambda, etc.).
  • Solid understanding of ETL processes and modern data warehousing concepts.
  • Experience with big data tools, especially Apache Spark (PySpark preferred).
  • Familiarity with DevOps and CI/CD practices for data pipeline deployment.
  • Knowledge of data governance and cataloging tools.
  • Strong problem-solving, communication, and collaboration skills.
This advertiser has chosen not to accept applicants from your region.

AWS Data Engineer

₱1200000 - ₱2400000 Y Deloitte

Posted today

Job Viewed

Tap Again To Close

Job Description

Title: Cloud (Data Engineer) Senior Consultant based in Deloitte Consulting Philippines Delivery Center

Are you ready to unleash your potential?
At Deloitte, our purpose is to make an impact that matters for our clients, our people, and the communities we serve.

We believe we have a responsibility to be a force for good, and WorldImpact is our portfolio of initiatives focused on making a tangible impact on society's biggest challenges and creating a better future. We strive to advise clients on how to deliver purpose-led growth and embed more equitable, inclusive as well as sustainable business practices.

Hence, we seek talented individuals driven to excel and innovate, working together to achieve our shared goals.

We are committed to creating positive work experiences that foster a culture of respect and inclusion, where diverse perspectives are celebrated, and everyone is recognized for their contributions.

Ready to unleash your potential with us? Join the winning team now
Work you will do
Deloitte's Engineering offers help to enable organization's end-to-end journey from on-premise legacy systems to the cloud, from design through deployment, and leading to the ultimate destination—a transformed organization primed for growth.

As a Cloud Engineer, you will help build and connect sustainable cloud-native systems to thrive in the cloud ecosystem. Services include: Strategy & Architecture; Assets & Industry Solutions, and; Engineering & Integration.

Key Responsibilities Include

  • Design and build robust data pipelines, preferably with Databricks , ensuring efficient handling of large-scale data operations.
  • Manage and optimize SQL databases for data ingestion, transformation, and storage, ensuring high performance and reliability.
  • Utilize Python to develop and enhance data engineering solutions, including advanced data manipulation and automation.
  • Implement and integrate Machine Learning (ML) and AI solutions, with a focus on Generative AI to drive data insights and innovation.
  • Work with AWS cloud data services to leverage cloud-based tools for data processing, storage, and analytics.
  • Collaborate with stakeholders to translate business requirements into technical solutions, ensuring alignment with project goals.
  • Stay current with emerging technologies and best practices in data engineering, maintaining a forward-thinking approach.
  • Promote an inclusive environment within the Illuminate Data team and across the broader Illuminate Product team.
  • Mentor and guide junior data engineers, fostering their growth and achieving team objectives.

Your role as a leader
At Deloitte, We Believe In The Importance Of Empowering Our People To Be Leaders At All Levels. We Connect Our Purpose And Shared Values To Identify Issues As Well As To Make An Impact That Matters To Our Clients, People And The Communities. Additionally, Senior Consultants Across Our Firm Are Expected To

  • Actively seek out developmental opportunities for growth, act as strong brand ambassadors for the firm as well as share their knowledge and experience with others.
  • Respect the needs of their colleagues and build up cooperative relationships.
  • Understand the goals of our internal and external stakeholder to set personal priorities as well as align their teams' work to achieve the objectives.
  • Constantly challenge themselves, collaborate with others to deliver on tasks and take accountability for the results.
  • Build productive relationships and communicate effectively in order to positively influence teams and other stakeholders.
  • Offer insights based on a solid understanding of what makes Deloitte successful.
  • Project integrity and confidence while motivating others through team collaboration as well as recognising individual strengths, differences, and contributions.
  • Understand disruptive trends and promote potential opportunities for improvement.

Enough About Us, Let's Talk About You

  • Bachelor's degree in Software Engineering, Information Technology, or equivalent
  • 7+ years of pipeline design and build experience on Databricks for handling large data volumes and ensuring seamless data flow.
  • Strong data engineering skills with SQL databases, including experience with complex data management and transformation tasks.
  • Proficiency in Python for developing data engineering solutions, from data ingestion to advanced processing.
  • Experience with Machine Learning (ML) and AI, particularly with Generative AI technologies, to enhance data-driven decision-making.
  • Hands-on experience with AWS cloud data services, including tools for data storage, compute, and processing.
  • Solid understanding of data modeling, database design, and ETL processes.
  • Strong problem-solving skills and the ability to resolve data engineering challenges effectively.
  • A proven ability to design and integrate complex data pipelines within business systems.

What is in store for you?

  • Embrace the dynamic nature of our work environment with the opportunity to work on a hybrid set-up and a shifting schedule.
  • Rewards platform – your hard work won't go unnoticed at Deloitte
  • Training and development - at Deloitte we believe in investing in our best assets, the people You will have access to world class training and funding towards industry and other professional certifications.
  • Receive support and mentoring to progress your career. You will have access to mentors and coaches who will help you pave a path for career progression.
  • Benefits effective upon hiring including paid time off and holidays, health, and life insurance

Next Steps
Sound like the sort of role for you? Apply now.

Due to volume of applications, we regret only shortlisted candidates will be notified.

Candidates will only be contacted by authorized Deloitte Recruiters via firm's business contact number or business email address.

2025 DCPDC Inc.

This advertiser has chosen not to accept applicants from your region.

AWS Data Engineer

₱900000 - ₱1200000 Y Peregrine

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary

The Data Engineer will be responsible for designing, developing, and managing data pipelines and architectures. S/he will ensure that data flows seamlessly from multiple sources into databases, data lakes, or warehouses and is processed in a way that supports business analysis and decision-making. The ideal candidate will have strong software engineering skills, experience with data processing frameworks, and the ability to optimize and scale large data systems.

Qualifications

  • Degree in Computer Engineering / Data Science / Statistics / Physics or any other related field in IT.
  • Minimum 2-3 years relevant experience in Data Engineering. Higher exp is better.
  • Knowledge in Agile methodology (e.g.Scrum & Kanban)
  • Experienced in using cloud computing platforms - AWS is a MUST.
  • Coding Languages and Commands

  • SQL / T-SQL; Python;Javascript; XML; JSON

  • Git; Linux

    • Systems / Software/ Platforms

    • Databricks experience is a MUST.

    • Any of the following is an advantage: Dynamic 365 BusinessCentral; Solver; MicrosoftSQL Server; MS Office 365 Applications;

Note:

  • Must be willing to work onsite at the head office near Eastwood Libis Quezon City on a usual office day shift
  • competitive compensation package awaits
This advertiser has chosen not to accept applicants from your region.

AWS Data Engineer

₱900000 - ₱1200000 Y Peregrine

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary

The Data Engineer will be responsible for designing, developing, and managing data pipelines and architectures. S/he will ensure that data flows seamlessly from multiple sources into databases, data lakes, or warehouses and is processed in a way that supports business analysis and decision-making. The ideal candidate will have strong software engineering skills, experience with data processing frameworks, and the ability to optimize and scale large data systems.

Qualifications

  • Degree in Computer Engineering / Data Science / Statistics / Physics or any other related field in IT.
  • Minimum 2-3 years relevant experience in Data Engineering. Higher exp is better.
  • Knowledge in Agile methodology (e.g.Scrum & Kanban)
  • Experienced in using cloud computing platforms - AWS is a MUST.
  • Coding Languages and Commands

  • SQL / T-SQL; Python;Javascript; XML; JSON

  • Git; Linux

    • Systems / Software/ Platforms

    • Databricks experience is a MUST.

Note:

  • Must be willing to work onsite at the head office near Eastwood Libis Quezon City on a usual office day shift
  • competitive compensation package awaits
This advertiser has chosen not to accept applicants from your region.

AWS Data Engineer

IBM

Posted 10 days ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
A career in IBM Consulting is rooted in long-term relationships and close collaboration with clients across the globe.
You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including Software and Red Hat.
Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. You will be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.
**Your role and responsibilities**
A Data Engineer with expertise in Data Platforms specializes in developing applications on Big Data technologies, such as API development. This role requires a strong foundation in traditional Application Development, along with knowledge of Analytics libraries, open-source Natural Language Processing, and statistical and big data computing libraries. The ideal candidate will possess exceptional technical abilities to understand, design, write, and debug complex code.
**Required technical and professional expertise**
AWS services basic monitoring and initial troubleshooting (Amazon Redshift, EMR, Glue, EC2, CloudWatch) Incident response and escalation procedures Basic Talend pipeline status checks and restart procedures ServiceNow incident management and comprehensive documentation Initial user access issue triage and appropriate escalation AWS systems health status verification and reporting
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

AWS Data Engineer

IBM

Posted 10 days ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
A career in IBM Consulting is rooted in long-term relationships and close collaboration with clients across the globe.
You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including Software and Red Hat.
Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. You will be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.
**Your role and responsibilities**
A Data Engineer with expertise in Data Platforms specializes in developing applications on Big Data technologies, such as API development. This role requires a strong foundation in traditional Application Development, along with knowledge of Analytics libraries, open-source Natural Language Processing, and statistical and big data computing libraries. The ideal candidate will possess exceptional technical abilities to understand, design, write, and debug complex code.
**Required technical and professional expertise**
AWS services basic monitoring and initial troubleshooting (Amazon Redshift, EMR, Glue, EC2, CloudWatch) Incident response and escalation procedures Basic Talend pipeline status checks and restart procedures ServiceNow incident management and comprehensive documentation Initial user access issue triage and appropriate escalation AWS systems health status verification and reporting
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

AWS Data Engineer

IBM

Posted 10 days ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
A career in IBM Consulting is rooted in long-term relationships and close collaboration with clients across the globe.
You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including Software and Red Hat.
Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. You will be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.
**Your role and responsibilities**
A Data Engineer with expertise in Data Platforms specializes in developing applications on Big Data technologies, such as API development. This role requires a strong foundation in traditional Application Development, along with knowledge of Analytics libraries, open-source Natural Language Processing, and statistical and big data computing libraries. The ideal candidate will possess exceptional technical abilities to understand, design, write, and debug complex code.
**Required technical and professional expertise**
AWS services basic monitoring and initial troubleshooting (Amazon Redshift, EMR, Glue, EC2, CloudWatch) Incident response and escalation procedures Basic Talend pipeline status checks and restart procedures ServiceNow incident management and comprehensive documentation Initial user access issue triage and appropriate escalation AWS systems health status verification and reporting
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Aws data Jobs in Philippines !

Senior Data Engineer (AWS & Confluent Data/AI Projects) - Remote

National Capital Region, National Capital Region TASQ Staffing Solutions

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

About the job Senior Data Engineer (AWS & Confluent Data/AI Projects) | Remote

Work Set-up: Remote



Schedule: 10am-6pm SGT

For Filipinos only



Responsibilities:



Architect and Design Data Solutions: Lead the design and architecture of scalable,

secure, and efficient data pipelines for both batch and real-time data processing on

AWS. This includes data ingestion, transformation, storage, and consumption layers.

Confluent Kafka Expertise: Design, implement, and optimize highly performant and

reliable data streaming solutions using Confluent Platform (Kafka, ksqlDB, Kafka

Connect, Schema Registry). Ensure efficient data flow for real-time analytics and AI

applications.

AWS Cloud Native Development: Develop and deploy data solutions leveraging a wide

range of AWS services, including but not limited to:

Data Storage: S3 (Data Lake), RDS, DynamoDB, Redshift, Lake Formation.

Data Processing: Glue, EMR (Spark), Lambda, Kinesis, MSK (for Kafka integration).

Orchestration: AWS Step Functions, Airflow (on EC2 or MWAA)

Analytics & ML: Athena, QuickSight, SageMaker (for MLOps integration).



Required Skills and Qualifications:



Bachelor's or Master's degree in Computer Science, Software Engineering, or a related

quantitative field.

3 to 5 years of experience in data engineering, with a significant focus on cloud-based

solutions.

Strong expertise in AWS data services (S3, Glue, EMR, Redshift, Kinesis, Lambda, etc.).

Extensive hands-on experience with Confluent Platform/Apache Kafka for building

real-time data streaming applications.

Proficiency in programming languages such as Python, PySpark, Scala, or Java.

Expertise in SQL and experience with various database systems (relational and NoSQL).

Solid understanding of data warehousing, data lakes, and data modeling concepts (star

schema, snowflake schema, etc.).

Experience with CI/CD pipelines and DevOps practices (Git, Terraform, Jenkins, Azure

DevOps, or similar).

AWS Certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified



Preferred Qualifications (Nice to Have):



Solutions Architect - Associate/Professional).

Experience with other streaming technologies (e.g., Flink).

Knowledge of containerization technologies (Docker, Kubernetes).

Familiarity with Data Mesh or Data Fabric concepts.

Experience with data visualization tools (e.g., Tableau, Power BI, QuickSight).

Understanding of MLOps principles and tools.

Candidate must have a working laptop
This advertiser has chosen not to accept applicants from your region.

Data Solutions Manager

₱800000 - ₱1200000 Y Channel Factory

Posted today

Job Viewed

Tap Again To Close

Job Description

As a Data Solutions Manager, you will be responsible for collecting, analyzing, and interpreting

large datasets to provide actionable insights and support data-driven decision-making within

the organization. Your role involves employing statistical methods, data visualization

techniques, and programming skills to extract meaningful patterns and trends from various

sources of data.

You are brilliant with data, reports and can handle multiple deadlines simultaneously. You are

motivated to go above and beyond for clients, and have a strong understanding of the AdTech

landscape with expertise in video and social platforms. You build collaborative relationships

with Sales and Campaign Management teams, and capitalize on the knowledge sharing

amongst the global CSM cohort to ensure we uphold best practice client servicing standards.

Core Responsibilities

  • Data Licensing Support:

○ Put together and deliver Youtube channel lists based on specific guidelines and

parameters.

○ Perform quality audits on YT channel data and lists.

○ Works closely with and supports the Product Manager for any tasks as needed.

  • Content Strategy Analysis Support:

○ Perform quality audits on YT channel data and lists.

○ Manage several projects at the same time, delivering results in a timely manner.

○ Provide insights and/or recommendations as needed to help improve the content

review process and tool.

○ Works closely with and supports the Product Manager for any tasks as needed.

Qualifications
Experience required:

  • A Bachelor degree in any field you & passionate about.
  • At least 1-2 years of experience in an analyst, quality control, or project management

role

  • Strong Google Sheets or MS Excel fundamentals.
  • Fluency in English.
  • Outstanding analytical skills
  • Ability to detect patterns, data inconsistencies, etc.
  • Strong attention to detail
  • Ability to thoroughly and accurately complete tasks/projects in a timely manner.
  • Able to multitask and prioritize.
  • Technology focused mindset - ability to pick up and learn new technology as needed.

Dependable - ability to work independently to complete a set of assigned tasks in a

timely manner.

Must have skills

  • data analyst
  • content strategy
  • campaign management
  • data visualization
This advertiser has chosen not to accept applicants from your region.

Data Solutions Manager

Taguig, National Capital Region ₱1200000 - ₱2400000 Y GCash (MYNT - Globe Fintech Innovations, Inc.)

Posted today

Job Viewed

Tap Again To Close

Job Description

Do you want to take the first step in making Filipinos' lives better everyday? Here in GCash we want to stay at the forefront of the FinTech industry by creating innovative, meaningful, and convenient financial solutions for the nation G ka ba? Join the G Nation today

You will be responsible for the following:

Does (The tasks / responsibilities that the role performs to address requirements in Key Result Areas)

  • Design and build dimensional data models in accordance to industry standards and best practices
  • Work closely with stakeholders to gather and understand requirements
  • Translate business requirements into data solutions and specification
  • Collaborate with delivery teams (data engineering, reports and dashboard developers) to ensure data is accurately processed and loaded
  • Create and maintain comprehensive documentation (entity relationship diagram, mappings, data dictionary, etc.)
  • Implement data quality checks and validation processes
  • Perform periodic analysis of query and reports performance, and recommend optimization solutions
  • Ensure data models support fast query execution, even with complex joins or large datasets
  • Provide technical guidance and mentorship to junior members of the team
  • Build complex queries on large volumes of data using SQL
  • Create and update documentation, and help maintain the team document portal
  • Standardize different queries and optimize them to speed up query times
  • Ensure data security practices in the deployed pipelines
  • Identify automation opportunities and implement automation projects
  • Maintain uptime for different automation tasks

Displays (The Knowledge, Skills, and Behaviors indicating how tasks / responsibilities will be performed )

  • Strong understanding of dimensional data modeling concepts, data warehousing principles and architectures
  • Possesses technical, analytical, and soft skills to allow the design of efficient and scalable data models, reports and dashboards
  • Deep understanding of SQL for querying, reporting, and optimizing performance
  • Familiarity with traditional and modern cloud-based databases
  • Comprehension to understand and convert business questions and reporting needs into efficient data solutions
  • Appetite to stay updated with with evolving data practices, tools, and technologies indata modeling and warehousing
  • Willingness to work with different stakeholders across the organization
  • Ability to work closely with data architects, data engineers, and developers to identify data related problems and solutions
  • Leadership skills to help establish the team's ways-of-working, processes, deliverables, and documentation
  • Strong collaboration skills
  • Agile mindset with 'fail fast, learn fast' attitude
  • Detail-oriented to verify the integrity of gathered data and reports
  • Experience in analytics and data warehousing concepts, terminologies, and architecture
  • Good oral and presentation skills are a must
  • Coordinates and supports the Data Engineering team on the updates of the data sources

Delivers (The specific outputs / tangible results produced by the role; resources responsible for )

  • Dimensional data models
  • Entity relationship diagram
  • Data dictionary and metadata documentation
  • Data mapping documents
  • Performance tuning artifacts
  • ETL specifications
  • Prototypes and sample reports
  • Validation and test scenarios
  • Data change management documentation
  • Governance and security artifacts

We are looking for:

  • Experience working with SQL (Oracle, Sybase, etc.) and cloud-based (Google BigQuery, Alibaba Cloud, etc.) databases required
  • Experience in Google Looker Studio, Google Sheets, Excel required
  • Experience in Google Suite (GSheet, GDocs, etc.) required
  • Experience in diagramming tools (ERWin, Lucid, etc.) required
  • Experience in JIRA and Confluence preferred
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Aws Data Jobs