749 Aws Data jobs in the Philippines
AWS Data Engineer
Posted today
Job Viewed
Job Description
- Design and build scalable ETL pipelines and data integration workflows using AWS services and Python.
- Develop and optimize data lake and data warehouse solutions for structured and unstructured data.
- Leverage Apache Spark for large-scale data processing and transformation tasks.
- Collaborate with cross-functional teams to gather requirements and deliver clean, usable datasets for analytics and reporting.
- Ensure high data quality, security, and compliance in all stages of the data lifecycle.
- 5+ years of hands-on experience in data engineering roles.
- Proficiency in Python for building and automating data pipelines.
- Strong experience with AWS services (S3, Glue, Redshift, Lambda, etc.).
- Solid understanding of ETL processes and modern data warehousing concepts.
- Experience with big data tools, especially Apache Spark (PySpark preferred).
- Familiarity with DevOps and CI/CD practices for data pipeline deployment.
- Knowledge of data governance and cataloging tools.
- Strong problem-solving, communication, and collaboration skills.
AWS Data Engineer
Posted today
Job Viewed
Job Description
Title: Cloud (Data Engineer) Senior Consultant based in Deloitte Consulting Philippines Delivery Center
Are you ready to unleash your potential?
At Deloitte, our purpose is to make an impact that matters for our clients, our people, and the communities we serve.
We believe we have a responsibility to be a force for good, and WorldImpact is our portfolio of initiatives focused on making a tangible impact on society's biggest challenges and creating a better future. We strive to advise clients on how to deliver purpose-led growth and embed more equitable, inclusive as well as sustainable business practices.
Hence, we seek talented individuals driven to excel and innovate, working together to achieve our shared goals.
We are committed to creating positive work experiences that foster a culture of respect and inclusion, where diverse perspectives are celebrated, and everyone is recognized for their contributions.
Ready to unleash your potential with us? Join the winning team now
Work you will do
Deloitte's Engineering offers help to enable organization's end-to-end journey from on-premise legacy systems to the cloud, from design through deployment, and leading to the ultimate destination—a transformed organization primed for growth.
As a Cloud Engineer, you will help build and connect sustainable cloud-native systems to thrive in the cloud ecosystem. Services include: Strategy & Architecture; Assets & Industry Solutions, and; Engineering & Integration.
Key Responsibilities Include
- Design and build robust data pipelines, preferably with Databricks , ensuring efficient handling of large-scale data operations.
- Manage and optimize SQL databases for data ingestion, transformation, and storage, ensuring high performance and reliability.
- Utilize Python to develop and enhance data engineering solutions, including advanced data manipulation and automation.
- Implement and integrate Machine Learning (ML) and AI solutions, with a focus on Generative AI to drive data insights and innovation.
- Work with AWS cloud data services to leverage cloud-based tools for data processing, storage, and analytics.
- Collaborate with stakeholders to translate business requirements into technical solutions, ensuring alignment with project goals.
- Stay current with emerging technologies and best practices in data engineering, maintaining a forward-thinking approach.
- Promote an inclusive environment within the Illuminate Data team and across the broader Illuminate Product team.
- Mentor and guide junior data engineers, fostering their growth and achieving team objectives.
Your role as a leader
At Deloitte, We Believe In The Importance Of Empowering Our People To Be Leaders At All Levels. We Connect Our Purpose And Shared Values To Identify Issues As Well As To Make An Impact That Matters To Our Clients, People And The Communities. Additionally, Senior Consultants Across Our Firm Are Expected To
- Actively seek out developmental opportunities for growth, act as strong brand ambassadors for the firm as well as share their knowledge and experience with others.
- Respect the needs of their colleagues and build up cooperative relationships.
- Understand the goals of our internal and external stakeholder to set personal priorities as well as align their teams' work to achieve the objectives.
- Constantly challenge themselves, collaborate with others to deliver on tasks and take accountability for the results.
- Build productive relationships and communicate effectively in order to positively influence teams and other stakeholders.
- Offer insights based on a solid understanding of what makes Deloitte successful.
- Project integrity and confidence while motivating others through team collaboration as well as recognising individual strengths, differences, and contributions.
- Understand disruptive trends and promote potential opportunities for improvement.
Enough About Us, Let's Talk About You
- Bachelor's degree in Software Engineering, Information Technology, or equivalent
- 7+ years of pipeline design and build experience on Databricks for handling large data volumes and ensuring seamless data flow.
- Strong data engineering skills with SQL databases, including experience with complex data management and transformation tasks.
- Proficiency in Python for developing data engineering solutions, from data ingestion to advanced processing.
- Experience with Machine Learning (ML) and AI, particularly with Generative AI technologies, to enhance data-driven decision-making.
- Hands-on experience with AWS cloud data services, including tools for data storage, compute, and processing.
- Solid understanding of data modeling, database design, and ETL processes.
- Strong problem-solving skills and the ability to resolve data engineering challenges effectively.
- A proven ability to design and integrate complex data pipelines within business systems.
What is in store for you?
- Embrace the dynamic nature of our work environment with the opportunity to work on a hybrid set-up and a shifting schedule.
- Rewards platform – your hard work won't go unnoticed at Deloitte
- Training and development - at Deloitte we believe in investing in our best assets, the people You will have access to world class training and funding towards industry and other professional certifications.
- Receive support and mentoring to progress your career. You will have access to mentors and coaches who will help you pave a path for career progression.
- Benefits effective upon hiring including paid time off and holidays, health, and life insurance
Next Steps
Sound like the sort of role for you? Apply now.
Due to volume of applications, we regret only shortlisted candidates will be notified.
Candidates will only be contacted by authorized Deloitte Recruiters via firm's business contact number or business email address.
2025 DCPDC Inc.
AWS Data Engineer
Posted today
Job Viewed
Job Description
Job Summary
The Data Engineer will be responsible for designing, developing, and managing data pipelines and architectures. S/he will ensure that data flows seamlessly from multiple sources into databases, data lakes, or warehouses and is processed in a way that supports business analysis and decision-making. The ideal candidate will have strong software engineering skills, experience with data processing frameworks, and the ability to optimize and scale large data systems.
Qualifications
- Degree in Computer Engineering / Data Science / Statistics / Physics or any other related field in IT.
- Minimum 2-3 years relevant experience in Data Engineering. Higher exp is better.
- Knowledge in Agile methodology (e.g.Scrum & Kanban)
- Experienced in using cloud computing platforms - AWS is a MUST.
Coding Languages and Commands
SQL / T-SQL; Python;Javascript; XML; JSON
Git; Linux
Systems / Software/ Platforms
Databricks experience is a MUST.
- Any of the following is an advantage: Dynamic 365 BusinessCentral; Solver; MicrosoftSQL Server; MS Office 365 Applications;
Note:
- Must be willing to work onsite at the head office near Eastwood Libis Quezon City on a usual office day shift
- competitive compensation package awaits
AWS Data Engineer
Posted today
Job Viewed
Job Description
Job Summary
The Data Engineer will be responsible for designing, developing, and managing data pipelines and architectures. S/he will ensure that data flows seamlessly from multiple sources into databases, data lakes, or warehouses and is processed in a way that supports business analysis and decision-making. The ideal candidate will have strong software engineering skills, experience with data processing frameworks, and the ability to optimize and scale large data systems.
Qualifications
- Degree in Computer Engineering / Data Science / Statistics / Physics or any other related field in IT.
- Minimum 2-3 years relevant experience in Data Engineering. Higher exp is better.
- Knowledge in Agile methodology (e.g.Scrum & Kanban)
- Experienced in using cloud computing platforms - AWS is a MUST.
Coding Languages and Commands
SQL / T-SQL; Python;Javascript; XML; JSON
Git; Linux
Systems / Software/ Platforms
Databricks experience is a MUST.
Note:
- Must be willing to work onsite at the head office near Eastwood Libis Quezon City on a usual office day shift
- competitive compensation package awaits
AWS Data Engineer
Posted 10 days ago
Job Viewed
Job Description
A career in IBM Consulting is rooted in long-term relationships and close collaboration with clients across the globe.
You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including Software and Red Hat.
Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. You will be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.
**Your role and responsibilities**
A Data Engineer with expertise in Data Platforms specializes in developing applications on Big Data technologies, such as API development. This role requires a strong foundation in traditional Application Development, along with knowledge of Analytics libraries, open-source Natural Language Processing, and statistical and big data computing libraries. The ideal candidate will possess exceptional technical abilities to understand, design, write, and debug complex code.
**Required technical and professional expertise**
AWS services basic monitoring and initial troubleshooting (Amazon Redshift, EMR, Glue, EC2, CloudWatch) Incident response and escalation procedures Basic Talend pipeline status checks and restart procedures ServiceNow incident management and comprehensive documentation Initial user access issue triage and appropriate escalation AWS systems health status verification and reporting
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
AWS Data Engineer
Posted 10 days ago
Job Viewed
Job Description
A career in IBM Consulting is rooted in long-term relationships and close collaboration with clients across the globe.
You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including Software and Red Hat.
Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. You will be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.
**Your role and responsibilities**
A Data Engineer with expertise in Data Platforms specializes in developing applications on Big Data technologies, such as API development. This role requires a strong foundation in traditional Application Development, along with knowledge of Analytics libraries, open-source Natural Language Processing, and statistical and big data computing libraries. The ideal candidate will possess exceptional technical abilities to understand, design, write, and debug complex code.
**Required technical and professional expertise**
AWS services basic monitoring and initial troubleshooting (Amazon Redshift, EMR, Glue, EC2, CloudWatch) Incident response and escalation procedures Basic Talend pipeline status checks and restart procedures ServiceNow incident management and comprehensive documentation Initial user access issue triage and appropriate escalation AWS systems health status verification and reporting
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
AWS Data Engineer
Posted 10 days ago
Job Viewed
Job Description
A career in IBM Consulting is rooted in long-term relationships and close collaboration with clients across the globe.
You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including Software and Red Hat.
Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. You will be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.
**Your role and responsibilities**
A Data Engineer with expertise in Data Platforms specializes in developing applications on Big Data technologies, such as API development. This role requires a strong foundation in traditional Application Development, along with knowledge of Analytics libraries, open-source Natural Language Processing, and statistical and big data computing libraries. The ideal candidate will possess exceptional technical abilities to understand, design, write, and debug complex code.
**Required technical and professional expertise**
AWS services basic monitoring and initial troubleshooting (Amazon Redshift, EMR, Glue, EC2, CloudWatch) Incident response and escalation procedures Basic Talend pipeline status checks and restart procedures ServiceNow incident management and comprehensive documentation Initial user access issue triage and appropriate escalation AWS systems health status verification and reporting
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Be The First To Know
About the latest Aws data Jobs in Philippines !
Senior Data Engineer (AWS & Confluent Data/AI Projects) - Remote
Posted 4 days ago
Job Viewed
Job Description
Work Set-up: Remote
Schedule: 10am-6pm SGT
For Filipinos only
Responsibilities:
Architect and Design Data Solutions: Lead the design and architecture of scalable,
secure, and efficient data pipelines for both batch and real-time data processing on
AWS. This includes data ingestion, transformation, storage, and consumption layers.
Confluent Kafka Expertise: Design, implement, and optimize highly performant and
reliable data streaming solutions using Confluent Platform (Kafka, ksqlDB, Kafka
Connect, Schema Registry). Ensure efficient data flow for real-time analytics and AI
applications.
AWS Cloud Native Development: Develop and deploy data solutions leveraging a wide
range of AWS services, including but not limited to:
Data Storage: S3 (Data Lake), RDS, DynamoDB, Redshift, Lake Formation.
Data Processing: Glue, EMR (Spark), Lambda, Kinesis, MSK (for Kafka integration).
Orchestration: AWS Step Functions, Airflow (on EC2 or MWAA)
Analytics & ML: Athena, QuickSight, SageMaker (for MLOps integration).
Required Skills and Qualifications:
Bachelor's or Master's degree in Computer Science, Software Engineering, or a related
quantitative field.
3 to 5 years of experience in data engineering, with a significant focus on cloud-based
solutions.
Strong expertise in AWS data services (S3, Glue, EMR, Redshift, Kinesis, Lambda, etc.).
Extensive hands-on experience with Confluent Platform/Apache Kafka for building
real-time data streaming applications.
Proficiency in programming languages such as Python, PySpark, Scala, or Java.
Expertise in SQL and experience with various database systems (relational and NoSQL).
Solid understanding of data warehousing, data lakes, and data modeling concepts (star
schema, snowflake schema, etc.).
Experience with CI/CD pipelines and DevOps practices (Git, Terraform, Jenkins, Azure
DevOps, or similar).
AWS Certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified
Preferred Qualifications (Nice to Have):
Solutions Architect - Associate/Professional).
Experience with other streaming technologies (e.g., Flink).
Knowledge of containerization technologies (Docker, Kubernetes).
Familiarity with Data Mesh or Data Fabric concepts.
Experience with data visualization tools (e.g., Tableau, Power BI, QuickSight).
Understanding of MLOps principles and tools.
Candidate must have a working laptop
Data Solutions Manager
Posted today
Job Viewed
Job Description
As a Data Solutions Manager, you will be responsible for collecting, analyzing, and interpreting
large datasets to provide actionable insights and support data-driven decision-making within
the organization. Your role involves employing statistical methods, data visualization
techniques, and programming skills to extract meaningful patterns and trends from various
sources of data.
You are brilliant with data, reports and can handle multiple deadlines simultaneously. You are
motivated to go above and beyond for clients, and have a strong understanding of the AdTech
landscape with expertise in video and social platforms. You build collaborative relationships
with Sales and Campaign Management teams, and capitalize on the knowledge sharing
amongst the global CSM cohort to ensure we uphold best practice client servicing standards.
Core Responsibilities
- Data Licensing Support:
○ Put together and deliver Youtube channel lists based on specific guidelines and
parameters.
○ Perform quality audits on YT channel data and lists.
○ Works closely with and supports the Product Manager for any tasks as needed.
- Content Strategy Analysis Support:
○ Perform quality audits on YT channel data and lists.
○ Manage several projects at the same time, delivering results in a timely manner.
○ Provide insights and/or recommendations as needed to help improve the content
review process and tool.
○ Works closely with and supports the Product Manager for any tasks as needed.
Qualifications
Experience required:
- A Bachelor degree in any field you & passionate about.
- At least 1-2 years of experience in an analyst, quality control, or project management
role
- Strong Google Sheets or MS Excel fundamentals.
- Fluency in English.
- Outstanding analytical skills
- Ability to detect patterns, data inconsistencies, etc.
- Strong attention to detail
- Ability to thoroughly and accurately complete tasks/projects in a timely manner.
- Able to multitask and prioritize.
- Technology focused mindset - ability to pick up and learn new technology as needed.
Dependable - ability to work independently to complete a set of assigned tasks in a
timely manner.
Must have skills
- data analyst
- content strategy
- campaign management
- data visualization
Data Solutions Manager
Posted today
Job Viewed
Job Description
Do you want to take the first step in making Filipinos' lives better everyday? Here in GCash we want to stay at the forefront of the FinTech industry by creating innovative, meaningful, and convenient financial solutions for the nation G ka ba? Join the G Nation today
You will be responsible for the following:
Does (The tasks / responsibilities that the role performs to address requirements in Key Result Areas)
- Design and build dimensional data models in accordance to industry standards and best practices
- Work closely with stakeholders to gather and understand requirements
- Translate business requirements into data solutions and specification
- Collaborate with delivery teams (data engineering, reports and dashboard developers) to ensure data is accurately processed and loaded
- Create and maintain comprehensive documentation (entity relationship diagram, mappings, data dictionary, etc.)
- Implement data quality checks and validation processes
- Perform periodic analysis of query and reports performance, and recommend optimization solutions
- Ensure data models support fast query execution, even with complex joins or large datasets
- Provide technical guidance and mentorship to junior members of the team
- Build complex queries on large volumes of data using SQL
- Create and update documentation, and help maintain the team document portal
- Standardize different queries and optimize them to speed up query times
- Ensure data security practices in the deployed pipelines
- Identify automation opportunities and implement automation projects
- Maintain uptime for different automation tasks
Displays (The Knowledge, Skills, and Behaviors indicating how tasks / responsibilities will be performed )
- Strong understanding of dimensional data modeling concepts, data warehousing principles and architectures
- Possesses technical, analytical, and soft skills to allow the design of efficient and scalable data models, reports and dashboards
- Deep understanding of SQL for querying, reporting, and optimizing performance
- Familiarity with traditional and modern cloud-based databases
- Comprehension to understand and convert business questions and reporting needs into efficient data solutions
- Appetite to stay updated with with evolving data practices, tools, and technologies indata modeling and warehousing
- Willingness to work with different stakeholders across the organization
- Ability to work closely with data architects, data engineers, and developers to identify data related problems and solutions
- Leadership skills to help establish the team's ways-of-working, processes, deliverables, and documentation
- Strong collaboration skills
- Agile mindset with 'fail fast, learn fast' attitude
- Detail-oriented to verify the integrity of gathered data and reports
- Experience in analytics and data warehousing concepts, terminologies, and architecture
- Good oral and presentation skills are a must
- Coordinates and supports the Data Engineering team on the updates of the data sources
Delivers (The specific outputs / tangible results produced by the role; resources responsible for )
- Dimensional data models
- Entity relationship diagram
- Data dictionary and metadata documentation
- Data mapping documents
- Performance tuning artifacts
- ETL specifications
- Prototypes and sample reports
- Validation and test scenarios
- Data change management documentation
- Governance and security artifacts
We are looking for:
- Experience working with SQL (Oracle, Sybase, etc.) and cloud-based (Google BigQuery, Alibaba Cloud, etc.) databases required
- Experience in Google Looker Studio, Google Sheets, Excel required
- Experience in Google Suite (GSheet, GDocs, etc.) required
- Experience in diagramming tools (ERWin, Lucid, etc.) required
- Experience in JIRA and Confluence preferred