84 Etl Engineer jobs in the Philippines

ETL Engineer

₱1200000 - ₱2400000 Y MicroSourcing

Posted today

Job Viewed

Tap Again To Close

Job Description

Discover your 100% YOU with MicroSourcing

Position: ETL Analyst

Location: Eastwood Libis, Quezon City

Work setup & shift: Hybrid | Nightshift (8pm-5am PH)

**Why join MicroSourcing?

You'll have:**

  • Competitive Rewards: Enjoy above-market compensation, healthcare coverage on day one, plus one or more dependents, paid time-off with cash conversion, group life insurance, and performance bonuses
  • A Collaborative Spirit: Contribute to a positive and engaging work environment by participating in company-sponsored events and activities.
  • Work-Life Harmony: Enjoy the balance between work and life that suits you with flexible work arrangements.
  • Career Growth: Take advantage of opportunities for continuous learning and career advancement
  • Inclusive Teamwork: Be part of a team that celebrates diversity and fosters an inclusive culture.

Your Role:

Design, build, and maintain scalable ETL/ELT data pipelines that extract, transform, and load data from multiple sources into internal systems for use by client-facing applications and operational reporting. This role focuses entirely on backend data engineering and integration—not on dashboards or data visualization.

  1. Data Pipeline Implementation and Support
  2. Data Transformation

  3. Create data transformations according to business rules and client-specific requirements.

  4. Support data enrichment workflows

  5. Automation & Refresh Process

  6. Implement automation for recurring data loads and transformations across customers.

  7. Monitor and facilitate data refresh cycles, including scheduling, orchestration, and validation.
  8. Own client refresh lifecycle, ensuring timely, complete, and accurate data updates.

  9. Data Optimization

  10. Continuously improve pipeline efficiency, reducing refresh times and resource usage.

  11. Identify bottlenecks and refactor logic or infrastructure to scale with growing data volumes

  12. Refresh Data QA & Validation

  13. Validate data post-ingestion and transformation to ensure schema alignment, completeness, and business logic integrity.

  14. Collaborate with the Data Quality and Delivery teams to resolve ingestion-related data quality issues.
  15. Build checks into the ETL workflows to proactively detect and flag anomalies.

  16. Documentation & SOPs

What You Need:

Technical quantitative BS degree, combined with modelling work experience or training, with working knowledge of modelling (regression, machine learning, feature selection, validation), data ETL (extracting, preparing, validating) and building analytics workflows.

Essential Skills - Analytics

  • 2+ years of hands-on experience developing automated data workflows in Alteryx or similar ETL tools (e.g., Talend, SSIS, Informatica).
  • Strong SQL experience, especially in writing complex joins, data transformations, and quality checks.
  • Candidates should understand data structures and logic but are not expected to create visualizations or dashboards.
  • Working knowledge of programming languages such as SQL (R and Python are beneficial)
  • Develop custom workflows using Alteryx to access snowflake database, create repeatable processes for other team members to execute.
  • Data Collection & Structured Analysis - Develops logical approach to data collection and ensures data collection meets the timeline
  • Identifies gaps in data, and/or anomalies and seeks clarification - surfaces concerns or issues immediately with recommendations.

This is a backend ETL role. Candidates focused on reporting, dashboarding, or visualization tools (e.g., Power BI, Tableau) are not a fit for this position.



About MicroSourcing

With over 9,000 professionals across 13 delivery centers, MicroSourcing is the pioneer and largest offshore provider of managed services in the Philippines.

Our commitment to 100% YOU

MicroSourcing firmly believes that our company's strength lies in our people's diversity and talent. We are proud to foster an inclusive culture that embraces individuals of all races, genders, ethnicities, abilities, and backgrounds. We provide space for everyone, embracing different perspectives, and making room for opportunities for each individual to thrive.

At MicroSourcing, equality is not merely a slogan – it's our commitment. Our way of life. Here, we don't just accept your unique authentic self - we celebrate it, valuing every individual's contribution to our collective success and growth. Join us in celebrating YOU and your 100%

For more information, visit

*Terms & conditions apply

This advertiser has chosen not to accept applicants from your region.

Data Engineer, ETL Engineer

₱1440000 - ₱1560000 Y Bravissimo Resourcing

Posted today

Job Viewed

Tap Again To Close

Job Description

We are looking for a Data, ETL & Connector Engineer with strong experience in data engineering or data science, including metadata ETL and APIs. This role will focus on data integration, transformation, and supporting search engine capabilities, especially in e-commerce applications.

Core Skills:

  • Java, Spring Batch, JavaScript, Python, AWS

Preferred Skills:

  • Search Engine (Solr, Elasticsearch, or others)
  • Azure DevOps

Key Responsibilities:

  • Develop, modify, and test connectors for acquiring and transforming catalog data and metadata feeds into formats required by the search engine.
  • Support incremental processes, mappings, and security requirements.
  • Perform performance benchmarking and improve data ingestion quality.
  • Collaborate with data owners, API SMEs, and Technical Architects to understand source system schema.
  • Support API enhancements and back up full-stack engineers as needed.
  • Analyze bugs, document findings, and implement/test fixes with other teams.
  • Participate in sprint activities with Technical Leads, Delivery Leads, Scrum Masters, and Product Owners.
  • Use Azure DevOps for ticketing and task tracking, updating status daily.
  • Apply CI/CD processes, resolve bugs, and perform unit testing.
  • Maintain up-to-date documentation on solution changes.
DataEngineer #ETLEngineer #ConnectorEngineer #DataScience #DataEngineering #APIs #Java #SpringBatch #Python #AWS #Elasticsearch #Solr #AzureDevOps #EcommerceTech #TechJobsPH #AccentureCareers

Job Type: Full-time

Pay: Php120, Php130,000.00 per month

Benefits:

  • Health insurance
  • Life insurance
  • Pay raise

Work Location: In person

This advertiser has chosen not to accept applicants from your region.

ETL Data Engineer

₱104000 - ₱130878 Y Accenture

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description:

Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.

Summary:

As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with team members to address challenges and contribute to the overall success of data initiatives.

Roles & Responsibilities:

  • Expected to perform independently and become an SME.
  • Required active participation/contribution in team discussions.
  • Contribute in providing solutions to work related problems.
  • Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.
  • Monitor and optimize data pipelines for performance and reliability.
  • Document data processes and workflows to ensure clarity and knowledge sharing.
  • Stay updated with industry trends and best practices in data engineering.
  • Assist in troubleshooting and resolving data-related issues as they arise.

Professional & Technical Skills:

  • Required Skill: Expert proficiency in Microsoft Azure Data Services.
  • Additional Good to Have Skills: Experience with Data Engineering.
  • Strong understanding of data modeling and database design principles.
  • Experience with ETL tools and data integration techniques.
  • Familiarity with cloud computing concepts and services.
  • Proficient in programming languages such as Python or SQL for data manipulation.

Additional Information:

- The candidate should have minimum 2 years of experience in Microsoft Azure Data Services.

- This position is based at our Manila office.

Minimum 2 year(s) of experience is required

This advertiser has chosen not to accept applicants from your region.

ETL Data Engineer/Analyst

₱1200000 - ₱2400000 Y MicroSourcing

Posted today

Job Viewed

Tap Again To Close

Job Description

Discover your 100% YOU with MicroSourcing

Position: ETL Analyst

Location: Eastwood Libis, Quezon City

Work setup & shift: Hybrid | Nightshift (8pm-5am PH); Hybrid remote work from home every Monday & Friday + 1 fully remote week scheduled each month (9 days in office each month, subject to change in months with 5 weeks)

**Why join MicroSourcing?

You'll have:**

  • Competitive Rewards: Enjoy above-market compensation, healthcare coverage on day one, plus one or more dependents, paid time-off with cash conversion, group life insurance, and performance bonuses
  • A Collaborative Spirit: Contribute to a positive and engaging work environment by participating in company-sponsored events and activities.
  • Work-Life Harmony: Enjoy the balance between work and life that suits you with flexible work arrangements.
  • Career Growth: Take advantage of opportunities for continuous learning and career advancement
  • Inclusive Teamwork: Be part of a team that celebrates diversity and fosters an inclusive culture.

Your Role:

Design, build, and maintain scalable ETL/ELT data pipelines that extract, transform, and load data from multiple sources into internal systems for use by client-facing applications and operational reporting. This role focuses entirely on backend data engineering and integration—not on dashboards or data visualization.

  1. Data Pipeline Implementation and Support
  2. Data Transformation

  3. Create data transformations according to business rules and client-specific requirements.

  4. Support data enrichment workflows

  5. Automation & Refresh Process

  6. Implement automation for recurring data loads and transformations across customers.

  7. Monitor and facilitate data refresh cycles, including scheduling, orchestration, and validation.
  8. Own client refresh lifecycle, ensuring timely, complete, and accurate data updates.

  9. Data Optimization

  10. Continuously improve pipeline efficiency, reducing refresh times and resource usage.

  11. Identify bottlenecks and refactor logic or infrastructure to scale with growing data volumes

  12. Refresh Data QA & Validation

  13. Validate data post-ingestion and transformation to ensure schema alignment, completeness, and business logic integrity.

  14. Collaborate with the Data Quality and Delivery teams to resolve ingestion-related data quality issues.
  15. Build checks into the ETL workflows to proactively detect and flag anomalies.

  16. Documentation & SOPs

What You Need:

Technical quantitative BS degree, combined with modelling work experience or training, with working knowledge of modelling (regression, machine learning, feature selection, validation), data ETL (extracting, preparing, validating) and building analytics workflows.

Essential Skills - Analytics

  • 2+ years of hands-on experience developing automated data workflows in Alteryx or similar ETL tools (e.g., Talend, SSIS, Informatica).
  • Strong SQL experience, especially in writing complex joins, data transformations, and quality checks.
  • Candidates should understand data structures and logic but are not expected to create visualizations or dashboards.
  • Working knowledge of programming languages such as SQL (R and Python are beneficial)
  • Develop custom workflows using Alteryx to access snowflake database, create repeatable processes for other team members to execute.
  • Data Collection & Structured Analysis - Develops logical approach to data collection and ensures data collection meets the timeline
  • Identifies gaps in data, and/or anomalies and seeks clarification - surfaces concerns or issues immediately with recommendations.

This is a backend ETL role. Candidates focused on reporting, dashboarding, or visualization tools (e.g., Power BI, Tableau) are not a fit for this position.



About MicroSourcing

With over 9,000 professionals across 13 delivery centers, MicroSourcing is the pioneer and largest offshore provider of managed services in the Philippines.

Our commitment to 100% YOU

MicroSourcing firmly believes that our company's strength lies in our people's diversity and talent. We are proud to foster an inclusive culture that embraces individuals of all races, genders, ethnicities, abilities, and backgrounds. We provide space for everyone, embracing different perspectives, and making room for opportunities for each individual to thrive.

At MicroSourcing, equality is not merely a slogan – it's our commitment. Our way of life. Here, we don't just accept your unique authentic self - we celebrate it, valuing every individual's contribution to our collective success and growth. Join us in celebrating YOU and your 100%

For more information, visit

*Terms & conditions apply

This advertiser has chosen not to accept applicants from your region.

ETL Data Engineer – AWS

Taguig, National Capital Region ₱120000 - ₱150000 Y Eastvantage Business Solutions Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

About The Role:

We are looking for a detail-oriented and proactive ETL Data Engineer with strong experience in data integration, transformation, and pipeline development. The ideal candidate should be proficient in AWS, Python, and SQL, with hands-on experience working with diverse data sources and formats. A solid understanding of both RDBMS and NoSQL databases, along with familiarity in API development and data ingestion, is essential. Candidates who stay current with emerging technologies and thrive in Agile environments will be a great fit for our dynamic and fast-paced team.

Position Requirements:

  • Candidate must possess at least a Bachelor's Degree / College, Computer Science / IT
  • At least 5 years+ of working experience in the related field (Python, ETL).
  • Hands-on experience with the AWS Suite for cloud-based data and analytics solutions.
  • Skilled in ingesting data from various sources including APIs, logs, flat files, and databases.
  • Experienced in API client-server development (REST), with a focus on API authentication and data ingestion.
  • Proficient in administering both RDBMS and NoSQL databases within a unified data and analytics environment.
  • Capable of converting, processing, and transforming various file formats (e.g., CSV, JSON, Parquet, XML) using data tools.
  • Comfortable using Python to address data and analytics requirements.
  • Experienced in deploying and maintaining scripts using Git repositories.
  • Keeps up to date with emerging technologies in the data and analytics space.
  • Adept at working within SCRUM/Agile methodologies and environments.
  • Open to working remotely and available for mid-shift or rotating schedules.
ETLDevelopment #DataEngineering #AWSCloud #PythonProgramming #SQL #NoSQL #APIIntegration #RemoteWork #MidShift #AgileTeam #FieldServiceTech #TechJobs #DigitalTransformation
This advertiser has chosen not to accept applicants from your region.

ETL Data Engineer – AWS

Taguig, National Capital Region ₱900000 - ₱1200000 Y Eastvantage

Posted today

Job Viewed

Tap Again To Close

Job Description

We're Hiring: ETL Data Engineer – AWS & Python Specialist
We are seeking an experienced ETL Data Engineer to design, develop, and maintain robust data pipelines using AWS cloud services and Python. The ideal candidate will have expertise in extracting, transforming, and loading large datasets while ensuring data quality, performance, and scalability across our data infrastructure.

About The Role
We are looking for a detail-oriented and proactive ETL Data Engineer with strong experience in data integration, transformation, and pipeline development. The ideal candidate should be proficient in AWS, Python, and SQL, with hands-on experience working with diverse data sources and formats. A solid understanding of both RDBMS and NoSQL databases, along with familiarity in API development and data ingestion, is essential. Candidates who stay current with emerging technologies and thrive in Agile environments will be a great fit for our dynamic and fast-paced team.

Position Requirements

  • Candidate must possess at least a Bachelor's Degree / College, Computer Science / IT
  • At least 5 years+ of working experience in the related field (Python, ETL).
  • Hands-on experience with the AWS Suite for cloud-based data and analytics solutions.
  • Skilled in ingesting data from various sources including APIs, logs, flat files, and databases.
  • Experienced in API client-server development (REST), with a focus on API authentication and data ingestion.
  • Proficient in administering both RDBMS and NoSQL databases within a unified data and analytics environment.
  • Capable of converting, processing, and transforming various file formats (e.g., CSV, JSON, Parquet, XML) using data tools.
  • Comfortable using Python to address data and analytics requirements.
  • Experienced in deploying and maintaining scripts using Git repositories.
  • Keeps up to date with emerging technologies in the data and analytics space.
  • Adept at working within SCRUM/Agile methodologies and environments.
  • Open to working remotely and available for mid-shift or rotating schedules.
ETLDevelopment #DataEngineering #AWSCloud #PythonProgramming #SQL #NoSQL #APIIntegration #RemoteWork #MidShift #AgileTeam #FieldServiceTech #TechJobs #DigitalTransformation

Ready to transform data into insights? Apply now and let's build the future together

This advertiser has chosen not to accept applicants from your region.

Data Engineering

Taguig, National Capital Region ₱1500000 - ₱2500000 Y DXC Technology

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description
Job Description:

  • Designs and develops Data and Analytics solutions
  • Performs data analysis, data architecture, data engineering activities
  • Designs and builds data ingestion pipelines; develops data transformation, cleansing and preparation processes
  • Designs and builds data storage solutions including database objects and data structures
  • Takes ownership of complex data-level or database incidents and problems and provides resolutions in a timely manner
  • Provides advisory and consultation services to determine the right solution
  • Utilizes best practices, work plans, checklists and defined processes
  • Proactively supports application teams
  • Performs optimization
  • Participates in project technical reviews
  • Leads or participates in work product reviews
  • Engages SMEs or vendors to provide assistance to resolve more complex issues as required

Technology And Functional Skills

  • Azure data platform and components: Azure Data Factory, Databricks, Blob Storage, Synapse, SQL DB/DW, Event Hubs, etc
  • AWS data platform and components: S3, Redshift, Spectrum, Athena, Aurora, CLI, etc
  • GCP data platform and components: BigQuery, Dataflow, Dataprep, Pub/Sub, etc
  • Non-cloud native: Informatica, Talend, DataStage, SAP BODS, etc
  • Big Data: Hadoop, Hadoop ecosystem, Spark, Scala, etc
  • Language and Toolset: SQL, Python, PySpark, ELK (Elasticsearch, Logstash, Kibana)
  • Data Integration, Extract-Transform-Load (ETL), Data Ingestion, Data Cleansing, Data Preparation, Database
  • Data Warehouse, Data Lake, and Business Intelligence (BI) concepts

At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We're committed to fostering an inclusive environment where everyone can thrive.

Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here
.

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Etl engineer Jobs in Philippines !

Data Engineering

Taguig, National Capital Region ₱80000 - ₱100000 Y Control Flow Labs

Posted today

Job Viewed

Tap Again To Close

Job Description

About Control Flow Labs

Control Flow Labs is a fast-growing pre-seed startup building an AI-powered ecommerce analytics platform. We aggregate data from major platforms to deliver comprehensive analytics and insights for medium to enterprise-level businesses.

With our MVP live and beta clients onboard, we're scaling rapidly and preparing for our next funding round. We're looking for a skilled Data & Integrations Manager to perfect our current data systems and build new capabilities as we grow.

The Role

We're seeking a Data Engineering & Integrations Manager to own the complete data lifecycle of our analytics platform. You'll be responsible for ensuring data accuracy, building robust integrations with ecommerce platforms, writing ETL pipelines, and creating meaningful analytics that our enterprise clients rely on for business decisions.

This is a hands-on technical role where you'll work closely with our current project lead and report to our incoming CTO. You'll be instrumental in scaling our data capabilities from beta to 100+ enterprise clients.

What You'll DoData Engineering & Pipeline Management

● Build and maintain integrations with ecommerce platforms using REST and GraphQL APIs

● Implement OAuth flows and manage platform authentication for secure data access

● Own the complete data pipeline using Dagster for orchestration, ensuring reliable data ingestion and transformation

● Develop and optimize ETL/ELT processes that can scale with rapid client growth

● Implement data validation and quality checks to guarantee accuracy across all client dashboards

Analytics & Visualization Development

● Create analytics endpoints that power our frontend dashboards and client interfaces

● Transform raw platform data into actionable ecommerce insights and KPIs

● Build data models that support both current reporting and future AI-powered features

● Collaborate with our dedicated frontend developer to ensure smooth data delivery to client dashboards

System Optimization & Scaling

● Perfect existing data systems to ensure 100% accuracy and reliability

● Monitor and prevent data drift through automated validation and anomaly detection

● Optimize performance for real-time analytics as we scale to more clients and data volume

● Document data flows and processes for team knowledge sharing and compliance

Platform Integration Expertise

● Design integration strategies for new ecommerce platforms as we expand

● Troubleshoot platform API changes and maintain integration reliability

● Communicate infrastructure requirements to our dedicated DevOps team for optimal system performance

Note: You'll focus purely on data engineering and backend analytics - we have dedicated team members handling frontend development and infrastructure management.

What We're Looking ForTechnical Skills (Required)

● Strong Python development experience, particularly with FastAPI and Dagster

● Advanced SQL skills and experience with PostgreSQL

● Familiarity with Cloud platform experience with AWS services (RDS, S3, etc.)

● Data engineering expertise with tools like Dagster, dbt, or similar orchestration platforms

● API integration experience with REST and GraphQL methodologies

● Data modeling and warehouse design experience

Domain Knowledge (Preferred)

● Ecommerce platform experience - understanding of Shopify, TikTok, Amazon, or similar platforms

● Analytics platform background - experience building customer-facing dashboards and reports

● Business intelligence tools knowledge for visualization and reporting

● Data validation and quality assurance methodologies

Experience Level

● 3-5+ years in data engineering, analytics engineering, or similar technical roles

● Proven track record building and maintaining data pipelines at scale

● Experience with customer-facing analytics or BI platforms

● Background in fast-paced startup or growth environments

Soft Skills

● Obsessive about data accuracy - you treat every dashboard as a promise to users

● Strong communication skills - can translate technical requirements to infrastructure team

● Problem-solving mindset - you proactively identify and fix data issues before they impact clients

● Collaborative approach - comfortable working closely with product, engineering, and business teams

What We OfferCompensation & Benefits

● Competitive salary: ₱80,000 - ₱100,000/month based on experience

● Flexible work arrangement - primarily onsite in BGC, Taguig with WFH flexibility as needed

● Full benefits package

● Growth opportunity - join a fast-scaling startup with clear advancement paths

Technical Environment

● Modern data stack: Python FastAPI, PostgreSQL, AWS, Dagster

● Interesting challenges: Multi-platform integrations, real-time analytics, enterprise-scale data processing

● Direct impact: Your work directly enables client business decisions and platform growth

● Learning opportunities: Exposure to AI/ML integration and enterprise-scale data challenges

Team & Culture

● Collaborative team of 13 across all departments

● Fast-paced environment - we move quickly and don't let blockers persist

● Data-driven culture - decisions backed by metrics and evidence

● Direct mentorship from experienced project lead and incoming CTO

Success Metrics (First 6 Months)

● 100% data accuracy across all client integrations and dashboards

● Zero data downtime or missing data incidents

● Successful integration of at least 2 new platform connections

● Improved data pipeline performance supporting 3x client growth

● Clear documentation of all data processes and integration methods

Requirements

● Reliable commute to BGC, Taguig office

● Immediate availability preferred

Ready to Join Us?

If you're passionate about building reliable, scalable data systems that power real business decisions, we'd love to hear from you.

To apply, please include:

● Your resume highlighting relevant data engineering and integration experience

● Brief examples of data pipelines or integrations you've built

● Any experience with ecommerce platforms or analytics systems

Control Flow Labs is an equal opportunity employer committed to building a diverse and inclusive team.

Job Type: Full-time

Pay: Php80, Php100,000.00 per month

Benefits:

  • Additional leave
  • Health insurance

Work Location: In person

This advertiser has chosen not to accept applicants from your region.

Data Engineering Director

₱2500000 - ₱6000000 Y IntouchCX

Posted today

Job Viewed

Tap Again To Close

Job Description

As Director of Data Engineering, You Will…

  • Develop and execute a data strategy that aligns with the company's objectives, leading the data engineering team to build scalable and robust data solutions
  • Mentor, and lead a team of data engineers and architects
  • Foster a collaborative, innovative, and results-oriented environment within the team
  • Design and implement effective database solutions and models to store and retrieve company data
  • Examine and identify database structural necessities by evaluating client operations, applications, and programming
  • Manage timelines, resources, and deliverables for multiple projects simultaneously
  • Ensure projects are completed on time and within budget while meeting business needs
  • Ensure compliance with data governance and security requirements
  • Establish policies and procedures for data handling and sharing

As Director of Data Engineering, You Have…

  • A Degree or Diploma in Computer Science, Information Systems or equivalent experience
  • Minimum of 8 years of experience in data engineering, with at least 3 years in a leadership role
  • Strong knowledge of database management, warehousing solutions, and ETL processes
  • Proficiency in SQL and familiarity with programming languages such as Python or Java
  • Experience with cloud platforms (AWS, Azure, Google Cloud) and understanding of SaaS environments
  • Knowledge of real-time analytics and BI tools like Looker, Tableau, PowerBI, or similar
  • Certifications in data management, big data solutions, or cloud architecture
  • Experience in a contact center or customer service environmentDemonstrated ability to build and maintain high-performing data engineering teams
  • Excellent communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels

Job Type: Full-time

Benefits:

  • Additional leave
  • Free parking
  • Health insurance
  • Life insurance
  • On-site parking
  • Work from home

Work Location: Remote

This advertiser has chosen not to accept applicants from your region.

Data Engineering Consultant

Makati City, National Capital Region ₱1200000 - ₱1800000 Y Cobden & Carter International

Posted today

Job Viewed

Tap Again To Close

Job Description

  • Develop innovative solutions, create actionable insights and drive better decisions and performance to help the business achieve its business objectives
  • Coordination with all stake holders and Ensures Project Meets Requirements & Objectives
  • Establish a deep understanding of the business and collaborate with Business SMEs to shape the reporting solutions provided
  • Resolve issues as they arise across areas of the project and where they impact on other activities, systems and projects
  • Influences senior leadership to adopt new ideas, projects and / or approaches
  • Develops software solutions by studying information needs; conferring with users; studying systems flow, data usage, and work processes; investigating problem areas; Collaborating on solutions to those identified problem areas and following the software development lifecycle
  • Performs all tasks in the development life cycle including requirements analysis, design, development, and testing
  • Utilizes available programming methodologies and languages and adhere to coding standards, procedures and techniques while contributing to the technical code documentation
  • Creates and helps maintain technical and end-user documentation for new and existing applications
  • Performs all other related duties as assigned
  • Identify and help mitigate risks to the data and organizational health

Must have skills:

7 years in MS SQL Server, ETL (SSIS), Snowflake, Databrick

Nice to have:

2 years in Microsoft .NET Framework / .Net Core, Microservices, Rest APIs, Python

two with telephony data experience

Job Types: Full-time, Permanent

Pay: Php100, Php150,000.00 per month

Education:

  • Bachelor's (Required)

Experience:

  • Snowflake: 7 years (Required)
  • Databricks: 7 years (Required)
  • MS SQL: 7 years (Required)
  • ETL (SSIS): 7 years (Required)

Work Location: In person

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Etl Engineer Jobs