5 Data Engineering jobs in the Philippines
Data Engineering Specialist
Posted 416 days ago
Job Viewed
Job Description
The Data Engineer helps manage data sources and environments, utilizing large data sets and maintaining their integrity to create models and apps that deliver insights to the organization.
You will:
Build and manage data models that bring together data from different sources.Help consolidate and cleanse data for use by the modeling and development teams.Structure data for use in analytics applications.Lead a team of Data Engineers effectivelyRequirementsMust-Have:
A minimum of 2 years of Advanced SQL knowledge and experience working with relational databasesA minimum of 2 years of Familiarity and hands-on experience in different SQL objects like stored procedures, functions, views, etc.,A minimum of 2 years Building data flow components and processing systems to extract, transform, load, and integrate data from various sources.College Graduate - Bachelor’s degree or equivalentGood to have :
A Minimum of 2 years of strong cloud technology experience A minimum of 2 years of hands-on experience in advanced Excel topics such as cube functions, VBA Automation, Power Pivot, etc. Understanding of sales processes and systems.Master’s degree in a technical field.Experience with Python.Experience with quality assurance processes.Experience in project managementat least 2 years SQL experience working with relational databasesLead Consultant - Data Architecture & Engineering (GDC) - Remote
Posted 4 days ago
Job Viewed
Job Description
br>The prescreening interview can be done over the phone or through Teams.
Job Highlights
Leadership Opportunities - This role will help you extend and expand your knowledge and experience in technical leadership positions. These include technical decision-making; guidance and mentorship of more junior team members; and developing relationships with project, client, and company leadership - all while remaining hands-on in your areas of expertise.
Our technology focus is on Azure—you'll have plenty of exciting opportunities to grow your skills in Microsoft and Azure in an environment committed to technical excellence and client experience in a very specific, defined space. Six months of experience on our team is worth years somewhere else. < r>
Microsoft Partnerships - Our great global relationship with Microsoft ensures that we have a pipeline of cutting-edge Azure work opportunities and access to the teams that have built the platform. You won't find this at other places.
Competitive compensation package, salary, allowance, standard benefits including quarterly and annual performance-based cash bonuses, and other remuneration. Great working environment and company culture with flexible work location.
General Required Technical Skills:
Experience leading small to medium-sized technical teams (this includes work allocation/distribution to team members, technical escalation and support, and representing the team in Agile ceremonies and client meetings).
Expertise in designing and implementing logical and physical data models for cloud and hybrid data warehouse environments
Implementing data architectures to support a variety of data formats and structures, including structured, semi-structured and unstructured data
Experience with multiple full life-cycle data warehouse implementations Understanding of data architectures required to support data integration processing Experience with data modeling technologies such as ER/Studio, ER/Win, or similar Experience with Microsoft Azure Data Platform services including Azure Data Lake Store, Azure Storage, Azure Synapse, Azure Data Factory, Azure SQL database, Logic Apps, and APIs
Demonstrated ability to quickly learn, adopt and apply new technologies
Data profiling and creation of source to target mappings
Ability to provision and configure Azure data service resources
Detailed Required Skills:
Python & SQL Scripting
SQL, PySpark
General Cloud Architecture competency skillset: capable of taking requirements and building out data pipelines
API Knowledge is required, but preference given to candidates who can create APIs
Preferred Experience/Skills/Certifications:
Microsoft Fabric
Microsoft Azure Cosmos DB, Data Flows, Express Route, Azure Active Directory
Experience creating strategies to migrate customers from on-premise environments to Azure
Power BI and semantic modeling
AWS Glue & Azure Data Factory
AWS $3 & Azure Blob
AWS Athena & Azure Databricks
AWS Redshift & Azure Synapse
AWS ECS & Azure AKS
Lead Data Architecture & Engineering Consultant (GDC) - WFH
Posted 24 days ago
Job Viewed
Job Description
br>Job Title: Lead Data Architecture & Engineering Consultant (GDC) | REMOTE
Job Overview:
This role offers leadership opportunities, allowing you to expand your expertise in technical leadership responsibilities. You will be involved in making technical decisions, mentoring junior team members, and building relationships with project, client, and company leadershipall while remaining actively hands-on in your technical domain. The primary technology focus is on Azure, providing numerous opportunities to develop skills within the Microsoft and Azure ecosystems in an environment dedicated to technical excellence and exceptional client service. Six months of experience on our team can equate to years of experience elsewhere.
Partnerships:
Our strong global relationship with Microsoft ensures access to innovative Azure projects and direct engagement with the teams that develop the platform, offering unique professional growth opportunities unavailable elsewhere.
Compensation & Culture:
We offer a competitive salary package, allowances, standard benefits, and performance-based bonuses quarterly and annually. Our company culture emphasizes a positive work environment, with flexible work arrangements to support work-life balance.
Key Technical Skills:
- Proven experience leading small to medium technical teams, including task allocation, technical escalation, support, and representing the team in Agile ceremonies and client meetings.
- Expertise in designing and deploying logical and physical data models for cloud and hybrid data warehouse environments.
- Experience in developing data architectures that support various data formats, including structured, semi-structured, and unstructured data.
- Hands-on experience with full lifecycle data warehouse projects.
- Knowledge of data architecture principles for data integration processes.
- Familiarity with data modeling tools such as ER/Studio, ER/Win, or similar.
- Proficiency with Microsoft Azure Data Platform services, including Azure Data Lake Storage, Azure Storage, Azure Synapse Analytics, Azure Data Factory, Azure SQL Database, Logic Apps, and APIs.
- Ability to quickly learn, adopt, and apply new technologies.
- Experience in data profiling and creating source-to-target data mappings.
- Skilled in provisioning and configuring Azure data services.
Detailed Skill Requirements:
- Python and SQL scripting expertise.
- Experience with SQL and PySpark.
- General competency in cloud architecture, capable of translating requirements into data pipeline solutions.
- API development knowledge; candidates who can create APIs will be preferred.
Preferred Experience, Skills, and Certifications:
- Microsoft Fabric
- Azure Cosmos DB, Data Flows, ExpressRoute, Azure Active Directory
- Experience planning and executing migration strategies from on-premises environments to Azure
- Power BI and semantic data modeling
- AWS Glue and Azure Data Factory
- AWS S3 and Azure Blob Storage
- AWS Athena and Azure Databricks
- AWS Redshift and Azure Synapse Analytics
- AWS ECS and Azure AKS
Lead Cloud Consultant - Data Architecture & Engineering (Remote)
Posted 1 day ago
Job Viewed
Job Description
We are currently hiring for Lead Consultant - Data Architecture & Engineering .
We look forward to reviewing your application and potentially working with you.
General Required Technical Skills:
- Experience leading small to medium sized technical teams (this includes work allocation/distribution to team members, technical escalation and support, representing the team in Agile ceremonies and client meetings).
- Expertise in designing and implementing logical and physical data models for cloud and hybrid data warehouse environments
- Implementing data architectures to support a variety of data formats and structures including structured, semi-structured and unstructured data
- Experience with multiple full life-cycle data warehouse implementations
- Understanding of data architectures required to support data integration processing
- Experience with data modeling technologies such as ER/Studio, ER/Win or similar
- Experience with Microsoft Azure Data Platform services including Azure Data Lake Store, Azure Storage, Azure Synapse, Azure Data Factory, Azure SQL database, Logic Apps, APIs
- Demonstrated ability to quickly learn, adopt and apply new technologies
- Data profiling and creation of source to target mappings
- Ability to provision and configure Azure data service resources
Detailed Required Skills:
- Python & SQL Scripting
- SQL, PySpark
- General Cloud Architecture competency skillset
- Capable of taking requirements and building out data pipelines
- API Knowledge is required, but preference given to candidates who can create APIs
Preferred Experience/Skills/Certifications
- Microsoft Fabric
- Microsoft Azure Cosmos DB, Data Flows, Express Route, Azure Active Directory
- Experience creating strategies to migrate customers from on-premise environments to Azure
- Power BI and semantic modeling
- AWS Glue & Azure Data Factory
- AWS S3 & Azure Blob
- AWS Athena & Azure Databricks
- AWS Redshift & Azure Synapse
- AWS ECS & Azure AKS
Don't miss this opportunity to take the next step in your cloud engineering career we'd love to hear from you!
Data Platform Engineer Big Data Cloud ETL
Posted 22 days ago
Job Viewed
Job Description
We’re seeking a Data Platform Engineer to help architect, develop, and optimize large-scale data ingestion, transformation, and analytics solutions across cloud and on-prem environments. This role is ideal for professionals experienced in handling diverse data formats and streaming workloads, who thrive in fast-moving environments and enjoy building data-driven systems from the ground up. br>
Key Responsibilities
Implement solutions for batch and streaming data ingestion from APIs, flat files, and cloud sources
Develop data ingestion and transformation pipelines using tools such as Spark, Scala, and Kafka
Set up data staging, ETL processes, and quality checks to prepare datasets for analytics
Build scalable data frameworks and contribute to architecture decisions for cloud and on-prem systems
Optimize SQL and PLSQL queries to ensure efficient data retrieval from structured and unstructured sources
Support data profiling, source-to-target mappings, and validation of business requirements
Work closely with Data Architects and Delivery Leads to shape data solutions that meet stakeholder goals
Deliver solutions that integrate well with BI/reporting tools (Looker, Tableau, or similar)
Job Qualifications
Preferred Skills & Experience
Bachelor’s degree in Computer Science, Statistics, Information Management, Finance, Economics, or related field < r>
Minimum 3 years' experience in integrating data for analytics using formats like JSON, XML, flat files, Hadoop, or cloud-native formats
Proficient in Spark/Scala and other JVM-based programming languages
Hands-on experience with ingestion tools such as NiFi, Sqoop, or Flume
Strong background in working with HDFS, Hive, HBase, and large-scale data processing systems
Familiar with data pipeline development using cloud-native tools (e.g., AWS Glue, Azure Data Factory, GCP Dataflow)
Knowledge of CI/CD tooling including Jenkins, Bitbucket, SonarQube, and Nexus
Proficient in SQL and PLSQL for data access, transformation, and optimization
Familiar with BI tools like Looker, Tableau, or similar for data visualization
Knowledgeable in core concepts such as data lakes, warehousing, and real-time data architecture
Strong understanding of streaming platforms like Apache Kafka, Spark Streaming, or Storm
Capable of handling structured and unstructured data, profiling source systems, and delivering fit-for-purpose datasets
Working knowledge of Java is a plus
Why Join Us?
Work with modern big data stacks and real-time analytics platforms
Collaborative and technically driven team
Exposure to both on-prem and cloud-native technologies
Opportunity to shape the future of enterprise data architecture
Be The First To Know
About the latest Data engineering Jobs in Philippines !