579 Reporting Engineer jobs in the Philippines

Data and Reporting Engineer

₱900000 - ₱1200000 Y QBE Group Shared Services Centre

Posted today

Job Viewed

Tap Again To Close

Job Description

Primary Details

Time Type: Full time

Worker Type: Employee

A Data and Reporting Engineer is responsible for collecting, transforming, and storing data from various sources, ensuring data quality and integrity, and creating meaningful reports and dashboards for stakeholders. They bridge the gap between data sources and end-users, enabling data-driven decision-making within QBE. . This role will also be responsible for producing complex reporting and analytics, automation, dashboard creation and reports consolidation across the function. This role will involve the design and development of reporting tools that are project driven and will be direct contact of onshore stakeholders on certain projects that will be assigned.

RESPONSIBILITIES:



Data Extraction and Transformation:


•Gather data from diverse sources, such as databases, APIs, and flat files.


•Clean, transform, and preprocess data to ensure accuracy and consistency.


•Develop and maintain ETL (Extract, Transform, Load) processes.



Data Warehousing:


•Design and maintain data warehouses or data lakes to store and organize data efficiently.


•Implement data modeling and schema design to optimize data retrieval.


•Big data management



Database Management:


•Administer and optimize databases for performance and scalability.


•Ensure data security and access control measures are in place.



Reporting and Visualization:


•Create interactive dashboards and reports using tools like Tableau, Power BI, or custom-built solutions.


•Collaborate with business analysts and stakeholders to define reporting requirements
.



Data Quality and Governance:


•Implement data quality checks and validation processes.


•Establish data governance policies and procedures to maintain data integrity.


•Perform root cause analysis on data queries from stakeholders



Automation and Scalability:


•Develop automated solutions for data processing and reporting tasks.


•Scale data pipelines and reporting systems to handle growing volumes of data.



Performance Monitoring:


•Monitor system performance, troubleshoot issues, and optimize query performance.


•Implement alerting mechanisms for system failures or anomalies.



Documentation and Knowledge Sharing:


•Maintain documentation for data pipelines, data models, and reporting processes.


•Collaborate with team members and stakeholders to share knowledge and best practices.



Technical Expertise:


•Stay up-to-date with emerging technologies and best practices in data engineering and reporting.


•Continuously enhance skills in SQL, data modeling, and reporting tools.



Project Management:


•Leading and managing onshore reporting project requirements and direct contact of specific AO business units.


•Implement Agile methodology on different reporting projects



Quality Assurance:


•Implement data quality checks to ensure data integrity.


•Identify and address data quality issues as they arise.


•Audit reports produced by the team ensuring DTP is adhered to.

QUALIFICATIONS:

Knowledge


•General knowledge of insurance business and related market conditions preferred


•Fundamental knowledge of data analysis, extraction and management techniques for insurance portfolios


•Strong mathematical and statistical knowledge


•Fundamental knowledge in Report template creation and design


•Agile methodologies, such as Scrum, Kanban, and Lean.

Skills


•Proficiency in data modeling, SQL, and database management systems (e.g., SQL Server, Oracle, PostgreSQL).


•Experience with ETL tools and processes (e.g.,SAS Base, SAS EG, Apache NiFi, Apache Spark, Talend).


•Knowledge of data warehousing concepts and technologies (e.g., Snowflake, Redshift, BigQuery).


•Familiarity with reporting and visualization tools (e.g., Tableau, Power BI, QlikView).


•Programming skills in languages such as Python, Java, or Scala for data processing and automation.


•Strong analytical and problem-solving skills.


•Understanding of data governance and data security principles.


•Excellent communication skills to work effectively with cross-functional teams and business stakeholders.


•Ability to adapt to evolving technologies and trends in data engineering and reporting.

Experience


•Relevant years of experience in SAS Base, SAS EG, SQL, Excel, Tableau, Power BI, Has been involved in multiple automation initiatives that delivered tangible benefits


•Relevant years of experience in the financial services industry, preferably in insurance


•Working experience in a consulting or shared services environment preferred

Bachelor's Degree in Computer Science, Information Technology, or any related field or equivalent work experience.

Happy to talk flexible working arrangement

Skills:

Business Intelligence Applications, Business Management, Communication, Critical Thinking, Customer Service, Detail-Oriented, Financial Products, Intentional collaboration, Managing performance, QlikView, Regulatory Compliance, Reporting and Analysis, Risk Management, Sound Judgment, Stakeholder Management

How to Apply:

To submit your application, click "Apply" and follow the step by step process.

Equal Employment Opportunity:

QBE is an equal opportunity employer and is required to comply with equal employment opportunity legislation in each jurisdiction it operates.

This advertiser has chosen not to accept applicants from your region.

Data and Reporting Engineer

₱1500000 - ₱3000000 Y QBE GROUP SHARED SERVICES LIMITED - PHILIPPINE BRANCH

Posted today

Job Viewed

Tap Again To Close

Job Description

A Data and Reporting Engineer is responsible for collecting, transforming, and storing data from various sources, ensuring data quality and integrity, and creating meaningful reports and dashboards for stakeholders. They bridge the gap between data sources and end-users, enabling data-driven decision-making within QBE.This role will also be responsible for producing complex reporting and analytics, automation, dashboard creation and reports consolidation across the function. This role will involve the design and development of reporting tools that are project driven and will be direct contact of onshore stakeholders on certain projects that will be assigned.

RESPONSIBILITIES:


•Data Extraction and Transformation:


•Gather data from diverse sources, such as databases, APIs, and flat files.


•Clean, transform, and preprocess data to ensure accuracy and consistency.


•Develop and maintain ETL (Extract, Transform, Load) processes.


•Data Warehousing:


•Design and maintain data warehouses or data lakes to store and organize data efficiently.


•Implement data modeling and schema design to optimize data retrieval.


•Big data management


•Database Management:


•Administer and optimize databases for performance and scalability.


•Ensure data security and access control measures are in place.


•Reporting and Visualization:


•Create interactive dashboards and reports using tools like Tableau, Power BI, or custom-built solutions.


•Collaborate with business analysts and stakeholders to define reporting requirements.


•Data Quality and Governance:


•Implement data quality checks and validation processes.


•Establish data governance policies and procedures to maintain data integrity.


•Perform root cause analysis on data queries from stakeholders


•Automation and Scalability:


•Develop automated solutions for data processing and reporting tasks.


•Scale data pipelines and reporting systems to handle growing volumes of data.


•Performance Monitoring:


•Monitor system performance, troubleshoot issues, and optimize query performance.


•Implement alerting mechanisms for system failures or anomalies.


•Documentation and Knowledge Sharing:


•Maintain documentation for data pipelines, data models, and reporting processes.


•Collaborate with team members and stakeholders to share knowledge and best practices.


•Technical Expertise:


•Stay up-to-date with emerging technologies and best practices in data engineering and reporting.


•Continuously enhance skills in SQL, data modeling, and reporting tools.


•Project Management:


•Leading and managing onshore reporting project requirements and direct contact of specific AO business units.


•Implement Agile methodology on different reporting projects


•Quality Assurance:


•Implement data quality checks to ensure data integrity.


•Identify and address data quality issues as they arise.


•Audit reports produced by the team ensuring DTP is adhered to.


QUALIFICATIONS:

Knowledge


•General knowledge of insurance business and related market conditions preferred


•Fundamental knowledge of data analysis, extraction and management techniques for insurance portfolios


•Strong mathematical and statistical knowledge


•Fundamental knowledge in Report template creation and design


•Agile methodologies, such as Scrum, Kanban, and Lean.

Skills


•Proficiency in data modeling, SQL, and database management systems (e.g., SQL Server, Oracle, PostgreSQL).


•Experience with ETL tools and processes (e.g.,SAS Base, SAS EG, Apache NiFi, Apache Spark, Talend).


•Knowledge of data warehousing concepts and technologies (e.g., Snowflake, Redshift, BigQuery).


•Familiarity with reporting and visualization tools (e.g., Tableau, Power BI, QlikView).


•Programming skills in languages such as Python, Java, or Scala for data processing and automation.


•Strong analytical and problem-solving skills.


•Understanding of data governance and data security principles.


•Excellent communication skills to work effectively with cross-functional teams and business stakeholders.


•Ability to adapt to evolving technologies and trends in data engineering and reporting.

Experience


•7+ years minimum experience in SAS Base, SAS EG, SQL, Excel, Tableau, Power BI, Has been involved in multiple automation initiatives that delivered tangible benefits


•At least 1 year experience in the financial services industry, preferably in insurance


•Working experience in a consulting or shared services environment preferred


•Bachelor's degree in computer science, Information Technology, or any related field or equivalent work experience.

This advertiser has chosen not to accept applicants from your region.

Regulatory Reporting Platform Engineer

Makati City, National Capital Region ₱80000 - ₱120000 Y ING

Posted today

Job Viewed

Tap Again To Close

Job Description

Responsibility of / Expectations from the Role

Design and develop the respective platform codes, configurations and unit testing

Ensure deliverables meet business requirements

Document and communicate technical designs.

Ensure all support or project activities are in compliance to the quality assurance, audit

specification, internal controls and standards.

Provide up- to- date and accurate relevant documentation of changes, fixes, upgrades,

enhancements of supported applications and implementation procedures in accordance

to established standards

Required Technical Skillset

MSBI stack developer(SQL, T-SQL, SSIS, SSAS, SSRS):

Power BI Developer

Good work exposure to Agile Methodology

Strong expertise in SSAS Multidimensional Cube development with MDX language

Strong understanding of DSVs, dimension-measure group relationship set up, attribute

key processing, partition processing

Performance improvement experience through MDX

SQL Profiler usage and leveraging for optimizations

TFS (Team Foundation Server), GitHub, Azure Dev Ops.

Build end-end ETL flows for DWH application

Analyze data models

Data quality analysis

Perform complex database programming by writing SQL procedures on large-scale

databases

Proven ability to perform technical troubleshooting to diagnose, isolate and correct data

and database issues

This advertiser has chosen not to accept applicants from your region.

Data Engineer/ Senior Data Engineer

Taguig, National Capital Region ₱1500000 - ₱3000000 Y MoneyHero Group (Nasdaq: MNY)

Posted today

Job Viewed

Tap Again To Close

Job Description

About Us

MoneyHero Group (Nasdaq: MNY) is a market leading financial products platform in Greater Southeast Asia, reaching 9.8m monthly unique users and working with more than 270 commercial partners across five markets including Singapore, Hong Kong S.A.R., Philippines, Taiwan, and Malaysia.

About The Job

The Senior Data Engineer is responsible for overseeing the design, development, implementation, and maintenance of all data-related systems, infrastructure, and processes within an organization. They will lead a team of data engineers and work closely with data scientists, business analysts, and other stakeholders to ensure that data is properly collected, processed, stored, and made available for analysis and decision-making purposes.

The Data Engineer Will

  • Develop and implement technical data architecture and engineering strategies to support the organisation's data-driven initiatives, including data warehousing, ETL processes, and data governance.
  • Build and maintain scalable and reliable data pipelines, ensuring data quality and consistency across all systems.
  • Create and maintain data models that align with business requirements, ensuring data integrity and optimal performance
  • Manage, guide and support junior data engineers in the development and implementation of data-related solutions
  • Collaborate with data scientists, business analysts, and other stakeholders to identify data-related needs and requirements, and develop and implement solutions.
  • Monitor and troubleshoot data-related issues, providing timely resolution and support to minimize downtime and disruption.

In This Role, We Are Looking For Someone With

  • At least 4 years of hands-on experience in data engineering building scalable pipelines and infrastructure
  • Create and maintain data models that align with business requirements, ensuring data integrity and optimal performance
  • Strong technical skills in data warehousing, ETL processes, and database management systems, such as SQL, Python and dbt.
  • Experience with cloud-based data technologies such as AWS, Azure, or Google Cloud ( BigQuery preferred)
  • Knowledge of data security and privacy regulations, such as GDPR and HIPAA
  • Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders
  • Analytical and problem-solving skills, with the ability to identify and resolve data-related issues and implement effective solutions
  • Bachelor's degree in computer science, engineering, or a related field
MoneyHeroGroup

What can you expect from us?

Impact
: We are actively empowering and connecting people to a better financial future. Join us if you want to help us achieve our mission.

Work
: We have a team of over 350 talented individuals in 4 markets who are hyper passionate about building innovative financial solutions and making an impact on people's lives.

Culture
: We take our work seriously but don't hesitate to keep things light. We can only create magic when we have a little bit of fun.

Thrive
: We launched in 2014, and now help over 10 million monthly users make the best financial decisions. Accelerate your career and become a pioneer in your field with a leading fintech company that seeks to push the boundaries of your imagination and is committed to growing your career.

Reputation
: We are backed by world-class organizations and companies and have raised over US$110 million from investors including Experian, Pacific Century Group, IFC - a member of the World Bank Group

EEO Statement

MoneyHero Group is an equal opportunity employer. We value, support and respect all individuals and is committed to maintaining an inclusive and diverse working environment. Decisions in hiring are based on business needs, requirements of the job and individual qualifications and shall not be influenced by any consideration of race, ethnic or national origin, religion, sex (including gender identity and/or expression), age, sexual orientation, marital status, parental status, disability, genetic information, political affiliation or other applicable legally protected characteristics.

This advertiser has chosen not to accept applicants from your region.

Data Engineer

₱1500000 - ₱2500000 Y Sysgen RPO, Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

REQUIRED SKILLS AND QUALIFICATIONS KEY RESPONSIBILITIES


• Lead the design, development, and maintenance of robust data pipelines and workflows.


• Write and optimize complex SQL queries, stored procedures, and views for performance and scalability.


• Oversee and manage the operational aspects of data stores.


• Drive data optimization, performance tuning, and integrity checks.


• Manage cloud-based SQL server (azure managed instance) environments.


• Lead the migration of data and systems from SQL server to snowflake.


• Work with datalake environments and support data ingestion and transformation processes.


• Collaborate with cross-functional teams including data analysts, developers, and business stakeholders.


• Provide technical leadership, code reviews, and mentorship to junior team members.

QUALIFICATIONS


• Strong SQL expertise including stored procedures, views, and performance tuning.


• Experience in azure cloud, particularly with SQL server managed instances.


• Hands-on experience with snowflake and cloud-based data migration.


• Familiarity with datalake architectures and concepts.


• Experience with ssis and/or crystal reports is a plus.


• Strong problem-solving skills and attention to detail.


• Excellent communication and leadership skills.

This advertiser has chosen not to accept applicants from your region.

Data Engineer

₱1200000 - ₱2400000 Y IBM

Posted today

Job Viewed

Tap Again To Close

Job Description

Introduction
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe.

You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat.

Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.

Your Role And Responsibilities
As a Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs.

Your Primary Responsibilities Include

  • Strategic Data Model Design and ETL Optimization: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements.
  • Robust Data Infrastructure Management: Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
  • Seamless Data Accessibility and Security Coordination: Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too.

Required Technical And Professional Expertise

  • Experience in ETL, Snowflake, and or DBT.
  • Experience in SQL, Unix scripting and or Shell scripting,
  • Experience on databases like DB2, Netezza.
  • Experience in Scheduling tools like Control-M.

Preferred Technical And Professional Experience

  • Technical Development experience
  • Demonstrated client interaction and excellent communication skills; both (written and verbal)
  • Amenable to work on a client dictated schedule (Day, Mid and Night) and Location
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Mandaluyong, National Capital Region ₱1500000 - ₱3000000 Y Maya

Posted today

Job Viewed

Tap Again To Close

Job Description

The
Data Engineer
is responsible for the creation, maintenance, and continuous improvement of data pipelines. Part of his/her responsibilities are to implement best practices in data management practices (i.e., cleaning, validation, and transformation of data) make data into usable datasets that can easily be consumed by other teams. This role will also work closely with the software engineers, data analysts, data scientists and data governance to understand how the data behaves in its respective domain, to clarify business and technical requirements on different data use cases, and to design and create efficient and reliable data pipelines. Within Data Engineering, this person will learn and adopt best practices on data management, data architecture design, and DataOps principles. May it be in Central DE or Distributed DE, a Data engineer is crucial in creating value for downstream teams that use data.

About the Role

Key Responsibilities:

  • Develop, maintain, and optimize data pipelines, data models, and data management solutions across data warehouses, data/delta lakes, or lakehouse environments.
  • Collaborate with upstream teams (e.g., Mesh Teams) to integrate data sources and with downstream teams to ensure data usability and accessibility.
  • Understand and adhere to existing technology standards and Data Engineering (DE) best practices.

Responsibilities

Central DE:

  • Maintain and enhance the overall data architecture, ensuring scalability, high availability, and timely data ingestion.
  • Build and optimize data pipelines for new data sources, applying DataOps principles to ensure seamless operations and minimal disruptions.

Distributed DE:

  • Acquire and maintain deep domain knowledge of assigned data areas to inform data modeling and pipeline development.
  • Design and develop data models for Zone 2 (silver layer) and Zone 3 (gold layer), ensuring business datasets are accurate, reliable, and ready for downstream consumption.

Qualifications

  • Good working knowledge on Shell (e.g. bash, zsh) scripting
  • Good working knowledge on data manipulation (SQL statements, JSON, NOSQL query, etc.)
  • Good working knowledge on AWS services (EC2, S3, Glue Crawlers, Jobs, Batch, Athena, Lambda, etc.) or equivalent cloud offerings a big plus
  • Good working knowledge on Apache Spark using SQL/Python
  • Good understanding of the concepts of Datawarehouse, Data Lake/Delta Lake and/or Lakehouse
  • Ability to work with other Leads to foster a culture of collaboration and teamwork

Required Skills

Central DE:

  • Good knowledge on Linux/Unix Administration
  • CI/CD experience using Terraform a big plus

Distributed DE:

  • Good working knowledge on data modeling
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Reporting engineer Jobs in Philippines !

Data Engineer

Taguig, National Capital Region ₱1200000 - ₱2400000 Y Nasdaq

Posted today

Job Viewed

Tap Again To Close

Job Description

Why Nasdaq
When you work at Nasdaq, you're working for more open and transparent markets so that more people can access opportunities. Connections can be made, jobs can be created, and communities can thrive. We want all our employees to have access to opportunity, too. That means planning for career growth, ensuring you have the tools you need, and promoting an inclusive culture where we're all valued for our unique perspective.

Here, you will work for a global tech leader committed to breaking down barriers to inclusive prosperity. We see technology as a means to free people up to work together more productively and effectively by centralizing data, analytics, and market intelligence.

Here, we're committed to building a more diverse and inclusive workforce. Not only is it our responsibility to do better, but we also need representative voices to power the fresh thinking that is vital for our business and our clients.

What We Offer
This is a permanent full-time role based in Bonifacio Global City, Taguig following a hybrid work model setup (at least 2 in office days per week).

You can expect an autonomous but fast-paced work environment where you are recognized for your results and ability to drive things forward. Every day brings many opportunities to learn & grow and rewards with a global impact we create.

In return, you will receive HMO coverage for you and your dependents, employee stock purchase plan, equity grant, retirement plan, annual bonus, free counseling sessions, subscription to e-learning platforms, fitness, wellness and more.

What You Will Do
The Data Operations and Engineering (DOPE) division of Nasdaq Data Link is responsible for the datasets that our customers buy and consume every day. We are seeking a
Data Operations Analyst
to join our growing team.

As part of the Data Operations Team, you will contribute by monitoring and maintaining hundreds of datasets on Nasdaq Data Link. This includes monitoring various tools to ensure dataset timeliness and accuracy, communicating with stakeholders, prioritizing and resolving data issues, and handling various maintenance tasks related to data products. Specifically, you will:

  • Continuously monitor a shared email inbox, various Slack channels and PagerDuty to respond to alerts related to data delays.
  • Troubleshoot alerts and determine the appropriate course of action.
  • Communicate with partners, Customer Success and other stakeholders to keep them aware of ongoing issues.
  • Assist the various support teams to resolve outstanding queries from customers and partners.
  • Develop your programming skills in Python, Spark and SQL.
  • Learn to use new tools such as Databricks and Monte Carlo

What We Expect

  • Have 1-3 years of professional experience in understanding data products and data management. Recent graduates are welcome to apply.
  • Amenable to work on a rotational shift schedule and accommodate weekend work as the team operates 24/7 depending on business needs.
  • Have a Bachelor of Computer Science/Engineering or equivalent degree or have equivalent qualifications/experience.
  • Able to communicate professionally with internal and external stakeholders.
  • Be eager to learn new technologies and best practices in data engineering by working with a world-class data team.
  • Have a basic knowledge of Python/SQL/Git and be willing to further develop these skills
  • Be eager to learn about parallel processing frameworks, orchestration/scheduling tools, and distributed data systems such as Databricks and Airflow.
  • Be entrepreneurial and enthusiastic about working in a fast-paced environment.

What Would Be Helpful

  • Familiarity with Databricks and Airflow

Does It Sound Like You?
Please follow through by clicking the "Apply" link and submitting your application. If your skills and experience are a match, we will be in touch soon. In the meantime, please visit our website and social media channels to learn more about our innovative business, inclusive culture and discover why Nasdaq Manila is Great Place To Work certified

Come as You Are
Nasdaq is an equal opportunity employer. We positively encourage applications from suitably qualified and eligible candidates regardless of age, color, disability, national origin, ancestry, race, religion, gender, sexual orientation, gender identity and/or expression, veteran status, genetic information, or any other status protected by applicable law.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

This advertiser has chosen not to accept applicants from your region.

Data Engineer

Makati City, National Capital Region ₱900000 - ₱1200000 Y Nezda Technologies Inc

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:
  • Design and build robust, scalable, and efficient data pipelines using Spark (PySpark, Scala, SQL, etc.).
  • Manage and optimize ETL/ELT processes, including data ingestion, transformation, and integration from multiple sources into data lakes and warehouses.
  • Utilize Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse, Azure Data Lake, and Azure SQL Database for data engineering tasks.
  • Ensure data quality, reliability, security, and performance across all pipelines and systems.
  • Collaborate with data analysts, BI developers, and stakeholders to understand data needs and deliver accurate solutions.
  • Perform data profiling, validation, and cleansing to maintain integrity and consistency.
  • Monitor, troubleshoot, and optimize existing pipelines and workflows.
  • Document processes, data flows, and technical solutions.


Qualifications & Skills:
  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • 3+ years of experience in data engineering or a similar role.
  • Strong hands-on experience with Azure Data Factory, Databricks, Synapse, Data Lake, and SQL Database.
  • Proficiency in ETL/ELT design and implementation.
  • Expertise in Spark (PySpark, Scala, SQL) for large-scale data processing.
  • Strong knowledge of SQL (querying, optimization, stored procedures).
  • Familiarity with Python for scripting and automation.
  • Understanding of data modeling, warehousing, and schema design.
  • Knowledge of data governance, security, and compliance best practices.
  • Strong analytical, problem-solving, and collaboration skills.

Additional Requirements:

  • Open to Project based employment (12 months w/ possibility of absorption depends on your performance)
This advertiser has chosen not to accept applicants from your region.

Data Engineer

₱1200000 - ₱2400000 Y Optum

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Responsibilities:

  • Collaborate with business teams to understand requirements and deliver effective reporting and data solutions.
  • Design, develop, test, and maintain software applications and data pipelines.
  • Analyze and resolve issues across systems, projects, and processes.
  • Influence leadership by presenting innovative ideas and technical solutions.
  • Create and maintain technical documentation and user guides.
  • Ensure data integrity, security, and compliance across systems.
  • Support CI/CD pipelines and DevOps practices for efficient deployments.
  • Participate in Agile development processes and project planning.

Job Qualifications:

Must Have:

  • 8+ years of experience working with tools like MS SQL Server and ETL (SSIS)

Nice to Have:

  • Experience (2+ months) with Snowflake or Databricks
  • Expertise in creating/debugging stored procedures, handling transactions, and error management
  • 2+ years of hands-on experience in .NET

What We Offer:

  • Competitive Total Rewards Package
  • Retirement Plan
  • HMO Coverage from Day 1
  • Dental, Medical, and Optical Reimbursements
  • Life and Disability Insurance
  • Paid Time-Off Benefits
  • Sick Leave Conversion
  • Tuition Fee Reimbursement
  • Employee Assistance Program (EAP)
  • Annual Performance-Based Merit Increases
  • Employee Recognition Programs
  • Training and Staff Development Opportunities
  • Employee Referral Program
  • Volunteerism Opportunities
  • All Mandatory Statutory Benefits

Who We Are:

·
Optum
is the health care technology and innovation arm of UnitedHealth Group, working alongside
UnitedHealthcare
.

· As part of a Fortune 5 enterprise, we are committed to helping people live healthier lives and making the health system work better for everyone.

· With operations across North America, South America, Europe, Asia Pacific, and the Middle East—including over 25,000 employees in the Philippines—we are a global leader in health solutions and care delivery.

Join us and be part of a team that's
Caring. Connecting. Growing together.

This advertiser has chosen not to accept applicants from your region.

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Reporting Engineer Jobs