Data Platform Engineer

3 weeks ago


Sydney, New South Wales, Australia Rich Data Co Full time

About RDC

Rich Data Co (RDC) Delivering the Future of Credit, Today We believe credit should be accessible, fair, inclusive, and sustainable. We are passionate about AI and developing new techniques that leverage traditional and non-traditional data to get the right decision in a clear and explainable way. Global leading financial institutions are leveraging RDC's AI decisioning platform to offer credit in a way that aligns to the customers' needs and expectations. RDC uses explainable AI to provide banks with deeper insight into borrower behaviour, enabling more accurate and efficient lending decisions to businesses.

Purpose of Role

The Data Platform Engineer is responsible for developing, operating, and supporting scalable and reliable data pipelines, dashboards, and reporting tools that enable data-driven decision-making across the organisation. This hybrid role combines core data engineering skills (such as pipeline development with Apache Airflow, data transformation in Python, and query optimisation in SQL) with operational responsibilities (such as troubleshooting, validation, and ensuring the uptime and integrity of production data workflows). You will work across cloud-based platforms, relational and NoSQL databases, and modern visualisation tools to ensure high data quality, availability, and performance. As part of a collaborative, fast-moving data team, you will support both project-based data initiatives and the day-to-day stability of the data platform, ensuring that business users have access to timely, accurate, and actionable insights. This is a hands-on role requiring strong technical skills and a proactive approach to both building and supporting the data infrastructure that powers RDC's analytics and operational reporting environment.

Accountability & Outcomes

  1. Design, develop, and maintain ETL pipelines using Apache Airflow, ensuring scalable and efficient data processing.
  2. Build and troubleshoot Python-based data processing scripts for transformation, ingestion, and automation.
  3. Support the daily operation of data systems, including pipeline health monitoring, incident response, and root cause analysis.
  4. Collaborate with internal teams to define data requirements, integrate multiple sources, and ensure end-to-end data accuracy.
  5. Perform data validation, profiling, and QA checks across pipelines and environments.
  6. Develop and maintain dashboards and reports using JavaScript, SQL, and BI tools such as Power BI or Tableau.
  7. Document data workflows, standards, and pipeline logic for operational consistency.
  8. Contribute to the continuous improvement of reliability, observability, and performance in data systems.

Capabilities

Experience

Essential

  1. A minimum of 5+ years of hands-on experience in data engineering, data operations, or data platform roles.
  2. Proven experience with Apache Airflow for building, scheduling, and monitoring ETL pipelines.
  3. Strong Python programming skills for data processing, automation, and pipeline support.
  4. Experience supporting and debugging production-grade data pipelines in cloud environments (preferably AWS).
  5. Solid SQL skills with the ability to write and optimise complex queries for reporting and validation.
  6. Familiarity with monitoring tools and logs (e.g., CloudWatch, log parsers, pipeline alerting frameworks).
  7. Experience working with RDBMS (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., DynamoDB).
  8. Demonstrated ability to respond to operational issues, analyse data quality anomalies, and implement remediation.
  9. Experience with agile project delivery, managing both sprint-based tasks and day-to-day support incidents.

Desirable

  1. Experience in cloud environments such as AWS, GCP, or Azure.
  2. Exposure to DevOps practices, including CI/CD pipelines for deploying data workflows.
  3. Experience supporting dashboard reliability and data refresh monitoring.
  4. Familiarity with data warehouse solutions and dimensional modelling concepts.
  5. Background supporting data governance, audit logs, and regulatory compliance for operational data.

Knowledge and Skill

Essential

  1. A deep understanding of ETL/ELT processes, including pipeline dependencies, failure points, and logging.
  2. Strong proficiency in SQL and Python, including pandas and PySpark (if applicable).
  3. Familiarity with JavaScript and frontend dashboard development frameworks (e.g., React or Angular).
  4. Understanding of data quality checks, schema evolution, and operational alerting.
  5. Experience using BI/reporting tools such as Power BI, Tableau, or similar for visualisation.
  6. Comfortable working with APIs and integrating real-time or batch data feeds.
  7. Ability to document data architecture, dependencies, and support runbooks clearly.
  8. Excellent communication and collaboration skills, especially across engineering, product, and analytics teams.

Desirable

  1. Understanding of data platform observability (e.g., data freshness, completeness, latency).
  2. Experience with metadata management, data lineage tools, or documentation platforms.
  3. Familiarity with infrastructure-as-code for data services (e.g., Terraform for Airflow deployment).
  4. Awareness of data privacy frameworks.
  5. Ability to contribute to the architecture of a scalable, reliable data platform in a fast-paced environment.

Join the Future of Credit

  • Work at a 5-Star Employer of Choice 2023 - RDC was named one of HRD Australia's "best companies to work for in Australia".
  • Join a fast-growing global AI company - Grow your skills, capabilities and gain AI and global experience.
  • High performance team - Work alongside some of the best product teams, data scientists and credit experts in the industry.
  • Vibrant team culture - Join an innovative and agile team who celebrates wins and solves problems together.
  • Work-life balance - Highly flexible working arrangements - work how's right for you
  • Financial inclusion - Be part of a company that is striving for global financial inclusion and driving the future of credit.
#J-18808-Ljbffr

  • Sydney, New South Wales, Australia beBee Careers Full time

    Data Platform Engineer RoleThis is an exciting opportunity to join our data engineering team as a Data Platform Engineer. As a key member of the team, you will be responsible for designing and developing data products that drive business insights and inform strategic decisions.**Key Responsibilities:**


  • Sydney, New South Wales, Australia beBeeDataPlatform Full time

    Job Title: Data Platform EngineerWe are seeking an experienced Data Platform Engineer to join our team. The successful candidate will be responsible for designing, building, and maintaining large-scale data systems that support our business operations.


  • Sydney, New South Wales, Australia beBee Careers Full time

    Senior Data Services SpecialistWe are seeking a seasoned Senior Data Services Specialist to join our Data and Analytics division. In this role, you will be responsible for integrating, formatting, and presenting relevant analyses and data visualisations to communicate results, findings, and recommendations to various business users.The OpportunityThis is an...


  • Sydney, New South Wales, Australia beBeeDataEngineer Full time

    Data Platform Engineer WantedAbout the RoleWe are seeking an experienced Data Platform Engineer to join our team. The successful candidate will be responsible for designing, building and maintaining data platforms that support our business needs.Responsibilities include:Building and enhancing data pipelines to ingest data from various sourcesDesigning and...


  • Sydney, New South Wales, Australia Synechron Full time

    2 weeks ago Be among the first 25 applicantsGet AI-powered advice on this job and more exclusive features.Platform Engineer with expertise in the AWS data stack to join our dynamic team. The ideal candidate will be responsible for designing, building, and maintaining modern scalable and robust data platforms that support our data-driven initiatives (e.g....


  • Sydney, New South Wales, Australia Synechron Full time

    2 weeks ago Be among the first 25 applicantsGet AI-powered advice on this job and more exclusive features.Platform Engineer with expertise in the AWS data stack to join our dynamic team. The ideal candidate will be responsible for designing, building, and maintaining modern scalable and robust data platforms that support our data-driven initiatives (e.g....


  • Sydney, New South Wales, Australia beBee Careers Full time

    We are seeking a skilled Engineering Manager to lead high-performing engineering teams in delivering robust data & AI platform capabilities. The ideal candidate will have 10+ years of experience in software/data engineering and 3+ years in Data / Analytics / AI engineering leadership roles.Key Responsibilities:Design, build, and manage scalable, secure, and...


  • Sydney, New South Wales, Australia beBee Careers Full time

    Highly Skilled Developer Position Description:This role requires a highly skilled developer to design and implement a declarative infrastructure management framework for our data platform (Snowflake and AWS) to enable self-service data pipeline and analytics development by our data community.The ideal candidate will have experience improving testing...


  • Sydney, New South Wales, Australia beBeeData Full time

    This is an opportunity to be part of a forward-thinking team driving the next generation of data infrastructure on AWS.You will be designing and building scalable data platforms on AWS to support analytics, reporting, and real-time use cases.Key skills required include strong hands-on experience with AWS cloud services in a data platform context, proven...


  • Sydney, New South Wales, Australia beBeeLeadership Full time

    Data Engineering Lead PositionWe are seeking a seasoned data engineering leader to drive the strategic and technical development of an enterprise-wide cloud-based data platform.