Data Engineer
Rich Data Co
Sydney, Australia
Full-time
Hybrid and flexible working arrangements
About RDC
Rich Data Co (RDC) Delivering the Future of Credit, Today! We believe credit should be accessible, fair, inclusive, and sustainable. We are passionate about AI and developing new techniques that leverage traditional and non-traditional data to get the right decision in a clear and explainable way. Global leading financial institutions are leveraging RDC’s AI decisioning platform to offer credit in a way that aligns to the customers’ needs and expectations. RDC uses explainable AI to provide banks with deeper insight into borrower behaviour, enabling more accurate and efficient lending decisions to businesses.
Join the Future of Credit!
About the Role
As a Data Engineer at RDC, you will be responsible for building and maintaining data pipelines, managing data integration, and developing dashboards and reports that deliver critical business insights. Your role will blend data engineering expertise with report development skills, ensuring efficient data flow and visualization. With 2+ years of experience in Airflow, Python, and JavaScript, you’ll be expected to:
Design, develop, and maintain ETL pipelines using Apache Airflow to ensure efficient and scalable data processing.
Use strong Python programming skills to write clean, maintainable code for data transformation and automation.
Build dynamic and interactive dashboards using JavaScript and modern frameworks like React or Angular, ensuring an intuitive user experience.
Manage data storage, retrieval, and optimization in both RDBMS (e.g., MySQL, PostgreSQL) and NoSQL (e.g., DynamoDB) databases.
Collaborate with cross-functional teams to gather requirements, integrating data from various sources to create comprehensive reports and dashboards.
Ensure scalability, performance, and security in data pipelines and visualizations while working in cloud-based environments.
Here's what a typical day might look like:
Team Stand-Up:
Start your day by participating in an agile stand-up, collaborating with product, sales, and customer support teams to discuss ongoing data engineering and reporting projects.
Design & Build:
Develop ETL pipelines using Apache Airflow and design real-time, interactive dashboards using JavaScript and Python to meet business needs. Work on creating data integration solutions that ensure timely and accurate reporting.
Data Integration:
Work closely with internal teams to ensure seamless data integration from various sources, optimizing data pipelines to ensure data flows correctly into your dashboards and reports.
Collaboration & Refinement:
Collaborate with stakeholders to gather data requirements and refine report outputs based on feedback. Participate in agile ceremonies, helping clarify user stories and project timelines.
Learning & Continuous Improvement:
Stay updated on the latest trends in data engineering and dashboard development, focusing on optimizing performance and enhancing the user experience.
Your Skills & Experience
You must have:
2+ years of proven experience in Apache Airflow for building and managing data pipelines.
Strong proficiency in Python, with experience in writing efficient and scalable data processing code.
Strong JavaScript skills for dashboard development.
Experience with both RDBMS (e.g., MySQL, PostgreSQL) and NoSQL (e.g., DynamoDB) databases.
Experience with data pipelines or ETL processes, ensuring data is available for dashboards and reports.
Strong SQL skills, with the ability to write complex queries and optimize them for performance.
Experience with data visualization tools such as Power BI, Tableau, or similar platforms.
Excellent communication skills to collaborate with cross-functional teams and translate business requirements into technical designs.
Strong problem-solving skills, with the ability to troubleshoot technical issues and optimize data pipelines and dashboards.
You ideally have:
Experience working in cloud-based environments such as AWS, Azure, or GCP.
Familiarity with API integration for real-time data fetching and interaction.
Experience with agile development methodologies, working within fast-paced, iterative environments.
Knowledge of DevOps practices and CI/CD pipelines.
What's Next?
If this sounds like you, get in touch or apply now. Our Head of Talent and Capability would love to speak with you! Get ready for the #futureofcredit
Please note candidates must have Australian working rights.