45 Big Data jobs in Australia

Data Engineer

Sydney, New South Wales Cognizant

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

What makes Cognizant a unique place to work? The combination of rapid growth and an international and innovative environment! This is creating many opportunities for people like YOU - people with an entrepreneurial spirit who want to make a difference in this world.
At Cognizant, together with your colleagues from all around the world, you will collaborate on creating solutions for the world's leading companies and help them become more flexible, more innovative, and successful. Moreover, this is your chance to be part of the success story.
**Position Summary**
Experienced Data Engineer with a strong focus on the ELT (Extract, Load, Transform)/ ETL processes. Person should be skilled in designing and optimizing complex queries, stored procedures, and data models using SQL Server. Person should be responsible for building and managing scalable data pipelines with Azure Data Factory and automating deployments through Azure DevOps. Should be well-versed in implementing Data Vault 2.0 architecture, including Hubs, Links, and Satellites, to support scalable and auditable enterprise data solutions. Brings a solid understanding of data integrity, performance tuning, and maintainable data architecture. Should collaborate effectively with cross-functional teams to deliver clean, governed, and business-ready data products.
**Mandatory Skills**
+ Proficiency in SQL Server including advanced T-SQL, stored procedures, query optimization, and indexing.
+ Experience in building and managing ETL/ELT pipelines using Azure Data Factory.
+ Hands-on experience with Azure DevOps for CI/CD automation and release management.
+ Practical knowledge of Data Vault 2.0 including implementation of Hubs, Links, and Satellites.
+ Understanding data modeling and data warehousing concepts.
+ Strong skills in performance tuning and troubleshooting SQL workloads.
+ Familiarity with version control systems such as Git.
+ Awareness of data governance, data quality, and metadata management practices.
**Roles And Responsibilities**
+ Design, develop, and maintain scalable ETL/ELT data pipelines using Azure Data Factory.
+ Write efficient T-SQL scripts and stored procedures for data transformation and loading in SQL Server.
+ Implement Data Vault 2.0 models including Hubs, Links, and Satellites to support scalable data warehousing.
+ Collaborate with architects, analysts, and business users to understand data requirements and deliver solutions.
+ Manage source-to-target mappings and ensure accurate data lineage and documentation.
+ Set up CI/CD pipelines for data projects using Azure DevOps and Git.
+ Monitor and optimize pipeline performance and resource utilization.
+ Troubleshoot data issues, resolve pipeline failures, and ensure data accuracy and completeness.
+ Participate in code reviews and enforce data engineering best practices.
+ Ensure compliance with data governance, security, and privacy standards.
+ Support production deployments and incident management for data workflows.
**Qualifications/Certifications (Optional):**
+ Bachelor's degree in computer science, Information Technology, Data Engineering, or a related field.
+ 4+ years of experience in data engineering or ETL development.
+ Strong working knowledge of SQL Server and relational database design.
+ Experience with cloud-based data integration tools, especially Azure Data Factory.
+ Familiarity with DevOps practices and CI/CD pipelines in Azure environments.
+ Solid understanding of data warehousing principles and Data Vault 2.0 methodology.
+ Good problem-solving and analytical skills, with attention to data quality and performance.
+ Effective communication and teamwork skills, with experience in agile or scrum teams.
**Salary Range** : $85,000-$95,000
**Date of Posting:** 01-Jul-25
**Next Steps:** If you feel this opportunity suits you, or Cognizant is the type of organization you would like to join, we want to have a conversation with you! Please apply directly with us.
For a complete list of open opportunities with Cognizant, visit . Cognizant is committed to providing Equal Employment Opportunities. Successful candidates will be required to undergo a background check.
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
This advertiser has chosen not to accept applicants from your region.

Data Engineer

2000 Sydney, New South Wales Wisr

Posted today

Job Viewed

Tap Again To Close

Job Description

This is a Data Engineer role with Wisr based in Sydney, NSW, AU
== Wisr ==

Role Seniority - mid level

More about the Data Engineer role at Wisr

About the job

This role will be supporting Wisr’s ongoing effort to modernise and evolve the data platform as we continue to scale and grow. As a key member of the Data Engineering team, you will bring strong data infrastructure and modelling capability, coupled with hands-on experience building data pipelines in a Lambda architecture.


You'll play a crucial role in supporting the development of our data models and pipelines, helping us to build a world-class data platform that serves the entire organisation.


By collaborating closely with various internal stakeholders including your fellow data engineers, Product Managers, Developers and cross-functional teams including Operations and Marketing, you will play a pivotal role in ensuring seamless data integration, processing, and analytics.

What You’ll Do:


  • Design and build scalable pipelines and data models that enable insight generation, experimentation and rapid iteration within the Data Squad


  • Own and evolve the data stack supporting product development (e.g. dbt, SQL, Python, orchestration, observability)


  • Ensure high standards of data quality, reliability, and performance, especially in experimental environments

  • Develop and maintain robust documentation, schema management, and lineage tracking to ensure transparency and traceability


  • Forming a deep understanding of our data processes and helping to improve it through implementing best practices


  • Staying on top of any failures and issues in our data systems, troubleshooting, proposing improvements and implementing fixes


About You


With hands-on experience across modern data architectures, you bring deep expertise in building robust data pipelines, developing scalable data models, and managing data workflows end-to-end. You’re passionate about data and thrive on the challenge of ingesting, transforming, and integrating complex datasets from multiple sources. You take pride in seeing your work translate into real business outcomes.

You are confident working with technologies such as Python, SQL, dbt, and orchestration tools like Airflow. Experience with cloud platforms (AWS, GCP, or Azure), modern data warehouses (e.g. Snowflake, BigQuery, Redshift), and event-driven or streaming data systems (e.g. Kafka, Kinesis) is highly desirable.

You’ll also have:

  • 3+ years in data engineering or full-stack data roles

  • Tertiary qualifications in Computer Science, Engineering, or a related technical field

  • Demonstrated ability to take ownership of projects, collaborate cross-functionally, and work independently in dynamic environments

Technical Skills

  • Good understanding of SQL and Python for data transformation, scripting, and orchestration

  • Experience with dbt, CI/CD for data, version control, and software engineering best practices

  • Good understanding of data modelling frameworks, including dimensional modelling, data mesh, data vault, and star/snowflake schemas

  • Good understanding of data orchestration / ETL tools such as Apache Airflow, Azure Data Factory, etc

  • Experience with major cloud data warehouse or lakehouse platform (e.g. Snowflake, Databricks, BigQuery)

  • Familiarity with Docker, Kubernetes (especially AKS), and deploying data services in containerised environments

  • A good grasp of data quality, governance, and observability principles

  • Experience enabling real-time analytics and streaming data architectures

Desirable / Nice to Have

  • Experience in or strong interest in the FinTech or financial services domain

  • Experience building ML / MLOps pipelines would be a plus

  • Certifications such as Azure Data Engineer Associate

  • Exposure to domain-driven data architecture or data mesh implementations

  • Experience with real-time data pipelines using tools like Kafka, Flink, or Azure Stream Analytics




Before we jump into the responsibilities of the role. No matter what you come in knowing, you’ll be learning new things all the time and the Wisr team will be there to support your growth.

Please consider applying even if you don't meet 100% of what’s outlined

Key Responsibilities
  • Designing and building data pipelines
  • ️ Owning and evolving the data stack
  • Troubleshooting data systems


Key Strengths
  • Data pipeline development
  • Data modeling
  • Collaboration
  • ️ Cloud platforms
  • Machine Learning pipelines
  • ⏱️ Real-time data processing


Why Wisr is partnering with Hatch on this role. Hatch exists to level the playing field for people as they discover a career that’s right for them. So when you apply you have the chance to show more than just your resume.

A Final Note: This is a role with Wisr not with Hatch.
This advertiser has chosen not to accept applicants from your region.

Data Engineer

2000 Sydney, New South Wales NSW Department of Customer Service

Posted today

Job Viewed

Tap Again To Close

Job Description

contract
This is a Data Engineer role with NSW Department of Customer Service based in Sydney, NSW, AU
== NSW Department of Customer Service ==

Role Seniority - mid level

More about the Data Engineer role at NSW Department of Customer Service

Data Engineer

SNSW Grade: 7/8 Employment Type: Temporary Full time up to September 2027 Location: Haymarket (McKell), Parramatta or Gosford. In office presence required as per DCS directive Salary range : $102,899 - $21,317 plus 11.5% Super

We are seeking a talented Data Engineer to join our dynamic team at Service NSW. As a passionate data professional, you will play a pivotal role in shaping the future of government services.

About the team The Operational Data & Insights team at Service NSW is at the forefront of leveraging data to enhance business and customer experience outcomes. Our team provides expert advice to leadership utilising statistical and information analysis. We partner with stakeholders to design, develop and maintain innovative data products and services to support frontline service delivery. By driving process improvements through analytical insights, we support the agency to deliver efficient, customer-centric services.

In your new role you will:

  • Collaborate with stakeholders to design, build, test and maintain complex data management systems, to ensure they meet business requirements and user needs.

  • Work to deliver automated, reliable and secure services that provide valuable insights to inform decision making.

  • Integrate disparate data from various sources, such as, operational databases, third-party systems, APIs, and cloud services into a centralised repository.

  • Create a single source of truth that enables consistent, reliable, and efficient access to data for analytics and decision-making.

  • Be involved in providing expert advice to stakeholders, recommending and implementing ways to improve data efficiency and reliability.

To be successful in this role you will demonstrate:

  • You will work with cutting-edge AWS technologies to design, develop, and implement innovative data solutions that improve decision-making and enhance the customer experience.

  • You thrive in a fast-paced environment, are passionate about data, have a strong analytical mindset.

  • You have relevant tertiary qualifications in Data Engineering, Computer Science or equivalent experience.

To Apply

Please submit your resume (4 pages max) and cover letter (2 pages max) outlining your suitability for the role.

Salary Grade 7/8, with the base salary for this role starting at 102,899 base plus superannuation

Click Here to access the Role Description. For enquiries relating to recruitment please contact Jyostna Channamadhvuni via

Visit the Capability Application Tool to prepare for the recruitment process by accessing practice application and interview questions based on the focus capabilities listed in the role description.

Closing Date: Tuesday, 8th July 2025 @ 9.59 am

Careers at Department of Customer Service A career at the Department of Customer Service (DCS) gives you the opportunity to help improve government services and be part of reform that benefits people across NSW. We are focused on delivering excellent customer service, digital transformation, and regulatory reform. Come join us and influence the future of our great state.

Belong in our diverse and inclusive workplace

The strength of our workforce lies in its diversity and embracing difference, while the key to our success is leveraging the contributions of employees with different backgrounds and perspectives.

You can view our full diversity and inclusion statement here.

We want you to bring your best self to this application process. If you have any support or access needs that may require adjustments to allow you to fully participate in this selection process (including an alternate format of the application form) please contact or 02 9494 8351.

For more information, please visit

Information on some of the different types of disabilities

Information on adjustments available for the recruitment process




Before we jump into the responsibilities of the role. No matter what you come in knowing, you’ll be learning new things all the time and the NSW Department of Customer Service team will be there to support your growth.

Please consider applying even if you don't meet 100% of what’s outlined

Key Responsibilities
  • Collaborating with stakeholders
  • Delivering automated services
  • Creating a single source of truth


Key Strengths
  • ️ Data management systems
  • ️ AWS technologies
  • Analytical mindset
  • Stakeholder engagement
  • Data integration
  • Process improvement


Why NSW Department of Customer Service is partnering with Hatch on this role. Hatch exists to level the playing field for people as they discover a career that’s right for them. So when you apply you have the chance to show more than just your resume.

A Final Note: This is a role with NSW Department of Customer Service not with Hatch.
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Sendle

Posted today

Job Viewed

Tap Again To Close

Job Description

This is a Senior Data Engineer role with Sendle based in AU
== Sendle ==

Role Seniority - senior

More about the Senior Data Engineer role at Sendle

Sendle builds shipping that is good for the world. We help small businesses thrive by making parcel delivery simple, reliable, and affordable. We’re a B Corp and the first 100% carbon neutral delivery service in Australia, Canada, and the United States, where we harness major courier networks to create a delivery service that levels the playing field for small businesses.

We envision a world where small businesses can compete on a level playing field with the big guys. Sendle is a fast-growing business with bold ambitions and big dreams.

In the last few years, we’ve made huge strides towards our goal of becoming the largest SMB eCommerce courier in the world, moving from a single country operation in Australia to successful launch and operation in the US and Canada. We’ve also launched major partnerships with Vestiaire Collective, eBay, Shopify and, Etsy too!

But most importantly, we’re a bunch of good people doing good work. Wanna join us?

A bit about the role

We are looking for a Senior Data Engineer who is passionate about building scalable data systems that will enable our vision of data democratization to drive value for the business.

As a company, data is at the center of every critical business decision we make. With this role, you will work across many different areas of the business, learning about everything from marketing and sales to courier logistics and network performance. Additionally, there is the opportunity to work directly with stakeholders, with you being a technical thought partner and working collaboratively to design and build solutions to address key business questions.

What you’ll do
  • Develop, deploy, and maintain data models to support the data needs of various teams across the company

  • Build data models with DBT, utilizing git for source control

  • Ingest data from different sources (via Fivetran, APIs, etc.) into Snowflake for use by the DBT models

  • Collaborate with the Data Engineering team to brainstorm, scope, and implement process improvements

  • Work with the entire Data and Analytics team to enhance data observability and monitoring

  • Act as a thought partner for stakeholders and peers across the company on ad hoc data requests and identify the best approach and design for our near-term and long-term growth objectives

  • Understand the tradeoffs between technical possibilities and stakeholder needs and strive for balanced solutions

  • Hold self and others accountable to meet commitments and act with a clear sense of ownership

  • Demonstrate persistence in the face of obstacles, resolve them effectively, and involve others as needed

  • Contribute to our data literacy efforts by improving the accessibility, discoverability, and interpretability of our data

  • Research industry trends and introduce new methodologies and processes to the team

What you’ll need
  • Experience with data modeling, data warehousing, and building ETL pipelines (Dagster, DBT, and Snowflake experience a plus)

  • Advanced SQL knowledge

  • Experience with source control technologies such as git

  • Experience implementing scalable and sustainable data infrastructure (familiarity with AWS EC2, Terraform, CI/CD, etc.)

  • Experience with AI projects and exposure to leveraging LLMs/Machine Learning a plus

  • DevOps knowledge a plus

  • Strong communication skills and the ability to partner with business stakeholders to translate business requirements into technical solutions

  • Ability to effectively communicate technical approach with teammates and leaders

  • Ability to thrive in a remote environment through effective async communication and collaboration

  • Ability to manage multiple projects simultaneously

  • A can-do attitude and demonstrates flexibility by readily taking on new opportunities and assisting others

  • The 5Hs (our core values) in their approach to work and building partnerships with stakeholders and teammates

What we’re offering
  • The chance to work with a creative team in a supportive environment

  • A personal development budget

  • You are able to create your own work environment, connecting to a remote team from anywhere in Australia

  • EAP access for you and your immediate family, because we care about your wellbeing

  • Options through participation in Sendle’s ESOP

What matters to us

We believe that our culture is one of our most important assets. We have 5 key values that we look for in every member of our team.

  • Humble - We put others first. We embrace and seek feedback from others.

  • Honest - We speak gently but frankly. We take ownership of our mistakes and speak the truth.

  • Happy - We enjoy the journey. We are optimistic and find opportunities in all things.

  • Hungry - We aspire to make a difference. We aim high, step out of our comfort zones, and tackle the hard problems.

  • High-Performing - We relentlessly deliver. We know the goal and work fearlessly towards it.

Legally, we need you to know this:

We are an equal-opportunity employer and value diversity. We do not discriminate on the basis of race, religion, colour, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

If you require accommodations due to a disability to participate in the application or interview process, please get in touch with our team at to discuss your needs.

But it’s important to us that you know this:

We strongly believe that diversity of experience contributes to a broader collective perspective that will consistently lead to a better company and better outcomes. We are working hard to increase the diversity of our team wherever we can and we actively encourage everyone to consider becoming a part of it.

If you want to be a part of something remarkable then we’re excited to hear from you.

Interested in knowing more? Check us out on our Careers Page, Being Carbon Neutral and LinkedIn.

#LI-Remote




Before we jump into the responsibilities of the role. No matter what you come in knowing, you’ll be learning new things all the time and the Sendle team will be there to support your growth.

Please consider applying even if you don't meet 100% of what’s outlined

Key Responsibilities
  • Developing, deploying, and maintaining data models
  • Building data models with DBT
  • Ingesting data from different sources


Key Strengths
  • ️ Data modeling and ETL pipelines
  • Advanced SQL knowledge
  • Source control technologies
  • Strong communication skills
  • Remote collaboration skills
  • Project management skills


Why Sendle is partnering with Hatch on this role. Hatch exists to level the playing field for people as they discover a career that’s right for them. So when you apply you have the chance to show more than just your resume.

A Final Note: This is a role with Sendle not with Hatch.
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

2060 Waverton, New South Wales Shift

Posted today

Job Viewed

Tap Again To Close

Job Description

This is a Senior Data Engineer role with Shift based in North Sydney, NSW, AU
== Shift ==

Role Seniority - senior

More about the Senior Data Engineer role at Shift

Company Description

At Shift, we’re business specialists dedicated to helping Australian SMEs take control of their cashflow, streamline trade terms and choose the right financial products.

We believe Australian businesses are the driving force behind our economy and are core to our communities. That’s why our business expertise, focus on relationships, and market-leading technology is at the core of everything we do.

Our unique approach to product innovation combined with our collaborative culture means you can build your career in a supportive environment. You’ll be joining a diverse team of over 250 people who are always looking to deliver better outcomes for Australian businesses.

Job Description

We're looking for a Senior Data Engineer who’s passionate about creating data solutions that power thoughtful decision-making. In this role, you’ll build and maintain modern data pipelines on Azure using tools like Databricks, Azure Data Factory, and Delta Lake, while contributing to a team that values collaboration, inclusion, and continuous improvement.

This is a great opportunity for someone who enjoys partnering with diverse teams, bringing ideas to life, and building scalable solutions that have real business impact.

What you’ll do:

  • Design, develop, and maintain secure, scalable ELT data pipelines using Azure tools

  • Ensure data is accurate, consistent, and ready for analysis through strong governance practices

  • Collaborate closely with analysts, engineers, and business stakeholders to understand goals and deliver trusted data products

  • Build data models that consolidate complex sources into actionable insights

  • Identify opportunities to improve performance, reliability, and cost-efficiency across systems

  • Support the development of others through knowledge sharing, mentoring, and inclusive design practices

  • Contribute to architecture, documentation, and automation to help the team scale sustainably

  • Use tools like Unity Catalog, Azure Functions, and DevOps pipelines to manage metadata, schema evolution, and deployment

Qualifications

What you'll have:

  • Around 5+ years of experience in data engineering or a related field

  • Proficiency in Python and SQL, with a focus on clean, maintainable code

  • Experience with Azure Databricks, Delta Lake, Azure SQL, and Azure Data Lake Gen2

  • Familiarity with CI/CD practices and data orchestration using tools like Azure Functions and Delta Live Tables

  • A collaborative mindset and strong communication skills - able to partner across technical and non-technical teams

  • A passion for solving complex problems and improving data systems over time

  • Commitment to continuous learning and staying current with evolving technologies

  • Relevant tertiary qualifications or equivalent experience

Additional Information

Key benefits:

  • Collaborative teams – a flat structure means everyone can learn from colleagues and senior leaders around the business.

  • Be involved – come together with all of your colleagues every 100 days to share the product and technology roadmap and business strategy.

  • Flexible working environment – we’re headquartered in North Sydney with state based workplaces and offer a flexible work policy.

  • Family support – industry leading 26 weeks paid parental leave.

  • Varied workspaces – our office enables areas for collaboration, brainstorming and socialising as well as focus zones.

  • Range of benefits – supporting your physical, psychological and financial wellbeing.

#LI-Hybrid




Before we jump into the responsibilities of the role. No matter what you come in knowing, you’ll be learning new things all the time and the Shift team will be there to support your growth.

Please consider applying even if you don't meet 100% of what’s outlined

Key Responsibilities
  • Designing and developing data pipelines
  • Ensuring data accuracy
  • Collaborating with stakeholders


Key Strengths
  • Data engineering
  • Python and SQL
  • ️ Azure tools
  • CI/CD practices
  • Collaboration
  • Continuous learning


Why Shift is partnering with Hatch on this role. Hatch exists to level the playing field for people as they discover a career that’s right for them. So when you apply you have the chance to show more than just your resume.

A Final Note: This is a role with Shift not with Hatch.
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

3000 Melbourne, Victoria Prezzee

Posted today

Job Viewed

Tap Again To Close

Job Description

This is a Senior Data Engineer role with Prezzee based in Melbourne, VIC, AU
== Prezzee ==

Role Seniority - senior

More about the Senior Data Engineer role at Prezzee

Who We Are: At Prezzee, we’re more than just a digital gifting platform – we’re building human connections through unforgettable gifting moments. With over 1,000 brand partners globally, we’re transforming how people give and receive gifts. From our beginnings as a small Australian startup to becoming a global leader, we thrive on innovation, collaboration, and a deep commitment to making a real impact.

The Role: As a Senior Data Engineer at Prezzee, you’ll take the lead in designing, building, and maintaining the data pipelines that drive critical decision-making across the business. Reporting to the Head of Infrastructure, this hands-on role will give you the opportunity to work on a multinational data platform, drawing and integrating data from multiple regions around the world.

You’ll play a key role in shaping Prezzee’s data capabilities as we pursue bold strategic ambitions and continue to disrupt the global gift card and payments market. Your work will ensure our teams have access to trusted, timely data—empowering smarter decisions and enabling scalable growth at pace.

What You'll Be Doing:

  • Own the end-to-end development of scalable, sustainable data pipelines in AWS and Databricks—designing, building, testing, and maintaining systems that keep our data flowing seamlessly.

  • Optimise and fine-tune our data infrastructure to ensure performance, reliability, and efficiency at scale.

  • Manage and maintain the critical software systems that support our growing data platform.

  • Champion innovation by fostering a culture of continuous improvement, always looking for smarter, faster ways to work.

  • Collaborate cross-functionally with infrastructure and engineering teams to deliver, operate, and monitor reliable data solutions.

  • Partner with stakeholders to deeply understand business needs and translate them into robust, future-ready data pipelines and products.

  • Define and enhance technical standards , including clear, well-maintained documentation that supports consistency and knowledge sharing.

  • Promote and uphold best practices across all aspects of data engineering to ensure quality, scalability, and long-term success.

What We’re Looking For:

Core Skills:

  • Hands-on experience working with Terraform

  • Extensive experience building and maintaining data pipelines in AWS and Databricks

  • Exceptional problem solving and troubleshooting skills

  • A strong understanding of architectural concepts and cloud-based solution design

  • Demonstrable ability to motivate self and others

  • Experience working with technical and non-technical partners, with the capability to influence

  • Experience with Redshift, Postgres, DynamoDB, Looker and Workday Prism Analytics would be a bonus (but not essential)

Why Join Us? Join us at Prezzee, where we don’t just work – we innovate, collaborate, and shape the future of digital gifting. As part of our dynamic team, you’ll contribute to a global company’s growth while enjoying a flexible, supportive, and inclusive work environment. We offer:

  • Incentive schemes based on company-wide targets and individual performance.

  • Employee referral program and staff discounts.

  • Flexible hours, Culture Swap Days, and 30-day work-from-anywhere benefit.

  • Opportunities for professional growth through self-led learning and leadership development programs.

  • Fortnight FIRE Fridays where teams come collaborate to spark with Formative Ideas, Research and Experiment in tech improvements

  • Supportive wellbeing platform (Telus) for your mental, social, financial, and physical wellbeing.

Prezzee has been recognized as Foundry's Computerworld 2024 & 2025 Best Places to Work in IT . We value diversity, collaboration, and innovation in everything we do. Join a global team united by the core values: Give openness, Give greatness, Give magic, and Give a damn .

Prezzee is an Equal Opportunity employer. We believe that diversity is the key to building the best products for our customers, team culture and growing our global business. Our diversity mission is for our people to be their most authentic selves, to inspire, innovate and celebrate within a culture of belonging. We do not discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status.

We are looking for people to help create human connections, make magic and shape the future of gifting so even if you don’t think you quite meet all of the skills listed or tick all the boxes, we’d still love to hear from you! Please let us know if you require any adjustments as part of the application and recruitment process, We also encourage you to let us know of your pronouns at any point of the process.




Before we jump into the responsibilities of the role. No matter what you come in knowing, you’ll be learning new things all the time and the Prezzee team will be there to support your growth.

Please consider applying even if you don't meet 100% of what’s outlined

Key Responsibilities
  • Owning the end-to-end development of data pipelines
  • ️ Optimising data infrastructure
  • Championing innovation


Key Strengths
  • ️ Hands-on experience with Terraform
  • ️ Building and maintaining data pipelines in AWS and Databricks
  • Exceptional problem-solving skills
  • Experience with Redshift, Postgres, DynamoDB, Looker and Workday Prism Analytics


Why Prezzee is partnering with Hatch on this role. Hatch exists to level the playing field for people as they discover a career that’s right for them. So when you apply you have the chance to show more than just your resume.

A Final Note: This is a role with Prezzee not with Hatch.
This advertiser has chosen not to accept applicants from your region.

Principal Data Engineer

2000 Sydney, New South Wales Commonwealth Bank

Posted today

Job Viewed

Tap Again To Close

Job Description

This is a Principal Data Engineer role with Commonwealth Bank based in Sydney, NSW, AU
== Commonwealth Bank ==

Role Seniority - senior

More about the Principal Data Engineer role at Commonwealth Bank

Principal Data Engineer

  • You have knowledge and experience that spans both development and architecture, including data engineering, modelling and cloud architecture

  • We are embarking on an exciting Data Transformation program and are ready to push the boundaries and deliver engineering best practices to elevate the data quality and availability in our domain

  • Together we will build tomorrow’s bank today, using world-leading engineering, technology, and innovation

Do Work That Matters We're building tomorrow’s bank today, which means we need creative and diverse engineers to help us redefine what customers expect from a bank. Envisioning new technologies that are still waiting to be invented and reimagining products that support our customers and help build Australia’s future economy. Here, you'll have the chance to bring your passion to life by working with the latest technology on groundbreaking projects that deliver a seamless and unmatched customer experience. See Yourself in Our Team This role is part of the Reporting and Insights Crew within Financial Crimes domain , where we design and deliver cutting-edge technology solutions that serve data and analytical use-cases within Financial Crimes domain.

We are seeking an outstanding Principal Engineer to join our team and help to shape the direction of our data and analytics cloud platforms that we are building.

We’re interested in hearing from people who:

  • Will shape the future of our data and analytics platforms while focusing on designing, developing, and deploying innovative technology solutions that will enable us to build tomorrow's bank today.

  • Are comfortable in leading our engineers to implementing complex technical solutions in a SaaS and cloud-based environment. You’ll possess a strong risk mindset in ensuring that appropriate cyber security controls are implemented and play a significant mentoring role in providing technical assistance to the engineers in the team.

  • Enjoy and have demonstrable experience building enterprise software

  • Have a broad technology background with engineering as a core competency

  • Have experience building and leading teams through complex multi-year builds ideally gained in a similar role within an enterprise environment

  • Have identified, gained support and ensured execution of technical direction and strategy

  • Can design and implement solutions to complex problems, including onboarding and migrating to new tooling as well as removing obsolete systems and code.

  • Constructively challenges the status quo while influencing stakeholders and building diverse and inclusive teams.

Tech Skills We use a broad range of tools, languages, and frameworks. We don’t expect you to know them all but experience or exposure with some of these (or equivalents) will set you up for success in this team;

  • Cloud-scale Relational and No-SQL DBs (eg: Redshift, Snowflake, Mongo)

  • Data Integration & Cataloguing services (eg: Alation, AWS Glue)

  • Data processing framework using EMR and Spark

  • Fluent in all aspects of modern Software Development Lifecycle

  • Experienced with at least one programming language (functional or OOPS)

  • Comfortable creating documentation and design artefacts for Software Developers, Cyber Security review and other stakeholders

  • Wide range of knowledge and experience across AWS Analytics services, Integration services, database services and Machine learning services

Working with us

Whether you’re passionate about customer service, driven by data, or called by creativity, a career with CommBank is for you.

We support our people with the flexibility to balance where work is done with at least half your time each month connecting in the Sydney office

You’ll be supported when faced with challenges and empowered to tackle new opportunities.

We really love working here, and we think you will too.

If you're already part of the Commonwealth Bank Group (including Bankwest, x15ventures), you'll need to apply through Sidekick to submit a valid application. We’re keen to support you with the next step in your career.

We're aware of some accessibility issues on this site, particularly for screen reader users. We want to make finding your dream job as easy as possible, so if you require additional support please contact HR Direct on 1800 989 696.

Advertising End Date: 27/07/2025




Before we jump into the responsibilities of the role. No matter what you come in knowing, you’ll be learning new things all the time and the Commonwealth Bank team will be there to support your growth.

Please consider applying even if you don't meet 100% of what’s outlined

Key Responsibilities
  • Shaping data platforms
  • Leading engineering teams
  • ️ Building enterprise software


Key Strengths
  • Data engineering
  • ️ Cloud architecture
  • Software development lifecycle
  • Data integration services
  • ️ Programming languages
  • Machine learning services


Why Commonwealth Bank is partnering with Hatch on this role. Hatch exists to level the playing field for people as they discover a career that’s right for them. So when you apply you have the chance to show more than just your resume.

A Final Note: This is a role with Commonwealth Bank not with Hatch.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Big data Jobs in Australia !

Data Engineer Analyst

3000 Melbourne, Victoria IAG

Posted today

Job Viewed

Tap Again To Close

Job Description

permanent
This is a Data Engineer Analyst role with IAG based in Melbourne, VIC, AU
== IAG ==

Role Seniority - mid level

More about the Data Engineer Analyst role at IAG

Help shape the future as a Data Engineer Analyst.

Join the largest general insurance group in Australia and Aotearoa New Zealand. We’re a top-tier ASX-listed company helping to make the world a safer place through creating a stronger and more resilient business and enabling our portfolio of iconic insurance brands.

Your Role

Are you ready to build data products that help make the world a safer place? Join our Single View of Assets – Data Engineering team and play a key role in shaping how we understand every property, vehicle, and asset across Australia and New Zealand.

We’re looking for a curious, proactive data engineer who’s passionate about creating smart, scalable solutions. In this role, you’ll develop powerful data pipelines, APIs, geospatial tools, and visualisations that put high-quality, reliable insights back into the business to help us achieve our purpose of making our world a safer place.

Your work won’t just sit in a dashboard — it will help improve customer experience, reduce operational overheads, improve data quality, and directly support critical decisions that impact millions of customers and their communities.

Key Responsibilities

  • Collaborate with stakeholders to gather and document complex business requirements, ensuring data products are strategically aligned and deliver measurable value.

  • Develop, maintain, and support production-grade data pipelines, APIs, and interfaces that reflect evolving business needs.

  • Write efficient, high-quality code in GCP, follow best practices, conduct peer reviews, and manage tasks using JIRA for clear documentation and prioritization.

  • Ensure data quality and compliance by creating detailed test cases, performing rigorous testing, and working closely with the data product team to uphold security and integrity standards.

  • Contribute across the product lifecycle, from planning to post-release support, while staying current with industry trends and continuously improving technical skills.

This is a permanent position open to candidates based anywhere within Australia.

About You

  • 5 years in data engineering or analytics engineering, with a strong background in software development.

  • Strong skills in SQL, Python or other comparable languages. Experience with Google Cloud Platform is essential. Experience with DBT is desirable.

  • Demonstrated experience in implementing AI-driven development, testing, optimisation, and governance tools.

  • Experience designing and implementing RESTful APIs

  • Well-versed in Agile methodologies and DevOps practices, including experience with CI/CD pipelines and version control systems like Git.

  • Strong analytical and problem-solving abilities with an experimental mindset and a passion for creating innovative data solutions.

  • Experience in building cloud native geospatial data pipelines, mapping services and interfaces is desirable.

  • Experience in developing or maintaining addressing and geocoding data and apis is desirable

  • Bachelor’s degree (or higher) in Business Analysis, Analytics/Data Science, Management Information System or in a specific insurance domain; or equivalent industry experience.

Applications close Wednesday 9 July 2025 at 23:59pm AEST.

Joining IAG, you’ll have access to a raft of benefits from across the group:

  • Boosted superannuation with 13% as standard

  • Up to 50% off personal insurance, including home and motor insurance

  • Work from home and many more flexibility options with myFlex

  • Dedicated career growth programs, including the award-winning IAG Academy

  • Discounts on every day and special occasion items

  • A certified Family Inclusive WorkplaceTM

  • Community volunteer days and team volunteer activities

  • employment type eligibility criteria apply

About Us

As part of IAG you'll enjoy a world of career opportunities, a purpose-led place focused on creating connection and belonging, and where you can create meaningful impact every day and grow your career beyond the expected. That’s not just words. It’s our people promise. We're ready for you with unexpected opportunities for your career, your work-life and your ability to make a difference. We celebrate all viewpoints shaped by life experiences and culture, and are guided by the knowledge and voice of Aboriginal and Torres Strait Islander peoples, businesses, and communities. We collaborate on Indigenous-led solutions that enable growth and create meaningful change for our customers and employees.

We’re ready for you. Apply today.




Before we jump into the responsibilities of the role. No matter what you come in knowing, you’ll be learning new things all the time and the IAG team will be there to support your growth.

Please consider applying even if you don't meet 100% of what’s outlined

Key Responsibilities
  • Collaborating with stakeholders
  • Developing and maintaining data pipelines
  • Writing efficient code


Key Strengths
  • Data engineering
  • Programming skills
  • API design
  • AI-driven development
  • ️ Geospatial data pipelines
  • Addressing and geocoding


Why IAG is partnering with Hatch on this role. Hatch exists to level the playing field for people as they discover a career that’s right for them. So when you apply you have the chance to show more than just your resume.

A Final Note: This is a role with IAG not with Hatch.
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

2000 Sydney, New South Wales Prezzee

Posted today

Job Viewed

Tap Again To Close

Job Description

This is a Senior Data Engineer role with Prezzee based in Sydney, NSW, AU
== Prezzee ==

Role Seniority - senior

More about the Senior Data Engineer role at Prezzee

Who We Are: At Prezzee, we’re more than just a digital gifting platform – we’re building human connections through unforgettable gifting moments. With over 1,000 brand partners globally, we’re transforming how people give and receive gifts. From our beginnings as a small Australian startup to becoming a global leader, we thrive on innovation, collaboration, and a deep commitment to making a real impact.

The Role: As a Senior Data Engineer at Prezzee, you’ll take the lead in designing, building, and maintaining the data pipelines that drive critical decision-making across the business. Reporting to the Head of Infrastructure, this hands-on role will give you the opportunity to work on a multinational data platform, drawing and integrating data from multiple regions around the world.

You’ll play a key role in shaping Prezzee’s data capabilities as we pursue bold strategic ambitions and continue to disrupt the global gift card and payments market. Your work will ensure our teams have access to trusted, timely data—empowering smarter decisions and enabling scalable growth at pace.

What You'll Be Doing:

  • Own the end-to-end development of scalable, sustainable data pipelines in AWS and Databricks—designing, building, testing, and maintaining systems that keep our data flowing seamlessly.

  • Optimise and fine-tune our data infrastructure to ensure performance, reliability, and efficiency at scale.

  • Manage and maintain the critical software systems that support our growing data platform.

  • Champion innovation by fostering a culture of continuous improvement, always looking for smarter, faster ways to work.

  • Collaborate cross-functionally with infrastructure and engineering teams to deliver, operate, and monitor reliable data solutions.

  • Partner with stakeholders to deeply understand business needs and translate them into robust, future-ready data pipelines and products.

  • Define and enhance technical standards , including clear, well-maintained documentation that supports consistency and knowledge sharing.

  • Promote and uphold best practices across all aspects of data engineering to ensure quality, scalability, and long-term success.

What We’re Looking For:

Core Skills:

  • Hands-on experience working with Terraform

  • Extensive experience building and maintaining data pipelines in AWS and Databricks

  • Exceptional problem solving and troubleshooting skills

  • A strong understanding of architectural concepts and cloud-based solution design

  • Demonstrable ability to motivate self and others

  • Experience working with technical and non-technical partners, with the capability to influence

  • Experience with Redshift, Postgres, DynamoDB, Looker and Workday Prism Analytics would be a bonus (but not essential)

Why Join Us? Join us at Prezzee, where we don’t just work – we innovate, collaborate, and shape the future of digital gifting. As part of our dynamic team, you’ll contribute to a global company’s growth while enjoying a flexible, supportive, and inclusive work environment. We offer:

  • Incentive schemes based on company-wide targets and individual performance.

  • Employee referral program and staff discounts.

  • Flexible hours, Culture Swap Days, and 30-day work-from-anywhere benefit.

  • Opportunities for professional growth through self-led learning and leadership development programs.

  • Fortnight FIRE Fridays where teams come collaborate to spark with Formative Ideas, Research and Experiment in tech improvements

  • Supportive wellbeing platform (Telus) for your mental, social, financial, and physical wellbeing.

Prezzee has been recognized as Foundry's Computerworld 2024 & 2025 Best Places to Work in IT . We value diversity, collaboration, and innovation in everything we do. Join a global team united by the core values: Give openness, Give greatness, Give magic, and Give a damn .

Prezzee is an Equal Opportunity employer. We believe that diversity is the key to building the best products for our customers, team culture and growing our global business. Our diversity mission is for our people to be their most authentic selves, to inspire, innovate and celebrate within a culture of belonging. We do not discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status.

We are looking for people to help create human connections, make magic and shape the future of gifting so even if you don’t think you quite meet all of the skills listed or tick all the boxes, we’d still love to hear from you! Please let us know if you require any adjustments as part of the application and recruitment process, We also encourage you to let us know of your pronouns at any point of the process.




Before we jump into the responsibilities of the role. No matter what you come in knowing, you’ll be learning new things all the time and the Prezzee team will be there to support your growth.

Please consider applying even if you don't meet 100% of what’s outlined

Key Responsibilities
  • Owning the end-to-end development of data pipelines
  • ️ Optimising data infrastructure
  • Collaborating cross-functionally


Key Strengths
  • ️ Hands-on experience with Terraform
  • ️ Building and maintaining data pipelines in AWS and Databricks
  • Exceptional problem-solving skills
  • Experience with Redshift, Postgres, DynamoDB, Looker, and Workday Prism Analytics


Why Prezzee is partnering with Hatch on this role. Hatch exists to level the playing field for people as they discover a career that’s right for them. So when you apply you have the chance to show more than just your resume.

A Final Note: This is a role with Prezzee not with Hatch.
This advertiser has chosen not to accept applicants from your region.

Data Engineer - Amazon FinTech

Sydney, New South Wales Amazon

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Description
Are you a highly skilled data engineer and project leader? Do you think big, enjoy complexity and building solutions that scale? Are you curious to know what you could achieve in a company that pushes the boundaries of modern technology? If you answered yes and you have a background in FinTech you'll love this role and Amazon's data obsessed culture.
Amazon Devices and Services Fintech is the global team that designs and builds the financial planning and analysis tools for wide variety of Amazon's new and established organizations. From Kindle to Ring and even new and exciting companies like Kuiper (our new interstellar satellite play) this team enjoys a wide variety of complex and interesting problem spaces. They are almost like FinTech consultants embedded in Amazon.
As a Data Engineer in this team you will build and enhance the businesses global financial systems org wide. The primary product used is TM1, but unlike other FinTech teams you will have the freedom to use Amazon's full arsenal of AWS technology to build the innovative solutions required to deliver at the scale a trillion dollar company demands. You will manage all aspects from requirements gathering, technical design, development, deployment, and integration to solve budgeting, planning, performance management and reporting challenges.
Key job responsibilities
- Design and implement next generation financial solutions assisted by almost unlimited access to AWS resources including EC2, RDS, Redshift, Stepfunctions, EMR, Lambda and 3rd party software TM1.
- Build and deliver high quality data pipelines capable of scaling from running for a single month of data during month end close to 150 and more months when doing restatements.
- Continually improve ongoing reporting and analysis processes and infrastructure, automating or simplifying self-service capabilities for customers.
- Dive deep to resolve problems at their root, looking for failure patterns and suggesting and implementing fixes or enhancements.
- Prepare runbooks, methods of procedures, tutorials, training videos on best practices for global delivery.
- Solve unique challenges presented by the massive data volume and diverse data sets working for one of the largest companies in the world.
Basic Qualifications
- Substantive experience building data engineering solutions in a large enterprise
- Basic knowledge of TM1/Planning Analytics or other financial planning and reporting systems of scale.
- Extensive experience writing SQL queries and stored procedures.
- Knowledge of distributed systems as it pertains to data storage and computing
- FinTech experience, exhibiting knowledge of financial reporting, budgeting and forecasting functions and processes.
- Bachelors degree.
Preferred Qualifications
- Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions.
- Experience with programming languages such as python, java shell scripts.
- Experience with IBM Planning Analytics/TM1 both scripting processes and writing rules.
- Experience with design & delivery of formal training curriculum and programs.
- Project management, scoping, reporting, and scheduling experience.
- austechjob
Acknowledgement of country:
In the spirit of reconciliation Amazon acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.
IDE statement:
Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status.
Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Big Data Jobs