Big Data Developer

16 hours ago


Sydney, New South Wales, Australia Cognizant Full time
Position Summary

We are seeking a highly skilled and experienced Big Data Developer to join our engineering team. The ideal candidate will have strong expertise in building and maintaining robust, scalable, and high-performance big data solutions using modern big data tools and technologies. You will be responsible for developing data pipelines, transform large volumes of data, and optimizing data infrastructure to enable real-time and batch processing for business insights and analytics.

Mandatory Skills
  • Design, develop and maintain scalable and secure big data processing systems using tools like Apache Spark, Hadoop, Hive, Kafka etc.
  • Build and optimize data pipelines and architectures for data ingestion, processing and storage from diverse source (structured, semi-structured, unstructured).
  • Develop ETL/ELT workflows and data transformation logic using custom code in Python, Scala, Java and job scheduling tools such as Control-M.
  • Translate business need/requirements into technical specifications, develop application strategy and high-level design documentation
  • Involve on solution design discussions, competitive displacement, proof-of-concept engagements, solution demonstrations and technical workshops
  • Responsible to manage the development sprints effectively and efficiently from planning to execution to review in an agile development environment.
  • Convert business knowledge into technical knowledge and vice – versa and able to translate those insights into effective frontline action.
  • Responsible to build and manage end-to-end big data streaming/batch pipeline as per business requirements.
  • Report and update the management/customer about the issues/blockers on time to mitigate risk.
  • Collaborate with different teams, analysts and stake holders to understand data requirements and deliver efficient data solutions.
  • Ensure data quality, consistency and security by implementing appropriate data validation, logging, monitoring, and government standards.
  • Tune and monitor the performance of data systems, jobs and queries.
  • Integrate and manage data platforms in Azure cloud environment.
  • Mentor junior developers and support team best practices in code review, version control, and testing.
  • Document system designs, data flows, and technical specifications.
  • Proactively identify and eliminate impediments and facilitate flow.
  • Experience in project management software (i.e., JIRA, Manuscript, etc.) to support task estimates and execution.
  • Involving the engineering development projects and facilitates sprint releases.
  • Create reusable assets / supporting artifacts based on the industry landscape
Mandatory Knowledge & Qualifications
  • Bachelor's or master's degree in computer science, data engineering, information technology, or related fields.
  • 4+ years of experience in Big Data technologies.
  • Should have experience in template-oriented data validation / cleansing framework.
  • Strong programming skills in Python, Scala, Shell Scripting and Java.
  • Extensive experience in data transformation and building ETL pipelines using PySpark.
  • Strong knowledge in performance improvements of existing big data Streaming or Batch Pipeline.
  • Proficiency in big data tools and frameworks such as Apache Spark, Hadoop, Kafka, and Hive.
  • Strong proficiency in design and develop generic ingestion and streaming frameworks which can be used and leveraged by new initiatives/projects.
  • Experience with NoSQL Databases – HBase and Azure Cosmos DB.
  • Should have experience in optimizing high volume data load in Spark.
  • Deep knowledge in Azure cloud platform.
  • Experience in identifying complex data lineage using Apache Atlas.
  • Strong knowledge of creating, scheduling and monitoring jobs using Control-M.
  • Strong knowledge in Data analysis, Natural Language Processing (NLP), Machine Learning (ML) techniques, Data visualization.
  • Deep understanding of data governance tool – Alex.
  • Mandatory to have strong understanding of BFS domain.
  • Build predictive models and machine-learning algorithms.
  • Present information using data visualization techniques.
Required Skills & Technology Stack
  • Bachelor's or master's degree in computer science, data engineering, information technology, or related fields.
  • 4+ years of experience in Big Data technologies.
  • Should have experience in template-oriented data validation / cleansing framework.
  • Strong programming skills in Python, Scala, Shell Scripting and Java.
  • Extensive experience in data transformation and building ETL pipelines using PySpark.
  • Strong knowledge in performance improvements of existing big data Streaming or Batch Pipeline.
  • Proficiency in big data tools and frameworks such as Apache Spark, Hadoop, Kafka, and Hive.
  • Strong proficiency in design and develop generic ingestion and streaming frameworks which can be used and leveraged by new initiatives/projects.
  • Experience with NoSQL Databases – HBase and Azure Cosmos DB.
  • Should have experience in optimizing high volume data load in Spark.
  • Deep knowledge in Azure cloud platform.
  • Experience in identifying complex data lineage using Apache Atlas.
  • Strong knowledge of creating, scheduling and monitoring jobs using Control-M.
  • Strong knowledge in Data analysis, Natural Language Processing (NLP), Machine Learning (ML) techniques, Data visualization.
  • Deep understanding of data governance tool – Alex.
  • Mandatory to have strong understanding of BFS domain.
  • Build predictive models and machine-learning algorithms.
  • Present information using data visualization techniques.
  • Primary Skills:

  • • Hadoop – Expert

  • • Apache Spark – Expert

  • • Apache Atlas – Expert

  • • Python – Expert

  • • Scala – Expert

  • • Shell Script – Expert

  • • PySpark – Expert

  • • Scikitlearn – Expert

  • • Matplotlib – Expert

  • • Azure Storage Explorer – Expert
  • Database Skills:

  • • HBase - Expert

  • • Hive - Expert

  • • Oracle – Expert

  • • SqlServer – Expert

  • • MySQL – Expert

  • • Cosmos DB - Expert
  • Other Skills:

  • • NLP – Expert

  • • ML – Expert

  • • Data Science – Expert

  • • Data Visualization – Expert
  • Tools:

  • • PyCharm/VSCode - Advanced

  • • Control M – Advanced

  • • Code Versioning Tool - Advanced

  • • Project Management tools (JIRA, Confluence, Service NOW) – Advanced

  • • MS Visio - Expert

  • • HP ALM/QC- Expert
Salary Range

Salary Range: $75,000-$85,000

Date of Posting

Date of Posting: 25/August/2025

Next Steps

If you feel this opportunity suits you, or Cognizant is the type of organization you would like to join, we want to have a conversation with you Please apply directly with us.

For a complete list of open opportunities with Cognizant, visit http://www.cognizant.com/careers. Cognizant is committed to providing Equal Employment Opportunities. Successful candidates will be required to undergo a background check.

#LI-CTSAPAC


#J-18808-Ljbffr
  • Big Data Developer

    19 hours ago


    Sydney, New South Wales, Australia Cognizant Full time

    Position SummaryWe are seeking a highly skilled and experienced Big Data Developer to join our engineering team. The ideal candidate will have strong expertise in building and maintaining robust, scalable, and high-performance big data solutions using modern big data tools and technologies. You will be responsible for developing data pipelines, transform...

  • Big Data Developer

    7 days ago


    Sydney, New South Wales, Australia Cognizant Full time

    Join to apply for the Big Data Developer role at Cognizant1 day ago Be among the first 25 applicantsJoin to apply for the Big Data Developer role at CognizantPosition SummaryWe are seeking a highly skilled and experienced Big Data Developer to join our engineering team. The ideal candidate will have strong expertise in building and maintaining robust,...

  • Big Data Developer

    3 days ago


    Sydney, New South Wales, Australia BULLIT MANAGEMENT SERVICES LIMITED Full time

    Job Description for Big Data Lead Engineer:Location: SydneyResponsibilitiesProvides technical direction and guidance to the engineering team.Ensures adherence to engineering standards and best practices.Participates in technical discussions and decision-making.Identifies and resolves technical challenges and issues.Conducts code reviews and provides feedback...

  • Big Data Developer

    3 days ago


    Sydney, New South Wales, Australia BULLIT MANAGEMENT SERVICES LIMITED Full time

    Job Description for Big Data Lead Engineer:Location: SydneyResponsibilitiesProvides technical direction and guidance to the engineering team.Ensures adherence to engineering standards and best practices.Participates in technical discussions and decision-making.Identifies and resolves technical challenges and issues.Conducts code reviews and provides feedback...


  • Sydney, New South Wales, Australia beBeeData Full time $150,000 - $200,000

    Job Title: Big Data DeveloperSeeking an experienced Big Data Developer to join our team.

  • Big Data Architect

    3 days ago


    Sydney, New South Wales, Australia beBeeData Full time

    Big Data Architecture ExpertWe are seeking a talented professional to join our fast-paced AI team as a Big Data Architecture Expert. As part of this role, you will be responsible for designing and developing large-scale data architectures that support machine learning and artificial intelligence model training, validation, testing, and production.Key...


  • Sydney, New South Wales, Australia beBeeBigData Full time $180,000 - $220,000

    Senior Big Data Solutions ArchitectWe are seeking an experienced Senior Big Data Solutions Architect to lead the development and maintenance of our regulatory epics and transform our regulatory reporting platform.You will guide a team of data architects to enhance platform efficiency using AI, automation, and cutting-edge technological advancements.Key...

  • Big Data Specialist

    2 days ago


    Sydney, New South Wales, Australia beBeedata Full time $120,000 - $180,000

    Job Title: Big Data SpecialistAbout the Role:We are seeking an experienced professional with a strong background in designing, building, and maintaining scalable data pipelines and workflows on the Databricks platform.Key Responsibilities:

  • Big Data Architect

    1 week ago


    Sydney, New South Wales, Australia beBeeAnalytics Full time $120,000 - $186,850

    We're seeking a seasoned Solutions Architect to join our team in Sydney, Australia.About the RoleYou will form successful relationships with clients throughout your assigned territory to provide technical and business value in collaboration with an Account Executive and a Senior Solutions Architect.You will gain excitement from clients about Databricks...


  • Sydney, New South Wales, Australia beBeeBigData Full time $75,000 - $85,000

    Expert Big Data Developer WantedWe are seeking a seasoned Big Data Developer to spearhead the design and implementation of robust, scalable big data solutions. The ideal candidate will have extensive expertise in building and maintaining high-performance big data systems using modern tools and technologies.Key ResponsibilitiesDevelop and optimize complex...