Job DescriptionData Engineer
Role: Data Engineer
Location: Newcastle Upon Tyne
Salary: TBC - Depending on experience
Levels: Senior Analyst, Specialist
Hybrid Working: 3 days per week in our Newcastle, Cobalt business park office
Please Note: Any offer of employment is subject to satisfactory BPSS and SC security clearance which requires 5 years continuous UK address history (typically including no periods of 30 consecutive days or more spent outside of the UK) and declaration of being a British or EU passport holder or hold Indefinite Leave to remain within the UK at the point of application.
Note: The above information relates to a specific client requirement
About the TeamOur Advanced Technology Centre is a hub of innovation where we deliver high‑quality data and technology services to clients across both the public and private sectors. You’ll join a collaborative culture that values diverse thinking, continuous learning, and opportunities for career growth within a global network of experts.
If you're looking for a dynamic role that offers hands‑on experience with modern data technologies and the chance to shape large‑scale data solutions, this position offers you the opportunity to develop and progress rapidly.
Role Overview
As a Data Engineer, you will design, build, and maintain scalable data solutions that enable analytics, AI, and operational insights. You’ll work alongside client and internal teams to create robust data pipelines, ensure data reliability, and support cloud‑based architectures that power intelligent decision‑making.
Key ResponsibilitiesData Pipeline Development- Build, optimize, and maintain scalable data pipelines using Java (primary), plus exposure to Python, Flink, Kafka, or Spark.
- Develop and support real‑time streaming pipelines and event‑driven integrations.
- Integrate data from multiple sources (streaming, batch, APIs) using AWS managed services (e.g., Kinesis, MSK, Lambda, Glue).
Data Architecture & Standards- Contribute to data modelling, data architecture best practices, and modern patterns (e.g., medallion architecture).
- Ensure data quality, lineage, governance, and security controls are applied consistently.
DevOps & Deployment- Deploy and maintain data applications using CI/CD tooling (Azure DevOps, GitHub Actions, Jenkins).
- Use Infrastructure as Code (e.g., Terraform, CloudFormation) to manage cloud environments.
- Work with container technologies such as Docker and Kubernetes‑based workloads.
Collaboration- Work closely with analytics, ML/AI, and product teams to deliver clean, well‑structured datasets.
- Participate in code reviews and internal knowledge‑sharing sessions.
- Provide guidance to junior engineers where needed.
QualificationCore Data Engineering- Strong programming proficiency in Java (preferred) or Python.
- Hands-on experience with at least one of: Kafka, Flink, Spark (Flink/Kafka preferred for streaming).
- Solid understanding of stream processing concepts (e.g., event time, state, backpressure).
- Understanding of software engineering best practices: testing, design patterns, CI/CD, Git.
- Experience building ETL/ELT or streaming data pipelines.
- Exposure to microservices and distributed system concepts.
- Experience working with cloud platforms, ideally AWS, but Azure/GCP also acceptable.
- Understanding of distributed compute, large-scale data systems, and performance considerations.
DevOps & Engineering Practices- Experience with CI/CD tools (Azure DevOps, GitHub Actions, Jenkins etc.).
- Infrastructure‑as‑Code (Terraform preferred).
- Experience with containerisation (Docker) and orchestration platforms (Kubernetes/EKS).
Certifications & Tools- Exposure to enterprise data platforms (Databricks, Snowflake, BigQuery, or similar).
- Cloud certifications (AWS, Azure, GCP) are beneficial but not required.
Other Requirements- Minimum 3 years’ experience working on data engineering or large-scale data solutions.
- Comfortable working in Agile delivery teams.
- Strong communication skills and ability to collaborate with technical and non‑technical stakeholders.
Desirable- Experience in client-facing or consulting environments.
- Professional cloud or data engineering certifications.
- Experience mentoring or supporting junior engineers.
- Background in designing or operating real‑time, low-latency systems.
LocationsNewcastle
Additional InformationEqual Employment Opportunity Statement All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Please read Accenture’s Recruiting and Hiring Statement for more information on how we process your data during the Recruiting and Hiring process.
About AccentureWe work with one shared purpose: to deliver on the promise of technology and human ingenuity. Every day, more than 775,000 of us help our stakeholders continuously reinvent. Together, we drive positive change and deliver value to our clients, partners, shareholders, communities, and each other.
We believe that delivering value requires innovation, and innovation thrives in an inclusive and diverse environment. We actively foster a workplace free from bias, where everyone feels a sense of belonging and is respected and empowered to do their best work.
At Accenture, we see well-being holistically, supporting our people’s physical, mental, and financial health. We also provide opportunities to keep skills relevant through certifications, learning, and diverse work experiences. We’re proud to be consistently recognized as one of the World’s Best Workplaces™.
Join Accenture to work at the heart of change. Visit us at www.accenture.com .