Software Developer (SFIA 4) SC / Newcastle - CEA

Peregrine logo
Peregrine
ScreenedHybridJust posted
Greenhill, West Midlands
£127,185
Posted 1 day ago
Apply Now

About the role

Software Developer

Read on to fully understand what this job requires in terms of skills and experience If you are a good match, make an application.
Permanent | Hybrid (willing to travel to Newcastle) | Python | Apache | Data | SC cleared

Hybrid work arrangement. Office attendance is up to 60%. Location is flexible: London, Leeds, Newcastle, Sheffield, Blackpool, Manchester and Birmingham. You must be willing to travel to Newcastle when required.

The Role:
We are looking for Software Developers with strong Python and data processing with Apache Spark responsible to lead the design, development, and operation of data-driven applications and pipelines in a collaborative Dev and DevOps environment.
The role focuses on writing and improving application code while working closely with DevOps engineers to support automated deployments, infrastructure management, system monitoring, reliability, and scaling. The post holder collaborates with Product Owners, Business Analysts, and users to translate business needs into robust technical solutions, operates and improves production services, analyses data and system issues to identify root causes, and champions engineering best practices. They also provide technical leadership through coaching and mentoring junior colleagues and contributing to continuous improvement across development, delivery, and operational processes.

Responsibilities:
Engineers will contribute to research, development and delivery across:
Design, build, and operation of data ingest and publishing pipelines
Workflow orchestration and task scheduling using managed services
Collaboration with Product Owners, Business Analysts, and users to shape technical solutions
Production support, monitoring, and continuous improvement of system resilience, stability, and performance
Data analysis and investigation to identify root causes of defects and operational issues
DevOps collaboration, including supporting automated deployments, infrastructure management, monitoring, and scaling
Coaching and mentoring junior engineers and promoting engineering best practices

Skills & Experience:
Understanding of data processing using Apache Spark
Use of Python, SQL, and familiarity with PySpark
Experience using Apache Airflow for task orchestration
Understanding of EMR and reviewing output logs
Use of Jupyter notebooks and/or Amazon Athena to query and validate data
Data analysis to identify root cause of issues
Understanding of dimensional data models and slowly changing dimensions/historic data capture
Use of AWS console and services such as, but not limited to; CloudWatch, IAM, S3, Glue, ECR, EC2, EMR, Dynamo DB, LakeFormation
Familiarity with Amazon Textract and Comprehend
Understanding of both server-side and xwzovoh client-side encryption
Use of GitLab for source code management pipelines for CI/CD
Use of GitLab Tags for component versioning in shared repositories
Understanding of Docker and containerization of solutions
IaC using Terraform
Experience of understanding how customer expectations transition to applied functionality
Familiarity with, and implementation of Engineering best practices
Use of gitlab for release tagging and deployments
Familiarity with basic data structures for constructing a solution
Active BPSS, SC clearance or eligible for clearance

Desirable skills :
Experience supporting AI or data driven platforms
Knowledge of cyber security or fraud prevention domains
Experience working within government or critical national infrastructure environments

Find out more: peregrine.global or check out our LinkedIn page: peregrine-resourcing

About this listing

Screened by Joboru

This role passed our automated spam and quality filters and was active in our feed when last checked. Joboru is an aggregator — here is how we screen listings. If anything looks off, tell us.