Essential skills:
– Significant commercial experience in a Senior Data Engineering role
– Strong experience with tracking technologies such as Snowplow/Rudderstack/Segment
– Previous experience with real-time data streaming platforms such as Kafka/Confluent/Google Cloud Pub/Sub
– Experience with stream processing frameworks such as Faust/Flink/Kafka Streams or similar
– Great Python skills
– Experience mentoring more junior team members
– Comfortable with ELT pipelines and the full data lifecycle
– Comfortable evaluating both business requirements and technical requirements
– A good understanding of techniques to deal with large datasets
– Experience handling real-time data
Desirable skills:
– Comfortable with database technologies such as Snowflake/PostgreSQL and NoSQL technologies such as Elasticsearch/Redis or similar
– An understanding of JavaScript/TypeScript
– An understanding of Docker
– Experience managing data pipelines over time and evolving them to meet new business requirements
Diversity is incredibly important to us. Research shows how people from marginalised groups are less likely to apply for a job unless they meet every requirement. However, these accountabilities are a guide and, if you feel like this role could be for you and you don’t meet every criteria, please do apply. We’d love to hear from you.
Benefits we offer
– Employee Assistance Programme (confidential counselling)
– Medicash healthcare scheme (reclaim costs for dental, physiotherapy, osteopathy and optical care)
– easitBrighton travel scheme (discounted public transport options)
– Cycle to work scheme
– Life Insurance scheme
– 25 days annual leave + bank holidays + your birthday off (rising to 28 after 3 consecutive years with the business & 30 after 5 years)
– Contributory pension scheme