Are you an experienced Data Engineer who wants to design and build Data pipelines within a scaling-up digital entertainment app that’s at the forefront of the new “online” way of socialising?
Are you looking to work somewhere you’re part of a team who take pride in working together?
Does working within a team who are straightforward, constantly challenging themselves and always learning with a genuine love for what they’re creating appeal?
We’re looking for someone who is energetic and proactive who is fascinated by modern data stacks and enjoys digging into the schemas and file formats.
We’re looking for a data engineer to help us create data pipelines and shape the data for our fascinating and unique chat and entertainment app for iOS, Android and the web – WOLF, The World’s Online Festival.
The ideal candidate will have a background in software engineering, experience in an analytics field, working with multiple data sets to help generate data-driven decisions.
Ideally you would be familiar with the modern data stack, data warehouse solutions and the world of mobile and web apps.
This position is based full time in our offices in Cramlington and can be partly remote.
- Implement new technology to leverage data across the business
- Partner with BI team, marketing/community teams, and finance team to solve problems, and design data processes
- Design, manage, and support the data infrastructure, alongside implementing principles and processes to ensure that the platform remains highly automated, self-servicing and able to scale as we continue to grow.
You’ll work across the following four key areas:
- Connecting various data sources to our data warehouse solution in Google BigQuery
- Developing and implementing procedures for secure and effective data management.
- Implementing data definitions, data mappings and providing support across ongoing integration activities
2. Data warehousing
- Creating data transformation pipelines and generating reports tables for analytics
- Writing SQL unit tests for data transformation scripts
- Running regular quality checks on data
3. Business intelligence
- Working with our BI team to deliver data engineering support for existing BI reports
- Working collaboratively with software engineering teams to understand and define application and platform data requirements.
4. Data infrastructure
- Managing our data ingestion pipelines with infrastructure as code
- Managing data transformation pipelines with SQL and Git
- Providing recommendations on data architecture and process improvements on an ongoing basis
The successful candidate will need to be highly technical, follow a hands-on approach, and have extensive experience managing and analysing data.
- Experience with the Cloud, ideally AWS with a desire to learn Google Cloud
- Previous experience of working with products that have regular releases through automated CI / CD processes
- An Agile background with experience in the use of tools such as Jira, Confluence etc
- Solid experience working with ELT/ETL tools
- Have an in-depth understanding of database structures and principles.
- Experience troubleshooting application and data platform related incidents through to resolution
- Experience working with and developing continuous integration (CI) and continuous development (CD) environments
Required Technical Knowledge
- Good working knowledge of AWS services (ECS, EC2, Kinesis, etc.)
- Familiar with serverless, event-driven technology such as AWS Lambda
- Hands-on experience working with databases, i.e. AWS MySQL, DynamoDB.
- Experience working with Snowflake or BigQuery data warehouse.
- Experience with Node.JS.
- Experience of scripting languages such as Python is highly desirable.
- Understanding of Cloud Infrastructure as a Code (AWS Cloudformation, Terraform).
- The ability and appetite to learn and use a wide variety of open-source technologies and tools