Data Engineer

Data Engineer Job Description Template

Our company is looking for a Data Engineer to join our team.

Responsibilities:

  • Convert algorithms, models and features into production solutions;
  • Designing a coherent database structure to accommodate player behavioural data for data science and analytics requirements;
  • Creating elegant, simple, tested and reusable code;
  • Engaging with Data Science and Machine Learning teams;
  • Build efficient and reliable data pipelines, ingest and transform data sets using SQL, Python & AWS;
  • Investigate, troubleshoot and resolve data errors / discrepancies;
  • Data mapping;
  • Identify and investigate potential improvements to our current methodologies; plan and implement agreed changes;
  • Assemble large, complex data sets that meet functional / non-functional business requirements;
  • Investigate, test and introduce new ways of working and technologies where appropriate;
  • 6 Weeks Work – with potential for other contracts;
  • Helping with creating the data pipeline for analysing large datasets (Spark, BigQuery);
  • Be able to liaise with internal managers to represent the data team at meetings;
  • Develop database solutions to store and retrieve company information;
  • Assisting data scientists with productionised implementations of complex computational jobs, e.g. ML model training and evaluation.

Requirements:

  • Python;
  • Strong experience in Software and Data Engineering;
  • Strong background in SQL Server;
  • Good level of Unix skills and bash scripting;
  • Ability to write easily understandable and maintainable code in many languages;
  • Fibre optic light meter / OTDR testing;
  • Experience with Data Warehousing / Data Lakes;
  • Strong performance in a relevant Master’s degree;
  • Knowledge of data analysis, visualization techniques, and frameworks;
  • Understanding of web analytics, UI, and UX ?
  • Single Mode and Multimode Fibre Optic installation and spliced termination into patch panels;
  • Experience with wider persistence technologies (e.g. Elastic, Mongo, Accumulo);
  • Extensive experience with Python;
  • Educated to a great level in a relevant Bachelor’s degree;
  • Experience with Spark or PySpark,