Some of our clients are looking for Data Engineers to join their teams.
Key Responsibilities:
• Create and run complex queries and automation scripts for operational data processing, reporting and pipelines
• Work closely with the Development teams to help planning and designing of required database structures, resolve problems as well as optimize performance on key projects and systems
• Align or correct corrupt data, as well as help migrate and normalize database structures
• Research, plan, design and develop specifications for future database requirements including enhancements, upgrades, and capacity projections; evaluate alternatives; and make appropriate recommendations
• Define and implement end to end data architecture including data pipelines and data models
• Help clients transition to modern cloud-based infrastructures (AWS, Azure, GCP) and to
leverage related architecture patterns (e.g., APIs, events)
• Work with clients in more traditional areas of data engineering such as, data warehousing,
building operational ETL/ELT data pipelines across several sources, and constructing relational
and dimensional data models
• Help implement or maintain data backup and recovery procedures
• Mentor and train colleagues where necessary by helping them learn and improve their SQL skills, as well as innovate and iterate on best practices
• Prepare and maintain documentation such as database configuration and content, security and user authorizations
Required skills and abilities:
• At least 3 years of experience with data
• Experience with PostgreSQL or MySQL
• Comprehensive knowledge of the principles of relational database design and operation
• Knowledge of / Experience with Linux, SSH, Scripting/automation with Bash
• Experience working in an Agile environment
• You must have experience in hands on data engineering, solution design, and architecture
• Understanding of key core concepts like distributed computing, batch and stream processing,
functional and object-orientated programming, how pipelines are built and deployed on cloud,
pipeline schedules and SLAs
• Experience of designing and building at least one modern data analytics solution using cloud
technologies (Azure, AWS, GCP)
• You have knowledge of different technology stacks including common legacy and modern
stacks. You have working knowledge of some of the leading data (vendor or open-source)
tools/technologies like Databricks, Snowflake, Airflow, dbT, etc.
If you are interested, please send us your CV!
All applications will be treated as strictly confidential and only short-listed candidates will be contacted.
License for recruitment for Bulgaria: № 2399 / 15.11.2017
License for administration and protection of personal data: № 432025 / 23.10.2017