متن کامل آگهی:
Job Summary:
We are seeking an experienced and versatile Lead Data Engineer to join our Business Intelligence team. This role involves designing, maintaining, and optimizing scalable data infrastructures, overseeing data pipelines, and leveraging both relational and non-relational databases. The ideal candidate will also have expertise in stream processing technologies and modern engineering practices. As a leader, you will guide a team of talented engineers, ensuring our data systems meet the evolving needs of the organization.
Key Responsibilities:
Design, develop, and maintain robust, high-performance data infrastructures using relational databases (MSSQL)
Architect and optimize ETL workflows using Microsoft SSIS for efficient data integration and transformation.
Manage Python-based data workflows using orchestration tools like Apache Airflow.
Develop and maintain multidimensional and tabular models for advanced reporting and analytics using SSAS.
Work with non-relational databases such as MongoDB to support flexible and scalable data storage solutions.
Implement and manage real-time stream processing systems using Kafka for low-latency, high-throughput data pipelines.
Monitor and maintain the health and performance of data servers, including data storage, reporting, and related infrastructure.
Apply modern engineering practices such as version control (Git), containerization (Docker), and CI/CD pipelines to streamline development and deployment.
Collaborate with DevOps teams to ensure scalable and reliable infrastructure integration.
Troubleshoot and resolve data-related issues, ensuring data accuracy and integrity across systems.
Lead and mentor a team of two data engineers, fostering professional growth and alignment with team objectives.
Stay informed about emerging trends and technologies in data engineering to drive continuous improvement.
Essential Qualifications:
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
6+ years of experience in data engineering with a focus on managing complex data infrastructures and pipelines.
Expertise in designing, managing, and optimizing relational databases.
Proficiency in ETL tools and orchestration frameworks like Apache Airflow.
Hands-on experience with non-relational databases such as MongoDB.
Strong knowledge of stream processing technologies such as Kafka for real-time data workflows.
Experience with server monitoring, scaling, and performance optimization.
Proficiency in Git, Docker, and CI/CD tools for modern development practices.
Familiarity with DevOps tools and practices, such as Kubernetes, Jenkins, or Terraform.
Proven leadership and mentoring abilities, with a track record of managing and developing team members.
Excellent problem-solving skills and the ability to thrive in a fast-paced, collaborative environment.
Preferred Qualifications:
Knowledge of big data technologies such as Hadoop, Spark, or Snowflake.
Exposure to machine learning workflows and tools.