Job
- Level
- Experienced
- Job Field
- Data
- Employment Type
- Full Time
- Contract Type
- Permanent employment
- Location
- Berlin
- Working Model
- Onsite
Job Technologies
Your role in the team
- **Advanced Data Architecture Design** : Design and implement scalable and reliable data architectures. Lead initiatives for data modeling, data warehousing, and data lake development, ensuring the architecture supports current and future business needs. Develop and maintain scalable, efficient data architectures, incorporating modern data stacks like AWS, Databricks, and Snowflake. Ensure these architectures support both current and future business analytics, AI, and machine learning initiatives
- **Advanced Data Engineering** : Apply expertise in big data technologies, real-time data processing, and cloud-based systems to enhance our data capabilities. Implement data pipelines, ETL processes, and data storage solutions
- **Expertise in ML/AI Tooling:** Utilize tools such as AWS SageMaker, Databricks MLflow, and other advanced ML/AI technologies to facilitate data processing and analysis. Implement and oversee machine learning pipelines and data science workflows
- **Team Mentorship and Collaboration** : Act as a mentor to junior data engineers, fostering a culture of technical excellence. Collaborate with cross-functional teams, including data scientists, analysts, and IT professionals, to align data engineering efforts with organizational goals
- **Innovative Solutions and Best Practices:** Stay at the forefront of data engineering trends. Introduce innovative solutions and best practices in data processing, storage, and analytics. Evaluate and recommend new technologies to enhance our data capabilities
- **Data Quality and Governance:** Establish and maintain high standards for data quality and integrity. Implement data governance frameworks and ensure compliance with data privacy and security regulations
- **Performance Optimization:** Monitor system performance, identify bottlenecks, and implement solutions to optimize data flow and storage
This text has been machine translated. Show original
Our expectations of you
Qualifications
- **Technical Expertise** : Proficiency in big data technologies (e.g., Hadoop, Spark), database management systems (e.g., SQL, NoSQL), cloud services (e.g., AWS, Azure, GCP), and programming languages (e.g., Python , Scala, Java). Advanced skills in AWS, Databricks, Snowflake, and other modern data tools. Experience with big data technologies, real-time data processing, and cloud-based systems
- **Leadership Skills:** Strong leadership and team-building capabilities. Ability to mentor and develop technical teams
- **Problem-Solving:** Exceptional problem-solving skills and the ability to work on complex systems and challenges
- **Communication:** Excellent communication and interpersonal skills, with the ability to engage effectively with technical and non-technical stakeholders
- Strong programming skills in Python and SQL
- Knowledge of distributed systems and Spark distributed data processing engine
- Understanding of data security and privacy regulations and how to ensure data quality, consistency, and accessibility
- Proficiency in designing, building, and maintaining data warehousing solutions based on Snowflake and Databricks Lakehouse Platform
- Expert understanding of dimensional data modeling techniques
- Data governance and management skills, such as defining and enforcing data quality standards, data contracts, data lineage, and data access policies, as well as ensuring data security and compliance
- Expert knowledge of AWS platform and cloud computing principles
- Strong infrastructure management skills, such as provisioning, configuring, and maintaining data servers, clusters, and networks, as well as automating and optimizing data workflows and processes
- Ability to design, plan, drive, and document major architectural changes and propose innovative solutions for data engineering problems
Experience
- **Experience** : 8+ years of experience in data engineering with a demonstrated track record in designing and managing large-scale data systems. Experience in leading data engineering teams is essential
- Experience with ETL tools and pipelines that support data ingestion, processing, storage, and delivery, such as Airflow
This text has been machine translated. Show original
Benefits
Work-Life-Integration
Food & Drink
Job Locations
This is your employer
Babbel
Founded in 2007, Babbel is the most used and most effective language learning app in the world. No small achievement, and no small challenge. At a time when walls are being talked about, we build bridges - making the language learning journey as exciting and empowering as possible, helping people to make new connections and participate in worlds bigger than their own.
Description
- Company Type
- Established Company
- Working Model
- Full Remote, Hybrid, Onsite
- Industry
- Education System