Are you seeking new challenges and a place where you can enjoy a close-knit environment while constantly learning ?
Welcome to a team with a clear purpose : "TRANSFORM people's lives by being the most reliable technology ally"!
Get ready and join this adventure!
What Will You Find?
Technical and personal challenges that will keep you in constant growth.
A connected team focused on your physical and mental well-being .
Culture of continuous improvement , fresh and collaborative, with learning opportunities and people willing to support you.
KaizenHub , a programme designed to enhance your talents, with feedback , mentoring, and coaching through Sofka U . It'll be both a challenge and a game!
Programmes like Happy Kaizen and WeSofka that look after your physical and emotional well-being.
What Are We Looking For?
We are seeking a skilled Azure Data Bricks Engineer with expertise in designing, implementing, and optimising big data solutions using Azure Databricks.
This professional will be responsible for leveraging data to drive insights and innovation, ensuring efficient data processing and integration across cloud environments.
Key Responsibilities :
- Design and deploy scalable data pipelines using Azure Databricks.
- Integrate data from a variety of sources, ensuring seamless data flow.
- Perform data analysis and transform raw data into business-ready formats.
- Optimise data processing workflows for better performance and cost-efficiency.
- Ensure high data quality and implement data governance best practices.
- Collaborate with cross-functional teams to align data strategies with business objectives.
- Troubleshoot and resolve issues related to data ingestion and processing.
- Document processes, architecture, and system configurations.
Technical Requirements :
- Proven experience with Azure Databricks, PySpark, and Big Data technologies.
- Strong understanding of Azure services such as Data Lake Storage and Synapse Analytics.
- Proficiency in programming languages like Python or Scala.
- Familiarity with data modeling, ETL processes, and data warehousing concepts.
- Experience with CI / CD pipelines for automated deployments.
- Knowledge of data security protocols and practices.
- Ability to optimise and troubleshoot complex data systems.
- Exposure to Machine Learning models and their integration is a plus.
Apply and Be Part of Our Story!
Conditions
Permanent contract - We aim for long-term relationships and for you to be part of our family for a long time!
Looking for professional growth? You can design your own career plan in line with your aspirations!