At Schwarz IT Barcelona, we provide high value IT services for the entire Schwarz Group, which includes Lidl, Kaufland, Schwarz Produktion, PreZero, Schwarz Digits, STACKIT, and XMCyber.
As part of a top 5 global retail company, we serve 6 billion customers through 13,700 stores in 32 countries, supported by over 575,000 employees.
We are looking for open-minded colleagues with passion for technology, who are willing to find diverse and exciting career opportunities in a dynamic work environment that stands for development and progress.
Elevate your career with us, where development and progress are at the heart of everything we do.
YOUR TASKS
- Collaborate with other teams and build data services for data ingestion, processing and visualization of data product insights.
- Integrate cloud providers and third-party tools to provide teams with a holistic overview of their cloud costs, code quality and software security
- Provide essential services for the platform like billing data ingestion, configuring end-user data configuration and management portals, managing data contracts, or defining data pipelines between different services.
- Design, develop and implement data integration solutions supporting batch/ETL and API-led integrations that deliver tangible business value.
- Proactively assess the current state of the technology and identify gaps and overlaps. Capture the future state of technology vision in actionable, context-specific roadmaps.
- Develop policies around data quality, data security, data retention, data stewardship, and proactively identify and support project impacts
- Serves as an expert level technical resource across multiple initiatives.
- Work in a team-based environment including a global workforce, vendors, and third-party contractors.
- Translate high-level business requirements into detailed technical specifications.
- Collaborate closely with peers, offering mentorship and fostering a knowledge-sharing environment.
- Continuously evaluate and advocate for advanced tools, technologies, and processes that drive industry best practices.
- Actively participate in our Agile development processes, contributing to team meetings and delivering incremental improvements.
- Collaborate with cross-functional teams to understand data requirements and deliver reliable data solutions.
- Monitor data pipeline performance, troubleshoot issues, and implement optimizations to improve efficiency.
- Assist in the design and development of APIs for seamless data integration across platforms.
YOUR PROFILE
- Bachelor’s degree in computer science, Engineering, or a related field.
- 5+ years of real data engineering experience in designing/ building complex data pipelines supporting big data workloads (like streaming, analytics, operational, data warehousing, etc.)
- 4+ years working with Data and relevant computation frameworks and system.
- 4+ years using Python or Java programming language.
- 5+ years’ experience in writing SQL and query optimization.
- 3+ years of experience with Cloud Technologies, preferably GCP and Azure.
- 3+ years of experience working with Data Lake, Data Warehouse and Data Lakehouse architectures.
- Outstanding communication skills, coupled with strong problem-solving, organizational, and analytical abilities.
- Nice to have: Master’s degree in computer science or engineering.
- Nice to have: Proven experience in a cloud computing environment, preferably GCP, Azure, AWS or similar.
- Nice to have: Practical development and data analysis experience using Pandas and/or PySpark.
- Nice to have: Experience with Data Platform technologies such as Databricks, Unity Catalog and Delta ecosystem.
- Nice to have: Advanced Data Streaming Processing: Knowledge of data processing technologies such as Apache Spark, Flink, or Kafka.
- Nice to have: Familiarity with orchestration and scheduling tools like Apache Airflow.
- Nice to have: Experience with Infrastructure as Code tools like Terraform, Pulumi or similar IaC languages.
- Nice to have: Experience with Agile data engineering principles and methodologies.
- Nice to have: Strong understanding of ELT/ETL methodologies and tools.
- Nice to have: Experience in data warehousing and familiarity with its concepts and terminologies.
- Nice to have: Capable of troubleshooting and conducting root cause analysis to address and resolve data issues effectively.
- Nice to have: Analyze and develop physical database designs, data models and metadata
beWe look forward to receiving your application.
Schwarz IT Barcelona SL · Laura Hernandez Costa · Reference no. 47070
C/Bergara 13, floor 5, 08002 Barcelona
es.it.schwarz