Ausbildung Berufskraftfahrer/-in (m/w/d) in 2025

Uniper SE

Job Description

At Uniper, we proactively transform the world of energy while ensuring the security of energy supply. As an internationally operating company, we work in very diverse teams with the greatest possible working time flexibility for our employees. Our corporate culture is characterized by equal opportunities, mutual appreciation, and respect. With us, you will be able to develop new business models, work on technological solutions for a modern, sustainable, and future-oriented energy supply, as well as proactively help shape changes. Interested? We look forward to meeting you!


Our Commercial Technology Team in Düsseldorf is looking for you!

We are Uniper

Join our high-impact, results-driven Platform Engineering team where your work directly influences our trading operations. You'll have the opportunity to work with cutting-edge tools and platforms, tackle complex challenges in energy trading, and see the tangible outcomes of your contributions. Our commitment to leveraging the latest technologies ensures you'll continually expand your skillset while driving significant business results.

Position Summary

As the (Senior) Data/DataOps Engineer, you will be responsible for designing, developing, and maintaining scalable data pipelines, managing data integration, and ensuring robust data quality and governance to enhance and support our systematic trading activities. This role is critical to ensuring the efficiency, reliability, and scalability of our core data infrastructure, directly supporting our strategic trading initiatives. The ideal candidate has a passion for creating high quality software, loves working with data in all its forms and shapes, enjoys solving complex problems, brings expertise in time series data and is adept at working with modern data platforms and tools.

Your Responsibilities

  • Gain a deep understanding of data requirements related to Uniper's energy trading business and translate them into technical designs
  • Design, develop, and maintain scalable data pipelines for both batch and streaming data, ensuring data quality and consistency to support data analytics initiatives
  • Develop and maintain robust data models that optimally support workloads of Uniper's analysts and traders, ensuring data is well structured and easily accessible
  • Seamlessly integrate market and other data from various sources, including internal and external data feeds, APIs, databases, and streaming services
  • Implement data quality checks, validation processes, and monitoring solutions; maintain data governance and security standards across all data domains
  • Manage and maintain an up-to-date data catalog using tools like Collibra to ensure metadata is accurately documented and accessible
  • Develop and implement automation scripts and workflows to enhance efficiency and reduce manual intervention in data processing
  • Monitor and optimize the performance of data pipelines, ensuring efficient data processing and minimizing disruptions
  • Collaborate cross-functionally with traders, analysts, software engineers, and other stakeholders to understand data requirements and ensure that solutions are aligned with business needs
  • Leverage tools and platforms including Azure, Databricks (utilizing Unity Catalog & Delta Lake), Snowflake, PostgreSQL, TimescaleDB, Kafka, and Flink with a strong focus on time series data
  • Stay updated on and experiment with emerging technologies like Delta Lake and Apache Iceberg to continuously enhance our lakehouse architecture

Essential Qualifications:

  • Proven expertise in Data/DataOps Engineering, Data Management, Data Architecture, or related roles by a strong track record of building and maintaining scalable data pipelines, integrating diverse data sources, and ensuring high data quality and governance
  • Strong knowledge of software engineering best practices, object-oriented concepts, and the ins and outs of data-focused development
  • Proficiency in Python, SQL, and data processing frameworks like Apache Spark
  • Experience with cloud platforms (e.g., Azure) and tools like Databricks, Snowflake, PostgreSQL, and Kafka
  • Expertise in handling both event-based and aggregated time series data
  • Strong understanding of modern data governance frameworks, data modeling, data architecture, OLAP & OLTP systems and their application in dynamic environments

Preferred Qualifications:

  • Previous experience in an operational data team, preferably in energy trading or a related field
  • Experience with DevOps techniques, including CI/CD and infrastructure-as-code
  • Experience with Delta Live Tables (DLT) and/or dbt is a plus
  • Familiarity with data cataloging and quality monitoring solutions (e.g., Collibra)

Soft Skills and Cultural Fit:

  • Excellent communication and collaboration skills to work effectively with technical teams and business stakeholders
  • Strong analytical thinking, problem-solving abilities
  • A passion for continuous learning, innovation, knowledge sharing, and driving excellence in data engineering
  • Ability to work effectively in a cross-functional, fast-paced environment

Additional Requirements:

  • Fluency in English (verbal and written)
  • Willingness to participate in on-call duties in the future, as the team evolves to support 24/7 operational needs

View More