Data Warehouse Architect (DBN)

We Want You:

Are you a data visionary with a passion for building high-performance systems? BET Software is looking for a dynamic and forward-thinking Data Warehouse Architect to join our Team. In this role, you’ll be responsible for designing and implementing scalable, cutting-edge data architectures that power our business decisions and drive growth. Your expertise in modern data warehousing, real-time data streaming, and distributed computing will help shape the future of our data infrastructure. Collaborating with cross-functional Teams, you’ll ensure data integration, governance, and security are top-notch. Ready to take your skills to the next level and make a lasting impact? Apply now and help us unlock the potential of data at BET Software!

You Bring:

  • At least 8 years in a technical role with experience in data architecture, data warehousing, and data engineering.
  • At least 3-5 years’ experience working with Apache Kafka and real-time data streaming.
  • Strong experience with data warehousing solutions (e.g., AWS RedHat, Snowflake, Google BigQuery).
  • Expertise in SQL performance tuning, database optimisation, and complex query development.
  • Experience with big data technologies such as Hadoop, Spark, Hive, and Presto.
  • Deep understanding of distributed data processing frameworks and parallel computing techniques.
  • Strong experience with ETL/ELT processes, data ingestion frameworks, and transformation logic.
  • Strong understanding of data governance, security, and compliance best practices.
  • Proficiency in programming languages such as Python, Java, or Scala for data processing.
  • Experience in designing and implementing high-throughput, low-latency data architectures.
  • Strong problem-solving and analytical skills with attention to detail.
  • Experience working with a high volume and complex data environment.

What You’ll Do:

Strategy, Objectives and Execution

  • Design and implement scalable, high-performance data warehouse architectures that support analytical and operational workloads.
  • Develop and implement long-term technical roadmaps for data management, integration, and processing.
  • Lead evaluations and recommend best-fit technologies for real-time and batch data processing.
  • Ensure that data solutions are optimised for performance, security, and scalability.
  • Identify and resolve bottlenecks in data design and system performance.
  • Develop and maintain data models, schemas, and architecture blueprints for relational and big data environments.
  • Ensure seamless data integration from multiple sources, leveraging Kafka for real-time streaming and event-driven architecture.
  • Facilitate system design and review, ensuring compatibility with existing and future systems.
  • Optimise data workflows, ETL/ELT pipelines, and distributed storage strategies.

 

Technical Expertise

  • Architect, build, and maintain Kafka-based streaming platforms for real-time data ingestion, processing, and analytics.
  • Design and implementation data lake and data warehouses.
  • Develop and enforce data governance policies, ensuring high-quality and secure data management.
  • Ensure compliance with data security, privacy, and regulatory standards.
  • Utilise advanced SQL query optimisation techniques, indexing strategies, partitioning, and materialised views to enhance performance.
  • Work extensively with relational databases (PostgreSQL, MySQL, SQL Server) and big data technologies (Hadoop, Spark).
  • Design and implement data architectures that efficiently handle structured and unstructured data at scale.
  • Optimise data pipelines and ETL/ELT processes using tools such as Apache, and Spark Streaming.
  • Develop automated workflows for data extraction, transformation, and loading (ETL/ELT) across disparate data sources.
  • Leverage distributed computing technologies to process and analyse large datasets efficiently.
  • Implement best practices for microservices-based data architecture and containerised solutions.

 

Stakeholder Management

  • Collaborate with key internal and external stakeholders to manage expectations and resolve technical issues.
  • Ensure operational communication is effectively documented and shared across relevant Teams.
  • Present architectural strategies, progress, and recommendations to senior leadership and stakeholders.
  • Translate complex technical concepts into understandable terms for non-technical stakeholders.

 

Continuous Improvement and Innovation

  • Drive adoption of new technologies and methodologies in data warehousing, big data, and streaming analytics.
  • Challenge the status quo to find new efficiencies and innovations in large-scale data processing.
  • Conduct performance tuning, capacity planning, and scalability assessments for data systems handling petabyte-scale datasets.
  • Research and implement emerging technologies in distributed computing, data warehouses, and real-time analytics.

Living the Spirit:

  • Foster an inclusive and collaborative engineering culture.
  • Encourage open communication and knowledge-sharing within the team.
  • Promote continuous learning and innovation within the data architecture space.
  • Lead by example and demonstrate integrity in all professional interactions.

 

The Company We Keep:

At BET Software, we don’t just recruit talent, we cultivate it. Our learning and development programmes, our various opportunities for growth, and our well-deserved incentives are what keep our All-Star Team the best amongst the rest.

 

Please note that only Team members who meet the stipulated minimum requirements will be considered.

Should you not be contacted within 30 days, kindly consider your application unsuccessful.

Other vacancies in Software Development