Share this post on
To apply, click on the link at the end of the posts and all the best with your applications.
Senior Data Engineer (Pretoria)
Reference Number
E100001
Description
ADVERTISEMENT
Senior Data Engineer
Job Grade: P6
Ref No.: E100001
Location: Pretoria
Job Type: 5 years fixed term contract
Job Purpose: The primary purpose of this role is to take a leading role in designing, implementing, and optimising CIPC’s data infrastructure to support strategic goals. This position is central to building scalable, high-performance data pipelines, driving data engineering best practices, and ensuring the delivery of robust data solutions that empower data scientists and analysts to generate critical business insights.
Key performance areas
1. Data Pipeline & System Development
● Pipeline Design and Implementation: Lead the design, build, and maintenance of scalable and secure data pipelines (batch and real-time) and databases to process complex, high-volume structured and unstructured data.
● ETL/ELT Development: Develop and optimize ETL (Extract, Transform, Load) processes to efficiently transform raw data into usable, high-quality formats for analysis and consumption.
● Infrastructure Optimization: Manage and optimize data warehousing solutions (e.g., Databricks, Snowflake, Redshift, BigQuery, Synapse) and implement and maintain data storage solutions, including SQL and NoSQL databases.
● Automation: Automate data processing tasks using frameworks like Apache Airflow and optimize deployment and orchestration to improve efficiency and reduce manual effort.
2. Data Quality, Performance & Compliance
● Performance Monitoring: Monitor data pipeline performance, troubleshoot issues promptly, and optimize data processing frameworks to handle increasing data volumes with low latency.
● Quality and Standards: Develop and enforce standards and best practices for data quality, security, documentation, and compliance across all data systems and processes.
● Architecture Contribution: Ensure data pipelines and databases are optimized for performance, security, availability, and scalability, and contribute actively to overall data architecture decisions.
3. Collaboration, Mentorship & Strategy
● Stakeholder Collaboration: Work closely with data scientists, data analysts, and business teams to understand their data needs, ensure data accessibility, and deliver solutions tailored for their analysis requirements.
● Technical Leadership: Serve as a technical leader, coach, and mentor for junior data engineers and adjacent data and engineering teams.
● Project Leadership: Lead end-to-end data engineering projects, including requirements gathering, technical deliverable planning, output quality control, and stakeholder management.
● Strategy Contribution: Contribute technical expertise to the development and evolution of the CIPC data strategy.
Minimum Functional Requirements (Technical Skills & Knowledge)
● Core Programming: Expertise in programming languages such as Python, Java, or Scala.
● SQL Mastery: Advanced proficiency with SQL and deep experience in database optimization techniques for high-volume data.
● Big Data & Distributed Systems: Strong hands-on experience with distributed systems and big data technologies, including Apache Spark, Hadoop, or Flink.
● Cloud Data Platforms: Strong knowledge of cloud-based data platforms and their services across major providers (AWS, Azure, and GCP).
● ETL/Orchestration Tools: Proven experience with ETL tools and orchestration frameworks (e.g., Apache Kafka, Apache Airflow, Apache Spark).
● Architecture Design: Experience in designing and implementing data architectures that specifically support large-scale data processing and machine learning initiatives.
● Soft Skills: Strong problem-solving and critical thinking skills, excellent interpersonal skills, and the ability to work effectively with cross-functional teams.
● Mentorship: Proven experience leading and mentoring junior data engineers.
Applicants may, as a step in the recruitment process, be subjected to competency assessment. In addition, the successful candidate must be prepared to undergo a process of security clearance prior to appointment.
Qualifications and SA citizenship checks will be conducted on the successful candidate. It is the applicant’s responsibility to have foreign qualifications evaluated by the South African Qualifications Authority (SAQA).
It will be expected of candidates to be available for selection interviews on a date, time and place as determined by CIPC.
CIPC is an equal opportunity, affirmative action employer. Preference will be given to candidates whose appointment will enhance representation in accordance with the approved employment equity plan.
Feedback will only be given to shortlisted candidates.
CIPC reserves the right not to fill an advertised position.
For further details regarding these positions please click on the link: https://cipc.mcidirecthire.com/default/External/CurrentOpportunities or visit the CIPC website at www.cipc.co.za
Kindly note that faxed, emailed, posted and or hand delivered applications will not be considered.
Should you experience any difficulty in applying please contact the CIPC Recruitment Office by dialing: 087 743 7074, 7075, 7197 or 087 260 1554
Closing date: October 31, 2025
Requirements
Candidates must meet one of the following requirements:
Required Minimum Education / Training
Formal Education Pathway (Preferred):
- Education: Bachelor’s Degree / Advanced Diploma in Computer Science, Engineering, or a related technical field. Advanced degrees are a plus.
- Added Advantage Certifications: Holding one or more of the following advanced professional certifications is a strong advantage:
- Specialized Tools:
- Databricks Certified Data Engineer (Professional or Associate)
- Confluent Certified Developer for Apache Kafka, or
- Certified Kubernetes Administrator (CKA).
- Cloud Platforms:
- Google Cloud Certified Professional Data Engineer.
- Microsoft Certified: Azure Data Engineer Associate, or
- AWS Certified Data Analytics – Specialty.
- Data Management: Certified Data Management Professional (CDMP) (Practitioner or Professional level).
Required Minimum Experience
- Experience: Minimum 5+ years (preferred 8+ years) of proven experience as a Data Engineer or in a similar technical role, with a strong track record of building scalable data solutions.
Specialized Certification and Experience Pathway (Alternative):
- Education: Senior Certificate (NQF 4) and relevant technical certifications
- Mandatory Certifications: The candidate must hold a combination of at least two advanced professional certifications, which must include one specialized tool certification and one cloud platform certification:
- Specialized Tool Certification (MUST include one):
- Databricks Certified Data Engineer (Professional or Associate)
- Confluent Certified Developer for Apache Kafka
- Certified Kubernetes Administrator (CKA) or Certified Kubernetes Application Developer (CKAD)
- Cloud Platform Certification (MUST include one):
- Google Cloud Certified Professional Data Engineer
- Microsoft Certified: Azure Data Engineer Associate
- AWS Certified Data Analytics – Specialty
- Added Advantage Certification (Choose one more from any category):
- Certified Data Management Professional (CDMP) (Practitioner or Professional level)
- DAMA Certified Data Management Professional (CDMP)
- Any other certification listed above not yet counted.
- Specialized Tool Certification (MUST include one):
- Mandatory Certifications: The candidate must hold a combination of at least two advanced professional certifications, which must include one specialized tool certification and one cloud platform certification:
Required Minimum Experience
- Experience: Minimum 10+ years of proven experience as a Data Engineer or in a similar technical role, with a strong track record of building scalable data solutions.
Minmum Functional Requirements (Technical Skills & Knowledge)
- Core Programming: Expertise in programming languages such as Python, Java, or Scala.
- SQL Mastery: Advanced proficiency with SQL and deep experience in database optimization techniques for high-volume data.
- Big Data & Distributed Systems: Strong hands-on experience with distributed systems and big data technologies, including Apache Spark, Hadoop, or Flink.
- Cloud Data Platforms: Strong knowledge of cloud-based data platforms and their services across major providers (AWS, Azure, and GCP).
- ETL/Orchestration Tools: Proven experience with ETL tools and orchestration frameworks (e.g., Apache Kafka, Apache Airflow, Apache Spark).
- Architecture Design: Experience in designing and implementing data architectures that specifically support large-scale data processing and machine learning initiatives.
- Soft Skills: Strong problem-solving and critical thinking skills, excellent interpersonal skills, and the ability to work effectively with cross-functional teams.
- Mentorship: Proven experience leading and mentoring junior data engineers.
Click here to apply
Senior Data Architect (Pretoria)
Reference Number
D13000
Description
ADVERTISEMENT
Senior Data Architect
Job Grade: P4
Ref No.: D130001
Location: Pretoria
Job Type: 5 years fixed term contract
Job Purpose: The primary purpose of this role is to lead and implement a comprehensive Data Governance Program and establish a robust, future-proof Data Architecture across multi-cloud platforms. This role is crucial to revolutionizing CIPC’s data management, enhancing data quality, streamlining analytics processes, and creating a centralized data hub, including a Data Marketplace, to foster a data-driven culture and enable advanced analytics across the organization.
Key performance areas
- Data Architecture Strategy & Design:
- Architectural Blueprint: Design and maintain the conceptual, logical, and physical data models, data dictionaries, and comprehensive data flow diagrams across the enterprise.
- Strategic Alignment: Develop forward-looking data strategies for data warehousing, data mining, and data integration that align with CIPC’s business objectives and technological roadmap.
- Tool Evaluation & Selection: Evaluate, recommend, and select appropriate data management tools, technologies, and platforms to ensure optimal performance, scalability, and cost-efficiency.
- Data Lakehouse Architecture:
- Cloud Architecture: Design and implement a secure, scalable, and highly available Data Lakehouse architecture across multi-cloud platforms (including Azure, AWS, and GCP services like Azure Databricks, AWS Redshift, Google BigQuery, and respective storage solutions).
- Data Pipeline Development: Establish robust, end-to-end data ingestion (batch and streaming) and ELT/ETL pipelines to incorporate diverse data sources into the Data Lakehouse effectively.
- Analytics Enablement: Develop optimized data transformation, processing, and storage solutions tailored for advanced analytics, reporting, and data science workloads.
- Data Governance & Management
- Governance Frameworks: Define and establish comprehensive data management frameworks for Data Ingestion, Data Lake, Data Warehouse, Data Sharing, Data Analytics, and Data Quality.
- Data Marketplace & Sharing: Design and implement a Data Marketplace architecture to facilitate governed data sharing and collaboration among internal stakeholders, fostering a data-driven culture.
- Quality & Observability: Define and implement data quality monitoring, validation, and improvement processes, including developing data observability solutions to track data health, usage, lineage, and compliance across the entire data ecosystem.
- Metadata Management: Utilize and champion Data Catalog tools for effective data discovery, metadata management, and lineage tracking.
- Collaboration, Guidance & Compliance
- Stakeholder Engagement: Collaborate proactively with IT, business leaders, and security teams to prioritize data requirements, define service level agreements (SLAs), and develop data governance policies and procedures.
- Mentorship & Support: Provide technical guidance and support to Data Analysts, Data Engineers, and Data Scientists, ensuring the effective and responsible leveraging of data assets.
- Regulatory Compliance: Ensure all data systems and processes comply with relevant regulations, standards, and security policies.
Applicants may, as a step in the recruitment process, be subjected to competency assessment. In addition, the successful candidate must be prepared to undergo a process of security clearance prior to appointment.
Qualifications and SA citizenship checks will be conducted on the successful candidate. It is the applicant’s responsibility to have foreign qualifications evaluated by the South African Qualifications Authority (SAQA).
It will be expected of candidates to be available for selection interviews on a date, time and place as determined by CIPC.
CIPC is an equal opportunity, affirmative action employer. Preference will be given to candidates whose appointment will enhance representation in accordance with the approved employment equity plan.
Feedback will only be given to shortlisted candidates.
CIPC reserves the right not to fill an advertised position.
For further details regarding these positions please click on the link: https://cipc.mcidirecthire.com/default/External/CurrentOpportunities or visit the CIPC website at www.cipc.co.za
Kindly note that faxed, emailed, posted and or hand delivered applications will not be considered.
Should you experience any difficulty in applying please contact the CIPC Recruitment Office by dialing: 087 743 7074, 7075, 7197 or 087 260 1554
Closing date:October 31, 2025
Requirements
Required Minimum Education / Training
Candidates must meet one of the following requirements:
Formal Education Pathway (Preferred):
- Education: Bachelor’s Degree / Advanced Diploma in Computer Science, Computer Engineering, Data Analytics, Data Engineering, Statistics/Mathematical Science/Economic Sciences, or a relevant quantitative field.Required Minimum experience
- Experience: Minimum 10–15 years of proven experience as a Data Architect, Solutions Architect, or equivalent senior data leadership role, with a strong focus on cloud-native data architecture and governance.
- Cloud Platforms: Deep understanding of cloud services and technologies relevant to data management and analytics, including Azure, AWS, and Google Cloud.
- Technical Skills: Strong knowledge of data modeling, ETL/ELT processes, data warehousing, and data governance best practices. Proficiency in SQL, Python, Spark, and other relevant programming languages and tools.
Specialized Certification and Experience Pathway (Alternative):
Required Minimum Education / Training
- Education: National Diploma in Information Technology or a related technical field.
Required Minimum Experience
- Experience: Minimum 12–17 years of progressive experience in Data Governance, Data cataloging tools, and Data Lakehouse implementation, Data Engineering, and Data Architecture, demonstrating a career trajectory that has built the equivalent strategic and leadership capabilities of a Bachelor’s degree holder.
- Mandatory Certifications: The candidate must hold a combination of at least two advanced professional certifications, which must include one strategic framework and one platform-specific architect certification:
- Strategic Architecture/Governance (MUST include one):
- TOGAF Certification (e.g., TOGAF 9 Certified or equivalent)
- Certified Data Management Professional (CDMP) (at Professional or Master level)
- Cloud Platform/Engineering (MUST include one):
- Databricks Certified Data Engineer (Associate or Professional)
- Microsoft Certified: Azure Solutions Architect Expert / Azure Data Engineer Associate
- Google Cloud Certified Professional Data Engineer / Professional Cloud Architect
- AWS Certified Solutions Architect – Professional / Data Analytics – Specialty
Minimum Functional Requirements (Technical Skills & Knowledge)
● Deep Data Modeling Expertise: Expert-level proficiency in designing and implementing dimensional (Star/Snowflake), Data Vault, and Relational data models, with a clear understanding of the trade-offs between them.
● Cloud Data Platform Mastery: In-depth knowledge and proven experience designing and deploying scalable data solutions using at least two major cloud platforms (Azure, AWS, or GCP), including their respective data warehousing, data lake, and compute services.
● Big Data Ecosystem: Strong hands-on experience with Apache Spark (PySpark/Scala) for large-scale data processing and experience with modern messaging/streaming technologies (e.g., Kafka).
● Data Governance: Proven ability to implement technical solutions for Metadata Management, Data Lineage, Data Quality (DQ), and Data Observability.
● Advanced SQL & Programming: Expert proficiency in writing complex, optimized SQL queries and extensive experience in a programming language like Python or Scala for pipeline development.
● Architecture Frameworks: Practical experience applying Enterprise Architecture methodologies and frameworks, such as TOGAF, to data initiatives.
● Security & Access Control: Expertise in designing and enforcing fine-grained data access controls, encryption, and data masking techniques across cloud data stores.
● Technical Skills: Strong knowledge of data modeling, ETL/ELT processes, data warehousing, and data governance best practices. Proficiency in SQL, Python, Spark, and other relevant programming languages and tools.
● Soft Skills: Excellent communication, collaboration, and problem-solving skills. Ability to work independently and lead a team of technical resources.
Click here to apply
ICT Service Manager (JG P7) (Pretoria)
Reference Number
S200O1
Description
ADVERTISEMENT
ICT Service Manager(JG P7) Ref No: S20001
Employment type: Permanent
Job Purpose: Proactive Management of the ICT service calls and act as a link between ICT and business.
Required Minimum education/training
- 3-year ICT National Diploma or degree (NQF 6 or higher)
- ITIL training/certification
- Relevant technical certifications will be an added advantage.
Required Minimum work experience:
- 5 years working experience in ICT support services.
- One year experience as ICT desktop support/services team leader or project manager
- Experience in other ICT services and infrastructure management areas
- Exposure to frameworks such as ITIL, ITSM or COBIT
- Experience with business service management solutions
- Experience in leading ICT desktop support projects.
- Experience with desktop support services reporting.
- Manage the provision of desktop support services i.t.o agreed service levels
Key performance areas:
- Ensure ICT governance over the desktop environment.
- Provide reports and escalations to Senior Manager
- Act as a link between ICT and business
- Proactive Monitoring and Management of Service levels with the Business Background verification, including criminal record and citizenship checks, it is the applicants’ responsibility to have their foreign qualifications evaluated by the South African Qualifications Authority (SAQA).CIPC is an equal opportunity, affirmative action employer. Preference will be given to candidates whose appointment will enhance representivity in accordance with the approved employment equity plan.Correspondence will be limited to shortlisted candidates only. It will be expected of candidates to be available for selection interviews on a date, time and place as determined by CIPC.If you do not hear from us within two months of the closing date of this advertisement, please accept that your application was unsuccessful.CIPC reserves the right not to fill an advertised position.Kindly note that faxed, emailed, posted and or hand delivered applications will not be considered.To apply click this link: https://cipc.mcidirecthire.com/default/External/CurrentOpportunities or visit the CIPC website at www.cipc.co.zaShould you experience any difficulty in applying please contact the CIPC Recruitment Office at 087 743 7197 / 7075 / 087 260 1554Closing date: 31 October 2025
Requirements
Required Minimum education/training
- 3-year ICT National Diploma or degree (NQF 6 or higher)
- ITIL training/certification
- Relevant technical certifications will be an added advantage.
Required Minimum work experience:
- 5 years working experience in ICT support services.
- One year experience as ICT desktop support/services team leader or project manager
- Experience in other ICT services and infrastructure management areas
- Exposure to frameworks such as ITIL, ITSM or COBIT
- Experience with business service management solutions
- Experience in leading ICT desktop support projects.
- Experience with desktop support services reporting.
- Manage the provision of desktop support services i.t.o agreed service levels
Click here to apply
We wish you all the best with your applications
Leave a Reply