Lead Data Engineer (offshore) (Job Code : J47768)  

 Job Summary
Share this job on Facebook  Share this job on Twitter  Share this job on Linked In
4.00 - 8.00  Years 
Lead Data Engineer (offshore)
BE-Comp/IT, BE-Other, BTech-Comp/IT, BTech-Other
Educational Level:
Stream of Study:
Industrial Type:
IT-Software/Software Services
Functional Area:
IT Software - Application Programming / Maintenance
Key Skills:
SQL AND Python AND Spark AND (Dagster OR Airflow OR Air flow OR Air-flow)
Job Post Date:
2024-05-09 13:10:12  

 Company Description
Our client has a cloud-based CDR data holders solution, they also use their deep knowledge of the Australian energy market technology to create the data extraction and transformation logic directly from CRM or billing system – saving hundreds of hours of technical effort and leaving their customer to focus on value-adding projects instead of CDR compliance.

 Job Description
?Client is looking out for a technical delivery leader on platform and project builds for building and maintaining optimised and highly available data pipelines that facilitate data management and data analytics solutions.
?The role will drive the development of data processing frameworks and architectural approaches that scale to handle the business’s growing demands.
?This hands-on role requires technical expertise to deliver data and analytical services, including data ingestion, transformation, storage, and reporting and strives to continuously develop new and improved data engineering capabilities.

The key accountabilities
?Design data solutions to meet business, technical and user requirements. This includes building modern data pipelines that meet functional/non-functional business requirements and provide end-to-end data solutions
?Participate in proofs-of-concept on platform innovation, and effectively transition and scale those concepts into production at scale through, engineering, deployment and commercialisation
?Contribute to ongoing management of the platform, including platform enhancements, capacity management, performance monitoring, troubleshooting and resolution of technical issues
?Identify, design, and implement process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability
?Technical ownership for data and data pipelines to ensure compliance with data standards, architectural standards, and achievement of business requirements
?Build a strong understanding of platform architecture and the best way to leverage it to achieve effective outcomes for a given project
?Support compliance with data, information and security management requirements

Technical qualifications & skills:
?Experience in building and maintaining real-time data pipelines.
?Expected experience using the following tech stack
?Preferred but not mandatory
?Kubernetes and Terraform
?CI/CD - Github actions, CircleCI
?Datadog / Cloudwatch - Logging
?Prior work with stream-processing systems (e.g., Kafka, Kinesis).
Exposure to Trino, Dagster, Cubejs, metabase/looker will be advantageo