Big Data Developer (CPT)
Recruiter on August 20, 2019
Job Type: Full Time
Job Category: Information Technology
Contract Type: Permanent
Cape Town, Western Cape, South Africa
Salary: Competitive Salary
Our client develops and supports software and data solutions across a variety of industries.
They want you to get ahead of the market and stay there. They offer a combination of plug and play products that can be integrated with existing systems and processes and can also be customised to client needs.
Their capabilities extend to big data engineering and bespoke software development, solutions are available as both cloud-based and hosted.
• Must be able to pick up a new technology quickly and deliver features in a highly agile manner.
• Experience writing functional Scala in a production grade system. You are not a Java developer writing OO in Scala.
• Previous experience using Apache Spark SQL in a production system using Scala with YARN as the resource manager.
• Clear and practical understanding of how Hive works, which includes running Hive on Tez.
• Practical experience using Debian flavoured Linux distributions.
• Familiarity with event driven development and architectures.
• Previous experience using Docker containers to deploy your systems.
• Ability to easily navigate the administration of an HDP cluster on AWS.
• Must be able to index millions of documents from Hadoop into Elasticsearch.
• Ability to work with various messaging systems, such as Kafka and RabbitMQ.
• Must be able to aggregate data using Apache Kylin Cube.
• Must be able to easily pick up Python if you have not used it previously.
• Ability to operate and deploy to a Kubernetes cluster on AWS.
• Must be able to understand basic concepts about mortgage backed securities.
• Building and operating a content management platform for a high-profile big data project that promises to revolutionise an area of finance by providing unprecedented market insight in a timely manner.
• Working closely within a team of developers distributed in London, Johannesburg, Cape Town and New Zealand.
• Building and operating an ingestion and analytics platform that collects frequently changing data and exposing the normalised and aggregated data via APIs to our client’s customers. This will include data cleansing, aggregation, financial computations.