Our client is looking to build a data, analytics and data science platform to enable them to deliver an exceptional user experience. You must be experienced in AWS cloud services, Dev Ops, large Datasets, SQL/NoSQL environments and the Hadoop ecosystem.
What you'll be doing:
Promote AWS and cloud best practices to maximise compute performance while minimising infrastructure costs.
Design and develop an ‘infrastructure as code’ platform.
Keep up to date with AWS and other cloud providers.
Manage and run an enterprise level DevOps team.
Able to work with different data types e.g. streaming, real-time, file based, RDMS, unstructured data, etc.
Work in an agile environment, organising standups, retros, planning, showcases and other agile meetings and ceremonies.
Work closely with scrum masters, asset designers and data consulting team to determine platform requirements and priorities.
Resolve technical barriers and roadblocks impacting delivery teams.
Track delivery outcomes against key performance indicators and provide reporting to the head of advanced analytics enablement.
Work with analysts and data scientists on bespoke problems requiring engineering support.
Showcase work to stakeholders
What you'll need:
Experience with ‘infrastructure as code’, e.g. Terraform, Ansible.
Familiarity with AWS services
Experience with shell scripting.
Exposure to DevOps and Agile.
Management of analytical or data focused projects / teams.
Tertiary qualification in a relevant technical subject.
Excellent communication and interpersonal skills, both oral and written.
Ability to assess and implement new technologies and processes.
An open mindset and proven ability to innovate and influence.
Data ingestion technologies and capturing meta-data and data lineage.
Exposure to Hadoop technologies e.g. Spark, PySpark, Kafka, Hive, Flume, Hue, Sqoop, etc.
Experience with configuring secure environments with AD groups, SSO, SAML, etc.
If the above sounds like you then click APPLY NOW or phone Richard on (03) 8637 7314 for more information.