You will play a lead role in driving the next evolution of the Enterprises Big Data and Analytics platform in a hybrid AWS and GCP environment.
- Globally Recognised Financial Services Firm
- Opportunity for both contract and Perm
- 3-year minimum Big data re-platform project
You will be working for a globally recognised enterprise investment bank and financial services firm based in Sydney CBD.
Lead Data Engineer to join an experienced engineering team within a DevOps environment. You will play a lead role in driving the next evolution of the Enterprises Big Data and Analytics platform in a hybrid AWS and GCP environment. You will bring significant hands-on experience in building, implementing, and enhancing enterprise-scale data platforms. This position will have end-to-end accountability for developing, deploying, and supporting data assets, whilst creating templates and implementation methods along with setting standards.
You will have:
- DataStage or Informatica), data warehouses, data lakes, and reporting tools
- Strong Big Data and Hadoop experience with main focus on Spark, Scala Hive, Presto (or other query engines), big data storage formats (such as Parquet, ORC, Avro)
- experience of solution architecture and high-level design with an ability to present options, recommendations, estimation, and technical planning.
- experience working in a DevOps model in an Agile environment.
- prior working experience with AWS - any or all of EC2, S3, RDS, EMR, Athena
- Financial services Domain experience is advantageous.