Big Data Architect/Engineer
AIA (Hong Kong) | More jobs from AIA (Hong Kong)
Big Data Architect/Engineer
- Develop and maintain the Big Data Architecture for current, transitional and future states of conceptual, logical and physical deployments.
- Defining and maintaining of the Enterprise Ingestion, Storage and Usage strategy, standards and policies (including data security and retention policies) for Big Data solutions.
- Working with the business in defining to understand their requirements and define the Big Data solution to meet the business requirements.
- Oversee the design and implementation of the big data architecture within programs and projects as necessary while ensuring adherence to AIA architecture vision, standards and policies.
- Ensure consistency of the Architecture with the Information, Technical and Security architecture frameworks as part of the over-all Enterprise Architecture vision.
- Defining standards, critical metrics and KPI (key performance indicators) for monitoring/tuning and scaling data processing pipelines and corresponding dashboards/Alerts.
- Leading / participating in the due diligence process for new solution proposals or purchases by reviewing all design aspects.
- Sharing industry standards and best practices with the wider EA team and BUs, identify opportunities and provide guidance on Next Gen emerging technologies and techniques on how they could impact, augment or support AIA's strategic vision.
- Researching on new technologies to be implemented & driving quick POCs for feasibility
- Perform other responsibilities and duties periodically assigned by reporting manager in order to meet operational and / or other requirement.
- Minimum graduate degree in Computer Engineering, Computer Science or related field
- Overall experience between 6 to 12+ years mandatory. Minimum 3 to 5 years of experience in the field of Big Data desirable.
- Should have extensive hands on experience design, deployment and integration of major Big Data components, such as HDFS, Map Reduce, Kafka, Spark, Pig, Hive, Flume, Sqoop, Zookeeper, HBase, Cassandra, NoSQL, Streaming Pipelines, etc.
- Should have hands on experience with at least one commercial Big Data Framework like Cloudera Hadoop, MAPR FS, Hortonworks, etc.
- Production-level hands-on experience working on HDFS, MapReduce, Hive, Apache Spark, Java and other tools in Big data stack.
- Should have hands on experience with Distributed Cluster Administration and performance tuning.
- Should have hands-on experience with one or more databases like - Cassandra, Elasticsearch, Redis, MongoDB, etc.
- Solid knowledge of high availability design, concurrency and distributed systems.
- Should have understanding of Big Data Security and Governance.
- Good to have experience in working with SaaS (Software as a service) product in cloud Knowledge of software development methodologies including agile, TDD, CI/CD
- Knowledgeable of Unix scripting language such as Bash, Perl, PHP, Python
- Certifications on Big Data Technologies and Application desired.
- Critical thinking skills to assess how Big Data capabilities can best be applied to complex business situations.
- Good communication skills with fluent English is mandatory. Knowledge of other Asian languages desirable.
- Willing to travel