Cyber Security - Big Data Engineer
We are looking for a Big Data Engineer (m/f)
What are my responsibilities?
- Analyse complex Cyber Security requirements and transform them into Data Science use cases
- Implement above Data Science use cases on top of an AWS-based Big Data platform
- Assess and translate functional, non-functional, and operational requirements into detailed architecture
- Develop and extend an AWS-based Streaming platform with in-stream Machine Learning and Scoring. Integrate advanced forensic tools as well as Data Science Tools
- Interface with external entities in an operational environment
- Display a high level of critical thinking and bring successful resolution to high-impact, complex, and/or cross-functional problems
What do I need to qualify for this job?
- BS/BA degree in related discipline, MS is of advantage, where required, or equivalent combination of education and experience. Disciplines may include Data Science, Computer Science, Computer Engineering, Mathematics, Statistics and Physics. Certification may be required in some areas
- Typically, 2-5 years of work experience in a related field. Successful demonstration or potential to perform key responsibilities as presented above. Advanced degree may be substituted for experience, where applicable
- Amazon Certified Solution Architect/Developer Associate or higher is of advantage
- Background in information security is of advantage
- Demonstrated ability to learn in a fast-paced environment
- English written and verbal communication skills.
- Deep Knowledge in AWS Big Data technologies
- Knowledge in Data Science, preferable experience in Log File analysis and NLP
- Knowledge in some or all Data Science methods: Deep Learning, Tensorflow, MXNet, XGBoost, Random Forests, K-Means, DBSCAN, PCA, etc.
- Knowledge in some or all AWS technologies: SageMaker, Amazon ML, Kinesis, Firehose, Glue , Lamda, Athena, S3, Glacier, VPC, Load Balancing, API Gateway, CodeCommit, CodeBuilt, CodeDeploy, CodePipeline, IAM, Direct Connect, KMS, EMR, CloudWatch, CloudFormation
- ETL with spark, glue
- Knowledge in some or all technologies: Spark, Kafka, Flink, Git
- Experience with all or some of the following programming languages: Python, Java, Scala
- Experience with all or some of the following operating systems: Linux (Ubuntu, CentOS, Amazon Linux, Redhat), Window
Job ID: 104883
Organization: Corporate Technology
Experience Level: Experienced Professional
Job Type: Full-time