Role: Big data Architect
Rate/Salary : Euro 350/Day
Location: Brussels, Belgium
Job type: Contract – 6 months
Total Yrs. of Experience* 10+ years
Relevant Yrs. of experience* 4+ years
Detailed JD *(Roles and Responsibilities)
· Ability to do architect, design
· Ability to evaluate the requirements in the context of various Big data Components and articulate the recommendation.
. Previous experience working with Data Warehouse, Big data and Cloud Technologies
. Strong Agile background
. Experience of Continuous Integration and Continuous Delivery (CI/CD)
. Must be able to design and implement DevOps practices for version control, compliance, configuration management, build, release, and testing by using Big data technologies.
· Excellent written communication skills
· Excellent analytical and problem solving skills
. Experience in Architecting and design Data Lake solutions.
. Design Data Lake deployments based on customer requirements, best practices, and following patterns and practices developed by Agile IT
. Implementing the relevant deployment pattern and scaling a release pipeline to deploy multiple endpoints
. Experience in Big data – Hadoop Components – Spark (Batch and Streaming), Kafka, Storm, HIVE, HDFS, MapReduce, Oozie, Pig, Flume.
Hadoop: Spark (Batch and Streaming), Kafka, Storm, HIVE, HDFS, MapReduce, Oozie, Pig, Flume.
Architecture: Distributed, Client/Server, multi-tier, SoA and Object Oriented
If you’re interested in this role, forward an up-to-date copy of your CV to firstname.lastname@example.org or apply online