Data architect - السعودية, الرياض

Job description you will be responsible for designing and optimizing big data and data warehouse architecture, as well as optimizing data flow and pipelines for cross functional teams. you are a technical guru when it comes to selecting the right tools for implementing data ingestion, processing, and storage. security, performance, scalability, availability, accessibility, and maintainability are your top priorities when designing data solutions. you have a deep, broad, and hands-on experience in the various technologies from hadoop ecosystem, nosql, rdbms, ingestion, and processing. responsibilities: provide thought leadership and drive architecture and the design of big data and data warehousing solutions. clearly articulate pros and cons of various technologies and platforms identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. design the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources document use cases, solutions and recommendations work with various stakeholders, from program and project managers to the solution and enterprise architects in the design, planning and governance of implementing data projects design and strategy to keep data separated and secure across national boundaries through multiple data centers and regions. identify ways to improve data reliability, efficiency and quality design a strategy to insure data availability, accessibility, scalability, and maintainability perform detailed analysis of business problems and technical environments and use this in designing the solution explore new opportunities introduced by new technologies and initiatives benchmark implemented solution against other solutions from other vendors build capabilities and guide the technical team and push them to progress in their career qualifications 9-12 years experience in data warehousing and big data projects. deep and broad experiences in hadoop ecosystem, including hdfs, mapreduce, hive, hbase, impala, kudu, solr, etc.. hands-on experience in multiple nosql databases like cassandra, mongodb, neo4j, elasticsearch and elk stack experience with stream-processing systems: storm, spark-streaming, flink, etc.. experience with real-time messaging platform like kafka, kinesis, etc.. advanced working sql knowledge and experience working with relational databases, query authoring (sql) as well as working familiarity with a variety of databases, including distributed relational database like singlestore and vitess experience in building and optimizing ‘big data data pipelines strong analytic skills related to working with unstructured datasets. experience in object stores like minio and ceph build processes supporting data transformation, data structures, metadata, dependency and workload management. proven record of building highly available and always-on data platforms linux shell scripting languages: python, java, scala, etc. fluent in english & arabic


Devoteam
السعودية, الرياض ,Saudi Arabia
2022-01-14
2022-03-15
Not disclosed AED
Permanent Job,Full time
942680
Please report inappropriate ads by sending a message to abuse@expatriatesjobs.com. Please include the Job ID located in the header of each ad

Apply to this job now Report abuse