We are looking for a skilled Data Architect to join our team, where you will take charge of designing and implementing our data architecture and solutions. In this position, you will develop data models, oversee data pipelines, and manage the architecture to guarantee data accessibility, reliability, and scalability. Your role will be pivotal in advancing our data strategy, empowering data-driven insights, and supporting strategic decision-making across the organization.
Responsibilities: Architect and Design: Develop and implement robust and scalable data architecture solutions in AWS to support business intelligence, analytics, and data science initiatives.Data Management: Define and maintain data architecture frameworks, standards, and principles, including modeling, metadata management, data security, and compliance.ETL/ELT Pipelines: Design, implement, and optimize data pipelines for efficient data ingestion and transformation using AWS services like Glue, Lambda, Step Functions, and EMR.Data Storage and Processing: Architect data lakes and warehouses using AWS technologies such as S3, Redshift, Athena, and DynamoDB, ensuring high performance and reliability.Data Governance: Establish data governance practices, ensuring compliance with data privacy regulations and security standards.Performance Tuning: Optimize queries, storage, and data processing performance while managing costs in a cloud environment.Collaboration: Work closely with data engineers, data scientists, and business analysts to understand data requirements and ensure seamless integration and accessibility.Innovation: Stay updated on emerging data architecture trends and AWS advancements to continuously improve our data platforms.Requirements: Experience: Minimum of 5+ years as a Data Architect or in a similar role, with extensive experience in AWS cloud services.Technical Expertise: Strong proficiency with AWS services, such as S3, Redshift, Glue, Kinesis, RDS, Lambda, IAM, and VPCs.Data Modeling: In-depth knowledge of data modeling, data integration, and data warehousing principles.Big Data Technologies: Familiarity with big data processing frameworks like Apache Spark or Hadoop.Database Experience: Hands-on experience with both relational (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB, MongoDB).Programming Skills: Proficient in SQL and one or more programming languages, such as Python, Java, or Scala.Data Security: Understanding of data security practices and compliance requirements (e.g., GDPR, CCPA, or regional regulations like Sadaya for Saudi Arabia).DevOps Practices: Knowledge of CI/CD, infrastructure as code (e.g., AWS CloudFormation, Terraform), and containerization technologies like Docker and Kubernetes.Analytical Skills: Strong analytical and problem-solving skills with a keen eye for detail.Communication: Excellent communication and documentation skills to articulate data strategies and solutions to both technical and non-technical stakeholders.
#J-18808-Ljbffr