Sr. Data Architect, AWS Data Lake and Advanced Analytics

  • Edwards Lifesciences
  • Irvine, CA, USA
  • Jul 20, 2020
Full time Health Care

Job Description

Edwards Lifesciences is hiring for a Sr. Data Architect, AWS Data Lake and Advanced Analytics, highly skilled in AWS cloud computing, to support our Enterprise Data Office vision. The Enterprise Data Office (EDO) is committed to enabling business outcomes with advance analytics and algorithm governance. The Sr. Data Architect will establish and maintain enterprise level data architecture through the stages of planning, design, execution, and operation. In this role you will work on Data Analytics, Data Warehousing and/or Hadoop/Data Lake projects and be responsible for solving complex and high scale (billions + records) data challenges. You will partner with data scientists, data engineers, solution architects, business systems analysts and developers. In addition, you will provide expert level consultation to data management and technology decisions, while developing and promulgating the data lake architecture standards and practices necessary to realize and sustain systems.

Responsibilities:

  • Lead efforts to introduce, re-engineer and optimize AWS cloud data platform processes and systems by assessing business needs and architecting, proposing and implementing data management solutions.
  • Develop and design of models for complex analytical and data warehouse systems including performing tasks related to database design, data analysis, data quality, metadata management and support.  
  • Collaborate with business process leaders to introduce and maintain data platforms, algorithm governance and data security capabilities and roadmap.
  • Collaborate with Data Scientist, Data Engineer and Solution Architect and implement and support various AWS services such as Elastic Compute Cloud (EC2), Amazon Data Pipeline, S3, DynamoDB, Relational Database Service (RDS), Athena, Aurora, Redshift, Redshift spectrum, Elastic Map Reduce (EMR), Sagemaker studio and commercial/open source IDEs.    
  • Ensure data lake design follows the prescribed reference architecture and framework; Ensure design reflects appropriate business rules, and facilitates data integration, data conformity and data integrity
  • Work on project teams to translate business requirements into system qualities that lead to repeatable and executable design strategies and patterns.
  • Implement large cloud-based data warehouse solutions utilizing cluster and parallel RDMS, Hadoop and NoSQL architectures.
  • Develop architecture supporting end to end Data integration process using ETL and ELT using Structured, semi-structured and Un-structured Data
  • Develop proof of concept prototypes for next-generation data lake solution

Basic Qualifications:

  • Bachelor’s degree in Computer Science, Information Systems, Mathematics or a related discipline.
  • 12+ years of experience in Information Technology within a complex, matrixed and global business environment, including experience within a manufacturing organization.
  • 8+ years of experience in a Data Architect role working with cross-functional teams (within IT and/or within the business) on enterprise level or complex system implementations.
  • 4+ years of experience of full cycle AWS Data Lake platform implementation and performance tuning.
  • Expertise in AWS data lake services such as Amazon Elastic Compute (EC2), Amazon Data Pipeline, Glue, Amazon S3, Amazon DynamoDB, Amazon Relational Database Service (RDS), Amazon Elastic Map Reduce (EMR), Amazon Kinesis, and Amazon Redshift.
  • Understanding of Apache Hadoop and Hadoop ecosystem. Experience with one or more relevant tools such as Sqoop, Flume, Kafka, Oozie, Zookeeper, HCatalog, or Solr.
  • Understanding of database and analytical technologies in the industry including MPP and NoSQL databases, Data Warehouse design, ETL, BI reporting and Dashboard development.
  • Familiarity with one or more SQL-on-Hadoop technologies (Hive, Impala, Spark SQL, Presto).
  • Experience developing software code in one or more programming languages (Java, JavaScript, Python, Ruby, JSON etc).
  • Knowledge of best practices related to Data lake, Data lake governance, Data Security (e.g., HITRUST), and Data Integration & Interoperability.
  • Experience with Agile framework and DevOps.

Preferred Qualifications:

  • Professional certifications e.g. TOGAF, Data Management, COBIT, ITIL, AWS Certified solution architect, etc.
  • Excellent documentation and interpersonal relationship skills with ability to drive achievement of objectives.
  • Strong interpersonal and leadership skills.
  • Strong written and verbal communication skills including the ability to communicate at various levels within an organization and to explain complex or technical matters in a manner suitable for a non-technical audience.

Edwards is an Equal Opportunity/Affirmative Action employer including protected Veterans and individuals with disabilities.