BIG DATA ARCHITECT SKILLS, EXPERIENCE, AND JOB REQUIREMENTS

Updated: Sep 21, 2024 - The Big Data Architect specializes in creating strategic solution architectures that facilitate the delivery of data products for BI/Analytics/Reporting, Personalization, and AI/ML use cases. This role includes expertise in developing web-scale solutions across platforms like Google Cloud, AWS, or Azure, emphasizing systems that support high concurrency, availability, and resilience. Additionally, the architect has substantial experience in designing machine learning pipelines, implementing data governance tools, and integrating with key marketing and analytics platforms.

Essential Hard and Soft Skills for a Standout Big Data Architect Resume
  • Data Modeling
  • Big Data Technologies
  • Database Management
  • Programming
  • ETL Tools
  • Machine Learning Algorithms
  • Data Warehousing Solutions
  • Cloud Platforms
  • SQL and NoSQL Databases
  • Data Visualization Tools
  • Analytical Thinking
  • Problem-Solving
  • Communication
  • Project Management
  • Team Collaboration
  • Adaptability
  • Attention to Detail
  • Leadership
  • Time Management
  • Creativity

Summary of Big Data Architect Knowledge and Qualifications on Resume

1. BS in Computer Science with 7 years of experience

  • Excellent communicator with good analytical skills.
  • Hands-on experience with leading commercial Cloud platforms like AWS and Azure
  • Strong experience with Apache Spark (Databricks or similar), Kafka and NiFi
  • Strong experience with relational SQL and NoSQL databases like PostgreSQL, Oracle, Cassandra, Mongo DB
  • Hands-on experience with Snowflake or similar cloud storage technologies
  • Hands-on experience with Hadoop distributions (Cloudera, Hortonworks) and Hive
  • Experience with CI/CD pipeline implementation into a DataOps environment
  • Strong understanding of data analytics and visualization
  • Experience with ElasticSearch and Kibana is a big plus
  • Experience with Alation and/or other data catalog tools (Azure, Informatica) is a big plus
  • Experience with Agile development methodologies
  • Strong presentation, facilitation, verbal and written communication skills, including interpersonal skill

2. BS in Software Engineering with 2 years of experience

  • Experience working directly with business clients to design a solution that meets business requirements
  • Ability to clearly articulate pros and cons of various technologies and platforms and architectural options
  • Ability to document use cases, solutions and recommendations.
  • Experience with tools and concepts related to data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, and structured and unstructured data.
  • Database development experience using Oracle, SQL Server, SAP BW or SAP HANA
  • Big Data Development experience using Hive and/or Spark
  • Effective verbal and written communication and influencing skills.
  • Effective analytical and technical skills.
  • Ability to work in a team environment
  • Ability to research, plan, organize, lead, and implement new processes or technology
  • Experience in Google Cloud Services
  • Python, Scala or Java development experience

3. BS in Information Systems with 3 years of experience

  • Experience of defining solution architecture and strategic direction to deliver Data products and capabilities to support BI/Analytics/Reporting, Personalization or Advanced Analytics (AI/ML) use cases
  • Experience with architecting web-scale solutions on Google Cloud but comparable experience on AWS or Azure
  • Experience understanding and articulation of architectural principles to build systems that support very high concurrency, are highly available, and highly resilient in the face of dependent component failures
  • Experience designing microservices and in containerization and orchestration (Kubernetes, GKE or equivalent, Docker)
  • Experience working with data warehousing, data marts and data lake technologies
  • Experience leveraging Big Data offerings on Google Cloud Platform (GCP) like BigQuery, Dataflow, Dataproc, Dataprep, Pub/Sub, DLP etc. or their functional equivalents on AWS, Azure or on-premise
  • Demonstrable experience of defining and delivering integration solutions to/from marketing, advertising and analytical COTS products (Adobe, Braze, Google Ads, Facebook, Twitter etc.)
  • Significant experience designing machine learning pipelines for intelligent insights and predictions
  • Proven experience working with Data Governance tools and frameworks (Collibra, Alation etc.)
  • Working understanding of Identity and Access Management (IAM) principles
  • Strong CS fundamentals including data structures, algorithms, networking and security

4. BA in Statistics with 4 years of experience

  • Expertise into Big Data technologies like HDFS, SPARK, SCALA, Impala, Hive, Hue etc.
  • Strong understanding and hands-on experience with Oracle, UNIX, Informatica and Control-M
  • Knowledge and hands-on experience on BI reporting tools like OBIEE and regulatory reporting tool like AXIOM
  • Experience in the Banking and financial IT sector
  • Strong shell scripting and automation skills
  • Exposure to work on Database/UNIX server upgradation
  • Experience in design, planning, preparation, and implementation of solutions in complex environments
  • Knowledge on IntelliJ, Tectia, Toad, SVN, Jenkin, Maven and Gradle would be plus
  • Fully conversant with BASEL III regulatory reporting domain, FINMA would be advantageous
  • Strong understanding of BASEL III norms
  • Experience on implementing FINMA Regulatory guidelines in areas like Large Exposure rules (LER), Leverage Ratio (LRE), Standardise approach – Counterparty Credit Risk (SACCR) model approach to counterparty exposures, Single client identifier (SCI), Total loss-absorbing capacity(TLAC) etc.
  • Good understanding on reconciliation of Risk-weighted assets (RWA), Exposure at default (EAD) etc. calculations as per business needs
  • Well-versed in agile methodologies, and DevOps and over the period proved his technical expertise in all tiers of application development using cutting edge technologies
  • Cloud-based Data and Analytics platform solution