LEAD DATA ENGINEER SKILLS, EXPERIENCES, AND JOB REQUIREMENTS

Published: Jan 14, 2026 - The Lead Data Engineer designs and delivers scalable data solutions by applying deep expertise in data warehousing and cloud-based architectures using AWS technologies such as RDS, Lambda, S3, Snowflake, and Databricks. This role requires strong experience in migrating ETL processes from relational data warehouses to cloud environments, along with proficiency in Spark, Python, Java, Agile methodologies, and prior exposure to the Finance domain. The lead also collaborates effectively with cross-functional teams by demonstrating professional English proficiency, strong communication skills, and the ability to build and maintain reliable end-to-end data pipelines.

Essential Hard and Soft Skills for a Lead Data Engineer Resume
  • Data Warehousing
  • AWS Cloud Services
  • ETL Development
  • Data Pipeline Architecture
  • Apache Spark
  • Python Programming
  • SQL
  • Databricks
  • Data Lake Architecture
  • Snowflake
  • Leadership
  • Communication
  • Problem Solving
  • Stakeholder Management
  • Collaboration
  • Critical Thinking
  • Decision Making
  • Time Management
  • Adaptability
  • Mentoring

Summary of Lead Data Engineer Knowledge and Qualifications on Resume

1. BS in Computer Science with 5 years of Experience

  • Experience installing and managing Linux servers, Active Directory, and Data Center infrastructure solutions
  • Ability to manage and monitor vendor hardware and software network acquisitions
  • Ability to work independently to resolve highly technical system issues
  • Thorough knowledge of the principles of preventative maintenance techniques on the storage and server environment to ensure systems remain highly reliable
  • Comprehensive knowledge of the acquisition and installation of network security software and conduct intrusion testing
  • Comprehensive knowledge of the major components of storage and server infrastructure- hardware, software, cabling systems
  • Ability to utilize and manage software systems of the Microsoft Office suite, Microsoft Server/Active Directory, and firewalls
  • Ability to evaluate and manage software and database systems of CommVault and VMware ESXi
  • Ability to communicate with users to define system requirements and resolve problems
  • Effective collaboration with peers and vendors on the efficiency of the systems supported

2. BS in Software Engineering with 7 years of Experience

  • Business-proven hands-on experience in AWS, Azure or GCP
  • Experience with established tools like Hadoop, Spark and Airflow, as well as knowledge of newer developments such as Arrow, DBT, and Great Expectations
  • Proficiency in Python, SQL and/or Java 
  • Hands-on experience building ETL pipelines
  • Solid experience in data modelling and resilient engineering
  • Experience with data science and ML libraries
  • First-hand experience in deploying, maintaining, and updating models in production 
  • Able to collaborate in diverse and visionary teams and settings
  • Must have strong communication skills
  • Business fluent English language skills, German speaking and writing skills 
  • Passion to share knowledge and onboard new team members
  • Previous experience in large-scale projects with multiple stakeholder groups based on varying data sources and technologies
  • Understanding of technologies used to collect and transform consumer data
  • Familiarity with complex software systems, microservice architectures, and real-time processing

3. BS in Data Science with 8 years of Experience

  • Experience in developing and shipping an end-to-end production-grade data engineering platform
  • Experience in building and managing data engineering teams
  • Familiarity with AWS data stack (EMR, Kinesis, S3, etc.)
  • Experience in building batch processing tools (EMR/Hadoop/etc)
  • Experience in building streaming data platforms (Kafka/Kinesis/Storm/etc)
  • Experience in workflow orchestration tools such as Airflow/Luigi
  • Experience in building and managing a data quality/auditing/monitoring system
  • Hands-on experience with “big data” platforms, including Hadoop (Azure or GCP) and Spark
  • Proficiency in “big data” technologies including Spark, Airflow, Kafka, Hbase, Pig, NoSQL databases, etc.
  • Experience and background on traditional relational data warehouse technologies like Oracle, Teradata, and DB2
  • Proficiency in one of the following programming languages: PySpark, Scala, or Java
  • Good knowledge and experience with SQL, including Analytical SQL functions

4. BS in Information Technology with 9 years of Experience

  • Working experience in database management/data model/data warehouse 
  • Strong understanding of financial process knowledge
  • Working experience in SQL
  • Strong technical systems background (Tableau, Alteryx, SAP HANA/BW, MDM)
  • Experience documenting best practices and enforcing strong governance in a team
  • Knowledge of Tableau and Alteryx development
  • Strong project management skills or experience leading implementations
  • Experience supporting business partners within the Finance organization 
  • Experience building data models and performing complex queries using SQL
  • Experience in performance tuning large datasets
  • Experience building large data pipelines and/or web services
  • Intermediate programming skills with Python / R and other scripting languages
  • Experience in building integration with upstream and downstream systems with REST APIs

5. BS in Computer Engineering with 7 years of Experience

  • Experience in data platform administration/engineering , or related
  • Experience in data platform administration/engineering 
  • Experience in migrating ETL processes (not just data) from relational warehouse databases to AWS-based solutions
  • Hands-on experience with Amazon Web Services (AWS)-based solutions such as Lambda, DynamoDB, Snowflake and S3
  • Knowledge and experience using query languages (SQL, Cypher) for relational and graph databases 
  • Capability to collaborate with stakeholders and project leaders to understand requirements, deliverables, and set expectations on tasks
  • Ability to work in a fast-paced, rapidly changing environment
  • Experience working in an agile and collaborative team environment 
  • Excellent written and verbal communication, presentation and professional speaking skills
  • Passion for learning and interest in pursuing classroom training and self-discovery on a variety of emerging technologies
  • Experience within the financial services industry 

6. BS in Information Systems with 6 years of Experience

  • Experience designing, developing, and deploying data solutions using Power BI and, Azure platform
  • Experience in designing Data pipelines (ETL/ELT), Data warehouse and Data marts
  • Hands-on expert with real-time data processing and analytics, data ingestion (batched and streamed), and data storage solutions
  • Hands-on Azure Analysis Services and Power BI and good to have experience with other tools
  • Hands-on experience with Data Factory, Data Lake Storage, Databricks, Data Explorer, Machine Learning, and Azure Synapse Analytics
  • Expert at creating data dissemination diagrams, data flow diagrams, data lifecycle diagrams, data migration diagrams, and data security diagrams, etc.
  • Demonstrated strength in data management, orchestration, access control, etc
  • Proven expert in writing optimized SQL to deal with large data volumes
  • Hands-on knowledge of Python along with its main data libraries like Pandas, Numpy, Beautiful Soup, etc.

7. BS in Artificial Intelligence with 5 years of Experience

  • Experience with distributed software development
  • Hands-on experience with all aspects of software development such as data, server-side, UI, and open-source software
  • Experience in handling large data volumes and having strong knowledge in Data wrangling and data pipeline 
  • Experience with Linux, Open source, C++ or Java, client-server apps
  • Expertise in at least two of the following: Java / Go / Scala / Python
  • Familiar with container strategies and ecosystems such as Docker, Kubernetes
  • Expertise in analyzing input Data elements across multiple sources and building and validating Data lineage
  • Experience managing and building a Data streaming automation framework
  • Knowledge of cloud architecture and scalable solutions including orchestration

8. BS in Cybersecurity with 7 years of Experience

  • Knowledge of Python running on AWS (EC2, S3, EMR, ELB, Opensearch, etc.), with Redis, Apache Spark, and MongoDB
  • Familiarity with the API servers
  • Excellent English communication and collaboration skills
  • Detail-oriented and capable of multitasking
  • Ability to interact with a diverse group of stakeholders and proactively provide actionable data
  • Familiar with working with DevSecOps tools, methodologies - CI/CD (i.e., Jenkins, Bitbucket, GitHub), Azure DevOps, i.e., best practices for code deployment
  • Working knowledge of job Orchestration (Control-M, Airflow)
  • Able to work with Architects, Technical Leads and Business Teams and contribute to the development of technical designs
  • Strong knowledge of various data structures and the ability to extract data from various data sources
  • Able to provide technical database consultation on application development, global infrastructure, and other database administration efforts related to specific DBMS
  • Experience in writing complex SQL queries 
  • Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows

9. BS in Applied Mathematics with 8 years of Experience

  • Experience in building real-time data architectures using technologies like Spark, Kafka, and Flink
  • Experience in building systems on cloud platforms, AWS, and Infrastructure as Code technologies like Terraform
  • Understanding of different data capabilities, whether that be stream processing, event sourcing, machine learning, data science, big data, BI, or analytics
  • Able to be a technical innovator with a curious mindset and a thirst for knowledge and be able to demonstrate a drive for continual improvement
  • Advanced working knowledge in large-scale ERP analytics, optimization, business intelligence, and statistics for IT organizations
  • Ability to identify key insights and build large-scale data products that enhance the customer experience and improve operational processes
  • Strong ability to provide strategic guidance on emerging technologies to innovate and ensure SJSU is well-positioned in delivering analytic solutions
  • Strong ability to deliver on high-impact analytics projects
  • Skilled in recruiting, developing, inspiring, and leading a growing team of data analysts
  • Advanced working SQL knowledge and skills working with relational databases like Oracle, query authoring (SQL), as well as working familiarity with a variety of databases
  • Expert knowledge in high-level programming languages like Python, SQL, Java, and JavaScript
  • Advanced knowledge in application architectures on public cloud platforms such as Amazon Web Services (AWS), Google Cloud Platform (GCP), or IBM Cloud Pak
  • Advanced knowledge of enterprise data architecture, ETL integration, data warehousing techniques, analytics/end-user reporting toolsets

10. BS in Statistics with 12 years of Experience

  • Overall experience in building ETL/ELT, data warehousing and big data solutions
  • Experience in building data models and data pipelines to process different types of large datasets
  • Experience with Python, Spark, Hive, Hadoop, Kinesis, Kafka
  • Proven expertise in relational and dimensional data modeling
  • Understand PII standards, processes, and security protocols
  • Experience in building a data warehouse using Cloud Technologies such as AWS or GCP Services, and Cloud Data Warehouse, Google BigQuery
  • Able to confidently express the benefits and constraints of technology solutions to technology partners, stakeholders, team members, and senior levels of management
  • Experience implementing and supporting operational data stores, data warehouses, data marts, and data integration applications
  • In-depth knowledge of Big Data solutions and the Hadoop ecosystem
  • Ability to effectively share technical information, communicate technical issues and solutions to all levels of business
  • Able to juggle multiple projects - can identify primary and secondary objectives, prioritize time and communicate timeline to team members
  • Passionate about designing and developing elegant ETL/ELT pipelines and Frameworks
  • Ability and desire to take product/project ownership
  • Ability to think creatively, strategically and technically
  • Ability to work a flexible schedule based on department and Company needs
  • Must have Cloud Architect (AWS or GCP or Azure) Certification

11. BS in Electrical Engineering with 6 years of Experience

  • Strong knowledge in a programming language such as Java, Python, or Scala
  • Hands-on cloud computing experience on AWS, Microsoft Azure, or GCP
  • Experience with data lake technologies such as Kafka, Avro, Parquet and Spark
  • Experience with RDBMS, Columnar, and NoSQL databases such as MySQL, PostgreSQL, and Elasticsearch
  • Good knowledge of data orchestration frameworks such as Airflow or Luigi
  • Understanding of data modelling and different data structures, adapting to particular use cases
  • Continuous integration and delivery principles
  • Problem-solving skills and analytical thinking/innovation
  • Good command of both written and oral English
  • Strong communication skills to collaborate with various stakeholders, whilst still being able to work autonomously
  • Experience in the gambling industry 

12. BS in Systems Engineering with 5 years of Experience

  • Experience with leading projects and leading a team of technical people
  • Hands-on object-oriented programming experience (Python/Java)
  • Experience in designing and building data applications/pipelines
  • Excellent coding skills (clear coding, unit testing, CI/CD, gitflow, etc.)
  • Understand the diverse ways of storing and processing data (file vs SQL vs NO-SQL, batch vs streaming, stateful vs stateless)
  • Strong interpersonal skills, customer-centric attitude, proven team player and team builder
  • Exceptional communication and presentation skills to include technical and business concepts
  • Experience with 2 or more Azure cloud technologies and eagerness to learn more
  • Strong interpersonal skills, customer-centric attitude, proven team player and team builder

13. BS in Cloud Computing with 7 years of Experience

  • Experience as a Data Engineer
  • Experience working with orchestration tools (e.g., Airflow)
  • Experience working with big data and ETL development
  • Programming skills in SQL, Spark with Scala
  • Experience working with cloud computing services (e.g., GCP, AWS, Azure)
  • Experience with Data Science workflows
  • Excellent problem-solving and communication skills both with peers and experts from other areas
  • Self-motivated and have a proven ability to take the initiative to solve problems
  • Must have deep expertise in one of the programming languages for data processes (Python, Scala) and must have worked on building data lakes
  • Deep understanding of ETL/ELT design methodologies, architecture, strategy, and tactics for complex ETL solutions
  • Good knowledge of Data formats (Parquet, ORC, etc.) and working knowledge of Data Lakes
  • Proactively identify, design, and implement process improvements like automating manual processes, optimizing data flow, increasing scalability, etc.
  • Must have worked with relational databases

14. BS in Big Data Analytics with 9 years of Experience

  • Professional experience in the data analytics field with increasing responsibility
  • Demonstrated history of designing and building enterprise-level architecture for data warehouses
  • Expert team leadership, including growing, managing, and coaching a team of direct reports
  • Expert-level database knowledge such as SQL Server, Oracle, Postgres, MySQL, MongoDB, Amazon Aurora, etc.
  • Experience in cloud data warehouse technologies like Azure, AWS, Teradata, Snowflake, etc.
  • Experience in middleware and data
  • Experience in Azure Data Factory (ADF)
  • Business and analytical thinker, with an ability to create solutions based on data analytics
  • Ability to understand and tell the story embedded in the data at the core of business
  • Ability to communicate complex analytic ideas and insights to a non-technical business stakeholder
  • Ability to design, execute, and interpret the results of experiments on large and complex data sets
  • Strong knowledge of industry best/next practices in the application of data analytics in marketing

15. BS in Computational Science with 5 years of Experience

  • Experience leading a technical team (team- or project management experience)
  • Deep knowledge of SQL and relational databases, column-oriented databases (Snowflake, Redshift, BigQuery)
  • Proficient in both software development and Python
  • Experience with orchestration of data pipelines using Apache Airflow
  • Knowledge of Data modelling and ETL/ELT practices
  • Familiarity with messaging technologies such as Kafka and RabbitMQ, as well as with batch and streaming data processing frameworks (dbt, Apache Spark, Apache Beam)
  • Experience with cloud platforms such as AWS and GCP
  • Previous experience in designing and building large-scale data platforms with reliability and maintainability in mind
  • Excellent communication skills in English

16. BA in Computer Science with 10 years of Experience

  • Extensive professional experience working in analytics, data science, data engineering, or software development
  • Extensive experience in designing, building, and maintaining data pipeline systems
  • Strong coding experience in server-side programming languages (Python, Scala, Go, Java, or R) as well as database languages (SQL)
  • Strong knowledge in software engineering principles
  • Experience with data technologies and concepts such as NiFi, Spark, Hadoop, Airflow, Kafka, RDBMS, and Columnar databases
  • Working knowledge of Docker and GIT
  • Agile, fast prototyping skills, including feature integration during all cycles of development
  • Exceptional verbal, written, and interpersonal communication skills
  • Customer-obsessed mindset, as well as an innovation-minded and critical mindset (‘good enough’) 
  • Proven experience in translating the company strategy and the Tribe strategy into a concrete trajectory for the squads and Chapter members
  • Good stakeholder management upholding the Data Team’s brand and values
  • Proven track record of people management and coaching and mentoring skills
  • Ability to motivate and engage people and to bring them to the next level
  • Proficient communication skills (written, oral) in English as well as a good level of French and/or Dutch

17. BA in Information Technology with 7 years of Experience

  • Experience designing and building complete, robust ETL/SSIS processes
  • Experience with one or more BI reporting tools (Qlik-based)
  • Experience leading a technical team, direct management of the team
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • Experience supporting and working with cross-functional teams in a dynamic environment
  • Possess strong quantitative and problem-solving skills
  • Able to be solutions-oriented, a self-starter and have an eye for detail
  • Must have exceptional organisational and stakeholder management skills
  • Able to and enjoy working in a fast-paced, dynamic and international team environment
  • Excellent communication skills in English (written and verbal) 
  • Able to effectively communicate technical subject matter to technical and business stakeholders

18. BA in Data Analytics with 8 years of Experience

  • Working experience in designing ETL processes, designing and managing data warehouses and database backends for web applications, and building reporting dashboards
  • Working experience in managing a team of ETL Developers, Data Engineers, and Business Intelligence Analysts
  • Expert command of modern ETL development, data warehousing, and reporting methodologies
  • Expert command of SQL, Tableau, SSIS, and SSRS
  • Strong experience developing scalable, secure, and efficient backends for web applications
  • Experience designing robust and secure API's for databases
  • Experience in the Marketing/Analytics industry 
  • Experience leading and building a team of data engineers
  • Experience with SQL development, demonstrating mastery of use
  • Experience working with data in a variety of capacities 
  • Hands-on experience with SQL, Snowflake
  • Experience with ETL/ELT tools such as FiveTran and dbt
  • Dev-ops experience using GIT, developing, and deploying code to production
  • Proficient in working in Unix/Linux as well as a Windows server environment

19. BA in Management Information Systems with 4 years of Experience

  • Proven track record of working with RDBMS in the role of data designer and developer
  • Highly experienced in Databases, DataMarts and Warehousing
  • Excellent in Python scripting with the added advantage of R
  • Experience of working with Data Scientists and has exposure to the process of model building
  • Exposure to Dataiku to help automate data models or data integrations
  • Good experience as an ETL and Kimball Dimensional database designer and developer on large multi-million-row data warehouse applications
  • Exposure to BI technologies such as Business Objects, MicroStrategy and Tableau 
  • Ability to interpret a set of business requirements and turn them into simple, sensible solutions
  • Experience in identifying and addressing risks and issues impacting application delivery

20. BA in Applied Computer Science with 10 years of Experience

  • Advanced understanding of Relational Database Management Systems
  • Understanding of Object-Oriented Programming, including languages such as Python and R
  • Working knowledge of one or more (ideally several) data platforms, including MS SQL Server, Azure SQL, Azure Synapse, Oracle, Redshift, Couchbase
  • Understanding of extraction tools such as Azure Data Factory and SSIS to supply the Decision Support database platforms with data from any of the corporation’s OLTP or analytical databases
  • Hands-on experience using a data visualization tool, such as Power BI and Tableau
  • Demonstrated aptitude for problem-solving and problem identification
  • Attention to detail 
  • Ability to manage multiple projects and responsibilities at once
  • Strong written and verbal communication skills and ability to work in a global environment
  • Ability to operate in a team environment, with excellent interpersonal skills
  • Hands-on experience with Enterprise Database Development such as MS SQL Server and Oracle 10g+
  • Experience with Azure, AWS or other cloud solutions
  • Experience using Stored Procedures
  • Experience creating and modifying SSIS packages
  • Automation experience with business and technology processes

21. BA in Mathematics with 9 years of Experience

  • Experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Informatics, Information Systems, or another quantitative field
  • Experience with big data tools such as Hadoop, Spark, Hive, etc.
  • Experience with object-oriented/object function scripting languages such as Python, Scala, Shell, etc.
  • Experience with relational SQL and NoSQL databases, including Postgres and Oracle
  • Experience (nice to have) with tools like Dataiku, Mosaic, Alteryx, etc.
  • Experience building RESTful APIs to provide programmatic access to data, ideally for customers
  • Confident in working with a range of cloud services on AWS and/or Azure
  • Experience in handling data pipelines, data warehouses and data lakes
  • Understand large data sets and their application to business
  • Comfortable as the data expert at a company and explaining the business significance throughout the schema
  • Ability to ingest large amounts of SQL and NoSQL data and ensure that the data is beautifully structured and formatted in the data warehouse in such a way that is consistent and easy to utilize and leverage and understand
  • Experience as a team lead, setting strategy and mentoring others

22. BA in Statistics with 8 years of Experience

  • Significant technical and specialized expertise in several data-oriented fields such as integration, databases, modelling, governance and processes
  • Excellent knowledge of the concepts, technologies and solutions of modern digital platforms (microservices, APIs, real-time data streams, orchestration, straight-through processing, etc.) and the data architecture needed to support it
  • Hands-on experience in a large-scale global data science, warehousing and analytics projects using Redshift, Snowflake DB, Oracle and/or SQL Server databases
  • Knowledge of data technical architecture, infrastructure components, ETL/ELT, reporting/analytic tools and microservices
  • Advanced skills in reporting tools such as PowerBI, Tableau and/or Looker
  • Experience with AWS cloud using Kinesis, Lambda, DynamoDB, Athena, ECS, EKS, etc.
  • Experience using R/Python in data-focused full-stack environments (from data wrangling to analysis to visualizations)
  • Experience working with consumer-based clickstream, web behavioral data and content data
  • Demonstrated experience as a software engineer
  • Expertise in the Data Engineering tech stack, including Python or Scala, data warehouses, SQL and relational databases
  • Experience with Snowflake
  • Experience working with data lakes and data processing systems that operate across a large organization and that make use of technologies such as Spark, Hive, or Presto

23. BA in Economics with 7 years of Experience

  • Experience building ETL pipelines into a data warehouse and event layer
  • Experience with using Python in a production environment
  • Experience with AWS technologies
  • Able to gather data from a variety of sources and interfaces, including REST APIs
  • Experience working in a high-volume data environment such as e-commerce and SaaS environments
  • Data management, data engineer or analytics engineer experience 
  • Experience in people management (or proven ability and passion to manage people)
  • Able to make decisions based on data and evidence
  • Able to take great pleasure in writing quality, highly maintainable code
  • Able to thrive in environments supporting growth
  • Excited about new technologies and spend time staying up to date in the industry
  • Experience and passion for fitting pragmatic solutions to problems (building, open-source, vendor)

24. BA in Finance with 4 years of Experience

  • Strong analytical and data modeling skills
  • Solid understanding of database technology
  • Experience with AWS or similar
  • Familiar with distributed computing
  • Solid understanding of analytic platform data use
  • Strong programming skills in SQL and/or Python
  • Experience using the AWS big data technology stack
  • Must have moderate experience with Alteryx or similar ETL platforms
  • Experience implementing data governance principles

25. BA in Business Analytics with 6 years of Experience

  • Strong data engineering (ETL) experience in the cloud, in AWS
  • Must have AWS Certification (developer/DevOps/SA)
  • Excellent understanding of the distributed computing paradigm
  • Should have excellent experience in data warehouse and data lake implementation
  • Experience in Relational databases, ETL design patterns and ETL development
  • Excellent experience in CICD frameworks and container-based deployments
  • Should have excellent programming and SQL skills
  • Should have good exposure to No-SQL and Big Data technologies
  • Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise
  • Demonstrate strong analytical and problem-solving capability
  • Good understanding of the data ecosystem, both current and future data trends

26. BA in Operations Research with 7 years of Experience

  • Experience in leading complex projects
  • Must be able to solve complex business problems and present recommendations to senior management effectively
  • Must have good organizational and decision-making skills
  • Proven ability to lead and influence others at all levels within the organization
  • Good facilitation skills and ability to work across organizations and unite diverse individuals with disparate goals and enable productive discussion and outcomes
  • Strong influencer with the ability to successfully navigate change and drive success across teams
  • Good organizational, planning and prioritization skills
  • Good oral and written communication skills
  • Experience in Data Engineering, Software Engineering, Security Engineering or related fields
  • Fluency in SQL as well as Python or Java
  • Experience with Event-driven/streaming architectures
  • Experience in custom ETL design, implementation and maintenance
  • Experience working with Big Data/MPP analytics platform (i.e., Spark, Flink, Amazon Redshift, Google BigQuery, or similar)

27. BA in Information Management with 5 years of Experience

  • Hands-on experience with Pipelines involving Big Data (Hadoop) and NoSQL(Cassandra)
  • Experience in Kafka/Solr/Elastic search 
  • Able to build workflows and related automation for sourcing requirements from logical modeling to physical implementation
  • Able to utilize technical expertise and leadership skills to resolve issues, ensure product goals are met, and serve as a mentor to coach and support junior engineers to continue to raise the engineering bar across the organization
  • Able to work to drive both high-level and detailed technical designs, conduct technical reviews and define solution requirements and technical implementation approaches
  • Able to define the team's roadmap in line with the company and technology vision
  • Proficient understanding of algorithms, data structures, architectural design patterns and best practices
  • Adept with agile software development lifecycle and DevOps principles
  • Excellent communication, presentation, leadership, problem-solving and analytical skills
  • Proven collaboration and influencing skills

28. BA in Technology Management with 8 years of Experience

  • Experience in data technologies and applications development
  • Experience in Hadoop using Core Java Programming, Spark, Kafka, Hive, Pig scripts, Sqoop on Linux/Unix environment
  • Expertise in Cassandra, ScyllaDB, NiFi 
  • Working experience in Agile development
  • Expertise in Object-Oriented Programming Language - Java/ Python
  • Strong algorithms and data structures
  • Ability to generate creative and innovative solutions for QA challenges and constraints
  • Strong knowledge of database concepts and UNIX/LINUX
  • Expert-level SQL skills for data manipulation (DML) and validation (DB2)
  • Experience using version control and bug tracking tools
  • Result-oriented with strong analytical and problem-solving skills
  • Able to work independently with minimal supervision
  • Comfortable learning new technologies quickly
  • Experience leading a small team 
  • Strong interpersonal and communication skills

29. BA in Quantitative Analysis with 9 years of Experience

  • Experience working with data pipeline and workflow management tools
  • Experience working with cloud data technologies such as Google Cloud Platform or Microsoft Azure
  • Experience working with both relational SQL and NoSQL databases
  • Experience in working with message queuing, stream processing, and highly scalable big data stores
  • Experience in applying TDD and CI/CD approaches when building data solutions
  • Experience leading small teams of up to 3 data engineers
  • Should have working proficiency in SQL, Python,
  • Should have working proficiency in building ETL pipelines
  • Should have experience managing a Data Warehouse
  • Practical knowledge of a platform orchestrating processes (e.g., Airflow)
  • Knowledge of Software engineering methodologies, e.g., Unit Testing, CI/CD
  • Should have the capability to work independently with very minimal supervision
  • Possess critical thinking, problem-solving, stakeholder management, and effective presentation skills
  • Should be a good team player with inclusiveness and constructive thinking

30. BA in Digital Systems with 7 years of Experience

  • Experience in systems/data integration projects
  • Excellent troubleshooting and problem-solving skills
  • Experience in implementing batch, real-time integration use cases involving a variety of endpoints such as Databases, Web Services, REST APIs, Text Files, Message Queues/Streams, COTS products and SaaS platforms
  • Must have solid hands-on technical experience in Talend, AWS Glue, Athena, Python, AWS Lambda and writing complex SQL queries
  • Experience with common web connectivity and integration standards/protocols such as XML, SOAP, REST, JSON, JDBC, OATH, JWT
  • Must have experience in implementing integration processes that are fault-tolerant and scalable
  • Highly proficient with data analysis and data modeling
  • Should have excellent communication skills in both written and verbal English
  • Ability to work as a Technical Lead and provide guidance to junior members on the team
  • Experience with AWS RedShift, IAM, Roles/Policies, Security Groups
  • Experience with visualization/BI Technologies such as Power BI, AWS QuickSight, or others
  • Knowledge/experience with Splunk

31. BS in Computer Science with 4 years of Experience

  • Fluent in Python and comfortable with packages like NumPy and Pandas
  • Expertise in Apache Airflow
  • Advanced working knowledge of SQL, relational databases, and query tuning/optimization
  • Familiarity with a variety of non-relational data stores including Data Warehouses, NoSQL Systems, and Data Lakes
  • Passionate about designing elegant ETL pipelines for big data systems
  • Expert analysis and troubleshooting skills
  • Experience performing root cause analysis on internal and external data and processes
  • Experience in refining non-uniform or unstructured data into actionable data products
  • Experience with automated testing platforms and continuous integration
  • Comfortable with cloud computing services and concepts (AWS)
  • Working knowledge of message queuing, stream processing, and highly scalable data stores

32. BS in Software Engineering with 6 years of Experience

  • Working knowledge of application and data security concepts, best practices, and common vulnerabilities
  • Previous experience working with offshore teams 
  • Working experience in the Financial industry 
  • Knowledge of big data languages such as R, Python/PySpark, and familiarity with cloud architecture
  • Advanced knowledge in data mining, forecasting, simulation, and/or predictive modeling
  • Advanced knowledge building BI environments with significant scale and scope
  • Advanced technical knowledge in IT systems and emerging technology trends and issues
  • Ability to synthesize data from multiple sources to address complex business questions
  • Advanced knowledge with analytics tools, “big data” technologies, cloud computing environments, relational databases
  • Ability to partner with key stakeholders to understand what data and analytics support is needed to help drive positive outcomes

33. BS in Data Science with 5 years of Experience

  • Solid experience with Big Data tools to include but not limited to Scala, Hadoop, Spark, Pyspark, Cloud, and Hive
  • Strong understanding of Agile Principles (Scrum)
  • Proficient with relational data modeling
  • Proven ability to develop in cloud data warehouses
  • Proven ability to develop with Hadoop/HDFS
  • Full understanding of ETL concepts
  • Full understanding of data warehousing concepts
  • Exposure to VCS (Git, SVN)
  • Solid experience developing with either Java, Scala, or Python
  • Must have strong technical aptitude
  • Ability to work as part of a team and to work independently, a self-initiator, versatile and assumes risk with responsibility
  • Ability to maintain confidentiality of work records
  • Highly developed communication skills including facilitation, presentation, and documentation

34. BS in Information Technology with 8 years of Experience

  • Must have previously built large data products from scratch and maintained those products through the entire software development lifecycle at a growth-stage startup
  • Experience leveraging Spark (DataFrames, Query/Job Optimization, Building ETL Pipelines) or an analogous framework
  • Working in a fast-paced, highly collaborative, and ambitious startup work environment
  • Proficient in the design and build of data pipelines using Python to extract, transform and load data from multiple sources
  • Skills and expertise in cloud technology, S3, Presto, Hive and NoSQL database
  • Knowledge and experience of Kinesis Firehose, DynamoDB, and Kubernetes
  • Technically competent web application developer specializing in Microsoft technologies and cloud solutions
  • Highly experienced in database design and development
  • Experience in data and imaging compression technology
  • Proven to have converted functional requirements and user stories into database design and structure that mitigate potential issues, ensure data integrity, and maintain security
  • Expert in the full Software Development Life Cycle (SDLC) with the right Agile mindset
  • Strong sense of commitment, excellent execution, and strong critical thinking and analytical skills

35. BS in Computer Engineering with 7 years of Experience

  • Proven Data Analysis experience in an enterprise-level big data environment
  • Statistical/Regulatory Model and Data working environment experience
  • Excellent hands-on design and programming experience in multi-tiered applications
  • Previous people management experience
  • Advanced in Python
  • Practical experience working with data
  • Deep understanding of Spark, Hadoop, Airflow, Kafka, Terraform, Databricks, SageMaker, PostgreSQL, AWS Cloud
  • Experience with NLP models
  • Extensive Software Engineering experience with Java or Python
  • Data warehousing know-how (with MS SQL or similar RDBMS)
  • Demonstrable test automation experience
  • Exposure to BI development and BI reporting tools (e.g., Power BI, Tableau)
  • Experience building robust CI pipelines

36. BS in Information Systems with 6 years of Experience

  • Experience in computer science, information technology, or a related field
  • Hands-on experience with Amazon Web Services (AWS) based solutions such as RDS, Lambda, Snowflake, and S3 or other cloud technologies
  • Experience in migrating ETL processes (not just data) from relational warehouse Databases to AWS-based solutions
  • Experience with Spark and/or Python
  • Experienced in Agile methodologies
  • Prior experience in the Finance domain
  • Experience as a data engineer/software engineer or similar
  • Experience within R/Python or Java Engineering
  • Deep knowledge of and experience with data warehousing
  • Deep knowledge of and experience with setting up pipelines in a cloud environment, i.e., Data Lake, Databricks, Data Factory, etc.
  • Professional proficiency in verbal and written English
  • Great communication and collaborative skills