DATA QUALITY LEAD SKILLS, EXPERIENCE, AND JOB REQUIREMENTS

Published: October 2, 2024 - The Data Quality Lead champions the enhancement of data quality management for master, transactional, and third-party data, focusing on financial and asset management sectors. This role is pivotal in integrating sophisticated data quality frameworks, utilizing financial data sources like Factset, Reuters, and Bloomberg, and promoting an automation-first approach with tools such as Python. Expertise in data warehousing, data lakes, and data modeling enriches the capacity to develop robust data quality strategies across a matrixed organization.

Essential Hard and Soft Skills for a Standout Data Quality Lead Resume
  • Data Management
  • Data Analysis
  • Database Skills
  • Data Modeling
  • Programming
  • Governance Knowledge
  • Warehousing Skills
  • Data Mining
  • Data Visualization
  • Machine Learning
  • Communication
  • Problem Solving
  • Attention to Detail
  • Leadership
  • Adaptability
  • Project Management
  • Collaboration
  • Influencing
  • Decision-Making
  • Strategic Thinking

Summary of Data Quality Lead Knowledge and Qualifications on Resume

1. BS in Data Science with 10 years of Experience

  • Work experience with emphasis in a quantitative field such as business, finance, statistics, mathematics, computer science, or analytics Experience in implementing Data quality solutions for multiple clients
  • Informatica experience with IDQ
  • Experience in using FSS products
  • Project management and multi-tasking skills
  • Strong problem-solving and crtitical thinking skills
  • Proven ability to deliver in high pressure environments
  • Exceptional written, verbal community
  • Ability to work in a fast-paced, action-oriented environment
  • Working knowledge of relational databases and SQL 
  • Experience in PowerBI, with additional experience in Tableau and similar BI tools
  • Must have the ability to work with new data sources and create relevant documentation and processes
  • Experience with Microsoft Azure and Agile methodology 
  • Must have a strong work ethic and commitment to getting things done

2. BS in Business Analytics with 4 years of Experience

  • Ability to work in mixed teams that may include not only EPAM colleagues but also client’s engineers and managers as well as third-party vendors
  • Strong knowledge of SQL
  • Strong understanding of DWH principals and best practices
  • Experience with testing automation process in Data projects, exposure to tools like Selenium, JMeter, Cucumber, SoapUI, etc.
  • Experience in manual functional testing (new feature testing/regression testing/smoke testing/integration testing)
  • Strong experience in Layer-to-Layer testing in Data project (ETL testing)
  • Experience in defect reporting (Jira), strong experience in defect assessment and QA investigation (UAT support, L2 support)
  • Experience in QA team management (task planning, distribution and tracking, reporting)
  • Experience in mentoring/coaching of new team members
  • Ability to communicate with different stakeholders from client and third-party sides
  • Experience with Waterfall and Scrum project management methodologies
  • Proficient working in AWS cloud environment
  • Solid experience working on Agile projects

3. BS in Statistics with 8 years of Experience

  • IT experience including Data Integration, Data Quality, Data Migration, Data Governance projects.
  • Experience in Informatica Administration (PowerCenter and/or IDQ).
  • Experience in using IDQ in Data Governance projects.
  • Experience in troubleshooting Informatica (IDQ) administration/operating issues.
  • Strong debugging, problem-solving and investigative skills. 
  • Ability to assimilate information (log files, error messages etc.) and pursue leads to find root cause problems.
  • Understand integrating Informatica DQ tool with Microsoft Azure, Collibra and additional systems for monitoring enterprise data quality.
  • Excellent stakeholder-facing and internal communication skills.
  • Strong written and verbal communication skills.
  • Proven ability to execute multiple tasks efficiently and effectively.
  • Self-motivated, independent and possesses the ability to learn quickly.
  • Analytical mindset with strong
  • Ability to effectively conduct meetings, both formal and informal. 

4. BS in Computer Science with 5 years of Experience

  • Strong experience with data quality management principles around master data as well as transactional and 3rd party data, especially in financial/asset management context.
  • Demonstrated passion and experience in building a strong data quality organization
  • Experience with asset management or capital markets data such as financial securities data, risk data, trading data, Account, Product, and Client data.
  • Excellent problem solving and analytical skills with an automation mindset
  • Experience with data quality tools (like EDQ, Ataccama, IBM Information Analyzer etc.)
  • Strong proficiency with SQL (data definition and data manipulation)
  • Ability to automate data quality tasks with basic programming languages such as Python or similar
  • Experience using financial data sources such as Factset, Reuters, Bloomberg.
  • Understanding of data warehouses, data lakes, data modeling helpful.
  • Experience in data quality related roles
  • Ability to build consensus across teams, with staff and leadership.
  • Ability to achieve results through influence without authority in a highly matrixed organization.
  • Ability to successfully negotiate and collaborate with others of different skill sets, backgrounds and levels within the organization.

5. BS in Information Systems with 7 years of Experience

  • DevOps with Gitlab/Jenkins CI/CD experience
  • Experience in X-Ray Test Management Tool preferred along with similar tools like ALM and Azure Test Manager
  • Ability to influence cross-functional teams (Automation, Performance and Restaurant Functionality QA)
  • Experience creating test cases and test scripts for data warehouse testing, including data completeness testing, boundary testing, data validation based on data models and using source to target mapping documents, validation of transformations, and slowly changing dimensional data
  • Extensive SQL skills that support the testing listed above, including the ability to write simple and complex SQL scripts, store procedures, create and execute on-demand and scheduled SQL jobs to support the testing effort
  • Clear understanding of data warehouse concepts based on experience, demonstrated by the ability to infer what types of tests need to be performed based on the business needs, business rules, data model, source to target mapping documents, and down-stream artifacts
  • Extensive API Testing knowledge using Postman and Swagger
  • Working knowledge of JSON and XML messaging
  • Able to perform in a fast pace agile environment (2 weeks of sprint), attend sprint planning at the beginning of each sprint and retro at the end of each sprint and mid sprint review/product backlog review (PBR) meeting to go over backlog and prioritized User stories and estimated points
  • Ability to work independently and with offshore coordination 
  • Testing experience with Snowflake, Databricks, MS SQL Server, Apache Airflow, Tibco MDM, Talend, ADLS