WHAT DOES AN ETL DEVELOPER DO?
Updated: Jun 11, 2025 - The Extract, Transform, Load (ETL) Developer ensures adherence to Martin Marietta’s shared values and Code of Ethics while driving the strategic direction of IT enterprise architecture. Builds strong partnerships with business and technology teams to align Customer Information Architecture with company objectives and technical requirements. Develops processes, ensures data security, and provides ongoing architecture support to maintain consistency and compliance.


A Review of Professional Skills and Functions for Extract, Transform, Load Developer
1. Extract, Transform, Load Developer Duties
- Data Engineering: Design, develop, analyze raw data, and maintain datasets.
- Data Integrity: Ensure data integrity is maintained.
- Data Provisioning: Provide data for system requirements.
- Data Quality: Improve data quality and efficiency.
- Data Organization: Improve data organization and accuracy.
- Documentation Support: Support documentation, management of source code, implementation, and other tasks.
- Requirements Translation: Transform functional requirements into technical specifications.
- Data Modelling: Create and update data models.
- Data Profiling: Perform data profiling and data quality analysis.
- ETL Development: Develop, tune, and maintain ETL objects and scripts using Informatica PowerCenter.
- ETL Programming: Design, program, test, and tune ETL code to move data from multiple data sources (databases and files) into target databases.
- Data Transformation: Ensure transformed/converted data maintains integrity, and accuracy, and meets performance expectations.
- Informatica Development: Experience in developing Informatica ETL programs with Oracle and PostgreSQL databases as targets.
- Business Prioritization: Work with team leads to prioritize business and information needs.
- ETL Specifications: Prepare high-level ETL mapping specifications.
- SQL Development: Develop complex data scripts (primarily SQL) for ETL.
- Data Troubleshooting: Troubleshoot and determine the best resolution for data issues and anomalies.
2. ETL Developer Details
- Requirements Analysis: Work with customers, managers, and Application Development Group departmental management to understand requirements for simple-to-complex business systems required by the group.
- Technical Specifications: Work with other IT staff to develop technical specifications based on functional specifications as created.
- SDLC Process: Follow typical systems development life cycle processes in defining projects and gaining approval.
- Business Alignment: Work with the business unit to ensure that the system is being built to defined expectations and perform change management functions.
- ETL Development: Develop or participate in the development of ETL systems using Visual Studio, SSIS, TFS, and MS-SQL Server Integration Services packages as requested by the manager.
- Problem Analysis: Analyze and assist in the correction of problems that may occur in system operations.
- Data Quality: Understand and interpret upstream and downstream data flow, data mapping, and check data quality from source to target systems.
- Impact Analysis: Perform impact analysis (IA) of new applications, new interfaces, and integration.
- Data Mapping: Develop mapping from the source system to the target system for all change requests.
- Solution Design: Responsible for implementing high-quality solutions by translating business requirements into functional specifications and technical specifications.
- ETL Architecture: Responsible for developing ETL architectural and design patterns, considerations, solution options, and impact.
- ETL Implementation: Design, develop, and implement ETL solutions using IBM InfoSphere (CDC).
- Stakeholder Communication: Meeting with users to clarify requirements and communicate with offshore/remote teams is a must for this role.
- Post-Production Support: Provide post-production batch support.
- Testing Support: Participate in unit testing, system integration testing, and UAT support.
- Batch Support: Provide night batch support for critical system uptime.
3. Extract, Transform, Load Developer Responsibilities
- Ethics & Compliance: Adherence to and compliance with Martin Marietta’s shared values and Code of Ethics.
- IT Strategy: Share in developing the long-term strategic direction for the company's IT enterprise architecture.
- Business Alignment: Understand the strategic objectives of the company and ensure that Customer Information Architecture is capable of supporting them.
- Stakeholder Management: Establish partnerships with key business constituents that help drive customer information-related requirements and ensure that the Customer Information Architecture vision is in alignment with these requirements.
- Technology Partnership: Partner with technology teams to ensure that the Customer Information Architecture supports the technical requirements of applications and that application designs leverage the Customer Information Architecture.
- Process Development: Develop processes and artifacts to ensure that new services and changes to existing services are aligned with the target architecture.
- Architecture Review: Provide ongoing architecture review and design support to ensure all areas of IT are implementing solutions consistent with the target architecture.
- Business Relationship Management: Liaise with management and subject matter experts across the industry and within the organization to accomplish objectives and build strong business relationships.
- Leadership & Initiative: Display initiative, inspire others, and deliver high-quality work while meeting important deadlines for internal customers.
- Data Requirements: Work with stakeholders to understand data needs and evaluate requirements for data ingestion.
- Project Tracking: Estimate, track, and communicate the status of assigned items to the Enterprise Data and Analytics team.
- Data Security: Ensure data security in all areas within the ADAP environment.
- Code Review: Perform code reviews and approve Pull Requests to ensure adherence to good software design practices and Architecture strategy intent.
- Collaboration: Participate in cross-team meetings.
- Issue Resolution: Participate in production issue investigation
4. ETL Developer Accountabilities
- Data Modelling: Develop logical and physical data flow models for ETL applications. Lead the design and development of ETL processes for the data warehouse lifecycle (staging of data, ODS data integration, EDW, and data marts).
- ETL Best Practices: Lead solution design workshops, championing and promoting the application of ETL best practices, document ETL data mappings, data dictionaries, processes, programs, and solutions as per established standards for data governance.
- Cloud Data Engineering: Led the design, build, test, and maintain cloud-based data structures such as data marts, data warehouses, data lakes, and data pipelines to acquire, profile, cleanse, consolidate, integrate, and persist structured, semi-structured, and unstructured data.
- Database Design: Work with the team to design the logical data model implement the physical database structure and construct/implement operational data stores and data marts.
- ETL Development: Develop and create transformation queries, views, and stored procedures for ETL processes and process automation, design and develop stored procedures, functions, views, and triggers to be used during the ETL process.
- System Integration: Integrate ETL development with existing projects to maximize object reuse.
- Business Data Modelling: Design and develop business data models required by analytics applications such as interactive dashboards and ad-hoc reports.
- ETL Support: Provide support, maintenance, troubleshooting, and performance tuning to HRC ETL applications.
- Project Management: Manage multiple projects at various stages simultaneously.
- ETL Strategy: Develop ETL (Extract-Transform-Load) strategies to support HRC business customers.
5. Extract, Transform, Load (ETL) Developer Functions
- Data Engineering: Responsible for the full life cycle development, implementation, support, and self-serve of data processing within the Digital Data Platform.
- Data Architecture: Creating, testing, and maintaining the optimum data processing pipeline design and architecture within the Digital Data Platform.
- Data Processing: Gather and process raw data at scale from different data sources and consolidate it into proper data layering techniques within the Digital Data Platform.
- Stakeholder Management: Working with different stakeholders from different divisions, including IT Teams, Digital Teams, DevOps Teams, Operations Teams, and other required teams to assist with data processing and platform manageability and maintainability.
- Issue Resolution: Identify and resolve Digital Data Platform-related issues on the infrastructure level and data layers level.
- ETL Development: Contribute to the design and implementation of data pipeline solutions, ETL system enhancements, and Digital Data Platform enhancements.
- Technical Documentation: Provide technical documentation of data processing, ETL solutions, and Digital Data Platform changes and deployment.
- System Support: Perform system-level support of data processing, ETL solutions, and Digital Data Platform solutions, ensuring high quality and accuracy of work and service.
- Data Marts Development: Delivery and maintenance of data marts using WhereScape Red / SQL Server.
- Data Modelling: Lead data modeling, including discovery workshops, star schema design, and dimension conformance.
- Software Development: Lead and contribute across the development life cycle, e.g., testing and release.
- Collaboration: Work within the Data Analytics team alongside BI Analysts, Modelling Analysts, and Actuaries.
- Dimensional Modelling: Be the expert in dimensional modeling practice, including liaising with architecture, establishing standards, and coaching users.
- Data Visualization: Contribute to the development of data systems by building software for retrieving, analyzing, and visualizing data.
- Data Analysis: Analyze material flow data and extract useful statistics about failures to drive meaningful improvements to production control and customer experience.