BIG DATA DEVELOPER SKILLS, EXPERIENCE, AND JOB REQUIREMENTS

Updated: Sep 21, 2024 - The Big Data Developer with a profound grasp of the Core Java development environment and extensive hands-on experience with Spark Architecture excels in managing high-volume data. This role includes developing Hadoop-based BI solutions to enhance financial, sales, and marketing reporting, alongside a solid command of SQL, databases, and Data Warehousing principles. This position is skilled in sourcing data from diverse systems, they are adept at implementing complete projects on the Hadoop platform, demonstrating strong analytical and communication capabilities.

Essential Hard and Soft Skills for a Standout Big Data Developer Resume
  • Programming Languages
  • Big Data Frameworks
  • Database Management
  • Data Warehousing Solutions
  • Data Mining and Predictive Analytics
  • Machine Learning Algorithms
  • Data Visualization
  • Cloud Platforms
  • ETL Tools
  • Data Security and Privacy Knowledge
  • Problem-Solving
  • Communication
  • Attention to Detail
  • Time Management
  • Adaptability
  • Teamwork
  • Creativity
  • Critical Thinking
  • Leadership
  • Continuous Learning.

Summary of Big Data Developer Knowledge and Qualifications on Resume

1. BS in Computer Science with 2 years of Experience

  • Experience with developing solutions on the AWS platform using services such as RDS, S3, IAM, Lambda, API Gateway
  • Hands-on experience migrating customers to the cloud and designing DevOps operational processes, deployment checklists, etc.
  • Experience with Scrum/Agile methodology
  • Experience working in cloud migration services
  • Have a keen interest in using any and all appropriate tools, especially Cloud-based, to solve the problem at hand
  • Expert-level demonstrated experience in developing code, implementation and adopting to cloud strategy
  • Experience working in Cloud environments, AWS, and Big data environments
  • Experience writing code in a high-level language like Java, a scripting language like Python
  • Experience building integrations between applications using REST APIs
  • Configure and implement AWS tools such as CloudWatch, CloudTrail, and direct system logs for monitoring.

2. BS in Data Science with 3 years of Experience

  • Creation and maintenance of CI/CD pipelines
  • Test-driven development and/or behavior-driven development
  • Ability to stay current with the evolving platform and understand how new AWS services can enrich products
  • Familiarity with GIT and managing branching strategies
  • Strong skills in creating and using complex SQL queries, Stored Procedures, and validating reports/back-end data.
  • Strong Data Analysis skills with Database knowledge and experience with database query tools and languages (TOAD/DB Visualizer/SQL Developer).
  • Ability to work on multiple projects and flexibility to change priorities when needed.
  • Leverage industry best practices to design, test, implement, and support a solution.
  • A very hands-on coding experience using a modern programming language (Python or Scala)
  • Understand data ingestion mechanics
  • Hortonworks-HDP (or) Cloudera-CDP

3. BS in Statistics with 5 years of Experience

  • Working experience in writing Understanding Complex
  • HQL queries using Hive and SQL queries.
  • Hands-on experience with writing Complex Spark programs in Scala/Python to process huge volumes of data.
  • Experience with developing systems that can scale to large amounts of data
  • Experience in Unix Shell scripting for automation activities
  • Experience in any one of traditional relational databases development like Teradata, Oracle, Netezza, SQLServer so on
  • Experience in working with Scrum teams
  • Real-time streaming tools like Apache Kafka and NiFi
  • Experience in using any ETL tools like Informatica (or) IBM
  • Knowledge of the Healthcare and Insurance Domain
  • Experience in Data Warehouse and BI Analytics

4. BS in Information Technology with 4 years of Experience

  • Good understanding of the project lifecycle process, with experience working in an
  • Experience in agile scrum operating model
  • Good understanding of the software development lifecycle, including continuous
  • build and test, peer code review, design review, info security review, production,
  • Experience in UAT and QA release cycles
  • Experience with NoSQL databases, such as HBase, Cassandra
  • Authorization to work in the US is a precondition of employment
  • Experience in a senior Applications Developer role
  • Must have Strong Hands-on experience in Big Data Technologies (Hadoop, Spark, Scala, Elastic).
  • Must have good knowledge of NoSQL(Hbase), MongoDB (& RDBMS is plus)
  • Excellent written/oral communication and presentation skills

5. BS in Applied Mathematics with 1 year of Experience

  • Strong understanding of Core Java development environment and hands-on experience.
  • Has strong knowledge of Spark Architecture and insight into code behavior when high volume data is involved.
  • Have hands-on experience in the design, development and/or support of solutions based on the Hadoop platform
  • Experience on developing/architecting Hadoop based BI Solutions for supporting financial, sales, and marketing performance reporting
  • Must possess strong knowledge of SQL and databases
  • Must possess good concepts in Business Intelligence and Data warehousing
  • Experience in sourcing data from disparate systems with a good understanding of Data Models and ETL procedures
  • Well-versed with deployment methodologies
  • Must have good analytical and communication skills
  • Experience in end-to-end implementation of projects using platform.

6. BA in Computer Information Systems with 2 years of Experience

  • Experience in Spark Development.
  • Knowledge in Spark-based code enhancements for performance improvement.
  • Experience of Core Java development 
  • Experience in software development life cycle.
  • Experience in Project life cycle activities on development and maintenance projects.
  • Experience in Design and architecture review.
  • Ability to work in a team in diverse/ multiple stakeholder environment
  • Experience in the Banking domain
  • Experience and desire to work in a Global delivery environment

7. BS in Software Engineering with 6 years of Experience

  • Working with Big Data technologies, such as Hadoop, ElasticSearch, Hue, Spark, and Kafka.
  • Proven experience in developing MapReduce / HQL processes.
  • Proven experience in developing in Scala / Java or Python.
  • Experience in working with Oracle / MySQL databases preferred.
  • Excellent analytical skills and a talent for providing technological solutions to business problems.
  • Strong independent learning skills and the ability to tackle complex technological challenges.
  • Ability to thrive when working either independently or within a team of engineers.
  • Excellent communication and interpersonal skills.
  • Basic knowledge of industry practices and standards
  • Exposure to Volcker regulation
  • Showcase dynamic leadership presence in various forums.
  • Ability to work with a geographically distributed and diversified team.

8. BA in Data Analytics with 3 years of Experience

  • A solid track record of data management showing flawless execution and attention to detail.
  • Strong knowledge of and experience with statistics.
  • Programming experience, ideally in Python, Spark, Kafka, Elasticsearch, and a willingness to learn new programming languages to meet goals and objectives.
  • Must be very good with Python frameworks like Pandas, NumPy, SciPy, etc.
  • Experience in C, Java or other programming languages
  • Knowledge of data cleaning, wrangling, visualization, and reporting, with an understanding of the best, most efficient use of associated tools and applications to complete these tasks.
  • Deep knowledge of data mining, machine learning, natural language processing, or information retrieval.
  • Experience processing large amounts of structured and unstructured data, including integrating data from multiple sources.
  • A willingness to explore new alternatives or options to solve data mining issues
  • Experience in production support and troubleshooting.

9. BS in Artificial Intelligence with 4 years of Experience

  • Experience in financial services industry products and regulatory development.
  • Solid working experience in various forms of data infrastructure inclusive of RDBMS, such as SQL, Hadoop, Spark, Java, Unix, Oracle and OBIEE
  • Relevant professional experience in Data Engineering and Business Intelligence
  • Experience in Advanced SQL, RDBMS, ETL, and Data warehousing.
  • Strong knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT
  • Experience in reporting/analytic tools and environments, data structures, data modeling and performance tuning.
  • Ability to effectively communicate with both business and technical teams
  • Experience on processing streaming, understand on streaming and realtime difference.
  • Databases, software algorithms and design patterns
  • Experience on Data Visualization & Reporting application development like tableau, Qlik View etc.
  • Experience on Data Quality Management application development
  • Experience BPM-related product and project 

10. BA in Information Systems with 5 years of Experience

  • Must have Design, Development, Testing and Deployment in Hadoop platforms
  • Must have experience with Spark and ScalaPython.
  • Must have working experience with Developing programs running on JavaPython.
  • Must have experience with strong UNIX shell scripting
  • Must have experience with one of the IDE tools such as Eclipse.
  • Must have experience with SDLC Methodology (AgileScrumIterative Development).
  • Must have experience with NoSql Databases like HBASE, Mongo or Cassandra.
  • Must have experience with developing Pig scriptsHive QL, SQOOP
  • Must have experience with Jenkins and Jira tools
  • Ability to work both independently and as part of a team.
  • Ability to solve problems and have analytical thinking capabilities
  • Experience with developing MapReduce programs running on the Hadoop cluster using JavaPython.

11. BS in Computational Mathematics with 2 years of Experience

  • Experience delivering business intelligence solutions
  • Able to scope large efforts, conduct meetings, and write technically oriented documents
  • Able to work with functional and technical teams
  • Proven ability to deliver complex and time-sensitive projects
  • Strong communication (written and oral) and interpersonal skills
  • Architect / Lead with a strong passion for turning large data into actionable insights that drive improved business performance.
  • Good analytical and problem-solving skills
  • Fluent in relational database concepts
  • Must be knowledgeable in software development lifecycles/methodologies i.e. agile
  • Has strong presentation and collaboration skills and can communicate all aspects of the job requirements, including the creation of formal documentation
  • Strong problem-solving, time management, and organizational skills
  • Good knowledge and understanding of the manufacturing industry
  • Experience in working with the multicultural teams in multiple geographies with good collaboration skills
  • Fluent in English verbal/written communication.