3003 Summit Boulevard Atlanta, GA 30319 | Direct Hire
- Gather and processes raw, structured, semi-structured, and unstructured data using batch and real-time data processing frameworks.
- Implement and optimize data solutions in enterprise data warehouses and big data repositories, focusing primarily on movement to the cloud.
- Understand and enforce appropriate data master management techniques.
- Ensure data quality and implement tools and frameworks for automating the identification of data quality issues.
- Work with internal and external data providers on data validation, provide feedback, and make customized changes to data feeds and data mappings for analytical and operational use.
- Understand the challenges that the analytics organization faces in their day-to-day work, and partner with them to design viable data solutions.
- Recommend improvements to processes, technology, and interfaces that improve the effectiveness of the team and reduce technical debt.
- Provide ongoing support, monitoring, and maintenance of deployed products.
- Work directly with stakeholders to understand real world problems in our domain.
- Shape actionable items for the data services engineering team to solve those real-world problems.
- Drive new and enhanced capabilities to our Enterprise Data Platform partners to meet the needs of product/engineering/business.
- Demonstrate ownership of initiatives and drive them through to completion.
- Extensive experience engineering server-side code for data processing applications.
- Working experience with Hadoop ecosystem technologies such as Hive or Pig.
- Working experience with distributed scalable data stores such as Hbase, Accumulo, or BigTable.
- Working experience with MapReduce-based frameworks.
- Experience with data concepts (SQL, NoSQL, Normalization) and working with complex data structures.
- Exposure to Agile/SAFE methodologies.
- Git or general version control experience.
- Knowledge of functional programming.
- Ability to understand and work with product teams to engineer requirements.
- Analytical skills and the ability to pay careful attention to detail.
- Technical writing including high and low level diagramming techniques.
- MS/BS degree in Computer Science, related field, or equivalent work experience.
- Experience with Data Modeling
- Experience with event frameworks (Kafka, Kinesis, RabbitMQ etc.)
- Working experience of statistics and how they apply to operational metrics