Big Data Architect

Duluth, GA | Scout

Post Date: 07/24/2017 Job ID: 46334 Industry: Big Data

We work more like a small startup company. Our development teams are small and embrace agile principles to quickly generate value to our stakeholders. We combine the latest open source technologies together with traditional enterprise software products.

Our office culture is casual, fun and social, with an emphasis on education and innovation. We have the freedom to try new ideas, experiment and are expected to be constantly learning and growing. There is also a strong emphasis on mentoring others in the group, enabling them to grow and learn.

What You’ ll Do:
  • Lead Big Data engineering team with specialization on data ingestion (from 100+ source systems in batch and near real-time), egestion and governance.
  • Participate in collaborative software and system design and development of the new Data Lake on Hortonworks HDP and HDF distributions.
  • Manage own learning and contribute to technical skill building of the team.
  • Inspire and cultivate the engineering mindset and systems thinking.
  • Gain deep technical expertise in the data movement patterns, practices and tools.
  • Play active role in Big Data Communities of Practice.
  • Put the minimal system needed into production.

Qualifications:

Required Qualifications:
  • Bachelor’ s degree or higher in Computer Science or a related field.
  • Good understanding of distributed computing and big data architectures.
  • Passion for software engineering and craftsman-like coding prowess.
  • Proven experience in developing Big Data solutions in Hadoop Ecosystem using Apache NiFi, Kafka, Flume, Sqoop, Apache Atlas, Hive, HDFS, HBase and Spark (Hortonworks HDP and HDF preferred).
  • Experience with at least one of the leading CDC (Change Data Capture) tools like Informatica PowerCenter.
  • Development experience with at least one NoSQL database. HBase or Cassandra preferred.
  • Polyglot development (4-5 years+): Capable of developing in Java and Scala with good understanding of functional programming, SOLID principles and, concurrency models and modularization.
  • DevOps: Appreciates the CI and CD model and always builds to ease consumption and monitoring of the system. Experience with Maven (or Gradle or SBT) and Git preferred.
  • Experience in Agile development including Scrum and other lean techniques.
  • Should believe in You Build! You Ship! And You Run! Philosophy.
  • Personal qualities such as creativity, tenacity, curiosity, and passion for deep technical excellence.

Desired Qualifications:
  • Experience with Big Data migrations/transformations programs in the Data Warehousing and/or Business Intelligence areas.
  • Experience with ETL tools like Talend, Pentaho, Attunity etc.
  • Knowledge of Teradata, Netezza etc.
  • Good grounding in NoSQL data stores such as Cassandra, Neo4j etc.
  • Strong knowledge on computer algorithms.
  • Experience with workload orchestration and automation tools like Oozie, Control-M etc.
  • Experience in building self-contained applications using Docker, Vagrant. Chef.
Apply Online

Not ready to apply?

Send an email reminder to:

Share This Job:

Related Jobs: