Position summary
A Data Engineer at Figg is a software engineer with a proficiency in data. The data engineer will build and maintain the Figg data warehouse which is used for both reporting and analytics across the company. The individual works closely with the analytics teams, account management teams and sales teams to identify and develop high quality data pipelines. The data comes from a variety of sources and it is the responsibility of the data engineer to make sense of the data using cloud-based systems (AWS) and provide a reliable and structured format to the different business needs at Figg.
Duties & responsibilities
- Advise, consult, and coach other data professionals on standards and practices
- Adapt and learn new technologies in a quickly changing field
- Recommend and implement best tools to ensure optimized data performance
- Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals
- Loves to teach and learn, and knows that continuous learning is the cornerstone of every successful engineer
- Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions
- Partner with business analysts and solutions architects to develop technical architectures for strategic enterprise projects and initiatives
- Has a solid understanding of AWS tools for ETL processing, their pros and cons and is able to intelligently convey the same
The ideal candidate
- Has 7+ years working in a data analytics or engineering role
- Has 5+ years of experience in a data engineering position
- Has advanced working SQL knowledge, query authoring and experience working with different RDBMS
- Has experience with NoSQL datastores
- Has experience with the design and development of data pipelines
- Has experience with ETL, specifically AWS Glue
- Has experience with programming languages, specifically Python
- Has a solid understanding of data streaming engines such as Spark.
- Is knowledgeable with shell scripting in a unix environment
- Experience with AWS cloud services: EMR, RDS, Athena
- Bonus: Cloud certification
- Bonus: Experience with Map/Reduce programming and Hadoop
- Bonus: Experience with infrastructure tooling
- Bonus: Experience working with Pentaho
Bachelor’s Degree in Computer Science/Programming or equivalent work experience required. Candidates must have a legal right to work in the USA.