There are LOTs of MUSTS for this position - top 3 priorities are (AWS, ETL and Data Warehouse). MUST have Redshift of Snowflake Implementation experience - NOT just knowledge of... Agile work environment with 2 week sprints. PLUSSES - programming experience (C#, Java, Python)
The Senior Data Engineer relishes working with large volumes of data, enjoys the challenge of highly complex technical contexts, and, above all else, is passionate about data and analytics. He/she is an expert with modern data architectures, data modeling, ETL design, business intelligence tools and passionately partners with the business to identify strategic opportunities where improvements in data infrastructure creates significant business impact. He/she needs to possess exceptional technical expertise in large scale data
warehouse and BI systems.
Major responsibilities of the position are listed below. To perform the job successfully, the individual must be able to execute each essential duty satisfactorily.
· Responsible for designing, building & managing the advanced analytics platform to support data science and the business intelligence teams with a mind toward accuracy, scalability and high performance.
· Work with the product owner and data stakeholders to understand the business requirements and implement optimal data solutions.
· Architect, implement, and operate stable, scalable and highly performant data pipelines that cleanse, structure and integrate disparate big data sets into a readable and accessible format for end user analyses and reporting.
· Build & operate the TaxAct Data Lake on AWS technologies to optimize our data lifecycle and enable next generation data analytics.
· Provide architectural, functional and process guidance on system capabilities and solution approaches.
· Provide technical guidance and thought leadership to other members of the team
Education & Experience
· Bachelor’s Degree in Computer Science or equivalent experience
· 5+ years hands-on experience with Data Warehousing design & implementation (preferably with AWS Redshift or Snowflake Data Warehouse)
· 5+ years of strong experience with SQL and ability to write efficient code for high volume data processing
· 5+ years hands-on ETL experience support high performing ETL processes, including data quality and testing processes
· 2+ years hands-on experience implementing data lake solutions (preferably on AWS technologies)
· Experience with AWS Big Data technologies (Kinesis, EMR, AWS Glue, etc.)
· Experience with NoSQL technologies (DynamoDB, Redis, ElasticSearch, etc.)
· 2+ years programming experience with an advanced language (C#, Python, Java, etc.)
· Experience ensuring data quality across multiple datasets used for analytical purposes
· Experience with data visualization tools such as Tableau, QlikView, Power BI, etc.
· Experience in agile development methodologies.
· Knowledge of statistical concepts and their application in reporting and data mart applications
· Good communication (oral and written) and interpersonal skills.
· Enthusiastic attention to detail.
· Ability to acquire new skills quickly, and thrive in a collaborative team environment
To perform the job successfully, an individual must be able to execute each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill and ability required.
- English Language - Knowledge of the structure and content of the English language including the meaning and spelling of words, rules of composition, and grammar.
- Mathematics — Knowledge of arithmetic, algebra, statistics, and their applications.
· Computers— Knowledge of computer hardware and software, including applications and programming.