In A Nutshell
Build and develop cloud-based data pipelines and architecture while maintaining and improving back-end Python applications.
Responsibilities
- Supporting the creation and rollout of our cloud-based data ingestion and translation architecture (tech stack includes, DBT and Airflow).
- Maintaining and building upon our relational databases.
- Ensuring smooth functioning of our development, quality assurance, and production environments.
- Developing and maintaining (debugging, improving, testing, and deploying) Python scripts.
- Building integrations applications and solving integrations issues.
- Creating and executing unit test plans based on system and validation requirements.
- Documenting changes in software for end users.
- Serving as a thought partner with colleagues and internal stakeholders about the feasibility and time intensiveness of proposed projects.
Skillset
- Experience with the following languages: Python and SQL.
- At least 3 – 5 years’ experience as a software developer, data analyst, engineer, or similar, in a full-time role.
- Understanding of cloud architectures and their applications within a data-driven environment.
- Integration among different applications through APIs.
- Experience using Python with datasets.
- Experience with SQL language and tools to help build data transformation models.
- Experience with code management and review such as Github.
- A degree (or related practical experience) in computer programming, or related field.
- Experience with cloud deployed applications using AWS a plus.
- Experience with AWS, data modeling and implementation, and/or Ruby on Rails a plus but not necessary.