hero

Companies you'll love to work for

94
companies
602
Jobs

Staff Data Engineer

Cowbell Cyber

Cowbell Cyber

Software Engineering, Data Science
Pune, Maharashtra, India
Posted on Wednesday, May 24, 2023

Job Location: Balewadi, Amar Tech Park , Pune

Interview Process: HR Screening call / Online Technical Assessment via Codility/ L1 Technical Round / L2 Hiring Manager OR Leadership Round.

Senior Data Engineer (Data Acquistion)

What you will do:

  • Partner with data scientists and business stakeholders to understand data needs and help build data products that scale across the company
  • Drive the collection of new data from first-party and third party data sources via APIs and the refinement of existing data sources.
  • Enhance Cowbell Connector platform in terms of adding new functionality, usability and observability.
  • Implement tools for monitoring and ensuring data quality and consistency
  • Collaborate with Architects and Lead Engineers to implement and maintain the standards and consistency for the data-pipelines and best practices
  • Experience in Continuous Integration, Delivery, and Deployment software tools to support, enhance, and grow our CI and CD capabilities
  • Experience in containerization using Docker and Kubernetes.
  • Develop and maintain effective working relationship with data suppliers

What Cowbell needs from you:

  • Solid experience in data ingestion and integration by consuming third-party APIs.
  • Proficient in one or more programming languages such as Java, Python etc. and rock-solid SQL skills
  • Experience implementing a data lake architecture and/or enterprise data solutions.
  • Experience with Data Warehousing solutions such as Snowflake, Redshift or similar solutions

Preferred qualifications:

  • 7+ years of experience in Data Engineering, 4 of which specific use of Python
  • Bachelor's or Master’s Degree in Computer Science, Information Systems, or other related field
  • Experience building large scale distributed data systems
  • Experience working with AWS/GCP/Azure big data technologies
  • Experience with distributed computing technologies like Apache Kafka, Spark etc.
  • Knowledge in data modeling, ETL development, and data warehousing
  • Knowledge of database and software engineering best practices across the development lifecycle including
  • Strong communication skills and experienced in gathering requirement

This is a hybrid role, which will require 3 days in office.