Senior Data Engineer

Job Description

Cloudleaf is transforming supply chain visibility and optimization. We leverage the ubiquity of low-cost sensors, mobile, the power of cloud computing and machine learning to deliver real-time, data-driven operational intelligence in global supply chains. Cloudleaf is an early revenue growth company and is currently working with top Fortune 500 companies.

Cloudleaf’s patented technology includes the Sensor Fabric™, a mesh of intelligent IoT sensors, gateways and cloud engines. Our Sensor Fabric™ captures location, sensor fusion (such as temperature, shock and vibration) and path data in real time. Our location, workflow and business rules engines in the cloud transform streaming digital data into actionable insights. Dashboards provide alerts, notifications, metrics, trend analytics and KPIs to solve audit/compliance, quality and production problems. Cloudleaf’s library of APIs/SDKs enable seamless integration with existing enterprise systems. We deliver a 50X+ cost improvement over existing solutions, enabling mass-scale deployments that were previously cost-prohibitive.

If you’re seeking an intellectual challenge, a stellar team, and limitless opportunity, Cloudleaf is the place for you. We’re changing the game with how sensor networks are deployed and delivered. We are looking for inspired, passionate thinkers and doers to help us with our mission.

As a senior data engineer, you will establish, advocate and execute integration strategy and frameworks leveraging the latest integration technologies to help our integrations, data and support organizations build the unique and industry-leading Cloudleaf Digital Visibility Platform. You will develop data processing pipelines including both inbound and outbound integration solutions with third-party device vendors, shipment carriers, suppliers and other large enterprise ERP systems. You work closely with both the Data team and Application team to make sure the data is seamlessly integrated, transformed, processed and surfaced in the application. You work closely with the Device team and the Customer Success team to do PoCs with third-party device vendors and prospective customers.

To be successful in this position, you must be a self­starter, take accountability, speak up, work at a quick pace and handle multiple tasks simultaneously while maintaining a good sense of humor. You must possess good oral and written skills and be able to collaborate effectively with other team members. You will be expected to contribute ideas and provide feedback on how to continually improve the way we innovate solutions.

Our backend is written primarily in Java. We use the Spring Framework to build our backend systems. Our backend stack includes technologies such as Kafka, RabbitMQ, Spark and multiple NoSQL/SQL data stores. We deploy our applications in AWS and Azure cloud. Experience with these technologies and frameworks is desirable.

Responsibilities

  • Architect, implement and maintain performant data processing pipelines, microservices and backend systems
  • Work with cross-functional teams located in different geographies (India)
  • Document detailed data processing pipeline architecture, integration flows/processes
  • Participate in quick PoCs with third party vendors, carrier integrations and prospective customers, etc.
  • Communicate effectively with internal teams, business partners and collaborate well within a team environment to drive results
  • Embrace new technologies and work with various tools to achieve desired functionality
  • Develop automated test routines and perform unit, system and functional testing
  • Help/Motivate/Coach junior members of the team

Qualifications

  • 5+ years of software development experience, including designing, building and maintaining distributed systems, data pipelines, RESTFUL web applications and other backend systems
  • 1+ years of experience with both designing and implementing API led connectivity solutions and webhooks
  • Experience with microservices architecture, NoSQL data stores and message brokers like Kafka, RabbitMQ is a plus
  • Integration experience with cloud SaaS platforms such as Salesforce, Oracle Cloud, and cloud-based ETL services such as Azure Data Factory, AWS Glue, etc. is welcome
  • Should be able to translate the product requirements into a clean, generalized architecture, document it, and explain the same to other stakeholders
  • Experience in working with geographically distributed teams in different time zones is desirable.
  • Strong written, verbal communication skills

Compensation & Benefits

  • Comprehensive Health, Vision, Dental, & 401K
  • Equity stake in our future success
  • Flexible vacation policy
  • Team-building activities
  • Kitchen stocked with snacks and drinks
  • Work on exciting new technologies and applications

Our Core Values

  • Innovation, Agility, Action and Results
  • Customers and solutions come first
  • We innovate and constantly improve
  • Have fun and thrive in a fast-paced environment