[EY813] - WOBOT. AI - COMPUTER VISION ENGINEER - EDGE DEVICES

Wobot.ai


**Key Responsibilities**: - Develop and implement computer vision algorithms for edge devices, including object detection, recognition, tracking, and segmentation. - Optimize computer vision algorithms to run efficiently on edge devices with limited computational resources. - Design and implement software architecture for computer vision systems on edge devices. - Collaborate with hardware engineers to optimize performance and minimize power consumption. - Test and evaluate computer vision algorithms on various edge devices to ensure functionality and performance. - Stay up-to-date with the latest research and advancements in computer vision and machine learning. **Requirements**: - Bachelor's or Master's degree in Computer Science, Electrical Engineering, or related field. - Strong proficiency in computer vision libraries such as OpenCV, TensorFlow, and PyTorch. - Experience with low-level programming languages such as C++ and CUDA. - Familiarity with hardware design and optimization for edge devices. - Ability to work in a team environment and communicate effectively with cross-functional teams. **How we work**: - We use Microsoft Teams for daily communication, conduct daily standups and team meetings over Teams. - We value open discussion, ownership, and a founder mindset. - We prioritize design, amazing UI/UX, documentation, to-do lists, and data-based decision-making. - We encourage team bonding through bi-weekly town halls, destressing sessions with a certified healer, and fun company retreats twice a year. - We offer a 100% remote workplace model, health insurance, top performers eligible for attractive equity options, mental health consultations, company-sponsored upskilling courses, growth hours, the chance to give back with 40 hours for community causes, and access to a financial advisor. **Shift Timings**: - 8AM to 5PM EST **Salary**: $5,732,850 - $143,338,613 per year **Speak with the employer** +91 9000000000

trabajosonline.net © 2017–2021
Más información