
Job Description:
- Design, implement and deliver custom solutions using the existing robotics framework.
- Debug issues, do root-cause analysis and apply fixes.
- Design and implement tools to facilitate application development and testing.
- Participate in architectural improvements.
- Work with team members in deployment and field testing.
Requirements:
- Bachelor Degree / Masters in Engineering (ECE or CSE preferred)
- Work experience of 3+ years in software programming.
- Proficiency in Python programming for Linux based systems.
- Full understanding of software engineering.
- Basic Knowledge of Robot Operating System(ROS) is a plus.
- Good understanding of the algorithms and control loops.
- Working knowledge of Git: creating, merging branches, cherry-picking commits, examining the diff between two hashes. Advanced Git usage is a plus.
- Knowledge of video streaming from edge devices is a plus.
- Thrive in a fast-paced environment and have the ability to own the project’s tasks end-to-end with minimum hand-holding
- Learn and adapt new technologies & skills. Work on projects independently with timely delivery & defect free approach.

Similar jobs
Mandatory (Experience 1) – Must have 5+ years of hands-on experience in Information Security, with a primary focus on cloud security across AWS, Azure, and GCP environments.
Mandatory (Experience 2) – Must have strong practical experience working with Cloud Security Posture Management (CSPM) tools such as Prisma Cloud, Wiz, or Orca along with SIEM / IDS / IPS platforms
Mandatory (Experience 3) – Must have proven experience in securing Kubernetes and containerized environments including image security,runtime protection, RBAC, and network policies.
Mandatory (Experience 4) – Must have hands-on experience integrating security within CI/CD pipelines using tools such as Snyk, GitHub Advanced Security,or equivalent security scanning solutions.
Mandatory (Experience 5) – Must have solid understanding of core security domains including network security, encryption, identity and access management key management, and security governance including cloud-native security services like GuardDuty, Azure Security Center etc
Mandatory (Experience 6) – Must have practical experience with Application Security Testing tools including SAST, DAST, and SCA in real production environments
Mandatory (Experience 7) – Must have hands-on experience with security monitoring, incident response, alert investigation, root-cause analysis (RCA), and managing VAPT / penetration testing activities
Mandatory (Experience 8) – Must have experience securing infrastructure-as-code and cloud deployments using Terraform, CloudFormation, ARM, Docker, and Kubernetes
Mandatory (Core Skill) – Must have working knowledge of globally recognized security frameworks and standards such as ISO 27001, NIST, and CIS with exposure to SOC2, GDPR, or HIPAA compliance environments
Job Title: Senior AIML Engineer – Immediate Joiner (AdTech)
Location: Pune – Onsite
About Us:
We are a cutting-edge technology company at the forefront of digital transformation, building innovative AI and machine learning solutions for the digital advertising industry. Join us in shaping the future of AdTech!
Role Overview:
We are looking for a highly skilled Senior AIML Engineer with AdTech experience to develop intelligent algorithms and predictive models that optimize digital advertising performance. Immediate joiners preferred.
Key Responsibilities:
- Design and implement AIML models for real-time ad optimization, audience targeting, and campaign performance analysis.
- Collaborate with data scientists and engineers to build scalable AI-driven solutions.
- Analyze large volumes of data to extract meaningful insights and improve ad performance.
- Develop and deploy machine learning pipelines for automated decision-making.
- Stay updated on the latest AI/ML trends and technologies to drive continuous innovation.
- Optimize existing models for speed, scalability, and accuracy.
- Work closely with product managers to align AI solutions with business goals.
Requirements:
- Minimum 4-6 years of experience in AIML, with a focus on AdTech (Mandatory).
- Strong programming skills in Python, R, or similar languages.
- Hands-on experience with machine learning frameworks like TensorFlow, PyTorch, or Scikit-learn.
- Expertise in data processing and real-time analytics.
- Strong understanding of digital advertising, programmatic platforms, and ad server technology.
- Excellent problem-solving and analytical skills.
- Immediate joiners preferred.
Preferred Skills:
- Knowledge of big data technologies like Spark, Hadoop, or Kafka.
- Experience with cloud platforms like AWS, GCP, or Azure.
- Familiarity with MLOps practices and tools.
How to Apply:
If you are a passionate AIML engineer with AdTech experience and can join immediately, we want to hear from you. Share your resume and a brief note on your relevant experience.
Join us in building the future of AI-driven digital advertising!
Position Responsibilities
•Design, develop and test responsive and modular web applications providing optimal user experience on desktop and mobile devices
•Coordinate with other developers and teams in a fast-paced, collaborative development environment
•Research, build and coordinate the conversion and/or integration of new features
•Troubleshoot and analyse root cause for pre-prod or production problems and resolve issues
•Address problems with systems integration and compatibility
•Demonstrate impact of design on scalability, performance, and reliability
•Follow established coding and software tools standards in adherence to established security and quality control standards for software development
•Provide technical guidance to junior team members
Requirements and Qualifications
- Bachelor’s degree in Computer Science or related field
- 8+ years of experience as frontend engineer building large and cross platform applications
- SME level experience in Angular and/or React
- Excellent experience in Graphql, WebRTC, WebSockets and REST, PWA, Service Workers
- Excellent understanding of DOM, component rendering and client side performance issues
- Deep knowledge of Webpack like various bundling/build mechanisms and optimising builds
Good-to-have Qualifications
- Experience with building maps, reporting and analytics solutions
- Solid understanding of creating cross platform mobile application and publishing on various channels
- Experience with Native Android, Swift, or reactive Interfaces using RxJS
- Experience with Cloud Technologies
The ideal candidate is a self-motivated, multi-tasker, and demonstrated team-player. You will be a lead developer responsible for the development of new software products and enhancements to existing products. You should excel in working with large-scale applications and frameworks and have outstanding communication and leadership skills.
Qualifications
- 2-4 years of relevant experience in developing complex enterprise applications using Java, Spring Boot, Message Queues ( like Apache Kafka, RabbitMQ, etc...),Multithreading, and SQL
- Ability to write high-quality code with speed
- Should have experience in building various end to end feature
- Product and startup background is preferred
- Have a good understanding of SDLC, Agile, and design patterns
- Have a working experience with Github and Jira tool
- Should be able to mentor and lead the team in fast-paced environments.
- High growth\Startup mindset
- Open Source contribution\Active Git profile big plus
- Interest in cloud security is a big plus
- Experience developing data center applications in cloud security, backup, and disaster
- recovery, is a big plus
- Knowledge of AWS\Azure is a big plus
Roles & Responsibilities
- Writing good quality code.
- Be able to design ,implement and deploy scalable backend applications on cloud(AWS/Azure).
- Be able to take ownership of projects and deliver them on time.
- Be able to lead a team and mentor junior developers.
Qualifications:
- BE/BTech degree in CS/IT or similar related field.
- 4+ years experience as backend engineer.
- Good knowledge in Python and at least one of the framework - Django, Flask, FastAPI, etc.
- Hands on experience in building microservices applications on AWS/GCP/Azure.
- Good understanding of platforms(Docker, AWS/Azure).
- Must have good understanding of Data Structures/Algorithms/Databases and other CS concepts.
- Experience working in multiple databases and data modelling.
- Experience writing APIs and related technologies like REST, JSON, websockets, grpc etc.
- Should be inquisitive enough picking up any language and frameworks based on need.
- Should be able to design/implement/deploy end-to-end systems.
- Having good knowledge on deploying cloud applications and with Dockers/Kubernetes is a plus.
- Experience building end-to-end analytics systems is a plus.
Do you have excellent tech knowledge, have worked a little and wish to take your career in a new direction, then this role is for you.
Founded in 2014, this company aims to enhance the quality of life and make it easier for women across the country. Recommended by best of doctors, their products are available across wellness stores and e-commerce.
- Understanding the business goals, what the product offers, the platform for communication and accordingly creating copy that can help drive maximum conversions.
- Measuring performance of the copy at every step to ensure business objectives are met.
- Researching about the brand, competition, and category and creating a communication plan for the brand in line with the brand values.
- Ensuring consistent brand voice across all brand channels and pieces of communication
- Understanding the platform for communication and creating copy that works for the platform.
- Measuring content for performance and conversion.
- Monitoring web traffic and engagement (e.g. conversions and bounce rates) basis content created.
- Coordinating and collaborating with marketing and design teams to ensure high quality output.
- Follow industry-related news and generate ideas around trending topics.
- Reviewing and updating published content as needed, identifying gaps and making recommendations accordingly
What you need to have:
- Any Graduation
- Being highly creative and imaginative
- Having good written and interpersonal skills
- Working well in a team
- Ability to work under pressure and manage deadlines
- Having an eye for detail
- Having an interest in new advertising trends and techniques
They are backed by some of the world’s top marquee funds in our journey to create transparency and standardization to an otherwise opaque industry.
Your responsibilities:
● Understand the business context and build high-quality code using proven design and
architectural patterns
● Develop, test and deploy integrations required to meet business requirements
● Carry out unit tests and other quality control mechanisms to inform and validate the code and design
● Utilizing and monitoring cloud infrastructure resources (such as AWS, Azure) efficiently
● Participate in a highly fluid environment applying agile software development principles
● Ensure the coding standards are on-par with the best in the industry
Educational Qualifications:
● Bachelor's or Master’s degree in a quantitative field (e.g. Mathematics, Engineering, Computer Science)
Must have skills:
● 3+ years of experience with React.js, React Native, and Node.js/Java/Go
● Experience in building and deploying a mobile application using React Native.
● Strong proficiency in JavaScript/TypeScript, including DOM manipulation and the JavaScript object model
● A sound understanding of Redux/Flux, Webpack, ES6, Jest.
● Must have hands-on experience in Restful APIs
● Using logic and reasoning to identify the strengths and weaknesses of alternative solutions, conclusions, or approaches to problems
● Ability to ship features on a weekly basis: should be good at time management and
prioritization
● Hands-on experience in CI/CD principles and TDD
● Strong written and verbal English communication skills
Good to have skills:
● Experience in event-driven & asynchronous I/O frameworks
● Exposure to business process and workflow automation
● Working experience in process-driven and data-intensive business applications
Be Part Of Building The Future
Dremio is the Data Lake Engine company. Our mission is to reshape the world of analytics to deliver on the promise of data with a fundamentally new architecture, purpose-built for the exploding trend towards cloud data lake storage such as AWS S3 and Microsoft ADLS. We dramatically reduce and even eliminate the need for the complex and expensive workarounds that have been in use for decades, such as data warehouses (whether on-premise or cloud-native), structural data prep, ETL, cubes, and extracts. We do this by enabling lightning-fast queries directly against data lake storage, combined with full self-service for data users and full governance and control for IT. The results for enterprises are extremely compelling: 100X faster time to insight; 10X greater efficiency; zero data copies; and game-changing simplicity. And equally compelling is the market opportunity for Dremio, as we are well on our way to disrupting a $25BN+ market.
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for innovative minds with experience in leading and building high quality distributed systems at massive scale and solving complex problems.
Responsibilities & ownership
- Lead, build, deliver and ensure customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product.
- Work on distributed systems for data processing with efficient protocols and communication, locking and consensus, schedulers, resource management, low latency access to distributed storage, auto scaling, and self healing.
- Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment.
- Lead the team to solve complex and unknown problems
- Solve technical problems and customer issues with technical expertise
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Mentor other team members for high quality and design
- Collaborate with Product Management to deliver on customer requirements and innovation
- Collaborate with Support and field teams to ensure that customers are successful with Dremio
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 8+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models, and their use in developing distributed and scalable systems
- 5+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Hands-on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, and storage systems
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Ability to solve ambiguous, unexplored, and cross-team problems effectively
- Hands on experience of working projects on AWS, Azure, and Google Cloud Platform
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and Google Cloud)
- Understanding of distributed file systems such as S3, ADLS, or HDFS
- Excellent communication skills and affinity for collaboration and teamwork
- Ability to work individually and collaboratively with other team members
- Ability to scope and plan solution for big problems and mentors others on the same
- Interested and motivated to be part of a fast-moving startup with a fun and accomplished team
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow








