50+ Python Jobs in Hyderabad | Python Job openings in Hyderabad
Apply to 50+ Python Jobs in Hyderabad on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.



What We’re Looking For
As a Senior AI/ML Engineer at Meltwater, you’ll play a vital role in building cutting-edge social solutions for our global client base within the Explore mission. We’re seeking a proactive, quick-learning engineer who thrives in a collaborative environment.
Our culture values continuous learning, team autonomy, and a DevOps mindset. Meltwater development teams take full ownership of their subsystems and infrastructure, including running on-call rotations.
With a heavy reliance on Software Engineering in AI/ML and Data Science, we seek individuals with experience in:
- Cloud infrastructure and containerisation (Docker, Azure or AWS – Azure preferred)
- Data preparation
- Model lifecycle (training, serving, registries)
- Natural Language Processing (NLP) and Large Language Models (LLMs)
In this role, you’ll have the opportunity to:
- Push the boundaries of our technology stack
- Modify open-source libraries
- Innovate with existing technologies
- Work on distributed systems at scale
- Extract insights from vast amounts of data
What You’ll Do
- Lead and mentor a small team while doing hands-on coding.
- Demonstrate excellent communication and collaboration skills.
What You’ll Bring
- Bachelor’s or Master’s degree in Computer Science (or equivalent) OR demonstrable experience.
- Proven experience as a Lead Software Engineer in AI/ML and Data Science.
- 8+ years of working experience.
- 2+ years of leadership experience as Tech Lead or Team Lead.
- 5+ years strong knowledge of Python and software engineering principles.
- 5+ years strong knowledge of cloud infrastructure and containerization.
- Docker (required).
- Azure or AWS (required, Azure preferred).
- 5+ years strong working knowledge of TensorFlow / PyTorch.
- 3+ years good working knowledge of ML-Ops principles.
- Data preparation.
- Model lifecycle (training, serving, registries).
- Theoretical knowledge of AI / Data Science in one or more of:
- Natural Language Processing (NLP) and LLMs
- Neural Networks
- Topic modelling and clustering
- Time Series Analysis (TSA): anomaly detection, trend analysis, forecasting
- Retrieval Augmented Generation
- Speech to Text
- Excellent communication and collaboration skills.
What We Offer
- Flexible paid time off options for enhanced work-life balance.
- Comprehensive health insurance tailored for you.
- Employee assistance programs covering mental health, legal, financial, wellness, and behavioural support.
- Complimentary Calm App subscription for you and your loved ones.
- Energetic work environment with a hybrid work style.
- Family leave program that grows with your tenure.
- Inclusive community with professional development opportunities.
Our Story
At Meltwater, we believe that when you have the right people in the right environment, great things happen.
Our best-in-class technology empowers 27,000 customers worldwide to make better business decisions through data. But we can’t do that without our global team of developers, innovators, problem-solvers, and high-performers who embrace challenges and find new solutions.
Our award-winning global culture drives everything we do. Employees can make an impact, learn every day, feel a sense of belonging, and celebrate successes together.
We are innovators at the core who see potential in people, ideas, and technologies. Together, we challenge ourselves to go big, be bold, and build best-in-class solutions.
- 2,200+ employees
- 50 locations across 25 countries
We are Meltwater. We love working here, and we think you will too.
"Inspired by innovation, powered by people."


What You’ll Do:
As an AI/ML Engineer at Meltwater, you’ll play a vital role in building cutting-edge social solutions for our global client base within the Explore mission. We’re seeking a proactive, quick-learning engineer who thrives in a collaborative environment. Our culture values continuous learning, team autonomy, and a DevOps mindset.
Meltwater development teams take full ownership of their subsystems and infrastructure, including running on-call rotations. With a heavy reliance on Software Engineer in AI/ML and Data Science, we seek individuals with experience in:
- Cloud infrastructure and containerization (Docker, Azure or AWS is required; Azure is preferred)
- Data Preparation
- Model Lifecycle (training, serving, and registries)
- Natural Language Processing (NLP) and LLMs
In this role, you’ll have the opportunity to push the boundaries of our technology stack, from modifying open-source libraries to innovating with existing technologies. If you’re passionate about distributed systems at scale and finding new ways to extract insights from vast amounts of data, we invite you to join us in this exciting journey.
What You’ll Bring:
- Bachelor’s or master’s degree in computer science or equivalent degree or demonstrable experience.
- Proven experience as a Software Engineer in AI/ML and Data Science.
- Minimum of 2-4 years of working experience.
- Strong working experience in Python and software engineering principles (2+ Years).
- Experience with cloud infrastructure and containerization (1+ Years).
- Docker is required.
- Experience with TensorFlow / PyTorch (2+ Years).
- Experience with ML-Ops Principles (1+ Years).
- Data Preparation
- Model Lifecycle (training, serving, and registries)
- Sound knowledge on any cloud (AWS/Azure).
- Good theoretical knowledge of AI / Data Science in one or more of the following areas:
- Natural Language Processing (NLP) and LLMs
- Neural Networks
- Topic Modelling and Clustering
- Time Series Analysis (TSA), including anomaly detection, trend analysis, and forecasting
- Retrieval Augmented Generation
- Speech to Text
- Excellent communication and collaboration skills
What We Offer:
- Enjoy comprehensive paid time off options for enhanced work-life balance.
- Comprehensive health insurance tailored for you.
- Employee assistance programs covering mental health, legal, financial, wellness, and behaviour areas to ensure your overall well-being.
- Energetic work environment with a hybrid work style, providing the balance you need.
- Benefit from our family leave program, which grows with your tenure at Meltwater.
- Thrive within our inclusive community and seize ongoing professional development opportunities to elevate your career.
Where You’ll Work:
HITEC City, Hyderabad.
Our Story:
The sky is the limit at Meltwater.
At Meltwater, we believe that when you have the right people in the right working environment, great things happen. Our best-in-class technology empowers our 27,000 customers around the world to analyse over a billion pieces of data each day and make better business decisions.
Our award-winning culture is our north star and drives everything we do – from striving to create an environment where all employees do their best work, to delivering customer value by continuously innovating our products — and making sure to celebrate our successes and have fun along the way.
We’re proud of our diverse team of 2,300+ employees in 50 locations across 25 countries around the world. No matter where you are, you’ll work with people who care about your success and get the support you need to reach your goals.
So, in a nutshell, that’s Meltwater. We love working here, and we think you will too.

Role: Mobile Automation Engineer (SDET) — On-site, India
Role & Responsibilities
- Design, build and maintain scalable mobile test automation frameworks for Android and iOS using Appium, Espresso, XCUITest or equivalent tools to support continuous delivery.
- Create and own automated test suites (functional, regression, UI, and smoke) that run reliably in CI/CD pipelines (Jenkins/GitHub Actions) and on cloud device farms (BrowserStack/Sauce Labs).
- Collaborate with Developers and Product Owners to translate requirements into test strategies, write robust test cases, and automate end-to-end and integration scenarios (including API tests).
- Investigate, triage, and debug failures — use device logs, ADB, Xcode traces, and performance tools to isolate flakiness and reliability issues and drive fixes.
- Integrate automated tests into build pipelines, enforce quality gates, and provide actionable reporting and metrics for release readiness.
- Advocate and implement test automation best practices: code quality, modular frameworks, reusability, CI parallelization, and maintainable test data strategies.
Skills & Qualifications
- Must-Have
- 3+ years in mobile QA/automation with hands-on experience in Appium or native frameworks (Espresso/XCUITest) across Android and iOS.
- Strong programming skills in Java/Kotlin or Swift and working knowledge of Python or JavaScript for scripting and test tooling.
- Experience integrating automated suites into CI/CD (Jenkins/GitHub Actions) and executing on real & virtual device clouds (BrowserStack/Sauce Labs).
- Practical experience with API testing (REST), test frameworks (TestNG/JUnit/Mocha), and source control (Git).
- Solid debugging skills using ADB, Xcode, Android SDK, and familiarity with mobile performance profiling.
- Preferred
- Experience building custom automation frameworks, parallel test execution, and reliability/flakiness reduction strategies.
- Knowledge of CI orchestration, containerized test runners, and mobile security or accessibility testing.
- ISTQB or equivalent QA certification, prior experience in Agile/Scrum teams, and exposure to device lab management.
We are seeking a highly skilled Fabric Data Engineer with strong expertise in Azure ecosystem to design, build, and maintain scalable data solutions. The ideal candidate will have hands-on experience with Microsoft Fabric, Databricks, Azure Data Factory, PySpark, SQL, and other Azure services to support advanced analytics and data-driven decision-making.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using Microsoft Fabric and Azure data services.
- Implement data integration, transformation, and orchestration workflows with Azure Data Factory, Databricks, and PySpark.
- Work with stakeholders to understand business requirements and translate them into robust data solutions.
- Optimize performance and ensure data quality, reliability, and security across all layers.
- Develop and maintain data models, metadata, and documentation to support analytics and reporting.
- Collaborate with data scientists, analysts, and business teams to deliver insights-driven solutions.
- Stay updated with emerging Azure and Fabric technologies to recommend best practices and innovations.
- Required Skills & Experience
- Proven experience as a Data Engineer with strong expertise in the Azure cloud ecosystem.
Hands-on experience with:
- Microsoft Fabric
- Azure Databricks
- Azure Data Factory (ADF)
- PySpark & Python
- SQL (T-SQL/PL-SQL)
- Solid understanding of data warehousing, ETL/ELT processes, and big data architectures.
- Knowledge of data governance, security, and compliance within Azure.
- Strong problem-solving, debugging, and performance tuning skills.
- Excellent communication and collaboration abilities.
Preferred Qualifications
- Microsoft Certified: Fabric Analytics Engineer Associate / Azure Data Engineer Associate.
- Experience with Power BI, Delta Lake, and Lakehouse architecture.
- Exposure to DevOps, CI/CD pipelines, and Git-based version control.


Job Description
We are looking for a talented Java Developer to work in abroad countries. You will be responsible for developing high-quality software solutions, working on both server-side components and integrations, and ensuring optimal performance and scalability.
Preferred Qualifications
- Experience with microservices architecture.
- Knowledge of cloud platforms (AWS, Azure).
- Familiarity with Agile/Scrum methodologies.
- Understanding of front-end technologies (HTML, CSS, JavaScript) is a plus.
Requirment Details
Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).
Proven experience as a Java Developer or similar role.
Strong knowledge of Java programming language and its frameworks (Spring, Hibernate).
Experience with relational databases (e.g., MySQL, PostgreSQL) and ORM tools.
Familiarity with RESTful APIs and web services.
Understanding of version control systems (e.g., Git).
Solid understanding of object-oriented programming (OOP) principles.
Strong problem-solving skills and attention to detail.

We are hiring a QA Automation Engineer with strong expertise in automation frameworks and hands-on manual testing. The role requires designing and executing test strategies, building automation scripts, identifying bugs, and ensuring the delivery of high-quality, secure, and scalable applications.
Key Responsibilities
- Design, develop, and maintain automation test scripts using Selenium, Appium, Playwright, or Cypress.
- Perform manual testing (functional, regression, smoke, UAT) for web and mobile apps.
- Execute test cases, track bugs, and document results using JIRA or similar tools.
- Conduct API testing using Postman / Rest Assured.
- Perform cross-browser and cross-platform testing to ensure compatibility.
- Collaborate with developers and product teams to reproduce and resolve defects.
- Support CI/CD pipelines with automated test integration (Jenkins, GitLab CI, GitHub Actions).
- Conduct database testing (SQL queries) for data validation.
- Contribute to performance testing using JMeter or LoadRunner.
- Ensure test documentation (test plans, test cases, defect reports, execution results) is accurate and updated.
- Work in Agile/Scrum teams, participate in sprint planning, and provide QA estimates.
- Apply security testing basics to detect vulnerabilities.

We are seeking a talented Full Stack Developer to design, build, and maintain scalable web and mobile applications. The ideal candidate should have hands-on experience in frontend (React.js, Flutter), backend (Node.js, Express), databases (PostgreSQL, MongoDB), and Python for AI/ML integration. You will work closely with the engineering team to deliver secure, high-performance, and user-friendly products.
Key Responsibilities
- Develop responsive and dynamic web applications using React.js and modern UI frameworks.
- Build and optimize REST APIs and backend services with Node.js and Express.js.
- Design and manage PostgreSQL and MongoDB databases, ensuring optimized queries and data modeling.
- Implement state management using Redux/Context API.
- Ensure API security with JWT, OAuth2, Helmet.js, and rate-limiting.
- Integrate Google Cloud services (GCP) for hosting, storage, and serverless functions.
- Deploy and maintain applications using CI/CD pipelines, Docker, and Kubernetes.
- Use Redis for caching, sessions, and job queues.
- Optimize frontend performance (lazy loading, code splitting, caching strategies).
- Collaborate with design, QA, and product teams to deliver high-quality features.
- Maintain clear documentation and follow coding standards.

We are looking for a highly skilled Senior Full Stack Developer / Tech Lead to drive end-to-end development of scalable, secure, and high-performance applications. The ideal candidate will have strong expertise in React.js, Node.js, PostgreSQL, MongoDB, Python, AI/ML, and Google Cloud platforms (GCP). You will play a key role in architecture design, mentoring developers, ensuring best coding practices, and integrating AI/ML solutions into our products.
This role requires a balance of hands-on coding, system design, cloud deployment, and leadership.
Key Responsibilities
- Design, develop, and deploy scalable full-stack applications using React.js, Node.js, PostgreSQL, and MongoDB.
- Build, consume, and optimize REST APIs and GraphQL services.
- Develop AI/ML models with Python and integrate them into production systems.
- Implement CI/CD pipelines, containerization (Docker, Kubernetes), and cloud deployments (GCP/AWS).
- Manage security, authentication (JWT, OAuth2), and performance optimization.
- Use Redis for caching, session management, and queue handling.
- Lead and mentor junior developers, conduct code reviews, and enforce coding standards.
- Collaborate with cross-functional teams (product, design, QA) for feature delivery.
- Monitor and optimize system performance, scalability, and cost-efficiency.
- Own technical decisions and contribute to long-term architecture strategy.

Job Title: Engineering Lead
Role Overview:
We are looking for an Engineering Lead to take end-to-end ownership of technical delivery, design, architecture, and quality for our multi-customer SaaS product. You will lead and mentor the engineering team, drive scalable design and high-quality delivery, manage releases across customer environments, and ensure the stability and performance of the product in production.
Key Responsibilities:
· Delivery & Release Management: Plan and deliver product features and customer-specific releases on time with high quality, ensuring operational readiness and stability across environments.
· Technical Design & Architecture: Lead technical design and high-scale architecture for new and existing modules, ensuring scalability, performance, and maintainability.
· Team Management: Mentor and guide engineers, ensure clarity in priorities, unblock challenges, and foster a culture of ownership and quality within the team.
· Requirement to Delivery: Work with product and customer teams to understand requirements, translate them into designs and implementation plans, and track them through to delivery.
· Product Quality: Establish and maintain engineering best practices, code reviews, automated testing, and CI/CD pipelines to ensure high product quality and reliability.
· Troubleshooting & Support: Lead the team in debugging complex issues in development and production, ensuring minimal downtime and strong customer satisfaction.
· Hands-on Contribution: Actively contribute technically where needed, providing architectural guidance and coding support aligned with the team’s stack.
Requirements:
· Experience: 8–12 years in software engineering with at least 3+ years in a lead role.
· Proven experience in designing scalable, high-performance architectures and technical solutions.
· Experience delivering multi-customer SaaS product releases, including phased and customer-specific configurations.
· Strong track record of ensuring product quality and stability through structured processes, testing, and monitoring.
· Ability to troubleshoot complex issues and guide teams towards resolution.
· Experience in mentoring and managing engineering teams to drive aligned delivery and high performance.
· Hands-on experience with your relevant tech stack (e.g., Python, Django, Angular, AWS, Docker, Redis, RabbitMQ).
· Excellent communication and collaboration skills with Product, QA, and Customer Support teams.
· Bachelor’s or Master’s degree in Engineering or related field.

A strong proficiency in at least one scripting language (e.g., Python, Bash, PowerShell) is required.
Candidates must possess an in-depth ability to design, write, and implement complex automation logic, not just basic scripts.
Proven experience in automating DevOps processes, environment provisioning, and configuration management is essential.
Cloud Platform (AWS Preferred) : • Extensive hands-on experience with Amazon Web Services (AWS) is highly preferred.
Candidates must be able to demonstrate expert-level knowledge of core AWS services and articulate their use cases.
Excellent debugging and problem-solving skills within the AWS ecosystem are mandatory. The ability to diagnose and resolve issues efficiently is a key requirement.
Infrastructure as Code (IaC - Terraform Preferred) : • Expert-level knowledge and practical experience with Terraform are required.
Candidates must have a deep understanding of how to write scalable, modular, and reusable Terraform code.
Containerization and Orchestration (Kubernetes Preferred) : • Advanced, hands-on experience with Kubernetes is mandatory. • Candidates must be proficient in solving complex, production-level issues related to deployments, networking, and cluster management. • A solid foundational knowledge of Docker is required.


Job Title: Python Developer (FastAPI)
Experience Required: 4+ years
Location: Pune, Bangalore, Hyderabad, Mumbai, Panchkula, Mohali
Shift: Night Shift 6:30 pm to 3:30 AM IST
About the Role
We are seeking an experienced Python Developer with strong expertise in FastAPI to join our engineering team. The ideal candidate should have a solid background in backend development, RESTful API design, and scalable application development.
Required Skills & Qualifications
· 4+ years of professional experience in backend development with Python.
· Strong hands-on experience with FastAPI (or Flask/Django with migration experience).
· Familiarity with asynchronous programming in Python.
· Working knowledge of version control systems (Git).
· Good problem-solving and debugging skills.
· Strong communication and collaboration abilities.

🚀 We’re Hiring: Senior Cloud & ML Infrastructure Engineer 🚀
We’re looking for an experienced engineer to lead the design, scaling, and optimization of cloud-native ML infrastructure on AWS.
If you’re passionate about platform engineering, automation, and running ML systems at scale, this role is for you.
What you’ll do:
🔹 Architect and manage ML infrastructure with AWS (SageMaker, Step Functions, Lambda, ECR)
🔹 Build highly available, multi-region solutions for real-time & batch inference
🔹 Automate with IaC (AWS CDK, Terraform) and CI/CD pipelines
🔹 Ensure security, compliance, and cost efficiency
🔹 Collaborate across DevOps, ML, and backend teams
What we’re looking for:
✔️ 6+ years AWS cloud infrastructure experience
✔️ Strong ML pipeline experience (SageMaker, ECS/EKS, Docker)
✔️ Proficiency in Python/Go/Bash scripting
✔️ Knowledge of networking, IAM, and security best practices
✔️ Experience with observability tools (CloudWatch, Prometheus, Grafana)
✨ Nice to have: Robotics/IoT background (ROS2, Greengrass, Edge Inference)
📍 Location: Bengaluru, Hyderabad, Mumbai, Pune, Mohali, Delhi
5 days working, Work from Office
Night shifts: 9pm to 6am IST
👉 If this sounds like you (or someone you know), let’s connect!
Apply here:

A American Bank holding company . a community-focused financial institution that provides accessible banking services to its members, operating on a not-for-profit basis.



Position: AIML_Python Enginner
Kothapet_Hyderabad _Hybrid.( 4 days a week onsite)
Contract to hire fulltime to client.
5+ years of python experience for scripting ML workflows to deploy ML Pipelines as real time, batch, event triggered, edge deployment
4+ years of experience in using AWS sagemaker for deployment of ML pipelines and ML Models using Sagemaker piplines, Sagemaker mlflow, Sagemaker Feature Store..etc.
3+ years of development of apis using FastAPI, Flask, Django
3+ year of experience in ML frameworks & tools like scikit-learn, PyTorch, xgboost, lightgbm, mlflow.
Solid understanding of ML lifecycle: model development, training, validation, deployment and monitoring
Solid understanding of CI/CD pipelines specifically for ML workflows using bitbucket, Jenkins, Nexus, AUTOSYS for scheduling
Experience with ETL process for ML pipelines with PySpark, Kafka, AWS EMR Serverless
Good to have experience in H2O.ai
Good to have experience in containerization using Docker and Orchestration using Kubernetes.


Required Skills & Qualifications
- 4+ years of professional experience in backend development with Python.
- Strong hands-on experience with FastAPI (or Flask/Django with migration experience).
- Familiarity with asynchronous programming in Python.
- Working knowledge of version control systems (Git).
- Good problem-solving and debugging skills.
- Strong communication and collaboration abilities.
- should have a solid background in backend development, RESTful API design, and scalable application development.
Shift: Night Shift 6:30 pm to 3:30 AM IST


Job Title: Python Developer (Full Time)
Location: Hyderabad (Onsite)
Interview: Virtual and Face to Face Interview (Last round)
Experience Required: 4 + Years
Working Days: 5 Days
About the Role
We are seeking a highly skilled Lead Python Developer with a strong background in building scalable and secure applications. The ideal candidate will have hands-on expertise in Python frameworks, API integrations, and modern application architectures. This role requires a tech leader who can balance innovation, performance, and compliance while driving successful project delivery.
Key Responsibilities
- Application Development
- Architect and develop robust, high-performance applications using Django, Flask, and FastAPI.
- API Integration
- Design and implement seamless integration with third-party APIs (including travel-related APIs, payment gateways, and external service providers).
- Data Management
- Develop and optimize ETL pipelines for structured and unstructured data using data lakes and distributed storage solutions.
- Microservices Architecture
- Build modular, scalable applications using microservices principles for independent deployment and high availability.
- Performance Optimization
- Enhance application performance through load balancing, caching, and query optimization to deliver superior user experiences.
- Security & Compliance
- Apply secure coding practices, implement data encryption, and ensure compliance with industry security and privacy standards (e.g., PCI DSS, GDPR).
- Automation & Deployment
- Utilize CI/CD pipelines, Docker/Kubernetes, and monitoring tools for automated testing, deployment, and production monitoring.
- Collaboration
- Partner with front-end developers, product managers, and stakeholders to deliver user-centric, business-aligned solutions.
Requirements
Education
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Technical Expertise
- 4+ years of hands-on experience with Python frameworks (Django, Flask, FastAPI).
- Proficiency in RESTful APIs, GraphQL, and asynchronous programming.
- Strong knowledge of SQL/NoSQL databases (PostgreSQL, MongoDB) and big data tools (Spark, Kafka).
- Familiarity with Kibana, Grafana, Prometheus for monitoring and visualization.
- Experience with AWS, Azure, or Google Cloud, containerization (Docker, Kubernetes), and CI/CD tools (Jenkins, GitLab CI).
- Working knowledge of testing tools: PyTest, Selenium, SonarQube.
- Experience with API integrations, booking flows, and payment gateway integrations (travel domain knowledge is a plus, but not mandatory).
Soft Skills
- Strong problem-solving and analytical skills.
- Excellent communication, presentation, and teamwork abilities.
- Proactive, ownership-driven mindset with the ability to perform under pressure.


We are seeking a highly skilled Qt/QML Engineer to design and develop advanced GUIs for aerospace applications. The role requires working closely with system architects, avionics software engineers, and mission systems experts to create reliable, intuitive, and real-time UI for mission-critical systems such as UAV ground control stations, and cockpit displays.
Key Responsibilities
- Design, develop, and maintain high-performance UI applications using Qt/QML (Qt Quick, QML, C++).
- Translate system requirements into responsive, interactive, and user-friendly interfaces.
- Integrate UI components with real-time data streams from avionics systems, UAVs, or mission control software.
- Collaborate with aerospace engineers to ensure compliance with DO-178C, or MIL-STD guidelines where applicable.
- Optimise application performance for low-latency visualisation in mission-critical environments.
- Implement data visualisation (raster and vector maps, telemetry, flight parameters, mission planning overlays).
- Write clean, testable, and maintainable code while adhering to aerospace software standards.
- Work with cross-functional teams (system engineers, hardware engineers, test teams) to validate UI against operational requirements.
- Support debugging, simulation, and testing activities, including hardware-in-the-loop (HIL) setups.
Required Qualifications
- Bachelor’s / Master’s degree in Computer Science, Software Engineering, or related field.
- 1-3 years of experience in developing Qt/QML-based applications (Qt Quick, QML, Qt Widgets).
- Strong proficiency in C++ (11/14/17) and object-oriented programming.
- Experience integrating UI with real-time data sources (TCP/IP, UDP, serial, CAN, DDS, etc.).
- Knowledge of multithreading, performance optimisation, and memory management.
- Familiarity with aerospace/automotive domain software practices or mission-critical systems.
- Good understanding of UX principles for operator consoles and mission planning systems.
- Strong problem-solving, debugging, and communication skills.
Desirable Skills
- Experience with GIS/Mapping libraries (OpenSceneGraph, Cesium, Marble, etc.).
- Knowledge of OpenGL, Vulkan, or 3D visualisation frameworks.
- Exposure to DO-178C or aerospace software compliance.
- Familiarity with UAV ground control software (QGroundControl, Mission Planner, etc.) or similar mission systems.
- Experience with Linux and cross-platform development (Windows/Linux).
- Scripting knowledge in Python for tooling and automation.
- Background in defence, aerospace, automotive or embedded systems domain.
What We Offer
- Opportunity to work on cutting-edge aerospace and defence technologies.
- Collaborative and innovation-driven work culture.
- Exposure to real-world avionics and mission systems.
- Growth opportunities in autonomy, AI/ML for aerospace, and avionics UI systems.

CTC: up to 20 LPA
Exp: 4 to 7 Years
Required Qualifications
- Bachelor's degree in Computer Science, Information Technology, or related field
- 4+ years of experience in software development
- Strong proficiency in Java with deep understanding of web technology stack
- Hands-on experience developing applications with Spring Boot framework
- Solid understanding of Python programming language with practical Flask framework experience
- Working knowledge of NATS server for messaging and streaming data
- Experience deploying and managing applications in Kubernetes
- Understanding of microservices architecture and RESTful API design
- Familiarity with containerization technologies (Docker)
- Experience with version control systems (Git)
Skills & Competencies
- Skills Java (Spring Boot, Spring Cloud, Spring Security)
- Python (Flask, SQL Alchemy, REST APIs)
- NATS messaging patterns (pub/sub, request/reply, queue groups)
- Kubernetes (deployments, services, ingress, ConfigMaps, Secrets)
- Web technologies (HTTP, REST, WebSocket, gRPC)
- Container orchestration and management
- Soft Skills Problem-solving and analytical thinking

Responsibilities :
- Design and develop user-friendly web interfaces using HTML, CSS, and JavaScript.
- Utilize modern frontend frameworks and libraries such as React, Angular, or Vue.js to build dynamic and responsive web applications.
- Develop and maintain server-side logic using programming languages such as Java, Python, Ruby, Node.js, or PHP.
- Build and manage APIs for seamless communication between the frontend and backend systems.
- Integrate third-party services and APIs to enhance application functionality.
- Implement CI/CD pipelines to automate testing, integration, and deployment processes.
- Monitor and optimize the performance of web applications to ensure a high-quality user experience.
- Stay up-to-date with emerging technologies and industry trends to continuously improve development processes and application performance.
Qualifications :
- Bachelors/master's in computer science or related subjects or hands-on experience demonstrating working understanding of software applications.
- Knowledge of building applications that can be deployed in a cloud environment or are cloud native applications.
- Strong expertise in building backend applications using Java/C#/Python with demonstrable experience in using frameworks such as Spring/Vertx/.Net/FastAPI.
- Deep understanding of enterprise design patterns, API development and integration and Test-Driven Development (TDD)
- Working knowledge in building applications that leverage databases such as PostgreSQL, MySQL, MongoDB, Neo4J or storage technologies such as AWS S3, Azure Blob Storage.
- Hands-on experience in building enterprise applications adhering to their needs of security and reliability.
- Hands-on experience building applications using one of the major cloud providers (AWS, Azure, GCP).
- Working knowledge of CI/CD tools for application integration and deployment.
- Working knowledge of using reliability tools to monitor the performance of the application.

Role: Senior Backend Developer
Exp: 4 - 7 Years
CTC: up to 22 LPA
Key Responsibilities
- Design, develop, and maintain scalable applications using Java (Spring Boot) and Python (Flask).
- Build RESTful APIs and microservices following best practices.
- Implement event-driven architecture leveraging NATS messaging server.
- Deploy, manage, and optimize applications in Kubernetes and containerized environments.
- Develop and manage CI/CD pipelines, ensuring smooth deployment and delivery.
- Collaborate with cross-functional teams to deliver high-quality solutions.
- Write clean, maintainable, and well-documented code.
- Participate in code reviews and contribute to architectural decisions.
- Troubleshoot, debug, and optimize application performance.

Development and Customization:
Build and customize Frappe modules to meet business requirements.
Develop new functionalities and troubleshoot issues in ERPNext applications.
Integrate third-party APIs for seamless interoperability.
Technical Support:
Provide technical support to end-users and resolve system issues.
Maintain technical documentation for implementations.
Collaboration:
Work with teams to gather requirements and recommend solutions.
Participate in code reviews for quality standards.
Continuous Improvement:
Stay updated with Frappe developments and optimize application performance.
Skills Required:
Proficiency in Python, JavaScript, and relational databases.
Knowledge of Frappe/ERPNext framework and object-oriented programming.
Experience with Git for version control.
Strong analytical skill

Python Developer
Location: Hyderabad (Apple Office)
Experience: 8+ years (Retail / E-commerce preferred)
Budget- 1.9 lpm + GST
Contract: 1 Year + Extendable
Job Responsibilities / Requirements:
- 8+ years of proven experience, preferably in retail or e-commerce environments.
- Strong expertise in Python development.
- Excellent communication skills with the ability to collaborate across multiple teams.
- Hands-on experience with Container & Orchestration: Kubernetes, Docker.
- Expertise in Infrastructure Automation via Kubernetes YAML configurations.
- Strong skills in Scripting & Automation: Python, Shell Scripts (Bash).
- Familiarity with CI/CD Pipelines: GitHub Actions, Jenkins.
- Experience with Monitoring & Logging: Splunk, Grafana.
- Immediate Joiners Preferred – Urgent Support Required.

Python Developer
Location: Hyderabad (Apple Office)
Experience: 8+ years (Retail / E-commerce preferred)
Budget- 1.9 lpm + GST
Contract: 1 Year + Extendable
Job Responsibilities / Requirements:
- 8+ years of proven experience, preferably in retail or e-commerce environments.
- Strong expertise in Python development.
- Excellent communication skills with the ability to collaborate across multiple teams.
- Hands-on experience with Container & Orchestration: Kubernetes, Docker.
- Expertise in Infrastructure Automation via Kubernetes YAML configurations.
- Strong skills in Scripting & Automation: Python, Shell Scripts (Bash).
- Familiarity with CI/CD Pipelines: GitHub Actions, Jenkins.
- Experience with Monitoring & Logging: Splunk, Grafana.
- Immediate Joiners Preferred – Urgent Support Required.

About the role
Meltwater’s collaborative Security Team needs a passionate Security Engineer to continue to advance Meltwater’s security. Working with a group of fun loving people who are genuinely excited and passionate about security, there will be more laughs than facepalms! If you believe that improving security is about constantly moving technology forward to be more secure, and shifting security tools and checks earlier in the development lifecycle, then you’ll feel at home on Meltwater’s Security Team!
At Meltwater we want to ensure that we can have autonomous, empowered and highly efficient teams. Our Security Team charges head on into the challenge of ensuring our teams can maintain their autonomy without compromising the security of our systems, services and data. Through enablement and collaboration with teams, Security Engineers ensure that our development and infrastructure practices have security defined, integrated and implemented in a common-sense manner that reduces risk for our business. Security Engineers define best practices, build tools, implement security checks and controls together with the broader Engineering and IT teams to ensure that our employees and our customers' data stays safe.
As part of this, we leverage AWS as a key component of our cloud infrastructure. Security Engineers play a critical role in securing and optimizing AWS environments by implementing best practices, automating security controls, and collaborating with teams to ensure scalability, resilience, and compliance with industry standards.
Responsibilities
- In this role, you will be designing and implementing security functions ranging from checks on IaC (Infrastructure as Code) to SAST/DAST scanners in our CI/CD pipelines.
- You will be collaborating closely with almost every part of the Meltwater organization and help create security impact across all teams with strong support from the business.
- Collaborate closely with teams to help identify and implement frictionless security controls throughout the software development lifecycle
- Propose and implement solutions to enhance the overall cloud infrastructure and toolset.
- Perform ongoing security testing, including static (SAST), dynamic (DAST), and penetration testing, along with code reviews, vulnerability assessments, and regular security audits to identify risks, improve security, and develop mitigation strategies.
- Educate and share knowledge around secure coding practices
- Identify applicable industry best practices and consult with development teams on methods to continuously improve the risk posture.
- Build applications that improve our security posture and monitoring/alerting capabilities
- Implement and manage security technologies including firewalls, intrusion detection/prevention systems (IDS/IPS), endpoint protection, and security information and event management (SIEM) tools.
- Conduct vulnerability assessments, penetration testing, and regular security audits to identify risks and develop mitigation strategies.
- Monitor and respond to security incidents and alerts, performing root cause analysis and incident handling.
- Participate in incident response and disaster recovery planning, testing, and documentation.
- Manage identity and access management (IAM) solutions to enforce least privilege and role-based access controls (RBAC).
- Assist in the development of automated security workflows using scripting (Python, Bash, or similar).
Skills and background
- Strong collaboration skills with experience working cross functionally with a diverse group of stakeholders
- Strong communication skills with the ability to provide technical guidance to both technical and non-technical audiences
- Experience in implementing security controls early in the software development life cycle
- Knowledge of industry accepted security best practices/standards/policies such as NIST, OWASP, CIS, MITRE&ATT@CK
- Software developer experience in one or more of the following languages: JavaScript, Java, Kotlin or Python
- Experience in at least one public cloud provider, preferably AWS, with experience in security, infrastructure, and automation.
- Hands-on experience with SIEM platforms such as Splunk, QRadar, or similar.
- Proficiency in Linux operating system, network security, including firewalls, VPNs, IDS/IPS, and monitoring tools.
- Experience with vulnerability management tools (Snyk, Nessus, Dependabot) and penetration testing tools (Kali Linux, Metasploit).
- Experience in forensics and malware analysis.
- Self motivated learner that continuously wants to share knowledge to improve others
- The ideal candidate is someone from a Software Development background with a passion for security.
If you’re someone who understands the value of introducing security early in the software development lifecycle, and want to do so by enabling and empowering teams by building tools they WANT to use, we want to hear from you!

Role: AI/ ML Engineering
Experience: 3 to 5 yrs
Work location: Hyderabad
Interview mode: Virtual
Notice period: Immediate Joiner
Key Responsibilities:
· Design and implement RAG pipelines and AI agentic systems using cutting-edge LLM frameworks.
· Fine-tune open-source LLMs and develop narrow, domain-specific models.
· Build and maintain ML pipelines using MLFlow and ensure reproducibility, auditability, and version control.
· Collaborate with cross-functional teams to deploy ML systems into scalable, secure, and production-ready environments.
· Containerize and serve models using Docker, Kubernetes, and FastAPI.
· Automate CI/CD workflows using Azure DevOps, with integrated monitoring and alerts.
· Integrate authentication and authorization flows using Azure AD and Microsoft Graph API.
· Optimize deployed models for latency, cost-efficiency, and operational maintainability.
Required Skills & Experience:
· Strong foundation in Computer Science, software architecture, and distributed systems.
· Proficiency in Python, including both object-oriented and functional programming paradigms.
· Hands-on experience with open-source LLMs, embedding models, and vector databases.
· Practical implementation of RAG pipelines and LLM agentic systems.
· Strong working knowledge of MLOps tooling (e.g., MLFlow), model versioning, and reproducible experiments.
· Experience deploying ML systems using Docker, Kubernetes, and FastAPI or equivalent frameworks.
· Proven experience working in Azure cloud ecosystem:
· Azure DevOps for build/release automation.
· Azure GraphAPI for accessing organizational data.
· Secure identity flows using Azure AD.

Position Title: Data Engineer with Snowflake Lead
Experience: 7+ Yrs
Shift Schedule: Rotational Shifts
Mode of work: Hybrid (Need to come office)
Location: Hyderabad
**Role Overview: **
Snowflake Managed Services team as a Software Engineer to work on data platform development, enhancements, and production support. You will support Snowflake environments across multiple clients, ensuring stability, performance, and continuous improvement.
**Key Responsibilities: **
Design and develop Snowflake pipelines, data models, and transformations
Provide L2/L3 production support for Snowflake jobs, queries, and integrations
Troubleshoot failed jobs, resolve incidents, and conduct RCA
Tune queries, monitor warehouses, and help optimize Snowflake usage and cost
Handle service requests like user provisioning, access changes, and role management
Document issues, enhancements, and standard procedures (runbooks)
Required Skills & Experience:
4+ years of hands-on experience in Snowflake development and support
Strong SQL, data modeling, and performance tuning experience
Exposure to CI/CD pipelines and scripting languages (e.g., Python OR Pyspark)
ETL or ELT
Experience with data pipelines and orchestration tools ( ADF)
Preferred:
SnowPro Core Certification
Experience with ticketing systems (ServiceNow, Jira)
Cloud experience with Azure
Basic understanding of ITIL processes

Key Responsibilities
- Design and implement ETL/ELT pipelines using Databricks, PySpark, and AWS Glue
- Develop and maintain scalable data architectures on AWS (S3, EMR, Lambda, Redshift, RDS)
- Perform data wrangling, cleansing, and transformation using Python and SQL
- Collaborate with data scientists to integrate Generative AI models into analytics workflows
- Build dashboards and reports to visualize insights using tools like Power BI or Tableau
- Ensure data quality, governance, and security across all data assets
- Optimize performance of data pipelines and troubleshoot bottlenecks
- Work closely with stakeholders to understand data requirements and deliver actionable insights
🧪 Required Skills
Skill AreaTools & TechnologiesCloud PlatformsAWS (S3, Lambda, Glue, EMR, Redshift)Big DataDatabricks, Apache Spark, PySparkProgrammingPython, SQLData EngineeringETL/ELT, Data Lakes, Data WarehousingAnalyticsData Modeling, Visualization, BI ReportingGen AI IntegrationOpenAI, Hugging Face, LangChain (preferred)DevOps (Bonus)Git, Jenkins, Terraform, Docker
📚 Qualifications
- Bachelor's or Master’s degree in Computer Science, Data Science, or related field
- 3+ years of experience in data engineering or data analytics
- Hands-on experience with Databricks, PySpark, and AWS
- Familiarity with Generative AI tools and frameworks is a strong plus
- Strong problem-solving and communication skills
🌟 Preferred Traits
- Analytical mindset with attention to detail
- Passion for data and emerging technologies
- Ability to work independently and in cross-functional teams
- Eagerness to learn and adapt in a fast-paced environment
EDI Developer / Map Conversion Specialist
Role Summary:
Responsible for converting 441 existing EDI maps into the PortPro-compatible format and testing them for 147 customer configurations.
Key Responsibilities:
- Analyze existing EDI maps in Profit Tools.
- Convert, reconfigure, or rebuild maps for PortPro.
- Ensure accuracy in mapping and transformation logic.
- Unit test and debug EDI transactions.
- Support system integration and UAT phases.
Skills Required:
- Proficiency in EDI standards (X12, EDIFACT) and transaction sets.
- Hands-on experience in EDI mapping tools.
- Familiarity with both Profit Tools and PortPro data structures.
- SQL and XML/JSON data handling skills.
- Experience with scripting for automation (Python, Shell scripting preferred).
- Strong troubleshooting and debugging skills.

The Opportunity
We’re looking for a Senior Data Engineer to join our growing Data Platform team. This role is a hybrid of data engineering and business intelligence, ideal for someone who enjoys solving complex data challenges while also building intuitive and actionable reporting solutions.
You’ll play a key role in designing and scaling the infrastructure and pipelines that power analytics, dashboards, machine learning, and decision-making across Sonatype. You’ll also be responsible for delivering clear, compelling, and insightful business intelligence through tools like Looker Studio and advanced SQL queries.
What You’ll Do
- Design, build, and maintain scalable data pipelines and ETL/ELT processes.
- Architect and optimize data models and storage solutions for analytics and operational use.
- Create and manage business intelligence reports and dashboards using tools like Looker Studio, Power BI, or similar.
- Collaborate with data scientists, analysts, and stakeholders to ensure datasets are reliable, meaningful, and actionable.
- Own and evolve parts of our data platform (e.g., Airflow, dbt, Spark, Redshift, or Snowflake).
- Write complex, high-performance SQL queries to support reporting and analytics needs.
- Implement observability, alerting, and data quality monitoring for critical pipelines.
- Drive best practices in data engineering and business intelligence, including documentation, testing, and CI/CD.
- Contribute to the evolution of our next-generation data lakehouse and BI architecture.
What We’re Looking For
Minimum Qualifications
- 5+ years of experience as a Data Engineer or in a hybrid data/reporting role.
- Strong programming skills in Python, Java, or Scala.
- Proficiency with data tools such as Databricks, data modeling techniques (e.g., star schema, dimensional modeling), and data warehousing solutions like Snowflake or Redshift.
- Hands-on experience with modern data platforms and orchestration tools (e.g., Spark, Kafka, Airflow).
- Proficient in SQL with experience in writing and optimizing complex queries for BI and analytics.
- Experience with BI tools such as Looker Studio, Power BI, or Tableau.
- Experience in building and maintaining robust ETL/ELT pipelines in production.
- Understanding of data quality, observability, and governance best practices.
Bonus Points
- Experience with dbt, Terraform, or Kubernetes.
- Familiarity with real-time data processing or streaming architectures.
- Understanding of data privacy, compliance, and security best practices in analytics and reporting.
Why You’ll Love Working Here
- Data with purpose: Work on problems that directly impact how the world builds secure software.
- Full-spectrum impact: Use both engineering and analytical skills to shape product, strategy, and operations.
- Modern tooling: Leverage the best of open-source and cloud-native technologies.
- Collaborative culture: Join a passionate team that values learning, autonomy, and real-world impact.

About the Role
We’re hiring a Data Engineer to join our Data Platform team. You’ll help build and scale the systems that power analytics, reporting, and data-driven features across the company. This role works with engineers, analysts, and product teams to make sure our data is accurate, available, and usable.
What You’ll Do
- Build and maintain reliable data pipelines and ETL/ELT workflows.
- Develop and optimize data models for analytics and internal tools.
- Work with team members to deliver clean, trusted datasets.
- Support core data platform tools like Airflow, dbt, Spark, Redshift, or Snowflake.
- Monitor data pipelines for quality, performance, and reliability.
- Write clear documentation and contribute to test coverage and CI/CD processes.
- Help shape our data lakehouse architecture and platform roadmap.
What You Need
- 2–4 years of experience in data engineering or a backend data-related role.
- Strong skills in Python or another backend programming language.
- Experience working with SQL and distributed data systems (e.g., Spark, Kafka).
- Familiarity with NoSQL stores like HBase or similar.
- Comfortable writing efficient queries and building data workflows.
- Understanding of data modeling for analytics and reporting.
- Exposure to tools like Airflow or other workflow schedulers.
Bonus Points
- Experience with DBT, Databricks, or real-time data pipelines.
- Familiarity with cloud infrastructure tools like Terraform or Kubernetes.
- Interest in data governance, ML pipelines, or compliance standards.
Why Join Us?
- Work on data that supports meaningful software security outcomes.
- Use modern tools in a cloud-first, open-source-friendly environment.
- Join a team that values clarity, learning, and autonomy.
If you're excited about building impactful software and helping others do the same, this is an opportunity to grow as a technical leader and make a meaningful impact.
We are seeking a Software Engineer in Test to join our Quality Engineering team. In this role, you will be responsible for designing, developing, and maintaining automation frameworks to enhance our test coverage and ensure the delivery of high-quality software. You will collaborate closely with developers, product managers, and other stakeholders to drive test automation strategies and improve software reliability.
Key Responsibilities
● Design, develop, and maintain robust test automation frameworks for web, API, and backend services.
● Implement automated test cases to improve software quality and test coverage.
● Develop and execute performance and load tests to ensure the application behaves reliably in self-hosted environment
environments.
● Integrate automated tests into CI/CD pipelines to enable continuous testing.
● Collaborate with software engineers to define test strategies, acceptance criteria, and quality standards.
● Conduct performance, security, and regression testing to ensure application stability.
● Investigate test failures, debug issues, and work with development teams to resolve defects.
● Advocate for best practices in test automation, code quality, and software reliability.
● Stay updated with industry trends and emerging technologies in software testing.
Qualifications & Experience
● Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
● 3+ years of experience in software test automation.
● Proficiency in programming languages such as Java, Python, or JavaScript.
● Hands-on experience with test automation tools like Selenium, Cypress, Playwright, or similar.
● Strong knowledge of API testing using tools such as Postman, RestAssured, or Karate.
● Experience with CI/CD tools such as Jenkins, GitHub Actions, or GitLab CI/CD.
● Understanding of containerization and cloud technologies (Docker, Kubernetes, AWS, or similar).
● Familiarity with performance testing tools like JMeter or Gatling is a plus.
● Excellent problem-solving skills and attention to detail.
● Strong communication and collaboration skills.
Inncircles Technologies is a problem-solving company. With powerful data management capabilities and AI-driven algorithms, we have developed a construction management platform named Inncircles Arena, a one-stop solution for managing any construction project.
Inncircles Arena can help construction industry owners, builders, general contractors, and specialist contractors to improve construction management operations efficiency and project management. The application runs on a cloud-based platform and offers a complete range of tools to gather field data through a user-friendly interface and mobile applications.
Due to the software's modern, user-friendly design, users can access project information from any location through mobile and web applications. Collaboration tools are integrated into each feature to facilitate effective coordination and ensure all teams are on the same page.
With highly configurable features, products, solutions, and services, we aim to make digital transformation easier and more simplified for construction companies.
Why should you join our team?
- 100% growth with diverse experience working with international clients
- Exposure across media & digital channels
- Dynamic learning curve across Global Landscape
- A part of a young team, ready to experiment together
About the Role
We are looking for a Quality Analyst with strong skills in Manual Testing and Web Automation to join our growing team. The ideal candidate will be passionate about delivering high-quality software products, adept at identifying bugs, and ensuring seamless functionality across applications.
Key Responsibilities
● Design, develop, and execute manual test cases for web applications and APIs.
● Create, maintain, and enhance web automation test scripts using industry-standard tools and frameworks.
● Collaborate with cross-functional teams (Developers, Product Managers) to ensure quality at every stage of development.
● Perform regression testing, smoke testing, sanity testing and end-to-end testing for new releases.
● Use Jira for defect tracking and reporting, ensuring clear communication of bugs and their statuses.
● Work with Git for version control and participate in code reviews related to test scripts.
● Integrate and maintain test execution pipelines using Jenkins (CI/CD).
● Conduct performance and load testing using JMeter, identifying bottlenecks and providing actionable insights.
● Perform basic database testing with MongoDB, validating backend data integrity.
Requirements
● BE/B.Tech/BCA degree in Computer science, Engineering, or a related field.
● 1–3 years of experience in Manual Testing and Web Automation Testing.
● Strong analytical and problem-solving skills with keen attention to detail.
● Good understanding of SDLC, STLC, and Agile methodologies.
● Excellent communication and collaboration skills.
● Hands-on experience with Selenium/Playwright or similar web automation tools.
● Knowledge of Programming languages (Python, Java, JavaScript and TypeScript).
● Proficiency in Jira for bug tracking and project management.
● Basic knowledge of Git for version control.
● Familiarity with Jenkins for CI/CD pipelines.
● Understanding of MongoDB for basic data validation.
Good to Have
● Exposure to API testing tools (e.g., Postman, Rest Assured).
● Experience in Performance Testing using JMeter.
● Familiarity with cross-browser and cross-platform testing.

Job Description:
Title : Python AWS Developer with API
Tech Stack : AWS API gateway, Lambda functionality, Oracle RDS, SQL & database management, (OOPS) principles, Java script, Object relational Mapper, Git, Docker, Java dependency management, CI/CD, AWS cloud & S3, Secret Manager, Python, API frameworks, well-versed with Front and back end programming (python).
Responsibilities:
· Worked on building high-performance APIs using AWS services and Python. Python coding, debugging programs and integrating app with third party web services.
· Troubleshoot and debug non-prod defects, back-end development, API, main focus on coding and monitoring applications.
· Core application logic design.
· Supports dependency teams in UAT testing and perform functional application testing which includes postman testing

🔍 Job Description:
We are looking for an experienced and highly skilled Technical Lead to guide the development and enhancement of a large-scale Data Observability solution built on AWS. This platform is pivotal in delivering monitoring, reporting, and actionable insights across the client's data landscape.
The Technical Lead will drive end-to-end feature delivery, mentor junior engineers, and uphold engineering best practices. The position reports to the Programme Technical Lead / Architect and involves close collaboration to align on platform vision, technical priorities, and success KPIs.
🎯 Key Responsibilities:
- Lead the design, development, and delivery of features for the data observability solution.
- Mentor and guide junior engineers, promoting technical growth and engineering excellence.
- Collaborate with the architect to align on platform roadmap, vision, and success metrics.
- Ensure high quality, scalability, and performance in data engineering solutions.
- Contribute to code reviews, architecture discussions, and operational readiness.
🔧 Primary Must-Have Skills (Non-Negotiable):
- 5+ years in Data Engineering or Software Engineering roles.
- 3+ years in a technical team or squad leadership capacity.
- Deep expertise in AWS Data Services: Glue, EMR, Kinesis, Lambda, Athena, S3.
- Advanced programming experience with PySpark, Python, and SQL.
- Proven experience in building scalable, production-grade data pipelines on cloud platforms.

Position Overview
We are seeking a skilled Developer to join our engineering team. The ideal candidate will have strong expertise in Java and Python ecosystems, with hands-on experience in modern web technologies, messaging systems, and cloud-native development using Kubernetes.
Key Responsibilities
- Design, develop, and maintain scalable applications using Java and Spring Boot framework
- Build robust web services and APIs using Python and Flask framework
- Implement event-driven architectures using NATS messaging server
- Deploy, manage, and optimize applications in Kubernetes environments
- Develop microservices following best practices and design patterns
- Collaborate with cross-functional teams to deliver high-quality software solutions
- Write clean, maintainable code with comprehensive documentation
- Participate in code reviews and contribute to technical architecture decisions
- Troubleshoot and optimize application performance in containerized environments
- Implement CI/CD pipelines and follow DevOps best practices
- Required Qualifications
- Bachelor's degree in Computer Science, Information Technology, or related field
- 4+ years of experience in software development
- Strong proficiency in Java with deep understanding of web technology stack
- Hands-on experience developing applications with Spring Boot framework
- Solid understanding of Python programming language with practical Flask framework experience
- Working knowledge of NATS server for messaging and streaming data
- Experience deploying and managing applications in Kubernetes
- Understanding of microservices architecture and RESTful API design
- Familiarity with containerization technologies (Docker)
- Experience with version control systems (Git)
- Skills & Competencies
- Skills Java (Spring Boot, Spring Cloud, Spring Security)
- Python (Flask, SQL Alchemy, REST APIs)
- NATS messaging patterns (pub/sub, request/reply, queue groups)
- Kubernetes (deployments, services, ingress, ConfigMaps, Secrets)
- Web technologies (HTTP, REST, WebSocket, gRPC)
- Container orchestration and management
- Soft Skills Problem-solving and analytical thinking
- Strong communication and collaboration
- Self-motivated with ability to work independently
- Attention to detail and code quality
- Continuous learning mindset
- Team player with mentoring capabilities

Role: Python Developer
Location: HYD , BLR,
Experience : 6+ Years
Skills needed:
Python developer experienced 5+ Years in designing, developing, and maintaining scalable applications with a strong focus on API integration. Must demonstrate proficiency in RESTful API consumption, third-party service integration, and troubleshooting API-related issues.

Tableau Server Administrator (10+ Yrs Exp.) 📊🔒
📍Location: Remote
🗓️ Experience: 10+ years
MandatorySkills & Qualifications:
1. Proven expertise in Tableau architecture, clustering, scalability, and high availability.
2. Proficiency in PowerShell, Python, or Shell scripting.
3. Experience with cloud platforms (AWS, Azure, GCP) and Tableau Cloud.
4. Familiarity with database systems (SQL Server, Oracle, Snowflake).
5. Any certification Plus.



About NxtWave:
NxtWave is one of India’s fastest-growing edtech startups, transforming the way students learn and build careers in tech. With a strong community of learners across the country, we’re building cutting-edge products that make industry-ready skills accessible and effective at scale.
What will you do:
- Build and ship full-stack features end-to-end (frontend, backend, data).
- Own your code – from design to deployment with CI/CD pipelines.
- Make key architectural decisions and implement scalable systems.
- Lead code reviews, enforce clean code practices, and mentor SDE-1s.
- Optimize performance across frontend (Lighthouse) and backend (tracing, metrics)
- Ensure secure, accessible, and SEO-friendly applications.
- Collaborate with Product, Design, and Ops to deliver fast and effectively.
- Work in a fast-paced, high-impact environment with rapid release cycles.
What we are expecting:
- 3–5 years of experience building production-grade full-stack applications.
- Proficiency in React (or Angular/Vue), TypeScript, Node.js / NestJS / Django / Spring Boot.
- Strong understanding of REST/GraphQL APIs, relational & NoSQL databases.
- Experience with Docker, AWS (Lambda, EC2, S3, API Gateway), Redis, Elasticsearch.
- Solid testing experience – unit, integration, and E2E (Jest, Cypress, Playwright).
- Strong problem-solving, communication, and team collaboration skills.
- Passion for learning, ownership, and building great software.
Location: Hyderabad (In-office)
Apply here:- https://forms.gle/QeoNC8LmWY6pwckX9



Why NxtWave
As a Fullstack SDE1 at NxtWave, you
- Get first hand experience of building applications and see them released quickly to the NxtWave learners (within weeks)
- Get to take ownership of the features you build and work closely with the product team
- Work in a great culture that continuously empowers you to grow in your career
- Enjoy freedom to experiment & learn from mistakes (Fail Fast, Learn Faster)
- NxtWave is one of the fastest growing edtech startups. Get first-hand experience in scaling the features you build as the company grows rapidly
- Build in a world-class developer environment by applying clean coding principles, code architecture, etc.
Responsibilities
- Design, implement, and ship user-centric features spanning frontend, backend, and database systems under guidance.
- Define and implement RESTful/GraphQL APIs and efficient, scalable database schemas.
- Build reusable, maintainable frontend components using modern state management practices.
- Develop backend services in Node.js or Python, adhering to clean-architecture principles.
- Write and maintain unit, integration, and end-to-end tests to ensure code quality and reliability.
- Containerize applications and configure CI/CD pipelines for automated builds and deployments.
- Enforce secure coding practices, accessibility standards (WCAG), and SEO fundamentals.
- Collaborate effectively with Product, Design, and engineering teams to understand and implement feature requirements..
- Own feature delivery from planning through production, and mentor interns or junior developers.
Qualifications & Skills
- 1+ years of experience building full-stack web applications.
- Proficiency in JavaScript (ES6+), TypeScript, HTML5, and CSS3 (Flexbox/Grid).
- Advanced experience with React (Hooks, Context, Router) or equivalent modern UI framework.
- Hands-on with state management patterns (Redux, MobX, or custom solutions).
- Strong backend skills in Node.js (Express/Fastify) or Python (Django/Flask/FastAPI).
- Expertise in designing REST and/or GraphQL APIs and integrating with backend services.
- Solid knowledge of MySQL/PostgreSQL and familiarity with NoSQL stores (Elasticsearch, Redis).
- Experience using build tools (Webpack, Vite), package managers (npm/Yarn), and Git workflows.
- Skilled in writing and maintaining tests with Jest, React Testing Library, Pytest, and Cypress.
- Familiar with Docker, CI / CD tools (GitHub Actions, Jenkins), and basic cloud deployments.
- Product-first thinker with strong problem-solving, debugging, and communication skills.
Qualities we'd love to find in you:
- The attitude to always strive for the best outcomes and an enthusiasm to deliver high quality software
- Strong collaboration abilities and a flexible & friendly approach to working with teams
- Strong determination with a constant eye on solutions
- Creative ideas with problem solving mind-set
- Be open to receiving objective criticism and improving upon it
- Eagerness to learn and zeal to grow
- Strong communication skills is a huge plus
Work Location: Hyderabad

· Design, develop, and implement AI/ML models and algorithms.
· Focus on building Proof of Concept (POC) applications to demonstrate the feasibility and value of AI solutions.
· Write clean, efficient, and well-documented code.
· Collaborate with data engineers to ensure data quality and availability for model training and evaluation.
· Work closely with senior team members to understand project requirements and contribute to technical solutions.
· Troubleshoot and debug AI/ML models and applications.
· Stay up-to-date with the latest advancements in AI/ML.
· Utilize machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) to develop and deploy models.
· Develop and deploy AI solutions on Google Cloud Platform (GCP).
· Implement data preprocessing and feature engineering techniques using libraries like Pandas and NumPy.
· Utilize Vertex AI for model training, deployment, and management.
· Integrate and leverage Google Gemini for specific AI functionalities.
Qualifications:
· Bachelor’s degree in computer science, Artificial Intelligence, or a related field.
· 3+ years of experience in developing and implementing AI/ML models.
· Strong programming skills in Python.
· Experience with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn.
· Good understanding of machine learning concepts and techniques.
· Ability to work independently and as part of a team.
· Strong problem-solving skills.
· Good communication skills.
· Experience with Google Cloud Platform (GCP) is preferred.
· Familiarity with Vertex AI is a plus.

Job Summary:
We are looking for a skilled and motivated Python AWS Engineer to join our team. The ideal candidate will have strong experience in backend development using Python, cloud infrastructure on AWS, and building serverless or microservices-based architectures. You will work closely with cross-functional teams to design, develop, deploy, and maintain scalable and secure applications in the cloud.
Key Responsibilities:
- Develop and maintain backend applications using Python and frameworks like Django or Flask
- Design and implement serverless solutions using AWS Lambda, API Gateway, and other AWS services
- Develop data processing pipelines using services such as AWS Glue, Step Functions, S3, DynamoDB, and RDS
- Write clean, efficient, and testable code following best practices
- Implement CI/CD pipelines using tools like CodePipeline, GitHub Actions, or Jenkins
- Monitor and optimize system performance and troubleshoot production issues
- Collaborate with DevOps and front-end teams to integrate APIs and cloud-native services
- Maintain and improve application security and compliance with industry standards
Required Skills:
- Strong programming skills in Python
- Solid understanding of AWS cloud services (Lambda, S3, EC2, DynamoDB, RDS, IAM, API Gateway, CloudWatch, etc.)
- Experience with infrastructure as code (e.g., CloudFormation, Terraform, or AWS CDK)
- Good understanding of RESTful API design and microservices architecture
- Hands-on experience with CI/CD, Git, and version control systems
- Familiarity with containerization (Docker, ECS, or EKS) is a plus
- Strong problem-solving and communication skills
Preferred Qualifications:
- Experience with PySpark, Pandas, or data engineering tools
- Working knowledge of Django, Flask, or other Python frameworks
- AWS Certification (e.g., AWS Certified Developer – Associate) is a plus
Educational Qualification:
- Bachelor's or Master’s degree in Computer Science, Engineering, or related field

Role Overview:
We are seeking a Senior Software Engineer (SSE) with strong expertise in Kafka, Python, and Azure Databricks to lead and contribute to our healthcare data engineering initiatives. This role is pivotal in building scalable, real-time data pipelines and processing large-scale healthcare datasets in a secure and compliant cloud environment.
The ideal candidate will have a solid background in real-time streaming, big data processing, and cloud platforms, along with strong leadership and stakeholder engagement capabilities.
Key Responsibilities:
- Design and develop scalable real-time data streaming solutions using Apache Kafka and Python.
- Architect and implement ETL/ELT pipelines using Azure Databricks for both structured and unstructured healthcare data.
- Optimize and maintain Kafka applications, Python scripts, and Databricks workflows to ensure performance and reliability.
- Ensure data integrity, security, and compliance with healthcare standards such as HIPAA and HITRUST.
- Collaborate with data scientists, analysts, and business stakeholders to gather requirements and translate them into robust data solutions.
- Mentor junior engineers, perform code reviews, and promote engineering best practices.
- Stay current with evolving technologies in cloud, big data, and healthcare data standards.
- Contribute to the development of CI/CD pipelines and containerized environments (Docker, Kubernetes).
Required Skills & Qualifications:
- 4+ years of hands-on experience in data engineering roles.
- Strong proficiency in Kafka (including Kafka Streams, Kafka Connect, Schema Registry).
- Proficient in Python for data processing and automation.
- Experience with Azure Databricks (or readiness to ramp up quickly).
- Solid understanding of cloud platforms, with a preference for Azure (AWS/GCP is a plus).
- Strong knowledge of SQL and NoSQL databases; data modeling for large-scale systems.
- Familiarity with containerization tools like Docker and orchestration using Kubernetes.
- Exposure to CI/CD pipelines for data applications.
- Prior experience with healthcare datasets (EHR, HL7, FHIR, claims data) is highly desirable.
- Excellent problem-solving abilities and a proactive mindset.
- Strong communication and interpersonal skills to work in cross-functional teams.


Job Title : Senior Python Developer
Experience : 7+ Years
Location : Remote or Hybrid (Gurgaon / Coimbatore / Hyderabad)
Job Summary :
We are looking for a highly skilled and motivated Senior Python Developer to join our dynamic engineering team.
The ideal candidate will have a strong foundation in web application development using Python and related frameworks. A passion for writing clean, scalable code and solving complex technical challenges is essential for success in this role.
Mandatory Skills : Python (3.x), FastAPI or Flask, PostgreSQL or Oracle, ORM, API Microservices, Agile Methodologies, Clean Code Practices.
Required Skills and Qualifications :
- 7+ Years of hands-on experience in Python (3.x) development.
- Strong proficiency in FastAPI or Flask frameworks.
- Experience with relational databases like PostgreSQL, Oracle, or similar, along with ORM tools.
- Demonstrated experience in building and maintaining API-based microservices.
- Solid grasp of Agile development methodologies and version control practices.
- Strong analytical and problem-solving skills.
- Ability to write clean, maintainable, and well-documented code.
Nice to Have :
- Experience with Google Cloud Platform (GCP) or other cloud providers.
- Exposure to Kubernetes and container orchestration tools.

We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).

Job Title: AI Solutioning Architect – Healthcare IT
Role Summary:
The AI Solutioning Architect leads the design and implementation of AI-driven solutions across the organization, ensuring alignment with business goals and healthcare IT standards. This role defines the AI/ML architecture, guides technical execution, and fosters innovation using platforms like Google Cloud (GCP).
Key Responsibilities:
- Architect scalable AI solutions from data ingestion to deployment.
- Align AI initiatives with business objectives and regulatory requirements (HIPAA).
- Collaborate with cross-functional teams to deliver AI projects.
- Lead POCs, evaluate AI tools/platforms, and promote GCP adoption.
- Mentor technical teams and ensure best practices in MLOps.
- Communicate complex concepts to diverse stakeholders.
Qualifications:
- Bachelor’s/Master’s in Computer Science or related field.
- 12+ years in software development/architecture with strong AI/ML focus.
- Experience in healthcare IT and compliance (HIPAA).
- Proficient in Python/Java and ML frameworks (TensorFlow, PyTorch).
- Hands-on with GCP (preferred) or other cloud platforms.
- Strong leadership, problem-solving, and communication skills.


About NxtWave
NxtWave is one of India’s fastest-growing ed-tech startups, reshaping the tech education landscape by bridging the gap between industry needs and student readiness. With prestigious recognitions such as Technology Pioneer 2024 by the World Economic Forum and Forbes India 30 Under 30, NxtWave’s impact continues to grow rapidly across India.
Our flagship on-campus initiative, NxtWave Institute of Advanced Technologies (NIAT), offers a cutting-edge 4-year Computer Science program designed to groom the next generation of tech leaders, located in Hyderabad’s global tech corridor.
Know more:
🌐 NxtWave | NIAT
About the Role
As a PhD-level Software Development Instructor, you will play a critical role in building India’s most advanced undergraduate tech education ecosystem. You’ll be mentoring bright young minds through a curriculum that fuses rigorous academic principles with real-world software engineering practices. This is a high-impact leadership role that combines teaching, mentorship, research alignment, and curriculum innovation.
Key Responsibilities
- Deliver high-quality classroom instruction in programming, software engineering, and emerging technologies.
- Integrate research-backed pedagogy and industry-relevant practices into classroom delivery.
- Mentor students in academic, career, and project development goals.
- Take ownership of curriculum planning, enhancement, and delivery aligned with academic and industry excellence.
- Drive research-led content development, and contribute to innovation in teaching methodologies.
- Support capstone projects, hackathons, and collaborative research opportunities with industry.
- Foster a high-performance learning environment in classes of 70–100 students.
- Collaborate with cross-functional teams for continuous student development and program quality.
- Actively participate in faculty training, peer reviews, and academic audits.
Eligibility & Requirements
- Ph.D. in Computer Science, IT, or a closely related field from a recognized university.
- Strong academic and research orientation, preferably with publications or project contributions.
- Prior experience in teaching/training/mentoring at the undergraduate/postgraduate level is preferred.
- A deep commitment to education, student success, and continuous improvement.
Must-Have Skills
- Expertise in Python, Java, JavaScript, and advanced programming paradigms.
- Strong foundation in Data Structures, Algorithms, OOP, and Software Engineering principles.
- Excellent communication, classroom delivery, and presentation skills.
- Familiarity with academic content tools like Google Slides, Sheets, Docs.
- Passion for educating, mentoring, and shaping future developers.
Good to Have
- Industry experience or consulting background in software development or research-based roles.
- Proficiency in version control systems (e.g., Git) and agile methodologies.
- Understanding of AI/ML, Cloud Computing, DevOps, Web or Mobile Development.
- A drive to innovate in teaching, curriculum design, and student engagement.
Why Join Us?
- Be at the forefront of shaping India’s tech education revolution.
- Work alongside IIT/IISc alumni, ex-Amazon engineers, and passionate educators.
- Competitive compensation with strong growth potential.
- Create impact at scale by mentoring hundreds of future-ready tech leaders.


About Cognida.ai:
Our Purpose is to boost your competitive advantage using AI and Analytics.
We Deliver tangible business impact with data-driven insights powered by AI. Drive revenue growth, increase profitability and improve operational efficiencies.
We Are technologists with keen business acumen - Forever curious, always on the front lines of technological advancements. Applying our latest learnings, and tools to solve your everyday business challenges.
We Believe the power of AI should not be the exclusive preserve of the few. Every business, regardless of its size or sector deserves the opportunity to harness the power of AI to make better decisions and drive business value.
We See a world where our AI and Analytics solutions democratise decision intelligence for all businesses. With Cognida.ai, our motto is ‘No enterprise left behind’.
Position: Python Fullstack Architect
Location: Hyderabad
Job Summary
We’re seeking a seasoned Python Fullstack Architect with 15+ years of experience to lead solution design, mentor teams, and drive technical excellence across projects. You'll work closely with stakeholders, contribute to architecture governance, and integrate modern technologies across the stack.
Key Responsibilities
- Design and review Python-based fullstack solution architectures.
- Guide development teams on best practices, modern frameworks, and cloud-native patterns.
- Engage with clients to translate business needs into scalable technical solutions.
- Stay current with tech trends and contribute to internal innovation initiatives.
Required Skills
- Strong expertise in Python (Django/Flask/FastAPI) and frontend frameworks (React, Angular, etc.).
- Cloud experience (AWS, Azure, or GCP) and DevOps/CI-CD setup.
- Familiarity with enterprise tools: RabbitMQ, Kafka, OAuth2, PostgreSQL, MongoDB.
- Solid understanding of microservices, API design, batch/stream processing.
- Strong leadership, mentoring, and architectural problem-solving skills.

Product company for financial operations automation platform

Mandatory Criteria :
- Candidate must have Strong hands-on experience with Kubernetes of atleast 2 years in production environments.
- Candidate should have Expertise in at least one public cloud platform [GCP (Preferred), AWS, Azure, or OCI).
- Proficient in backend programming with Python, Java, or Kotlin (at least one is required).
- Candidate should have strong Backend experience.
- Hands-on experience with BigQuery or Snowflake for data analytics and integration.
About the Role
We are looking for a highly skilled and motivated Cloud Backend Engineer with 4–7 years of experience, who has worked extensively on at least one major cloud platform (GCP, AWS, Azure, or OCI). Experience with multiple cloud providers is a strong plus. As a Senior Development Engineer, you will play a key role in designing, building, and scaling backend services and infrastructure on cloud-native platforms.
# Experience with Kubernetes is mandatory.
Key Responsibilities
- Design and develop scalable, reliable backend services and cloud-native applications.
- Build and manage RESTful APIs, microservices, and asynchronous data processing systems.
- Deploy and operate workloads on Kubernetes with best practices in availability, monitoring, and cost-efficiency.
- Implement and manage CI/CD pipelines and infrastructure automation.
- Collaborate with frontend, DevOps, and product teams in an agile environment.
- Ensure high code quality through testing, reviews, and documentation.
Required Skills
- Strong hands-on experience with Kubernetes of atleast 2 years in production environments (mandatory).
- Expertise in at least one public cloud platform [GCP (Preferred), AWS, Azure, or OCI].
- Proficient in backend programming with Python, Java, or Kotlin (at least one is required).
- Solid understanding of distributed systems, microservices, and cloud-native architecture.
- Experience with containerization using Docker and Kubernetes-native deployment workflows.
- Working knowledge of SQL and relational databases.
Preferred Qualifications
- Experience working across multiple cloud platforms.
- Familiarity with infrastructure-as-code tools like Terraform or CloudFormation.
- Exposure to monitoring, logging, and observability stacks (e.g., Prometheus, Grafana, Cloud Monitoring).
- Hands-on experience with BigQuery or Snowflake for data analytics and integration.
Nice to Have
- Knowledge of NoSQL databases or event-driven/message-based architectures.
- Experience with serverless services, managed data pipelines, or data lake platforms.

Job Summary:
We are seeking a skilled Python Developer with a strong foundation in Artificial Intelligence and Machine Learning. You will be responsible for designing, developing, and deploying intelligent systems that leverage large datasets and cutting-edge ML algorithms to solve real-world problems.
Key Responsibilities:
- Design and implement machine learning models using Python and libraries like TensorFlow, PyTorch, or Scikit-learn.
- Perform data preprocessing, feature engineering, and exploratory data analysis.
- Develop APIs and integrate ML models into production systems using frameworks like Flask or FastAPI.
- Collaborate with data scientists, DevOps engineers, and backend teams to deliver scalable AI solutions.
- Optimize model performance and ensure robustness in real-time environments.
- Maintain clear documentation of code, models, and processes.
Required Skills:
- Proficiency in Python and ML libraries (NumPy, Pandas, Scikit-learn, TensorFlow, PyTorch).
- Strong understanding of ML algorithms (classification, regression, clustering, deep learning).
- Experience with data pipeline tools (e.g., Airflow, Spark) and cloud platforms (AWS, Azure, or GCP).
- Familiarity with containerization (Docker, Kubernetes) and CI/CD practices.
- Solid grasp of RESTful API development and integration.
Preferred Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Data Science, or related field.
- 2–5 years of experience in Python development with a focus on AI/ML.
- Exposure to MLOps practices and model monitoring tools.


POSITION / TITLE: Data Science Lead
Location: Offshore – Hyderabad/Bangalore/Pune
Who are we looking for?
Individuals with 8+ years of experience implementing and managing data science projects. Excellent working knowledge of traditional machine learning and LLM techniques.
The candidate must demonstrate the ability to navigate and advise on complex ML ecosystems from a model building and evaluation perspective. Experience in NLP and chatbots domains is preferred.
We acknowledge the job market is blurring the line between data roles: while software skills are necessary, the emphasis of this position is on data science skills, not on data-, ML- nor software-engineering.
Responsibilities:
· Lead data science and machine learning projects, contributing to model development, optimization and evaluation.
· Perform data cleaning, feature engineering, and exploratory data analysis.
· Translate business requirements into technical solutions, document and communicate project progress, manage non-technical stakeholders.
· Collaborate with other DS and engineers to deliver projects.
Technical Skills – Must have:
· Experience in and understanding of the natural language processing (NLP) and large language model (LLM) landscape.
· Proficiency with Python for data analysis, supervised & unsupervised learning ML tasks.
· Ability to translate complex machine learning problem statements into specific deliverables and requirements.
· Should have worked with major cloud platforms such as AWS, Azure or GCP.
· Working knowledge of SQL and no-SQL databases.
· Ability to create data and ML pipelines for more efficient and repeatable data science projects using MLOps principles.
· Keep abreast with new tools, algorithms and techniques in machine learning and works to implement them in the organization.
· Strong understanding of evaluation and monitoring metrics for machine learning projects.
Technical Skills – Good to have:
· Track record of getting ML models into production
· Experience building chatbots.
· Experience with closed and open source LLMs.
· Experience with frameworks and technologies like scikit-learn, BERT, langchain, autogen…
· Certifications or courses in data science.
Education:
· Master’s/Bachelors/PhD Degree in Computer Science, Engineering, Data Science, or a related field.
Process Skills:
· Understanding of Agile and Scrum methodologies.
· Ability to follow SDLC processes and contribute to technical documentation.
Behavioral Skills :
· Self-motivated and capable of working independently with minimal management supervision.
· Well-developed design, analytical & problem-solving skills
· Excellent communication and interpersonal skills.
· Excellent team player, able to work with virtual teams in several time zones.

Job Role : DevOps Engineer (Python + DevOps)
Experience : 4 to 10 Years
Location : Hyderabad
Work Mode : Hybrid
Mandatory Skills : Python, Ansible, Docker, Kubernetes, CI/CD, Cloud (AWS/Azure/GCP)
Job Description :
We are looking for a skilled DevOps Engineer with expertise in Python, Ansible, Docker, and Kubernetes.
The ideal candidate will have hands-on experience automating deployments, managing containerized applications, and ensuring infrastructure reliability.
Key Responsibilities :
- Design and manage containerization and orchestration using Docker & Kubernetes.
- Automate deployments and infrastructure tasks using Ansible & Python.
- Build and maintain CI/CD pipelines for streamlined software delivery.
- Collaborate with development teams to integrate DevOps best practices.
- Monitor, troubleshoot, and optimize system performance.
- Enforce security best practices in containerized environments.
- Provide operational support and contribute to continuous improvements.
Required Qualifications :
- Bachelor’s in Computer Science/IT or related field.
- 4+ years of DevOps experience.
- Proficiency in Python and Ansible.
- Expertise in Docker and Kubernetes.
- Hands-on experience with CI/CD tools and pipelines.
- Experience with at least one cloud provider (AWS, Azure, or GCP).
- Strong analytical, communication, and collaboration skills.
Preferred Qualifications :
- Experience with Infrastructure-as-Code tools like Terraform.
- Familiarity with monitoring/logging tools like Prometheus, Grafana, or ELK.
- Understanding of Agile/Scrum practices.