Cutshort logo
Python Jobs in Hyderabad

50+ Python Jobs in Hyderabad | Python Job openings in Hyderabad

Apply to 50+ Python Jobs in Hyderabad on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
Inferigence Quotient

at Inferigence Quotient

1 recruiter
Neeta Trivedi
Posted by Neeta Trivedi
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad
1 - 2 yrs
₹6L - ₹12L / yr
QML
Qt
skill iconC++
skill iconPython

We are seeking a highly skilled Qt/QML Engineer to design and develop advanced GUIs for aerospace applications. The role requires working closely with system architects, avionics software engineers, and mission systems experts to create reliable, intuitive, and real-time UI for mission-critical systems such as UAV ground control stations, and cockpit displays.

Key Responsibilities

  • Design, develop, and maintain high-performance UI applications using Qt/QML (Qt Quick, QML, C++).
  • Translate system requirements into responsive, interactive, and user-friendly interfaces.
  • Integrate UI components with real-time data streams from avionics systems, UAVs, or mission control software.
  • Collaborate with aerospace engineers to ensure compliance with DO-178C, or MIL-STD guidelines where applicable.
  • Optimise application performance for low-latency visualisation in mission-critical environments.
  • Implement data visualisation (raster and vector maps, telemetry, flight parameters, mission planning overlays).
  • Write clean, testable, and maintainable code while adhering to aerospace software standards.
  • Work with cross-functional teams (system engineers, hardware engineers, test teams) to validate UI against operational requirements.
  • Support debugging, simulation, and testing activities, including hardware-in-the-loop (HIL) setups.

Required Qualifications

  • Bachelor’s / Master’s degree in Computer Science, Software Engineering, or related field.
  • 1-3 years of experience in developing Qt/QML-based applications (Qt Quick, QML, Qt Widgets).
  • Strong proficiency in C++ (11/14/17) and object-oriented programming.
  • Experience integrating UI with real-time data sources (TCP/IP, UDP, serial, CAN, DDS, etc.).
  • Knowledge of multithreading, performance optimisation, and memory management.
  • Familiarity with aerospace/automotive domain software practices or mission-critical systems.
  • Good understanding of UX principles for operator consoles and mission planning systems.
  • Strong problem-solving, debugging, and communication skills.

Desirable Skills

  • Experience with GIS/Mapping libraries (OpenSceneGraph, Cesium, Marble, etc.).
  • Knowledge of OpenGL, Vulkan, or 3D visualisation frameworks.
  • Exposure to DO-178C or aerospace software compliance.
  • Familiarity with UAV ground control software (QGroundControl, Mission Planner, etc.) or similar mission systems.
  • Experience with Linux and cross-platform development (Windows/Linux).
  • Scripting knowledge in Python for tooling and automation.
  • Background in defence, aerospace, automotive or embedded systems domain.

What We Offer

  • Opportunity to work on cutting-edge aerospace and defence technologies.
  • Collaborative and innovation-driven work culture.
  • Exposure to real-world avionics and mission systems.
  • Growth opportunities in autonomy, AI/ML for aerospace, and avionics UI systems.
Read more
Hyderabad
4 - 8 yrs
₹20L - ₹30L / yr
Generative AI
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Large Language Models (LLM)
Retrieval Augmented Generation (RAG)
+8 more

We are seeking a talented AI/ML Engineer with strong hands-on experience in Generative AI and Large Language Models (LLMs) to join our Business Intelligence team. The role involves designing, developing, and deploying advanced AI/ML and GenAI-driven solutions to unlock business insights and enhance data-driven decision-making.


Key Responsibilities:

• Collaborate with business analysts and stakeholders to identify AI/ML and Generative AI use cases.

• Design and implement ML models for predictive analytics, segmentation, anomaly detection, and forecasting.

• Develop and deploy Generative AI solutions using LLMs (GPT, LLaMA, Mistral, etc.).

• Build and maintain Retrieval-Augmented Generation (RAG) pipelines and semantic search systems.

• Work with vector databases (FAISS, Pinecone, ChromaDB) for embedding storage and retrieval.

• Develop end-to-end AI/ML pipelines from data preprocessing to deployment.

• Integrate AI/ML and GenAI solutions into BI dashboards and reporting tools.

• Optimize models for performance, scalability, and reliability.

• Maintain documentation and promote knowledge sharing within the team.


Mandatory Requirements:

• 4+ years of relevant experience as an AI/ML Engineer.

• Hands-on experience in Generative AI and Large Language Models (LLMs) – Mandatory.

• Experience implementing RAG pipelines and prompt engineering techniques.

• Strong programming skills in Python.

• Experience with ML frameworks (TensorFlow, PyTorch, scikit-learn).

• Experience with vector databases (FAISS, Pinecone, ChromaDB).

• Strong understanding of SQL and database systems.

• Experience integrating AI solutions into BI tools (Power BI, Tableau).

• Strong analytical, problem-solving, and communication skills. Good to Have

• Experience with cloud platforms (AWS, Azure, GCP).

• Experience with Docker or Kubernetes.

• Exposure to NLP, computer vision, or deep learning use cases.

• Experience in MLOps and CI/CD pipelines

Read more
MARS Telecom Systems

at MARS Telecom Systems

2 candid answers
Bisman Gill
Posted by Bisman Gill
Hyderabad
3yrs+
Upto ₹15L / yr (Varies
)
skill iconPython
TensorFlow
PyTorch
OpenCV
Computer Vision
+1 more

Computer Vision Engineer

Experience Range: 3–6 years

About the Role:

We are seeking a Computer Vision Engineer to design and implement vision-based solutions that power intelligent systems. You will work on algorithms and models that enable real-time image and video analysis for transportation and automation applications.


Key Responsibilities:

• Develop and optimize computer vision algorithms for object detection, tracking, and recognition

• Implement deep learning models using Python, OpenCV, PyTorch, TensorFlow

• Collaborate with data engineers to prepare and process large-scale image datasets

• Deploy vision models in production environments with cloud and edge computing support

• Research emerging techniques in computer vision and apply them to product development


Required Skills:

• Strong proficiency in Python and computer vision libraries (OpenCV, PyTorch, TensorFlow)

• Experience with image preprocessing, feature extraction, and model training

• Knowledge of deep learning architectures (CNNs, RNNs, Transformers)

• Familiarity with cloud deployment and MLOps practices


Preferred Qualifications:

• Bachelor’s/master's in computer science, AI, or related field

• Prior experience in Intelligent Transportation Systems (ITS) or automotive vision applications

Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
5 - 7 yrs
₹15L - ₹21L / yr
skill iconPython
Terraform
PySpark
skill iconAmazon Web Services (AWS)

Job Details

Job Title: Lead I - Data Engineering (Python, AWS Glue, Pyspark, Terraform)

Industry: Global digital transformation solutions provider

Domain - Information technology (IT)

Experience Required: 5-7 years

Employment Type: Full Time

Job Location: Hyderabad

CTC Range: Best in Industry

 

Job Description

Data Engineer with AWS, Python, Glue, Terraform, Step function and Spark

 

Skills: Python, AWS Glue, Pyspark, Terraform - All are mandatory

 

******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: Hyderabad 

Read more
Service Co

Service Co

Agency job
via Vikash Technologies by Rishika Teja
Hyderabad
6 - 14 yrs
₹30L - ₹35L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Core AI
Windows Azure
MCP
+4 more

Responsibilities:


• End-to-end design, development, and deployment of enterprise-grade AI solutions leveraging Azure AI, Google Vertex AI, or comparable cloud platforms.


• Architect and implement advanced AI systems, including agentic workflows, LLM integrations, MCP-based solutions, RAG pipelines, and scalable microservices.


• Oversee the development of Python-based applications, RESTful APIs, data processing pipelines, and complex system integrations.


• Define and uphold engineering best practices, including CI/CD automation, testing frameworks, model evaluation procedures, observability, and operational monitoring.


• Partner closely with product owners and business stakeholders to translate requirements into actionable technical designs, delivery plans, and execution roadmaps.


• Provide hands-on technical leadership, conducting code reviews, offering architectural guidance, and ensuring adherence to security, governance, and compliance standards.



• Communicate technical decisions, delivery risks, and mitigation strategies effectively to senior leadership and cross-functional teams.

 

Read more
Remote, Hyderabad
3 - 5 yrs
₹15L - ₹25L / yr
Natural Language Processing (NLP)
Large Language Models (LLM) tuning
Data Structures
Algorithms
skill iconPython
+9 more

In this role, you'll be responsible for building machine learning based systems and conduct data analysis that improves the quality of our large geospatial data. You’ll be developing NLP models to extract information, using outlier detection to identifying anomalies and applying data science methods to quantify the quality of our data. You will take part in the development, integration, productionisation and deployment of the models at scale, which would require a good combination of data science and software development.


Responsibilities


  • Development of machine learning models
  • Building and maintaining software development solutions
  • Provide insights by applying data science methods
  • Take ownership of delivering features and improvements on time


Must-have Qualifications


  • 4 year's experience 
  • Senior data scientist preferable with knowledge of NLP
  • Strong programming skills and extensive experience with Python
  • Professional experience working with LLMs, transformers and open-source models from HuggingFace
  • Professional experience working with machine learning and data science, such as classification, feature engineering, clustering, anomaly detection and neural networks
  • Knowledgeable in classic machine learning algorithms (SVM, Random Forest, Naive Bayes, KNN etc.).
  • Experience using deep learning libraries and platforms, such as PyTorch
  • Experience with frameworks such as Sklearn, Numpy, Pandas, Polars
  • Excellent analytical and problem-solving skills
  • Excellent oral and written communication skills


Extra Merit Qualifications


  • Knowledge in at least one of the following: NLP, information retrieval, data mining
  • Ability to do statistical modeling and building predictive models
  • Programming skills and experience with Scala and/or Java
Read more
CAW.Tech

at CAW.Tech

5 recruiters
Ranjana Singh
Posted by Ranjana Singh
Hyderabad
8 - 12 yrs
Best in industry
skill iconPython
skill iconJava
Microservices
Distributed Systems
skill iconPostgreSQL
+7 more

Role Overview

We are hiring a Principal Datacenter Backend Developer to architect and build highly scalable, reliable backend platforms for modern data centers. This role owns control-plane and data-plane services powering orchestration, monitoring, automation, and operational intelligence across large-scale on-prem, hybrid, and cloud data center environments.

This is a hands-on principal IC role with strong architectural ownership and technical leadership responsibilities.


Key Responsibilities

  • Own end-to-end backend architecture for datacenter platforms (orchestration, monitoring, DCIM, automation).
  • Design and build high-availability distributed systems at scale.
  • Develop backend services using Java (Spring Boot / Micronaut / Quarkus) and/or Python (FastAPI / Flask / Django).
  • Build microservices for resource orchestration, telemetry ingestion, capacity and asset management.
  • Design REST/gRPC APIs and event-driven systems.
  • Drive performance optimization, scalability, and reliability best practices.
  • Embed SRE principles, observability, and security-by-design.
  • Mentor senior engineers and influence technical roadmap decisions.


Required Skills

  • Strong hands-on experience in Java and/or Python.
  • Deep understanding of distributed systems and microservices.
  • Experience with Kubernetes, Docker, CI/CD, and cloud-native deployments.
  • Databases: PostgreSQL/MySQL, NoSQL, time-series data.
  • Messaging systems: Kafka / Pulsar / RabbitMQ.
  • Observability tools: Prometheus, Grafana, ELK/OpenSearch.
  • Secure backend design (OAuth2, RBAC, audit logging).


Nice to Have

  • Experience with DCIM, NMS, or infrastructure automation platforms.
  • Exposure to hyperscale or colocation data centers.
  • AI/ML-based monitoring or capacity planning experience.


Why Join

  • Architect mission-critical platforms for large-scale data centers.
  • High-impact principal role with deep technical ownership.
  • Work on complex, real-world distributed systems problems.


Read more
VirtuesTech
Budime Haripriya
Posted by Budime Haripriya
Hyderabad
8 - 10 yrs
₹10L - ₹30L / yr
.NET Compact Framework
skill iconPython
Software design
agile methodology

Title:TeamLead– Software Development

(Lead ateam of developers to deliver applications in line withproduct strategy and growth)

 Experience:8– 10 years

 Department:InformationTechnology

Classification: Full-Time

 Location:HybridinHyderabad,India (3days onsiteand2days remote)


Job Description:

Lookingforafull-time Software Development Team Lead to lead our high-performing Information

Technology team. Thisperson will play a key rolein Clarity’s business by overseeing a development

team, focusingonexisting systems and long-term growth.Thisperson will serveas the technical leader,

able to discuss data structures, new technologies, and methods of achieving system goals. This person

will be crucialin facilitating collaborationamong team members and providing mentoring.

Reporting to the Director, SoftwareDevelopment,thispersonwillberesponsible for theday-to-day

operations of their team, be the first point of escalation and technical contactfor theteam.


JobResponsibilities:

 Manages all activities oftheir software developmentteamand sets goals for each team

member to ensure timely project delivery.

 Performcode reviews andwrite code if needed.

 Collaborateswiththe InformationTechnologydepartmentand business management

team to establish priorities for the team’s plan and manage team performance.

 Provide guidance on project requirements,developer processes, andend-user

documentation.

 Supports anexcellent customer experience bybeingproactive in assessing escalations

and working with the team to respond appropriately.

 Uses technical expertise to contribute towards building best-in-class products. Analyzes

business needs and develops a mix of internal and externalsoftware systems that work

well together.

 Using Clarity platforms, writes, reviews, and revises product requirements and

specifications. Analyzes software requirements,implements design plans, andreviews

unit tests. Participates in other areas of the software developmentprocess.


RequiredSkills:

 ABachelor’s degree inComputerScience,InformationTechnology, Engineering,or a

related discipline.

 Excellentwritten and verbalcommunication skills.

 Experiencewith .Net Framework,WebApplications,WindowsApplications, andWeb

Services

 Experience in developing andmaintaining applications using C#.NetCore,ASP.NetMVC,

and Entity Framework

 Experience in building responsive front-endusingReact.js,Angular.js,HTML5, CSS3 and

JavaScript.

 Experience in creating andmanaging databases, stored procedures andcomplex queries

with SQL Server

 Experiencewith Azure Cloud Infrastructure

 8+years of experience indesigning andcoding software inabove technology stack.

 3+years ofmanaging a teamwithin adevelopmentorganization.

 3+years of experience in Agile methodologies.


Preferred Skills:

 Experience in Python,WordPress,PHP

 Experience in using AzureDevOps

 Experience working with Salesforce, orany othercomparable ticketing system

 Experience in insurance/consumer benefits/file processing (EDI).

Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Chennai, Hyderabad, Kochi (Cochin), Noida, Pune, Thiruvananthapuram
7 - 10 yrs
₹21L - ₹30L / yr
Perforce
DevOps
skill iconGit
skill iconGitHub
skill iconPython
+7 more

JOB DETAILS:

* Job Title: Specialist I - DevOps Engineering

* Industry: Global Digital Transformation Solutions Provider

* Salary: Best in Industry

* Experience: 7-10 years

* Location: Bengaluru (Bangalore), Chennai, Hyderabad, Kochi (Cochin), Noida, Pune, Thiruvananthapuram

 

Job Description

Job Summary:

As a DevOps Engineer focused on Perforce to GitHub migration, you will be responsible for executing seamless and large-scale source control migrations. You must be proficient with GitHub Enterprise and Perforce, possess strong scripting skills (Python/Shell), and have a deep understanding of version control concepts.

The ideal candidate is a self-starter, a problem-solver, and thrives on challenges while ensuring smooth transitions with minimal disruption to development workflows.

 

Key Responsibilities:

  • Analyze and prepare Perforce repositories — clean workspaces, merge streams, and remove unnecessary files.
  • Handle large files efficiently using Git Large File Storage (LFS) for files exceeding GitHub’s 100MB size limit.
  • Use git-p4 fusion (Python-based tool) to clone and migrate Perforce repositories incrementally, ensuring data integrity.
  • Define migration scope — determine how much history to migrate and plan the repository structure.
  • Manage branch renaming and repository organization for optimized post-migration workflows.
  • Collaborate with development teams to determine migration points and finalize migration strategies.
  • Troubleshoot issues related to file sizes, Python compatibility, network connectivity, or permissions during migration.

 

Required Qualifications:

  • Strong knowledge of Git/GitHub and preferably Perforce (Helix Core) — understanding of differences, workflows, and integrations.
  • Hands-on experience with P4-Fusion.
  • Familiarity with cloud platforms (AWS, Azure) and containerization technologies (Docker, Kubernetes).
  • Proficiency in migration tools such as git-p4 fusion — installation, configuration, and troubleshooting.
  • Ability to identify and manage large files using Git LFS to meet GitHub repository size limits.
  • Strong scripting skills in Python and Shell for automating migration and restructuring tasks.
  • Experience in planning and executing source control migrations — defining scope, branch mapping, history retention, and permission translation.
  • Familiarity with CI/CD pipeline integration to validate workflows post-migration.
  • Understanding of source code management (SCM) best practices, including version history and repository organization in GitHub.
  • Excellent communication and collaboration skills for cross-team coordination and migration planning.
  • Proven practical experience in repository migration, large file management, and history preservation during Perforce to GitHub transitions.

 

Skills: Github, Kubernetes, Perforce, Perforce (Helix Core), Devops Tools

 

Must-Haves

Git/GitHub (advanced), Perforce (Helix Core) (advanced), Python/Shell scripting (strong), P4-Fusion (hands-on experience), Git LFS (proficient)

Read more
CAW.Tech

at CAW.Tech

5 recruiters
Ranjana Singh
Posted by Ranjana Singh
Hyderabad
5 - 8 yrs
Best in industry
skill iconPython
skill iconDjango
skill iconPostgreSQL
MySQL
FastAPI
+1 more

We are looking for a Staff Engineer - Python to join one of our engineering teams at our office in Hyderabad.


What would you do?

  • Own end-to-end delivery of backend projects from requirements and LLDs to production.
  • Lead technical design and execution, ensuring scalability, reliability, and code quality.
  • Build and integrate chatbot and AI-driven workflows with third-party systems.
  • Diagnose and resolve complex performance and production issues.
  • Drive testing, documentation, and engineering best practices.
  • Mentor engineers and act as the primary technical point of contact for the project/client.


Who Should Apply?

  • 5+ years of hands-on experience building backend systems in Python.
  • Proficiency in building web-based applications using Django or similar frameworks.
  • In-depth knowledge of the Python stack and API-first system design.
  • Experience working with SQL and NoSQL databases including PostgreSQL/MySQL, MongoDB, ElasticSearch, or key-value stores.
  • Strong experience owning design, delivery, and technical decision-making.
  • Proven ability to lead and mentor engineers through reviews and execution.
  • Clear communicator with a high-ownership, delivery-focused mindset.


Nice to Have

  • Experience contributing to system-level design discussions.
  • Prior exposure to AI/LLM-based systems or conversational platforms.
  • Experience working directly with clients or external stakeholders.
  • Background in fast-paced product or service environments.
Read more
Auxo AI
Kritika Dhingra
Posted by Kritika Dhingra
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
2 - 8 yrs
₹10L - ₹30L / yr
skill iconAmazon Web Services (AWS)
Data Transformation Tool (DBT)
SQL
skill iconPython
Spark
+1 more

AuxoAI is seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have 3-7 years of prior experience in data engineering, with a strong background in working on modern data platforms. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.


Location : Bangalore, Hyderabad, Mumbai, and Gurgaon


Responsibilities:

· Designing, building, and operating scalable on-premises or cloud data architecture

· Analyzing business requirements and translating them into technical specifications

· Design, develop, and implement data engineering solutions using DBT on cloud platforms (Snowflake, Databricks)

· Design, develop, and maintain scalable data pipelines and ETL processes

· Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.

· Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness

· Implement data governance and security best practices to ensure compliance and data integrity

· Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring

· Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.


Requirements


· Bachelor's or Master's degree in Computer Science, Engineering, or a related field.

· Overall 3+ years of prior experience in data engineering, with a focus on designing and building data pipelines

· Experience of working with DBT to implement end-to-end data engineering processes on Snowflake and Databricks

· Comprehensive understanding of the Snowflake and Databricks ecosystem

· Strong programming skills in languages like SQL and Python or PySpark.

· Experience with data modeling, ETL processes, and data warehousing concepts.

· Familiarity with implementing CI/CD processes or other orchestration tools is a plus.


Read more
Navitas Business Consulting
Solomon Yericherla
Posted by Solomon Yericherla
Hyderabad
5 - 10 yrs
₹15L - ₹22L / yr
skill iconJava
skill iconPython
skill iconAmazon Web Services (AWS)
skill iconJavascript
RESTful APIs
+4 more

5–10 years of experience in backend or full-stack development (Java, C#, Python, or Node.js preferred).

•Design, develop, and deploy full-stack web applications (front-end, back-end, APIs, and databases).

•Build responsive, user-friendly UIs using modern JavaScript frameworks (React, Vue, or Angular).

•Develop robust backend services and RESTful or GraphQL APIs using Node.js, Python, Java, or similar technologies.

•Manage and optimize databases (SQL and NoSQL).

•Collaborate with UX/UI designers, product managers, and QA engineers to refine requirements and deliver solutions.

•Implement CI/CD pipelines and support cloud deployments (AWS, Azure, or GCP).

•Write clean, testable, and maintainable code with appropriate documentation.

•Monitor performance, identify bottlenecks, and troubleshoot production issues.

•Stay up to date with emerging technologies and recommend improvements to tools, processes, and architecture.

•Proficiency in front-end technologies: HTML5, CSS3, JavaScript/TypeScript, and frameworks like React, Vue.js, or Angular.

•Strong experience with server-side programming (Node.js, Python/Django, Java/Spring Boot, or .NET).

•Experience with databases: PostgreSQL, MySQL, MongoDB, or similar.

•Familiarity with API design, microservices architecture, and REST/GraphQL best practices.

•Working knowledge of version control (Git/GitHub) and DevOps pipelines.

Understanding of cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes).

Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
4 - 10 yrs
₹8L - ₹20L / yr
Automated testing
skill iconAmazon Web Services (AWS)
skill iconPython
Test Automation (QA)
AWS CloudFormation
+25 more

JOB DETAILS:

* Job Title: Tester III - Software Testing (Automation testing + Python + AWS)

* Industry: Global digital transformation solutions provide

* Salary: Best in Industry

* Experience: 4 -10 years

* Location: Hyderabad

Job Description

Responsibilities:

  • Develop, maintain, and execute automation test scripts using Python.
  • Build reliable and reusable test automation frameworks for web and cloud-based applications.
  • Work with AWS cloud services for test execution, environment management, and integration needs.
  • Perform functional, regression, and integration testing as part of the QA lifecycle.
  • Analyze test failures, identify root causes, raise defects, and collaborate with development teams.
  • Participate in requirement review, test planning, and strategy discussions.
  • Contribute to CI/CD setup and integration of automation suites.

 

Required Experience:

  • Strong hands-on experience in Automation Testing.
  • Proficiency in Python for automation scripting and framework development.
  • Understanding and practical exposure to AWS services (Lambda, EC2, S3, CloudWatch, or similar).
  • Good knowledge of QA methodologies, SDLC/STLC, and defect management.
  • Familiarity with automation tools/frameworks (e.g., Selenium, PyTest).
  • Experience with Git or other version control systems.

 

Good to Have:

  • API testing experience (REST, Postman, REST Assured).
  • Knowledge of Docker/Kubernetes.
  • Exposure to Agile/Scrum environment.

 

Skills: Automation testing, Python, Java, ETL, AWS

 

Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
4 - 10 yrs
₹8L - ₹20L / yr
Automated testing
skill iconPython
Web applications
Software Testing (QA)
Systems Development Life Cycle (SDLC)
+18 more

JOB DETAILS:

* Job Title: Tester III - Software Testing (Automation Testing + Python + Azure)

* Industry: Global digital transformation solutions provide

* Salary: Best in Industry

* Experience: 4 -10 years

* Location: Hyderabad

Job Description

Responsibilities:

  • Design, develop, and execute automation test scripts using Python.
  • Build and maintain scalable test automation frameworks.
  • Work with Azure DevOps for CI/CD, pipeline automation, and test management.
  • Perform functional, regression, and integration testing for web and cloud‑based applications.
  • Analyze test results, log defects, and collaborate with developers for timely closure.
  • Participate in requirement analysis, test planning, and strategy discussions.
  • Ensure test coverage, maintain script quality, and optimize automation suites.


Required Experience:

  • Strong hands-on expertise in automation testing for web/cloud applications.
  • Solid proficiency in Python for creating automation scripts and frameworks.
  • Experience working with Azure services and Azure DevOps pipelines.
  • Good understanding of QA methodologies, SDLC/STLC, and defect lifecycle.
  • Experience with tools like Selenium, PyTest, or similar frameworks (good to have).
  • Familiarity with Git or other version control tools.

 

Good to Have:

  • Experience with API testing (REST, Postman, or similar tools)
  • Knowledge of Docker/Kubernetes
  • Exposure to Agile/Scrum environments

 

Skills: automation testing, python, java, azure

Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
4 - 10 yrs
₹8L - ₹20L / yr
Automated testing
Software Testing (QA)
Mobile App Testing (QA)
Web applications
skill iconJavascript
+17 more

JOB DETAILS:

* Job Title: Tester III - Software Testing- Playwright + API testing

* Industry: Global digital transformation solutions provide

* Salary: Best in Industry

* Experience: 4 -10 years

* Location: Hyderabad

Job Description

Responsibilities:

  • Design, develop, and maintain automated test scripts for web applications using Playwright.
  • Perform API testing using industry-standard tools and frameworks.
  • Collaborate with developers, product owners, and QA teams to ensure high-quality releases.
  • Analyze test results, identify defects, and track them to closure.
  • Participate in requirement reviews, test planning, and test strategy discussions.
  • Ensure automation coverage, maintain reusable test frameworks, and optimize execution pipelines.

 

Required Experience:

  • Strong hands-on experience in Automation Testing for web-based applications.
  • Proven expertise in Playwright (JavaScript, TypeScript, or Python-based scripting).
  • Solid experience in API testing (Postman, REST Assured, or similar tools).
  • Good understanding of software QA methodologies, tools, and processes.
  • Ability to write clear, concise test cases and automation scripts.
  • Experience with CI/CD pipelines (Jenkins, GitHub Actions, Azure DevOps) is an added advantage.

 

Good to Have:

  • Knowledge of cloud environments (AWS/Azure)
  • Experience with version control tools like Git
  • Familiarity with Agile/Scrum methodologies

 

Skills: automation testing, sql, api testing, soap ui testing, playwright

Read more
IXG Inc
Hyderabad
0 - 1 yrs
₹20000 - ₹40000 / mo
Design thinking
AI Agents
skill iconPython
skill iconMongoDB
skill iconPostgreSQL
+1 more

AI-Native Software Developer Intern


Build real AI agents used daily across the company

We’re looking for a high-agency, AI-native software developer intern to help us build internal AI agents that improve productivity across our entire company (80–100 people using them daily).


You will ship real systems, used by real teams, with real impact.

If you’ve never built anything outside coursework, this role is probably not a fit.


What You’ll Work On

You will work directly on designing, building, deploying, and iterating AI agents that power internal workflows.

Examples of problems you may tackle:


Internal AI agents for:

  • Knowledge retrieval across Notion / docs / Slack
  • Automated report generation
  • Customer support assistance
  • Process automation (ops, hiring, onboarding, etc.)
  • Decision-support copilots
  • Prompt engineering + structured outputs + tool-using agents

Building workflows using:

  • LLM APIs
  • Vector databases
  • Agent frameworks
  • Internal dashboards
  • Improving reliability, latency, cost, and usability of AI systems
  • Designing real UX around AI tools (not just scripts)

You will own features end-to-end:

  • Problem understanding
  • Solution design
  • Implementation
  • Testing
  • Deployment
  • Iteration based on user feedback


What We Expect From You

You must:

  • Be AI-native: you actively use tools like:
  • ChatGPT / Claude / Cursor / Copilot
  • AI for debugging, scaffolding, refactoring
  • Prompt iteration
  • Rapid prototyping
  • Be comfortable with at least one programming language (Python, TypeScript, JS, etc.)
  • Have strong critical thinking
  • You question requirements
  • You think about edge cases
  • You optimize systems, not just make them “work”
  • Be high agency
  • You don’t wait for step-by-step instructions
  • You proactively propose solutions
  • You take ownership of outcomes
  • Be able to learn fast on the job

Help will be provided but you will not be spoonfed.


Absolute Requirement (Non-Negotiable)

If you have not built any side projects with a visible output, you will most likely be rejected.

We expect at least one of:

  • A deployed web app
  • A GitHub repo with meaningful commits
  • A working AI tool
  • A live demo link
  • A product you built and shipped
  • An agent, automation, bot, or workflow you created


Bonus Points (Strong Signals)

These are not required but will strongly differentiate you:

  • Built projects using:
  • LLM APIs (OpenAI, Anthropic, etc.)
  • LangChain / LlamaIndex / custom agent frameworks
  • Vector DBs like Pinecone, Weaviate, FAISS
  • RAG systems
  • Experience deploying:
  • Vercel, Fly.io, Render, AWS, etc.
  • Built internal tools for a team before
  • Strong product intuition (you care about UX, not just code)
  • Experience automating your own workflows using scripts or AI


What You’ll Gain

You will get:

  • Real experience building AI agents used daily
  • Ownership over production systems
  • Deep exposure to:
  • AI architecture
  • Product thinking
  • Iterative engineering
  • Tradeoffs (cost vs latency vs accuracy)
  • A portfolio that actually means something in 2026
  • A strong shot at long-term roles based on performance

If you perform well, you won’t leave with a certificate, you'll leave with real-world building experience.


Who This Is Perfect For

  • People who already build things for fun
  • People who automate their own life with scripts/tools
  • People who learn by shipping
  • People who prefer responsibility over structure
  • People who are excited by ambiguity

Who This Is Not For

Be honest with yourself:

  • If you need step-by-step instructions
  • If you avoid open-ended problems
  • If you’ve never built anything outside assignments
  • If you dislike using AI tools while coding

This will be frustrating for you.


How To Apply

Send:

  • Your GitHub
  • Links to projects (deployed preferred)
  • A short note explaining:
  • What you built
  • Why you built it
  • What you’d improve if you had more time

Strong portfolios beat strong resumes.

Read more
Kanerika Software

at Kanerika Software

3 candid answers
2 recruiters
Ariba Khan
Posted by Ariba Khan
Hyderabad, Indore, Ahmedabad
3 - 5 yrs
Upto ₹20L / yr (Varies
)
SQL
Snowflake
Airflow
skill iconPython

About Kanerika:

Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.


We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.


Awards and Recognitions:

Kanerika has won several awards over the years, including:

1. Best Place to Work 2023 by Great Place to Work®

2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today

3. NASSCOM Emerge 50 Award in 2014

4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture

5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.


Working for us:

Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.


Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.


Role Responsibilities: 

Following are high level responsibilities that you will play but not limited to: 

  • Design, development, and implementation of modern data pipelines, data models, and ETL/ELT processes.
  • Architect and optimize data lake and warehouse solutions using Microsoft Fabric, Databricks, or Snowflake.
  • Enable business analytics and self-service reporting through Power BI and other visualization tools.
  • Collaborate with data scientists, analysts, and business users to deliver reliable and high-performance data solutions.
  • Implement and enforce best practices for data governance, data quality, and security.
  • Mentor and guide junior data engineers; establish coding and design standards.
  • Evaluate emerging technologies and tools to continuously improve the data ecosystem.


Required Qualifications:

  • Bachelor's degree in computer science, Information Technology, Engineering, or a related field.
  • Bachelor’s/ Master’s degree in Computer Science, Information Technology, Engineering, or related field.
  • 3-5 years of experience in data engineering or data platform development
  • Strong hands-on experience in SQL, Snowflake, Python, and Airflow
  • Solid understanding of data modeling, data governance, security, and CI/CD practices.

Preferred Qualifications:

  • Familiarity with data modeling techniques and practices for Power BI.
  • Knowledge of Azure Databricks or other data processing frameworks.
  • Knowledge of Microsoft Fabric or other Cloud Platforms.


What we need?

· B. Tech computer science or equivalent.


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.


Employee Benefits:

1. Culture:

  • Open Door Policy: Encourages open communication and accessibility to management.
  • Open Office Floor Plan: Fosters a collaborative and interactive work environment.
  • Flexible Working Hours: Allows employees to have flexibility in their work schedules.
  • Employee Referral Bonus: Rewards employees for referring qualified candidates.
  • Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.


2. Inclusivity and Diversity:

  • Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
  • Mandatory POSH training: Promotes a safe and respectful work environment.


3. Health Insurance and Wellness Benefits:

  • GMC and Term Insurance: Offers medical coverage and financial protection.
  • Health Insurance: Provides coverage for medical expenses.
  • Disability Insurance: Offers financial support in case of disability.


4. Child Care & Parental Leave Benefits:

  • Company-sponsored family events: Creates opportunities for employees and their families to bond.
  • Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
  • Family Medical Leave: Offers leave for employees to take care of family members' medical needs.


5. Perks and Time-Off Benefits:

  • Company-sponsored outings: Organizes recreational activities for employees.
  • Gratuity: Provides a monetary benefit as a token of appreciation.
  • Provident Fund: Helps employees save for retirement.
  • Generous PTO: Offers more than the industry standard for paid time off.
  • Paid sick days: Allows employees to take paid time off when they are unwell.
  • Paid holidays: Gives employees paid time off for designated holidays.
  • Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.


6. Professional Development Benefits:

  • L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
  • Mentorship Program: Offers guidance and support from experienced professionals.
  • Job Training: Provides training to enhance job-related skills.
  • Professional Certification Reimbursements: Assists employees in obtaining professional   certifications.
  • Promote from Within: Encourages internal growth and advancement opportunities.
Read more
Hyderabad
10 - 14 yrs
₹30L - ₹38L / yr
skill iconPython
skill iconDjango
RESTful APIs
NOSQL Databases
Communication Skills
+1 more

Role Summary

We are looking for a seasoned Python/Django expert with 10–12 years of real-world development experience and a strong background in leading engineering teams. The selected candidate will be responsible for managing complex technical initiatives, mentoring team members, ensuring best coding practices, and partnering closely with cross-functional teams. This position demands deep technical proficiency, strong leadership capability, and exceptional communication skills.

Primary Responsibilities

· Lead, guide, and mentor a team of Python/Django engineers, offering hands-on technical support and direction.

· Architect, design, and deliver secure, scalable, and high-performing web applications.

· Manage the complete software development lifecycle including requirements gathering, system design, development, testing, deployment, and post-launch maintenance.

· Ensure compliance with coding standards, architectural patterns, and established development best practices.

· Collaborate with product teams, QA, UI/UX, and other stakeholders to ensure timely and high-quality product releases.

· Perform detailed code reviews, optimize system performance, and resolve production-level issues.

· Drive engineering improvements such as automation, CI/CD implementation, and modernization of outdated systems.

· Create and maintain technical documentation while providing regular updates to leadership and stakeholders.

Required Skills & Qualifications Negotiable

· 10–14 years of professional experience in software development with strong expertise in Python and Django.

· Solid understanding of key web technologies, including REST APIs, HTML, CSS, and JavaScript.

· Hands-on experience working with relational and NoSQL databases (such as PostgreSQL, MySQL, or MongoDB).

· Familiarity with major cloud platforms (AWS, Azure, or GCP) and container tools like Docker and Kubernetes is a plus.

· Proficient in Git workflows, CI/CD pipelines, and automated testing tools.

· Strong analytical and problem-solving skills, especially in designing scalable and high-availability systems.

· Excellent communication skills—both written and verbal.

· Demonstrated leadership experience in mentoring teams and managing technical deliverables.

· Must be available to work on-site in the Hyderabad office; remote work is not allowed.

Preferred Qualifications

· Experience with microservices, asynchronous frameworks (such as FastAPI or Celery), or event-driven architectures.

· Familiarity with Agile/Scrum methodologies.

· Previous background as a technical lead or engineering manager.

Read more
IT Services & Staffing Solutions Industry

IT Services & Staffing Solutions Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
12 - 14 yrs
₹29L - ₹38L / yr
skill iconAmazon Web Services (AWS)
DevOps
Terraform
Troubleshooting
Amazon VPC
+16 more

REVIEW CRITERIA:

MANDATORY:

  • Strong Hands-On AWS Cloud Engineering / DevOps Profile
  • Mandatory (Experience 1): Must have 12+ years of experience in AWS Cloud Engineering / Cloud Operations / Application Support
  • Mandatory (Experience 2): Must have strong hands-on experience supporting AWS production environments (EC2, VPC, IAM, S3, ALB, CloudWatch)
  • Mandatory (Infrastructure as a code): Must have hands-on Infrastructure as Code experience using Terraform in production environments
  • Mandatory (AWS Networking): Strong understanding of AWS networking and connectivity (VPC design, routing, NAT, load balancers, hybrid connectivity basics)
  • Mandatory (Cost Optimization): Exposure to cost optimization and usage tracking in AWS environments
  • Mandatory (Core Skills): Experience handling monitoring, alerts, incident management, and root cause analysis
  • Mandatory (Soft Skills): Strong communication skills and stakeholder coordination skills


ROLE & RESPONSIBILITIES:

We are looking for a hands-on AWS Cloud Engineer to support day-to-day cloud operations, automation, and reliability of AWS environments. This role works closely with the Cloud Operations Lead, DevOps, Security, and Application teams to ensure stable, secure, and cost-effective cloud platforms.


KEY RESPONSIBILITIES:

  • Operate and support AWS production environments across multiple accounts
  • Manage infrastructure using Terraform and support CI/CD pipelines
  • Support Amazon EKS clusters, upgrades, scaling, and troubleshooting
  • Build and manage Docker images and push to Amazon ECR
  • Monitor systems using CloudWatch and third-party tools; respond to incidents
  • Support AWS networking (VPCs, NAT, Transit Gateway, VPN/DX)
  • Assist with cost optimization, tagging, and governance standards
  • Automate operational tasks using Python, Lambda, and Systems Manager


IDEAL CANDIDATE:

  • Strong hands-on AWS experience (EC2, VPC, IAM, S3, ALB, CloudWatch)
  • Experience with Terraform and Git-based workflows
  • Hands-on experience with Kubernetes / EKS
  • Experience with CI/CD tools (GitHub Actions, Jenkins, etc.)
  • Scripting experience in Python or Bash
  • Understanding of monitoring, incident management, and cloud security basics


NICE TO HAVE:

  • AWS Associate-level certifications
  • Experience with Karpenter, Prometheus, New Relic
  • Exposure to FinOps and cost optimization practices
Read more
Semi-Conductor Industry

Semi-Conductor Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
10 - 12 yrs
₹30L - ₹35L / yr
Signal integrity
Systems analysis and design
Schematic
skill iconPython
Perl
+113 more

MANDATORY CRITERIA:

  • Education: B.Tech / M.Tech in ECE / CSE / IT
  • Experience: 10–12 years in hardware board design, system hardware engineering, and full product deployment cycles
  • Proven expertise in digital, analog, and power electronic circuit analysis & design
  • Strong hands-on experience designing boards with SoCs, FPGAs, CPLDs, and MPSoC architectures
  • Deep understanding of signal integrity, EMI/EMC, and high-speed design considerations
  • Must have successfully completed at least two hardware product development cycles from high-level design to final deployment
  • Ability to independently handle schematic design, design analysis (DC drop, SI), and cross-team design reviews
  • Experience in sourcing & procurement of electronic components, PCBs, and mechanical parts for embedded/IoT/industrial hardware
  • Strong experience in board bring-up, debugging, issue investigation, and cross-functional triage with firmware/software teams
  • Expertise in hardware validation, test planning, test execution, equipment selection, debugging, and report preparation
  • Proficiency in Cadence Allegro or Altium EDA tools (mandatory)
  • Experience coordinating with layout, mechanical, SI, EMC, manufacturing, and supply chain teams
  • Strong understanding of manufacturing services, production pricing models, supply chain, and logistics for electronics/electromechanical components


DESCRIPTION:

COMPANY OVERVIEW:

The company is a semiconductor and embedded system design company with a focus on Embedded, Turnkey ASICs, Mixed Signal IP, Semiconductor & Product Engineering and IoT solutions catering to Aerospace & Defence, Consumer Electronics, Automotive, Medical and Networking & Telecommunications.


REQUIRED SKILLS:

  • Extensive experience in hardware board designs and towards multiple product field deployment cycles.
  • Strong foundation and expertise in analyzing digital, Analog and power electronic circuits.
  • Proficient with SoC, FPGAs, CPLD and MPSOC architecture-based board designs.
  • Knowledgeable in signal integrity, EMI/EMC concepts for digital and power electronics.
  • Completed at least two project from high-level design to final product level deployment.
  • Capable of independently managing product’s schematic, design analysis DC Drop, Signal Integrity, and coordinating reviews with peer of layout, mechanical, SI, and EMC teams.
  • Sourcing and procurement of electronic components, PCBs, and mechanical parts for cutting-edge IoT, embedded, and industrial product development.
  • Experienced in board bring-up, issue investigation, and triage in collaboration with firmware and software teams.
  • Skilled in preparing hardware design documentation, validation test planning, identifying necessary test equipment, test development, execution, debugging, and report preparation.
  • Effective communication and interpersonal skills for collaborative work with cross-functional teams, including post-silicon bench validation, BIOS, and driver development/QA.
  • Hands-on experience with Cadence Allegro/Altium EDA tools is essential.
  • Familiarity with programming and scripting languages like Python and Perl, and experience in test automation is advantageous.
  • Should have excellent exposure with coordination of Manufacturing Services, pricing model for production value supply chain & Logistics in electronics and electromechanical components domain.
Read more
Semi-Conductor Industry

Semi-Conductor Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad, Bengaluru (Bangalore)
10 - 12 yrs
₹30L - ₹35L / yr
skill iconPython
Perl
EDA
Test Automation (QA)
Supply Chain Management (SCM)
+90 more

MANDATORY CRITERIA:

  • Education: B.Tech / M.Tech in ECE / CSE / IT
  • Experience: 10–12 years in hardware board design, system hardware engineering, and full product deployment cycles
  • Proven expertise in digital, analog, and power electronic circuit analysis & design
  • Strong hands-on experience designing boards with SoCs, FPGAs, CPLDs, and MPSoC architectures
  • Deep understanding of signal integrity, EMI/EMC, and high-speed design considerations
  • Must have successfully completed at least two hardware product development cycles from high-level design to final deployment
  • Ability to independently handle schematic design, design analysis (DC drop, SI), and cross-team design reviews
  • Experience in sourcing & procurement of electronic components, PCBs, and mechanical parts for embedded/IoT/industrial hardware
  • Strong experience in board bring-up, debugging, issue investigation, and cross-functional triage with firmware/software teams
  • Expertise in hardware validation, test planning, test execution, equipment selection, debugging, and report preparation
  • Proficiency in Cadence Allegro or Altium EDA tools (mandatory)
  • Experience coordinating with layout, mechanical, SI, EMC, manufacturing, and supply chain teams
  • Strong understanding of manufacturing services, production pricing models, supply chain, and logistics for electronics/electromechanical components


DESCRIPTION:

COMPANY OVERVIEW:

The company is a semiconductor and embedded system design company with a focus on Embedded, Turnkey ASICs, Mixed Signal IP, Semiconductor & Product Engineering and IoT solutions catering to Aerospace & Defence, Consumer Electronics, Automotive, Medical and Networking & Telecommunications.


REQUIRED SKILLS:

  • Extensive experience in hardware board designs and towards multiple product field deployment cycles.
  • Strong foundation and expertise in analyzing digital, Analog and power electronic circuits.
  • Proficient with SoC, FPGAs, CPLD and MPSOC architecture-based board designs.
  • Knowledgeable in signal integrity, EMI/EMC concepts for digital and power electronics.
  • Completed at least two project from high-level design to final product level deployment.
  • Capable of independently managing product’s schematic, design analysis DC Drop, Signal Integrity, and coordinating reviews with peer of layout, mechanical, SI, and EMC teams.
  • Sourcing and procurement of electronic components, PCBs, and mechanical parts for cutting-edge IoT, embedded, and industrial product development.
  • Experienced in board bring-up, issue investigation, and triage in collaboration with firmware and software teams.
  • Skilled in preparing hardware design documentation, validation test planning, identifying necessary test equipment, test development, execution, debugging, and report preparation.
  • Effective communication and interpersonal skills for collaborative work with cross-functional teams, including post-silicon bench validation, BIOS, and driver development/QA.
  • Hands-on experience with Cadence Allegro/Altium EDA tools is essential.
  • Familiarity with programming and scripting languages like Python and Perl, and experience in test automation is advantageous.
  • Should have excellent exposure with coordination of Manufacturing Services, pricing model for production value supply chain & Logistics in electronics and electromechanical components domain.
Read more
Moolya Software Testing Private Limited
Durga Anand
Posted by Durga Anand
Hyderabad
5 - 8 yrs
₹5L - ₹23L / yr
skill iconJava
Selenium
BDD
Cucumber
restassured
+2 more

Job Title: QA Automation Engineer

Key Responsibilities & Skills:

  • Strong hands-on experience in Java automation testing
  • Expertise in Selenium for web application automation
  • Experience with BDD frameworks using Cucumber (feature files and step definitions)
  • Hands-on experience in API automation using Rest Assured
  • Working knowledge of Python automation scripting
  • Experience with Robot Framework for test automation
  • Ability to design, develop, and maintain scalable automation frameworks
  • Experience in test execution, reporting, and defect tracking
  • Strong analytical, problem-solving, and communication skills
Read more
Highfly Sourcing

at Highfly Sourcing

2 candid answers
Highfly Hr
Posted by Highfly Hr
Singapore, Switzerland, New Zealand, Dubai, Dublin, Ireland, Augsburg, Germany, Manchester (United Kingdom), Qatar, Kuwait, Malaysia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Goa
3 - 5 yrs
₹15L - ₹25L / yr
SQL
skill iconPHP
skill iconPython
Data Visualization
Data Structures
+5 more

We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.

Key Responsibilities:

  • Collect, clean, and organize data from internal and external sources
  • Analyze large datasets to identify trends, patterns, and opportunities
  • Prepare regular and ad-hoc reports for business stakeholders
  • Create dashboards and visualizations using tools like Power BI or Tableau
  • Work closely with cross-functional teams to understand data requirements
  • Ensure data accuracy, consistency, and quality across reports
  • Document data processes and analysis methods


Read more
Hashone Careers
Madhavan I
Posted by Madhavan I
Hyderabad
6 - 10 yrs
₹15L - ₹28L / yr
skill iconData Analytics
skill iconPython
SQL
Data Warehouse (DWH)
Data modeling

Job Description

Role: Data Analyst

Experience: 6 - 9 Years

Location: Hyderabad

WorkMode: Work from Office (5 Days)


Overview

We are seeking a highly skilled Data Analyst with 6+ years of experience in analytics, data modeling, and advanced SQL. The ideal candidate has strong expertise in building scalable data models using dbt, writing efficient Python scripts, and delivering high-quality insights that support data-driven decision-making.


Key Responsibilities

Design, develop, and maintain data models using dbt (Core and dbt Cloud).

Build and optimize complex SQL queries to support reporting, analytics, and data pipelines.

Write Python scripts for data transformation, automation, and analytics workflows.

Ensure data quality, integrity, and consistency across multiple data sources.

Collaborate with cross-functional teams (Engineering, Product, Business) to understand data needs.

Develop dashboards and reports to visualize insights (using tools such as Tableau, Looker, or Power BI).

Perform deep-dive exploratory analysis to identify trends, patterns, and business opportunities.

Document data models, pipelines, and processes.

Contribute to scaling the analytics stack and improving data architecture.


Required Qualifications

6 - 9 years of hands-on experience in data analytics or data engineering.

Expert-level skills in SQL (complex joins, window functions, performance tuning).

Strong experience building and maintaining dbt data models.

Proficiency in Python for data manipulation, scripting, and automation.

Solid understanding of data warehousing concepts (e.g., dimensional modeling, ELT/ETL pipelines).

Understanding with cloud data platforms (Snowflake, BigQuery, Redshift, etc.).

Strong analytical thinking and problem-solving skills.

Excellent communication skills with the ability to present insights to stakeholders.

Trino and lakehouse architecture experience good to have


Read more
Financial Services Industry

Financial Services Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
4 - 5 yrs
₹10L - ₹20L / yr
skill iconPython
CI/CD
SQL
skill iconKubernetes
Stakeholder management
+14 more

Required Skills: CI/CD Pipeline, Kubernetes, SQL Database, Excellent Communication & Stakeholder Management, Python

 

Criteria:

Looking for 15days and max 30 days of notice period candidates.

looking candidates from Hyderabad location only

Looking candidates from EPAM company only 

1.4+ years of software development experience

2. Strong experience with Kubernetes, Docker, and CI/CD pipelines in cloud-native environments.

3. Hands-on with NATS for event-driven architecture and streaming.

4. Skilled in microservices, RESTful APIs, and containerized app performance optimization.

5. Strong in problem-solving, team collaboration, clean code practices, and continuous learning.

6.  Proficient in Python (Flask) for building scalable applications and APIs.

7. Focus: Java, Python, Kubernetes, Cloud-native development

8. SQL database 

 

Description

Position Overview

We are seeking a skilled Developer to join our engineering team. The ideal candidate will have strong expertise in Java and Python ecosystems, with hands-on experience in modern web technologies, messaging systems, and cloud-native development using Kubernetes.


Key Responsibilities

  • Design, develop, and maintain scalable applications using Java and Spring Boot framework
  • Build robust web services and APIs using Python and Flask framework
  • Implement event-driven architectures using NATS messaging server
  • Deploy, manage, and optimize applications in Kubernetes environments
  • Develop microservices following best practices and design patterns
  • Collaborate with cross-functional teams to deliver high-quality software solutions
  • Write clean, maintainable code with comprehensive documentation
  • Participate in code reviews and contribute to technical architecture decisions
  • Troubleshoot and optimize application performance in containerized environments
  • Implement CI/CD pipelines and follow DevOps best practices
  •  

Required Qualifications

  • Bachelor's degree in Computer Science, Information Technology, or related field
  • 4+ years of experience in software development
  • Strong proficiency in Java with deep understanding of web technology stack
  • Hands-on experience developing applications with Spring Boot framework
  • Solid understanding of Python programming language with practical Flask framework experience
  • Working knowledge of NATS server for messaging and streaming data
  • Experience deploying and managing applications in Kubernetes
  • Understanding of microservices architecture and RESTful API design
  • Familiarity with containerization technologies (Docker)
  • Experience with version control systems (Git)


Skills & Competencies

  • Skills Java (Spring Boot, Spring Cloud, Spring Security) 
  • Python (Flask, SQL Alchemy, REST APIs)
  • NATS messaging patterns (pub/sub, request/reply, queue groups)
  • Kubernetes (deployments, services, ingress, ConfigMaps, Secrets)
  • Web technologies (HTTP, REST, WebSocket, gRPC)
  • Container orchestration and management
  • Soft Skills Problem-solving and analytical thinking
  • Strong communication and collaboration
  • Self-motivated with ability to work independently
  • Attention to detail and code quality
  • Continuous learning mindset
  • Team player with mentoring capabilities


Read more
Digital Convergence Technologies
Pune, Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Hyderabad
5 - 8 yrs
₹40L - ₹45L / yr
Artificial Intelligence (AI)
Data-flow analysis
Microsoft SharePoint
API
skill iconPython

AI Agent Builder – Internal Functions and Data Platform Development Tools


About the Role:

We are seeking a forward-thinking AI Agent Builder to lead the design, development, and deployment, and usage reporting of Microsoft Copilot and other AI-powered agents across our data platform development tools and internal business functions. This role will be instrumental in driving automation, improving onboarding, and enhancing operational efficiency through intelligent, context-aware assistants.

This role is central to our GenAI transformation strategy. You will help shape the future of how our teams interact with data, reduce administrative burden, and unlock new efficiencies across the organization. Your work will directly contribute to our “Art of the Possible” initiative—demonstrating tangible business value through AI.

You Will:

•                 Copilot Agent Development: Use Microsoft Copilot Studio and Agent Builder to create, test, and deploy AI agents that automate workflows, answer queries, and support internal teams.

•                 Data Engineering Enablement: Build agents that assist with data connector scaffolding, pipeline generation, and onboarding support for engineers.

•                 Knowledge Base Integration: Curate and integrate documentation (e.g., ERDs, connector specs) into Copilot-accessible repositories (SharePoint, Confluence) to support contextual AI responses.

•                 Prompt Engineering: Design reusable prompt templates and conversational flows to streamline repeated tasks and improve agent usability.

•                 Tool Evaluation & Integration: Assess and integrate complementary AI tools (e.g., GitLab Duo, Databricks AI, Notebook LM) to extend Copilot capabilities.

•                 Cross-Functional Collaboration: Partner with product, delivery, PMO, and security teams to identify high-value use cases and scale successful agent implementations.

•                 Governance & Monitoring: Ensure agents align with Responsible AI principles, monitor performance, and iterate based on feedback and evolving business needs.

•                 Adoption and Usage Reporting: Use Microsoft Viva Insights and other tools to report on user adoption, usage and business value delivered.

What We're Looking For:

•                 Proven experience with Microsoft 365 Copilot, Copilot Studio, or similar AI platforms, ChatGPT, Claude, etc.

•                 Strong understanding of data engineering workflows, tools (e.g., Git, Databricks, Unity Catalog), and documentation practices.

•                 Familiarity with SharePoint, Confluence, and Microsoft Graph connectors.

•                 Experience in prompt engineering and conversational UX design.

•                 Ability to translate business needs into scalable AI solutions.

•                 Excellent communication and collaboration skills across technical and non-technical

Bonus Points:

•                 Experience with GitLab Duo, Notebook LM, or other AI developer tools.

•                 Background in enterprise data platforms, ETL pipelines, or internal business systems.

•                 Exposure to AI governance, security, and compliance frameworks.

•                 Prior work in a regulated industry (e.g., healthcare, finance) is a plus.

Read more
IT Services Industry

IT Services Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
10 - 12 yrs
₹20L - ₹35L / yr
Signal integrity
Systems analysis and design
Schematic
Hardware
EMI
+16 more

Required Skills: Advanced Hardware Board Design Expertise, Signal Integrity, EMI/EMC & Design Analysis, Board Bring-Up & Troubleshooting, EDA Tools & Technical Documentation, Cross-Functional & Supply Chain Coordination

 

Criteria:

  • Education: B.Tech / M.Tech in ECE / CSE / IT
  • Experience: 10–12 years in hardware board design, system hardware engineering, and full product deployment cycles
  • Proven expertise in digital, analog, and power electronic circuit analysis & design
  • Strong hands-on experience designing boards with SoCs, FPGAs, CPLDs, and MPSoC architectures
  • Deep understanding of signal integrity, EMI/EMC, and high-speed design considerations
  • Must have successfully completed at least two hardware product development cycles from high-level design to final deployment
  • Ability to independently handle schematic design, design analysis (DC drop, SI), and cross-team design reviews
  • Experience in sourcing & procurement of electronic components, PCBs, and mechanical parts for embedded/IoT/industrial hardware
  • Strong experience in board bring-up, debugging, issue investigation, and cross-functional triage with firmware/software teams
  • Expertise in hardware validation, test planning, test execution, equipment selection, debugging, and report preparation
  • Proficiency in Cadence Allegro or Altium EDA tools (mandatory)
  • Experience coordinating with layout, mechanical, SI, EMC, manufacturing, and supply chain teams
  • Strong understanding of manufacturing services, production pricing models, supply chain, and logistics for electronics/electromechanical components

 

Description

REQUIRED SKILLS:

• Extensive experience in hardware board designs and towards multiple product field deployment cycle.

• Strong foundation and expertise in analyzing digital, Analog and power electronic circuits.

• Proficient with SoC, FPGAs, CPLD and MPSOC architecture-based board designs.

• Knowledgeable in signal integrity, EMI/EMC concepts for digital and power electronics.

• Completed at least two project from high-level design to final product level deployment.

• Capable of independently managing product’s schematic, design analysis DC Drop, Singal Integrity, and coordinating reviews with peer of layout, mechanical, SI, and EMC teams.

• Sourcing and procurement of electronic components, PCBs, and mechanical parts for cutting-edge IoT, embedded, and industrial product development.

• Experienced in board bring-up, issue investigation, and triage in collaboration with firmware and software teams.

• Skilled in preparing hardware design documentation, validation test planning, identify necessary test equipment, test development, execution, debugging, and report preparation.

• Effective communication and interpersonal skills for collaborative work with cross-functional teams, including post-silicon bench validation, BIOS, and driver development/QA.

• Hands-on experience with Cadence Allegro/Altium EDA tools is essential.

• Familiarity with programming and scripting languages like Python and Perl, and experience in test automation is advantageous.

• Should have excellent exposure with coordination of Manufacturing Services, pricing model for production value supply chain & Logistics in electronics and electromechanical components domain.

 


Education Requirements: 

B. Tech / M. Tech (ECE/ CSE/ IT)

Experience - 10 to 12 Years


 

Read more
Deqode

at Deqode

1 recruiter
Samiksha Agrawal
Posted by Samiksha Agrawal
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Pune, Bengaluru (Bangalore), Hyderabad, Jaipur, Bhopal
5 - 8 yrs
₹5L - ₹13L / yr
skill iconPython
Azure
Artificial Intelligence (AI)
FastAPI
skill iconFlask
+3 more

Job Description: Python-Azure AI Developer

Experience: 5+ years

Locations: Bangalore | Pune | Chennai | Jaipur | Hyderabad | Gurgaon | Bhopal

Mandatory Skills:

  • Python: Expert-level proficiency with FastAPI/Flask
  • Azure Services: Hands-on experience integrating Azure cloud services
  • Databases: PostgreSQL, Redis
  • AI Expertise: Exposure to Agentic AI technologies, frameworks, or SDKs with strong conceptual understanding

Good to Have:

  • Workflow automation tools (n8n or similar)
  • Experience with LangChain, AutoGen, or other AI agent frameworks
  • Azure OpenAI Service knowledge

Key Responsibilities:

  • Develop AI-powered applications using Python and Azure
  • Build RESTful APIs with FastAPI/Flask
  • Integrate Azure services for AI/ML workloads
  • Implement agentic AI solutions
  • Database optimization and management
  • Workflow automation implementation


Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Chennai, Kochi (Cochin), Trivandrum, Hyderabad, Thiruvananthapuram
8 - 10 yrs
₹10L - ₹25L / yr
Business Analysis
Data Visualization
PowerBI
SQL
Tableau
+18 more

Job Description – Senior Technical Business Analyst

Location: Trivandrum (Preferred) | Open to any location in India

Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST

 

About the Role

We are seeking highly motivated and analytically strong Senior Technical Business Analysts who can work seamlessly with business and technology stakeholders to convert a one-line problem statement into a well-defined project or opportunity. This role is ideal for fresh graduates who have a strong foundation in data analytics, data engineering, data visualization, and data science, along with a strong drive to learn, collaborate, and grow in a dynamic, fast-paced environment.

As a Technical Business Analyst, you will be responsible for translating complex business challenges into actionable user stories, analytical models, and executable tasks in Jira. You will work across the entire data lifecycle—from understanding business context to delivering insights, solutions, and measurable outcomes.

 

Key Responsibilities

Business & Analytical Responsibilities

  • Partner with business teams to understand one-line problem statements and translate them into detailed business requirementsopportunities, and project scope.
  • Conduct exploratory data analysis (EDA) to uncover trends, patterns, and business insights.
  • Create documentation including Business Requirement Documents (BRDs)user storiesprocess flows, and analytical models.
  • Break down business needs into concise, actionable, and development-ready user stories in Jira.

Data & Technical Responsibilities

  • Collaborate with data engineering teams to design, review, and validate data pipelinesdata models, and ETL/ELT workflows.
  • Build dashboards, reports, and data visualizations using leading BI tools to communicate insights effectively.
  • Apply foundational data science concepts such as statistical analysispredictive modeling, and machine learning fundamentals.
  • Validate and ensure data quality, consistency, and accuracy across datasets and systems.

Collaboration & Execution

  • Work closely with product, engineering, BI, and operations teams to support the end-to-end delivery of analytical solutions.
  • Assist in development, testing, and rollout of data-driven solutions.
  • Present findings, insights, and recommendations clearly and confidently to both technical and non-technical stakeholders.

 

Required Skillsets

Core Technical Skills

  • 6+ years of Technical Business Analyst experience within an overall professional experience of 8+ years
  • Data Analytics: SQL, descriptive analytics, business problem framing.
  • Data Engineering (Foundational): Understanding of data warehousing, ETL/ELT processes, cloud data platforms (AWS/GCP/Azure preferred).
  • Data Visualization: Experience with Power BI, Tableau, or equivalent tools.
  • Data Science (Basic/Intermediate): Python/R, statistical methods, fundamentals of ML algorithms.

 

Soft Skills

  • Strong analytical thinking and structured problem-solving capability.
  • Ability to convert business problems into clear technical requirements.
  • Excellent communication, documentation, and presentation skills.
  • High curiosity, adaptability, and eagerness to learn new tools and techniques.

 

Educational Qualifications

  • BE/B.Tech or equivalent in:
  • Computer Science / IT
  • Data Science

 

What We Look For

  • Demonstrated passion for data and analytics through projects and certifications.
  • Strong commitment to continuous learning and innovation.
  • Ability to work both independently and in collaborative team environments.
  • Passion for solving business problems using data-driven approaches.
  • Proven ability (or aptitude) to convert a one-line business problem into a structured project or opportunity.

 

Why Join Us?

  • Exposure to modern data platforms, analytics tools, and AI technologies.
  • A culture that promotes innovation, ownership, and continuous learning.
  • Supportive environment to build a strong career in data and analytics.

 

Skills: Data Analytics, Business Analysis, Sql


Must-Haves

Technical Business Analyst (6+ years), SQL, Data Visualization (Power BI, Tableau), Data Engineering (ETL/ELT, cloud platforms), Python/R

Read more
Hashone Careers
Bengaluru (Bangalore), Pune, Hyderabad
5 - 10 yrs
₹12L - ₹25L / yr
DevOps
skill iconPython
cicd
skill iconKubernetes
skill iconDocker
+1 more

Job Description

Experience: 5 - 9 years

Location: Bangalore/Pune/Hyderabad

Work Mode: Hybrid(3 Days WFO)


Senior Cloud Infrastructure Engineer for Data Platform 


The ideal candidate will play a critical role in designing, implementing, and maintaining cloud infrastructure and CI/CD pipelines to support scalable, secure, and efficient data and analytics solutions. This role requires a strong understanding of cloud-native technologies, DevOps best practices, and hands-on experience with Azure and Databricks.


Key Responsibilities:


Cloud Infrastructure Design & Management

Architect, deploy, and manage scalable and secure cloud infrastructure on Microsoft Azure.

Implement best practices for Azure Resource Management, including resource groups, virtual networks, and storage accounts.

Optimize cloud costs and ensure high availability and disaster recovery for critical systems


Databricks Platform Management

Set up, configure, and maintain Databricks workspaces for data engineering, machine learning, and analytics workloads.

Automate cluster management, job scheduling, and monitoring within Databricks.

Collaborate with data teams to optimize Databricks performance and ensure seamless integration with Azure services.


CI/CD Pipeline Development

Design and implement CI/CD pipelines for deploying infrastructure, applications, and data workflows using tools like Azure DevOps, GitHub Actions, or similar.

Automate testing, deployment, and monitoring processes to ensure rapid and reliable delivery of updates.


Monitoring & Incident Management

Implement monitoring and alerting solutions using tools like Dynatrace, Azure Monitor, Log Analytics, and Databricks metrics.

Troubleshoot and resolve infrastructure and application issues, ensuring minimal downtime.


Security & Compliance

Enforce security best practices, including identity and access management (IAM), encryption, and network security.

Ensure compliance with organizational and regulatory standards for data protection and cloud operations.


Collaboration & Documentation

Work closely with cross-functional teams, including data engineers, software developers, and business stakeholders, to align infrastructure with business needs.

Maintain comprehensive documentation for infrastructure, processes, and configurations.


Required Qualifications

Education: Bachelor’s degree in Computer Science, Engineering, or a related field.


Must Have Experience:

6+ years of experience in DevOps or Cloud Engineering roles.

Proven expertise in Microsoft Azure services, including Azure Data Lake, Azure Databricks, Azure Data Factory (ADF), Azure Functions, Azure Kubernetes Service (AKS), and Azure Active Directory.

Hands-on experience with Databricks for data engineering and analytics.


Technical Skills:

Proficiency in Infrastructure as Code (IaC) tools like Terraform, ARM templates, or Bicep.

Strong scripting skills in Python, or Bash.

Experience with containerization and orchestration tools like Docker and Kubernetes.

Familiarity with version control systems (e.g., Git) and CI/CD tools (e.g., Azure DevOps, GitHub Actions).


Soft Skills:

Strong problem-solving and analytical skills.

Excellent communication and collaboration abilities.

Read more
AI Industry

AI Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Gurugram
5 - 12 yrs
₹20L - ₹46L / yr
skill iconData Science
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Generative AI
skill iconDeep Learning
+14 more

Review Criteria

  • Strong Senior Data Scientist (AI/ML/GenAI) Profile
  • 5+ years of experience in designing, developing, and deploying Machine Learning / Deep Learning (ML/DL) systems in production
  • Must have strong hands-on experience in Python and deep learning frameworks such as PyTorch, TensorFlow, or JAX.
  • 1+ years of experience in fine-tuning Large Language Models (LLMs) using techniques like LoRA/QLoRA, and building RAG (Retrieval-Augmented Generation) pipelines.
  • Must have experience with MLOps and production-grade systems including Docker, Kubernetes, Spark, model registries, and CI/CD workflows

 

Preferred

  • Prior experience in open-source GenAI contributions, applied LLM/GenAI research, or large-scale production AI systems
  • Preferred (Education) – B.S./M.S./Ph.D. in Computer Science, Data Science, Machine Learning, or a related field.

 

Job Specific Criteria

  • CV Attachment is mandatory
  • Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
  • Are you okay with 3 Days WFO?
  • Virtual Interview requires video to be on, are you okay with it?


Role & Responsibilities

Company is hiring a Senior Data Scientist with strong expertise in AI, machine learning engineering (MLE), and generative AI. You will play a leading role in designing, deploying, and scaling production-grade ML systems — including large language model (LLM)-based pipelines, AI copilots, and agentic workflows. This role is ideal for someone who thrives on balancing cutting-edge research with production rigor and loves mentoring while building impact-first AI applications.

 

Responsibilities:

  • Own the full ML lifecycle: model design, training, evaluation, deployment
  • Design production-ready ML pipelines with CI/CD, testing, monitoring, and drift detection
  • Fine-tune LLMs and implement retrieval-augmented generation (RAG) pipelines
  • Build agentic workflows for reasoning, planning, and decision-making
  • Develop both real-time and batch inference systems using Docker, Kubernetes, and Spark
  • Leverage state-of-the-art architectures: transformers, diffusion models, RLHF, and multimodal pipelines
  • Collaborate with product and engineering teams to integrate AI models into business applications
  • Mentor junior team members and promote MLOps, scalable architecture, and responsible AI best practices


Ideal Candidate

  • 5+ years of experience in designing, deploying, and scaling ML/DL systems in production
  • Proficient in Python and deep learning frameworks such as PyTorch, TensorFlow, or JAX
  • Experience with LLM fine-tuning, LoRA/QLoRA, vector search (Weaviate/PGVector), and RAG pipelines
  • Familiarity with agent-based development (e.g., ReAct agents, function-calling, orchestration)
  • Solid understanding of MLOps: Docker, Kubernetes, Spark, model registries, and deployment workflows
  • Strong software engineering background with experience in testing, version control, and APIs
  • Proven ability to balance innovation with scalable deployment
  • B.S./M.S./Ph.D. in Computer Science, Data Science, or a related field
  • Bonus: Open-source contributions, GenAI research, or applied systems at scale


Read more
Auxo AI
kusuma Gullamajji
Posted by kusuma Gullamajji
Bengaluru (Bangalore), Hyderabad, Mumbai, Gurugram
5 - 10 yrs
₹10L - ₹40L / yr
skill iconPython
SQL
Google Cloud Platform (GCP)
Dataform

Responsibilities:

  • Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)
  • Design and maintain BigQuery datasets using best practices in partitioning, clustering, and materialized views
  • Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration
  • Implement SQL-based transformations using Dataform (or dbt)
  • Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture
  • Drive engineering best practices across CI/CD, testing, monitoring, and pipeline observability
  • Partner with solution architects and product teams to translate data requirements into technical designs
  • Mentor junior data engineers and support knowledge-sharing across the team
  • Contribute to documentation, code reviews, sprint planning, and agile ceremonies



Requirements


  • 5+ years of hands-on experience in data engineering, with at least 2 years on GCP
  • Proven expertise in BigQueryDataflow (Apache Beam)Cloud Composer (Airflow)
  • Strong programming skills in Python and/or Java
  • Experience with SQL optimizationdata modeling, and pipeline orchestration
  • Familiarity with GitCI/CD pipelines, and data quality monitoring frameworks
  • Exposure to Dataformdbt, or similar tools for ELT workflows
  • Solid understanding of data architectureschema design, and performance tuning
  • Excellent problem-solving and collaboration skills

Bonus Skills:

  • GCP Professional Data Engineer certification
  • Experience with Vertex AICloud FunctionsDataproc, or real-time streaming architectures
  • Familiarity with data governance tools (e.g., Atlan, Collibra, Dataplex)
  • Exposure to Docker/KubernetesAPI integration, and infrastructure-as-code (Terraform)


Read more
Versatile Commerce LLP

at Versatile Commerce LLP

2 candid answers
Burugupally Shailaja
Posted by Burugupally Shailaja
Hyderabad
3 - 6 yrs
₹4L - ₹6L / yr
Selenium
skill iconJava
skill iconPython
skill iconJenkins
TestNG
+6 more

We’re Hiring – Automation Test Engineer!

We at Versatile Commerce are looking for passionate Automation Testing Professionals to join our growing team!

📍 Location: Gachibowli, Hyderabad (Work from Office)

💼 Experience: 3 – 5 Years

Notice Period: Immediate Joiners Preferred

What we’re looking for:

✅ Strong experience in Selenium / Cypress / Playwright

✅ Proficient in Java / Python / JavaScript

✅ Hands-on with TestNG / JUnit / Maven / Jenkins

✅ Experience in API Automation (Postman / REST Assured)

✅ Good understanding of Agile Testing & Defect Management Tools (JIRA, Zephyr)

Read more
CADFEM India
Agency job
via hirezyai by Aardra Suresh
Hyderabad
4 - 8 yrs
₹12L - ₹15L / yr
skill iconPython
skill iconReact.js
TypeScript
skill iconPostgreSQL
skill iconAngular (2+)
+2 more

Role Summary

We are seeking a Full-Stack Developer to build and secure features for our Therapy Planning Software (TPS), which integrates with RMS/RIS, EMR systems, devices (DICOM, Bluetooth, VR, robotics, FES), and supports ICD–ICF–ICHI coding. The role involves ~40% frontend and 60% backend development, with end-to-end responsibility for security across application layers.

Responsibilities

Frontend (40%)

  1. Build responsive, accessible UI in React + TypeScript (or Angular/Vue).
  2. Implement multilingual (i18n/l10n) and WCAG 2.1 accessibility standards.
  3. Develop offline-capable PWAs for home programs.
  4. Integrate REST/FHIR APIs for patient workflows, scheduling, and reporting.
  5. Support features like voice-to-text, video capture, and compression.

Backend (60%)

  1. Design and scale REST APIs using Python (FastAPI/Django).
  2. Build modules for EMR storage, assessments, therapy plans, and data logging.
  3. Implement HL7/FHIR endpoints and secure integrations with external EMRs.
  4. Handle file uploads (virus scanning, HD video compression, secure storage).
  5. Optimize PostgreSQL schemas and queries for performance.
  6. Implement RBAC, MFA, PDPA compliance, edit locks, and audit trails.

Security Layer (Ownership)

  1. Identity & Access: OAuth2/OIDC, JWT, MFA, SSO.
  2. Data Protection: TLS, AES-256 at rest, field-level encryption, immutable audit logs.
  3. Compliance: PDPA, HIPAA principles, MDA requirements.
  4. DevSecOps: Secure coding (OWASP ASVS), dependency scanning, secrets management.
  5. Monitoring: Logging/metrics (ELK/Prometheus), anomaly detection, DR/BCP preparedness.

Requirements

  • Strong skills in Python (FastAPI/Django) and React + TypeScript.
  • Experience with HL7/FHIR, EMR data, and REST APIs.
  • Knowledge of OAuth2/JWT authentication, RBAC, audit logging.
  • Proficiency with PostgreSQL and database optimization.
  • Cloud deployment (AWS/Azure) and containerization (Docker/K8s) a plus.

Added Advantage

  • Familiarity with ICD, ICF, ICHI coding systems or medical diagnosis workflows.

Success Metrics

  • Deliver secure end-to-end features with clinical workflow integration.
  • Pass OWASP/ASVS L2 security baseline.
  • Establish full audit trail and role-based access across at least one clinical workflow.


Read more
Loyalty Juggernaut Inc

at Loyalty Juggernaut Inc

2 recruiters
Shraddha Dhavle
Posted by Shraddha Dhavle
Hyderabad
3 - 5 yrs
₹5L - ₹15L / yr
ETL
ETL architecture
skill iconPython
Data engineering

At Loyalty Juggernaut, we’re on a mission to revolutionize customer loyalty through AI-driven SaaS solutions. We are THE JUGGERNAUTS, driving innovation and impact in the loyalty ecosystem with GRAVTY®, our SaaS Product that empowers multinational enterprises to build deeper customer connections. Designed for scalability and personalization, GRAVTY® delivers cutting-edge loyalty solutions that transform customer engagement across diverse industries including Airlines, Airport, Retail, Hospitality, Banking, F&B, Telecom, Insurance and Ecosystem.


Our Impact:

  • 400+ million members connected through our platform.
  • Trusted by 100+ global brands/partners, driving loyalty and brand devotion worldwide.


Proud to be a Three-Time Champion for Best Technology Innovation in Loyalty!!


Explore more about us at www.lji.io.


What you will OWN:

  • Build the infrastructure required for optimal extraction, transformation, and loading of data from various sources using SQL and AWS ‘big data’ technologies.
  • Create and maintain optimal data pipeline architecture.
  • Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Work with stakeholders, including the Technical Architects, Developers, Product Owners, and Executives, to assist with data-related technical issues and support their data infrastructure needs.
  • Create tools for data management and data analytics that can assist them in building and optimizing our product to become an innovative industry leader.


You would make a GREAT FIT if you have:

  • Have 2 to 5 years of relevant backend development experience, with solid expertise in Python.
  • Possess strong skills in Data Structures and Algorithms, and can write optimized, maintainable code.
  • Are familiar with database systems, and can comfortably work with PostgreSQL, as well as NoSQL solutions like MongoDB or DynamoDB.
  • Hands-on experience using Cloud Dataware houses like AWS Redshift, GBQ, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift, and AWS Batch would be an added advantage.
  • Have a solid understanding of ETL processes and tools and can build or modify ETL pipelines effectively.
  • Have experience managing or building data pipelines and architectures at scale.
  • Understand the nuances of data ingestion, transformation, storage, and analytics workflows.
  • Communicate clearly and work collaboratively across engineering, product.


Why Choose US?

  • This opportunity offers a dynamic and supportive work environment where you'll have the chance to not just collaborate with talented technocrats but also work with globally recognized brands, gain exposure, and carve your own career path.
  • You will get to innovate and dabble in the future of technology -Enterprise Cloud Computing, Blockchain, Machine Learning, AI, Mobile, Digital Wallets, and much more.


Read more
Versatile Commerce LLP

at Versatile Commerce LLP

2 candid answers
Burugupally Shailaja
Posted by Burugupally Shailaja
Hyderabad
3 - 9 yrs
₹3L - ₹8L / yr
Retrieval Augmented Generation (RAG)
skill iconMachine Learning (ML)
Generative AI
Open-source LLMs
skill iconPython
+2 more

📍Company: Versatile Commerce

 📍 Position: Data Scientists

 📍 Experience: 3-9 yrs

 📍 Location: Hyderabad (WFO)

 📅 Notice Period: 0- 15 Days

Read more
Pipaltree AI

at Pipaltree AI

2 candid answers
Mudit Tanwani
Posted by Mudit Tanwani
Remote, Hyderabad
3 - 7 yrs
₹24L - ₹60L / yr
Artificial Intelligence (AI)
skill iconPython
LLMs

At Pipaltree, we’re building an AI-enabled platform that helps brands understand how they’re truly perceived — not through surveys or static dashboards, but through real conversations happening across the world.

We’re a small team solving deep technical and product challenges: orchestrating large-scale conversation data, applying reasoning and summarization models, and turning this into insights that businesses can trust.


Requirements:

  • Deep understanding of distributed systems and asynchronous programming in Python
  • Experience with building scalable applications using LLMs or traditional ML techniques
  • Experience with Databases, Cache, and Micro services
  • Experience with DevOps is a huge plus
Read more
Talent Pro
Mumbai, Bengaluru (Bangalore), Hyderabad, Gurugram
5 - 8 yrs
₹25L - ₹35L / yr
skill iconPython
skill iconReact.js

Strong Full stack developer Profile

Mandatory (Experience 1) - Must Have Minimum 5+ YOE in Software Development,

Mandatory (Experience 2) - Must have 4+ YOE in backend using Python.

Mandatory (Experience 3) - Must have good experience in frontend using React JS with knowledge of HTML, CSS, and JavaScript.

Mandatory (Experience 4) - Must have Experience in any databases - MySQL / PostgreSQL / Postgres / Oracle / SQL Server /

Read more
One of the reputed Client in India

One of the reputed Client in India

Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Hyderabad, Pune
6 - 8 yrs
₹12L - ₹13L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark

Our Client is looking to hire Databricks Amin immediatly.


This is PAN-INDIA Bulk hiring


Minimum of 6-8+ years with Databricks, Pyspark/Python and AWS.

Must have AWS


Notice 15-30 days is preferred.


Share profiles at hr at etpspl dot com

Please refer/share our email to your friends/colleagues who are looking for job.

Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Chennai
7 - 10 yrs
₹10L - ₹18L / yr
full stack
skill iconReact.js
skill iconPython
skill iconGo Programming (Golang)
CI/CD
+9 more

Full-Stack Developer

Exp: 5+ years required

Night shift: 8 PM-5 AM/9PM-6 AM

Only Immediate Joinee Can Apply


We are seeking a mid-to-senior level Full-Stack Developer with a foundational understanding of software development, cloud services, and database management. In this role, you will contribute to both the front-end and back-end of our application. focusing on creating a seamless user experience, supported by robust and scalable cloud infrastructure.

Key Responsibilities

● Develop and maintain user-facing features using React.js and TypeScript.

● Write clean, efficient, and well-documented JavaScript/TypeScript code.

● Assist in managing and provisioning cloud infrastructure on AWS using Infrastructure as Code (IaC) principles.

● Contribute to the design, implementation, and maintenance of our databases.

● Collaborate with senior developers and product managers to deliver high-quality software.

● Troubleshoot and debug issues across the full stack.

● Participate in code reviews to maintain code quality and share knowledge.

Qualifications

● Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience.

● 5+ years of professional experience in web development.

● Proficiency in JavaScript and/or TypeScript.

● Proficiency in Golang and Python.

● Hands-on experience with the React.js library for building user interfaces.

● Familiarity with Infrastructure as Code (IaC) tools and concepts (e.g.(AWS CDK, Terraform, or CloudFormation).

● Basic understanding of AWS and its core services (e.g., S3, EC2, Lambda, DynamoDB).

● Experience with database management, including relational (e.g., PostgreSQL) or NoSQL (e.g., DynamoDB, MongoDB) databases.

● Strong problem-solving skills and a willingness to learn.

● Familiarity with modern front-end build pipelines and tools like Vite and Tailwind CSS.

● Knowledge of CI/CD pipelines and automated testing.


Read more
Hunarstreet technologies pvt ltd

Hunarstreet technologies pvt ltd

Agency job
Chennai, Hyderabad, Bengaluru (Bangalore), Mumbai, Pune, Gurugram, Mohali, Panchkula
5 - 15 yrs
₹10L - ₹15L / yr
Fullstack Developer
Web Development
skill iconJavascript
TypeScript
skill iconGo Programming (Golang)
+5 more

We are seeking a mid-to-senior level Full-Stack Developer with a foundational understanding of software development, cloud services, and database management. In this role, you will contribute to both the front-end and back-end of our application. focusing on creating a seamless user experience, supported by robust and scalable cloud infrastructure.


Key Responsibilities

● Develop and maintain user-facing features using React.js and TypeScript.

● Write clean, efficient, and well-documented JavaScript/TypeScript code.

● Assist in managing and provisioning cloud infrastructure on AWS using Infrastructure as Code (IaC) principles.

● Contribute to the design, implementation, and maintenance of our databases.

● Collaborate with senior developers and product managers to deliver high-quality software.

● Troubleshoot and debug issues across the full stack.

● Participate in code reviews to maintain code quality and share knowledge.


Qualifications

● Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience.

● 5+ years of professional experience in web development.

● Proficiency in JavaScript and/or TypeScript.

● Proficiency in Golang and Python.

● Hands-on experience with the React.js library for building user interfaces.

● Familiarity with Infrastructure as Code (IaC) tools and concepts (e.g.(AWS CDK, Terraform, or CloudFormation).

● Basic understanding of AWS and its core services (e.g., S3, EC2, Lambda, DynamoDB).

● Experience with database management, including relational (e.g., PostgreSQL) or NoSQL (e.g., DynamoDB, MongoDB) databases.

● Strong problem-solving skills and a willingness to learn.

● Familiarity with modern front-end build pipelines and tools like Vite and Tailwind CSS.

● Knowledge of CI/CD pipelines and automated testing.

Read more
Estuate Software

at Estuate Software

1 candid answer
Deekshith K Naidu
Posted by Deekshith K Naidu
Hyderabad
5 - 12 yrs
₹5L - ₹35L / yr
Google Cloud Platform (GCP)
Apache Airflow
ETL
skill iconPython
Big query
+1 more

Job Title: Data Engineer / Integration Engineer

 

Job Summary:

We are seeking a highly skilled Data Engineer / Integration Engineer to join our team. The ideal candidate will have expertise in Python, workflow orchestration, cloud platforms (GCP/Google BigQuery), big data frameworks (Apache Spark or similar), API integration, and Oracle EBS. The role involves designing, developing, and maintaining scalable data pipelines, integrating various systems, and ensuring data quality and consistency across platforms. Knowledge of Ascend.io is a plus.

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines and workflows.
  • Develop and optimize ETL/ELT processes using Python and workflow automation tools.
  • Implement and manage data integration between various systems, including APIs and Oracle EBS.
  • Work with Google Cloud Platform (GCP) or Google BigQuery (GBQ) for data storage, processing, and analytics.
  • Utilize Apache Spark or similar big data frameworks for efficient data processing.
  • Develop robust API integrations for seamless data exchange between applications.
  • Ensure data accuracy, consistency, and security across all systems.
  • Monitor and troubleshoot data pipelines, identifying and resolving performance issues.
  • Collaborate with data analysts, engineers, and business teams to align data solutions with business goals.
  • Document data workflows, processes, and best practices for future reference.

Required Skills & Qualifications:

  • Strong proficiency in Python for data engineering and workflow automation.
  • Experience with workflow orchestration tools (e.g., Apache Airflow, Prefect, or similar).
  • Hands-on experience with Google Cloud Platform (GCP) or Google BigQuery (GBQ).
  • Expertise in big data processing frameworks, such as Apache Spark.
  • Experience with API integrations (REST, SOAP, GraphQL) and handling structured/unstructured data.
  • Strong problem-solving skills and ability to optimize data pipelines for performance.
  • Experience working in an agile environment with CI/CD processes.
  • Strong communication and collaboration skills.

Preferred Skills & Nice-to-Have:

  • Experience with Ascend.io platform for data pipeline automation.
  • Knowledge of SQL and NoSQL databases.
  • Familiarity with Docker and Kubernetes for containerized workloads.
  • Exposure to machine learning workflows is a plus.

Why Join Us?

  • Opportunity to work on cutting-edge data engineering projects.
  • Collaborative and dynamic work environment.
  • Competitive compensation and benefits.
  • Professional growth opportunities with exposure to the latest technologies.

How to Apply:

Interested candidates can apply by sending their resume to [your email/contact].

 

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Bengaluru (Bangalore), Pune, Hyderabad
6 - 12 yrs
₹5L - ₹28L / yr
skill iconData Science
skill iconPython
Large Language Models (LLM)

Job Description:

 

Role: Data Scientist

 

Responsibilities:

 

 Lead data science and machine learning projects, contributing to model development, optimization and evaluation. 

 Perform data cleaning, feature engineering, and exploratory data analysis.  

 

Translate business requirements into technical solutions, document and communicate project progress, manage non-technical stakeholders.

 

Collaborate with other DS and engineers to deliver projects.

 

Technical Skills – Must have:

 

Experience in and understanding of the natural language processing (NLP) and large language model (LLM) landscape.

 

Proficiency with Python for data analysis, supervised & unsupervised learning ML tasks.

 

Ability to translate complex machine learning problem statements into specific deliverables and requirements.

 

Should have worked with major cloud platforms such as AWS, Azure or GCP.

 

Working knowledge of SQL and no-SQL databases.

 

Ability to create data and ML pipelines for more efficient and repeatable data science projects using MLOps principles.

 

Keep abreast with new tools, algorithms and techniques in machine learning and works to implement them in the organization.

 

Strong understanding of evaluation and monitoring metrics for machine learning projects.

Read more
Pune, Bengaluru (Bangalore), Hyderabad
8 - 12 yrs
₹14L - ₹15L / yr
skill iconR Programming
skill iconPython
Scikit-Learn
TensorFlow
PyTorch
+8 more

Role: Data Scientist (Python + R Expertise)

Exp: 8 -12 Years

CTC: up to 30 LPA


Required Skills & Qualifications:

  • 8–12 years of hands-on experience as a Data Scientist or in a similar analytical role.
  • Strong expertise in Python and R for data analysis, modeling, and visualization.
  • Proficiency in machine learning frameworks (scikit-learn, TensorFlow, PyTorch, caret, etc.).
  • Strong understanding of statistical modeling, hypothesis testing, regression, and classification techniques.
  • Experience with SQL and working with large-scale structured and unstructured data.
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and deployment practices (Docker, MLflow).
  • Excellent analytical, problem-solving, and communication skills.


Preferred Skills:

  • Experience with NLP, time series forecasting, or deep learning projects.
  • Exposure to data visualization tools (Tableau, Power BI, or R Shiny).
  • Experience working in product or data-driven organizations.
  • Knowledge of MLOps and model lifecycle management is a plus.


If interested kindly share your updated resume on 82008 31681


Read more
FloData
Mahesh J
Posted by Mahesh J
Hyderabad
3 - 5 yrs
₹20L - ₹40L / yr
Generative AI
Retrieval Augmented Generation (RAG)
Prompt engineering
AI Agents
Langgraph
+5 more

Join us to reimagine how businesses integrate data and automate processes – with AI at the core.


About FloData

FloData is re-imagining the iPaaS and Business Process Automation (BPA) space for a new era - one where business teams, not just IT, can integrate data, run automations, and solve ops bottlenecks using intuitive, AI-driven interfaces. We're a small, hands-on team with a deep technical foundation and strong industry connections. Backed by real-world learnings from our earlier platform version, we're now going all-in on building a generative AI-first experience.


The Opportunity

We’re looking for an GenAI Engineer to help build the intelligence layer of our new platform. From designing LLM-powered orchestration flows with LangGraph to building frameworks for evaluation and monitoring with LangSmith, you’ll shape how AI powers real-world enterprise workflows.


If you thrive on working at the frontier of LLM systems engineering, enjoy scaling prototypes into production-grade systems, and want to make AI reliable, explainable, and enterprise-ready - this is your chance to define a category-defining product.


What You'll Do

  • Spend ~70% of your time architecting, prototyping, and productionizing AI systems (LLM orchestration, agents, evaluation, observability)
  • Develop AI frameworks: orchestration (LangGraph), evaluation/monitoring (LangSmith), vector/graph DBs, and other GenAI infra
  • Work with product engineers to seamlessly integrate AI services into frontend and backend workflows
  • Build systems for AI evaluation, monitoring, and reliability to ensure trustworthy performance at scale
  • Translate product needs into AI-first solutions, balancing rapid prototyping with enterprise-grade robustness
  • Stay ahead of the curve by exploring emerging GenAI frameworks, tools, and research for practical application


Must Have

  • 3–5 years of engineering experience, with at least 1-2 years in GenAI systems
  • Hands-on experience with LangGraph, LangSmith, LangChain, or similar frameworks for orchestration/evaluation
  • Deep understanding of LLM workflows: prompt engineering, fine-tuning, RAG, evaluation, monitoring, and observability
  • A strong product mindset—comfortable bridging research-level concepts with production-ready business use cases
  • Startup mindset: resourceful, pragmatic, and outcome-driven


Good To Have

  • Experience integrating AI pipelines with enterprise applications and hybrid infra setups (AWS, on-prem, VPCs)
  • Experience building AI-native user experiences (assistants, copilots, intelligent automation flows)
  • Familiarity with enterprise SaaS/IT ecosystems (Salesforce, Oracle ERP, Netsuite, etc.)


Why Join Us

  • Own the AI backbone of a generational product at the intersection of AI, automation, and enterprise data
  • Work closely with founders and leadership with no layers of bureaucracy
  • End-to-end ownership of AI systems you design and ship
  • Be a thought partner in setting AI-first principles for both tech and culture
  • Onsite in Hyderabad, with flexibility when needed


Sounds like you?

We'd love to talk. Apply now or reach out directly to explore this opportunity.

Read more
US Base Company

US Base Company

Agency job
Hyderabad, Gurugram
10 - 18 yrs
₹20L - ₹35L / yr
skill iconPython
skill iconDjango
skill iconReact.js
Angular
skill iconJavascript
+3 more

Key Responsibilities

  • Design, develop, and maintain scalable microservices and RESTful APIs using Python (Flask, FastAPI, or Django).
  • Architect data models for SQL and NoSQL databases (PostgreSQL, ClickHouse, MongoDB, DynamoDB) to optimize performance and reliability.
  • Implement efficient and secure data access layers, caching, and indexing strategies.
  • Collaborate closely with product and frontend teams to deliver seamless user experiences.
  • Build responsive UI components using HTML, CSS, JavaScript, and frameworks like React or Angular.
  • Ensure system reliability, observability, and fault tolerance across services.
  • Lead code reviews, mentor junior engineers, and promote engineering best practices.
  • Contribute to DevOps and CI/CD workflows for smooth deployments and testing automation.

Required Skills & Experience

  • 10+ years of professional software development experience.
  • Strong proficiency in Python, with deep understanding of OOP, asynchronous programming, and performance optimization.
  • Proven expertise in building FAST API based microservices architectures.
  • Solid understanding of SQL and NoSQL data modeling, query optimization, and schema design.
  • Excellent hands on proficiency in frontend proficiency with HTML, CSS, JavaScript, and a modern framework (React, Angular, or Vue).
  • Experience working with cloud platforms (AWS, GCP, or Azure) and containerized deployments (Docker, Kubernetes).
  • Familiarity with distributed systems, event-driven architectures, and messaging queues (Kafka, RabbitMQ).
  • Excellent problem-solving, communication, and system design skills.


Read more
Technoidentity
Hyderabad
6 - 12 yrs
₹20L - ₹35L / yr
skill iconPython
FastAPI
PySpark

Supercharge Your Career as a Technical Lead - Python at Technoidentity!

Are you ready to solve people challenges that fuel business growth? At Technoidentity, we’re a Data+AI product engineering company building cutting-edge solutions in the FinTech domain for over 13 years—and we’re expanding globally. It’s the perfect time to join our

team of tech innovators and leave your mark!

At Technoidentity, we’re a Data + AI product engineering company trusted to deliver scalable and modern enterprise solutions. Join us as a Senior Python Developer and Technical Lead, where you'll guide high-performing engineering teams, design complex systems, and deliver

clean, scalable backend solutions using Python and modern data technologies. Your leadership will directly shape the architecture and execution of enterprise projects, with added strength in understanding database logic including PL/SQL and PostgreSQL/AlloyDB.

What’s in it for You?

• Modern Python Stack – Python 3.x, FastAPI, Pandas, NumPy, SQLAlchemy, PostgreSQL/AlloyDB, PL/pgSQL.

• Tech Leadership – Drive technical decision-making, mentor developers, and ensure code quality and scalability.

• Scalable Projects – Architect and optimize data-intensive backend services for highthroughput and distributed systems.

• Engineering Best Practices – Enforce clean architecture, code reviews, testing strategies, and SDLC alignment.

• Cross-Functional Collaboration – Lead conversations across engineering, QA, product, and DevOps to ensure delivery excellence.

What Will You Be Doing?

Technical Leadership

• Lead a team of developers through design, code reviews, and technical mentorship.

• Set architectural direction and ensure scalability, modularity, and code quality.

• Work with stakeholders to translate business goals into robust technical solutions.

Backend Development & Data Engineering

• Design and build clean, high-performance backend services using FastAPI and Python

best practices.

• Handle row- and column-level data transformation using Pandas and NumPy.

• Apply data wrangling, cleansing, and preprocessing techniques across microservices and pipelines.

Database & Performance Optimization

• Write performant queries, procedures, and triggers using PostgreSQL and PL/pgSQL.

• Understand legacy logic in PL/SQL and participate in rewriting or modernizing it for PostgreSQL-based systems.

• Tune both backend and database performance, including memory, indexing, and query optimization.

Parallelism & Communication

• Implement multithreading, multiprocessing, and parallel data flows in Python.

• Integrate Kafka, RabbitMQ, or Pub/Sub systems for real-time and async message

processing.

Engineering Excellence

• Drive adherence to Agile, Git-based workflows, CI/CD, and DevOps pipelines.

• Promote testing (unit/integration), monitoring, and observability for all backend systems.

• Stay current with Python ecosystem evolution and introduce tools that improve productivity and performance.

What Makes You the Perfect Fit?

• 6–10 years of proven experience in Python development, with strong expertise in designing and delivering scalable backend solutions

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Nagpur, Ahmedabad, Jaipur, Kochi (Cochin)
3.6 - 8 yrs
₹4L - ₹18L / yr
skill iconPython
skill iconDjango
skill iconFlask
skill iconAmazon Web Services (AWS)
AWS Lambda
+3 more

Job Summary:

Deqode is looking for a highly motivated and experienced Python + AWS Developer to join our growing technology team. This role demands hands-on experience in backend development, cloud infrastructure (AWS), containerization, automation, and client communication. The ideal candidate should be a self-starter with a strong technical foundation and a passion for delivering high-quality, scalable solutions in a client-facing environment.


Key Responsibilities:

  • Design, develop, and deploy backend services and APIs using Python.
  • Build and maintain scalable infrastructure on AWS (EC2, S3, Lambda, RDS, etc.).
  • Automate deployments and infrastructure with Terraform and Jenkins/GitHub Actions.
  • Implement containerized environments using Docker and manage orchestration via Kubernetes.
  • Write automation and scripting solutions in Bash/Shell to streamline operations.
  • Work with relational databases like MySQL and SQL, including query optimization.
  • Collaborate directly with clients to understand requirements and provide technical solutions.
  • Ensure system reliability, performance, and scalability across environments.


Required Skills:

  • 3.5+ years of hands-on experience in Python development.
  • Strong expertise in AWS services such as EC2, Lambda, S3, RDS, IAM, CloudWatch.
  • Good understanding of Terraform or other Infrastructure as Code tools.
  • Proficient with Docker and container orchestration using Kubernetes.
  • Experience with CI/CD tools like Jenkins or GitHub Actions.
  • Strong command of SQL/MySQL and scripting with Bash/Shell.
  • Experience working with external clients or in client-facing roles.


Preferred Qualifications:

  • AWS Certification (e.g., AWS Certified Developer or DevOps Engineer).
  • Familiarity with Agile/Scrum methodologies.
  • Strong analytical and problem-solving skills.
  • Excellent communication and stakeholder management abilities.


Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Mohali, Dehradun, Panchkula, Chennai
6 - 14 yrs
₹12L - ₹28L / yr
Test Automation (QA)
skill iconKubernetes
helm
skill iconDocker
skill iconAmazon Web Services (AWS)
+13 more

Job Title : Senior QA Automation Architect (Cloud & Kubernetes)

Experience : 6+ Years

Location : India (Multiple Offices)

Shift Timings : 12 PM to 9 PM (Noon Shift)

Working Days : 5 Days WFO (NO Hybrid)


About the Role :

We’re looking for a Senior QA Automation Architect with deep expertise in cloud-native systems, Kubernetes, and automation frameworks.

You’ll design scalable test architectures, enhance automation coverage, and ensure product reliability across hybrid-cloud and distributed environments.


Key Responsibilities :

  • Architect and maintain test automation frameworks for microservices.
  • Integrate automated tests into CI/CD pipelines (Jenkins, GitHub Actions).
  • Ensure reliability, scalability, and observability of test systems.
  • Work closely with DevOps and Cloud teams to streamline automation infrastructure.

Mandatory Skills :

  • Kubernetes, Helm, Docker, Linux
  • Cloud Platforms : AWS / Azure / GCP
  • CI/CD Tools : Jenkins, GitHub Actions
  • Scripting : Python, Pytest, Bash
  • Monitoring & Performance : Prometheus, Grafana, Jaeger, K6
  • IaC Practices : Terraform / Ansible

Good to Have :

  • Experience with Service Mesh (Istio/Linkerd).
  • Container Security or DevSecOps exposure.
Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Hyderabad, Noida, Mumbai, Navi Mumbai, Ahmedabad, Chennai, Coimbatore, Gurugram, Kochi (Cochin), Kolkata, Calcutta, Pune, Thiruvananthapuram, Trivandrum
7 - 15 yrs
₹15L - ₹30L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
Data Lake

SENIOR DATA ENGINEER:

ROLE SUMMARY:

Own the design and delivery of petabyte-scale data platforms and pipelines across AWS and modern Lakehouse stacks. You’ll architect, code, test, optimize, and operate ingestion, transformation, storage, and serving layers. This role requires autonomy, strong engineering judgment, and partnership with project managers, infrastructure teams, testers, and customer architects to land secure, cost-efficient, and high-performing solutions.



RESPONSIBILITIES:

  • Architecture and design: Create HLD/LLD/SAD, source–target mappings, data contracts, and optimal designs aligned to requirements.
  • Pipeline development: Build and test robust ETL/ELT for batch, micro-batch, and streaming across RDBMS, flat files, APIs, and event sources.
  • Performance and cost tuning: Profile and optimize jobs, right-size infrastructure, and model license/compute/storage costs.
  • Data modeling and storage: Design schemas and SCD strategies; manage relational, NoSQL, data lakes, Delta Lakes, and Lakehouse tables.
  • DevOps and release: Establish coding standards, templates, CI/CD, configuration management, and monitored release processes.
  • Quality and reliability: Define DQ rules and lineage; implement SLA tracking, failure detection, RCA, and proactive defect mitigation.
  • Security and governance: Enforce IAM best practices, retention, audit/compliance; implement PII detection and masking.
  • Orchestration: Schedule and govern pipelines with Airflow and serverless event-driven patterns.
  • Stakeholder collaboration: Clarify requirements, present design options, conduct demos, and finalize architectures with customer teams.
  • Leadership: Mentor engineers, set FAST goals, drive upskilling and certifications, and support module delivery and sprint planning.



REQUIRED QUALIFICATIONS:

  • Experience: 15+ years designing distributed systems at petabyte scale; 10+ years building data lakes and multi-source ingestion.
  •  Cloud (AWS): IAM, VPC, EC2, EKS/ECS, S3, RDS, DMS, Lambda, CloudWatch, CloudFormation, CloudTrail.
  • Programming: Python (preferred), PySpark, SQL for analytics, window functions, and performance tuning.
  • ETL tools: AWS Glue, Informatica, Databricks, GCP DataProc; orchestration with Airflow.
  • Lakehouse/warehousing: Snowflake, BigQuery, Delta Lake/Lakehouse; schema design, partitioning, clustering, performance optimization.
  • DevOps/IaC: Terraform with 15+ years of practice; CI/CD (GitHub Actions, Jenkins) with 10+ years; config governance and release management.
  • Serverless and events: Design event-driven distributed systems on AWS.
  • NoSQL: 2–3 years with DocumentDB including data modeling and performance considerations.
  • AI services: AWS Entity Resolution, AWS Comprehend; run custom LLMs on Amazon SageMaker; use LLMs for PII classification.



NICE-TO-HAVE QUALIFICATIONS:

  • Data governance automation: 10+ years defining audit, compliance, retention standards and automating governance workflows.
  • Table and file formats: Apache Parquet; Apache Iceberg as analytical table format.
  • Advanced LLM workflows: RAG and agentic patterns over proprietary data; re-ranking with index/vector store results.
  • Multi-cloud exposure: Azure ADF/ADLS, GCP Dataflow/DataProc; FinOps practices for cross-cloud cost control.



OUTCOMES AND MEASURES:

  • Engineering excellence: Adherence to processes, standards, and SLAs; reduced defects and non-compliance; fewer recurring issues.
  • Efficiency: Faster run times and lower resource consumption with documented cost models and performance baselines.
  • Operational reliability: Faster detection, response, and resolution of failures; quick turnaround on production bugs; strong release success.
  • Data quality and security: High DQ pass rates, robust lineage, minimal security incidents, and audit readiness.
  • Team and customer impact: On-time milestones, clear communication, effective demos, improved satisfaction, and completed certifications/training.



LOCATION AND SCHEDULE:

●      Location: Outside US (OUS).

●      Schedule: Minimum 6 hours of overlap with US time zones.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort