Cutshort logo
Python Jobs in Hyderabad

50+ Python Jobs in Hyderabad | Python Job openings in Hyderabad

Apply to 50+ Python Jobs in Hyderabad on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Chennai
7 - 10 yrs
₹10L - ₹18L / yr
full stack
skill iconReact.js
skill iconPython
skill iconGo Programming (Golang)
CI/CD
+9 more

Full-Stack Developer

Exp: 5+ years required

Night shift: 8 PM-5 AM/9PM-6 AM

Only Immediate Joinee Can Apply


We are seeking a mid-to-senior level Full-Stack Developer with a foundational understanding of software development, cloud services, and database management. In this role, you will contribute to both the front-end and back-end of our application. focusing on creating a seamless user experience, supported by robust and scalable cloud infrastructure.

Key Responsibilities

● Develop and maintain user-facing features using React.js and TypeScript.

● Write clean, efficient, and well-documented JavaScript/TypeScript code.

● Assist in managing and provisioning cloud infrastructure on AWS using Infrastructure as Code (IaC) principles.

● Contribute to the design, implementation, and maintenance of our databases.

● Collaborate with senior developers and product managers to deliver high-quality software.

● Troubleshoot and debug issues across the full stack.

● Participate in code reviews to maintain code quality and share knowledge.

Qualifications

● Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience.

● 5+ years of professional experience in web development.

● Proficiency in JavaScript and/or TypeScript.

● Proficiency in Golang and Python.

● Hands-on experience with the React.js library for building user interfaces.

● Familiarity with Infrastructure as Code (IaC) tools and concepts (e.g.(AWS CDK, Terraform, or CloudFormation).

● Basic understanding of AWS and its core services (e.g., S3, EC2, Lambda, DynamoDB).

● Experience with database management, including relational (e.g., PostgreSQL) or NoSQL (e.g., DynamoDB, MongoDB) databases.

● Strong problem-solving skills and a willingness to learn.

● Familiarity with modern front-end build pipelines and tools like Vite and Tailwind CSS.

● Knowledge of CI/CD pipelines and automated testing.


Read more
Hyderabad
6 - 10 yrs
₹20L - ₹30L / yr
skill iconJava
skill iconPython
skill iconHTML/CSS
skill iconJavascript
skill iconSpring Boot
+4 more

Senior Software Developer – Java Full Stack | AI-Powered Innovation

Experience: 6–10 years

Department: Engineering & Innovation


🌟 About the Role

We’re searching for a Senior Software Developer who thrives on solving complex challenges and building world-class products that redefine technology boundaries. You’ll be part of a dynamic team that brings Java full-stack excellence together with Python and AI-driven innovations, crafting scalable, intelligent, and high-performance solutions.

If you love clean code, intelligent systems, and pushing the limits of what’s possible, this is your playground.


💡 What You’ll Do

  • Design, develop, and deploy robust Java-based full-stack applications with a focus on performance, scalability, and reliability.
  • Collaborate with cross-functional teams to integrate AI and Python-driven components into enterprise-grade systems.
  • Architect and maintain microservices, RESTful APIs, and modular components for high-availability platforms.
  • Engage in end-to-end product development — from ideation to deployment — using modern frameworks and tools.
  • Champion best coding practices, conduct code reviews, and mentor junior engineers.
  • Explore, experiment, and implement new technologies in AI, automation, and intelligent analytics.
  • Troubleshoot complex issues, debug performance bottlenecks, and deliver elegant solutions.


🧠 What Makes You Stand Out

  • Strong expertise in Java, Spring Boot, Hibernate, and modern JavaScript frameworks (React, Angular, or Vue).
  • Hands-on exposure to Python programming — especially for automation or AI/ML integration.
  • Solid understanding of AI/ML frameworks (TensorFlow, PyTorch, or OpenAI APIs) is a big plus.
  • Experience with cloud technologies (AWS, Azure, or GCP) and containerization tools (Docker, Kubernetes).
  • Proven record of building scalable microservices and RESTful APIs.
  • Passion for problem-solving, algorithmic thinking, and clean architecture.
  • Excellent communication and collaboration skills — you turn complex problems into creative solutions.


⚙️ Tech Stack Snapshot

Languages: Java, Python, JavaScript

Frameworks: Spring Boot, React/Angular/Vue, Flask (optional)

Tools: Git, Jenkins, Docker, Kubernetes

Databases: MongoDB, MySQL, PostgreSQL

Bonus: AI/ML frameworks, Generative AI, or NLP experience


🌈 Why You’ll Love Working Here

  • Work on cutting-edge AI-integrated applications that make a real-world impact.
  • Join a culture that values innovation, autonomy, and technical excellence.
  • Collaborate with brilliant minds who inspire and challenge you daily.
  • Enjoy flexibility, learning opportunities, and a growth-oriented environment.


💬 Ready to code the future?

Apply now and let’s build something extraordinary together!

Read more
Hunarstreet technologies pvt ltd

Hunarstreet technologies pvt ltd

Agency job
Chennai, Hyderabad, Bengaluru (Bangalore), Mumbai, Pune, Gurugram, Mohali, Panchkula
5 - 15 yrs
₹10L - ₹15L / yr
Fullstack Developer
Web Development
skill iconJavascript
TypeScript
skill iconGo Programming (Golang)
+5 more

We are seeking a mid-to-senior level Full-Stack Developer with a foundational understanding of software development, cloud services, and database management. In this role, you will contribute to both the front-end and back-end of our application. focusing on creating a seamless user experience, supported by robust and scalable cloud infrastructure.


Key Responsibilities

● Develop and maintain user-facing features using React.js and TypeScript.

● Write clean, efficient, and well-documented JavaScript/TypeScript code.

● Assist in managing and provisioning cloud infrastructure on AWS using Infrastructure as Code (IaC) principles.

● Contribute to the design, implementation, and maintenance of our databases.

● Collaborate with senior developers and product managers to deliver high-quality software.

● Troubleshoot and debug issues across the full stack.

● Participate in code reviews to maintain code quality and share knowledge.


Qualifications

● Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience.

● 5+ years of professional experience in web development.

● Proficiency in JavaScript and/or TypeScript.

● Proficiency in Golang and Python.

● Hands-on experience with the React.js library for building user interfaces.

● Familiarity with Infrastructure as Code (IaC) tools and concepts (e.g.(AWS CDK, Terraform, or CloudFormation).

● Basic understanding of AWS and its core services (e.g., S3, EC2, Lambda, DynamoDB).

● Experience with database management, including relational (e.g., PostgreSQL) or NoSQL (e.g., DynamoDB, MongoDB) databases.

● Strong problem-solving skills and a willingness to learn.

● Familiarity with modern front-end build pipelines and tools like Vite and Tailwind CSS.

● Knowledge of CI/CD pipelines and automated testing.

Read more
Estuate Software
Deekshith K Naidu
Posted by Deekshith K Naidu
Hyderabad
5 - 12 yrs
₹5L - ₹35L / yr
Google Cloud Platform (GCP)
Apache Airflow
ETL
skill iconPython
Big query
+1 more

Job Title: Data Engineer / Integration Engineer

 

Job Summary:

We are seeking a highly skilled Data Engineer / Integration Engineer to join our team. The ideal candidate will have expertise in Python, workflow orchestration, cloud platforms (GCP/Google BigQuery), big data frameworks (Apache Spark or similar), API integration, and Oracle EBS. The role involves designing, developing, and maintaining scalable data pipelines, integrating various systems, and ensuring data quality and consistency across platforms. Knowledge of Ascend.io is a plus.

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines and workflows.
  • Develop and optimize ETL/ELT processes using Python and workflow automation tools.
  • Implement and manage data integration between various systems, including APIs and Oracle EBS.
  • Work with Google Cloud Platform (GCP) or Google BigQuery (GBQ) for data storage, processing, and analytics.
  • Utilize Apache Spark or similar big data frameworks for efficient data processing.
  • Develop robust API integrations for seamless data exchange between applications.
  • Ensure data accuracy, consistency, and security across all systems.
  • Monitor and troubleshoot data pipelines, identifying and resolving performance issues.
  • Collaborate with data analysts, engineers, and business teams to align data solutions with business goals.
  • Document data workflows, processes, and best practices for future reference.

Required Skills & Qualifications:

  • Strong proficiency in Python for data engineering and workflow automation.
  • Experience with workflow orchestration tools (e.g., Apache Airflow, Prefect, or similar).
  • Hands-on experience with Google Cloud Platform (GCP) or Google BigQuery (GBQ).
  • Expertise in big data processing frameworks, such as Apache Spark.
  • Experience with API integrations (REST, SOAP, GraphQL) and handling structured/unstructured data.
  • Strong problem-solving skills and ability to optimize data pipelines for performance.
  • Experience working in an agile environment with CI/CD processes.
  • Strong communication and collaboration skills.

Preferred Skills & Nice-to-Have:

  • Experience with Ascend.io platform for data pipeline automation.
  • Knowledge of SQL and NoSQL databases.
  • Familiarity with Docker and Kubernetes for containerized workloads.
  • Exposure to machine learning workflows is a plus.

Why Join Us?

  • Opportunity to work on cutting-edge data engineering projects.
  • Collaborative and dynamic work environment.
  • Competitive compensation and benefits.
  • Professional growth opportunities with exposure to the latest technologies.

How to Apply:

Interested candidates can apply by sending their resume to [your email/contact].

 

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Bengaluru (Bangalore), Pune, Hyderabad
6 - 12 yrs
₹5L - ₹28L / yr
skill iconData Science
skill iconPython
Large Language Models (LLM)

Job Description:

 

Role: Data Scientist

 

Responsibilities:

 

 Lead data science and machine learning projects, contributing to model development, optimization and evaluation. 

 Perform data cleaning, feature engineering, and exploratory data analysis.  

 

Translate business requirements into technical solutions, document and communicate project progress, manage non-technical stakeholders.

 

Collaborate with other DS and engineers to deliver projects.

 

Technical Skills – Must have:

 

Experience in and understanding of the natural language processing (NLP) and large language model (LLM) landscape.

 

Proficiency with Python for data analysis, supervised & unsupervised learning ML tasks.

 

Ability to translate complex machine learning problem statements into specific deliverables and requirements.

 

Should have worked with major cloud platforms such as AWS, Azure or GCP.

 

Working knowledge of SQL and no-SQL databases.

 

Ability to create data and ML pipelines for more efficient and repeatable data science projects using MLOps principles.

 

Keep abreast with new tools, algorithms and techniques in machine learning and works to implement them in the organization.

 

Strong understanding of evaluation and monitoring metrics for machine learning projects.

Read more
Inncircles
Sharat Chandra Manchi Sarapu
Posted by Sharat Chandra Manchi Sarapu
Hyderabad
2 - 6 yrs
₹8L - ₹20L / yr
skill iconPython
skill iconFlask
FastAPI
skill iconDjango
Databases
+2 more

About Us:

We are a cutting-edge startup reshaping the construction management landscape with AI-driven solutions that simplify complex processes and maximize efficiency. Our platform leverages the latest web and mobile technologies to solve real-world challenges in the construction industry, blending innovation with usability. If you're passionate about building scalable systems and love solving problems, we want you on board!

Who You Are:

You are a tech enthusiast with a passion for clean, scalable backend systems built in Python. You have a knack for solving challenging problems and enjoy working in a fast-paced startup environment. You’re comfortable diving into code, debugging complex issues, and collaborating with cross-functional teams. While deep expertise in Python frameworks is a must, you’re also excited about emerging technologies like generative AI, machine learning, and deep learning.

What You’ll Do:

  • Develop & Maintain: Build robust, secure, and scalable backend services using Python frameworks like Flask, FastAPI, or Django.
  • API Design: Create and maintain RESTful APIs and microservices that power our platform.
  • Database Management: Design and optimize database schemas; ideally with MongoDB, though experience with other databases is also valued.
  • Integration: Collaborate with front-end and mobile teams to integrate seamless data flows and user experiences.
  • Innovate: Explore and integrate new technologies, including LLMs, generative AI, machine learning, and deep learning, to enhance our product offerings.
  • Cloud & DevOps: Work with cloud computing platforms (AWS or similar) to deploy, scale, and maintain backend systems.

Tech Stack:

  • Backend: Python (Flask, FastAPI, or Django)
  • Database: MongoDB (preferred) or other relational/NoSQL databases
  • Cloud: AWS or other cloud platforms
  • Additional Tools: Git, Docker, CI/CD pipelines

What You Bring:

  • Experience: 2+ years of experience building scalable backend systems in Python.
  • Framework Proficiency: Solid hands-on experience with Flask, FastAPI, or Django.
  • Database Knowledge: Strong understanding of database design, indexing, and query optimization, preferably with MongoDB.
  • API Expertise: Experience designing and consuming RESTful APIs.
  • Version Control: Proficiency with Git and agile development practices.
  • Problem Solver: A keen eye for detail and a passion for writing clean, maintainable code.

Bonus Points For:

  • Exposure to and working experience with LLMs, generative AI, machine learning, deep learning, or fine-tuning models.
  • Familiarity with containerization (Docker) and modern CI/CD practices.
  • Experience working in a fast-paced startup environment.

Why Work With Us:

  • Impact: Join a mission-driven startup solving real-world problems in a trillion-dollar industry.
  • Innovation: Be part of a forward-thinking team that builds AI-powered, scalable tools from the ground up.
  • Growth: Enjoy rapid career advancement as our company scales, with ample space for your ideas to thrive.
  • Culture: Experience a collaborative, tech-driven, and fun work environment that values creativity, ownership, and continuous learning.


Read more
Inncircles
Gangadhar M
Posted by Gangadhar M
Hyderabad
4 - 8 yrs
Best in industry
NumPy
skill iconPython
pandas
skill iconMachine Learning (ML)
skill iconDeep Learning
+6 more

Job Title: Senior AI/ML/DL Engineer

Location: Hyderabad

Department: Artificial Intelligence/Machine Learning


Job Summary:

We are seeking a highly skilled and motivated Senior AI/ML/DL Engineer to contribute to

the development and implementation of advanced artificial intelligence, machine learning,

and deep learning solutions. The ideal candidate will have a strong technical background in

AI/ML/DL, hands-on experience in building scalable models, and a passion for solving

complex problems using data-driven approaches. This role involves working closely with

cross-functional teams to deliver innovative AI/ML solutions aligned with business objectives.

Key Responsibilities:


Technical Execution:

● Design, develop, and deploy AI/ML/DL models and algorithms to solve business

challenges.

● Stay up-to-date with the latest advancements in AI/ML/DL technologies and integrate

them into solutions.

● Implement best practices for model development, validation, and deployment.


Project Development:

● Collaborate with stakeholders to identify business opportunities and translate them

into AI/ML projects.

● Work on the end-to-end lifecycle of AI/ML projects, including data collection,

preprocessing, model training, evaluation, and deployment.

● Ensure the scalability, reliability, and performance of AI/ML solutions in production

environments.


Cross-Functional Collaboration:

● Work closely with product managers, software engineers, and domain experts to

integrate AI/ML capabilities into products and services.

● Communicate complex technical concepts to non-technical stakeholders effectively.


Research and Innovation:

Explore new AI/ML techniques and methodologies to enhance solution capabilities.


● Prototype and experiment with novel approaches to solve challenging problems.

●Contribute to internal knowledge-sharing initiatives and documentation.


Quality Assurance & MLOps:

● Ensure the accuracy, robustness, and ethical use of AI/ML models.

● Implement monitoring and maintenance processes for deployed models to ensure long-term performance.

● Follow MLOps practices for efficient deployment and monitoring of AI/ML solutions.


Qualifications:


Education:

● Bachelors/Master’s or Ph.D. in Computer Science, Data Science, Artificial Intelligence, Machine Learning, or a related field.


Experience:

● 5+ years of experience in AI/ML/DL, with a proven track record of delivering AI/ML solutions in production environments.

● Strong experience with programming languages such as Python, R, or Java.

● Proficiency in AI/ML frameworks and tools (e.g., TensorFlow, PyTorch, Scikit-learn,Keras).

● Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies

(e.g., Hadoop, Spark).

● Familiarity with MLOps practices and tools for model deployment and monitoring.


Skills:

● Strong understanding of machine learning algorithms, deep learning architectures,

and statistical modeling.

● Excellent problem-solving and analytical skills.

● Strong communication and interpersonal skills.

● Ability to manage multiple projects and prioritize effectively.


Preferred Qualifications:

● Experience in natural language processing (NLP), computer vision, or reinforcement

learning.

● Knowledge of ethical AI practices and regulatory compliance.

● Publications or contributions to the AI/ML community (e.g., research papers,open-source projects).


What We Offer:

● Competitive salary and benefits package.

● Opportunities for professional development and career growth.

● A collaborative and innovative work environment.

● The chance to work on impactful projects that leverage cutting-edge AI/ML technologies.

Read more
Inncircles
Gangadhar M
Posted by Gangadhar M
Hyderabad
3 - 5 yrs
Best in industry
PySpark
Spark
skill iconPython
ETL
Amazon EMR
+7 more


We are looking for a highly skilled Sr. Big Data Engineer with 3-5 years of experience in

building large-scale data pipelines, real-time streaming solutions, and batch/stream

processing systems. The ideal candidate should be proficient in Spark, Kafka, Python, and

AWS Big Data services, with hands-on experience in implementing CDC (Change Data

Capture) pipelines and integrating multiple data sources and sinks.


Responsibilities

  • Design, develop, and optimize batch and streaming data pipelines using Apache Spark and Python.
  • Build and maintain real-time data ingestion pipelines leveraging Kafka and AWS Kinesis.
  • Implement CDC (Change Data Capture) pipelines using Kafka Connect, Debezium or similar frameworks.
  • Integrate data from multiple sources and sinks (databases, APIs, message queues, file systems, cloud storage).
  • Work with AWS Big Data ecosystem: Glue, EMR, Kinesis, Athena, S3, Lambda, Step Functions.
  • Ensure pipeline scalability, reliability, and performance tuning of Spark jobs and EMR clusters.
  • Develop data transformation and ETL workflows in AWS Glue and manage schema evolution.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable and high-quality data solutions.
  • Implement monitoring, logging, and alerting for critical data pipelines.
  • Follow best practices for data security, compliance, and cost optimization in cloud environments.


Required Skills & Experience

  • Programming: Strong proficiency in Python (PySpark, data frameworks, automation).
  • Big Data Processing: Hands-on experience with Apache Spark (batch & streaming).
  • Messaging & Streaming: Proficient in Kafka (brokers, topics, partitions, consumer groups) and AWS Kinesis.
  • CDC Pipelines: Experience with Debezium / Kafka Connect / custom CDC frameworks.
  • AWS Services: AWS Glue, EMR, S3, Athena, Lambda, IAM, CloudWatch.
  • ETL/ELT Workflows: Strong knowledge of data ingestion, transformation, partitioning, schema management.
  • Databases: Experience with relational databases (MySQL, Postgres, Oracle) and NoSQL (MongoDB, DynamoDB, Cassandra).
  • Data Formats: JSON, Parquet, Avro, ORC, Delta/Iceberg/Hudi.
  • Version Control & CI/CD: Git, GitHub/GitLab, Jenkins, or CodePipeline.
  • Monitoring/Logging: CloudWatch, Prometheus, ELK/Opensearch.
  • Containers & Orchestration (nice-to-have): Docker, Kubernetes, Airflow/Step
  • Functions for workflow orchestration.


Preferred Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
  • Experience in large-scale data lake / lake house architectures.
  • Knowledge of data warehousing concepts and query optimisation.
  • Familiarity with data governance, lineage, and cataloging tools (Glue Data Catalog, Apache Atlas).
  • Exposure to ML/AI data pipelines is a plus.


Tools & Technologies (must-have exposure)

  • Big Data & Processing: Apache Spark, PySpark, AWS EMR, AWS Glue
  • Streaming & Messaging: Apache Kafka, Kafka Connect, Debezium, AWS Kinesis
  • Cloud & Storage: AWS (S3, Athena, Lambda, IAM, CloudWatch)
  • Programming & Scripting: Python, SQL, Bash
  • Orchestration: Airflow / Step Functions
  • Version Control & CI/CD: Git, Jenkins/CodePipeline
  • Data Formats: Parquet, Avro, ORC, JSON, Delta, Iceberg, Hudi
Read more
Pune, Bengaluru (Bangalore), Hyderabad
8 - 12 yrs
₹14L - ₹15L / yr
skill iconR Programming
skill iconPython
Scikit-Learn
TensorFlow
PyTorch
+8 more

Role: Data Scientist (Python + R Expertise)

Exp: 8 -12 Years

CTC: up to 30 LPA


Required Skills & Qualifications:

  • 8–12 years of hands-on experience as a Data Scientist or in a similar analytical role.
  • Strong expertise in Python and R for data analysis, modeling, and visualization.
  • Proficiency in machine learning frameworks (scikit-learn, TensorFlow, PyTorch, caret, etc.).
  • Strong understanding of statistical modeling, hypothesis testing, regression, and classification techniques.
  • Experience with SQL and working with large-scale structured and unstructured data.
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and deployment practices (Docker, MLflow).
  • Excellent analytical, problem-solving, and communication skills.


Preferred Skills:

  • Experience with NLP, time series forecasting, or deep learning projects.
  • Exposure to data visualization tools (Tableau, Power BI, or R Shiny).
  • Experience working in product or data-driven organizations.
  • Knowledge of MLOps and model lifecycle management is a plus.


If interested kindly share your updated resume on 82008 31681


Read more
ZestFindz Private Limited

at ZestFindz Private Limited

2 candid answers
ZestFindz Info Desk
Posted by ZestFindz Info Desk
Hyderabad
1 - 3 yrs
₹2L - ₹6L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconExpress
skill iconJavascript
TypeScript
+16 more

We are seeking a talented Full Stack Developer to design, build, and maintain scalable web and mobile applications. The ideal candidate should have hands-on experience in frontend (React.js, Flutter), backend (Node.js, Express), databases (PostgreSQL, MongoDB), and Python for AI/ML integration. You will work closely with the engineering team to deliver secure, high-performance, and user-friendly products.


Key Responsibilities

  • Develop responsive and dynamic web applications using React.js and modern UI frameworks.
  • Build and optimize REST APIs and backend services with Node.js and Express.js.
  • Design and manage PostgreSQL and MongoDB databases, ensuring optimized queries and data modeling.
  • Implement state management using Redux/Context API.
  • Ensure API security with JWT, OAuth2, Helmet.js, and rate-limiting.
  • Integrate Google Cloud services (GCP) for hosting, storage, and serverless functions.
  • Deploy and maintain applications using CI/CD pipelines, Docker, and Kubernetes.
  • Use Redis for caching, sessions, and job queues.
  • Optimize frontend performance (lazy loading, code splitting, caching strategies).
  • Collaborate with design, QA, and product teams to deliver high-quality features.
  • Maintain clear documentation and follow coding standards.
Read more
ZestFindz Private Limited

at ZestFindz Private Limited

2 candid answers
ZestFindz Info Desk
Posted by ZestFindz Info Desk
Hyderabad
3 - 7 yrs
₹6L - ₹16L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconExpress
skill iconJavascript
TypeScript
+17 more

We are looking for a highly skilled Senior Full Stack Developer / Tech Lead to drive end-to-end development of scalable, secure, and high-performance applications. The ideal candidate will have strong expertise in React.js, Node.js, PostgreSQL, MongoDB, Python, AI/ML, and Google Cloud platforms (GCP). You will play a key role in architecture design, mentoring developers, ensuring best coding practices, and integrating AI/ML solutions into our products.

This role requires a balance of hands-on coding, system design, cloud deployment, and leadership.


Key Responsibilities

  • Design, develop, and deploy scalable full-stack applications using React.js, Node.js, PostgreSQL, and MongoDB.
  • Build, consume, and optimize REST APIs and GraphQL services.
  • Develop AI/ML models with Python and integrate them into production systems.
  • Implement CI/CD pipelines, containerization (Docker, Kubernetes), and cloud deployments (GCP/AWS).
  • Manage security, authentication (JWT, OAuth2), and performance optimization.
  • Use Redis for caching, session management, and queue handling.
  • Lead and mentor junior developers, conduct code reviews, and enforce coding standards.
  • Collaborate with cross-functional teams (product, design, QA) for feature delivery.
  • Monitor and optimize system performance, scalability, and cost-efficiency.
  • Own technical decisions and contribute to long-term architecture strategy.
Read more
FloData
Mahesh J
Posted by Mahesh J
Hyderabad
3 - 5 yrs
₹20L - ₹40L / yr
Generative AI
Retrieval Augmented Generation (RAG)
Prompt engineering
AI Agents
Langgraph
+5 more

Join us to reimagine how businesses integrate data and automate processes – with AI at the core.


About FloData

FloData is re-imagining the iPaaS and Business Process Automation (BPA) space for a new era - one where business teams, not just IT, can integrate data, run automations, and solve ops bottlenecks using intuitive, AI-driven interfaces. We're a small, hands-on team with a deep technical foundation and strong industry connections. Backed by real-world learnings from our earlier platform version, we're now going all-in on building a generative AI-first experience.


The Opportunity

We’re looking for an GenAI Engineer to help build the intelligence layer of our new platform. From designing LLM-powered orchestration flows with LangGraph to building frameworks for evaluation and monitoring with LangSmith, you’ll shape how AI powers real-world enterprise workflows.


If you thrive on working at the frontier of LLM systems engineering, enjoy scaling prototypes into production-grade systems, and want to make AI reliable, explainable, and enterprise-ready - this is your chance to define a category-defining product.


What You'll Do

  • Spend ~70% of your time architecting, prototyping, and productionizing AI systems (LLM orchestration, agents, evaluation, observability)
  • Develop AI frameworks: orchestration (LangGraph), evaluation/monitoring (LangSmith), vector/graph DBs, and other GenAI infra
  • Work with product engineers to seamlessly integrate AI services into frontend and backend workflows
  • Build systems for AI evaluation, monitoring, and reliability to ensure trustworthy performance at scale
  • Translate product needs into AI-first solutions, balancing rapid prototyping with enterprise-grade robustness
  • Stay ahead of the curve by exploring emerging GenAI frameworks, tools, and research for practical application


Must Have

  • 3–5 years of engineering experience, with at least 1-2 years in GenAI systems
  • Hands-on experience with LangGraph, LangSmith, LangChain, or similar frameworks for orchestration/evaluation
  • Deep understanding of LLM workflows: prompt engineering, fine-tuning, RAG, evaluation, monitoring, and observability
  • A strong product mindset—comfortable bridging research-level concepts with production-ready business use cases
  • Startup mindset: resourceful, pragmatic, and outcome-driven


Good To Have

  • Experience integrating AI pipelines with enterprise applications and hybrid infra setups (AWS, on-prem, VPCs)
  • Experience building AI-native user experiences (assistants, copilots, intelligent automation flows)
  • Familiarity with enterprise SaaS/IT ecosystems (Salesforce, Oracle ERP, Netsuite, etc.)


Why Join Us

  • Own the AI backbone of a generational product at the intersection of AI, automation, and enterprise data
  • Work closely with founders and leadership with no layers of bureaucracy
  • End-to-end ownership of AI systems you design and ship
  • Be a thought partner in setting AI-first principles for both tech and culture
  • Onsite in Hyderabad, with flexibility when needed


Sounds like you?

We'd love to talk. Apply now or reach out directly to explore this opportunity.

Read more
US Base Company

US Base Company

Agency job
Hyderabad, Gurugram
10 - 18 yrs
₹20L - ₹35L / yr
skill iconPython
skill iconDjango
skill iconReact.js
Angular
skill iconJavascript
+3 more

Key Responsibilities

  • Design, develop, and maintain scalable microservices and RESTful APIs using Python (Flask, FastAPI, or Django).
  • Architect data models for SQL and NoSQL databases (PostgreSQL, ClickHouse, MongoDB, DynamoDB) to optimize performance and reliability.
  • Implement efficient and secure data access layers, caching, and indexing strategies.
  • Collaborate closely with product and frontend teams to deliver seamless user experiences.
  • Build responsive UI components using HTML, CSS, JavaScript, and frameworks like React or Angular.
  • Ensure system reliability, observability, and fault tolerance across services.
  • Lead code reviews, mentor junior engineers, and promote engineering best practices.
  • Contribute to DevOps and CI/CD workflows for smooth deployments and testing automation.

Required Skills & Experience

  • 10+ years of professional software development experience.
  • Strong proficiency in Python, with deep understanding of OOP, asynchronous programming, and performance optimization.
  • Proven expertise in building FAST API based microservices architectures.
  • Solid understanding of SQL and NoSQL data modeling, query optimization, and schema design.
  • Excellent hands on proficiency in frontend proficiency with HTML, CSS, JavaScript, and a modern framework (React, Angular, or Vue).
  • Experience working with cloud platforms (AWS, GCP, or Azure) and containerized deployments (Docker, Kubernetes).
  • Familiarity with distributed systems, event-driven architectures, and messaging queues (Kafka, RabbitMQ).
  • Excellent problem-solving, communication, and system design skills.


Read more
Clink

at Clink

2 candid answers
1 product
Hari Krishna
Posted by Hari Krishna
Hyderabad, Bengaluru (Bangalore)
2 - 4 yrs
₹8L - ₹12L / yr
Database Design
Systems design
Web Development
Relational Database (RDBMS)
skill iconPython
+4 more

Role Overview:

We’re looking for an exceptionally passionate, logical, and smart Backend Developer to join our core tech team. This role goes beyond writing code — you’ll help shape the architecture, lead entire backend team if needed, and be deeply involved in designing scalable systems almost from scratch.


This is a high-impact opportunity to work directly with the founders and play a pivotal role in building the core product. If you’re looking to grow alongside a fast-growing startup, take complete ownership, and influence the direction of the technology and product, this role is made for you.


Why Clink?

Clink is a fast-growing product startup building innovative solutions in the food-tech space. We’re on a mission to revolutionize how restaurants connect with customers and manage offers seamlessly. Our platform is a customer-facing app that needs to scale rapidly as we grow. We also aim to leverage Generative AI to enhance user experiences and drive personalization at scale.


Key Responsibilities:

  • Design, develop, and completely own high-performance backend systems.
  • Architect scalable, secure, and efficient system designs.
  • Own database schema design and optimization for performance and reliability.
  • Collaborate closely with frontend engineers, product managers, and designers.
  • Guide and mentor junior team members .
  • Explore and experiment with Generative AI capabilities for product innovation.
  • Participate in code reviews and ensure high engineering standards.

Must-Have Skills:

  • 2–5 years of experience in backend development at a product-based company.
  • Strong expertise in database design and system architecture.
  • Hands-on experience building multiple production-grade applications.
  • Solid programming fundamentals and logical problem-solving skills.
  • Experience with Python or Ruby on Rails (one is mandatory).
  • Experience integrating third-party APIs and services.

Good-to-Have Skills:

  • Familiarity with Generative AI tools, APIs, or projects.
  • Contributions to open-source projects or personal side projects.
  • Exposure to frontend basics (React, Vue, or similar) is a plus.
  • Exposure to containerization, cloud deployment, or CI/CD pipelines.

What We’re Looking For:

  • Extremely high aptitude and ability to solve tough technical problems.
  • Passion for building products from scratch and shipping fast.
  • hacker mindset — someone who builds cool stuff even in their spare time.
  • Team player who can lead when required and work independently when needed.


Read more
Technoidentity
Hyderabad
6 - 12 yrs
₹20L - ₹35L / yr
skill iconPython
FastAPI
PySpark

Supercharge Your Career as a Technical Lead - Python at Technoidentity!

Are you ready to solve people challenges that fuel business growth? At Technoidentity, we’re a Data+AI product engineering company building cutting-edge solutions in the FinTech domain for over 13 years—and we’re expanding globally. It’s the perfect time to join our

team of tech innovators and leave your mark!

At Technoidentity, we’re a Data + AI product engineering company trusted to deliver scalable and modern enterprise solutions. Join us as a Senior Python Developer and Technical Lead, where you'll guide high-performing engineering teams, design complex systems, and deliver

clean, scalable backend solutions using Python and modern data technologies. Your leadership will directly shape the architecture and execution of enterprise projects, with added strength in understanding database logic including PL/SQL and PostgreSQL/AlloyDB.

What’s in it for You?

• Modern Python Stack – Python 3.x, FastAPI, Pandas, NumPy, SQLAlchemy, PostgreSQL/AlloyDB, PL/pgSQL.

• Tech Leadership – Drive technical decision-making, mentor developers, and ensure code quality and scalability.

• Scalable Projects – Architect and optimize data-intensive backend services for highthroughput and distributed systems.

• Engineering Best Practices – Enforce clean architecture, code reviews, testing strategies, and SDLC alignment.

• Cross-Functional Collaboration – Lead conversations across engineering, QA, product, and DevOps to ensure delivery excellence.

What Will You Be Doing?

Technical Leadership

• Lead a team of developers through design, code reviews, and technical mentorship.

• Set architectural direction and ensure scalability, modularity, and code quality.

• Work with stakeholders to translate business goals into robust technical solutions.

Backend Development & Data Engineering

• Design and build clean, high-performance backend services using FastAPI and Python

best practices.

• Handle row- and column-level data transformation using Pandas and NumPy.

• Apply data wrangling, cleansing, and preprocessing techniques across microservices and pipelines.

Database & Performance Optimization

• Write performant queries, procedures, and triggers using PostgreSQL and PL/pgSQL.

• Understand legacy logic in PL/SQL and participate in rewriting or modernizing it for PostgreSQL-based systems.

• Tune both backend and database performance, including memory, indexing, and query optimization.

Parallelism & Communication

• Implement multithreading, multiprocessing, and parallel data flows in Python.

• Integrate Kafka, RabbitMQ, or Pub/Sub systems for real-time and async message

processing.

Engineering Excellence

• Drive adherence to Agile, Git-based workflows, CI/CD, and DevOps pipelines.

• Promote testing (unit/integration), monitoring, and observability for all backend systems.

• Stay current with Python ecosystem evolution and introduce tools that improve productivity and performance.

What Makes You the Perfect Fit?

• 6–10 years of proven experience in Python development, with strong expertise in designing and delivering scalable backend solutions

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Nagpur, Ahmedabad, Jaipur, Kochi (Cochin)
3.6 - 8 yrs
₹4L - ₹18L / yr
skill iconPython
skill iconDjango
skill iconFlask
skill iconAmazon Web Services (AWS)
AWS Lambda
+3 more

Job Summary:

Deqode is looking for a highly motivated and experienced Python + AWS Developer to join our growing technology team. This role demands hands-on experience in backend development, cloud infrastructure (AWS), containerization, automation, and client communication. The ideal candidate should be a self-starter with a strong technical foundation and a passion for delivering high-quality, scalable solutions in a client-facing environment.


Key Responsibilities:

  • Design, develop, and deploy backend services and APIs using Python.
  • Build and maintain scalable infrastructure on AWS (EC2, S3, Lambda, RDS, etc.).
  • Automate deployments and infrastructure with Terraform and Jenkins/GitHub Actions.
  • Implement containerized environments using Docker and manage orchestration via Kubernetes.
  • Write automation and scripting solutions in Bash/Shell to streamline operations.
  • Work with relational databases like MySQL and SQL, including query optimization.
  • Collaborate directly with clients to understand requirements and provide technical solutions.
  • Ensure system reliability, performance, and scalability across environments.


Required Skills:

  • 3.5+ years of hands-on experience in Python development.
  • Strong expertise in AWS services such as EC2, Lambda, S3, RDS, IAM, CloudWatch.
  • Good understanding of Terraform or other Infrastructure as Code tools.
  • Proficient with Docker and container orchestration using Kubernetes.
  • Experience with CI/CD tools like Jenkins or GitHub Actions.
  • Strong command of SQL/MySQL and scripting with Bash/Shell.
  • Experience working with external clients or in client-facing roles.


Preferred Qualifications:

  • AWS Certification (e.g., AWS Certified Developer or DevOps Engineer).
  • Familiarity with Agile/Scrum methodologies.
  • Strong analytical and problem-solving skills.
  • Excellent communication and stakeholder management abilities.


Read more
CoffeeBeans

at CoffeeBeans

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore), Pune, Hyderabad
5 - 8 yrs
Upto ₹28L / yr (Varies
)
Apache Spark
skill iconScala
skill iconPython

Focus Areas:

  • Build applications and solutions that process and analyze large-scale data.
  • Develop data-driven applications and analytical tools.
  • Implement business logic, algorithms, and backend services.
  • Design and build APIs for secure and efficient data exchange.

Key Responsibilities:

  • Develop and maintain data processing applications using Apache Spark and Hadoop.
  • Write MapReduce jobs and complex data transformation logic.
  • Implement machine learning models and analytics solutions for business use cases.
  • Optimize code for performance and scalability; perform debugging and troubleshooting.
  • Work hands-on with Databricks for data engineering and analysis.
  • Design and manage Airflow DAGs for orchestration and automation.
  • Integrate and maintain CI/CD pipelines (preferably using Jenkins).

Primary Skills & Qualifications:

  • Strong programming skills in Scala and Python.
  • Expertise in Apache Spark for large-scale data processing.
  • Solid understanding of data structures and algorithms.
  • Proven experience in application development and software engineering best practices.
  • Experience working in agile and collaborative environments.


Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Mohali, Dehradun, Panchkula, Chennai
6 - 14 yrs
₹12L - ₹28L / yr
Test Automation (QA)
skill iconKubernetes
helm
skill iconDocker
skill iconAmazon Web Services (AWS)
+13 more

Job Title : Senior QA Automation Architect (Cloud & Kubernetes)

Experience : 6+ Years

Location : India (Multiple Offices)

Shift Timings : 12 PM to 9 PM (Noon Shift)

Working Days : 5 Days WFO (NO Hybrid)


About the Role :

We’re looking for a Senior QA Automation Architect with deep expertise in cloud-native systems, Kubernetes, and automation frameworks.

You’ll design scalable test architectures, enhance automation coverage, and ensure product reliability across hybrid-cloud and distributed environments.


Key Responsibilities :

  • Architect and maintain test automation frameworks for microservices.
  • Integrate automated tests into CI/CD pipelines (Jenkins, GitHub Actions).
  • Ensure reliability, scalability, and observability of test systems.
  • Work closely with DevOps and Cloud teams to streamline automation infrastructure.

Mandatory Skills :

  • Kubernetes, Helm, Docker, Linux
  • Cloud Platforms : AWS / Azure / GCP
  • CI/CD Tools : Jenkins, GitHub Actions
  • Scripting : Python, Pytest, Bash
  • Monitoring & Performance : Prometheus, Grafana, Jaeger, K6
  • IaC Practices : Terraform / Ansible

Good to Have :

  • Experience with Service Mesh (Istio/Linkerd).
  • Container Security or DevSecOps exposure.
Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Hyderabad, Noida, Mumbai, Navi Mumbai, Ahmedabad, Chennai, Coimbatore, Gurugram, Kochi (Cochin), Kolkata, Calcutta, Pune, Thiruvananthapuram, Trivandrum
7 - 15 yrs
₹15L - ₹30L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
Data Lake

SENIOR DATA ENGINEER:

ROLE SUMMARY:

Own the design and delivery of petabyte-scale data platforms and pipelines across AWS and modern Lakehouse stacks. You’ll architect, code, test, optimize, and operate ingestion, transformation, storage, and serving layers. This role requires autonomy, strong engineering judgment, and partnership with project managers, infrastructure teams, testers, and customer architects to land secure, cost-efficient, and high-performing solutions.



RESPONSIBILITIES:

  • Architecture and design: Create HLD/LLD/SAD, source–target mappings, data contracts, and optimal designs aligned to requirements.
  • Pipeline development: Build and test robust ETL/ELT for batch, micro-batch, and streaming across RDBMS, flat files, APIs, and event sources.
  • Performance and cost tuning: Profile and optimize jobs, right-size infrastructure, and model license/compute/storage costs.
  • Data modeling and storage: Design schemas and SCD strategies; manage relational, NoSQL, data lakes, Delta Lakes, and Lakehouse tables.
  • DevOps and release: Establish coding standards, templates, CI/CD, configuration management, and monitored release processes.
  • Quality and reliability: Define DQ rules and lineage; implement SLA tracking, failure detection, RCA, and proactive defect mitigation.
  • Security and governance: Enforce IAM best practices, retention, audit/compliance; implement PII detection and masking.
  • Orchestration: Schedule and govern pipelines with Airflow and serverless event-driven patterns.
  • Stakeholder collaboration: Clarify requirements, present design options, conduct demos, and finalize architectures with customer teams.
  • Leadership: Mentor engineers, set FAST goals, drive upskilling and certifications, and support module delivery and sprint planning.



REQUIRED QUALIFICATIONS:

  • Experience: 15+ years designing distributed systems at petabyte scale; 10+ years building data lakes and multi-source ingestion.
  •  Cloud (AWS): IAM, VPC, EC2, EKS/ECS, S3, RDS, DMS, Lambda, CloudWatch, CloudFormation, CloudTrail.
  • Programming: Python (preferred), PySpark, SQL for analytics, window functions, and performance tuning.
  • ETL tools: AWS Glue, Informatica, Databricks, GCP DataProc; orchestration with Airflow.
  • Lakehouse/warehousing: Snowflake, BigQuery, Delta Lake/Lakehouse; schema design, partitioning, clustering, performance optimization.
  • DevOps/IaC: Terraform with 15+ years of practice; CI/CD (GitHub Actions, Jenkins) with 10+ years; config governance and release management.
  • Serverless and events: Design event-driven distributed systems on AWS.
  • NoSQL: 2–3 years with DocumentDB including data modeling and performance considerations.
  • AI services: AWS Entity Resolution, AWS Comprehend; run custom LLMs on Amazon SageMaker; use LLMs for PII classification.



NICE-TO-HAVE QUALIFICATIONS:

  • Data governance automation: 10+ years defining audit, compliance, retention standards and automating governance workflows.
  • Table and file formats: Apache Parquet; Apache Iceberg as analytical table format.
  • Advanced LLM workflows: RAG and agentic patterns over proprietary data; re-ranking with index/vector store results.
  • Multi-cloud exposure: Azure ADF/ADLS, GCP Dataflow/DataProc; FinOps practices for cross-cloud cost control.



OUTCOMES AND MEASURES:

  • Engineering excellence: Adherence to processes, standards, and SLAs; reduced defects and non-compliance; fewer recurring issues.
  • Efficiency: Faster run times and lower resource consumption with documented cost models and performance baselines.
  • Operational reliability: Faster detection, response, and resolution of failures; quick turnaround on production bugs; strong release success.
  • Data quality and security: High DQ pass rates, robust lineage, minimal security incidents, and audit readiness.
  • Team and customer impact: On-time milestones, clear communication, effective demos, improved satisfaction, and completed certifications/training.



LOCATION AND SCHEDULE:

●      Location: Outside US (OUS).

●      Schedule: Minimum 6 hours of overlap with US time zones.

Read more
Sonatype

at Sonatype

5 candid answers
Reshika Mendiratta
Posted by Reshika Mendiratta
Hyderabad
4 - 8 yrs
Upto ₹18L / yr (Varies
)
skill iconJava
Selenium
cypress
playwright
Appium
+1 more

We are seeking a Software Engineer in Test to join our Quality Engineering team. In this role, you will be responsible for designing, developing, and maintaining automation frameworks to enhance our test coverage and ensure the delivery of high-quality software. You will collaborate closely with developers, product managers, and other stakeholders to drive test automation strategies and improve software reliability.


Key Responsibilities

● Design, develop, and maintain robust test automation frameworks for web, API, and backend services.

● Implement automated test cases to improve software quality and test coverage.

● Develop and execute performance and load tests to ensure the application behaves reliably in self-hosted environment

environments.

● Integrate automated tests into CI/CD pipelines to enable continuous testing.

● Collaborate with software engineers to define test strategies, acceptance criteria, and quality standards.

● Conduct performance, security, and regression testing to ensure application stability.

● Investigate test failures, debug issues, and work with development teams to resolve defects.

● Advocate for best practices in test automation, code quality, and software reliability.

● Stay updated with industry trends and emerging technologies in software testing.


Qualifications & Experience

● Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.

● 3+ years of experience in software test automation.

● Proficiency in programming languages such as Java, Python, or JavaScript.

● Hands-on experience with test automation tools like Selenium, Cypress, Playwright, or similar.

● Strong knowledge of API testing using tools such as Postman, RestAssured, or Karate.

● Experience with CI/CD tools such as Jenkins, GitHub Actions, or GitLab CI/CD.

● Understanding of containerization and cloud technologies (Docker, Kubernetes, AWS, or similar).

● Familiarity with performance testing tools like JMeter or Gatling is a plus.

● Excellent problem-solving skills and attention to detail.

● Strong communication and collaboration skills.

Read more
Financial Services

Financial Services

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore), Hyderabad
8 - 12 yrs
₹40L - ₹45L / yr
Systems architecture
SOW
System integration
Solution architecture
skill iconPython
+3 more

Position Overview


We are seeking an experienced Solutions Architect to lead the technical design and implementation strategy for our finance automation platform. This role sits at the intersection of business requirements, technical architecture, and implementation excellence. You will be responsible for translating complex Statement of Work (SOW) requirements into comprehensive technical designs while mentoring implementation engineers and driving platform evolution.

 

Key Responsibilities

Solution Design & Architecture

1. Translate SOW requirements into detailed C4 architecture models and Business Process Canvas documentation

2. Design end-to-end solutions for complex finance automation workflows including reconciliations, book closure, and financial reporting

3. Create comprehensive technical specifications for custom development initiatives

4. Establish architectural standards and best practices for finance domain solutions

Technical Leadership & Mentorship

1. Mentor Implementation Engineers on solution design, technical approaches, and best practices

2. Lead technical reviews and ensure solution quality across all implementations

3. Provide guidance on complex technical challenges and architectural decisions

4. Foster knowledge sharing and technical excellence within the solutions team

Platform Strategy & Development

1. Make strategic decisions on when to push feature development to the Platform Team vs. custom implementation

2. Interface with Implementation Support team to assess platform gaps and enhancement opportunities

3. Collaborate with Program Managers to track and prioritize new platform feature development

4. Contribute to product roadmap discussions based on client requirements and market trends

Client Engagement & Delivery

1. Lead technical discussions with enterprise clients during pre-sales and implementation phases

2. Design scalable solutions that align with client's existing technology stack and future roadmap

3. Ensure solutions comply with financial regulations (Ind AS/IFRS/GAAP) and industry standards

4. Drive technical aspects of complex implementations from design through go-live

 

Required Qualifications

Technical Expertise

● 8+ years of experience in solution architecture, preferably in fintech or enterprise software

● Strong expertise in system integration, API design, and microservices architecture

● Proficiency in C4 modeling and architectural documentation standards

● Experience with Business Process Management (BPM) and workflow design

● Advanced knowledge of data architecture, ETL pipelines, and real-time data processing

● Strong programming skills in Python, Java, or similar languages

● Experience with cloud platforms (AWS, Azure, GCP) and containerization technologies.


Financial Domain Knowledge

● Deep understanding of finance and accounting principles (Ind AS/IFRS/GAAP)

● Experience with financial systems integration (ERP, GL, AP/AR systems)

● Knowledge of financial reconciliation processes and automation strategies

● Understanding of regulatory compliance requirements in financial services

Leadership & Communication

● Proven experience mentoring technical teams and driving technical excellence

● Strong stakeholder management skills with ability to communicate with C-level executives

● Experience working in agile environments with cross-functional teams

● Excellent technical documentation and presentation skills

 

Preferred Qualifications

● Master's degree in Computer Science, Engineering, or related technical field

● Experience with finance automation platforms (Blackline, Trintech, Anaplan, etc.)

● Certification in enterprise architecture frameworks (TOGAF, Zachman)

● Experience with data visualization tools (Power BI, Tableau, Looker)

● Background in SaaS platform development and multi-tenant architectures

● Experience with DevOps practices and CI/CD pipeline design

● Knowledge of machine learning applications in finance automation.

 

Skills & Competencies


Technical Skills

● Solution architecture and system design

● C4 modeling and architectural documentation

● API design and integration patterns

● Cloud-native architecture and microservices

● Data architecture and pipeline design

● Programming and scripting languages

Financial & Business Skills

● Financial process automation

● Business process modeling and optimization

● Regulatory compliance and risk management

● Enterprise software implementation

● Change management and digital transformation

Leadership Skills

● Technical mentorship and team development

● Strategic thinking and decision making

● Cross-functional collaboration

● Client relationship management

● Project and program management

Soft Skills

● Critical thinking and problem-solving

● Cross-functional collaboration

● Task and project management

● Stakeholder management

● Team leadership

● Technical documentation

● Communication with technical and non-technical stakeholders


Mandatory Criteria:  

● Looking for candidates who are Solution Architects in Finance from Product Companies.

● The candidate should have worked in Fintech for at least 4–5 years.

● Candidate should have Strong Technical and Architecture skills with Finance Exposure.

● Candidate should be from Product companies.

● Candidate should have 8+ years’ experience in solution architecture, preferably in fintech or enterprise software.

● Candidate should have Proficiency in Python, Java (or similar languages) and hands-on with cloud platforms (AWS/Azure/GCP) & containerization (Docker/Kubernetes).

● Candidate should have Deep knowledge of finance & accounting principles (Ind AS/IFRS/GAAP) and financial system integrations (ERP, GL, AP/AR).

● Candidate should have Expertise in system integration, API design, microservices, and C4 modeling.

● Candidate should have Experience in financial reconciliations, automation strategies, and regulatory compliance.

● Candidate should be Strong in problem-solving, cross-functional collaboration, project management, documentation, and communication.

● Candidate should have Proven experience in mentoring technical teams and driving excellence.

Read more
Tata Consultancy Services
Bengaluru (Bangalore), Hyderabad, Pune, Delhi, Kolkata, Chennai
5 - 8 yrs
₹7L - ₹30L / yr
skill iconScala
skill iconPython
PySpark
Apache Hive
Spark
+3 more

Skills and competencies:

Required:

·        Strong analytical skills in conducting sophisticated statistical analysis using bureau/vendor data, customer performance

Data and macro-economic data to solve business problems.

·        Working experience in languages PySpark & Scala to develop code to validate and implement models and codes in

Credit Risk/Banking

·        Experience with distributed systems such as Hadoop/MapReduce, Spark, streaming data processing, cloud architecture.

  • Familiarity with machine learning frameworks and libraries (like scikit-learn, SparkML, tensorflow, pytorch etc.
  • Experience in systems integration, web services, batch processing
  • Experience in migrating codes to PySpark/Scala is big Plus
  • The ability to act as liaison conveying information needs of the business to IT and data constraints to the business

applies equal conveyance regarding business strategy and IT strategy, business processes and work flow

·        Flexibility in approach and thought process

·        Attitude to learn and comprehend the periodical changes in the regulatory requirement as per FED

Read more
Auxo AI
kusuma Gullamajji
Posted by kusuma Gullamajji
Bengaluru (Bangalore), Gurugram, Hyderabad, Mumbai
8 - 12 yrs
₹20L - ₹35L / yr
skill iconPython
SQL
skill iconPostgreSQL
Apache Kafka

Responsibilities • Design, develop, and maintain backend systems and RESTful APIs using Python (Django, FastAPI, or Flask)• Build real-time communication features using WebSockets, SSE, and async IO • Implement event-driven architectures using messaging systems like Kafka, RabbitMQ, Redis Streams, or NATS • Develop and maintain microservices that interact over messaging and streaming protocols • Ensure high scalability and availability of backend services • Collaborate with frontend developers, DevOps engineers, and product managers to deliver end-to-end solutions • Write clean, maintainable code with unit/integration tests • Lead technical discussions, review code, and mentor junior engineers


Requirements • 8+ years of backend development experience, with at least 8 years in Python • Strong experience with asynchronous programming in Python (e.g., asyncio, aiohttp, FastAPI) • Production experience with WebSockets and Server-Sent Events • Hands-on experience with at least one messaging system: Kafka, RabbitMQ, Redis Pub/Sub, or similar • Proficient in RESTful API design and microservices architecture • Solid experience with relational and NoSQL databases • Familiarity with Docker and container-based deployment • Strong understanding of API security, authentication, and performance optimization


Nice to Have • Experience with GraphQL or gRPC • Familiarity with stream processing frameworks (e.g., Apache Flink, Spark Streaming) • Cloud experience (AWS, GCP, Azure), particularly with managed messaging or pub/sub services • Knowledge of CI/CD and infrastructure as code • Exposure to AI engineering workflows and tools • Interest or experience in building Agentic AI systems or integrating backends with AI agents

Read more
NeoGenCode Technologies Pvt Ltd
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Pune, Hyderabad
4 - 8 yrs
₹18L - ₹30L / yr
skill iconJava
skill iconSpring Boot
skill iconAmazon Web Services (AWS)
RESTful APIs
CI/CD
+3 more

Job Overview:

We are looking for a skilled Senior Backend Engineer to join our team. The ideal candidate will have a strong foundation in Java and Spring, with proven experience in building scalable microservices and backend systems. This role also requires familiarity with automation tools, Python development, and working knowledge of AI technologies.


Responsibilities:


  • Design, develop, and maintain backend services and microservices.
  • Build and integrate RESTful APIs across distributed systems.
  • Ensure performance, scalability, and reliability of backend systems.
  • Collaborate with cross-functional teams and participate in agile development.
  • Deploy and maintain applications on AWS cloud infrastructure.
  • Contribute to automation initiatives and AI/ML feature integration.
  • Write clean, testable, and maintainable code following best practices.
  • Participate in code reviews and technical discussions.


Required Skills:

  • 4+ years of backend development experience.
  • Strong proficiency in Java and Spring/Spring Boot frameworks.
  • Solid understanding of microservices architecture.
  • Experience with REST APIs, CI/CD, and debugging complex systems.
  • Proficient in AWS services such as EC2, Lambda, S3.
  • Strong analytical and problem-solving skills.
  • Excellent communication in English (written and verbal).


Good to Have:

  • Experience with automation tools like Workato or similar.
  • Hands-on experience with Python development.
  • Familiarity with AI/ML features or API integrations.
  • Comfortable working with US-based teams (flexible hours).


Read more
Meltwater

at Meltwater

2 candid answers
1 video
Reshika Mendiratta
Posted by Reshika Mendiratta
Hyderabad
8yrs+
Upto ₹65L / yr (Varies
)
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
PyTorch
TensorFlow
+6 more

What We’re Looking For

As a Senior AI/ML Engineer at Meltwater, you’ll play a vital role in building cutting-edge social solutions for our global client base within the Explore mission. We’re seeking a proactive, quick-learning engineer who thrives in a collaborative environment.

Our culture values continuous learning, team autonomy, and a DevOps mindset. Meltwater development teams take full ownership of their subsystems and infrastructure, including running on-call rotations.

With a heavy reliance on Software Engineering in AI/ML and Data Science, we seek individuals with experience in:

  • Cloud infrastructure and containerisation (Docker, Azure or AWS – Azure preferred)
  • Data preparation
  • Model lifecycle (training, serving, registries)
  • Natural Language Processing (NLP) and Large Language Models (LLMs)

In this role, you’ll have the opportunity to:

  • Push the boundaries of our technology stack
  • Modify open-source libraries
  • Innovate with existing technologies
  • Work on distributed systems at scale
  • Extract insights from vast amounts of data

What You’ll Do

  • Lead and mentor a small team while doing hands-on coding.
  • Demonstrate excellent communication and collaboration skills.

What You’ll Bring

  • Bachelor’s or Master’s degree in Computer Science (or equivalent) OR demonstrable experience.
  • Proven experience as a Lead Software Engineer in AI/ML and Data Science.
  • 8+ years of working experience.
  • 2+ years of leadership experience as Tech Lead or Team Lead.
  • 5+ years strong knowledge of Python and software engineering principles.
  • 5+ years strong knowledge of cloud infrastructure and containerization.
  • Docker (required).
  • Azure or AWS (required, Azure preferred).
  • 5+ years strong working knowledge of TensorFlow / PyTorch.
  • 3+ years good working knowledge of ML-Ops principles.
  • Data preparation.
  • Model lifecycle (training, serving, registries).
  • Theoretical knowledge of AI / Data Science in one or more of:
  • Natural Language Processing (NLP) and LLMs
  • Neural Networks
  • Topic modelling and clustering
  • Time Series Analysis (TSA): anomaly detection, trend analysis, forecasting
  • Retrieval Augmented Generation
  • Speech to Text
  • Excellent communication and collaboration skills.

What We Offer

  • Flexible paid time off options for enhanced work-life balance.
  • Comprehensive health insurance tailored for you.
  • Employee assistance programs covering mental health, legal, financial, wellness, and behavioural support.
  • Complimentary Calm App subscription for you and your loved ones.
  • Energetic work environment with a hybrid work style.
  • Family leave program that grows with your tenure.
  • Inclusive community with professional development opportunities.

Our Story

At Meltwater, we believe that when you have the right people in the right environment, great things happen.

Our best-in-class technology empowers 27,000 customers worldwide to make better business decisions through data. But we can’t do that without our global team of developers, innovators, problem-solvers, and high-performers who embrace challenges and find new solutions.

Our award-winning global culture drives everything we do. Employees can make an impact, learn every day, feel a sense of belonging, and celebrate successes together.

We are innovators at the core who see potential in people, ideas, and technologies. Together, we challenge ourselves to go big, be bold, and build best-in-class solutions.

  • 2,200+ employees
  • 50 locations across 25 countries

We are Meltwater. We love working here, and we think you will too.

"Inspired by innovation, powered by people."

Read more
Meltwater

at Meltwater

2 candid answers
1 video
Reshika Mendiratta
Posted by Reshika Mendiratta
Hyderabad
2yrs+
Best in industry
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
PyTorch
TensorFlow
+6 more

What You’ll Do:

As an AI/ML Engineer at Meltwater, you’ll play a vital role in building cutting-edge social solutions for our global client base within the Explore mission. We’re seeking a proactive, quick-learning engineer who thrives in a collaborative environment. Our culture values continuous learning, team autonomy, and a DevOps mindset.

Meltwater development teams take full ownership of their subsystems and infrastructure, including running on-call rotations. With a heavy reliance on Software Engineer in AI/ML and Data Science, we seek individuals with experience in:

  • Cloud infrastructure and containerization (Docker, Azure or AWS is required; Azure is preferred)
  • Data Preparation
  • Model Lifecycle (training, serving, and registries)
  • Natural Language Processing (NLP) and LLMs

In this role, you’ll have the opportunity to push the boundaries of our technology stack, from modifying open-source libraries to innovating with existing technologies. If you’re passionate about distributed systems at scale and finding new ways to extract insights from vast amounts of data, we invite you to join us in this exciting journey.


What You’ll Bring:

  • Bachelor’s or master’s degree in computer science or equivalent degree or demonstrable experience.
  • Proven experience as a Software Engineer in AI/ML and Data Science.
  • Minimum of 2-4 years of working experience.
  • Strong working experience in Python and software engineering principles (2+ Years).
  • Experience with cloud infrastructure and containerization (1+ Years).
  • Docker is required.
  • Experience with TensorFlow / PyTorch (2+ Years).
  • Experience with ML-Ops Principles (1+ Years).
  • Data Preparation
  • Model Lifecycle (training, serving, and registries)
  • Sound knowledge on any cloud (AWS/Azure).
  • Good theoretical knowledge of AI / Data Science in one or more of the following areas:
  • Natural Language Processing (NLP) and LLMs
  • Neural Networks
  • Topic Modelling and Clustering
  • Time Series Analysis (TSA), including anomaly detection, trend analysis, and forecasting
  • Retrieval Augmented Generation
  • Speech to Text
  • Excellent communication and collaboration skills


What We Offer:

  • Enjoy comprehensive paid time off options for enhanced work-life balance.
  • Comprehensive health insurance tailored for you.
  • Employee assistance programs covering mental health, legal, financial, wellness, and behaviour areas to ensure your overall well-being.
  • Energetic work environment with a hybrid work style, providing the balance you need.
  • Benefit from our family leave program, which grows with your tenure at Meltwater.
  • Thrive within our inclusive community and seize ongoing professional development opportunities to elevate your career.


Where You’ll Work:

HITEC City, Hyderabad.


Our Story:

The sky is the limit at Meltwater.

At Meltwater, we believe that when you have the right people in the right working environment, great things happen. Our best-in-class technology empowers our 27,000 customers around the world to analyse over a billion pieces of data each day and make better business decisions.

Our award-winning culture is our north star and drives everything we do – from striving to create an environment where all employees do their best work, to delivering customer value by continuously innovating our products — and making sure to celebrate our successes and have fun along the way.

We’re proud of our diverse team of 2,300+ employees in 50 locations across 25 countries around the world. No matter where you are, you’ll work with people who care about your success and get the support you need to reach your goals.

So, in a nutshell, that’s Meltwater. We love working here, and we think you will too.

Read more
Hyderabad
10 - 15 yrs
₹15L - ₹18L / yr
Test Automation (QA)
skill iconJava
skill iconPython
skill iconJavascript
CI/CD
+3 more

Role: Mobile Automation Engineer (SDET) — On-site, India

Role & Responsibilities

  • Design, build and maintain scalable mobile test automation frameworks for Android and iOS using Appium, Espresso, XCUITest or equivalent tools to support continuous delivery.
  • Create and own automated test suites (functional, regression, UI, and smoke) that run reliably in CI/CD pipelines (Jenkins/GitHub Actions) and on cloud device farms (BrowserStack/Sauce Labs).
  • Collaborate with Developers and Product Owners to translate requirements into test strategies, write robust test cases, and automate end-to-end and integration scenarios (including API tests).
  • Investigate, triage, and debug failures — use device logs, ADB, Xcode traces, and performance tools to isolate flakiness and reliability issues and drive fixes.
  • Integrate automated tests into build pipelines, enforce quality gates, and provide actionable reporting and metrics for release readiness.
  • Advocate and implement test automation best practices: code quality, modular frameworks, reusability, CI parallelization, and maintainable test data strategies.

Skills & Qualifications

  • Must-Have
  • 3+ years in mobile QA/automation with hands-on experience in Appium or native frameworks (Espresso/XCUITest) across Android and iOS.
  • Strong programming skills in Java/Kotlin or Swift and working knowledge of Python or JavaScript for scripting and test tooling.
  • Experience integrating automated suites into CI/CD (Jenkins/GitHub Actions) and executing on real & virtual device clouds (BrowserStack/Sauce Labs).
  • Practical experience with API testing (REST), test frameworks (TestNG/JUnit/Mocha), and source control (Git).
  • Solid debugging skills using ADB, Xcode, Android SDK, and familiarity with mobile performance profiling.
  • Preferred
  • Experience building custom automation frameworks, parallel test execution, and reliability/flakiness reduction strategies.
  • Knowledge of CI orchestration, containerized test runners, and mobile security or accessibility testing.
  • ISTQB or equivalent QA certification, prior experience in Agile/Scrum teams, and exposure to device lab management.


Read more
Syrencloud

at Syrencloud

3 recruiters
Sudheer Kumar
Posted by Sudheer Kumar
Remote, Hyderabad
3 - 10 yrs
₹10L - ₹30L / yr
Microsoft Fabric
ADF
Synapse
databricks
Microsoft Windows Azure
+5 more

We are seeking a highly skilled Fabric Data Engineer with strong expertise in Azure ecosystem to design, build, and maintain scalable data solutions. The ideal candidate will have hands-on experience with Microsoft Fabric, Databricks, Azure Data Factory, PySpark, SQL, and other Azure services to support advanced analytics and data-driven decision-making.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines using Microsoft Fabric and Azure data services.
  • Implement data integration, transformation, and orchestration workflows with Azure Data Factory, Databricks, and PySpark.
  • Work with stakeholders to understand business requirements and translate them into robust data solutions.
  • Optimize performance and ensure data quality, reliability, and security across all layers.
  • Develop and maintain data models, metadata, and documentation to support analytics and reporting.
  • Collaborate with data scientists, analysts, and business teams to deliver insights-driven solutions.
  • Stay updated with emerging Azure and Fabric technologies to recommend best practices and innovations.
  • Required Skills & Experience
  • Proven experience as a Data Engineer with strong expertise in the Azure cloud ecosystem.

Hands-on experience with:

  • Microsoft Fabric
  • Azure Databricks
  • Azure Data Factory (ADF)
  • PySpark & Python
  • SQL (T-SQL/PL-SQL)
  • Solid understanding of data warehousing, ETL/ELT processes, and big data architectures.
  • Knowledge of data governance, security, and compliance within Azure.
  • Strong problem-solving, debugging, and performance tuning skills.
  • Excellent communication and collaboration abilities.

 

Preferred Qualifications

  • Microsoft Certified: Fabric Analytics Engineer Associate / Azure Data Engineer Associate.
  • Experience with Power BI, Delta Lake, and Lakehouse architecture.
  • Exposure to DevOps, CI/CD pipelines, and Git-based version control.
Read more
Highfly Sourcing

at Highfly Sourcing

2 candid answers
Highfly Hr
Posted by Highfly Hr
Dubai, Augsburg, Germany, Zaragoza (Spain), Qatar, Salalah (Oman), Kuwait, Lebanon, Marseille (France), Genova (Italy), Winnipeg (Canada), Denmark, Poznan (Poland), Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Hyderabad, Pune
3 - 10 yrs
₹25L - ₹30L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+14 more

Job Description

We are looking for a talented Java Developer to work in abroad countries. You will be responsible for developing high-quality software solutions, working on both server-side components and integrations, and ensuring optimal performance and scalability.


Preferred Qualifications

  • Experience with microservices architecture.
  • Knowledge of cloud platforms (AWS, Azure).
  • Familiarity with Agile/Scrum methodologies.
  • Understanding of front-end technologies (HTML, CSS, JavaScript) is a plus.


Requirment Details

Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).

Proven experience as a Java Developer or similar role.

Strong knowledge of Java programming language and its frameworks (Spring, Hibernate).

Experience with relational databases (e.g., MySQL, PostgreSQL) and ORM tools.

Familiarity with RESTful APIs and web services.

Understanding of version control systems (e.g., Git).

Solid understanding of object-oriented programming (OOP) principles.

Strong problem-solving skills and attention to detail.

Read more
kbstechsolutions
Lalitha KBS
Posted by Lalitha KBS
Hyderabad
8 - 12 yrs
₹12L - ₹30L / yr
skill iconPython
skill iconDjango
skill iconAmazon Web Services (AWS)
Angular
skill iconAngular (2+)
+3 more

Job Title: Engineering Lead


Role Overview:

We are looking for an Engineering Lead to take end-to-end ownership of technical delivery, design, architecture, and quality for our multi-customer SaaS product. You will lead and mentor the engineering team, drive scalable design and high-quality delivery, manage releases across customer environments, and ensure the stability and performance of the product in production.


Key Responsibilities:

·        Delivery & Release Management: Plan and deliver product features and customer-specific releases on time with high quality, ensuring operational readiness and stability across environments.

·        Technical Design & Architecture: Lead technical design and high-scale architecture for new and existing modules, ensuring scalability, performance, and maintainability.

·        Team Management: Mentor and guide engineers, ensure clarity in priorities, unblock challenges, and foster a culture of ownership and quality within the team.

·        Requirement to Delivery: Work with product and customer teams to understand requirements, translate them into designs and implementation plans, and track them through to delivery.

·        Product Quality: Establish and maintain engineering best practices, code reviews, automated testing, and CI/CD pipelines to ensure high product quality and reliability.

·        Troubleshooting & Support: Lead the team in debugging complex issues in development and production, ensuring minimal downtime and strong customer satisfaction.

·        Hands-on Contribution: Actively contribute technically where needed, providing architectural guidance and coding support aligned with the team’s stack.


Requirements:

·        Experience: 8–12 years in software engineering with at least 3+ years in a lead role.

·        Proven experience in designing scalable, high-performance architectures and technical solutions.

·        Experience delivering multi-customer SaaS product releases, including phased and customer-specific configurations.

·        Strong track record of ensuring product quality and stability through structured processes, testing, and monitoring.

·        Ability to troubleshoot complex issues and guide teams towards resolution.

·        Experience in mentoring and managing engineering teams to drive aligned delivery and high performance.

·        Hands-on experience with your relevant tech stack (e.g., Python, Django, Angular, AWS, Docker, Redis, RabbitMQ).

·        Excellent communication and collaboration skills with Product, QA, and Customer Support teams.

·        Bachelor’s or Master’s degree in Engineering or related field.

Read more
Service Co

Service Co

Agency job
via Vikash Technologies by Rishika Teja
Hyderabad
6 - 10 yrs
₹10L - ₹25L / yr
skill iconAmazon Web Services (AWS)
Terraform
skill iconKubernetes
skill iconPython
Shell Scripting
+1 more

A strong proficiency in at least one scripting language (e.g., Python, Bash, PowerShell) is required.


Candidates must possess an in-depth ability to design, write, and implement complex automation logic, not just basic scripts.


Proven experience in automating DevOps processes, environment provisioning, and configuration management is essential.


Cloud Platform (AWS Preferred) : • Extensive hands-on experience with Amazon Web Services (AWS) is highly preferred.


Candidates must be able to demonstrate expert-level knowledge of core AWS services and articulate their use cases.


Excellent debugging and problem-solving skills within the AWS ecosystem are mandatory. The ability to diagnose and resolve issues efficiently is a key requirement.


Infrastructure as Code (IaC - Terraform Preferred) : • Expert-level knowledge and practical experience with Terraform are required.


Candidates must have a deep understanding of how to write scalable, modular, and reusable Terraform code.


Containerization and Orchestration (Kubernetes Preferred) : • Advanced, hands-on experience with Kubernetes is mandatory. • Candidates must be proficient in solving complex, production-level issues related to deployments, networking, and cluster management. • A solid foundational knowledge of Docker is required.

Read more
Mumbai, Pune, Hyderabad, Bengaluru (Bangalore), Panchkula, Mohali
5 - 8 yrs
₹10L - ₹20L / yr
skill iconPython
FastAPI
skill iconFlask
skill iconDjango
skill iconGit

Job Title: Python Developer (FastAPI)

Experience Required: 4+ years

Location: Pune, Bangalore, Hyderabad, Mumbai, Panchkula, Mohali 

Shift: Night Shift 6:30 pm to 3:30 AM IST

About the Role

We are seeking an experienced Python Developer with strong expertise in FastAPI to join our engineering team. The ideal candidate should have a solid background in backend development, RESTful API design, and scalable application development.


Required Skills & Qualifications

· 4+ years of professional experience in backend development with Python.

· Strong hands-on experience with FastAPI (or Flask/Django with migration experience).

· Familiarity with asynchronous programming in Python.

· Working knowledge of version control systems (Git).

· Good problem-solving and debugging skills.

· Strong communication and collaboration abilities.

Read more
IT Industry - Night Shifts

IT Industry - Night Shifts

Agency job
Bengaluru (Bangalore), Hyderabad, Mumbai, Navi Mumbai, Pune, Mohali, Delhi
5 - 10 yrs
₹20L - ₹30L / yr
skill iconAmazon Web Services (AWS)
IT infrastructure
skill iconMachine Learning (ML)
DevOps
Automation
+1 more

🚀 We’re Hiring: Senior Cloud & ML Infrastructure Engineer 🚀


We’re looking for an experienced engineer to lead the design, scaling, and optimization of cloud-native ML infrastructure on AWS.

If you’re passionate about platform engineering, automation, and running ML systems at scale, this role is for you.


What you’ll do:

🔹 Architect and manage ML infrastructure with AWS (SageMaker, Step Functions, Lambda, ECR)

🔹 Build highly available, multi-region solutions for real-time & batch inference

🔹 Automate with IaC (AWS CDK, Terraform) and CI/CD pipelines

🔹 Ensure security, compliance, and cost efficiency

🔹 Collaborate across DevOps, ML, and backend teams


What we’re looking for:

✔️ 6+ years AWS cloud infrastructure experience

✔️ Strong ML pipeline experience (SageMaker, ECS/EKS, Docker)

✔️ Proficiency in Python/Go/Bash scripting

✔️ Knowledge of networking, IAM, and security best practices

✔️ Experience with observability tools (CloudWatch, Prometheus, Grafana)


✨ Nice to have: Robotics/IoT background (ROS2, Greengrass, Edge Inference)


📍 Location: Bengaluru, Hyderabad, Mumbai, Pune, Mohali, Delhi

5 days working, Work from Office

Night shifts: 9pm to 6am IST

👉 If this sounds like you (or someone you know), let’s connect!


Apply here:

Read more
A American Bank holding company . a community-focused financial institution that provides accessible banking services to its members, operating on a not-for-profit basis.

A American Bank holding company . a community-focused financial institution that provides accessible banking services to its members, operating on a not-for-profit basis.

Agency job
via HyrHub by Shwetha Naik
Hyderabad
6 - 10 yrs
₹12L - ₹13L / yr
skill iconPython
skill iconMachine Learning (ML)
FastAPI
skill iconFlask

Position: AIML_Python Enginner

Kothapet_Hyderabad _Hybrid.( 4 days a week onsite)

Contract to hire fulltime to client.


5+ years of python experience for scripting ML workflows to deploy ML Pipelines as real time, batch, event triggered, edge deployment

4+ years of experience in using AWS sagemaker for deployment of ML pipelines and ML Models using Sagemaker piplines, Sagemaker mlflow, Sagemaker Feature Store..etc.

3+ years of development of apis using FastAPI, Flask, Django

3+ year of experience in ML frameworks & tools like scikit-learn, PyTorch, xgboost, lightgbm, mlflow.

Solid understanding of ML lifecycle: model development, training, validation, deployment and monitoring

Solid understanding of CI/CD pipelines specifically for ML workflows using bitbucket, Jenkins, Nexus, AUTOSYS for scheduling

Experience with ETL process for ML pipelines with PySpark, Kafka, AWS EMR Serverless

Good to have experience in H2O.ai

Good to have experience in containerization using Docker and Orchestration using Kubernetes.

 

Read more
Hunarstreet Technologies

Hunarstreet Technologies

Agency job
via Hunarstreet Technologies pvt ltd by Priyanka Londhe
Mumbai, Pune, Bengaluru (Bangalore), Hyderabad, Panchkula, Mohali
5 - 8 yrs
₹15L - ₹22L / yr
skill iconPython
FastAPI
skill iconDjango
skill iconFlask
backend development
+2 more

Required Skills & Qualifications

  • 4+ years of professional experience in backend development with Python.
  • Strong hands-on experience with FastAPI (or Flask/Django with migration experience).
  • Familiarity with asynchronous programming in Python.
  • Working knowledge of version control systems (Git).
  • Good problem-solving and debugging skills.
  • Strong communication and collaboration abilities.
  • should have a solid background in backend development, RESTful API design, and scalable application development.


Shift: Night Shift 6:30 pm to 3:30 AM IST

Read more
NeoGenCode Technologies Pvt Ltd
Ritika Verma
Posted by Ritika Verma
Hyderabad
6 - 8 yrs
₹20L - ₹30L / yr
skill iconPython
skill iconDjango
skill iconFlask
FastAi
Visualization
+10 more

Job Title: Python Developer (Full Time)

Location: Hyderabad (Onsite)

Interview: Virtual and Face to Face Interview (Last round)

Experience Required: 4 + Years  

Working Days: 5 Days

About the Role

We are seeking a highly skilled Lead Python Developer with a strong background in building scalable and secure applications. The ideal candidate will have hands-on expertise in Python frameworks, API integrations, and modern application architectures. This role requires a tech leader who can balance innovation, performance, and compliance while driving successful project delivery.

Key Responsibilities

  1. Application Development
  • Architect and develop robust, high-performance applications using Django, Flask, and FastAPI.
  1. API Integration
  • Design and implement seamless integration with third-party APIs (including travel-related APIs, payment gateways, and external service providers).
  1. Data Management
  • Develop and optimize ETL pipelines for structured and unstructured data using data lakes and distributed storage solutions.
  1. Microservices Architecture
  • Build modular, scalable applications using microservices principles for independent deployment and high availability.
  1. Performance Optimization
  • Enhance application performance through load balancing, caching, and query optimization to deliver superior user experiences.
  1. Security & Compliance
  • Apply secure coding practices, implement data encryption, and ensure compliance with industry security and privacy standards (e.g., PCI DSS, GDPR).
  1. Automation & Deployment
  • Utilize CI/CD pipelines, Docker/Kubernetes, and monitoring tools for automated testing, deployment, and production monitoring.
  1. Collaboration
  • Partner with front-end developers, product managers, and stakeholders to deliver user-centric, business-aligned solutions.

Requirements

Education

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.

Technical Expertise

  • 4+ years of hands-on experience with Python frameworks (Django, Flask, FastAPI).
  • Proficiency in RESTful APIs, GraphQL, and asynchronous programming.
  • Strong knowledge of SQL/NoSQL databases (PostgreSQL, MongoDB) and big data tools (Spark, Kafka).
  • Familiarity with Kibana, Grafana, Prometheus for monitoring and visualization.
  • Experience with AWS, Azure, or Google Cloud, containerization (Docker, Kubernetes), and CI/CD tools (Jenkins, GitLab CI).
  • Working knowledge of testing tools: PyTest, Selenium, SonarQube.
  • Experience with API integrations, booking flows, and payment gateway integrations (travel domain knowledge is a plus, but not mandatory).

Soft Skills

  • Strong problem-solving and analytical skills.
  • Excellent communication, presentation, and teamwork abilities.
  • Proactive, ownership-driven mindset with the ability to perform under pressure.
Read more
Inferigence Quotient

at Inferigence Quotient

1 recruiter
Neeta Trivedi
Posted by Neeta Trivedi
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad
1 - 2 yrs
₹6L - ₹12L / yr
QML
Qt
skill iconC++
skill iconPython

We are seeking a highly skilled Qt/QML Engineer to design and develop advanced GUIs for aerospace applications. The role requires working closely with system architects, avionics software engineers, and mission systems experts to create reliable, intuitive, and real-time UI for mission-critical systems such as UAV ground control stations, and cockpit displays.

Key Responsibilities

  • Design, develop, and maintain high-performance UI applications using Qt/QML (Qt Quick, QML, C++).
  • Translate system requirements into responsive, interactive, and user-friendly interfaces.
  • Integrate UI components with real-time data streams from avionics systems, UAVs, or mission control software.
  • Collaborate with aerospace engineers to ensure compliance with DO-178C, or MIL-STD guidelines where applicable.
  • Optimise application performance for low-latency visualisation in mission-critical environments.
  • Implement data visualisation (raster and vector maps, telemetry, flight parameters, mission planning overlays).
  • Write clean, testable, and maintainable code while adhering to aerospace software standards.
  • Work with cross-functional teams (system engineers, hardware engineers, test teams) to validate UI against operational requirements.
  • Support debugging, simulation, and testing activities, including hardware-in-the-loop (HIL) setups.

Required Qualifications

  • Bachelor’s / Master’s degree in Computer Science, Software Engineering, or related field.
  • 1-3 years of experience in developing Qt/QML-based applications (Qt Quick, QML, Qt Widgets).
  • Strong proficiency in C++ (11/14/17) and object-oriented programming.
  • Experience integrating UI with real-time data sources (TCP/IP, UDP, serial, CAN, DDS, etc.).
  • Knowledge of multithreading, performance optimisation, and memory management.
  • Familiarity with aerospace/automotive domain software practices or mission-critical systems.
  • Good understanding of UX principles for operator consoles and mission planning systems.
  • Strong problem-solving, debugging, and communication skills.

Desirable Skills

  • Experience with GIS/Mapping libraries (OpenSceneGraph, Cesium, Marble, etc.).
  • Knowledge of OpenGL, Vulkan, or 3D visualisation frameworks.
  • Exposure to DO-178C or aerospace software compliance.
  • Familiarity with UAV ground control software (QGroundControl, Mission Planner, etc.) or similar mission systems.
  • Experience with Linux and cross-platform development (Windows/Linux).
  • Scripting knowledge in Python for tooling and automation.
  • Background in defence, aerospace, automotive or embedded systems domain.

What We Offer

  • Opportunity to work on cutting-edge aerospace and defence technologies.
  • Collaborative and innovation-driven work culture.
  • Exposure to real-world avionics and mission systems.
  • Growth opportunities in autonomy, AI/ML for aerospace, and avionics UI systems.
Read more
Bluecopa

Bluecopa

Agency job
Hyderabad
4 - 5 yrs
₹13L - ₹15L / yr
skill iconJava
skill iconPython
CI/CD
skill iconSpring Boot
skill iconKubernetes
+5 more

CTC: up to 20 LPA

Exp: 4 to 7 Years


Required Qualifications

  • Bachelor's degree in Computer Science, Information Technology, or related field
  • 4+ years of experience in software development
  • Strong proficiency in Java with deep understanding of web technology stack
  • Hands-on experience developing applications with Spring Boot framework
  • Solid understanding of Python programming language with practical Flask framework experience
  • Working knowledge of NATS server for messaging and streaming data
  • Experience deploying and managing applications in Kubernetes
  • Understanding of microservices architecture and RESTful API design
  • Familiarity with containerization technologies (Docker)
  • Experience with version control systems (Git)


Skills & Competencies

  • Skills Java (Spring Boot, Spring Cloud, Spring Security)
  •  
  • Python (Flask, SQL Alchemy, REST APIs)
  • NATS messaging patterns (pub/sub, request/reply, queue groups)
  • Kubernetes (deployments, services, ingress, ConfigMaps, Secrets)
  • Web technologies (HTTP, REST, WebSocket, gRPC)
  • Container orchestration and management
  • Soft Skills Problem-solving and analytical thinking


Read more
Auxo AI
kusuma Gullamajji
Posted by kusuma Gullamajji
Hyderabad, Bengaluru (Bangalore), Mumbai, Gurugram
4 - 7 yrs
₹15L - ₹35L / yr
skill iconHTML/CSS
skill iconJavascript
skill iconPython
skill iconNodeJS (Node.js)
CI/CD

Responsibilities :

  • Design and develop user-friendly web interfaces using HTML, CSS, and JavaScript.
  • Utilize modern frontend frameworks and libraries such as React, Angular, or Vue.js to build dynamic and responsive web applications.
  • Develop and maintain server-side logic using programming languages such as Java, Python, Ruby, Node.js, or PHP.
  • Build and manage APIs for seamless communication between the frontend and backend systems.
  • Integrate third-party services and APIs to enhance application functionality.
  • Implement CI/CD pipelines to automate testing, integration, and deployment processes.
  • Monitor and optimize the performance of web applications to ensure a high-quality user experience.
  • Stay up-to-date with emerging technologies and industry trends to continuously improve development processes and application performance.

Qualifications :

  • Bachelors/master's in computer science or related subjects or hands-on experience demonstrating working understanding of software applications.
  • Knowledge of building applications that can be deployed in a cloud environment or are cloud native applications.
  • Strong expertise in building backend applications using Java/C#/Python with demonstrable experience in using frameworks such as Spring/Vertx/.Net/FastAPI.
  • Deep understanding of enterprise design patterns, API development and integration and Test-Driven Development (TDD)
  • Working knowledge in building applications that leverage databases such as PostgreSQL, MySQL, MongoDB, Neo4J or storage technologies such as AWS S3, Azure Blob Storage.
  • Hands-on experience in building enterprise applications adhering to their needs of security and reliability.
  • Hands-on experience building applications using one of the major cloud providers (AWS, Azure, GCP).
  • Working knowledge of CI/CD tools for application integration and deployment.
  • Working knowledge of using reliability tools to monitor the performance of the application.


Read more
Bluecopa

Bluecopa

Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Hyderabad
4 - 8 yrs
₹10L - ₹15L / yr
skill iconPython
skill iconJava
skill iconKubernetes
CI/CD
skill iconSpring Boot
+4 more

Role: Senior Backend Developer

Exp: 4 - 7 Years

CTC: up to 22 LPA


Key Responsibilities

  • Design, develop, and maintain scalable applications using Java (Spring Boot) and Python (Flask).
  • Build RESTful APIs and microservices following best practices.
  • Implement event-driven architecture leveraging NATS messaging server.
  • Deploy, manage, and optimize applications in Kubernetes and containerized environments.
  • Develop and manage CI/CD pipelines, ensuring smooth deployment and delivery.
  • Collaborate with cross-functional teams to deliver high-quality solutions.
  • Write clean, maintainable, and well-documented code.
  • Participate in code reviews and contribute to architectural decisions.
  • Troubleshoot, debug, and optimize application performance.


Read more
IT MNC

IT MNC

Agency job
via FIRST CAREER CENTRE by Aisha Fcc
Bengaluru (Bangalore), Noida, Hyderabad, Pune, Chennai
4 - 8 yrs
₹15L - ₹30L / yr
skill iconPython
skill iconJavascript
frappe

Development and Customization:


Build and customize Frappe modules to meet business requirements.


Develop new functionalities and troubleshoot issues in ERPNext applications.


Integrate third-party APIs for seamless interoperability.


Technical Support:


Provide technical support to end-users and resolve system issues.


Maintain technical documentation for implementations.


Collaboration:


Work with teams to gather requirements and recommend solutions.


Participate in code reviews for quality standards.


Continuous Improvement:


Stay updated with Frappe developments and optimize application performance.


Skills Required:

Proficiency in Python, JavaScript, and relational databases.


Knowledge of Frappe/ERPNext framework and object-oriented programming.


Experience with Git for version control.


Strong analytical skill

Read more
MindCrew Technologies

at MindCrew Technologies

3 recruiters
Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Hyderabad
8 - 12 yrs
₹10L - ₹15L / yr
skill iconPython
Bash
Shell Scripting
CI/CD
skill iconKubernetes

Python Developer 

Location: Hyderabad (Apple Office)

Experience: 8+ years (Retail / E-commerce preferred)

Budget- 1.9 lpm + GST

Contract: 1 Year + Extendable


Job Responsibilities / Requirements:


  • 8+ years of proven experience, preferably in retail or e-commerce environments.
  • Strong expertise in Python development.
  • Excellent communication skills with the ability to collaborate across multiple teams.
  • Hands-on experience with Container & Orchestration: Kubernetes, Docker.
  • Expertise in Infrastructure Automation via Kubernetes YAML configurations.
  • Strong skills in Scripting & Automation: Python, Shell Scripts (Bash).
  • Familiarity with CI/CD Pipelines: GitHub Actions, Jenkins.
  • Experience with Monitoring & Logging: Splunk, Grafana.
  • Immediate Joiners Preferred – Urgent Support Required.


Read more
MindCrew Technologies

at MindCrew Technologies

3 recruiters
Agency job
Hyderabad
8 - 12 yrs
₹10L - ₹12L / yr
skill iconPython
Shell Scripting
skill iconKubernetes
Bash
CI/CD
+2 more

Python Developer 

Location: Hyderabad (Apple Office)

Experience: 8+ years (Retail / E-commerce preferred)

Budget- 1.9 lpm + GST

Contract: 1 Year + Extendable


Job Responsibilities / Requirements:


  • 8+ years of proven experience, preferably in retail or e-commerce environments.
  • Strong expertise in Python development.
  • Excellent communication skills with the ability to collaborate across multiple teams.
  • Hands-on experience with Container & Orchestration: Kubernetes, Docker.
  • Expertise in Infrastructure Automation via Kubernetes YAML configurations.
  • Strong skills in Scripting & Automation: Python, Shell Scripts (Bash).
  • Familiarity with CI/CD Pipelines: GitHub Actions, Jenkins.
  • Experience with Monitoring & Logging: Splunk, Grafana.
  • Immediate Joiners Preferred – Urgent Support Required.



Read more
Versatile Commerce LLP

at Versatile Commerce LLP

2 candid answers
Burugupally Shailaja
Posted by Burugupally Shailaja
Hyderabad
3 - 5 yrs
₹3L - ₹5L / yr
Open-source LLMs
Retrieval Augmented Generation (RAG)
MLOps
Windows Azure
skill iconPython


Role: AI/ ML Engineering

Experience: 3 to 5 yrs

Work location: Hyderabad

Interview mode: Virtual

Notice period: Immediate Joiner


Key Responsibilities:

·        Design and implement RAG pipelines and AI agentic systems using cutting-edge LLM frameworks.

·        Fine-tune open-source LLMs and develop narrow, domain-specific models.

·        Build and maintain ML pipelines using MLFlow and ensure reproducibility, auditability, and version control.

·        Collaborate with cross-functional teams to deploy ML systems into scalable, secure, and production-ready environments.

·        Containerize and serve models using Docker, Kubernetes, and FastAPI.

·        Automate CI/CD workflows using Azure DevOps, with integrated monitoring and alerts.

·        Integrate authentication and authorization flows using Azure AD and Microsoft Graph API.

·        Optimize deployed models for latency, cost-efficiency, and operational maintainability.

Required Skills & Experience:

·        Strong foundation in Computer Science, software architecture, and distributed systems.

·        Proficiency in Python, including both object-oriented and functional programming paradigms.

·        Hands-on experience with open-source LLMs, embedding models, and vector databases.

·        Practical implementation of RAG pipelines and LLM agentic systems.

·        Strong working knowledge of MLOps tooling (e.g., MLFlow), model versioning, and reproducible experiments.

·        Experience deploying ML systems using Docker, Kubernetes, and FastAPI or equivalent frameworks.

·        Proven experience working in Azure cloud ecosystem:

·        Azure DevOps for build/release automation.

·        Azure GraphAPI for accessing organizational data.

·        Secure identity flows using Azure AD.

Read more
Hashone Careers

at Hashone Careers

2 candid answers
Madhavan I
Posted by Madhavan I
Hyderabad
7 - 11 yrs
₹10L - ₹15L / yr
Snow Flake
skill iconPython
SQL


Position Title: Data Engineer with Snowflake Lead   

Experience: 7+ Yrs  

Shift Schedule: Rotational Shifts  

Mode of work: Hybrid (Need to come office)  

Location: Hyderabad  


**Role Overview: **  

Snowflake Managed Services team as a Software Engineer to work on data platform development, enhancements, and production support. You will support Snowflake environments across multiple clients, ensuring stability, performance, and continuous improvement.  


**Key Responsibilities: **  

Design and develop Snowflake pipelines, data models, and transformations  

Provide L2/L3 production support for Snowflake jobs, queries, and integrations  

Troubleshoot failed jobs, resolve incidents, and conduct RCA  

Tune queries, monitor warehouses, and help optimize Snowflake usage and cost  

Handle service requests like user provisioning, access changes, and role management  

Document issues, enhancements, and standard procedures (runbooks)  


Required Skills & Experience:  

4+ years of hands-on experience in Snowflake development and support  

Strong SQL, data modeling, and performance tuning experience  

Exposure to CI/CD pipelines and scripting languages (e.g., Python OR Pyspark)  

ETL or ELT  

Experience with data pipelines and orchestration tools ( ADF)  


Preferred:  

SnowPro Core Certification  

Experience with ticketing systems (ServiceNow, Jira)  

Cloud experience with Azure  

Basic understanding of ITIL processes

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Chennai, Bengaluru (Bangalore), Hyderabad, Mumbai, Pune, Noida
4 - 6 yrs
₹3L - ₹21L / yr
AWS Data Engineer
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
databricks
+1 more

 Key Responsibilities

  • Design and implement ETL/ELT pipelines using Databricks, PySpark, and AWS Glue
  • Develop and maintain scalable data architectures on AWS (S3, EMR, Lambda, Redshift, RDS)
  • Perform data wrangling, cleansing, and transformation using Python and SQL
  • Collaborate with data scientists to integrate Generative AI models into analytics workflows
  • Build dashboards and reports to visualize insights using tools like Power BI or Tableau
  • Ensure data quality, governance, and security across all data assets
  • Optimize performance of data pipelines and troubleshoot bottlenecks
  • Work closely with stakeholders to understand data requirements and deliver actionable insights

🧪 Required Skills

Skill AreaTools & TechnologiesCloud PlatformsAWS (S3, Lambda, Glue, EMR, Redshift)Big DataDatabricks, Apache Spark, PySparkProgrammingPython, SQLData EngineeringETL/ELT, Data Lakes, Data WarehousingAnalyticsData Modeling, Visualization, BI ReportingGen AI IntegrationOpenAI, Hugging Face, LangChain (preferred)DevOps (Bonus)Git, Jenkins, Terraform, Docker

📚 Qualifications

  • Bachelor's or Master’s degree in Computer Science, Data Science, or related field
  • 3+ years of experience in data engineering or data analytics
  • Hands-on experience with Databricks, PySpark, and AWS
  • Familiarity with Generative AI tools and frameworks is a strong plus
  • Strong problem-solving and communication skills

🌟 Preferred Traits

  • Analytical mindset with attention to detail
  • Passion for data and emerging technologies
  • Ability to work independently and in cross-functional teams
  • Eagerness to learn and adapt in a fast-paced environment


Read more
VDart

VDart

Agency job
via VDart by Don Blessing
Bengaluru (Bangalore), Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 8 yrs
₹10L - ₹15L / yr
EDI
Developer
Mapping
Map Conversion
X112
+2 more

EDI Developer / Map Conversion Specialist

Role Summary:

Responsible for converting 441 existing EDI maps into the PortPro-compatible format and testing them for 147 customer configurations.

Key Responsibilities:

  • Analyze existing EDI maps in Profit Tools.
  • Convert, reconfigure, or rebuild maps for PortPro.
  • Ensure accuracy in mapping and transformation logic.
  • Unit test and debug EDI transactions.
  • Support system integration and UAT phases.

Skills Required:

  • Proficiency in EDI standards (X12, EDIFACT) and transaction sets.
  • Hands-on experience in EDI mapping tools.
  • Familiarity with both Profit Tools and PortPro data structures.
  • SQL and XML/JSON data handling skills.
  • Experience with scripting for automation (Python, Shell scripting preferred).
  • Strong troubleshooting and debugging skills.


Read more
Sonatype

at Sonatype

5 candid answers
Reshika Mendiratta
Posted by Reshika Mendiratta
Hyderabad
6 - 10 yrs
₹15L - ₹33L / yr
ETL
Spark
Apache Kafka
skill iconPython
skill iconJava
+11 more

The Opportunity

We’re looking for a Senior Data Engineer to join our growing Data Platform team. This role is a hybrid of data engineering and business intelligence, ideal for someone who enjoys solving complex data challenges while also building intuitive and actionable reporting solutions.


You’ll play a key role in designing and scaling the infrastructure and pipelines that power analytics, dashboards, machine learning, and decision-making across Sonatype. You’ll also be responsible for delivering clear, compelling, and insightful business intelligence through tools like Looker Studio and advanced SQL queries.


What You’ll Do

  • Design, build, and maintain scalable data pipelines and ETL/ELT processes.
  • Architect and optimize data models and storage solutions for analytics and operational use.
  • Create and manage business intelligence reports and dashboards using tools like Looker Studio, Power BI, or similar.
  • Collaborate with data scientists, analysts, and stakeholders to ensure datasets are reliable, meaningful, and actionable.
  • Own and evolve parts of our data platform (e.g., Airflow, dbt, Spark, Redshift, or Snowflake).
  • Write complex, high-performance SQL queries to support reporting and analytics needs.
  • Implement observability, alerting, and data quality monitoring for critical pipelines.
  • Drive best practices in data engineering and business intelligence, including documentation, testing, and CI/CD.
  • Contribute to the evolution of our next-generation data lakehouse and BI architecture.


What We’re Looking For


Minimum Qualifications

  • 5+ years of experience as a Data Engineer or in a hybrid data/reporting role.
  • Strong programming skills in Python, Java, or Scala.
  • Proficiency with data tools such as Databricks, data modeling techniques (e.g., star schema, dimensional modeling), and data warehousing solutions like Snowflake or Redshift.
  • Hands-on experience with modern data platforms and orchestration tools (e.g., Spark, Kafka, Airflow).
  • Proficient in SQL with experience in writing and optimizing complex queries for BI and analytics.
  • Experience with BI tools such as Looker Studio, Power BI, or Tableau.
  • Experience in building and maintaining robust ETL/ELT pipelines in production.
  • Understanding of data quality, observability, and governance best practices.


Bonus Points

  • Experience with dbt, Terraform, or Kubernetes.
  • Familiarity with real-time data processing or streaming architectures.
  • Understanding of data privacy, compliance, and security best practices in analytics and reporting.


Why You’ll Love Working Here

  • Data with purpose: Work on problems that directly impact how the world builds secure software.
  • Full-spectrum impact: Use both engineering and analytical skills to shape product, strategy, and operations.
  • Modern tooling: Leverage the best of open-source and cloud-native technologies.
  • Collaborative culture: Join a passionate team that values learning, autonomy, and real-world impact.
Read more
Sonatype

at Sonatype

5 candid answers
Reshika Mendiratta
Posted by Reshika Mendiratta
Hyderabad
2 - 5 yrs
Upto ₹20L / yr (Varies
)
skill iconPython
ETL
Spark
Apache Kafka
databricks
+12 more

About the Role

We’re hiring a Data Engineer to join our Data Platform team. You’ll help build and scale the systems that power analytics, reporting, and data-driven features across the company. This role works with engineers, analysts, and product teams to make sure our data is accurate, available, and usable.


What You’ll Do

  • Build and maintain reliable data pipelines and ETL/ELT workflows.
  • Develop and optimize data models for analytics and internal tools.
  • Work with team members to deliver clean, trusted datasets.
  • Support core data platform tools like Airflow, dbt, Spark, Redshift, or Snowflake.
  • Monitor data pipelines for quality, performance, and reliability.
  • Write clear documentation and contribute to test coverage and CI/CD processes.
  • Help shape our data lakehouse architecture and platform roadmap.


What You Need

  • 2–4 years of experience in data engineering or a backend data-related role.
  • Strong skills in Python or another backend programming language.
  • Experience working with SQL and distributed data systems (e.g., Spark, Kafka).
  • Familiarity with NoSQL stores like HBase or similar.
  • Comfortable writing efficient queries and building data workflows.
  • Understanding of data modeling for analytics and reporting.
  • Exposure to tools like Airflow or other workflow schedulers.


Bonus Points

  • Experience with DBT, Databricks, or real-time data pipelines.
  • Familiarity with cloud infrastructure tools like Terraform or Kubernetes.
  • Interest in data governance, ML pipelines, or compliance standards.


Why Join Us?

  • Work on data that supports meaningful software security outcomes.
  • Use modern tools in a cloud-first, open-source-friendly environment.
  • Join a team that values clarity, learning, and autonomy.


If you're excited about building impactful software and helping others do the same, this is an opportunity to grow as a technical leader and make a meaningful impact.

Read more
Inncircles
Tatikonda Geetha
Posted by Tatikonda Geetha
Hyderabad
1 - 3 yrs
Best in industry
skill iconJava
skill iconJavascript
Selenium
Playwright
Mobile App Testing (QA)
+5 more

Inncircles Technologies is a problem-solving company. With powerful data management capabilities and AI-driven algorithms, we have developed a construction management platform named Inncircles Arena, a one-stop solution for managing any construction project.


Inncircles Arena can help construction industry owners, builders, general contractors, and specialist contractors to improve construction management operations efficiency and project management. The application runs on a cloud-based platform and offers a complete range of tools to gather field data through a user-friendly interface and mobile applications.


Due to the software's modern, user-friendly design, users can access project information from any location through mobile and web applications. Collaboration tools are integrated into each feature to facilitate effective coordination and ensure all teams are on the same page.


With highly configurable features, products, solutions, and services, we aim to make digital transformation easier and more simplified for construction companies.


Why should you join our team?

  • 100% growth with diverse experience working with international clients
  • Exposure across media & digital channels
  • Dynamic learning curve across Global Landscape
  • A part of a young team, ready to experiment together


About the Role


We are looking for a Quality Analyst with strong skills in Manual Testing and Web Automation to join our growing team. The ideal candidate will be passionate about delivering high-quality software products, adept at identifying bugs, and ensuring seamless functionality across applications.


Key Responsibilities


●      Design, develop, and execute manual test cases for web applications and APIs.

●      Create, maintain, and enhance web automation test scripts using industry-standard tools and frameworks.

●      Collaborate with cross-functional teams (Developers, Product Managers) to ensure quality at every stage of development.

●      Perform regression testing, smoke testing, sanity testing and end-to-end testing for new releases.

●      Use Jira for defect tracking and reporting, ensuring clear communication of bugs and their statuses.

●      Work with Git for version control and participate in code reviews related to test scripts.

●      Integrate and maintain test execution pipelines using Jenkins (CI/CD).

●      Conduct performance and load testing using JMeter, identifying bottlenecks and providing actionable insights.

●      Perform basic database testing with MongoDB, validating backend data integrity.


Requirements


●      BE/B.Tech/BCA degree in Computer science, Engineering, or a related field.

●      1–3 years of experience in Manual Testing and Web Automation Testing.

●      Strong analytical and problem-solving skills with keen attention to detail.

●      Good understanding of SDLC, STLC, and Agile methodologies.

●      Excellent communication and collaboration skills.

●      Hands-on experience with Selenium/Playwright or similar web automation tools.

●      Knowledge of Programming languages (Python, Java, JavaScript and TypeScript).

●      Proficiency in Jira for bug tracking and project management.

●      Basic knowledge of Git for version control.

●      Familiarity with Jenkins for CI/CD pipelines.

●      Understanding of MongoDB for basic data validation.


Good to Have


●      Exposure to API testing tools (e.g., Postman, Rest Assured).

●      Experience in Performance Testing using JMeter.

●      Familiarity with cross-browser and cross-platform testing.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort