Cutshort logo

50+ Python Jobs in Pune | Python Job openings in Pune

Apply to 50+ Python Jobs in Pune on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
Leading drive specialist for machine and plant engineering

Leading drive specialist for machine and plant engineering

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
7 - 9 yrs
₹12L - ₹20L / yr
Object Oriented Programming (OOPs)
skill iconPython
Automation
skill iconJava
skill iconC++
+5 more

Job Details

Job Title: Sr. Python Automation Developer

Industry: Engineering

Domain - Information technology (IT)

Experience Required: 7-9 years

Employment Type: Full Time

Job Location: Pune

CTC Range: Best in Industry

 

Job Description

• Designation – Python Automation Developer (Sr./ Advanced Sr.)

• Experience: 7 to 9 years.

• Qualifications: B.E./MCA//M.Sc./B.Sc.

• Location: Pune (Near Sangamwadi )

 

Skills & Technologies:

Mandatory:

• Experience using OOPs in Python/Java/C++/C# (If the candidate has experience in Java/C++/C#, he must be willing to learn Python and work in it)

• Good analytical, design, coding and debugging skills

• Good analytical and requirement of understanding skills.

• Good design patterns, frameworks & coding skills – able to translate requirements into design and able to translate design into fully functional & efficient code.

• English communication skills.

 

Desirable:

• Working experience on any defect management tool

• Working experience on GIT/SVN or any code repository management tool

• Development Using Eclipse IDE or equivalent IDE

 

Behaviors:

• Good team player

• Openness to learn new technologies.

• Self-motivated and proactive

• Should work with minimum supervision.

• Should be able to supervise juniors.

• Take ownership

 

Must-Haves

• 5.9 years of relevant experience using OOPs in Python/Java/C++/C#

(If the candidate has experience in Java/C++/C#, he must be willing to learn Python and work in it.)

• Good analytical, design, coding, and debugging skills

• Good analytical and requirement of understanding skills.

• Good design patterns, frameworks & coding skills – able to translate

requirements into design and able to translate design into fully functional & efficient code.

• English communication skills.

Read more
Byline Learning Solutions
Baner, Pune
4 - 12 yrs
₹4L - ₹12L / yr
skill iconJavascript
skill iconPython
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconJava

We are looking for a highly skilled Senior Full Stack Developer with experience in building secure, scalable, and high-performing web applications for eLearning or fintech platforms. The ideal candidate will have a strong background in multi-tenant dashboards, role-based access control, and API integrations, with a proactive, go-getter mindset.

Key Responsibilities:

  • Develop and maintain multi-tenant dashboards with role-based access control (RBAC) and hierarchical permission models.
  • Implement authentication, authorization, and session management across different user tiers, ensuring secure handling of credentials and sensitive data.
  • Integrate LMS systems, third-party APIs, and fintech services while maintaining data integrity and security.
  • Ensure secure hosting, multi-layered application protection, and adherence to IT standards and best practices.
  • Optimize applications for performance, scalability, and reliability across platforms.
  • Collaborate with cross-functional teams, including designers, content developers, and project managers, to deliver end-to-end solutions.

Required Skills & Experience:

  • Proven full-stack development experience with technologies such as JavaScript, React, Node.js, Python, Java, or similar.
  • Hands-on experience with multi-level user management, RBAC, and dashboard architectures.
  • Strong knowledge of APIs, hosting, cloud services, and secure deployment practices.
  • Familiarity with data security protocols, fintech integrations, and enterprise IT standards.
  • Excellent problem-solving skills, proactive approach, and ability to work independently.

Preferred:

  • Experience in eLearning platforms (SCORM, xAPI) or fintech solutions.
  • Knowledge of database optimization, caching, and performance tuning.
  • Experience with international eLearning projects or multi-location deployments.


Read more
Pendo

at Pendo

3 candid answers
1 product
Eman Khan
Posted by Eman Khan
Pune
3 - 7 yrs
Upto ₹45L / yr (Varies
)
skill iconVue.js
skill iconReact.js
skill iconGoogle Analytics
skill iconJava
skill iconPython
+6 more

About the Role

Pendo is looking for a Software Engineer to help build and scale the platform that powers our integrations with enterprise systems such as Salesforce, Slack, Segment, and other partner tools. This team develops the services, APIs, data pipelines, and user interfaces that enable customers to seamlessly connect Pendo into their product and data ecosystems.


In this role, you will primarily focus on building scalable backend systems while also contributing to the frontend experiences that allow customers to configure, manage, and monitor integrations. You’ll collaborate closely with product managers, designers, and infrastructure teams to deliver reliable, high-performance capabilities used by millions of users.


What You'll Do

  • Design and build scalable backend services and APIs that power Pendo’s integrations platform.
  • Develop and maintain distributed, event-driven data pipelines that process and sync high volumes of behavioral and product analytics data.
  • Contribute to frontend applications that allow customers to configure, manage, and monitor integrations and data workflows.
  • Lead technical initiatives from design through implementation, testing, and production rollout.
  • Integrate with third-party APIs and enterprise platforms using technologies such as REST, webhooks, and OAuth.
  • Collaborate with product, design, infrastructure, and partner teams to translate business needs into high-quality technical solutions.
  • Use modern development workflows and AI-powered tools to improve developer productivity and streamline engineering processes.
  • Participate in design reviews and promote best practices in testing, observability, performance, and system reliability.
  • Contribute to improving platform scalability, availability, and operational excellence.


What We're Looking For

  • Experience building backend services, APIs, or distributed systems.
  • Experience developing modern web applications using frameworks such as Vue, React, or Angular.
  • Strong proficiency in at least one backend language such as Go, Java, Python, or C++.
  • Experience working with cloud infrastructure such as AWS or GCP.
  • Familiarity with distributed systems, event-driven architectures, or high-throughput data pipelines.
  • Experience writing and maintaining unit, integration, and end-to-end tests.
  • Strong collaboration and communication skills.


Nice to Have

  • Experience building integration platforms or working with third-party APIs.
  • Familiarity with authentication models such as OAuth and enterprise SaaS integrations.
  • Experience working with analytics or behavioral event data.
  • Experience leveraging AI-assisted development tools or working with modern AI workflows.


Technologies We Use

  • Frontend: Vue, Vuex, React, Angular, Highcharts, Jest, Cypress
  • Backend: Go, Java, Python, C++
  • Cloud & Data: AWS, GCP, Redis, Pub/Sub, SQL/NoSQL
  • AI / ML: GenAI, LLMs, LangChain, MLOps
Read more
Leading drive specialist for machine and plant engineering

Leading drive specialist for machine and plant engineering

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
7 - 9 yrs
₹12L - ₹20L / yr
skill iconPython
skill iconC++
Eclipse (IDE)
Object Oriented Programming (OOPs)
skill iconJava
+5 more

Job Details

Job Title: Sr. Python Automation Developer

Industry: Engineering

Domain - Information technology (IT)

Experience Required: 7-9 years

Employment Type: Full Time

Job Location: Pune

CTC Range: Best in Industry

 

Job Description

• Designation – Python Automation Developer (Sr./ Advanced Sr.)

• Experience: 7 to 9 years.

• Qualifications: B.E./MCA//M.Sc./B.Sc.

• Location: Pune (Near Sangamwadi )

 

Skills & Technologies:

Mandatory:

• Experience using OOPs in Python/Java/C++/C# (If the candidate has experience in Java/C++/C#, he must be willing to learn Python and work in it)

• Good analytical, design, coding and debugging skills

• Good analytical and requirement of understanding skills.

• Good design patterns, frameworks & coding skills – able to translate requirements into design and able to translate design into fully functional & efficient code.

• English communication skills.

 

Desirable:

• Working experience on any defect management tool

• Working experience on GIT/SVN or any code repository management tool

• Development Using Eclipse IDE or equivalent IDE

 

Behaviors:

• Good team player

• Openness to learn new technologies.

• Self-motivated and proactive

• Should work with minimum supervision.

• Should be able to supervise juniors.

• Take ownership

 

Must-Haves

• 5.9 years of relevant experience using OOPs in Python/Java/C++/C#

(If the candidate has experience in Java/C++/C#, he must be willing to learn Python and work in it.)

• Good analytical, design, coding, and debugging skills

• Good analytical and requirement of understanding skills.

• Good design patterns, frameworks & coding skills – able to translate

requirements into design and able to translate design into fully functional & efficient code.

• English communication skills.

Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
2 - 3 yrs
Best in industry
Whole-genome
RNA- seq analysis
skill iconR Programming
skill iconPython
GATK
+6 more

About NonStop:

NonStop is a software services company at the intersection of bioinformatics, genomics, and healthcare technology. We partner with biotech firms, pharma organizations, genomics labs, and clinical institutions to design and deliver production-grade bioinformatics software, AI-powered analytical platforms, and end-to-end genomic data pipelines.

We work on problems that matter: from accelerating variant interpretation workflows and building HIPAA-compliant AI platforms, to orchestrating large-scale multi-omics pipelines for disease diagnostics and pharmacogenomics. Our team blends deep domain expertise with engineering rigor, and we're growing to meet the increasing demand from the life sciences industry for smart, scalable, and compliant bioinformatics solutions.


We work this way:

  • We dig deep into biological problems, not just the code. Domain knowledge is valued as much as engineering craft.
  • Bioinformatics is a team sport. You'll work alongside software engineers, clinicians, and research scientists.
  • You own your work end-to-end, from design to delivery. We trust you to make good decisions and learn fast.
  • Life sciences move fast. We encourage continuous learning, conference participation, and staying ahead of the field.


Your role:

As a Bioinformatics Engineer at NonStop, you will be a key contributor in designing, building, and maintaining bioinformatics software solutions and analytical pipelines for our clients across genomics, clinical diagnostics, and precision medicine. You'll bring both biological insight and engineering excellence to every project, collaborating with product, engineering, and scientific teams to deliver solutions that are scalable, reproducible, and compliant.


You’ll be responsible for:

  • Build scalable bioinformatics applications and pipelines for efficient processing of genomic, transcriptomic, and multi-omics data.
  • Produce high-quality, detailed documentation for all projects, pipelines, tools, APIs, and analytical methods.
  • Provide technical consultation and solutions across cross-functional bioinformatics projects.
  • Coach and mentor team members through knowledge sharing, code reviews, and pairing on domain-specific challenges.
  • Ensure compliance with our SDLC process throughout the product development lifecycle.
  • Stay current with evolving bioinformatics technologies and evangelize technical excellence within the team.


We’re looking for:

  • Minimum 2 to 3 years of experience in designing, developing, and maintaining bioinformatics solutions.
  • Master's degree in Bioinformatics, Computational Biology, or a closely related technical discipline.
  • Strong understanding of genomic data analysis, variant calling, targeted sequencing, whole-exome (WES), and whole-genome sequencing (WGS) workflows.
  • Hands-on experience with RNA-seq analysis, including differential expression and transcriptomic workflows.
  • Proficiency in variant interpretation and ACMG/AMP classification workflows is a plus.
  • Knowledge of algorithms and computational model development applied to biological data.
  • Strong foundation in statistics and data analysis as applied to genomics and bioinformatics.
  • Experience developing and debugging bioinformatics pipelines using Nextflow, Snakemake, WDL, or CWL.
  • Proficiency in shell scripting and Linux/Unix environments for NGS data analysis.
  • Familiarity with workflow automation and best practices in reproducible pipeline design.
  • Excellent programming skills in Python (primary); familiarity with R or Java is a plus.
  • Proficiency with standard bioinformatics tools (GATK, DeepVariant, VEP, ANNOVAR, MultiQC, FastQC, etc.).
  • Experience with relational databases (PostgreSQL, MySQL, or Oracle) and NoSQL databases (MongoDB).
  • Confident use of Git and GitHub for version control and collaborative development.
  • Experience with cloud computing platforms (AWS or GCP, or Azure) for bioinformatics workloads.
  • Familiarity with high-performance computing (HPC) environments is a plus.


Why join NonStop:

  • Work on real-world genomics and clinical bioinformatics problems that directly impact patient care and scientific discovery.
  • Collaborate with life sciences clients at the cutting edge, from rare disease diagnostics to AI-assisted bioinformatics platforms.
Read more
Cambridge Wealth (Baker Street Fintech)
Sangeeta Bhagwat
Posted by Sangeeta Bhagwat
Pune
0 - 2 yrs
₹1.9L - ₹4L / yr
SQL
skill iconPython
skill iconAmazon Web Services (AWS)
Spotfire
Qlikview
+12 more

Who are we aka "About Us":

 

We are an early-stage Fintech Startup - working on exciting Fintech Products for some of the Top 5 Global Banks and building our own. If you are looking for a place where you can make a mark and not just be a cog in the wheel, Baker street Fintech Pvt Ltd (Parent Company) might be the place for you. We have a flat, ownership-oriented culture, and deliver world-class quality. You will be working with a founding team that has delivered over 26 industry-leading product experiences and won the Webby awards for Digital Strategy. In short, a bleeding edge team. 

 

As Cambridge Wealth, we are well-established in the wealth and mutual fund distribution segment, having won awards from BSE Star as well as Mutual Fund houses. Our UHNI/HNI/NRI clients include renowned professionals from various industries. 

 

What are we looking for a.k.a “The JD” :

 

We are seeking a skilled and detail-oriented Data Analyst to join our product team. As a Data Analyst, you will play a crucial role in extracting, analysing, and interpreting complex financial data to drive strategic decision-making and optimize our data solutions. The ideal candidate should possess a strong foundation in SQL / NoSQL databases, Python programming, and proficiency in tools like PostgreSQL and Excel. A deep understanding of financial concepts is also a plus. Additionally, having an interest in business intelligence tools and machine learning will be valuable for this role.

 

Responsibilities:

  • Proficient in writing complex SQL Queries
  • Utilize Python for data manipulation, analysis, and visualisation, using libraries such as pandas, matplotlib, psycopg etc.
  • Perform database optimization, indexing, and query tuning to ensure high performance.
  • Monitor and maintain data quality, troubleshoot data-related issues, and implement solutions to optimize data integrity and performance.
  • Design, configure, and maintain PostgreSQL databases
  • Set up and manage database clusters, replication, and backups for disaster recovery

 

Preferred Qualifications:

  • Intermediate-level Excel skills for data analysis and reporting.
  • Strong communication skills to present findings effectively and recommendations to both technical and non-technical stakeholders.
  • Detail-oriented mindset with a commitment to data accuracy and quality.

 

*(Only Applicants who have finished their educational commitments are requested to apply)

 

Not sure whether you should apply? Here's a quick checklist to make things easier. You are someone who:

  • Has worked (0-1.5 years preferably) or is looking to work specifically with an early-stage startup.
  • You are ready to be a part of a Zero To One Journey which implies that you shall be involved in building fintech products and process from the ground up.
  • You are comfortable to work in an unstructured environment with a small team where you decide what your day looks like and take initiative to take up the right piece of work, own it and work with the founding team on it.
  • This is not an environment where someone will be checking up on you every few hours. It is up to you to schedule check-ins whenever you find the need to, else we assume you are progressing well with your tasks. You will be expected to find solutions to problems and suggest improvements.
  • You want complete ownership for your role & be able to drive it the way you think is right.
  • You can be a self-starter and take ownership of deliverables to develop a consensus with the team on approach and methods and deliver to them.
  • Are looking to stick around for the long term and grow with the company.

 

Read more
Automotive, Manufacturing

Automotive, Manufacturing

Agency job
Pune
5 - 10 yrs
₹18L - ₹18L / yr
Object Oriented Programming (OOPs)
skill iconJava
skill iconPython
skill iconC
skill iconC++
+1 more

We have an urgent opening for a highly skilled and passionate professional for the below role:

Quick Role Overview:

  • Role: Python Automation Developer
  • Location: Pune (Near Sangamwadi – Metro Connectivity)
  • Working Model: Hybrid (4 Days Work from Office)
  • Experience: 6 – 9 Years (Minimum 5.9+ Years in OOPs Development)
  • Qualification: B.E. / MCA / M.Sc. / B.Sc.
  • Notice Period: Early Joiners Preferred (15–30 Days Max)


Job Description

We are looking for a strong Python Automation Developer with solid Object-Oriented Programming expertise. This role is ideal for professionals who are strong in Java / C++ / C# and are willing to transition into Python (if not already working in Python).

You will be responsible for designing, developing, and maintaining high-quality automation solutions while translating business requirements into scalable and efficient technical implementations.

This is an excellent opportunity to work in a German-based product company offering strong work-life balance and a global work culture.


Key Responsibilities

  • Design and develop automation solutions using Python (preferred) or other OOP-based languages.
  • Translate functional requirements into scalable technical designs.
  • Apply strong design patterns and coding best practices.
  • Write clean, efficient, maintainable, and well-documented code.
  • Debug, troubleshoot, and optimize performance issues.
  • Work closely with cross-functional teams in a global environment.
  • Supervise and mentor junior team members when required.
  • Take complete ownership of assigned modules.


Desired Skills & Competencies

Must-Have Skills:

  • 5.9+ years of relevant experience in OOPs (Python / Java / C++ / C#)
  • Strong analytical, coding, debugging, and design skills
  • Excellent understanding of design patterns and frameworks
  • Ability to convert requirements → design → fully functional implementation
  • Strong problem-solving mindset
  • Good English communication skills
  • Ability to work independently with minimum supervision

(Candidates with Java/C++/C# background must be willing to work in Python.)


Good to Have:

  • Experience with defect management tools
  • Experience with GIT / SVN or any code repository management tools
  • Experience with Eclipse IDE or equivalent IDE


Read more
Hashone Career
Madhavan I
Posted by Madhavan I
Bengaluru (Bangalore), Pune, Hyderabad, Chennai, Noida
7 - 10 yrs
₹20L - ₹35L / yr
skill iconPython
skill iconDjango
SQL

ROLE SUMMARY

The Senior Python Developer designs, builds, and improves Python and Django applications. The role includes developing end‑to‑end integrations using REST and SOAP services and delivering reliable, scalable solutions through hands‑on coding and data transformation work. The developer works closely with Business Analysts, architects, and other teams to ensure technical solutions support business needs. Key responsibilities also include improving SQL performance, taking part in code reviews, supporting DevOps workflows with Git and Azure DevOps, and helping integrate GenAI features—such as GPT models, embeddings, and agent‑based tools—into enterprise applications.

ROLE RESPONSIBILITIES

  • Design and develop Python and Django applications that are scalable, secure, and maintainable.
  • Implement UI components using CSS, Bootstrap, jQuery, or similar technologies as needed.
  • Develop integrations with internal and external systems using REST, SOAP, and WSDL‑based services.
  • Create and optimize SQL queries, database structures, and data access logic to support application features.
  • Work with Business Analysts and stakeholders to translate functional requirements into technical specifications and solutions.
  • Implement accurate data mappings and transformations in accordance with business and technical requirements.
  • Contribute to code reviews, follow established coding standards, and ensure high‑quality deliverables.
  • Support the implementation and maintenance of DevOps pipelines using Git and Azure DevOps.
  • Contribute to the integration of GenAI capabilities—including GPT models, embeddings, and agent‑based components—into enterprise applications.
  • Troubleshoot issues across the application stack and collaborate closely with peers to resolve technical challenges.

TECHNICAL QUALIFICATIONS

  • 7+ years of hands‑on experience with Python and Django, including complex application development.
  • 5+ years of experience with SQL development, optimization, and database design.
  • At least 1-2 years of applied experience with GenAI technologies (GPT models, embeddings, agents, etc.).
  • Deep expertise in application architecture, system integration, and service‑oriented design.
  • Strong experience with DevOps tools and practices, including Git, Azure DevOps, CI/CD pipelines, and automated deployments.
  • Advanced understanding of REST, SOAP, WSDL, and large‑scale service integrations.

GENERAL QUALIFICATIONS

  • Exceptional verbal and written communication skills.
  • Strong analytical, problem‑solving, and architectural reasoning abilities.
  • Demonstrated leadership experience with the ability to guide and mentor technical teams.
  • Proven ability to work effectively in fast‑paced, collaborative environments.

EDUCATION REQUIREMENTS

  • Bachelor’s degree in Computer Science, MIS, or a related field.
  • Advanced certifications in Python, cloud technologies, or GenAI are preferred but not required.

 

Read more
Dansk Scanning IT Know-How Private Limited
DanskScanningITKnowHow PrivateLimited
Posted by DanskScanningITKnowHow PrivateLimited
Pune
5 - 8 yrs
₹12L - ₹24L / yr
skill iconPython
skill iconJava
skill iconC#
skill iconAngular (2+)
skill iconReact.js
+9 more

Tech Lead (India) — Help Build WebLager’s Next Engineering Hub

Location: India

Team: Product & Development

Reporting to: Head of Product & Development (Denmark)


Why this role exists

WebLager is scaling fast, and 2026 will be a breakout year. We’re building an Indian IT office that’s not an outsourced extension of Denmark.

This is a real “build it right from day one” leadership role.


You’ll be our right hand in India — shaping the team, culture, and delivery. If you want to build something meaningful that’s expected to grow a lot next year, keep reading.


What you’ll do

You’re not here to babysit Jira. You’re here to ship, lead, and raise the bar.


●    Build and lead our India engineering team from early stage into real scale in 2026.


●    Set standards for quality and delivery — clean code, stable systems, smart execution.


●    Coach and grow people across levels: students, juniors, mid-levels, seniors.


●    Create a local WebLager community that feels like one company, not two offices.


●    Work tightly with Denmark on product, architecture, and delivery — as a partner, not a follower.


●    Stay hands-on: design, code, review, refactor, deploy.


●    Scale enterprise systems: performance, reliability, maintainability, observability.


●    Improve how we work: CI/CD, engineering rituals, docs that matter, fewer surprises.


●    Be the technical anchor when things are complex, messy, or moving fast.


What you bring

We don’t care about buzzwords. We care about proof you can build and lead.


Must-haves:

 

●    5+ years as a developer, with real production systems behind you.


●    Strong backend skills — ideally Python or another scripting language, plus Java/C# or similar, and also extensive database knowledge of both relational and

non-relational databases.


●    Frontend experience with a reactive framework like Angular, React, Vue, etc.


●    Experience scaling enterprise-grade systems and making architecture tradeoffs that hold up.


●    You’ve led people before (formally or naturally) and enjoy helping others grow.


●    Excellent problem-solving skills — you don’t freeze when things are unclear; you untangle them.


●    Near-perfect English (spoken and written). This is non-negotiable — you’ll work daily across countries and levels.


●    You take ownership by default and don’t need a map for every step.



Nice-to-haves:

 

●    You’ve helped build or grow a team from scratch.


●    Cloud + DevOps experience.


●    Product-minded engineering: you care about outcomes, not just tasks.



The kind of person who’ll thrive here

Let’s be direct:


●    You’re driven to create real results, not just “do your part.”


●    You want to build something from the ground up and shape the future of a company.


●    You lead with calm, clarity, and high standards.


●    You’re motivated beyond the norm — you don’t settle for “good enough.”


●    You know a Tech Lead is someone who steps up, helps others win, and keeps shipping.


●    You’re hungry to learn, and confident enough to challenge weak solutions.



The kind of person who won’t

Also direct:


●    If you expect everything to be built around you, look for another job.


●    If you want Denmark to hand you tasks, this isn’t it.


●    If you avoid responsibility or hard conversations, this will hurt.


●    If “average and comfortable” is your goal, don’t apply.



We’re building an exceptional team. Mediocre doesn’t survive here.


What you get

●    A rare chance to build an office, a culture, and a high-performing team in India from scratch.


●    Direct partnership with Danish leadership and product org.


●    Real influence over architecture, standards, and execution.


●    A company that values ownership and speed over politics.


●    Massive growth opportunity as the India office scales in 2026.


●    Competitive salary + benefits.


How to apply

Only reach out if you genuinely believe you’re the right fit and you’re motivated to build something one-of-a-kind.

Send: (This is mandatory)


●    A short page about you and what you’ve built.


●    CV/LinkedIn/GitHub/portfolio.


●    2–3 projects you’re proud of, and why.










Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune
5 - 6 yrs
₹4L - ₹10L / yr
Windows Azure
skill iconPython
PySpark
ADF
databricks
+2 more

🚀 Hiring: Data Engineer ( Azure ) at Deqode

⭐ Experience: 5+ Years

📍 Location: Pune, Bhopal, Jaipur, Gurgaon, Delhi, Banglore,

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


⭐ Hiring: Databricks Data Engineer – Lakeflow | Streaming | DBSQL | Data Intelligence

We are looking for a Databricks Data Engineer ( Azure ) to build reliable, scalable, and governed data pipelines powering analytics, operational reporting, and the Data Intelligence Layer.


🔹 Key Responsibilities

✅ Build optimized batch pipelines using Delta Lake (partitioning, OPTIMIZE, Z-ORDER, VACUUM)

✅ Implement incremental ingestion using Databricks Autoloader with schema evolution & checkpointing

✅ Develop Structured Streaming pipelines with watermarking, late data handling & restart safety

✅ Implement declarative pipelines using Lakeflow

✅ Design idempotent, replayable pipelines with safe backfills

✅ Optimize Spark workloads (AQE, skew handling, shuffle & join tuning)

✅ Build curated datasets for Databricks SQL (DBSQL), dashboards & downstream applications

✅ Package and deploy using Databricks Repos & Asset Bundles (CI/CD)

Ensure governance using Unity Catalog and embedded data quality checks


✅ Mandatory Skills (Must Have)

👉 Databricks & Delta Lake (Advanced Optimization & Performance Tuning)

👉 Structured Streaming & Autoloader Implementation

👉 Databricks SQL (DBSQL) & Data Modeling for Analytics

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shikha Nagar
Posted by Shikha Nagar
Pune
4 - 6 yrs
₹4L - ₹18L / yr
AWS Lambda
skill iconPython
pandas
SQL


We are hiring for a Python Developer at Wissen Technology!


📍 Location: Pune (Hybrid)

💼 Experience: 3–6 Years

⏱️ Notice Period: Immediate / 15 days preferred

🔧 Key Skills:

Strong experience in Python

• Hands-on with Pandas & NumPy

• Experience with AWS (S3, Lambda preferred)

• Good understanding of data processing & APIs

• SQL knowledge


🏢 About Wissen Technology:

Wissen Technology, part of the Wissen Group (est. 2000), is a fast-growing technology company specializing in high-end consulting across Banking, Finance, Telecom, and Healthcare domains.

✔️ Global presence – US, India, UK, Australia, Mexico & Canada

✔️ Certified Great Place to Work®

✔️ Trusted by Fortune 500 clients like Morgan Stanley, Goldman Sachs, and more


✔️ Strong growth with 400% revenue increase in recent years

🌐 Website: www.wissen.com

🔗 LinkedIn: https://www.linkedin.com/company/wissen-technology/

If you’re interested or have relevant candidates, please share your resume at [your email].

#Hiring #PythonDeveloper #PuneJobs #AWS #ImmediateJoiner


While you may already know about Wissen and the company history, here is a quick rundown for you.

 

About Wissen Technology:


·        The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015.

·        Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.

·        Our workforce has highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.

·        Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments.

·        Globally present with offices US, India, UK, Australia, Mexico, and Canada.

·        We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.

·        Wissen Technology has been certified as a Great Place to Work®.

·        Wissen Technology has been voted as the Top 20 AI/ML vendor by CIO Insider in 2020.

·        Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies.

·        We have served client across sectors like Banking, Telecom, Healthcare, Manufacturing, and Energy. They include likes of Morgan Stanley, Goldman Sachs, MSCI, StateStreet, Flipkart, Swiggy, Trafigura, GE to name a few.


De


Job Title: Application Development Engineer (Python – Backtesting & Index Platforms)

Role Overview



Key Responsibilities

Engine Development: Design and implement modular, reusable Python components for index construction, rebalancing, and backtesting.

Large-Scale Simulation: Use Pandas, NumPy, and PySpark to run historical calculations across long time horizons and multiple index variants.

Workflow Integration: Integrate engines with orchestrators such as Airflow or Temporal using parameterized, config-driven execution.

Reference Data Consumption: Query and utilize pricing, security master, and corporate action data from Snowflake.

Quality & Reconciliation: Build automated test harnesses to validate outputs, compare against benchmarks, and guarantee reproducibility.

Performance Optimization: Improve runtime efficiency through vectorization, caching, and distributed computing patterns.

Cross-Team Collaboration: Partner with Business, Index Ops, and Platform teams to accelerate research-to-production onboarding.

Required Technical Capabilities

Python Expertise: Strong proficiency in Python application development with emphasis on clean architecture and maintainable design.

Data & Numerical Libraries: Deep experience with Pandas and NumPy; working knowledge of PySpark for distributed workloads.

Financial Computation: Ability to implement portfolio mathematics, weighting algorithms, and time-series transformations.

Config-Driven Systems: Experience building rule-based or metadata-driven processing frameworks.

Database Skills: Strong SQL and experience consuming structured data from Snowflake.

Testing Discipline: Expertise in unit testing, regression testing, and deterministic replay of calculations.

Orchestration Integration: Familiarity with Airflow, Temporal, or similar workflow engines.

Cloud Infrastructure: Solid understanding of AWS ecosystem services (S3, Lambda, IAM) and how they integrate with the Snowflake Data Cloud.



Read more
Cambridge Wealth (Baker Street Fintech)
Sangeeta Bhagwat
Posted by Sangeeta Bhagwat
Pune
3 - 5 yrs
₹10L - ₹12L / yr
skill iconPython
SQL
skill iconAmazon Web Services (AWS)
skill iconPostgreSQL
pandas
+9 more


Department

Product & Technology

Location

On-site | Prabhat Road, Pune

Experience

3-5 Years in a Data Engineering or Analytics Role

Domain

Fintech / Wealth Management — non-negotiable

Compensation

11-12 LPA Fixed + Performance Bonus

Growth

Title upgrade + salary revision at 12–18 months for strong performers


Why this role is different from most Data Engineer postings

You will work directly with the founding team on a live wealth management platform used by HNI and NRI clients. You will not spend years in a queue waiting to matter your work ships to production, your analysis influences product decisions, and you will guide junior teammates from day one. If you perform, a raise and title upgrade are on the table within 1218 months. This is the kind of early-team role that defines careers.


About Cambridge Wealth

Cambridge Wealth is a fast-growing, award-winning Financial Services and Fintech firm obsessed with quality and exceptional client service. We serve a high-profile clientele NRI, Mass Affluent, HNI, and ultra-HNI professionals and have received multiple awards from major Mutual Fund houses and BSE. We are past the zero-to-one stage and now focused on scaling our features and intelligence layer. You will be joining at exactly the right time.


What You Will Be Doing

This is a central, hands-on data engineering role at the intersection of financial analytics and applied ML. You will own the data pipelines and analytical models that power investment insights for wealth management clients transforming transaction data and portfolio information into measurable, actionable intelligence.

We are not looking for someone who just keeps the lights on. We want someone who looks at a working system and immediately sees how to make it 10x faster, cleaner, and smarter using AI and automation wherever possible.


Key Responsibilities:


Data Engineering & Pipelines

  • Build and optimize PostgreSQL-based pipelines to process large volumes of investment transaction data.
  • Design and maintain database schemas, foreign tables, and analytical structures for performance at scale.
  • Write advanced SQL — window functions, stored procedures, query optimization, index design.
  • Build Python automation scripts for data ingestion, transformation, and scheduled pipeline runs.
  • Monitor AWS RDS workloads and troubleshoot performance issues proactively.


Financial Analytics & Modelling

  • Develop analytical frameworks to evaluate client portfolios against benchmarks and category averages.
  • Build data models covering mutual fund schemes, SIPs, redemptions, switches, and transfer lifecycles.
  • Create materialized views and derived tables optimized for dashboards and internal reporting tools.
  • Analyse client transaction history to surface patterns in investment behaviour and financial discipline.


Applied ML & AI-Driven Development

  • Use Python (Pandas, NumPy, Scikit-learn) for trend analysis, forecasting, and predictive modelling.
  • Implement classification or regression models to support financial pattern detection.
  • Use AI tools — LLMs, Copilots — to accelerate ETL development, code quality, and data cleaning.
  • Identify opportunities to automate repetitive data tasks and advocate for smarter tooling.


Data Quality & Governance

  • Own data integrity end-to-end in a live, high-stakes financial environment.
  • Build and maintain validation and cleaning protocols across all financial datasets.
  • Maintain Excel models, Power Query workflows, and structured reporting outputs.


Collaboration & Junior Mentorship

  • Work directly with Product, Investment Research, and Wealth Advisory teams.
  • Translate open-ended business questions into structured queries and measurable outputs.
  • Guide 1–2 junior trainees — review their work, set code quality standards, and help them grow.
  • Present findings clearly to non-technical stakeholders — no jargon, just clarity.


Skills — What We Need vs. What Helps

Skill / Tool

Requirement


Must-Haves:

SQL & PostgreSQL (window functions, stored procedures, optimization)

Python — Pandas, NumPy for data processing and automation

ML fundamentals — classification or regression (Scikit-learn)

AWS RDS or equivalent cloud database experience

Financial domain knowledge — mutual funds, SIPs, portfolio concepts

Python data visualization — Matplotlib, Seaborn, or Plotly

Strong Advantage

Excel — Power Query, advanced modelling

Materialized views, query planning, index optimization

Experience with BI/dashboard tools

Good to Have

NoSQL databases

Prior fintech or wealth management startup experience


Financial Domain — Non-Negotiable

This is a wealth management platform. You must come in with a working understanding of:

  • Mutual fund structures, scheme types, and NAV-based transactions
  • Investment lifecycle — SIPs, Lump Sum, Redemptions, Switches, and STPs
  • Portfolio allocation and benchmarking against indices (e.g. Nifty 50, category averages)
  • How HNI/NRI clients interact with financial products differently from retail investors

You do not need to be a CFA. But if mutual funds and portfolio analytics are completely new territory, this role is not the right fit right now.


The Culture Fit — Read This Carefully

We are a small, fast-moving team. This is not a place where you wait for a ticket to arrive in your queue. The right person for this role:

  • Has worked at a small startup before and is used to wearing multiple hats
  • Finds broken or slow data systems genuinely irritating and fixes them without being asked
  • Reaches for Python or an LLM when there is a repetitive task — automating is instinctive
  • Is comfortable saying 'I don't know but I'll find out' and follows through independently
  • Wants visibility and ownership, not just a well-defined job description
  • Is looking for a role where strong performance is directly visible and rewarded


Growth Path — What Happens If You Perform

This is not a vague 'growth opportunity' pitch.

If you hit the bar in your first 12–18 months, you will receive a salary revision and a title upgrade to Senior Data Engineer or Lead Data Engineer depending on team expansion. As we scale our Data and AI team, this role is the natural stepping stone to a team lead position. You will also gain direct exposure to founding-team decision-making — the kind of access that is hard to get at larger companies.


Preferred Background

  • 2–4 years in a data engineering or analytics role at a startup or small Fintech
  • Experience in a live product environment where data errors have real consequences
  • Exposure to portfolio analytics, investment research, or wealth management platforms
  • Has mentored or reviewed code for at least one junior team member


Hiring Process

We respect your time. The process is direct and moves fast.

  • Screening Questions — 5 minutes online
  • Online Challenge — MCQ(Data, SQL, AWS, etc), and one applied ML or analytics problem, Communication Skills and Personality (focused, not trick questions)
  • People Round — 30-minute video call, culture and communication
  • Technical Deep-Dive — 1 hour in person, live financial data problems and your past work
  • Founder's Interview — 1 hour in person, growth conversation and mutual fit
  • Offer & Background Verification


Read more
The Blue Owls Solutions

at The Blue Owls Solutions

2 candid answers
Apoorvo Chakraborty
Posted by Apoorvo Chakraborty
Pune
2 - 5 yrs
₹10L - ₹18L / yr
PySpark
SQL
skill iconPython
Data engineering
ETL

Blue Owls Solutions is looking for a mid-level Azure Data Engineer with approximately 4 years of hands-on experience to join our growing data team. In this role, you will design, build, and maintain scalable data pipelines and architectures that power business-critical analytics and reporting. You'll work closely with cross-functional teams to transform raw data into reliable, high-quality datasets that drive decision-making across the organization.

Required Skills

  • 4+ years of professional experience as a Data Engineer or in a similar data-focused role
  • Strong proficiency in SQL for data manipulation, querying, and performance optimization
  • Hands-on experience with PySpark for large-scale data processing and transformation
  • Solid working knowledge of the Microsoft Azure ecosystem (Azure Data Factory, Azure Data Lake, Azure Synapse, etc.)
  • Experience with Microsoft Fabric for end-to-end data analytics workflows
  • Ability to design and implement robust data architectures including data warehouses, lakehouses, and ETL/ELT frameworks
  • Strong coding and scripting skills with Python
  • Proven problem-solving ability with a knack for debugging complex data issues and optimizing pipeline performance
  • Understanding of data modeling concepts, dimensional modeling, and data governance best practices


Interview Process

  • Take-Home Assessment
  • 60-Minute Technical Interview
  • Culture Fit Round


Preferred Skills & Certifications

  • Microsoft Certified: Fabric Analytics Engineer Associate (DP-600)
  • Microsoft Certified: Fabric Data Engineer Associate (DP-700)
  • Experience with CI/CD practices for data pipelines
  • Familiarity with version control systems such as Git
  • Exposure to real-time streaming data solutions
  • Experience working in Agile or Scrum environments
  • Strong communication skills with the ability to translate technical concepts for non-technical stakeholders

What We Offer

  • Competitive salary and performance-based bonuses
  • Flexible hybrid options
  • Opportunities for professional development, training, and certification sponsorship
  • A collaborative, innovation-driven team culture
  • Paid time off and company holidays
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Meghana Shinde
Posted by Meghana Shinde
Pune, Bengaluru (Bangalore)
4 - 9 yrs
Best in industry
skill iconC++
skill iconPython




JOB DESRIPTION: C++ Developer ​

Experience : 4 –7   Years ​

Location : Pune​

No of Position : 1​

We are seeking an experienced C++ Developer with 4–7 years of experience to work in financial 

systems. The role involves working on mission-critical applications such as trading platforms, 

market data systems, risk engines, or payment processing systems, where performance, stability, 

and correctness are paramount.​

 

 

1. General Req -​

 

•4–7 years of professional C++ experience in performance-critical systems​

 

•Expert knowledge of modern C++ (C++11/14/17)​

 

•Strong understanding of data structures, algorithms, and memory models​

 

•Deep experience with multithreading, atomics, lock-free programming, and CPU cache 

behavior​

 

•Excellent knowledge of Linux internals and system-level programming​

 

•Experience with low-level debugging and profiling (gdb, perf, valgrind, flamegraphs)​

 

•Proficiency with CMake/Make and Git​

 

2. Trading Systems Experience (Highly Preferred)​

 

•Hands-on experience with order management systems (OMS) and execution engines​

 

•Knowledge of exchange protocols: FIX, ITCH, OUCH, FAST​

 

•Experience handling market data feeds (L1/L2, multicast, UDP)​

 

•Understanding of latency measurement, clock synchronization, and time stamping​

 

 




Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
DrSoumya Sahadevan
Posted by DrSoumya Sahadevan
Pune
7 - 15 yrs
₹20L - ₹30L / yr
skill iconAmazon Web Services (AWS)
Windows Azure
Google Cloud Platform (GCP)
PySpark
databricks
+2 more

About TVARIT

TVARIT GmbH specializes in developing and delivering cutting-edge artificial intelligence (AI) solutions for the metal industry, including steel, aluminum, copper, cast iron, and more. Our software products empower customers to make intelligent, data-driven decisions, driving advancements in Predictive Quality (PsQ), Predictive Maintenance (PdM), and Energy Consumption Reduction (PsE), etc. With a strong portfolio of renowned reference customers, state-of-the-art technology, a talented research team from prestigious universities, and recognition through esteemed awards such as the EU Horizon 2020 AI Prize, TVARIT is recognized as one of the most innovative AI companies in Germany and Europe. We are seeking a self-motivated individual with a positive "can-do" attitude and excellent oral and written communication skills in English to join our team.


Job Description: We are looking for a Senior Data Engineer with strong expertise in Azure Databricks, PySpark, and distributed computing to develop and optimize scalable ETL pipelines for manufacturing analytics. The role involves working with high-frequency industrial data to enable real-time and batch data processing.


Key Responsibilities · Build scalable real-time and batch processing workflows using Azure Databricks, PySpark, and Apache Spark.

· Perform data pre-processing, including cleaning, transformation, deduplication, normalization, encoding, and scaling to ensure high-quality input for downstream analytics.

· Design and maintain cloud-based data architectures, including data lakes, lakehouses, and warehouses, following Medallion Architecture.

· Deploy and optimize data solutions on Azure (preferred), AWS, or GCP with a focus on performance, security, and scalability.

· Develop and optimize ETL/ELT pipelines for structured and unstructured data from IoT, MES, SCADA, LIMS, and ERP systems. · Automate data workflows using CI/CD and DevOps best practices, ensuring security and compliance with industry standards

· Monitor, troubleshoot, and enhance data pipelines for high availability and reliability.

· Utilize Docker and Kubernetes for scalable data processing.

· Collaborate with automation team, data scientists and engineers to provide clean, structured data for AI/ML models.


Desired Skills and Qualifications · Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.

· 7+ years of experience in core data engineering, with a strong focus on cloud platforms such as Azure (preferred), AWS, or GCP · Proficiency in PySpark, Azure Databricks, Python and Apache Spark, etc.

. 2 years of team handling experience.

· Expertise in relational databases (e.g., SQL Server, PostgreSQL), time series databases (e.g. Influx DB), and NoSQL databases (e.g., MongoDB, Cassandra) · Experience in containerization (Docker, Kubernetes).

· Strong analytical and problem-solving skills with attention to detail.

· Good to have MLOps, DevOps including model lifecycle management

· Excellent communication and collaboration skills, with a proven ability to work effectively as a team player.

· Comfortable working in a dynamic, fast-paced startup environment, adapting quickly to changing priorities and responsibilities.

Read more
Hashone Career
Pune
4 - 7 yrs
₹8L - ₹14L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconPython

About Us

Wednesday is a technology consulting and engineering firm based in Pune. We specialise in helping digital-first businesses solve complex engineering problems. Our expertise lies in data engineering, applied AI, and app development. We offer our expertise through our services: Launch, Catalyse, Amplify, and Control.


We're a passionate bunch of people who take their work seriously. We deeply care about each other and are united by the cause of building teams that delivery great digital products & experiences.

Job Description

We are seeking Senior Software Engineers who can architect and ship fullstack digital products at a high bar — using AI-assisted development tools to move faster without cutting corners. This role spans platform, product, and go-to-market — you'll own backend systems, shape frontend experiences, make infrastructure decisions, and set a higher engineering standard for the team around you. The ideal candidate has designed systems they can defend, shipped products at scale, and knows what it takes to get there.

Requirements

Product & Client Ownership Be the day-to-day technical owner on engagements — understand the client's business deeply, shape the product roadmap, and translate ambiguous problems into clear engineering direction. Show up to demos and reviews with the confidence to defend tradeoffs and flag risks early.

Architecture & Judgment Make architectural decisions that hold up at scale. AI can generate code — your job is to decide what gets built, how it fits together, and when to push back. Evaluate tradeoffs, review TRDs, and set the technical direction the rest of the team executes against.

Fullstack Execution Ship backend services, APIs, database schemas, and user-facing features end-to-end. Use AI-assisted tools (Cursor, Claude Code, Antigravity) to move at the speed of a small team without cutting corners on quality.

Platform & Reliability Own cloud infrastructure, CI/CD, and production systems. Define how the team monitors, debugs, and responds to incidents. If something breaks at 2am, you've already thought about it.

AI & Automation Drive AI adoption in products — LLM APIs, RAG pipelines, agentic workflows. Push for automation across client and internal workflows. Know what these tools are good at and, more importantly, where they fail.

Raising the Bar Be the judgment layer for junior engineers who are moving fast with AI tools. Review code for what matters — not style, but correctness, scalability, and whether the author actually understood what they shipped. Run knowledge-sharing sessions. Onboard people well.

Must Haves


3–5 years of professional engineering experience with production systems you've owned end-to-end.

Active user of AI IDEs (Cursor, Claude Code, Antigravity, or similar).

Demonstrated system design ability — you've made architectural decisions and can evaluate trade-offs.

Good exposure to cloud platforms and deployments.

Familiarity with observability and monitoring tools — you can track down issues and identify bottlenecks.

Deep backend proficiency: API design, databases, microservices, distributed systems, event-driven architecture, and message brokers.

Worked with at least two of REST, GraphQL, or gRPC in production.

Eye for design — you care about the experiences you build for users.

High rate of learning — you figure things out fast.


Nice to Have


Cloud architecture experience (AWS, GCP, Azure) with containerisation and orchestration.

Familiarity with AI/ML: prompt engineering, embeddings, agent frameworks (LangChain, CrewAI, LangGraph).

Experience with automation and workflow tools (n8n, Make, Zapier).




Benefits

Mentorship: Work next to some of the best engineers and designers — and be one for others.

Freedom: An environment where you get to practice your craft. No micromanagement.

Comprehensive healthcare: Healthcare for you and your family.

Growth: A tailor-made program to help you achieve your career goals.

A voice that is heard: We don't claim to know the best way of doing things. We like to listen to ideas from our team.


Read more
Metron Security Private Limited
Prathamesh Shinde
Posted by Prathamesh Shinde
Pune
2 - 7 yrs
₹5L - ₹12L / yr
skill iconPython
skill iconGo Programming (Golang)
skill icon.NET
skill iconNodeJS (Node.js)

Job Description:


We are looking for a skilled Backend Developer with 2–5 years of experience in software development, specializing in Python and/or Golang. If you have strong programming skills, enjoy solving problems, and want to work on secure and scalable systems, we'd love to hear from you!


Location - Pune, Baner.

Interview Rounds - In Office


Key Responsibilities:

Design, build, and maintain efficient, reusable, and reliable backend services using Python and/or Golang

Develop and maintain clean and scalable code following best practices

Apply Object-Oriented Programming (OOP) concepts in real-world development

Collaborate with front-end developers, QA, and other team members to deliver high-quality features

Debug, optimize, and improve existing systems and codebase

Participate in code reviews and team discussions

Work in an Agile/Scrum development environment


Required Skills: Strong experience in Python or Golang (working knowledge of both is a plus)


Good understanding of OOP principles

Familiarity with RESTful APIs and back-end frameworks

Experience with databases (SQL or NoSQL)

Excellent problem-solving and debugging skills

Strong communication and teamwork abilities


Good to Have:

Prior experience in the security industry

Familiarity with cloud platforms like AWS, Azure, or GCP

Knowledge of Docker, Kubernetes, or CI/CD tools

Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
DrSoumya Sahadevan
Posted by DrSoumya Sahadevan
Pune
5 - 15 yrs
₹20L - ₹38L / yr
skill iconReact.js
API
AWS CloudFormation
skill iconDjango
skill iconNodeJS (Node.js)
+7 more

Availability: Full time 

Location: Pune, India 

Experience: 5- 6 years

 

Tvarit Solutions Private Limited (wholly owned subsidiary of TVARIT GmbH, Germany). TVARIT provides software to reduce manufacturing waste like scrap, energy, and machine downtime using its patented technology. With its software products, and highly competent team from renowned Universities, TVARIT has gained customer trust across 4 continents within a short span of 3 years. TVARIT is awarded among the top 8 out of 490 AI companies by European Data Incubator, apart from many more awards by the German government and industrial organizations making TVARIT one of the most innovative AI companies in Germany and Europe.  

 

We are looking for a passionate Full Stack Developer Level 2 to join our technology team in Pune Centre. You will be responsible for handling architecting, design, development, testing, leading the software development team and working toward infrastructure development that will support the company’s solutions. You will get an opportunity to work closely on projects that will involve the automation of the manufacturing process. 

 

Key Responsibilities 

· Full Stack Development: Design, develop, and maintain scalable web applications using React with TypeScript for the frontend and Node.js/Python for the backend.

· AI Integration: Collaborate with data scientists and ML engineers to integrate AI/ML models into the SaaS platform, ensuring seamless performance and usability.

· API Development & Optimization: Build and optimize high-performance REST APIs in Node.js and Python (Django, Flask, or FastAPI) to support real-time data processing and analytics.

· Database Engineering: Design, manage, and optimize data storage using relational (PostgreSQL), NoSQL (MongoDB/DynamoDB), graph, and vector databases for handling complex industrial data.

· Cloud-Native Deployment: Deploy, monitor, and manage services in containerized environments using Docker and Kubernetes on Linux-based systems (Ubuntu/Debian).

· System Architecture & Design: Contribute to architectural decisions, leveraging OOPs, microservices, domain-driven design, and design patterns to ensure scalability, security, and maintainability.

· Data Handling & Processing: Work with large-scale manufacturing datasets using Python (pandas) to enable predictive analytics and AI-driven insights.

· Collaboration & Agile Delivery: Partner with cross-functional teams—including product managers, manufacturing domain experts, and AI researchers—to translate business needs into technical solutions.

· Performance & Security: Ensure robust, secure, and high-performance software by implementing best practices in algorithms, data structures, and system design.

· Continuous Improvement: Stay updated on emerging technologies in AI, SaaS, and manufacturing systems to propose innovative solutions that enhance product capability.

 

Must have worked on these technologies.

· 5+ years of experience working with React-Typescript, node.js on a production level

· Python, pandas, High performance REST APIs in node and Python (in Django or Flask or Fast API)

· Databases: Relational DB like PostgreSQL, No SQL DB like Mongo or Dynamo DB, Vector databases, Graphs DBs

· OS: Linux flavor like Ubuntu, Debian

· Source Control and CI/CD

· Software Fundamentals: Excellent command on Algorithms and Data Structures

· Software design and Architecture: OOPs, Design Patterns, Micro Services, monolithic architectures, Domain driven Design

· Containers: Docker and Kubernetes

· Cloud: Fundamentals of AWS like S3 buckets, EC2, IAMs, Security groups


Benefits and Perks:

· Be part of the product which is transforming the manufacturing landscape with AI

· Culture of innovation, creativity, learning, and even failure, we believe in bringing out the best in you.

· Progressive leave policy for effective work-life balance.

· Get mentored by highly qualified internal resource groups and opportunities to avail industry-driven mentorship programs.

· Multicultural peer groups and supportive workplace policies. 

· Work from beaches, hills, mountains, and many more with the yearly workcation program; we believe in mixing elements of vacation and work.

 

 

 

How it's like to work for a Startup?

Working for TVARIT (deep-tech German IT Startup) can offer you a unique blend of innovation, collaboration, and growth opportunities. But it's essential to approach it with a willingness to adapt and thrive in a dynamic environment.

 

If this position sparked your interest, do apply today!

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Poonam Behere
Posted by Poonam Behere
Pune
4 - 8 yrs
₹1L - ₹20L / yr
skill iconPython

Company Name – Wissen Technology

Group of companies in India – Wissen Technology & Wissen Infotech

Work Location – Pune


While you may already know about Wissen and the company history, here is a quick rundown for you.

 

About Wissen Technology:


·        The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015.

·        Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.

·        Our workforce has highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.

·        Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments.

·        Globally present with offices US, India, UK, Australia, Mexico, and Canada.

·        We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.

·        Wissen Technology has been certified as a Great Place to Work®.

·        Wissen Technology has been voted as the Top 20 AI/ML vendor by CIO Insider in 2020.

·        Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies.

·        We have served client across sectors like Banking, Telecom, Healthcare, Manufacturing, and Energy. They include likes of Morgan Stanley, Goldman Sachs, MSCI, StateStreet, Flipkart, Swiggy, Trafigura, GE to name a few.


Job Description:


Experience Required: 3–5years


Job Title: Application Development Engineer (Python – Backtesting & Index Platforms)

Role Overview

We are seeking a strong Python Application Engineer to help build a next-generation Index Backtesting and Rebalance Platform.

In this role, you will design and develop deterministic, scalable calculation engines that convert financial methodologies into production-grade software.

You will work on portfolio construction, rebalancing logic, and historical simulations while consuming reference and market data from Snowflake using python as your core processing tool.


Key Responsibilities

Engine Development: Design and implement modular, reusable Python components for index construction, rebalancing, and backtesting.

Large-Scale Simulation: Use Pandas, NumPy, and PySpark to run historical calculations across long time horizons and multiple index variants.

Workflow Integration: Integrate engines with orchestrators such as Airflow or Temporal using parameterized, config-driven execution.

Reference Data Consumption: Query and utilize pricing, security master, and corporate action data from Snowflake.

Quality & Reconciliation: Build automated test harnesses to validate outputs, compare against benchmarks, and guarantee reproducibility.

Performance Optimization: Improve runtime efficiency through vectorization, caching, and distributed computing patterns.

Cross-Team Collaboration: Partner with Business, Index Ops, and Platform teams to accelerate research-to-production onboarding.

Required Technical Capabilities

Python Expertise: Strong proficiency in Python application development with emphasis on clean architecture and maintainable design.

Data & Numerical Libraries: Deep experience with Pandas and NumPy; working knowledge of PySpark for distributed workloads.

Financial Computation: Ability to implement portfolio mathematics, weighting algorithms, and time-series transformations.

Config-Driven Systems: Experience building rule-based or metadata-driven processing frameworks.

Database Skills: Strong SQL and experience consuming structured data from Snowflake.

Testing Discipline: Expertise in unit testing, regression testing, and deterministic replay of calculations.

Orchestration Integration: Familiarity with Airflow, Temporal, or similar workflow engines.

Cloud Infrastructure: Solid understanding of AWS ecosystem services (S3, Lambda, IAM).


 

Technical Skills:

  • Technical Skills:
  • Python, PySpark, Pandas, NumPy, AWS (S3, Lambda), Apache Airflow, SQL, Financial Services Domain Knowledge, Strong Communication Skills.





Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shikha Nagar
Posted by Shikha Nagar
Remote, Pune
4 - 8 yrs
Best in industry
skill iconC++
skill iconPython
Multithreading
Algorithms
  • 4–7 years of professional C++ experience in performance-critical systems
  • Expert knowledge of modern C++ (C++11/14/17)
  • Strong understanding of data structures, algorithms, and memory models
  • Deep experience with multithreading, atomics, lock-free programming, and CPU cache behaviour
  • Excellent knowledge of Linux internals and system-level programming
  • Experience with low-level debugging and profiling (gdb, perf, valgrind, flamegraphs)
  • Proficiency with CMake/Make and Git

2. Trading Systems Experience (Highly Preferred)

  • Hands-on experience with order management systems (OMS) and execution engines
  • Knowledge of exchange protocols: FIX, ITCH, OUCH, FAST
  • Experience handling market data feeds (L1/L2, multicast, UDP)
  • Understanding of latency measurement, clock synchronization, and time stamping
  • Experience with network tuning (kernel bypass, socket tuning, CPU pinning)
  • Familiarity with trading lifecycle, risk checks, and throttling mechanisms

3. Education

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related discipline

4. Soft Skills (Important for Trading Firms)

  • Ability to work under extreme time and accuracy pressure
  • Strong ownership of production systems
  • Clear and direct communication with traders and quants
  • Bias toward simple, fast, and reliable designs

5. Key Responsibilities

  • Design, develop, and optimize ultra-low-latency C++ trading applications
  • Build and maintain exchange connectivity and order execution systems
  • Develop real-time market data pipelines with strict latency requirements
  • Optimize systems at CPU, memory, and network levels
  • Implement lock-free or low-lock concurrent designs
  • Analyze latency using profiling tools and improve tail latency
  • Ensure high availability, fault tolerance, and rapid recovery
  • Work closely with Traders and Quant Researchers to implement strategies
  • Participate in architecture and performance design reviews
  • Review code, enforce best practices, and mentor junior engineers
  • Support production systems and handle time-critical issues when needed


Read more
Generative AI Persona platform

Generative AI Persona platform

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 7 yrs
₹15L - ₹20L / yr
skill iconMachine Learning (ML)
skill iconPython
ETL
skill iconData Science
ELT
+6 more

Description

We are currently hiring for the position of Data Scientist/ Senior Machine Learning Engineer (6–7 years’ experience).

 

Please find the detailed Job Description attached for your reference. We are looking for candidates with strong experience in:

  • Machine Learning model development
  • Scalable data pipeline development (ETL/ELT)
  • Python and SQL
  • Cloud platforms such as Azure/AWS/Databricks
  • ML deployment environments (SageMaker, Azure ML, etc.)

 

Kindly note:

  • Location: Pune (Work From Office)
  • Immediate joiners preferred

 

While sharing profiles, please ensure the following details are included:

  • Current CTC
  • Expected CTC
  • Notice Period
  • Current Location
  • Confirmation on Pune WFO comfort

 

Must have skills

Machine Learning - 6 years

Python - 6 years

ETL(Extract, Transform, Load) - 6 years

SQL - 6 years

Azure - 6 years

 

Read more
Digital solutions and services company

Digital solutions and services company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 7 yrs
₹17L - ₹23L / yr
skill iconMachine Learning (ML)
skill iconPython
ETL
skill iconData Science
SQL
+5 more

Data Scientist or Senior Machine Learning Engineer


We are currently hiring for the position of Data Scientist/ Senior Machine Learning Engineer (6–7 years' experience).


Please find the detailed Job Description attached for your reference.

We are looking for candidates with strong experience in:

  • Machine Learning model development
  • Scalable data pipeline development (ETL/ELT)
  • Python and SQL
  • Cloud platforms such as Azure/AWS/Databricks
  • ML deployment environments (SageMaker, Azure ML, etc.)


Kindly note:

  • Location: Pune (Work from Office)
  • Immediate joiners preferred


While sharing profiles, please ensure the following details are included:

  • Current CTC
  • Expected CTC
  • Notice Period
  • Current Location
  • Confirmation on Pune WFO comfort


Must have Skills

  • Machine Learning - 6 Years
  • Python - 6 Years
  • ETL (Extract, Transform, Load) - 6 Years
  • SQL - 6 Years
  • Azure - 6 Years


Request you to share relevant profiles at the earliest. Looking forward to your support.

Read more
BestQ
Sudha S
Posted by Sudha S
Jaipur, Faridabad, WEST BENGAL, Odisha, RAJASTHAN, Chandigarh, Nashik, Pune
1 - 5 yrs
₹1L - ₹8L / yr
skill iconJava
skill iconJavascript
skill iconPython
TestNG
Selenium
+18 more

Job Title: QA Engineer – Manual & Automation Testing

We are looking for a detail-oriented QA Engineer (Manual & Automation) to join a

fast-paced product team. In this role, you will be responsible for ensuring the quality,

reliability, and performance of a modern SaaS platform through structured

manual and automation testing. You will work closely with developers, product managers,

and stakeholders to deliver high-quality software releases on time.

What You’ll Be Doing

● Design, review, and execute detailed manual and automated test cases for web-based applications

● Perform functional, regression, smoke, sanity, and end-to-end testing across multiple modules

● Identify, log, and track defects clearly, ensuring proper follow-up and closure

● Validate bug fixes and feature enhancements before production releases

● Collaborate closely with developers to understand requirements and resolve issues efficiently

● Participate in requirement and design reviews to provide early QA feedback

● Maintain and update test cases, test scenarios, and automation scripts based on product changes

● Contribute to the continuous improvement of QA processes, test coverage, and release quality

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Pune, Mumbai, Bengaluru (Bangalore)
3 - 12 yrs
Best in industry
skill iconPython
pandas
Object Oriented Programming (OOPs)
SQL

JOB DESCRIPTION:


Location: Pune, Mumbai, Bangalore

Mode of Work : 3 days from Office


* Python : Strong expertise in data workflows and automation

* Pandas: For detailed data analysis and validation

* SQL: Querying and performing operations on Delta tables

* AWS Cloud: Compute and storage services

* OOPS concepts

Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
8 - 15 yrs
Best in industry
skill iconJavascript
skill iconReact.js
skill iconNodeJS (Node.js)
TypeScript
skill iconAmazon Web Services (AWS)
+6 more

About NonStop io Technologies:

NonStop io Technologies is a value-driven company with a strong focus on process-oriented software engineering. We specialize in Product Development and have a decade's worth of experience in building web and mobile applications across various domains. NonStop io Technologies follows core principles that guide its operations and believes in staying invested in a product's vision for the long term. We are a small but proud group of individuals who believe in the 'givers gain' philosophy and strive to provide value in order to seek value. We are committed to and specialize in building cutting-edge technology products and serving as trusted technology partners for startups and enterprises. We pride ourselves on fostering innovation, learning, and community engagement. Join us to work on impactful projects in a collaborative and vibrant environment.


Brief Description:

We are looking for an Engineering Manager who combines technical depth with leadership strength. This role involves leading one or more product engineering pods, driving architecture decisions, ensuring delivery excellence, and working closely with stakeholders to build scalable web and mobile technology solutions. As a key part of our leadership team, you’ll play a pivotal role in mentoring engineers, improving processes, and fostering a culture of ownership, innovation, and continuous learning.


Roles and Responsibilities:

● Team Management: Lead, coach, and grow a team of 15-20 software engineers, tech leads, and QA engineers

● Technical Leadership: Guide the team in building scalable, high-performance web and mobile applications using modern frameworks and technologies

● Architecture Ownership: Architect robust, secure, and maintainable technology solutions aligned with product goals

● Project Execution: Ensure timely and high-quality delivery of projects by driving engineering best practices, agile processes, and cross-functional collaboration

● Stakeholder Collaboration: Act as a bridge between business stakeholders, product managers, and engineering teams to translate requirements into technology plans

● Culture & Growth: Help build and nurture a culture of technical excellence, accountability, and continuous improvement

● Hiring & Onboarding: Contribute to recruitment efforts, onboarding, and learning & development of team members.


Requirements:

● 8+ years of software development experience, with 2+ years in a technical leadership or engineering manager role

● Proven experience in architecting and building web and mobile applications at scale

● Hands-on knowledge of technologies such as JavaScript/TypeScript, Angular, React, Node.js, .NET, Java, Python, or similar stacks

● Solid understanding of cloud platforms (AWS/Azure/GCP) and DevOps practices

● Strong interpersonal skills with a proven ability to manage stakeholders and lead diverse teams

● Excellent problem-solving, communication, and organizational skills

● Nice to have:

  • Prior experience in working with startups or product-based companies
  • Experience mentoring tech leads and helping shape engineering culture
  • Exposure to AI/ML, data engineering, or platform thinking


Why Join Us?

● Opportunity to work on a cutting-edge healthcare product

● A collaborative and learning-driven environment

● Exposure to AI and software engineering innovations

● Excellent work ethics and culture.



If you're passionate about technology and want to work on impactful projects, we'd love to hear from you!

Read more
Metron Security Private Limited
Chanchal Kale
Posted by Chanchal Kale
Pune
2 - 6 yrs
₹3L - ₹10L / yr
skill iconPython
skill iconGo Programming (Golang)
skill iconNodeJS (Node.js)
MERN Stack
Data Structures
+1 more

Job Summary:


  • We are looking for a highly motivated and skilled Software Engineer to join our team.
  • This role requires a strong understanding of the software development lifecycle, proficiency in coding, and excellent communication skills.
  • The ideal candidate will be responsible for production monitoring, resolving minor technical issues, collecting client information, providing effective client interactions, and supporting our development team in resolving challenges


Key Responsibilities:


  • Client Interaction: Serve as the primary point of contact for client queries, provide excellent communication, and ensure timely issue resolution.
  • Issue Resolution: Troubleshoot and resolve minor issues related to software applications in a timely manner.
  • Information Collection: Gather detailed technical information from clients, understand the problem context, and relay the information to the development leads for further action.
  • Collaboration: Work closely with development leads and cross-functional teams to provide timely support and resolution for customer issues.
  • Documentation: Document client issues, actions taken, and resolutions for future reference and continuous improvement.
  • Software Development Lifecycle: Be involved in maintaining, supporting, and optimizing software through its lifecycle, including bug fixes and enhancements.
  • Automating Redundant Support Tasks: (good to have) Should be able to automate the redundant repetitive tasks Required Skills and Qualifications:


Mandatory Skills:


  • Expertise in at least one Object Oriented Programming language (Python, Java, C#, C++, Reactjs, Nodejs).
  • Good knowledge on Data Structure and their correct usage.
  • Open to learn any new software development skill if needed for the project.
  • Alignment and utilization of the core enterprise technology stacks and integration capabilities throughout the transition states.
  • Participate in planning, definition, and high-level design of the solution and exploration of solution alternatives.
  • Define, explore, and support the implementation of enablers to evolve solution intent, working directly with Agile teams to implement them.
  • Good knowledge on the implications.
  • Experience architecting & estimating deep technical custom solutions & integrations.


Added advantage:


  • You have developed software using web technologies.
  • You have handled a project from start to end.
  • You have worked in an Agile Development project and have experience of writing and estimating User Stories
  • Communication Skills: Excellent verbal and written communication skills, with the ability to clearly explain technical issues to non-technical clients.
  • Client-Facing Experience: Strong ability to interact with clients, gather necessary information, and ensure a high level of customer satisfaction.
  • Problem-Solving: Quick-thinking and proactive in resolving minor issues, with a focus on providing excellent user experience.
  • Team Collaboration: Ability to collaborate with development leads, engineering teams, and other stakeholders to escalate complex issues or gather additional technical support when required.


Preferred Skills:


  • Familiarity with Cloud Platforms and Cyber Security tools: Knowledge of cloud computing platforms and services (AWS, Azure, Google Cloud) and Cortex XSOAR, SIEM, SOAR, XDR tools is a plus.
  • Automation and Scripting: Experience with automating processes or writing scripts to support issue resolution is an advantage.


Work Environment:

  • This is a rotational shift position
  • During evening shift the timings will be (5 PM to 2 AM), and you will be expected to work independently and efficiently during these hours.
  • The position may require occasional weekend shifts depending on the project requirements.
  • Additional benefit of night allowance.


Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune, Trivandrum , Thiruvananthapuram
8 - 10 yrs
₹20L - ₹24L / yr
skill iconJava
skill iconPython
API
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
+13 more

Job Details

Job Title: Lead Software Engineer - Java, Python, API Development

Industry: Global digital transformation solutions provider

Domain - Information technology (IT)

Experience Required: 8-10 years

Employment Type: Full Time

Job Location: Pune & Trivandrum/ Thiruvananthapuram

CTC Range: Best in Industry

 

Job Description

Job Summary

We are seeking a Lead Software Engineer with strong hands-on expertise in Java and Python to design, build, and optimize scalable backend applications and APIs. The ideal candidate will bring deep experience in cloud technologies, large-scale data processing, and leading the design of high-performance, reliable backend systems.

 

Key Responsibilities

  • Design, develop, and maintain backend services and APIs using Java and Python
  • Build and optimize Java-based APIs for large-scale data processing
  • Ensure high performance, scalability, and reliability of backend systems
  • Architect and manage backend services on cloud platforms (AWS, GCP, or Azure)
  • Collaborate with cross-functional teams to deliver production-ready solutions
  • Lead technical design discussions and guide best practices

 

Requirements

  • 8+ years of experience in backend software development
  • Strong proficiency in Java and Python
  • Proven experience building scalable APIs and data-driven applications
  • Hands-on experience with cloud services and distributed systems
  • Solid understanding of databases, microservices, and API performance optimization

 

Nice to Have

  • Experience with Spring Boot, Flask, or FastAPI
  • Familiarity with Docker, Kubernetes, and CI/CD pipelines
  • Exposure to Kafka, Spark, or other big data tools

 

Skills

Java, Python, API Development, Data Processing, AWS Backend

 

Skills: Java, API development, Data Processing, AWS backend, Python,

 

Must-Haves

Java (8+ years), Python (8+ years), API Development (8+ years), Cloud Services (AWS/GCP/Azure), Database & Microservices

8+ years of experience in backend software development

Strong proficiency in Java and Python

Proven experience building scalable APIs and data-driven applications

Hands-on experience with cloud services and distributed systems

Solid understanding of databases, microservices, and API performance optimization

Mandatory Skills: Java API AND AWS

 

******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: Pune, Trivandrum

Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
3 - 5 yrs
Best in industry
skill iconReact.js
skill iconAngular (2+)
skill iconVue.js
skill iconPython
skill iconJava
+11 more

About NonStop io Technologies:

NonStop io Technologies is a value-driven company with a strong focus on process-oriented software engineering. We specialize in Product Development and have a decade's worth of experience in building web and mobile applications across various domains. NonStop io Technologies follows core principles that guide its operations and believes in staying invested in a product's vision for the long term. We are a small but proud group of individuals who believe in the 'givers gain' philosophy and strive to provide value in order to seek value. We are committed to and specialize in building cutting-edge technology products and serving as trusted technology partners for startups and enterprises. We pride ourselves on fostering innovation, learning, and community engagement. Join us to work on impactful projects in a collaborative and vibrant environment.


Brief Description:

We are looking for a passionate and experienced Full Stack Engineer to join our engineering team. The ideal candidate will have strong experience in both frontend and backend development, with the ability to design, build, and scale high-quality applications. You will collaborate with cross-functional teams to deliver robust and user-centric solutions.


Roles and Responsibilities:

● Design, develop, and maintain scalable web applications

● Build responsive and high-performance user interfaces

● Develop secure and efficient backend services and APIs

● Collaborate with product managers, designers, and QA teams to deliver features

● Write clean, maintainable, and testable code

● Participate in code reviews and contribute to engineering best practices

● Optimize applications for speed, performance, and scalability

● Troubleshoot and resolve production issues

● Contribute to architectural decisions and technical improvements.


Requirements:

● 3 to 5 years of experience in full-stack development

● Strong proficiency in frontend technologies such as React, Angular, or Vue

● Solid experience with backend technologies such as Node.js, .NET, Java, or Python

● Experience in building RESTful APIs and microservices

● Strong understanding of databases such as PostgreSQL, MySQL, MongoDB, or SQL Server

● Experience with version control systems like Git

● Familiarity with CI CD pipelines

● Good understanding of cloud platforms such as AWS, Azure, or GCP

● Strong understanding of software design principles and data structures

● Experience with containerization tools such as Docker

● Knowledge of automated testing frameworks

● Experience working in Agile environments


Why Join Us?

● Opportunity to work on a cutting-edge healthcare product

● A collaborative and learning-driven environment

● Exposure to AI and software engineering innovations

● Excellent work ethic and culture


If you're passionate about technology and want to work on impactful projects, we'd love to hear from you!

Read more
NeoGenCode Technologies Pvt Ltd
Ritika Verma
Posted by Ritika Verma
Pune
3 - 5 yrs
₹8L - ₹15L / yr
skill iconPython
skill iconMongoDB
RESTful APIs
skill iconFlask
skill iconDjango

Python Software Engineer (3–5 Years Experience)

Location: [Pune]



Role Overview

We are seeking skilled Python engineers to join our core product team. You will work on backend services, API development, and system integrations, contributing to a codebase of over 250,000 Python lines and collaborating with frontend, DevOps, and native code teams.


Key Responsibilities

·    Design, develop, and maintain scalable Python backend services and APIs

·    Optimize performance and reliability of large, distributed systems

·    Collaborate with frontend (JS/HTML/CSS) and native (C/C++/C#) teams

·    Write unit/integration tests and participate in code reviews

·    Troubleshoot production issues and implement robust solutions


Required Skills

·    3–5 years of professional Python development experience

·    Strong understanding of OOP, design patterns, and modular code structure

·    Experience with MongoDB (PyMongo), Mako, RESTful APIs, and asynchronous programming

·    Familiarity with code quality tools (flake8, pylint) and test frameworks (pytest, unittest)

·    Experience with Git and collaborative development workflows

·    Ability to read and refactor large, multi-module codebases


Nice to Have

·    Experience with web frameworks (web.py, Flask, Django)

·    Knowledge of C/C++ or C# for cross-platform integrations

·    Familiarity with CI/CD, Docker, and cloud deployment

·    Exposure to security, encryption, or enterprise SaaS products


What We Offer

·    Opportunity to work on a mission-critical, enterprise-scale product

·    Collaborative, growth-oriented engineering culture

·    Flexible work arrangements (remote/hybrid)

·    Competitive compensation and benefits

Read more
Resources Valley

at Resources Valley

1 recruiter
Manind Gupta
Posted by Manind Gupta
Pune
8 - 16 yrs
₹25L - ₹38L / yr
skill iconPython
FastAPI
API

Role Overview:

We are seeking a backend-focused Software Engineer with deep expertise in REST APIs,

real-time integrations, and cloud-based application services. The ideal candidate will build

scalable backend systems, integrate real-time data flows, and contribute to system design

and documentation. This is a hands-on role working with global teams in a fast-paced, Agile

environment.

Key Responsibilities:

• Design, develop, and maintain REST APIs and backend services using Python, FastAPI,

and SQLAlchemy.

• Build and support real-time integrations using AWS Lambda, API Gateway, and

EventBridge.

• Develop and maintain Operational Data Stores (ODS) for real-time data access.

• Write performant SQL queries and work with dimensional data models in PostgreSQL.

• Contribute to cloud-based application logic and data orchestration.

• Containerize services using Docker and deploy via CI/CD pipelines.

• Implement automated testing using pytest, pydantic, and related tools.

• Collaborate with cross-functional Agile teams using tools like Jira.

• Document technical workflows, APIs, and system integrations with clarity and

consistency.

• Should have experience in team management

Required Skills & Experience:

• 8+ years of backend or integrations engineering experience.

• Expert-level knowledge of REST API development and real-time system design.

• Strong experience with: Python (FastAPI preferred), SQLAlchemy.

• PostgreSQL and advanced SQL.

• AWS Lambda, API Gateway, EventBridge.

• Operational Data Stores (ODS) and distributed system integration.

• Experience with Docker, Git, CI/CD tools, and automated testing frameworks.

• Experience working in Agile environments and collaborating with cross-functional

teams.


• Comfortable producing and maintaining clear technical documentation.

• Working knowledge of React is acceptable but not a focus.

• Hands-on experience working with Databricks or similar data platforms.

Education & Certifications:

• Bachelor’s degree in Computer Science, Engineering, or a related field (required).

• Master’s degree is a plus.

• Certifications in AWS (e.g., Developer Associate, Solutions Architect) or Python

frameworks are highly preferred.

Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Chennai, Hyderabad, Kochi (Cochin), Noida, Pune, Thiruvananthapuram
7 - 10 yrs
₹21L - ₹30L / yr
Perforce
DevOps
skill iconGit
skill iconGitHub
skill iconPython
+7 more

JOB DETAILS:

* Job Title: Specialist I - DevOps Engineering

* Industry: Global Digital Transformation Solutions Provider

* Salary: Best in Industry

* Experience: 7-10 years

* Location: Bengaluru (Bangalore), Chennai, Hyderabad, Kochi (Cochin), Noida, Pune, Thiruvananthapuram

 

Job Description

Job Summary:

As a DevOps Engineer focused on Perforce to GitHub migration, you will be responsible for executing seamless and large-scale source control migrations. You must be proficient with GitHub Enterprise and Perforce, possess strong scripting skills (Python/Shell), and have a deep understanding of version control concepts.

The ideal candidate is a self-starter, a problem-solver, and thrives on challenges while ensuring smooth transitions with minimal disruption to development workflows.

 

Key Responsibilities:

  • Analyze and prepare Perforce repositories — clean workspaces, merge streams, and remove unnecessary files.
  • Handle large files efficiently using Git Large File Storage (LFS) for files exceeding GitHub’s 100MB size limit.
  • Use git-p4 fusion (Python-based tool) to clone and migrate Perforce repositories incrementally, ensuring data integrity.
  • Define migration scope — determine how much history to migrate and plan the repository structure.
  • Manage branch renaming and repository organization for optimized post-migration workflows.
  • Collaborate with development teams to determine migration points and finalize migration strategies.
  • Troubleshoot issues related to file sizes, Python compatibility, network connectivity, or permissions during migration.

 

Required Qualifications:

  • Strong knowledge of Git/GitHub and preferably Perforce (Helix Core) — understanding of differences, workflows, and integrations.
  • Hands-on experience with P4-Fusion.
  • Familiarity with cloud platforms (AWS, Azure) and containerization technologies (Docker, Kubernetes).
  • Proficiency in migration tools such as git-p4 fusion — installation, configuration, and troubleshooting.
  • Ability to identify and manage large files using Git LFS to meet GitHub repository size limits.
  • Strong scripting skills in Python and Shell for automating migration and restructuring tasks.
  • Experience in planning and executing source control migrations — defining scope, branch mapping, history retention, and permission translation.
  • Familiarity with CI/CD pipeline integration to validate workflows post-migration.
  • Understanding of source code management (SCM) best practices, including version history and repository organization in GitHub.
  • Excellent communication and collaboration skills for cross-team coordination and migration planning.
  • Proven practical experience in repository migration, large file management, and history preservation during Perforce to GitHub transitions.

 

Skills: Github, Kubernetes, Perforce, Perforce (Helix Core), Devops Tools

 

Must-Haves

Git/GitHub (advanced), Perforce (Helix Core) (advanced), Python/Shell scripting (strong), P4-Fusion (hands-on experience), Git LFS (proficient)

Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Trivandrum, Thiruvananthapuram, Pune
3 - 5 yrs
₹15L - ₹25L / yr
Terraform
Splunk
DevOps
Windows Azure
SQL Azure
+12 more

JOB DETAILS:

* Job Title: Lead I - Azure, Terraform, GitLab CI 

* Industry: Global Digital Transformation Solutions Provider

* Salary: Best in Industry

* Experience: 3-5 years

* Location: Trivandrum/Pune

 

Job Description

Job Title: DevOps Engineer

Experience: 4–8 Years 

Location: Trivandrum & Pune 

Job Type: Full-Time

Mandatory skills: Azure, Terraform, GitLab CI, Splunk

 

Job Description

We are looking for an experienced and driven DevOps Engineer with 4 to 8 years of experience to join our team in Trivandrum or Pune. The ideal candidate will take ownership of automating cloud infrastructure, maintaining CI/CD pipelines, and implementing monitoring solutions to support scalable and reliable software delivery in a cloud-first environment.

 

Key Responsibilities

  • Design, manage, and automate Azure cloud infrastructure using Terraform.
  • Develop scalable, reusable, and version-controlled Infrastructure as Code (IaC) modules.
  • Implement monitoring and logging solutions using Splunk, Azure Monitor, and Dynatrace.
  • Build and maintain secure and efficient CI/CD pipelines using GitLab CI or Harness.
  • Collaborate with cross-functional teams to enable smooth deployment workflows and infrastructure updates.
  • Analyze system logs, performance metrics, and s to troubleshoot and optimize performance.
  • Ensure infrastructure security, compliance, and scalability best practices are followed.

 

Mandatory Skills

Candidates must have hands-on experience with the following technologies:

  • Azure – Cloud infrastructure management and deployment
  • Terraform – Infrastructure as Code for scalable provisioning
  • GitLab CI – Pipeline development, automation, and integration
  • Splunk – Monitoring, logging, and troubleshooting production systems

 

Preferred Skills

  • Experience with Harness (for CI/CD)
  • Familiarity with Azure Monitor and Dynatrace
  • Scripting proficiency in Python, Bash, or PowerShell
  • Understanding of DevOps best practices, containerization, and microservices architecture
  • Exposure to Agile and collaborative development environments

 

Skills Summary

Azure, Terraform, GitLab CI, Splunk (Mandatory) Additional: Harness, Azure Monitor, Dynatrace, Python, Bash, PowerShell

 

Skills: Azure, Splunk, Terraform, Gitlab Ci

 

******

Notice period - 0 to 15days only

Job stability is mandatory

Location: Trivandrum/Pune

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Mumbai, Pune
3 - 6 yrs
Best in industry
skill iconPython
PySpark
pandas
SQL
ADF
+2 more

* Python (3 to 6 years): Strong expertise in data workflows and automation

* Spark (PySpark): Hands-on experience with large-scale data processing

* Pandas: For detailed data analysis and validation

* Delta Lake: Managing structured and semi-structured datasets at scale

* SQL: Querying and performing operations on Delta tables

* Azure Cloud: Compute and storage services

* Orchestrator: Good experience with either ADF or Airflow

Read more
Virtana

at Virtana

3 candid answers
2 recruiters
Krutika Devadiga
Posted by Krutika Devadiga
Pune
5 - 10 yrs
Best in industry
skill iconPython
skill iconKubernetes
skill iconDocker
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
+3 more

Role Overview:

Challenge convention and work on cutting edge technology that is transforming the way our customers manage their physical, virtual and cloud computing environments. Virtual Instruments seeks highly talented people to join our growing team, where your contributions will impact the development and delivery of our product roadmap. Our award-winning Virtana Platform provides the only real-time, system-wide, enterprise scale solution for providing visibility into performance, health and utilization metrics, translating into improved performance and availability while lowering the total cost of the infrastructure supporting mission-critical applications.


We are seeking an individual with knowledge in Systems Management and/or Systems Monitoring Software and/or Performance Management Software and Solutions with insight into integrated infrastructure platforms like Cisco UCS, infrastructure providers like Nutanix, VMware, EMC & NetApp and public cloud platforms like Google Cloud and AWS to expand the depth and breadth of Virtana Products.


Work Location: Pune/ Chennai


Job Type: Hybrid


Role Responsibilities:

  • The engineer will be primarily responsible for design and development of software solutions for the Virtana Platform
  • Partner and work closely with team leads, architects and engineering managers to design and implement new integrations and solutions for the Virtana Platform.
  • Communicate effectively with people having differing levels of technical knowledge.
  • Work closely with Quality Assurance and DevOps teams assisting with functional and system testing design and deployment
  • Provide customers with complex application support, problem diagnosis and problem resolution

 

Required Qualifications:

  • Minimum of 4+ years of experience in a Web Application centric Client Server Application development environment focused on Systems Management, Systems Monitoring and Performance Management Software.
  • Able to understand and comprehend integrated infrastructure platforms and experience working with one or more data collection technologies like SNMP, REST, OTEL, WMI, WBEM.
  • Minimum of 4 years of development experience with one of these high level languages like Python, Java, GO is required.
  • Bachelor's (B.E, B.Tech) or Master's degree (M.E, M.Tech. MCA) in computer science, Computer Engineering or equivalent
  •  2 years of development experience in public cloud environment using Kubernetes etc (Google Cloud and/or AWS)

 

Desired Qualifications:

  • Prior experience with other virtualization platforms like OpenShift is a plus
  • Prior experience as a contributor to engineering and integration efforts with strong attention to detail and exposure to Open-Source software is a plus
  • Demonstrated ability as a strong technical engineer who can design and code with strong communication skills
  • Firsthand development experience with the development of Systems, Network and performance Management Software and/or Solutions is a plus
  • Ability to use a variety of debugging tools, simulators and test harnesses is a plus

 

About Virtana:

Virtana delivers the industry's only broadest and deepest Observability Platform that allows organizations to monitor infrastructure, de-risk cloud migrations, and reduce cloud costs by 25% or more.

Over 200 Global 2000 enterprise customers, such as AstraZeneca, Dell, Salesforce, Geico, Costco, Nasdaq, and Boeing, have valued Virtana's software solutions for over a decade.

Our modular platform for hybrid IT digital operations includes Infrastructure Performance Monitoring and Management (IPM), Artificial Intelligence for IT Operations (AIOps), Cloud Cost Management (Fin Ops), and Workload Placement Readiness Solutions. Virtana is simplifying the complexity of hybrid IT environments with a single cloud-agnostic platform across all the categories listed above. The $30BIT Operations Management (ITOM) Software market is ripe for disruption, and Virtana is uniquely positioned for success.

Read more
Metron Security Private Limited
Prathamesh Shinde
Posted by Prathamesh Shinde
Pune
3 - 5 yrs
₹4L - ₹10L / yr
skill iconPython

Job Description:


We are looking for a skilled Backend Developer with 2–5 years of experience in software development, specializing in Python and/or Golang. If you have strong programming skills, enjoy solving problems, and want to work on secure and scalable systems, we'd love to hear from you!


Location - Pune, Baner.

Interview Rounds - In Office


Key Responsibilities:

Design, build, and maintain efficient, reusable, and reliable backend services using Python and/or Golang

Develop and maintain clean and scalable code following best practices

Apply Object-Oriented Programming (OOP) concepts in real-world development

Collaborate with front-end developers, QA, and other team members to deliver high-quality features

Debug, optimize, and improve existing systems and codebase

Participate in code reviews and team discussions

Work in an Agile/Scrum development environment


Required Skills: Strong experience in Python or Golang (working knowledge of both is a plus)


Good understanding of OOP principles

Familiarity with RESTful APIs and back-end frameworks

Experience with databases (SQL or NoSQL)

Excellent problem-solving and debugging skills

Strong communication and teamwork abilities


Good to Have:

Prior experience in the security industry

Familiarity with cloud platforms like AWS, Azure, or GCP

Knowledge of Docker, Kubernetes, or CI/CD tools

Read more
Numino Labs
Disha Kamat
Posted by Disha Kamat
Pune
8 - 18 yrs
₹40L - ₹70L / yr
skill iconPython
skill iconReact.js
skill iconNextJs (Next.js)
skill iconAmazon Web Services (AWS)

Numino Labs 

Business: Software product engineering services: Pune, Goa.

Clients: Software product companies in the USA.

Business model: Exclusive teams for working on client products; direct and daily interactions with clients


Client 

Silicon Valley startup in genAI: 45m+ in funding.

Product: B2B SaaS. 

Core IP: Physics AI foundation model for hardware designers with specific focus on semi-conductor chip design.

Customers: World's top chip manufacturers


Responsibilities

  • Team player: Delivers effectively with teams; interpersonal skills, communication skills, risk management skills
  • Technical Leadership: Works with ambiguous requirements, designs solutions, independently drives delivery to customers
  • Hands on coder: Leverages AI to drive implementation across Reactjs, Python, DB, UnitTest, TestAutomation & Cloud Infra & CI/CD Automation.


Requirements

  • Strong computer science fundamentals: data structures & algorithms, networking, RDBMS, and distributed computing
  • 8-15 years of experience on Python Stack: Behave, PyTest, Python Generators & async operations, multithreading, context managers, decorators, descriptors
  • Python frameworks: FastAPI or Flask or DJango or SQLAlchemy
  • Expertise in Microservices, REST/gRPC APIs design, Authentication, Single Sign-on
  • Experience in high performance delivering solutions on Cloud
  • Some experience in FE: Js & Nextjs/ReactJs
  • Some experience in DevOps, Cloud Infra Automation, Test Automation
Read more
Metron Security Private Limited
Prathamesh Shinde
Posted by Prathamesh Shinde
Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹4L - ₹10L / yr
skill iconPython
Cyber Security

Job Description:


We are looking for a skilled Backend Developer with 2–5 years of experience in software development, specializing in Python and/or Golang. If you have strong programming skills, enjoy solving problems, and want to work on secure and scalable systems, we'd love to hear from you!


Location - Pune, Baner.

Interview Rounds - In Office

Cybersecurity Understanding Must


Key Responsibilities:

Design, build, and maintain efficient, reusable, and reliable backend services using Python and/or Golang

Develop and maintain clean and scalable code following best practices

Apply Object-Oriented Programming (OOP) concepts in real-world development

Collaborate with front-end developers, QA, and other team members to deliver high-quality features

Debug, optimize, and improve existing systems and codebase

Participate in code reviews and team discussions

Work in an Agile/Scrum development environment


Required Skills: Strong experience in Python or Golang (working knowledge of both is a plus)


Good understanding of OOP principles

Familiarity with RESTful APIs and back-end frameworks

Experience with databases (SQL or NoSQL)

Excellent problem-solving and debugging skills

Strong communication and teamwork abilities


Good to Have:

Prior experience in the security industry

Familiarity with cloud platforms like AWS, Azure, or GCP

Knowledge of Docker, Kubernetes, or CI/CD tools

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Amruta Mundale
Posted by Amruta Mundale
Pune
8 - 10 yrs
Best in industry
Technical support
SQL
Apache
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
+2 more

What You’ll Do:

We are looking for a Staff Operations Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.  

This role will be in the Engineering Operations team and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.  

  • Serve as the Engineering interface between Analytics and Engineering teams.
  • Develop and standardize all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data-based decision-making.
  • Optimize queries and data access efficiencies, serve as an expert in how to most efficiently attain desired data points.
  • Build “mastered” versions of the data for Analytics-specific querying use cases.
  • Establish a formal data practice for the Analytics practice in conjunction with the rest of DeepIntent.
  • Interpret analytics methodology requirements and apply them to data architecture to create standardized queries and operations for use by analytics teams.
  • Implement DataOps practices.
  • Master existing and new Data Pipelines and develop appropriate queries to meet analytics-specific objectives.
  • Collaborate with various business stakeholders, software engineers, machine learning engineers, and analysts.
  • Operate between Engineers and Analysts to unify both practices for analytics insight creation.

Who You Are:

  • 8+ years of experience in Tech Support (Specialised in Monitoring and maintaining Data pipeline).
  • Adept in market research methodologies and using data to deliver representative insights.
  • Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases.
  • Deep SQL experience is a must.
  • Exceptional communication skills with the ability to collaborate and translate between technical and non-technical needs.
  • English Language Fluency and proven success working with teams in the U.S.
  • Experience in designing, developing and operating configurable Data pipelines serving high-volume and velocity data.
  • Experience working with public clouds like GCP/AWS.
  • Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies.
  • Proficient with SQL, Python or JVM-based language, Bash.
  • Experience with any of Apache open-source projects such as Spark, Druid, Beam, Airflow etc. and big data databases like BigQuery, Clickhouse, etc. 
  • Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious.
  • Experience in debugging UI and Backend issues will be add on.


Read more
IntegriMart

IntegriMart

Agency job
Pune
5 - 8 yrs
₹12L - ₹15L / yr
Manual testing
Automation
skill iconPython
playwright
skill iconAmazon Web Services (AWS)
+2 more

Hope you are doing great!

We have an Urgent opening for a Senior Automation QA professional to join a global life sciences data platform company. Immediate interview slots available.


🔹 Quick Role Overview

  • Role: Senior Automation QA
  • Location: Pune(Hybrid -3 days work from office)
  • Employment Type: Full-Time
  • Experience Required: 5+ Years
  • Interview Process: 2–3 Rounds
  • Qualification: B.E / B.Tech
  • Notice Period : 0-30 Days


📌 Job Description

IntegriChain is the data and business process platform for life sciences manufacturers, delivering visibility into patient access, affordability, and adherence. The platform enables manufacturers to drive gross-to-net savings, ensure channel integrity, and improve patient outcomes.

We are expanding our Engineering team to strengthen our ability to process large volumes of healthcare and pharmaceutical data at enterprise scale.

The Senior Automation QA will be responsible for ensuring software quality by designing, developing, and maintaining automated test frameworks. This role involves close collaboration with engineering and product teams, ownership of test strategy, mentoring junior QA engineers, and driving best practices to improve product reliability and release efficiency.


🎯 Key Responsibilities

  • Hands-on QA across UI, API, and Database testing – both Automation & Manual
  • Analyze requirements, user stories, and technical documents to design detailed test cases and test data
  • Design, build, execute, and maintain automation scripts using BDD (Gherkin), Pytest, and Playwright
  • Own and maintain QA artifacts: Test Strategy, BRD, defect metrics, leakage reports, quality dashboards
  • Work with stakeholders to review and improve testing approaches using data-backed quality metrics
  • Ensure maximum feasible automation coverage in every sprint
  • Perform functional, integration, and regression testing in Agile & DevOps environments
  • Drive Shift-left testing, identifying defects early and ensuring faster closures
  • Contribute to enhancing automation frameworks with minimal guidance
  • Lead and mentor a QA team (up to 5 members)
  • Support continuous improvement initiatives and institutionalize QA best practices
  • Act as a problem-solver and strong team collaborator in a fast-paced environment


🧩 Desired Skills & Competencies

✅ Must-Have:

  • 5+ years of experience in test planning, test case design, test data preparation, automation & manual testing
  • 3+ years of strong UI & API automation experience using Playwright with Python
  • Solid experience in BDD frameworks (Gherkin, Pytest)
  • Strong database testing skills (Postgres / Snowflake / MySQL / RDS)
  • Hands-on experience with Git and Jenkins (DevOps exposure)
  • Working experience with JMeter
  • Experience in Agile methodologies (Scrum / Kanban)
  • Excellent problem-solving, analytical, communication, and stakeholder management skills

👍 Good to Have:

  • Experience testing AWS / Cloud-hosted applications
  • Exposure to ETL processes and BI reporting systems


Read more
ExterData Technologies Pvt Ltd
Pune
3 - 6 yrs
₹3L - ₹5L / yr
skill iconLaravel
skill iconCodeIgniter
Symfony
MVC Framework
skill iconPHP
+12 more

Seeking a skilled and experienced Full Stack PHP Developer to contribute to the development of scalable, secure, and high-performance digital platforms.

The role involves working with custom PHP frameworks similar to Laravel (PHP 8+, Laravel 11,12), MySQL, and front-end technologies such as Bootstrap, Blade, React.js etc.

The ideal candidate will have strong expertise in full-stack development, RESTful API integration, and implementing complex workflows in a multi-stakeholder environment.


Key Responsibilities:


• Design, develop, and optimize backend modules using Laravel (PHP 8+, Laravel 11,12), ensuringmodularity and scalability

• Develop and maintain RESTful APIs and integrate existing APIs.

• Build responsive UI components using Blade, Bootstrap 5, HTML5, CSS3, JavaScript, and jQuery

• Develop interactive dashboard components using React.js

• Manage user workflows including role-based access control (RBAC) and session handling

• Collaborate with UI/UX, DevOps, QA and Project Management teams to ensure timely feature delivery

• Optimize and manage MySQL schema and queries for large-scale datasets

• Use Git (GitHub/GitLab) and CI/CD pipelines for clean and versioned deployments

• Implement and manage Amazon S3 storage for secure file upload, retrieval, and deletion, includingconfiguration of S3 buckets, IAM permissions, pre-signed URLs, and storage lifecycle policies

• Ensure adherence to GIGW 3.0, CERT-In, and accessibility/security standards

• Ensure code quality, performance, and security of the applications

• Participate in requirement analysis, system architecture, and technical design discussions

• Write clean, modular, and well-documented code, following modern development standards

• Troubleshoot and debug issues, implementing fixes and performance improvements across layers

• Adhere to software development best practices and agile methodologies


Required Skills and Qualifications:

• Bachelor’s degree in computer science, IT, or related field

• Minimum 3 years of hands-on experience in full stack development

• Expertise in Laravel (PHP 8+) and MySQL, including schema optimization

• Strong knowledge of HTML5, CSS3, JavaScript, jQuery, Bootstrap, and other modern front-end tools

• Experience using AI-powered tools for development, code generation, testing, documentation, or productivity enhancement (e.g., GitHub Copilot, ChatGPT, Postman AI)

• Proven experience in both developing and integrating RESTful APIs, including third-party or government APIs

• Hands-on experience with React.js for building dashboards or dynamic user interfaces

• Experience integrating Power BI dashboards or frontend visualization libraries

• Understanding of RBAC, secure authentication, and session/token management

• Working knowledge of Git-based version control systems, CI/CD workflows, and modern deployment tools(e.g., GitLab CI, Laravel Forge)

• Implement and manage Amazon S3 storage for secure file upload, retrieval, and deletion, including configuration of S3 buckets, IAM permissions, pre-signed URLs, and storage lifecycle policies

• Exposure to Agile development and cross-functional team collaboration


Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Amruta Mundale
Posted by Amruta Mundale
Pune
5 - 8 yrs
Best in industry
skill iconPython
SQL
Spark
airflow
pandas
+6 more

What You’ll Do:

As a Sr. Data Scientist, you will work closely across DeepIntent Data Science teams located in New York, India, and Bosnia. The role will focus on building predictive models, implementing data-driven solutions to maximize ad effectiveness. You will also lead efforts in generating analyses and insights related to the measurement of campaign outcomes, Rx, patient journey, and supporting the evolution of the DeepIntent product suite. Activities in this position include developing and deploying models in production, reading campaign results, analyzing medical claims, clinical, demographic and clickstream data, performing analysis and creating actionable insights, summarizing, and presenting results and recommended actions to internal stakeholders and external clients, as needed.

  • Explore ways to create better predictive models.
  • Analyze medical claims, clinical, demographic and clickstream data to produce and present actionable insights.
  • Explore ways of using inference, statistical, and machine learning techniques to improve the performance of existing algorithms and decision heuristics.
  • Design and deploy new iterations of production-level code.
  • Contribute posts to our upcoming technical blog.

Who You Are:

  • Bachelor’s degree in a STEM field, such as Statistics, Mathematics, Engineering, Biostatistics, Econometrics, Economics, Finance, or Data Science.
  • 5+ years of working experience as a Data Scientist or Researcher in digital marketing, consumer advertisement, telecom, or other areas requiring customer-level predictive analytics.
  • Advanced proficiency in performing statistical analysis in Python, including relevant libraries, is required.
  • Experience working with data processing, transformation and building model pipelines using tools such as Spark, Airflow, and Docker.
  • You have an understanding of the ad-tech ecosystem, digital marketing and advertising data and campaigns or familiarity with the US healthcare patient and provider systems (e.g. medical claims, medications).
  • You have varied and hands-on predictive machine learning experience (deep learning, boosting algorithms, inference…).
  • You are interested in translating complex quantitative results into meaningful findings and interpretable deliverables, and communicating with less technical audiences orally and in writing.
  • You can write production level code, work with Git repositories.
  • Active Kaggle participant.
  • Working experience with SQL.
  • Familiar with medical and healthcare data (medical claims, Rx, preferred).
  • Conversant with cloud technologies such as AWS or Google Cloud.
Read more
AdElement

at AdElement

2 recruiters
Ritisha Nigam
Posted by Ritisha Nigam
Pune
2 - 5 yrs
₹3L - ₹7L / yr
adtech
SQL
skill iconJava
skill iconJavascript
skill iconPython

Company Description


AdElement is a leading digital advertising technology company that has been helping app publishers increase their ad revenue and reach untapped demand since 2011. With our expertise in connecting brands to app audiences on evolving screens, such as VR headsets and vehicle consoles, we enable our clients to be first to market. We have been recognized as the Google Agency of the Year and have offices globally, with our headquarters located in New Brunswick, New Jersey.


Job Description


Work alongside a highly skilled engineering team to design, develop, and maintain large-scale, highly performant, real-time applications.

Own building features, driving directly with product and other engineering teams.

Demonstrate excellent communication skills in working with technical and non-technical audiences.

Be an evangelist for best practices across all functions - developers, QA, and infrastructure/ops.

Be an evangelist for platform innovation and reuse.


Requirements:


2+ years of experience building large-scale and low-latency distributed systems.

Command of Java or C++.

Solid understanding of algorithms, data structures, performance optimization techniques, object-oriented programming, multi-threading, and real-time programming.

Experience with distributed caching, SQL/NO SQL, and other databases is a plus.

Experience with Big Data and cloud services such as AWS/GCP is a plus.

Experience in the advertising domain is a big plus.

B. S. or M. S. degree in Computer Science, Engineering, or equivalent.


Location: Pune, Maharashtra.





Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Bengaluru (Bangalore), Mumbai, Pune
4 - 7 yrs
Best in industry
skill iconPython
pandas
NumPy
SQL
skill iconHTML/CSS
+4 more

Specific Knowledge/Skills


  1. 4-6 years of experience
  2. Proficiency in Python programming.
  3. Basic knowledge of front-end development.
  4. Basic knowledge of Data manipulation and analysis libraries
  5. Code versioning and collaboration. (Git)
  6. Knowledge for Libraries for extracting data from websites.
  7. Knowledge of SQL and NoSQL databases
  8. Familiarity with RESTful APIs
  9. Familiarity with Cloud (Azure /AWS) technologies
Read more
MangoApps

at MangoApps

24 recruiters
Anjali Ghadge
Posted by Anjali Ghadge
Pune
3 - 5 yrs
₹5L - ₹20L / yr
skill iconPython
FastAPI
Large Language Models (LLM) tuning
skill iconFlask


A modern work platform means a single source of truth for your desk and deskless employees alike, where everything they need is organized and easy to find.


MangoApps was designed to unify your employee experience by combining intranet, communication, collaboration, and training into one intuitive, mobile-accessible workspace.


We are looking for a highly capable machine learning engineer to optimize our machine learning systems. You will be evaluating existing machine learning (ML) processes, performing statistical analysis to resolve data set problems, and enhancing the accuracy of our AI software's predictive automation capabilities.


To ensure success as a machine learning engineer, you should demonstrate solid data science knowledge and experience in a related ML role. A machine learning engineer will be someone whose expertise translates into the enhanced performance of predictive automation software.


AI/ML Engineer Responsibilities


  • Designing machine learning systems and self-running artificial intelligence (AI) software to automate predictive models.
  • Transforming data science prototypes and applying appropriate ML algorithms and tools.
  • Ensuring that algorithms generate accurate user recommendations.
  • Turning unstructured data into useful information by auto-tagging images and text-to-speech conversions.
  • Solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks.
  • Developing ML algorithms to huge volumes of historical data to make predictions.
  • Running tests, performing statistical analysis, and interpreting test results.
  • Documenting machine learning processes.
  • Keeping abreast of developments in machine learning.

AI/ML Engineer Requirements


  • Bachelor's degree in computer science, data science, mathematics, or a related field with at least 5+yrs of experience as an AI/ML Engineer
  • Advanced proficiency with Python and FastAPI framework along with good exposure to libraries like scikit-learn, Pandas, NumPy etc..
  • Experience in working on ChatGPT, LangChain (Must), Large Language Models (Good to have) & Knowledge Graphs
  • Extensive knowledge of ML frameworks, libraries, data structures, data modelling, and software architecture.
  • In-depth knowledge of mathematics, statistics, and algorithms.
  • Superb analytical and problem-solving abilities.
  • Great communication and collaboration skills.



Skills: fastapi,python,large language models,large language model,pandas,artificial intelligence,mathematics,machine learning,knowledge graphs,flask,python for data analysis,scikit-learn,langchain,algorithms,data science,chatgpt,numpy,statistics




Why work with us



  1. We take delight in what we do, and it shows in the products we offer and ratings of our products by leading industry analysts like IDC, Forrester and Gartner OR independent sites like Capterra.
  2. Be part of the team that has a great product-market fit, solving some of the most relevant communication and collaboration challenges faced by big and small organizations across the globe.
  3. MangoApps is highly collaborative place and careers at MangoApps come with a lot of growth and learning opportunities. If you’re looking to make an impact, MangoApps is the place for you.
  4. We focus on getting things done and know how to have fun while we do them. We have a team that brings creativity, energy, and excellence to every engagement.
  5. A workplace that was listed as one of the top 51 Dream Companies to work for by World HRD Congress in 2019.
  6. As a group, we are flat and treat everyone the same.

 

Benefits

 

We are a young organization and growing fast. Along with the fantastic workplace culture that helps you meet your career aspirations; we provide some comprehensive benefits.

 

1.     Comprehensive Health Insurance for Family (Including Parents) with no riders attached.

2.     Accident Insurance for each employee.

3.     Sponsored Trainings, Courses and Nano Degrees.


About You


·       Self-motivated: You can work with a minimum of supervision and be capable of strategically prioritizing multiple tasks in a proactive manner.

·       Driven: You are a driven team player, collaborator, and relationship builder whose infectious can-do attitude inspires others and encourages great performance in a fast-moving environment.

·       Entrepreneurial: You thrive in a fast-paced, changing environment and you’re excited by the chance to play a large role.

·       Passionate: You must be passionate about online collaboration and ensuring our clients are successful; we love seeing hunger and ambition.

·       Thrive in a start-up mentality with a “whatever it takes” attitude.

Read more
Innover Systems
Shivani Kawade
Posted by Shivani Kawade
Pune
2 - 5 yrs
₹4L - ₹10L / yr
skill iconPython
OpenCV
PyTorch
TensorFlow

Position: Machine Learning Engineer 

Job Type: Full Time, Permanent 

Location : Baner, Pune 

 

We are looking for a visionary Machine Learning Engineer to bridge the gap between cutting-edge research and real-world applications. In this role, you won’t just be training models; you will be architecting the visual intelligence that powers our next generation of products. You will join a high-impact team dedicated to solving complex spatial and visual problems at scale. 

What You’ll Do 

  • Architect & Innovate: Own the journey from research to production-ready computer vision pipelines. 
  • Optimize Performance: Build high-speed deep learning models for real-time detection and recognition. 
  • Lead with Data: Curate massive datasets and apply advanced engineering to maximize model accuracy. 
  • Collaborate & Integrate: Partner with cross-functional teams to embed AI insights into core products. 
  • Stay Ahead: Proactively prototype SOTA architectures to keep our solutions at the cutting edge. 

What We’re Looking For 

  • Expert in Python and OpenCV. You are framework-agnostic but a power user of PyTorch or TensorFlow
  • 2–5 years of shipping ML products. You know how to move beyond research notebooks into scalable production code. 
  • A problem solver obsessed with robust, explainable AI and clean, rigorous validation. 
  • Strong mastery of the math behind CV—specifically geometry, linear algebra, and optimization. 


Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 9 yrs
₹15L - ₹25L / yr
Data engineering
Apache Kafka
skill iconPython
skill iconAmazon Web Services (AWS)
AWS Lambda
+11 more

Job Details

- Job Title: Lead I - Data Engineering 

- Industry: Global digital transformation solutions provider

- Domain - Information technology (IT)

- Experience Required: 6-9 years

- Employment Type: Full Time

- Job Location: Pune

- CTC Range: Best in Industry


Job Description

Job Title: Senior Data Engineer (Kafka & AWS)

Responsibilities:

  • Develop and maintain real-time data pipelines using Apache Kafka (MSK or Confluent) and AWS services.
  • Configure and manage Kafka connectors, ensuring seamless data flow and integration across systems.
  • Demonstrate strong expertise in the Kafka ecosystem, including producers, consumers, brokers, topics, and schema registry.
  • Design and implement scalable ETL/ELT workflows to efficiently process large volumes of data.
  • Optimize data lake and data warehouse solutions using AWS services such as Lambda, S3, and Glue.
  • Implement robust monitoring, testing, and observability practices to ensure reliability and performance of data platforms.
  • Uphold data security, governance, and compliance standards across all data operations.

 

Requirements:

  • Minimum of 5 years of experience in Data Engineering or related roles.
  • Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).
  • Proficient in coding with Python, SQL, and Java — with Java strongly preferred.
  • Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.
  • Excellent problem-solvingcommunication, and collaboration skills.
  • Flexibility to write production-quality code in both Python and Java as required.

 

Skills: Aws, Kafka, Python


Must-Haves

Minimum of 5 years of experience in Data Engineering or related roles.

Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).

Proficient in coding with Python, SQL, and Java — with Java strongly preferred.

Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.

Excellent problem-solving, communication, and collaboration skills.

Flexibility to write production-quality code in both Python and Java as required.

Skills: Aws, Kafka, Python

Notice period - 0 to 15days only

Read more
Highfly Sourcing

at Highfly Sourcing

2 candid answers
Highfly Hr
Posted by Highfly Hr
Singapore, Switzerland, New Zealand, Dubai, Dublin, Ireland, Augsburg, Germany, Manchester (United Kingdom), Qatar, Kuwait, Malaysia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Goa
3 - 5 yrs
₹15L - ₹25L / yr
SQL
skill iconPHP
skill iconPython
Data Visualization
Data Structures
+5 more

We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.

Key Responsibilities:

  • Collect, clean, and organize data from internal and external sources
  • Analyze large datasets to identify trends, patterns, and opportunities
  • Prepare regular and ad-hoc reports for business stakeholders
  • Create dashboards and visualizations using tools like Power BI or Tableau
  • Work closely with cross-functional teams to understand data requirements
  • Ensure data accuracy, consistency, and quality across reports
  • Document data processes and analysis methods


Read more
Digital Convergence Technologies
Pune, Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Hyderabad
5 - 8 yrs
₹40L - ₹45L / yr
Artificial Intelligence (AI)
Data-flow analysis
Microsoft SharePoint
API
skill iconPython

AI Agent Builder – Internal Functions and Data Platform Development Tools


About the Role:

We are seeking a forward-thinking AI Agent Builder to lead the design, development, and deployment, and usage reporting of Microsoft Copilot and other AI-powered agents across our data platform development tools and internal business functions. This role will be instrumental in driving automation, improving onboarding, and enhancing operational efficiency through intelligent, context-aware assistants.

This role is central to our GenAI transformation strategy. You will help shape the future of how our teams interact with data, reduce administrative burden, and unlock new efficiencies across the organization. Your work will directly contribute to our “Art of the Possible” initiative—demonstrating tangible business value through AI.

You Will:

•                 Copilot Agent Development: Use Microsoft Copilot Studio and Agent Builder to create, test, and deploy AI agents that automate workflows, answer queries, and support internal teams.

•                 Data Engineering Enablement: Build agents that assist with data connector scaffolding, pipeline generation, and onboarding support for engineers.

•                 Knowledge Base Integration: Curate and integrate documentation (e.g., ERDs, connector specs) into Copilot-accessible repositories (SharePoint, Confluence) to support contextual AI responses.

•                 Prompt Engineering: Design reusable prompt templates and conversational flows to streamline repeated tasks and improve agent usability.

•                 Tool Evaluation & Integration: Assess and integrate complementary AI tools (e.g., GitLab Duo, Databricks AI, Notebook LM) to extend Copilot capabilities.

•                 Cross-Functional Collaboration: Partner with product, delivery, PMO, and security teams to identify high-value use cases and scale successful agent implementations.

•                 Governance & Monitoring: Ensure agents align with Responsible AI principles, monitor performance, and iterate based on feedback and evolving business needs.

•                 Adoption and Usage Reporting: Use Microsoft Viva Insights and other tools to report on user adoption, usage and business value delivered.

What We're Looking For:

•                 Proven experience with Microsoft 365 Copilot, Copilot Studio, or similar AI platforms, ChatGPT, Claude, etc.

•                 Strong understanding of data engineering workflows, tools (e.g., Git, Databricks, Unity Catalog), and documentation practices.

•                 Familiarity with SharePoint, Confluence, and Microsoft Graph connectors.

•                 Experience in prompt engineering and conversational UX design.

•                 Ability to translate business needs into scalable AI solutions.

•                 Excellent communication and collaboration skills across technical and non-technical

Bonus Points:

•                 Experience with GitLab Duo, Notebook LM, or other AI developer tools.

•                 Background in enterprise data platforms, ETL pipelines, or internal business systems.

•                 Exposure to AI governance, security, and compliance frameworks.

•                 Prior work in a regulated industry (e.g., healthcare, finance) is a plus.

Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Chennai, Kochi (Cochin), Pune, Trivandrum, Thiruvananthapuram
5 - 7 yrs
₹10L - ₹25L / yr
Google Cloud Platform (GCP)
skill iconJenkins
CI/CD
skill iconDocker
skill iconKubernetes
+15 more

Job Description

We are seeking a highly skilled Site Reliability Engineer (SRE) with strong expertise in Google Cloud Platform (GCP) and CI/CD automation to lead cloud infrastructure initiatives. The ideal candidate will design and implement robust CI/CD pipelines, automate deployments, ensure platform reliability, and drive continuous improvement in cloud operations and DevOps practices.


Key Responsibilities:

  • Design, develop, and optimize end-to-end CI/CD pipelines using Jenkins, with a strong focus on Declarative Pipeline syntax.
  • Automate deployment, scaling, and management of applications across various GCP services including GKE, Cloud Run, Compute Engine, Cloud SQL, Cloud Storage, VPC, and Cloud Functions.
  • Collaborate closely with development and DevOps teams to ensure seamless integration of applications into the CI/CD pipeline and GCP environment.
  • Implement and manage monitoring, logging, and ing solutions to maintain visibility, reliability, and performance of cloud infrastructure and applications.
  • Ensure compliance with security best practices and organizational policies across GCP environments.
  • Document processes, configurations, and architectural decisions to maintain operational transparency.
  • Stay updated with the latest GCP services, DevOps, and SRE best practices to enhance infrastructure efficiency and reliability.


Mandatory Skills:

  • Google Cloud Platform (GCP) – Hands-on experience with core GCP compute, networking, and storage services.
  • Jenkins – Expertise in Declarative Pipeline creation and optimization.
  • CI/CD – Strong understanding of automated build, test, and deployment workflows.
  • Solid understanding of SRE principles including automation, scalability, observability, and system reliability.
  • Familiarity with containerization and orchestration tools (Docker, Kubernetes – GKE).
  • Proficiency in scripting languages such as Shell, Python, or Groovy for automation tasks.


Preferred Skills:

  • Experience with TerraformAnsible, or Cloud Deployment Manager for Infrastructure as Code (IaC).
  • Exposure to monitoring and observability tools like Stackdriver, Prometheus, or Grafana.
  • Knowledge of multi-cloud or hybrid environments (AWS experience is a plus).
  • GCP certification (Professional Cloud DevOps Engineer / Cloud Architect) preferred.


Skills

Gcp, Jenkins, CICD Aws,


Nice to Haves

Experience with Terraform, Ansible, or Cloud Deployment Manager for Infrastructure as Code (IaC).

Exposure to monitoring and observability tools like Stackdriver, Prometheus, or Grafana.

Knowledge of multi-cloud or hybrid environments (AWS experience is a plus).

GCP certification (Professional Cloud DevOps Engineer / Cloud Architect) preferred.

 

******

Notice period - 0 to 15days only

Location – Pune, Trivandrum, Kochi, Chennai

Read more
Deqode

at Deqode

1 recruiter
Samiksha Agrawal
Posted by Samiksha Agrawal
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Pune, Bengaluru (Bangalore), Hyderabad, Jaipur, Bhopal
5 - 8 yrs
₹5L - ₹13L / yr
skill iconPython
Azure
Artificial Intelligence (AI)
FastAPI
skill iconFlask
+3 more

Job Description: Python-Azure AI Developer

Experience: 5+ years

Locations: Bangalore | Pune | Chennai | Jaipur | Hyderabad | Gurgaon | Bhopal

Mandatory Skills:

  • Python: Expert-level proficiency with FastAPI/Flask
  • Azure Services: Hands-on experience integrating Azure cloud services
  • Databases: PostgreSQL, Redis
  • AI Expertise: Exposure to Agentic AI technologies, frameworks, or SDKs with strong conceptual understanding

Good to Have:

  • Workflow automation tools (n8n or similar)
  • Experience with LangChain, AutoGen, or other AI agent frameworks
  • Azure OpenAI Service knowledge

Key Responsibilities:

  • Develop AI-powered applications using Python and Azure
  • Build RESTful APIs with FastAPI/Flask
  • Integrate Azure services for AI/ML workloads
  • Implement agentic AI solutions
  • Database optimization and management
  • Workflow automation implementation


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort