Cutshort logo

50+ Python Jobs in Pune | Python Job openings in Pune

Apply to 50+ Python Jobs in Pune on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
Nirmitee.io

at Nirmitee.io

4 recruiters
Gitashri K
Posted by Gitashri K
Pune
5 - 9 yrs
₹5L - ₹15L / yr
SQL
skill iconPython
AML
Anti Money Laundering

Job Description :

  • Proficient in SQL & Data Analysis tools and techniques(SQL & Python etc)
  • Banking domain expertise 
  • Strong understanding of AML (Anti Money Laundering)and fraud regulations
  • Excellent data analysis and problem-solving skills
  • Knowledge on data details and accuracy.
  • Identifies, creates, and analyzes data, information, and reports to make recommendations and enhance organizational capability.
  • Experience in using Business Analysis tools and techniques
  • Knowledge and understanding of various Business Analysis methodologies
  • Knowledge on data extraction, transformation, and mapping.
  • Functional knowledge - AML/ Fraud


NP : Immediate joiner to 30 days max


Read more
Client based at Bangalore location.

Client based at Bangalore location.

Agency job
Bengaluru (Bangalore), Pune, Chennai
4 - 8 yrs
₹8L - ₹16L / yr
Ab Initio
ETL
skill iconPython
SQL

Ab Initio Developer

 

About the Role:

We are seeking a skilled Ab Initio Developer to join our dynamic team and contribute to the development and maintenance of critical data integration solutions. As an Ab Initio Developer, you will be responsible for designing, developing, and implementing robust and efficient data pipelines using Ab Initio's powerful ETL capabilities.

Key Responsibilities:

·      Design, develop, and implement complex data integration solutions using Ab Initio's graphical interface and command-line tools.

·      Analyze complex data requirements and translate them into effective Ab Initio designs.

·      Develop and maintain efficient data pipelines, including data extraction, transformation, and loading processes.

·      Troubleshoot and resolve technical issues related to Ab Initio jobs and data flows.

·      Optimize performance and scalability of Ab Initio jobs.

·      Collaborate with business analysts, data analysts, and other team members to understand data requirements and deliver solutions that meet business needs.

·      Stay up-to-date with the latest Ab Initio technologies and industry best practices.

Required Skills and Experience:

·      2.5 to 8 years of hands-on experience in Ab Initio development.

·      Strong understanding of Ab Initio components, including Designer, Conductor, and Monitor.

·      Proficiency in Ab Initio's graphical interface and command-line tools.

·      Experience in data modeling, data warehousing, and ETL concepts.

·      Strong SQL skills and experience with relational databases.

·      Excellent problem-solving and analytical skills.

·      Ability to work independently and as part of a team.

·      Strong communication and documentation skills.

Preferred Skills:

·      Experience with cloud-based data integration platforms.

·      Knowledge of data quality and data governance concepts.

·      Experience with scripting languages (e.g., Python, Shell scripting).

·      Certification in Ab Initio or related technologies.


Read more
Vijay Sales
Tech Recruiter
Posted by Tech Recruiter
Pune
1 - 5 yrs
₹3L - ₹20L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Large Language Models (LLM)
Language models
skill iconPython

Job Title: AI/ML Engineer

Location: Pune

Experience Level: 1-5 Years

Job Type: Full-Time


About Us

Vijay Sales is one of India’s leading retail brands, offering a wide range of electronics and home appliances across multiple channels. As we continue to expand, we are building advanced technology solutions to optimise operations, improve customer experience, and drive growth. Join us in shaping the future of retail with innovative AI/ML-powered solutions.


Role Overview

We are looking for an AI/ML Engineer to join our technology team and help drive innovation across our business. In this role, you will design, develop, and implement machine learning models for applications like inventory forecasting, pricing automation, customer insights, and operational efficiencies. Collaborating with a cross-functional team, you’ll ensure our AI/ML solutions deliver measurable impact.


Key Responsibilities

  • Develop and deploy machine learning models to address business challenges such as inventory forecasting, dynamic pricing, demand prediction, and customer segmentation.
  • Preprocess and analyze large volumes of sales and customer data to uncover actionable insights.
  • Design algorithms for supervised, unsupervised, and reinforcement learning tailored to retail use cases.
  • Implement and manage pipelines to deploy and monitor models in production environments.
  • Continuously optimize model performance through retraining, fine-tuning, and feedback loops.
  • Work closely with business teams to identify requirements and translate them into AI/ML solutions.
  • Stay current with the latest AI/ML advancements and leverage them to enhance Vijay Sales’ technology stack.


Qualifications

Required:

  • Bachelor’s/Master’s degree in Computer Science, Data Science, Mathematics, or a related field.
  • Proficiency in Python and ML frameworks such as TensorFlow, PyTorch, or Scikit-learn.
  • Proven experience in developing, training, and deploying machine learning models.
  • Strong understanding of data processing, feature engineering, and data pipeline design.
  • Knowledge of algorithms for forecasting, classification, clustering, and optimization.
  • Experience working with large-scale datasets and databases (SQL/NoSQL).

Preferred:

  • Familiarity with retail industry challenges, such as inventory and pricing management.
  • Experience with cloud platforms (AWS, GCP, or Azure) for deploying ML solutions.
  • Knowledge of MLOps practices for scalable and efficient model management.
  • Hands-on experience with time-series analysis and demand forecasting models.
  • Understanding of customer analytics and personalization techniques.


Why Join Vijay Sales?

  • Work with one of India’s most iconic retail brands as we innovate and grow.
  • Be part of a team building transformative AI/ML solutions for the retail industry.
  • A collaborative work environment that encourages creativity and learning.
  • Competitive salary and benefits package, along with exciting growth opportunities.


Ready to make an impact? Apply now and help shape the future of Vijay Sales with cutting-edge AI/ML technologies!

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Tony Tom
Posted by Tony Tom
Bengaluru (Bangalore), Pune, Mumbai
4 - 10 yrs
Best in industry
snowflake
skill iconPython
Snowpark

· Designing and building data pipelines

Guiding customers on how to architect and build data engineering pipelines on Snowflake leveraging Snowpark, UDF etc for IOT Streaming analytics

· Developing code

Writing code in familiar programming languages, such as Python, and executing it within the Snowflake Data Cloud

· Data Science

Perform complex time series-based analyses for predictive and prescriptive maintenance and then later perform backpropagation based on Deep Learning (ANN, RNN etc) to uncover additional algorithms and embed them into the MLOPs pipeline

· Analyzing and testing software

Performing complex analysis, design, development, testing, and debugging of computer software

· Creating documentation

Creating repeatable processes and documentation as a result of customer engagement

· Developing best practices

Developing best practices, including ensuring knowledge transfer so that customers are properly enabled

· Working with stakeholders

Working with appropriate stakeholders to define system scope and objectives and establish baselines

· Querying and processing data

Querying and processing data with a DataFrame object

· Converting lambdas and functions

Converting custom lambdas and functions to user-defined functions (UDFs) that you can call to process data

· Writing stored procedures

Writing a stored procedure that you can call to process data, or automate with a task to build a data pipeline

Read more
Tech Prescient

at Tech Prescient

2 candid answers
3 recruiters
Ashwini Kulkarni
Posted by Ashwini Kulkarni
Pune
8 - 10 yrs
Best in industry
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+6 more

Job Position: Senior Technical Lead

Desired Skills: Python, Flask/FastAPI, MySQL/PostgreSQL, NoSQL, AWS, JavaScript, Angular/React

Experience Range: 8 - 10 Years

Type: Full Time

Location: India (Pune)

Availability: Immediate to 30 Days


Job Description: Tech Prescient is looking for an experienced and proven Technical Lead (Python/Flask/FastAPI/React/AWS/Azure Cloud) who has worked on the modern full stack to deliver software products and solutions. He/She should have experience in leading from the front, handling customer situations, internal teams, anchoring project communications and delivering outstanding work experience to our customers. 

In specific, below are some of the must-have skills and experiences to fulfill the job requirements - 

  1. 8+ years of relevant software design and development experience building cloud native applications using Python and JavaScript stack.
  2. Thorough understanding of deploying to at least one of the Cloud platforms (AWS or Azure). Knowledge of Kubernetes is an added advantage.
  3. Experience with Microservices architecture and server less deployments 
  4. Well-versed with RESTful services and building scalable API architectures using any of the Python frameworks. 
  5. Hands-on with Frontend technologies using either Angular or React
  6. Experience managing distributed delivery teams, tech leadership, ideating with the customer leadership, design discussions and code reviews to deliver quality software products
  7. Good attitude and passion to learn new technologies on the job. 
  8. Good communication and leadership skills. Ability to lead the internal team as well as customer communication (email/calls) 
  9. Bachelor's degree in Computer Science, Engineering, or a related field; advanced degree preferred.
Read more
Nirmitee.io

at Nirmitee.io

4 recruiters
Gitashri K
Posted by Gitashri K
Pune
4 - 10 yrs
₹2L - ₹8L / yr
skill iconPython
skill iconDjango
skill iconFlask

Overall 4+ years of IT experience

Minimum 4 years of hands-on experience in Python and backend development with exposure to cloud technologies

Key Skills & Responsibilities:


Python Proficiency:


Strong hands-on experience in Python with knowledge of frameworks like Django or Flask.

Ability to write clean, efficient, and maintainable code.

Cloud Exposure:


Familiarity with cloud platforms (AWS, Azure, or GCP).

Hands-on experience with basic services like virtual machines, serverless functions (e.g., AWS Lambda), or storage solutions.

Backend Development:


Proficiency in developing and consuming RESTful APIs.

Knowledge of additional communication protocols (e.g., GraphQL, WebSockets) is a plus.

Database Management:


Experience with relational databases (e.g., PostgreSQL, MySQL).

Basic knowledge of NoSQL databases like MongoDB or DynamoDB.

System Design & Problem-Solving:


Ability to understand and implement scalable backend solutions.

Basic exposure to distributed systems or event-driven architectures (e.g., Kafka, RabbitMQ) is a bonus.

Collaboration & Communication:


Good communication skills to work effectively in a team.

Willingness to learn and adapt to new technologies.

Security & Best Practices:


Awareness of secure coding practices and API security (e.g., OAuth, JWT).

Good-to-Have Skills:


Basic understanding of full-stack development, including front-end technologies (React, Angular, Vue.js).

Familiarity with containerization (Docker) and CI/CD pipelines.

Knowledge of caching strategies using Redis or Memcached.

Read more
Nirmitee.io

at Nirmitee.io

4 recruiters
Disha Karia
Posted by Disha Karia
Pune
5 - 10 yrs
₹8L - ₹15L / yr
skill iconPython
PySpark
skill iconAmazon Web Services (AWS)
CI/CD
skill iconGitHub

About the Role:

We are seeking a skilled Python Backend Developer to join our dynamic team. This role focuses on designing, building, and maintaining efficient, reusable, and reliable code that supports both monolithic and microservices architectures. The ideal candidate will have a strong understanding of backend frameworks and architectures, proficiency in asynchronous programming, and familiarity with deployment processes. Experience with AI model deployment is a plus.

Overall 5+ years of IT experience with minimum of 5+ Yrs of experience on Python and in Opensource web framework (Django) with AWS Experience.


Key Responsibilities:

- Develop, optimize, and maintain backend systems using Python, Pyspark, and FastAPI.

- Design and implement scalable architectures, including both monolithic and microservices.

-3+ Years of working experience in AWS (Lambda, Serverless, Step Function and EC2)

-Deep Knowledge on Python Flask/Django Framework

-Good understanding of REST API’s

-Sound Knowledge on Database

-Excellent problem-solving and analytical skills

-Leadership Skills, Good Communication Skills, interested to learn modern technologies

- Apply design patterns (MVC, Singleton, Observer, Factory) to solve complex problems effectively.

- Work with web servers (Nginx, Apache) and deploy web applications and services.

- Create and manage RESTful APIs; familiarity with GraphQL is a plus.

- Use asynchronous programming techniques (ASGI, WSGI, async/await) to enhance performance.

- Integrate background job processing with Celery and RabbitMQ, and manage caching mechanisms using Redis and Memcached.

- (Optional) Develop containerized applications using Docker and orchestrate deployments with Kubernetes.


Required Skills:

- Languages & Frameworks:Python, Django, AWS

- Backend Architecture & Design:Strong knowledge of monolithic and microservices architectures, design patterns, and asynchronous programming.

- Web Servers & Deployment:Proficient in Nginx and Apache, and experience in RESTful API design and development. GraphQL experience is a plus.

-Background Jobs & Task Queues: Proficiency in Celery and RabbitMQ, with experience in caching (Redis, Memcached).

- Additional Qualifications: Knowledge of Docker and Kubernetes (optional), with any exposure to AI model deployment considered a bonus.


Qualifications:

- Bachelor’s degree in Computer Science, Engineering, or a related field.

- 5+ years of experience in backend development using Python and Django and AWS.

- Demonstrated ability to design and implement scalable and robust architectures.

- Strong problem-solving skills, attention to detail, and a collaborative mindset.


Preferred:

- Experience with Docker/Kubernetes for containerization and orchestration.

- Exposure to AI model deployment processes.

Read more
Sarvaha Systems Private Limited
Eman Khan
Posted by Eman Khan
Pune
5 - 15 yrs
₹20L - ₹40L / yr
skill iconPython
skill iconJavascript
TypeScript
Playwright
Selenium
+8 more

Sarvaha would like to welcome talented Software Development Engineer in Test (SDET) with minimum 5 years of experience to join our team. As an SDET, you will champion the quality of the product and will design, develop, and maintain modular, extensible, and reusable test cases/scripts. This is a hands-on role which requires you to work with automation test developers and application developers to enhance the quality of the products and development practices. Please visit our website at http://www.sarvaha.com to know more about us.


Key Responsibilities

  • Understand requirements through specification or exploratory testing, estimate QA efforts, design test strategy, develop optimal test cases, maintain RTM
  • Design, develop & maintain a scalable test automation framework
  • Build interfaces to seamlessly integrate testing with development environments.
  • Create & manage test setups that prioritize scalability, remote accessibility and reliability.
  • Automate test scripts, create and execute relevant test suites, analyze test results and enhance existing or build newer scripts for coverage. Communicate with stakeholders for requirements, troubleshooting etc; provide visibility into the works by sharing relevant reports and metrics
  • Stay up-to-date with industry best practices in testing methodologies and technologies to advise QA and integration teams.


Skills Required

  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field (Software Engineering preferred).
  • Minimum 5+ years of experience in testing enterprise-grade/highly scalable, distributed applications, products, and services.
  • Expertise in manual and Automation testing with excellent understanding of test methodologies and test design techniques, test life cycle.
  • Strong programming skills in Typescript and Python, with experience using Playwright for building hybrid/BDD frameworks for Website and API automation
  • Very good problem-solving and analytical skills.
  • Experience in databases, both SQL and No-SQL
  • Practical experience in setting up CI/CD pipelines (ideally with Jenkins).
  • Exposure to Docker, Kubernetes and EKS is highly desired.
  • C# experience is an added advantage. 
  • A continuous learning mindset and a passion for exploring new technologies.
  • Excellent communication, collaboration, quick learning of needed language/scripting and influencing skills.


Position Benefits

  • Competitive salary and excellent growth opportunities within a dynamic team.
  • Positive and collaborative work environment with the opportunity to learn from talented colleagues.
  • Highly challenging and rewarding software development problems to solve.
  • Hybrid work model with established remote work options.
Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
2 - 3 yrs
Best in industry
skill iconNodeJS (Node.js)
skill iconMongoDB
Mongoose
skill iconExpress
skill iconGo Programming (Golang)
+6 more

We're seeking an experienced Backend Software Engineer to join our team.

As a backend engineer, you will be responsible for designing, developing, and deploying scalable backends for the products we build at NonStop.

This includes APIs, databases, and server-side logic.


Responsibilities:

  • Design, develop, and deploy backend systems, including APIs, databases, and server-side logic
  • Write clean, efficient, and well-documented code that adheres to industry standards and best practices
  • Participate in code reviews and contribute to the improvement of the codebase
  • Debug and resolve issues in the existing codebase
  • Develop and execute unit tests to ensure high code quality
  • Work with DevOps engineers to ensure seamless deployment of software changes
  • Monitor application performance, identify bottlenecks, and optimize systems for better scalability and efficiency
  • Stay up-to-date with industry trends and emerging technologies; advocate for best practices and new ideas within the team
  • Collaborate with cross-functional teams to identify and prioritize project requirements

Requirements:

  • At least 2+ years of experience building scalable and reliable backend systems
  • Strong proficiency in either of the programming languages such as Python, Node.js, Golang, RoR
  • Experience with either of the frameworks such as Django, Express, gRPC
  • Knowledge of database systems such as MySQL, PostgreSQL, MongoDB, Cassandra, or Redis
  • Familiarity with containerization technologies such as Docker and Kubernetes
  • Understanding of software development methodologies such as Agile and Scrum
  • Ability to demonstrate flexibility wrt picking a new technology stack and ramping up on the same fairly quickly
  • Bachelor's/Master's degree in Computer Science or related field
  • Strong problem-solving skills and ability to collaborate effectively with cross-functional teams
  • Good written and verbal communication skills in English
Read more
Nirmitee.io

at Nirmitee.io

4 recruiters
Disha Karia
Posted by Disha Karia
Pune
8 - 12 yrs
₹18L - ₹22L / yr
skill iconReact.js
SQL
skill iconNodeJS (Node.js)
skill iconNextJs (Next.js)
skill iconJava
+1 more

We’re looking for a Tech Lead with expertise in ReactJS (Next.js), backend technologies, and database management to join our dynamic team.

Key Responsibilities:

  • Lead and mentor a team of 4-6 developers.
  • Architect and deliver innovative, scalable solutions.
  • Ensure seamless performance while handling large volumes of data without system slowdowns.
  • Collaborate with cross-functional teams to meet business goals.

Required Expertise:

  • Frontend: ReactJS (Next.js is a must).
  • Backend: Experience in Node.js, Python, or Java.
  • Databases: SQL (mandatory), MongoDB (nice to have).
  • Caching & Messaging: Redis, Kafka, or Cassandra experience is a plus.
  • Proven experience in system design and architecture.
  • Cloud certification is a bonus.


Read more
Pune
4 - 7 yrs
₹18L - ₹30L / yr
Large Language Models (LLM)
skill iconPython
skill iconDocker
Retrieval Augmented Generation (RAG)
SQL
+7 more

Job Description

Phonologies is seeking a Senior Data Engineer to lead data engineering efforts for developing and deploying generative AI and large language models (LLMs). The ideal candidate will excel in building data pipelines, fine-tuning models, and optimizing infrastructure to support scalable AI systems for enterprise applications.


Role & Responsibilities

  • Data Pipeline Management: Design and manage pipelines for AI model training, ensuring efficient data ingestion, storage, and transformation for real-time deployment.
  • LLM Fine-Tuning & Model Lifecycle: Fine-tune LLMs on domain-specific data, and oversee the model lifecycle using tools like MLFlow and Weights & Biases.
  • Scalable Infrastructure: Optimize infrastructure for large-scale data processing and real-time LLM performance, leveraging containerization and orchestration in hybrid/cloud environments.
  • Data Management: Ensure data quality, security, and compliance, with workflows for handling sensitive and proprietary datasets.
  • Continuous Improvement & MLOps: Apply MLOps/LLMOps practices for automation, versioning, and lifecycle management, while refining tools and processes for scalability and performance.
  • Collaboration: Work with data scientists, engineers, and product teams to integrate AI solutions and communicate technical capabilities to business stakeholders.


Preferred Candidate Profile

  • Experience: 5+ years in data engineering, focusing on AI/ML infrastructure, LLM fine-tuning, and deployment.
  • Technical Skills: Advanced proficiency in Python, SQL, and distributed data tools.
  • Model Management: Hands-on experience with MLFlow, Weights & Biases, and model lifecycle management.
  • AI & NLP Expertise: Familiarity with LLMs (e.g., GPT, BERT) and NLP frameworks like Hugging Face Transformers.
  • Cloud & Infrastructure: Strong skills with AWS, Azure, Google Cloud, Docker, and Kubernetes.
  • MLOps/LLMOps: Expertise in versioning, CI/CD, and automating AI workflows.
  • Collaboration & Communication: Proven ability to work with cross-functional teams and explain technical concepts to non-technical stakeholders.
  • Education: Degree in Computer Science, Data Engineering, or related field.

Perks and Benefits

  • Competitive Compensation: INR 20L to 30L per year.
  • Innovative Work Environment for Personal Growth: Work with cutting-edge AI and data engineering tools in a collaborative setting, for continuous learning in data engineering and AI.


Read more
Fractal Analytics

at Fractal Analytics

5 recruiters
Eman Khan
Posted by Eman Khan
Bengaluru (Bangalore), Hyderabad, Gurugram, Noida, Mumbai, Pune, Coimbatore, Chennai
3 - 5.5 yrs
₹18L - ₹25L / yr
MLOps
MLFlow
kubeflow
skill iconMachine Learning (ML)
skill iconPython
+6 more

Building the machine learning production (or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-of-the-art AI solutions for Fractal clients.


Responsibilities

As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.

  • Enable Model tracking, model experimentation, Model automation
  • Develop ML pipelines to support
  • Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
  • Develop MLOps components in Machine learning development life cycle using Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
  • Work across all phases of Model development life cycle to build MLOPS components
  • Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
  • Be an integral part of client business development and delivery engagements across multiple domains


Required Qualifications

  • 3-5 years experience building production-quality software.
  • B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent
  • Strong experience in System Integration, Application Development or Data Warehouse projects across technologies used in the enterprise space
  • Knowledge of MLOps, machine learning and docker
  • Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
  • CI/CD experience( i.e. Jenkins, Git hub action,
  • Database programming using any flavors of SQL
  • Knowledge of Git for Source code management
  • Ability to collaborate effectively with highly technical resources in a fast-paced environment
  • Ability to solve complex challenges/problems and rapidly deliver innovative solutions
  • Foundational Knowledge of Cloud Computing on Azure
  • Hunger and passion for learning new skills
Read more
Fractal Analytics

at Fractal Analytics

5 recruiters
Eman Khan
Posted by Eman Khan
Bengaluru (Bangalore), Gurugram, Mumbai, Hyderabad, Pune, Noida, Coimbatore, Chennai
6 - 9 yrs
₹25L - ₹38L / yr
MLOps
MLFlow
kubeflow
skill iconMachine Learning (ML)
skill iconPython
+6 more

Building the machine learning production System(or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-ofthe-art AI solutions for Fractal clients.


Responsibilities

As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.

  • Enable Model tracking, model experimentation, Model automation
  • Develop scalable ML pipelines
  • Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
  • Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
  • Work across all phases of Model development life cycle to build MLOPS components
  • Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
  • Be an integral part of client business development and delivery engagements across multiple domains


Required Qualifications

  • 5.5-9 years experience building production-quality software
  • B.E/B.Tech/M.Tech in Computer Science or related technical degree OR equivalent
  • Strong experience in System Integration, Application Development or Datawarehouse projects across technologies used in the enterprise space
  • Expertise in MLOps, machine learning and docker
  • Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
  • Experience developing CI/CD components for production ready ML pipeline.
  • Database programming using any flavors of SQL
  • Knowledge of Git for Source code management
  • Ability to collaborate effectively with highly technical resources in a fast-paced environment
  • Ability to solve complex challenges/problems and rapidly deliver innovative solutions
  • Team handling, problem solving, project management and communication skills & creative thinking
  • Foundational Knowledge of Cloud Computing on Azure
  • Hunger and passion for learning new skills
Read more
Fractal Analytics

at Fractal Analytics

5 recruiters
Eman Khan
Posted by Eman Khan
Bengaluru (Bangalore), Hyderabad, Gurugram, Noida, Mumbai, Pune, Chennai, Coimbatore
5.5 - 9 yrs
₹25L - ₹36L / yr
Langchain
Large Language Models (LLM)
Retrieval Augmented Generation (RAG)
LangChain
Artificial Intelligence (AI)
+8 more

Responsibilities

  • Design and implement advanced solutions utilizing Large Language Models (LLMs).
  • Demonstrate self-driven initiative by taking ownership and creating end-to-end solutions.
  • Conduct research and stay informed about the latest developments in generative AI and LLMs.
  • Develop and maintain code libraries, tools, and frameworks to support generative AI development.
  • Participate in code reviews and contribute to maintaining high code quality standards.
  • Engage in the entire software development lifecycle, from design and testing to deployment and maintenance.
  • Collaborate closely with cross-functional teams to align messaging, contribute to roadmaps, and integrate software into different repositories for core system compatibility.
  • Possess strong analytical and problem-solving skills.
  • Demonstrate excellent communication skills and the ability to work effectively in a team environment.


Primary Skills

  • Generative AI: Proficiency with SaaS LLMs, including Lang chain, llama index, vector databases, Prompt engineering (COT, TOT, ReAct, agents). Experience with Azure OpenAI, Google Vertex AI, AWS Bedrock for text/audio/image/video modalities.
  • Familiarity with Open-source LLMs, including tools like TensorFlow/Pytorch and Huggingface. Techniques such as quantization, LLM finetuning using PEFT, RLHF, data annotation workflow, and GPU utilization.
  • Cloud: Hands-on experience with cloud platforms such as Azure, AWS, and GCP. Cloud certification is preferred.
  • Application Development: Proficiency in Python, Docker, FastAPI/Django/Flask, and Git.
  • Natural Language Processing (NLP): Hands-on experience in use case classification, topic modeling, Q&A and chatbots, search, Document AI, summarization, and content generation.
  • Computer Vision and Audio: Hands-on experience in image classification, object detection, segmentation, image generation, audio, and video analysis.
Read more
Gyaan AI Private Limited

at Gyaan AI Private Limited

2 candid answers
Arwa Virpurwala
Posted by Arwa Virpurwala
Pune
2 - 8 yrs
₹10L - ₹25L / yr
skill iconPython
Databases
skill iconDjango
Relational Database (RDBMS)
skill iconRedis
+1 more

About Gyaan:

Gyaan empowers Go-To-Market teams to ascend to new heights in their sales performance, unlocking boundless opportunities for growth. We're passionate about helping sales teams excel beyond expectations. Our pride lies in assembling an unparalleled team and crafting a crucial solution that becomes an indispensable tool for our users. With Gyaan, sales excellence becomes an attainable reality.


About the Job:

Gyaan is seeking an experienced backend developer with expertise in Python, Django, AWS, and Redis to join our dynamic team! As a backend developer, you will be responsible for building responsive and scalable applications using Python, Django, and associated technologies.


Required Qualifications:

  • 2+ years of hands-on experience programming in Python, Django
  • Good understanding of CI/CD tools (Github Action, Gitlab CI) in a SaaS environment.
  • Experience in building and running modern full-stack cloud applications using public cloud technologies such as AWS/
  • Proficiency with at least one relational database system like MySQL, Oracle, or PostgreSQL.
  • Experience with unit and integration testing.
  • Effective communication skills, both written and verbal, to convey complex problems across different levels of the organization and to customers.
  • Familiarity with Agile methodologies, software design lifecycle, and design patterns.
  • Detail-oriented mindset to identify and rectify errors in code or product development workflow.
  • Willingness to learn new technologies and concepts quickly, as the "cloud-native" field evolves rapidly.


Must Have Skills:

  • Python
  • Django Framework
  • AWS
  • Redis
  • Database Management


Qualifications:

  • Bachelor’s degree in Computer Science or equivalent experience.


If you are passionate about solving problems and have the required qualifications, we want to hear from you! You must be an excellent verbal and written communicator, enjoy collaborating with others, and welcome discussing a plan upfront. We offer a competitive salary, flexible work hours, and a dynamic work environment.


Read more
Nirmitee.io

at Nirmitee.io

4 recruiters
Gitashri K
Posted by Gitashri K
Pune
2 - 5 yrs
₹2L - ₹11L / yr
DevOps
skill iconPython

Job Title: DevOps Engineer

Job Type: Full-Time

About Us: Nirmitee.io is a leading IT company dedicated to delivering innovative solutions and services. We are looking for a talented DevOps Engineer with strong Python skills to join our dynamic team.

Job Description:

Responsibilities:

  • Design, implement, and maintain CI/CD pipelines to ensure smooth deployment processes.
  • Automate infrastructure provisioning, configuration, and deployment using tools like Terraform, Ansible, or similar.
  • Develop and maintain scripts and tools for system management, monitoring, and automation using Python.
  • Collaborate with development teams to ensure seamless integration and deployment of applications.
  • Monitor system performance, troubleshoot issues, and ensure high availability and reliability of services.
  • Implement and manage containerization technologies such as Docker and orchestration tools like Kubernetes.
  • Ensure security best practices are followed in all aspects of the infrastructure and deployment processes.
  • Participate in on-call rotations and respond to incidents as needed.

Requirements:

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Proven experience as a DevOps Engineer or in a similar role.
  • Strong proficiency in Python for scripting and automation.
  • Experience with CI/CD tools such as Jenkins, GitLab CI, or CircleCI.
  • Hands-on experience with infrastructure as code tools like Terraform, Ansible, or Chef.
  • Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.
  • Knowledge of containerization and orchestration tools like Docker and Kubernetes.
  • Understanding of networking, security, and system administration.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills.

Preferred Qualifications:

  • Experience with monitoring and logging tools such as Prometheus, Grafana, or ELK stack.
  • Knowledge of database management and SQL.
  • Familiarity with Agile and DevOps methodologies.
  • Certification in AWS, Azure, or Google Cloud is a plus.
Read more
Lean Technologies

at Lean Technologies

2 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Pune
5 - 9 yrs
Upto ₹55L / yr (Varies
)
Test Automation (QA)
skill iconPython
skill iconJavascript
TypeScript
Selenium
+3 more

About Lean Technologies

Lean Technologies is transforming the financial landscape by delivering open banking solutions across the MENA region. Our products have received overwhelmingly positive feedback from both developers and customers, and our recent $33 million Series A round, led by Sequoia, marks our commitment to further expansion and innovation in the GCC. We're dedicated to enabling the next generation of financial innovation and are constantly looking for driven, entrepreneurial individuals to join us on this exciting journey.


About the Role:

As a Senior SDET, you will play a critical role in ensuring the reliability and high performance of our open banking systems. This position involves extensive collaboration with development teams to guarantee the delivery of top-quality products. You will be responsible for creating and executing comprehensive test plans, contributing to the development and maintenance of our in-house automation framework, and analyzing test results to identify potential issues.


Key Responsibilities:

  • Implement, maintain, and adapt automation frameworks for open banking solutions.
  • Collaborate closely with all stakeholders to understand the full context of deliveries and translate complex functional and non-functional requirements into actionable tasks.
  • Ensure Quality Guidelines are met during the development cycle.
  • Monitor and report on test results, identifying potential performance issues.
  • Identify and recommend improvements to optimize the performance and scalability of open banking systems.
  • Participate in code reviews.


Requirements:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.
  • Minimum 5 years of experience in the testing domain, with a focus on automation.
  • Proven experience as an Automation Engineer or SDET, with a track record of building and maintaining scalable, reliable test frameworks.
  • Strong programming skills in languages such as JavaScript, TypeScript, and Python.
  • Experience with automated testing tools and frameworks (e.g., WebdriverIO, Selenium).
  • A passion for testing, innovation, and a demonstrated ability to work independently.
  • Experience with CI/CD tools like Jenkins.
  • Excellent verbal, written, and interpersonal skills.
  • Knowledge of performance testing tools such as Grafana and Kibana is a plus.


Why Join Us?

Lean is at the forefront of financial innovation, and this is just the beginning. We are committed to expanding our impact across the region and are always on the lookout for talented individuals passionate about solving complex problems. In addition to competitive compensation, we offer:

  • Private healthcare
  • Flexible office hours
  • Meaningful equity stakes
Read more
AbleCredit

at AbleCredit

2 candid answers
Utkarsh Apoorva
Posted by Utkarsh Apoorva
Bengaluru (Bangalore), Pune
2 - 5 yrs
₹15L - ₹30L / yr
skill iconPython
PyTorch
Shell Scripting

Salary: INR 15 to INR 30 lakhs per annum

Performance Bonus: Up to 10% of the base salary can be added

Location: Bangalore or Pune

Experience: 2-5 years


About AbleCredit:

AbleCredit is on a mission to solve the Credit Gap of emerging economies. In India alone, the Credit Gap is over USD 5T (Trillion!). This is the single largest contributor to poverty, poor genie index and lack of opportunities. Our Vision is to deploy AI reliably, and safely to solve some of the greatest problems of humanity.



Job Description:

This role is ideal for someone with a strong foundation in deep learning and hands-on experience with AI technologies.


  • You will be tasked with solving complex, real-world problems using advanced machine learning models in a privacy-sensitive domain, where your contributions will have a direct impact on business-critical processes.
  • As a Machine Learning Engineer at AbleCredit, you will collaborate closely with the founding team, who bring decades of industry expertise to the table.
  • You’ll work on deploying cutting-edge Generative AI solutions at scale, ensuring they align with strict privacy requirements and optimize business outcomes.


This is an opportunity for experienced engineers to bring creative AI solutions to one of the most challenging and evolving sectors, while making a significant difference to the company’s growth and success.



Requirements:

  • Experience: 2-4 years of hands-on experience in applying machine learning and deep learning techniques to solve complex business problems.
  • Technical Skills: Proficiency in standard ML tools and languages, including:
  • Python: Strong coding ability for building, training, and deploying machine learning models.
  • PyTorch (or MLX or Jax): Solid experience in one or more deep learning frameworks for developing and fine-tuning models.
  • Shell scripting: Familiarity with Unix/Linux shell scripting for automation and system-level tasks.
  • Mathematical Foundation: Good understanding of the mathematical principles behind machine learning and deep learning (linear algebra, calculus, probability, optimization).
  • Problem Solving: A passion for solving tough, ambiguous problems using AI, especially in data-sensitive, large-scale environments.
  • Privacy & Security: Awareness and understanding of working in privacy-sensitive domains, adhering to best practices in data security and compliance.
  • Collaboration: Ability to work closely with cross-functional teams, including engineers, product managers, and business stakeholders, and communicate technical ideas effectively.
  • Work Experience: This position is for experienced candidates only.


Additional Information:

  • Location: Pune or Bangalore
  • Work Environment: Collaborative and entrepreneurial, with close interactions with the founders.
  • Growth Opportunities: Exposure to large-scale AI systems, GenAI, and working in a data-driven privacy-sensitive domain.
  • Compensation: Competitive salary and ESOPs, based on experience and performance
  • Industry Impact: You’ll be at the forefront of applying Generative AI to solve high-impact problems in the finance/credit space, helping shape the future of AI in the business world.
Read more
Pune, Hybrid
3 - 5 yrs
₹8L - ₹16L / yr
skill iconPython
skill iconReact.js
skill iconAngularJS (1.x)
skill iconHTML/CSS
skill iconNodeJS (Node.js)

About the Company:

We are a dynamic and innovative company committed to delivering exceptional solutions that empower our clients to succeed. With our headquarters in the UK and a global footprint across the US, Noida, and Pune in India, we bring a decade of expertise to every endeavour, driving real results. We take a holistic approach to project delivery, providing end-to-end services that encompass everything from initial discovery and design to implementation, change management, and ongoing support. Our goal is to help clients leverage the full potential of the Salesforce platform to achieve their business objectives.

What Makes VE3 The Best For You We think of your family as our family, no matter the shape or size. We offer maternity leaves, PF Fund Contributions, 5 days working week along with a generous paid time off program that benefits balance your work & personal life.


Job Overview:

We are looking for a talented and experienced Senior Full Stack Web Developer who will be responsible for designing, developing, and implementing software solutions. As a part of our innovative team in Pune, you will work closely with global teams, transforming requirements into technical solutions while maintaining a strong focus on quality and efficiency.


Requirements


Key Responsibilities:

Software Design & Development:

Design software solutions based on requirements and within the constraints of architectural and design guidelines.

Derive software requirements, validate software specifications, and conduct feasibility analysis.

Accurately translate software architecture into design and code.

Technical Leadership:

Guide Scrum team members on design topics and ensure consistency against the design and architecture.

Lead the team in test automation design and implementation.

Identify opportunities for harmonization and reuse of


components/technology.

Coding & Implementation:

Actively participate in coding features and bug-fixing, ensuring adherence to coding and quality guidelines.

Lead by example in delivering solutions for self-owned components.

Collaboration & Coordination:

Collaborate with globally distributed teams to develop scalable and high-quality software products.

Ensure seamless integration and communication across multiple locations.


Required Skills and Qualifications:

Education: Bachelor's degree in Engineering or a related technical field.

Experience: 4-5 years of experience in software design and development.

Technical Skills:


Backend Development:

Strong experience in microservices API development using Java, Python, Spring Cloud.

Proficiency in build tools like Maven.


Frontend Development:

Expertise in full stack web development using JavaScript, Angular, React JS, NodeJS HTML5, and CSS3.


Database Knowledge:

Working knowledge of Oracle/PostgreSQL databases.

Cloud & DevOps:

Hands-on experience with AWS (Lambda, API Gateway, S3, EC2, EKS).

Exposure to CI/CD tools, code analysis, and test automation.

Operating Systems:

Proficiency in both Windows and Unix-based environments.

Nice to Have:

Experience with Terraform for infrastructure automation.

Soft Skills:

Individual Contributor: Ability to work independently while being a strong team player.

Problem-Solving: Strong analytical skills to identify issues and implement effective solutions.

Communication: Excellent verbal and written communication skills for collaboration with global teams.


Benefits

  • Competitive salary and benefits package.
  • Unlimited Opportunities for professional growth and development.
  • Collaborative and supportive work environment. 
  • Flexible working hours
  • Work life balance
  • Onsite opportunities
  • Retirement Plans
  • Team Building activities
  • Visit us @ https://www.ve3.global


Read more
Nirmitee.io

at Nirmitee.io

4 recruiters
Gitashri K
Posted by Gitashri K
Pune
3 - 7 yrs
₹3L - ₹7L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+2 more

Job Purpose:


To develop and maintain robust, scalable Python-based applications at Nirmitee.io, contributing to our product engineering and healthcare technology solutions with a focus on efficiency, innovation, and best practices.

Key Responsibilites:


  • Python Development:Write clean, efficient, and maintainable Python code
  • Develop and maintain scalable applications using Python frameworks like Django or Flask
  • Implement RESTful APIs and integrate with front-end technologies
  • Database Management:Work with both SQL and NoSQL databases, optimizing queries and ensuring data integrity
  • Design and implement database structures to support application requirements
  • Cloud and DevOps:Deploy and maintain applications in cloud environments (e.g., AWS, GCP)
  • Implement CI/CD pipelines for automated testing and deployment
  • Quality Assurance:Write and maintain comprehensive unit tests and integration tests
  • Participate in code reviews to ensure high code quality and share knowledge
  • Collaboration and Innovation:Work closely with cross-functional teams to deliver integrated solutions
  • Stay updated with the latest Python ecosystem developments and suggest improvements


Required Skills And Qualification:


  • 3+ years of experience in Python development
  • Bachelor's degree in Computer Science, Engineering, or a related field
  • Proficiency in Python and its ecosystem (e.g., Django, Flask, FastAPI)
  • Experience with SQL and NoSQL databases
  • Familiarity with cloud platforms (preferably AWS) and containerization (Docker)
  • Understanding of software development best practices and design patterns

Soft Skills:


  • Strong problem-solving and analytical skills
  • Excellent communication and teamwork abilities
  • Self-motivated and able to work independently when required
  • Adaptable and eager to learn new technologies


Company Culture:


Nirmitee.io is an innovative IT services company, driven by a passion for technology and a commitment to delivering exceptional solutions in product engineering and healthcare technology. We foster a culture of creativity, collaboration, and continuous learning.

Read more
Tekdi Technologies Pvt. Ltd.
Tekdi Recruitment
Posted by Tekdi Recruitment
Pune
6 - 10 yrs
₹15L - ₹25L / yr
skill iconPython
skill iconDjango
skill iconFlask
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
+2 more

Job Overview:

Python Lead responsibilities include developing and maintaining AI pipelines, including data preprocessing, feature extraction, model training, and evaluation.


Responsibilities:

  • Designing, developing, and implementing generative AI models and algorithms utilizing state-of-the-art techniques such as GPT, VAE, and GANs.
  • Conducting research to stay up-to-date with the latest advancements in generative AI, machine learning, and deep learning techniques and identify opportunities to integrate them into our products and services.
  • 7+ years of Experience in creating rest api using popular python web frameworks like Django, flask or fastapi.
  • Knowledge of databases like postgres, elastic, mongo etc.
  • Knowledge of working with external integrations like redis, kafka, s3, ec2 etc.
  • Some experience in ML integrations will be a plus.


Requirements:

  • Work experience as a Python Developer 
  • Team spirit 
  • Good problem-solving skills 
  • Proficient in Python and have experience with machine learning libraries and frameworks such as TensorFlow, PyTorch, or Keras.
  • strong knowledge of data structures, algorithms, and software engineering principles
  • Nice to have experience with natural language processing (NLP) techniques and tools, such as SpaCy, NLTK, or Hugging Face


Read more
Highfly Sourcing

at Highfly Sourcing

2 candid answers
Highfly Hr
Posted by Highfly Hr
Dubai, Augsburg, Germany, Zaragoza (Spain), Qatar, Salalah (Oman), Kuwait, Lebanon, Marseille (France), Genova (Italy), Winnipeg (Canada), Denmark, Poznan (Poland), Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Hyderabad, Pune
3 - 10 yrs
₹25L - ₹30L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+14 more

Job Description

We are looking for a talented Java Developer to work in abroad countries. You will be responsible for developing high-quality software solutions, working on both server-side components and integrations, and ensuring optimal performance and scalability.


Preferred Qualifications

  • Experience with microservices architecture.
  • Knowledge of cloud platforms (AWS, Azure).
  • Familiarity with Agile/Scrum methodologies.
  • Understanding of front-end technologies (HTML, CSS, JavaScript) is a plus.


Requirment Details

Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).

Proven experience as a Java Developer or similar role.

Strong knowledge of Java programming language and its frameworks (Spring, Hibernate).

Experience with relational databases (e.g., MySQL, PostgreSQL) and ORM tools.

Familiarity with RESTful APIs and web services.

Understanding of version control systems (e.g., Git).

Solid understanding of object-oriented programming (OOP) principles.

Strong problem-solving skills and attention to detail.

Read more
Lean Technologies

at Lean Technologies

2 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Pune
10yrs+
Upto ₹60L / yr (Varies
)
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
+8 more

About Lean Technologies

Lean is on a mission to revolutionize the fintech industry by providing developers with a universal API to access their customers' financial accounts across the Middle East. We’re breaking down infrastructure barriers and empowering the growth of the fintech industry. With Sequoia leading our $33 million Series A round, Lean is poised to expand its coverage across the region while continuing to deliver unparalleled value to developers and stakeholders.

Join us and be part of a journey to enable the next generation of financial innovation. We offer competitive salaries, private healthcare, flexible office hours, and meaningful equity stakes to ensure long-term alignment. At Lean, you'll work on solving complex problems, build a lasting legacy, and be part of a diverse, inclusive, and equal opportunity workplace.


About the role:

Are you a highly motivated and experienced software engineer looking to take your career to the next level? Our team at Lean is seeking a talented engineer to help us build the distributed systems that allow our engineering teams to deploy our platform in multiple geographies across various deployment solutions. You will work closely with functional heads across software, QA, and product teams to deliver scalable and customizable release pipelines.


Responsibilities

  • Distributed systems architecture – understand and manage the most complex systems
  • Continual reliability and performance optimization – enhancing observability stack to improve proactive detection and resolution of issues
  • Employing cutting-edge methods and technologies, continually refining existing tools to enhance performance and drive advancements
  • Problem-solving capabilities – troubleshooting complex issues and proactively reducing toil through automation
  • Experience in technical leadership and setting technical direction for engineering projects
  • Collaboration skills – working across teams to drive change and provide guidance
  • Technical expertise – depth skills and ability to act as subject matter expert in one or more of: IAAC, observability, coding, reliability, debugging, system design
  • Capacity planning – effectively forecasting demand and reacting to changes
  • Analyze and improve efficiency, scalability, and stability of various system resources
  • Incident response – rapidly detecting and resolving critical incidents. Minimizing customer impact through effective collaboration, escalation (including periodic on-call shifts) and postmortems


Requirements

  • 10+ years of experience in Systems Engineering, DevOps, or SRE roles running large-scale infrastructure, cloud, or web services
  • Strong background in Linux/Unix Administration and networking concepts
  • We work on OCI but would accept candidates with solid GCP/AWS or other cloud providers’ knowledge and experience
  • 3+ years of experience with managing Kubernetes clusters, Helm, Docker
  • Experience in operating CI/CD pipelines that build and deliver services on the cloud and on-premise
  • Work with CI/CD tools/services like Jenkins/GitHub-Actions/ArgoCD etc.
  • Experience with configuration management tools either Ansible, Chef, Puppet, or equivalent
  • Infrastructure as Code - Terraform
  • Experience in production environments with both relational and NoSQL databases
  • Coding with one or more of the following: Java, Python, and/or Go


Bonus

  • MultiCloud or Hybrid Cloud experience
  • OCI and GCP


Why Join Us?

At Lean, we value talent, drive, and entrepreneurial spirit. We are constantly on the lookout for individuals who identify with our mission and values, even if they don’t meet every requirement. If you're passionate about solving hard problems and building a legacy, Lean is the right place for you. We are committed to equal employment opportunities regardless of race, color, ancestry, religion, gender, sexual orientation, or disability.

Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
Shivani Kawade
Posted by Shivani Kawade
Pune
6 - 7 yrs
₹30L - ₹40L / yr
skill iconReact.js
skill iconGo Programming (Golang)
skill icongrafana
skill iconPython
skill iconDocker
+1 more

Availability: Full time  

Location: Pune, India  

Experience: 4- 6 years 


Tvarit Solutions Private Limited (wholly owned subsidiary of TVARIT GmbH, Germany). TVARIT provides software to reduce manufacturing waste like scrap, energy, and machine downtime using its patented technology. With its software products, and highly competent team from renowned Universities, TVARIT has gained customer trust across 4 continents within a short span of 5 years. TVARIT is awarded among the top 8 out of 490 AI companies by European Data Incubator, apart from many more awards by the German government and industrial organizations making TVARIT one of the most innovative AI companies in Germany and Europe.   


We are looking for a passionate Full Stack Developer Level 2 to join our technology team in Pune. You will be responsible for handling operations, design, development, testing, leading the software development team and working toward infrastructure development that will support the company’s solutions. You will get an opportunity to work closely on projects that will involve the automation of the manufacturing process.  


Key responsibilities 

  • Creating Plugins for Open-Source framework Grafana using React & Golang. 
  • Develop pixel-perfect implementation of the front end using React. 
  • Design efficient DB interaction to optimize performance. 
  • Interact with and build Python APIs. 
  • Collaborate across teams and lead/train the junior developers. 
  • Design and maintain functional requirement documents and guides. 
  • Get feedback from, and build solutions for, users and customers. 


Must have worked on these technologies.  

  • 2+ years of experience working with React-Typescript on a production level 
  • Experience with API creation using node.js or Python 
  • GitHub or any other SVC   
  • Have worked with any Linux/Unix-based operating system (Ubuntu, Debian, MacOS, etc)  


Good to have experience: 

  • Python-based backend technologies, relational and no-relational databases, Python Web Frameworks (Django or Flask) 
  • Experience with the Go programming language 
  • Experience working with Grafana, or on any other micro frontend architecture framework 
  • Experience with Docker 
  • Leading a team for at least a year 

 

Benefits and perks:

  • Culture of innovation, creativity, learning, and even failure, we believe in bringing out the best in you. 
  • Progressive leave policy for effective work-life balance. 
  • Get mentored by highly qualified internal resource groups and opportunities to avail industry-driven mentorship programs.
  • Multicultural peer groups and supportive workplace policies.  
  • Work from beaches, hills, mountains, and many more with the yearly workcation program; we believe in mixing elements of vacation and work. 

How it's like to work for a Startup?

Working for TVARIT (deep-tech German IT Startup) can offer you a unique blend of innovation, collaboration, and growth opportunities. But it's essential to approach it with a willingness to adapt and thrive in a dynamic environment.


If this position sparked your interest, do apply today!


By submitting my documents for the recruitment process, I agree that my data will be processed for the purpose of the recruitment process and stored for an additional 6 months after the process is completed. Without your consent, we unfortunately cannot consider your documents for the recruitment process. You can revoke your consent at any time. Further information on how we process your data can be found in our privacy policy at the following link: https://tvarit.com/privacy-policy/


Durch das Abschicken meiner Unterlagen für den Rrecruitingprozess erkläre ich mich damit einverstanden, dass meine Daten zum Zweck des Recruitingprozesses verarbeitet und nach Abschluss des Recruitingprozesses für weitere 6 Monate gespeichert werden. Ohne dein Einverständnis können wir deine Unterlagen für den Rrecruitingprozess leider nicht berücksichtigen. Du kannst dein Einverständnis jederzeit widerrufen. Weitere Informationen, wie wir deine Daten verarbeiten findest du in unserer Datenschutzerklärung unter folgendem Link: https://tvarit.com/privacy-policy/ 

Read more
Bengaluru (Bangalore), Pune, Mumbai
4 - 6 yrs
Best in industry
skill iconPython
Shell Scripting
linux
skill iconRuby
Network protocols
+2 more

Job Requirements:

Intermediate Linux Knowledge

  • Experience with shell scripting
  • Familiarity with Linux commands such as grep, awk, sed
  • Required

Advanced Python Scripting Knowledge

  • Strong expertise in Python
  • Required

Ruby

  • Nice to have

Basic Knowledge of Network Protocols

  • Understanding of TCP/UDP, Multicast/Unicast
  • Required

Packet Captures

  • Experience with tools like Wireshark, tcpdump, tshark
  • Nice to have

High-Performance Messaging Libraries

  • Familiarity with tools like Tibco, 29West, LBM, Aeron
  • Nice to have
Read more
Jeeva.ai

at Jeeva.ai

2 candid answers
Zainab Lokhandwala
Posted by Zainab Lokhandwala
Pune
3 - 5 yrs
Best in industry
skill iconPython
skill iconReact.js
skill iconMongoDB
skill iconAmazon Web Services (AWS)

About Jeeva.ai

At Jeeva.ai, we're on a mission to revolutionize the future of work by building AI employees that automate all manual tasks—starting with AI Sales Reps. Our vision is simple: "Anything that doesn’t require deep human connection can be automated & done better, faster & cheaper with AI." We’ve created a fully automated SDR using AI that generates 3x more pipeline than traditional sales teams at a fraction of the cost.

As a dynamic startup we are backed by Alt Capital (founded by Jack Altman & Sam Altman), Marc Benioff (CEO Salesforce), Gokul (Board Coinbase), Bonfire (investors in ChowNow), Techtsars (investors in Uber), Sapphire (investors in LinkedIn), Microsoft with $1M ARR in just 3 months after launch, we’re not just growing - we’re thriving and making a significant impact in the world of artificial intelligence.

As we continue to scale, we're looking for mid-senior Full Stack Engineers who are passionate, ambitious, and eager to make an impact in the AI-driven future of work.


About You

  • Experience: 3+ years of experience as a Full Stack Engineer with a strong background in React, Python, MongoDB, and AWS.
  • Automated CI/CD: Experienced in implementing and managing automated CI/CD pipelines using GitHub Actions and AWS Cloudformation.
  • System Architecture: Skilled in architecting scalable solutions for systems at scale, leveraging caching strategies, messaging queues and async/await paradigms for highly performant systems
  • Cloud-Native Expertise: Proficient in deploying cloud-native apps using AWS (Lambda, API Gateway, S3, ECS), with a focus on serverless architectures to reduce overhead and boost agility..
  • Development Tooling: Proficient in a wide range of development tools such as FastAPI, React State Management, REST APIs, Websockets and robust version control using Git.
  • AI and GPTs: Competent in applying AI technologies, particularly in using GPT models for natural language processing, automation and creating intelligent systems.
  • Impact-Driven: You've built and shipped products that users love and have seen the impact of your work at scale.
  • Ownership: You take pride in owning projects from start to finish and are comfortable wearing multiple hats to get the job done.
  • Curious Learner: You stay ahead of the curve, eager to explore and implement the latest technologies, particularly in AI.
  • Collaborative Spirit: You thrive in a team environment and can work effectively with both technical and non-technical stakeholders.
  • Ambitious: You have a hunger for success and are eager to contribute to a fast-growing company with big goals.


What You’ll Be Doing

  • Build and Innovate: Develop and scale AI-driven products like Gigi (AI Outbound SDR), Jim (AI Inbound SDR), Automate across voice & video with AI.
  • Collaborate Across Teams: Work closely with our Product, GTM, and Engineering teams to deliver world-class AI solutions that drive massive value for our customers.
  • Integrate and Optimize: Create seamless integrations with popular platforms like Salesforce, LinkedIn, and HubSpot, enhancing our AI’s capabilities.
  • Problem Solving: Tackle challenging problems head-on, from data pipelines to user experience, ensuring that every solution is both functional and delightful.
  • Drive AI Adoption: Be a key player in transforming how businesses operate by automating workflows, lead generation, and more with AI.





Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
Shivani Kawade
Posted by Shivani Kawade
Pune
4 - 6 yrs
₹15L - ₹25L / yr
PyTorch
skill iconPython
Scikit-Learn
NumPy
pandas
+2 more

Who are we looking for?  


We are looking for a Senior Data Scientist, who will design and develop data-driven solutions using state-of-the-art methods. You should be someone with strong and proven experience in working on data-driven solutions. If you feel you’re enthusiastic about transforming business requirements into insightful data-driven solutions, you are welcome to join our fast-growing team to unlock your best potential.  

 

Job Summary 

  • Supporting company mission by understanding complex business problems through data-driven solutions. 
  • Designing and developing machine learning pipelines in Python and deploying them in AWS/GCP, ... 
  • Developing end-to-end ML production-ready solutions and visualizations. 
  • Analyse large sets of time-series industrial data from various sources, such as production systems, sensors, and databases to draw actionable insights and present them via custom dashboards. 
  • Communicating complex technical concepts and findings to non-technical stakeholders of the projects 
  • Implementing the prototypes using suitable statistical tools and artificial intelligence algorithms. 
  • Preparing high-quality research papers and participating in conferences to present and report experimental results and research findings. 
  • Carrying out research collaborating with internal and external teams and facilitating review of ML systems for innovative ideas to prototype new models. 

 

Qualification and experience 

  • B.Tech/Masters/Ph.D. in computer science, electrical engineering, mathematics, data science, and related fields. 
  • 5+ years of professional experience in the field of machine learning, and data science. 
  • Experience with large-scale Time-series data-based production code development is a plus. 

 

Skills and competencies 

  • Familiarity with Docker, and ML Libraries like PyTorch, sklearn, pandas, SQL, and Git is a must. 
  • Ability to work on multiple projects. Must have strong design and implementation skills. 
  • Ability to conduct research based on complex business problems. 
  • Strong presentation skills and the ability to collaborate in a multi-disciplinary team. 
  • Must have programming experience in Python. 
  • Excellent English communication skills, both written and verbal. 


Benefits and Perks

  • Culture of innovation, creativity, learning, and even failure, we believe in bringing out the best in you. 
  • Progressive leave policy for effective work-life balance. 
  • Get mentored by highly qualified internal resource groups and opportunity to avail industry-driven mentorship program, as we believe in empowering people.  
  • Multicultural peer groups and supportive workplace policies.  
  • Work from beaches, hills, mountains, and many more with the yearly workcation program; we believe in mixing elements of vacation and work. 


 Hiring Process 

  • Call with Talent Acquisition Team: After application screening, a first-level screening with the talent acquisition team to understand the candidate's goals and alignment with the job requirements. 
  • First Round: Technical round 1 to gauge your domain knowledge and functional expertise. 
  • Second Round: In-depth technical round and discussion about the departmental goals, your role, and expectations.
  • Final HR Round: Culture fit round and compensation discussions.
  • Offer: Congratulations you made it!  


If this position sparked your interest, apply now to initiate the screening process.

Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
Shivani Kawade
Posted by Shivani Kawade
Remote, Pune
2 - 4 yrs
₹8L - ₹20L / yr
skill iconPython
PySpark
ETL
databricks
Azure
+6 more

TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes Tvarit one of the most innovative AI companies in Germany and Europe. 

 

 

We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English. 

 

 

We are seeking a skilled and motivated Data Engineer from the manufacturing Industry with over two years of experience to join our team. As a data engineer, you will be responsible for designing, building, and maintaining the infrastructure required for the collection, storage, processing, and analysis of large and complex data sets. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives. 

 

 

Skills Required 

  • Experience in the manufacturing industry (metal industry is a plus)  
  • 2+ years of experience as a Data Engineer 
  • Experience in data cleaning & structuring and data manipulation 
  • ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines. 
  • Python: Strong proficiency in Python programming for data manipulation, transformation, and automation. 
  • Experience in SQL and data structures  
  • Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache and NoSQL databases. 
  • Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform. 
  • Proficient in data management and data governance  
  • Strong analytical and problem-solving skills. 
  • Excellent communication and teamwork abilities. 

 


Nice To Have 

  • Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database). 
  • Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud. 


Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
Shivani Kawade
Posted by Shivani Kawade
Remote, Pune
2 - 6 yrs
₹8L - ₹25L / yr
SQL Azure
databricks
skill iconPython
SQL
ETL
+9 more

TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes TVARIT one of the most innovative AI companies in Germany and Europe.


We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.


We are seeking a skilled and motivated senior Data Engineer from the manufacturing Industry with over four years of experience to join our team. The Senior Data Engineer will oversee the department’s data infrastructure, including developing a data model, integrating large amounts of data from different systems, building & enhancing a data lake-house & subsequent analytics environment, and writing scripts to facilitate data analysis. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.


Skills Required:


  • Experience in the manufacturing industry (metal industry is a plus)
  • 4+ years of experience as a Data Engineer
  • Experience in data cleaning & structuring and data manipulation
  • Architect and optimize complex data pipelines, leading the design and implementation of scalable data infrastructure, and ensuring data quality and reliability at scale
  • ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
  • Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
  • Experience in SQL and data structures
  • Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache, and NoSQL databases.
  • Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
  • Proficient in data management and data governance
  • Strong analytical experience & skills that can extract actionable insights from raw data to help improve the business.
  • Strong analytical and problem-solving skills.
  • Excellent communication and teamwork abilities.


Nice To Have:

  • Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
  • Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
  • Bachelor’s degree in computer science, Information Technology, Engineering, or a related field from top-tier Indian Institutes of Information Technology (IIITs).
  • Benefits And Perks
  • A culture that fosters innovation, creativity, continuous learning, and resilience
  • Progressive leave policy promoting work-life balance
  • Mentorship opportunities with highly qualified internal resources and industry-driven programs
  • Multicultural peer groups and supportive workplace policies
  • Annual workcation program allowing you to work from various scenic locations
  • Experience the unique environment of a dynamic start-up


Why should you join TVARIT ?


Working at TVARIT, a deep-tech German IT startup, offers a unique blend of innovation, collaboration, and growth opportunities. We seek individuals eager to adapt and thrive in a rapidly evolving environment.


If this opportunity excites you and aligns with your career aspirations, we encourage you to apply today!

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
3 - 6 yrs
Best in industry
skill iconKubernetes
skill iconGit
MySQL
skill iconAmazon Web Services (AWS)
CI/CD
+3 more

With a core belief that advertising technology can measurably improve the lives of patients, DeepIntent is leading the healthcare advertising industry into the future. Built purposefully for the healthcare industry, the DeepIntent Healthcare Advertising Platform is proven to drive higher audience quality and script performance with patented technology and the industry’s most comprehensive health data. DeepIntent is trusted by 600+ pharmaceutical brands and all the leading healthcare agencies to reach the most relevant healthcare provider and patient audiences across all channels and devices. For more information, visit DeepIntent.com or find us on LinkedIn.


We are seeking a skilled and experienced Site Reliability Engineer (SRE) to join our dynamic team. The ideal candidate will have a minimum of 3 years of hands-on experience in managing and maintaining production systems, with a focus on reliability, scalability, and performance. As an SRE at Deepintent, you will play a crucial role in ensuring the stability and efficiency of our infrastructure, as well as contributing to the development of automation and monitoring tools.


Responsibilities:

  • Deploy, configure, and maintain Kubernetes clusters for our microservices architecture.
  • Utilize Git and Helm for version control and deployment management.
  • Implement and manage monitoring solutions using Prometheus and Grafana.
  • Work on continuous integration and continuous deployment (CI/CD) pipelines.
  • Containerize applications using Docker and manage orchestration.
  • Manage and optimize AWS services, including but not limited to EC2, S3, RDS, and AWS CDN.
  • Maintain and optimize MySQL databases, Airflow, and Redis instances.
  • Write automation scripts in Bash or Python for system administration tasks.
  • Perform Linux administration tasks and troubleshoot system issues.
  • Utilize Ansible and Terraform for configuration management and infrastructure as code.
  • Demonstrate knowledge of networking and load-balancing principles.
  • Collaborate with development teams to ensure applications meet reliability and performance standards.


Additional Skills (Good to Know):

  • Familiarity with ClickHouse and Druid for data storage and analytics.
  • Experience with Jenkins for continuous integration.
  • Basic understanding of Google Cloud Platform (GCP) and data center operations.


Qualifications:

  • Minimum 3 years of experience in a Site Reliability Engineer role or similar.
  • Proven experience with Kubernetes, Git, Helm, Prometheus, Grafana, CI/CD, Docker, and microservices architecture.
  • Strong knowledge of AWS services, MySQL, Airflow, Redis, AWS CDN.
  • Proficient in scripting languages such as Bash or Python.
  • Hands-on experience with Linux administration.
  • Familiarity with Ansible and Terraform for infrastructure management.
  • Understanding of networking principles and load balancing.


Education:

Bachelor's degree in Computer Science, Information Technology, or a related field.


DeepIntent is committed to bringing together individuals from different backgrounds and perspectives. We strive to create an inclusive environment where everyone can thrive, feel a sense of belonging, and do great work together.

DeepIntent is an Equal Opportunity Employer, providing equal employment and advancement opportunities to all individuals. We recruit, hire and promote into all job levels the most qualified applicants without regard to race, color, creed, national origin, religion, sex (including pregnancy, childbirth and related medical conditions), parental status, age, disability, genetic information, citizenship status, veteran status, gender identity or expression, transgender status, sexual orientation, marital, family or partnership status, political affiliation or activities, military service, immigration status, or any other status protected under applicable federal, state and local laws. If you have a disability or special need that requires accommodation, please let us know in advance.

DeepIntent’s commitment to providing equal employment opportunities extends to all aspects of employment, including job assignment, compensation, discipline and access to benefits and training.

Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
Shivani Kawade
Posted by Shivani Kawade
Pune
8 - 15 yrs
₹20L - ₹25L / yr
skill iconPython
CI/CD
Systems Development Life Cycle (SDLC)
ETL
JIRA
+5 more

TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes TVARIT one of the most innovative AI companies in Germany and Europe.



Requirements:

  • Python Experience: Minimum 3+ years.
  • Software Development Experience: Minimum 8+ years.
  • Data Engineering and ETL Workloads: Minimum 2+ years.
  • Familiarity with Software Development Life Cycle (SDLC).
  • CI/CD Pipeline Development: Experience in developing CI/CD pipelines for large projects.
  • Agile Framework & Sprint Methodology: Experience with Jira.
  • Source Version Control: Experience with GitHub or similar SVC.
  • Team Leadership: Experience leading a team of software developers/data scientists.

Good to Have:

  • Experience with Golang.
  • DevOps/Cloud Experience (preferably AWS).
  • Experience with React and TypeScript.

Responsibilities:

  • Mentor and train a team of data scientists and software developers.
  • Lead and guide the team in best practices for software development and data engineering.
  • Develop and implement CI/CD pipelines.
  • Ensure adherence to Agile methodologies and participate in sprint planning and execution.
  • Collaborate with the team to ensure the successful delivery of projects.
  • Provide on-site support and training in Pune.

Skills and Attributes:

  • Strong leadership and mentorship abilities.
  • Excellent problem-solving skills.
  • Effective communication and teamwork.
  • Ability to work in a fast-paced environment.
  • Passionate about technology and continuous learning.


Note: This is a part-time position paid on an hourly basis. The initial commitment is 4-8 hours per week, with potential fluctuations.


Join TVARIT and be a pivotal part of shaping the future of software development and data engineering.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sukanya Mohan
Posted by Sukanya Mohan
Pune, Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill iconAmazon Web Services (AWS)
EMR
skill iconPython
GLUE
SQL
+1 more

Greetings , Wissen Technology is Hiring for the position of Data Engineer

Please find the Job Description for your Reference:


JD

  • Design, develop, and maintain data pipelines on AWS EMR (Elastic MapReduce) to support data processing and analytics.
  • Implement data ingestion processes from various sources including APIs, databases, and flat files.
  • Optimize and tune big data workflows for performance and scalability.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Manage and monitor EMR clusters, ensuring high availability and reliability.
  • Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and store data in data lakes and data warehouses.
  • Implement data security best practices to ensure data is protected and compliant with relevant regulations.
  • Create and maintain technical documentation related to data pipelines, workflows, and infrastructure.
  • Troubleshoot and resolve issues related to data processing and EMR cluster performance.

 

 

Qualifications:

 

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • 5+ years of experience in data engineering, with a focus on big data technologies.
  • Strong experience with AWS services, particularly EMR, S3, Redshift, Lambda, and Glue.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Experience with big data frameworks and tools such as Hadoop, Spark, Hive, and Pig.
  • Solid understanding of data modeling, ETL processes, and data warehousing concepts.
  • Experience with SQL and NoSQL databases.
  • Familiarity with CI/CD pipelines and version control systems (e.g., Git).
  • Strong problem-solving skills and the ability to work independently and collaboratively in a team environment
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Pune
7 - 11 yrs
₹4L - ₹30L / yr
skill iconHTML/CSS
skill iconJavascript
skill iconAngular (2+)
skill iconAngularJS (1.x)
ASP.NET
+3 more

Job Description: 

Ideal experience required – 6-8 years

  • Mandatory hands-on experience in .Net Core (8)
  • Mandatory hands-on experience in Angular (10+ version required)
  • Azure and Microservice Architecture experience is good to have.
  • No database or domain constraint

 

Skills:

  • 7 to 10 years of working experience in managing .net projects closely with internal and external clients in structured contexts in an international environment.
  • Strong knowledge of .Net Core, .NET MVC, C#, SQL Server & JavaScript
  • Working experience in Angular
  • Familiar with various design and architectural patterns
  • Should be familiar with Git source code management for code repository.
  • Should be able to write clean, readable, and easily maintainable code.
  • Understanding of fundamental design principles for building a scalable application
  • Experience in implementing automated testing platforms and unit test. 

Nice to have:

  • AWS
  • Elastic Search
  • Mongo DB

Responsibilities:

  • Should be able to handle modules/project independently with minor supervision.
  • Should be good in troubleshooting and problem-solving skills.
  • Should be able to take complete ownership of modules and projects.
  • Should be able to communicate and coordinate with multiple teams.
  • Must have good verbal & written communication skill.

 

Read more
Monsoonfish

at Monsoonfish

1 video
5 recruiters
Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Pune
1 - 3 yrs
₹3L - ₹7L / yr
skill iconPython
skill iconXML
NOSQL Databases
JSON

Preferred Skills:

  • Experience with XML-based web services (SOAP, REST).
  • Knowledge of database technologies (SQL, NoSQL) for XML data storage.
  • Familiarity with version control systems (Git, SVN).
  • Understanding of JSON and other data interchange formats.
  • Certifications in XML technologies are a plus.
Read more
IntraEdge

at IntraEdge

1 recruiter
Karishma Shingote
Posted by Karishma Shingote
Pune
5 - 11 yrs
₹5L - ₹15L / yr
SQL
snowflake
Enterprise Data Warehouse (EDW)
skill iconPython
PySpark

Sr. Data Engineer (Data Warehouse-Snowflake)

Experience: 5+yrs

Location: Pune (Hybrid)


As a Senior Data engineer with Snowflake expertise you are a subject matter expert who is curious and an innovative thinker to mentor young professionals. You are a key person to convert Vision and Data Strategy for Data solutions and deliver them. With your knowledge you will help create data-driven thinking within the organization, not just within Data teams, but also in the wider stakeholder community.


Skills Preferred

  • Advanced written, verbal, and analytic skills, and demonstrated ability to influence and facilitate sustained change. Ability to convey information clearly and concisely to all levels of staff and management about programs, services, best practices, strategies, and organizational mission and values.
  • Proven ability to focus on priorities, strategies, and vision.
  • Very Good understanding in Data Foundation initiatives, like Data Modelling, Data Quality Management, Data Governance, Data Maturity Assessments and Data Strategy in support of the key business stakeholders.
  • Actively deliver the roll-out and embedding of Data Foundation initiatives in support of the key business programs advising on the technology and using leading market standard tools.
  • Coordinate the change management process, incident management and problem management process.
  • Ensure traceability of requirements from Data through testing and scope changes, to training and transition.
  • Drive implementation efficiency and effectiveness across the pilots and future projects to minimize cost, increase speed of implementation and maximize value delivery


Knowledge Preferred

  • Extensive knowledge and hands on experience with Snowflake and its different components like User/Group, Data Store/ Warehouse management, External Stage/table, working with semi structured data, Snowpipe etc.
  • Implement and manage CI/CD for migrating and deploying codes to higher environments with Snowflake codes.
  • Proven experience with Snowflake Access control and authentication, data security, data sharing, working with VS Code extension for snowflake, replication, and failover, optimizing SQL, analytical ability to troubleshoot and debug on development and production issues quickly is key for success in this role.
  • Proven technology champion in working with relational, Data warehouses databases, query authoring (SQL) as well as working familiarity with a variety of databases. 
  • Highly Experienced in building and optimizing complex queries. Good with manipulating, processing, and extracting value from large, disconnected datasets.
  • Your experience in handling big data sets and big data technologies will be an asset.
  • Proven champion with in-depth knowledge of any one of the scripting languages: Python, SQL, Pyspark.


Primary responsibilities

  • You will be an asset in our team bringing deep technical skills and capabilities to become a key part of projects defining the data journey in our company, keen to engage, network and innovate in collaboration with company wide teams.
  • Collaborate with the data and analytics team to develop and maintain a data model and data governance infrastructure using a range of different storage technologies that enables optimal data storage and sharing using advanced methods.
  • Support the development of processes and standards for data mining, data modeling and data protection.
  • Design and implement continuous process improvements for automating manual processes and optimizing data delivery.
  • Assess and report on the unique data needs of key stakeholders and troubleshoot any data-related technical issues through to resolution.
  • Work to improve data models that support business intelligence tools, improve data accessibility and foster data-driven decision making.
  • Ensure traceability of requirements from Data through testing and scope changes, to training and transition.
  • Manage and lead technical design and development activities for implementation of large-scale data solutions in Snowflake to support multiple use cases (transformation, reporting and analytics, data monetization, etc.).
  • Translate advanced business data, integration and analytics problems into technical approaches that yield actionable recommendations, across multiple, diverse domains; communicate results and educate others through design and build of insightful presentations.
  • Exhibit strong knowledge of the Snowflake ecosystem and can clearly articulate the value proposition of cloud modernization/transformation to a wide range of stakeholders.


Relevant work experience

Bachelors in a Science, Technology, Engineering, Mathematics or Computer Science discipline or equivalent with 7+ Years of experience in enterprise-wide data warehousing, governance, policies, procedures, and implementation.

Aptitude for working with data, interpreting results, business intelligence and analytic best practices.


Business understanding

Good knowledge and understanding of Consumer and industrial products sector and IoT. 

Good functional understanding of solutions supporting business processes.


Skill Must have

  • Snowflake 5+ years
  • Overall different Data warehousing techs 5+ years
  • SQL 5+ years
  • Data warehouse designing experience 3+ years
  • Experience with cloud and on-prem hybrid models in data architecture
  • Knowledge of Data Governance and strong understanding of data lineage and data quality
  • Programming & Scripting: Python, Pyspark
  • Database technologies such as Traditional RDBMS (MS SQL Server, Oracle, MySQL, PostgreSQL)


Nice to have

  • Demonstrated experience in modern enterprise data integration platforms such as Informatica
  • AWS cloud services: S3, Lambda, Glue and Kinesis and API Gateway, EC2, EMR, RDS, Redshift and Kinesis
  • Good understanding of Data Architecture approaches
  • Experience in designing and building streaming data ingestion, analysis and processing pipelines using Kafka, Kafka Streams, Spark Streaming, Stream sets and similar cloud native technologies.
  • Experience with implementation of operations concerns for a data platform such as monitoring, security, and scalability
  • Experience working in DevOps, Agile, Scrum, Continuous Delivery and/or Rapid Application Development environments
  • Building mock and proof-of-concepts across different capabilities/tool sets exposure
  • Experience working with structured, semi-structured, and unstructured data, extracting information, and identifying linkages across disparate data sets


Read more
Scremer
Sathish Dhawan
Posted by Sathish Dhawan
Pune, Mumbai
6 - 11 yrs
₹15L - ₹15L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
skill iconJava
Spark


Primary Skills

DynamoDB, Java, Kafka, Spark, Amazon Redshift, AWS Lake Formation, AWS Glue, Python


Skills:

Good work experience showing growth as a Data Engineer.

Hands On programming experience

Implementation Experience on Kafka, Kinesis, Spark, AWS Glue, AWS Lake Formation.

Excellent knowledge in: Python, Scala/Java, Spark, AWS (Lambda, Step Functions, Dynamodb, EMR), Terraform, UI (Angular), Git, Mavena

Experience of performance optimization in Batch and Real time processing applications

Expertise in Data Governance and Data Security Implementation

Good hands-on design and programming skills building reusable tools and products Experience developing in AWS or similar cloud platforms. Preferred:, ECS, EKS, S3, EMR, DynamoDB, Aurora, Redshift, Quick Sight or similar.

Familiarity with systems with very high volume of transactions, micro service design, or data processing pipelines (Spark).

Knowledge and hands-on experience with server less technologies such as Lambda, MSK, MWAA, Kinesis Analytics a plus.

Expertise in practices like Agile, Peer reviews, Continuous Integration


Roles and responsibilities:

Determining project requirements and developing work schedules for the team.

Delegating tasks and achieving daily, weekly, and monthly goals.

Responsible for designing, building, testing, and deploying the software releases.


Salary: 25LPA-40LPA

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Tony Tom
Posted by Tony Tom
Bengaluru (Bangalore), Mumbai, Pune
4 - 10 yrs
Best in industry
Performance Testing
skill iconPython
Linux/Unix

Job Description:

· Proficient In Python.

· Good knowledge of Stress/Load Testing and Performance Testing.

· Knowledge in Linux.

Read more
Sahaj AI Software

at Sahaj AI Software

1 video
6 recruiters
Soumya  Tripathy
Posted by Soumya Tripathy
Pune
11 - 17 yrs
Best in industry
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+10 more

About the role

As a full-stack engineer, you’ll feel at home if you are hands-on, grounded, opinionated and passionate about building things using technology. Our tech stack ranges widely with language ecosystems like Typescript, Java, Scala, Golang, Kotlin, Elixir, Python, .Net, Nodejs and even Rust.

This role is ideal for those looking to have a large impact and a huge scope for growth while still being hands-on with technology. We aim to allow growth without becoming “post-technical”. We are extremely selective with our consultants and are able to run our teams with fewer levels of management. You won’t find a BA or iteration manager here! We work in small pizza teams of 2-5 people where a well-founded argument holds more weight than the years of experience. You will have the opportunity to work with clients across domains like retail, banking, publishing, education, ad tech and more where you will take ownership of developing software solutions that are purpose-built to solve our clients’ unique business and technical needs.

Responsibilities

  • Produce high-quality code that allows us to put solutions into production.
  • Utilize DevOps tools and practices to build and deploy software.
  • Collaborate with Data Scientists and Engineers to deliver production-quality AI and Machine Learning systems.
  • Build frameworks and supporting tooling for data ingestion from a complex variety of sources. Work in short sprints to deliver working software with clear deliverables and client-led deadlines.
  • Willingness to be a polyglot developer and learn multiple technologies.

Skills you’ll need

  • A maker’s mindset. To be resourceful and have the ability to do things that have no instructions.
  • Extensive experience (at least 10 years) as a Software Engineer.
  • Deep understanding of programming fundamentals and expertise with at least one programming language (functional or object-oriented).
  • A nuanced and rich understanding of code quality, maintainability and practices like Test Driven Development.
  • Experience with one or more source control and build toolchains.
  • Working knowledge of CI/CD will be an added advantage.
  • Understanding of web APIs, contracts and communication protocols.
  • Understanding of Cloud platforms, infra-automation/DevOps, IaC/GitOps/Containers, design and development of large data platforms.

What will you experience in terms of culture at Sahaj?

  • A culture of trust, respect and transparency
  • Opportunity to collaborate with some of the finest minds in the industry
  • Work across multiple domains

What are the benefits of being at Sahaj?

  • Unlimited leaves
  • Life Insurance & Private Health insurance paid by Sahaj
  • Stock options
  • No hierarchy
  • Open Salaries
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Tony Tom
Posted by Tony Tom
Pune, Bengaluru (Bangalore), Mumbai
4 - 14 yrs
Best in industry
skill iconPython
AWS Lambda
gremlin
monkey
chaos enginnering tool

We are looking for QA role who has experience into Python ,AWS,and chaos engineering tool(Monkey,Gremlin)


⦁ Strong understanding of distributed systems

  • Cloud computing (AWS), and networking principles. 
  • Ability to understand complex trading systems and prepare and execute plans to induce failures
  • Python. 
  • Experience with chaos engineering tooling such as Chaos Monkey, Gremlin, or similar
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Tony Tom
Posted by Tony Tom
Pune
3 - 12 yrs
₹9L - ₹32L / yr
skill iconPython
Automation
Algorithmic trading
Investment banking
Selenium

Domain: - Investment Banking or Electronic Trading is mandatory


  • Develop (Python/Py test) automation tests in all components (e.g. API testing, client-server testing, E2E testing etc.) to meet product requirements and customer usages
  • Hands-On experience in Python
  • Proficiency in test automation frameworks and tools such as Selenium, Cucumber.
  • Experience working in a Microsoft Windows and Linux environment
  • Experience using Postman and automated API testing
  • Experience designing & executing load/stress and performance testing
  • Experience using test cases & test execution management tools and issues management tools (e.g Jira), and development environments (like Visual Studio, IntelliJ, or Eclipse). 


Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Pune, Hyderabad, Ahmedabad, Chennai
3 - 7 yrs
₹8L - ₹15L / yr
AWS Lambda
Amazon S3
Amazon VPC
Amazon EC2
Amazon Redshift
+3 more

Technical Skills:


  • Ability to understand and translate business requirements into design.
  • Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
  • Experience in creating ETL jobs using Python/PySpark.
  • Proficiency in creating AWS Lambda functions for event-based jobs.
  • Knowledge of automating ETL processes using AWS Step Functions.
  • Competence in building data warehouses and loading data into them.


Responsibilities:


  • Understand business requirements and translate them into design.
  • Assess AWS infrastructure needs for development work.
  • Develop ETL jobs using Python/PySpark to meet requirements.
  • Implement AWS Lambda for event-based tasks.
  • Automate ETL processes using AWS Step Functions.
  • Build data warehouses and manage data loading.
  • Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Pune, Hyderabad, Gurugram, Noida
5 - 11 yrs
₹20L - ₹36L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+7 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.


Role & Responsibilities:

Your role is focused on Design, Development and delivery of solutions involving:

• Data Integration, Processing & Governance

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Implement scalable architectural models for data processing and storage

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode

• Build functionality for data analytics, search and aggregation

Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 5+ years of IT experience with 3+ years in Data related technologies

2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc

6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Cloud data specialty and other related Big data technology certifications


Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes


Read more
MangoApps

at MangoApps

29 recruiters
Dhhruval Modi
Posted by Dhhruval Modi
Pune
3 - 5 yrs
₹5L - ₹15L / yr
skill iconPython
FastAPI
Large Language Models (LLM) tuning
skill iconFlask

A modern work platform means a single source of truth for your desk and deskless employees alike, where everything they need is organized and easy to find.


MangoApps was designed to unify your employee experience by combining intranet, communication, collaboration, and training into one intuitive, mobile-accessible workspace.


We are looking for a highly capable machine learning engineer to optimize our machine learning systems. You will be evaluating existing machine learning (ML) processes, performing statistical analysis to resolve data set problems, and enhancing the accuracy of our AI software's predictive automation capabilities.


To ensure success as a machine learning engineer, you should demonstrate solid data science knowledge and experience in a related ML role. A machine learning engineer will be someone whose expertise translates into the enhanced performance of predictive automation software.


AI/ML Engineer Responsibilities:

 

  • Designing machine learning systems and self-running artificial intelligence (AI) software to automate predictive models.
  • Transforming data science prototypes and applying appropriate ML algorithms and tools.
  • Ensuring that algorithms generate accurate user recommendations.
  • Turning unstructured data into useful information by auto-tagging images and text-to-speech conversions.
  • Solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks.
  • Developing ML algorithms to huge volumes of historical data to make predictions.
  • Running tests, performing statistical analysis, and interpreting test results.
  • Documenting machine learning processes.
  • Keeping abreast of developments in machine learning.


AI/ML Engineer Requirements:

 

  • Bachelor's degree in computer science, data science, mathematics, or a related field with at least 3+yrs of experience as an AI/ML Engineer
  • Advanced proficiency with Python and FastAPI framework along with good exposure to libraries like scikit-learn, Pandas, NumPy etc..
  • Experience in working on ChatGPT, LangChain (Must), Large Language Models (Good to have) & Knowledge Graphs
  • Extensive knowledge of ML frameworks, libraries, data structures, data modelling, and software architecture.
  • In-depth knowledge of mathematics, statistics, and algorithms.
  • Superb analytical and problem-solving abilities.
  • Great communication and collaboration skills.



Why work with us



  1. We take delight in what we do, and it shows in the products we offer and ratings of our products by leading industry analysts like IDC, Forrester and Gartner OR independent sites like Capterra.
  2. Be part of the team that has a great product-market fit, solving some of the most relevant communication and collaboration challenges faced by big and small organizations across the globe.
  3. MangoApps is highly collaborative place and careers at MangoApps come with a lot of growth and learning opportunities. If you’re looking to make an impact, MangoApps is the place for you.
  4. We focus on getting things done and know how to have fun while we do them. We have a team that brings creativity, energy, and excellence to every engagement.
  5. A workplace that was listed as one of the top 51 Dream Companies to work for by World HRD Congress in 2019.
  6. As a group, we are flat and treat everyone the same.

 

Benefits

 

We are a young organization and growing fast. Along with the fantastic workplace culture that helps you meet your career aspirations; we provide some comprehensive benefits.

 

1.     Comprehensive Health Insurance for Family (Including Parents) with no riders attached.

2.     Accident Insurance for each employee.

3.     Sponsored Trainings, Courses and Nano Degrees.


About You


·       Self-motivated: You can work with a minimum of supervision and be capable of strategically prioritizing multiple tasks in a proactive manner.

·       Driven: You are a driven team player, collaborator, and relationship builder whose infectious can-do attitude inspires others and encourages great performance in a fast-moving environment.

·       Entrepreneurial: You thrive in a fast-paced, changing environment and you’re excited by the chance to play a large role.

·       Passionate: You must be passionate about online collaboration and ensuring our clients are successful; we love seeing hunger and ambition.

·       Thrive in a start-up mentality with a “whatever it takes” attitude.

Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Noida
4 - 10 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.


Role & Responsibilities:

Job Title: Senior Associate L1 – Data Engineering

Your role is focused on Design, Development and delivery of solutions involving:

• Data Ingestion, Integration and Transformation

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time

• Build functionality for data analytics, search and aggregation


Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies

2.Minimum 1.5 years of experience in Big Data technologies

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security

7.Cloud data specialty and other related Big data technology certifications


Job Title: Senior Associate L1 – Data Engineering

Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Lokesh Manikappa
Posted by Lokesh Manikappa
Pune, Mumbai
5 - 9 yrs
₹5L - ₹15L / yr
Automation
Test Automation (QA)
skill iconJava
skill iconPython
Selenium
+3 more

Dear Connections,


We are hiring! Join our dynamic team as a QA Automation Tester (Python, Java, Selenium, API, SQL, Git)! We're seeking a passionate professional to contribute to our innovative projects. If you thrive in a collaborative environment, possess expertise in Python, Java, Selenium, and Robot Framework, and are ready to make an impact, apply now! Wissen Technology is committed to fostering innovation, growth, and collaboration. Don't miss this chance to be part of something extraordinary.


Company Overview:

Wissen is the preferred technology partner for executing transformational projects and accelerating implementation through thought leadership and a solution mindset. It is a leading IT solutions and consultancy firm dedicated to providing innovative and customized solutions to global clients. We leverage cutting-edge technologies to empower businesses and drive digital transformation.

#jobopportunity #hiringnow #joinourteam #career #wissen #QA #automationtester #robot #apiautomation #sql #java #python #selenium

Read more
A messaging AI platform

A messaging AI platform

Agency job
via Merito by Jinita Sumaria
Pune
12 - 18 yrs
₹35L - ₹50L / yr
Software Development
Product Management
Team Management
Product development
skill iconPython
+2 more

About Company:


Our client is the industry-leading provider of CRM messaging solutions. As a forward-thinking global company, it continues to innovate and develop cutting-edge solutions that redefine how businesses digitally communicate with their customers. It works with 2500 customers across 190 countries with customers ranging from SMBs to large global enterprises.


About the role:


The Director of Product Management is responsible for overseeing and implementing product development policies, objectives, and initiatives as well as leading research for new products, product enhancements, and product design.


Roles & responsibilities:


- Become a product expert on all company's solutions


- Build and own the product roadmap and timeline.


- Develop and execute a go-to-market strategy that addresses product, pricing, messaging, competitive positioning, product launch and promotion.


- Work with Development leaders to oversee development resources, including managing ROI, timelines, and deliverables.


- Work with the leadership team on driving product strategy, in both new and existing products, to increase overall market share, revenue and customer loyalty.


- Implement and communicate the strategic and technical direction for the department.


- Engage directly with customers to understand market needs and product requirements.


- Develop/implement a suite of Key Performance Indicators (KPI's) to measure product performance including profitability, customer satisfaction metrics, compliance, and delivery efficiency.


- Define and measure value of software solutions to establish and quantify customer ROI.


- Represent the company by visiting customers to solicit feedback on company products and services.


- Monitors and reports progress of projects within agreed upon timeframes.


- Write very high quality BRD, PRDs, Epics and User Stories


- Creates functional strategies and specific objectives as well as develops budgets, policies, and procedures.


- Creates and analyzes financial proposals related to product development and provides supporting content showing allocation of funds to execute these plans.


- Write status updates, iteration delivery and release notes as necessary


- Display a high level of critical thinking in cross-functional process analysis and problem resolution for new and existing products.


- Develop & conduct specialized training on new products launched and raise awareness & application of relevant subject matter.


- Monitor internal processes for efficiency and validity pre & post product launch/changes.


Requirements:


- Excellent communication skills, both verbal and in writing.


- Strong customer focus paired with exceptional presentation skills.


- Skilled at data analytics focused on identifying opportunities, driving insights, and measuring value.


- Strong problem-solving skills.


- Ability to work effectively in a diverse team environment.


- Proven strategic and tactical leadership, motivation, and decision-making skills


Required Education & Experience:


- Bachelor's Degree in Technology related field.


- Experience in working with a geographically diverse development team.


- Strong technical background with the ability to understand and discuss technical concepts.


- Proven experience in Software Development and Product Management.


- 12+ years of experience leading product teams in a fast-paced business environment as Product Leader on Software Platform or SaaS solution.


- Proven ability to lead and influence cross-functional teams.


- Demonstrated success in delivering high-impact products.


Preferred Qualifications


- Transition from software development role to product management.


- Experience building messaging solutions or marketing or support solutions.


- Experience with agile development methodologies.


- Familiarity with design thinking principles.


- Knowledge of relevant technologies and industry trends.


- Strong project management skills.

Read more
MNC service based company

MNC service based company

Agency job
via Tekfortune Inc by Ankit Uikey
Indore, Pune, Chennai, Vadodara
7 - 12 yrs
₹10L - ₹24L / yr
skill iconPython
skill iconDjango
skill iconReact.js
skill iconAngular (2+)
SQL
+14 more

Title/Role: Python Django Consultant

Experience: 8+ Years

Work Location: Indore / Pune /Chennai / Vadodara

 

Notice period: Immediate to 15 Days Max

 

Key Skills: Python, Django, Crispy Forms, Authentication, Bootstrap, jQuery, Server Side Rendered, SQL, Azure, React, Django DevOps

 

Job Description:

  • Should have knowledge and created forms using Django. Crispy forms is a plus point.
  • Must have leadership experience 
  • Should have good understanding of function based and class based views.
  • Should have good understanding about authentication (JWT and Token authentication)
  • Django – at least one senior with deep Django experience. The other 1 or 2 can be mid to senior python or Django
  • FrontEnd – Must have React/ Angular, CSS experience
  • Database – Ideally SQL but most senior has solid DB experience
  • Cloud – Azure preferred but agnostic
  • Consulting / client project background ideal.

 

Django Stack:

  • Django
  • Server Side Rendered HTML
  • Bootstrap
  • jQuery
  • Azure SQL
  • Azure Active Directory
  • Server Side Rendered/jQuery is older tech but is what we are ok with for internal tools. This is a good combination of late adopter agile stack integrated within an enterprise. Potentially we can push them to React for some discreet projects or pages that need more dynamism.

 

Django Devops:

  • Should have expertise with deploying and managing Django in Azure.
  • Django deployment to Azure via Docker.
  • Django connection to Azure SQL.
  • Django auth integration with Active Directory.
  • Terraform scripts to make this setup seamless.
  • Easy, proven to deployment / setup to AWS, GCP.
  • Load balancing, more advanced services, task queues, etc.


Read more
Ignite Solutions

at Ignite Solutions

6 recruiters
Meghana Dhamale
Posted by Meghana Dhamale
Remote, Pune
5 - 7 yrs
₹15L - ₹20L / yr
skill iconPython
LinkedIn
skill iconDjango
skill iconFlask
skill iconAmazon Web Services (AWS)
+2 more

We are looking for a hands-on technical expert who has worked with multiple technology stacks and has experience architecting and building scalable cloud solutions with web and mobile frontends. 

 What will you work on?

  •  Interface with clients
  • Recommend tech stacks
  • Define end-to-end logical and cloud-native architectures
  •  Define APIs
  • Integrate with 3rd party systems
  • Create architectural solution prototypes
  • Hands-on coding, team lead, code reviews, and problem-solving

What Makes You A Great Fit?

  • 5+ years of software experience 
  • Experience with architecture of technology systems having hands-on expertise in backend, and web or mobile frontend
  • Solid expertise and hands-on experience in Python with Flask or Django
  • Expertise on one or more cloud platforms (AWS, Azure, Google App Engine)
  • Expertise with SQL and NoSQL databases (MySQL, Mongo, ElasticSearch, Redis)
  • Knowledge of DevOps practices
  • Chatbot, Machine Learning, Data Science/Big Data experience will be a plus
  • Excellent communication skills, verbal and written

The job is for a full-time position at our https://goo.gl/maps/o67FWr1aedo">Pune (Viman Nagar) office. 

(Note: We are working remotely at the moment. However, once the COVID situation improves, the candidate will be expected to work from our office.)

Read more
Fullness Web Solutions

at Fullness Web Solutions

2 candid answers
Vidhu Bajaj
Posted by Vidhu Bajaj
Pune
2 - 3 yrs
₹6L - ₹8L / yr
skill iconPython
AWS Lambda

Hiring alert 🚨


Calling all #PythonDevelopers looking for an #ExcitingJobOpportunity 🚀 with one of our #Insurtech clients.


Are you a Junior Python Developer eager to grow your skills in #BackEnd development?


Our company is looking for someone like you to join our dynamic team. If you're passionate about Python and ready to learn from seasoned developers, this role is for you!


📣 About the company


The client is a fast-growing consultancy firm, helping P&C Insurance companies on their digital journey. With offices in Mumbai and New York, they're at the forefront of insurance tech. Plus, they offer a hybrid work culture with flexible timings, typically between 9 to 5, to accommodate your work-life balance.


💡 What you’ll do


📌 Work with other developers.

📌 Implement Python code with assistance from senior developers.

📌 Write effective test cases such as unit tests to ensure it is meeting the software design requirements.

📌 Ensure Python code when executed is efficient and well written.

📌 Refactor old Python code to ensure it follows modern principles.

📌 Liaise with stakeholders to understand the requirements.

📌 Ensure integration can take place with front end systems.

📌 Identify and fix code where bugs have been identified.


🔎 What you’ll need


📌 Minimum 3 years of experience writing AWS Lambda using Python

📌 Knowledge of other AWS services like CloudWatch and API Gateway

📌 Fundamental understanding of Python and its frameworks.

📌 Ability to write simple SQL queries

📌 Familiarity with AWS Lambda deployment

📌 The ability to problem-solve.

📌 Fast learner with an ability to adapt techniques based on requirements.

📌 Knowledge of how to effectively test Python code.

📌 Great communication and collaboration skills.

 

Read more
IGraft Global hair  Skin Services
Pune
1 - 2 yrs
₹4L - ₹5L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+10 more

Full Stack Developer Job Description


Position: Full Stack Developer

Department: Technology/Engineering

Location: Pune

Type: Full Time


Job Overview:

As a Full Stack Developer at Invvy Consultancy & IT Solutions, you will be responsible for both front-end and back-end development, playing a crucial role in designing and implementing user-centric web applications. You will collaborate with cross-functional teams including designers, product managers, and other developers to create seamless, intuitive, and high-performance digital solutions.


Responsibilities:


Front-End Development:

Develop visually appealing and user-friendly front-end interfaces using modern web technologies such as C# Coding, HTML5, CSS3, and JavaScript frameworks (e.g., React, Angular, Vue.js).

Collaborate with UX/UI designers to ensure the best user experience and responsive design across various devices and platforms.

Implement interactive features, animations, and dynamic content to enhance user engagement.

Optimize application performance for speed and scalability.


Back-End Development:

Design, develop, and maintain the back-end architecture using server-side technologies (e.g., Node.js, Python, Ruby on Rails, Java, .NET).

Create and manage databases, including data modeling, querying, and optimization.

Implement APIs and web services to facilitate seamless communication between front-end and back-end systems.

Ensure security and data protection by implementing proper authentication, authorization, and encryption measures.

Collaborate with DevOps teams to deploy and manage applications in cloud environments (e.g., AWS, Azure, Google Cloud).


Qualifications:

Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).

Proven experience as a Full Stack Developer or similar role.

Proficiency in front-end development technologies like HTML5, CSS3, JavaScript, and popular frameworks (React, Angular, Vue.js, etc.).

Strong experience with back-end programming languages and frameworks (Node.js, Python, Ruby on Rails, Java, .NET, etc.).

Familiarity with database systems (SQL and NoSQL) and their integration with web applications.

Knowledge of web security best practices and application performance optimization.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort