Cutshort logo

50+ Python Jobs in India

Apply to 50+ Python Jobs on CutShort.io. Find your next job, effortlessly. Browse Python Jobs and apply today!

icon
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Gurugram, Delhi, Noida, Ghaziabad, Faridabad
6 - 10 yrs
₹5L - ₹15L / yr
Google Cloud Platform (GCP)
skill iconPython
PySpark
skill icon.NET
skill iconScala

🚀 Hiring: Data Engineer | GCP + Spark + Python + .NET |

| 6–10 Yrs | Gurugram (Hybrid)


We’re looking for a skilled Data Engineer with strong hands-on experience in GCP, Spark-Scala, Python, and .NET.


📍 Location: Suncity, Sector 54, Gurugram (Hybrid – 3 days onsite)

💼 Experience: 6–10 Years

⏱️ Notice Period :- Immediate Joiner


Required Skills:

  • 5+ years of experience in distributed computing (Spark) and software development.
  • 3+ years of experience in Spark-Scala
  • 5+ years of experience in Data Engineering.
  • 5+ years of experience in Python.
  • Fluency in working with databases (preferably Postgres).
  • Have a sound understanding of object-oriented programming and development principles.
  • Experience working in an Agile Scrum or Kanban development environment.
  • Experience working with version control software (preferably Git).
  • Experience with CI/CD pipelines.
  • Experience with automated testing, including integration/delta, Load, and Performance
Read more
Hyderabad
10 - 15 yrs
₹20L - ₹50L / yr
skill iconPython
Legacy systems
CI/CD
skill iconDjango
Microservices
+3 more

Position of the role

The Principal Engineer reports to the Domain Lead and plays a key role in driving the technical vision, strategy, and execution of development initiatives. The Principal Engineer collaborates with multiple teams to ensure that software solutions are scalable, cost-optimized, performant, and aligned with business objectives.

 

Result expectation in terms of result areas and core activities

The Principal Engineer is responsible for defining, designing, and overseeing the implementation of complex software solutions. This role involves deep technical expertise, mentorship, and architectural guidance across multiple teams. The Principal Engineer also acts as a thought leader, influencing technology choices, best practices, and innovation within the organization.

 

Main objectives of the role

✔ Leading the design and architecture of the software by following best practices that ensure scalability, maintainability, cost optimization, and high performance.

✔ Driving innovation and continuous improvements in software development practices.

✔ Providing technical mentorship and coaching to engineers across teams.

✔ Ensuring the successful delivery of high-quality software aligned with business requirements.

✔ Defining and maintaining coding standards, best practices, and governance frameworks.

✔ Collaborating with product management and other stakeholders to shape the technical roadmap.

✔ Identifying technical debt and implementing strategies to mitigate it effectively.

✔ Promoting a culture of continuous learning, knowledge sharing, and cross-team collaboration.

✔ Leading DevOps, CI/CD, and automation to improve software delivery processes and efficiency.

 

Specialisation

✔ Deep understanding of software architecture, system design, and performance optimization.

✔ Translating complex business requirements into scalable and efficient software solutions.

✔ Handling large data transformations and ensuring system efficiency under the required load conditions.

✔ Leading initiatives for modernizing the technology stack and implementing best practices.

✔ Ensuring that security, scalability, and maintainability are embedded into development processes.

✔ Driving research and development efforts to explore emerging technologies and their business impact.

✔ Enabling teams to develop and maintain high-quality software through code reviews, architecture guidance, and technical strategy.

✔ Collaborating with product owners and stakeholders to ensure that development aligns with business goals and user needs.

 

Key processes in the role

✔ Agile / Scrum / Kanban development methodologies.

✔ CI/CD and DevOps practices to streamline delivery.

✔ Cloud-native architecture, monolith and microservices-based development.

✔ Scalable and high-performance computing strategies.

✔ Secure software development lifecycle (SDLC).

✔ Data-driven decision-making.

✔ Performance optimization.

 

Key relationships (teams and/or position titles)

✔ Engineering Teams (Developers, QA, DevOps).

✔ Head of Engineering and Technology Leadership.

✔ Product Management and Business Stakeholders.

✔ Customer Success and Solution Architects.

✔ External Technical Partners and Vendors.

 

Requirements

✔ Bachelor's or Master’s degree in Computer Science, Software Engineering, or a related field.

✔ 10+ years of experience in software development, with a proven track record .

✔ Strong experience in Python and modern software engineering practices.

✔ Expertise in cloud computing platforms (AWS, Azure, or GCP).

✔ Experience in architecting and developing scalable, high-performance applications.

✔ Hands-on experience with CI/CD pipelines, DevOps tools, and automation.

✔ Deep understanding of microservices, monolith, APIs, and distributed systems.

✔ Strong experience with database technologies, including SQL and NoSQL.

✔ Excellent communication and leadership skills, with the ability to influence technical decisions across teams.

✔ A passion for mentoring, coaching, and fostering a strong engineering culture.

✔ Experience in defining and implementing technical governance and best practices.

 

Competences

✔ Strategic Thinking - 4

✔ Analytical Problem Solving - 4

✔ Technical Leadership & Mentorship - 4

✔ Communication & Collaboration - 3

 

Measures of success

✔ Technical leadership impact - measurable improvements in code quality, architecture, and scalability.

✔ Delivery of high-quality software within agreed timelines and business requirements.

✔ Successful mentorship - improvement in team skill levels, problem-solving capabilities, and innovation.

✔ Reduction of technical debt through strategic refactoring and modernization.

✔ Engineering team satisfaction - based on feedback and collaboration effectiveness.

✔ Improvements in system performance, stability, cost-optimization, and security.

✔ Adoption of best practices and emerging technologies across teams.

✔ Contribution to company-wide strategic initiatives through technical innovation and leadership.

Read more
Indigrators solutions
Hyderabad
4 - 9 yrs
₹10L - ₹30L / yr
Large Language Models (LLM)
Open-source LLMs
Sagemaker
bed rocker
skill iconPython

Job Title: Senior AI Engineer 

Job Summary: 

We are seeking experienced Senior AI Engineers to join our AI team and drive the design, development, and deployment of cutting-edge AI solutions. You will work on exciting projects, such as sentiment analysis for support tickets, automated data insights, conversational interfaces, and zero-touch planning using AI. This role requires close collaboration with cross-functional teams, including Product and Data Engineering, to deliver impactful AI-driven features that transform our platform. 

Key Responsibilities: 

  • Design, develop, deploy, and maintain ML models and AI infrastructure. 
  • Collaborate with cross-functional teams to integrate ML models into production workflows. 
  • Utilize AWS services, including Sage Maker and Bedrock, for model deployment and real-time monitoring. 
  • Implement and manage CI/CD pipelines to ensure efficient and reliable model deployment. 
  • Stay updated with the latest advancements in machine learning and AI best practices. 
  • Monitor and optimise model performance, addressing issues related to scalability and efficiency. 
  • Troubleshoot and resolve problems related to ML models and infrastructure. 

Requirements: 

  • 5+ years of professional experience with Python programming. 
  • Hands-on experience with Machine Learning Operations (MLOps). 
  • Proven expertise in data engineering and ETL processes. 
  • Strong knowledge of AWS services, including Sage Maker 3wand Bedrock. 
  • Proficiency in setting up and managing Docker and CI/CD pipelines. 
  • Experience with large language models (LLMs) and prompt engineering. 
  • Familiarity with model performance monitoring and optimization techniques. 
  • Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment. 
Read more
Data Havn

Data Havn

Agency job
via Infinium Associate by Toshi Srivastava
Noida
5 - 9 yrs
₹40L - ₹60L / yr
skill iconPython
SQL
Data engineering
Snowflake
ETL
+5 more

About the Role:

We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.

Responsibilities:

  • Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
  • Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
  • Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
  • Team Management: Able to handle team.
  • Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
  • Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
  • Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
  • Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.

 

 

 

 

Skills:

  • Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
  • Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
  • Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
  • Understanding of data modeling and data architecture concepts.
  • Experience with ETL/ELT tools and frameworks.
  • Excellent problem-solving and analytical skills.
  • Ability to work independently and as part of a team.

Preferred Qualifications:

  • Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
  • Knowledge of machine learning and artificial intelligence concepts.
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Certification in cloud platforms or data engineering.


Read more
PGAGI
Javeriya Shaik
Posted by Javeriya Shaik
Remote only
0 - 0.6 yrs
₹2L - ₹2L / yr
skill iconPython
Large Language Models (LLM)
Natural Language Processing (NLP)
skill iconDeep Learning
FastAPI
+1 more

We're at the forefront of creating advanced AI systems, from fully autonomous agents that provide intelligent customer interaction to data analysis tools that offer insightful business solutions. We are seeking enthusiastic interns who are passionate about AI and ready to tackle real-world problems using the latest technologies.


Duration: 6 months


Perks:

- Hands-on experience with real AI projects.

- Mentoring from industry experts.

- A collaborative, innovative and flexible work environment

After completion of the internship period, there is a chance to get a full-time opportunity as AI/ML engineer (Up to 12 LPA).


Compensation:

- Joining Bonus: A one-time bonus of INR 2,500 will be awarded upon joining.

- Stipend: Base is INR 8000/- & can increase up to 20000/- depending upon performance matrix.

Key Responsibilities

  • Experience working with python, LLM, Deep Learning, NLP, etc..
  • Utilize GitHub for version control, including pushing and pulling code updates.
  • Work with Hugging Face and OpenAI platforms for deploying models and exploring open-source AI models.
  • Engage in prompt engineering and the fine-tuning process of AI models.

Requirements

  • Proficiency in Python programming.
  • Experience with GitHub and version control workflows.
  • Familiarity with AI platforms such as Hugging Face and OpenAI.
  • Understanding of prompt engineering and model fine-tuning.
  • Excellent problem-solving abilities and a keen interest in AI technology.


To Apply Click below link and submit the Assignment

https://pgagi.in/jobs/28df1e98-f0c3-4d58-9509-d5b1a4ea9754

Read more
Blitzy

at Blitzy

2 candid answers
1 product
Eman Khan
Posted by Eman Khan
Pune
6 - 10 yrs
₹40L - ₹70L / yr
skill iconPython
skill iconDjango
skill iconFlask
FastAPI
Google Cloud Platform (GCP)
+1 more

Requirements

  • 7+ years of experience with Python
  • Strong expertise in Python frameworks (Django, Flask, or FastAPI)
  • Experience with GCP, Terraform, and Kubernetes
  • Deep understanding of REST API development and GraphQL
  • Strong knowledge of SQL and NoSQL databases
  • Experience with microservices architecture
  • Proficiency with CI/CD tools (Jenkins, CircleCI, GitLab)
  • Experience with container orchestration using Kubernetes
  • Understanding of cloud architecture and serverless computing
  • Experience with monitoring and logging solutions
  • Strong background in writing unit and integration tests
  • Familiarity with AI/ML concepts and integration points


Responsibilities

  • Design and develop scalable backend services for our AI platform
  • Architect and implement complex systems with high reliability
  • Build and maintain APIs for internal and external consumption
  • Work closely with AI engineers to integrate ML functionality
  • Optimize application performance and resource utilization
  • Make architectural decisions that balance immediate needs with long-term scalability
  • Mentor junior engineers and promote best practices
  • Contribute to the evolution of our technical standards and processes
Read more
Cognida

at Cognida

2 candid answers
Swathi P
Posted by Swathi P
Remote only
4 - 10 yrs
₹15L - ₹40L / yr
skill iconPython
skill iconDjango

Position: Python Backend Developer


Location: Hyderabad/ Hybrid


Job Description:


We are seeking a skilled and motivated Python Backend Developer to join our team. In this role, you will be responsible for designing, building and maintaining high-performance, scalable and secure backend systems. The ideal candidate will have a strong understanding of Python programming and web development frameworks.


Responsibilities:


Design, build and maintain efficient, reusable, and reliable Python code

Integration of user-facing elements developed by front-end developers with server-side logic

Write clean, efficient and well-documented code, following best practices and industry standards

Develop and implement RESTful APIs and microservices

Ensure the performance, scalability and security of the backend systems

Collaborate with cross-functional teams to develop, test, and deploy new features

Troubleshoot and debug issues in the backend systems

Keep up-to-date with the latest industry trends and technologies

Requirements:


Strong experience in Python programming and web development frameworks (such as Flask, Django, etc.)

Experience with RESTful API development and microservices architecture

Knowledge of SQL and NoSQL databases (such as PostgreSQL, MongoDB, etc.)

Experience with cloud computing platforms (such as AWS, Google Cloud, etc.)

Strong understanding of security principles and how to implement them in a web environment

Ability to write clean, maintainable, and efficient code

Excellent problem-solving and communication skills

BTech/Mtech/MS in Computer Science or related field, or equivalent experience.


Read more
SDS softwares
Tanavee Sharma
Posted by Tanavee Sharma
Remote only
0 - 1.5 yrs
₹1L - ₹2.2L / yr
skill iconHTML/CSS
skill iconJavascript
skill iconReact.js
skill iconNodeJS (Node.js)
skill icontailwindcss
+12 more

💼 Job Title: Full Stack Developer (*Fresher/experienced*)

🏢 Company: SDS Softwares

💻 Location: Work from Home

💸 Salary range: ₹7,000 - ₹18,000 per month (based on knowledge and interview)

🕛 Shift Timings: 12 PM to 9 PM


About the role: As a Full Stack Developer, you will work on both the front-end and back-end of web applications. You will be responsible for developing user-friendly interfaces and maintaining the overall functionality of our projects.


⚜️ Key Responsibilities:

- Collaborate with cross-functional teams to define, design, and ship new features.

- Develop and maintain high-quality web applications (frontend + backend )

- Troubleshoot and debug applications to ensure peak performance.

- Participate in code reviews and contribute to the team’s knowledge base.


⚜️ Required Skills:

- Proficiency in HTML, CSS, JavaScript, React.js for front-end development. ✅

- Understanding of server-side languages such as Node.js, Python, or PHP. ✅

- Familiarity with database technologies such as MySQL, MongoDB, or ✅ PostgreSQL.

- Basic knowledge of version control systems, particularly Git.

- Strong problem-solving skills and attention to detail.

- Excellent communication skills and a team-oriented mindset.


💠 Qualifications:

- Recent graduates or individuals with internship experience (6 months to 1.5years) in software development.

- Must have a personal laptop and stable internet connection.

- Ability to join immediately is preferred.


If you are passionate about coding and eager to learn, we would love to hear from you. 👍


Read more
Cambridge Wealth (Baker Street Fintech)
Sangeeta Bhagwat
Posted by Sangeeta Bhagwat
Pune
2 - 4 yrs
₹4.2L - ₹7.5L / yr
skill iconFlutter
skill iconReact Native
RESTful APIs
skill iconPython

Who are we a.k.a “About Cambridge Wealth” :


We are an early stage Fintech Startup - working on exciting Fintech Products for some of the Top 5 Global Banks and building our own. If you are looking for a place where you can make a mark and not just be a cog in the wheel, Bakerstreet Fintech might be the place for you. We have a flat, ownership-oriented culture, and deliver world class quality. You will be working with a founding team that has delivered over 26 industry leading product experiences and won the Webby awards for Digital Strategy. In short, a bleeding edge team.


What are we looking for a.k.a “The JD” :

We are looking for a motivated and energetic Flutter Intern who will be running and designing product application features across various cross platform devices. Just like Lego boxes that fit on top of one another, we are looking out for someone who has experience using Flutter widgets that can be plugged together, customised and deployed anywhere.


What will you be doing at CW a.k.a “Your Responsibilities :

  • Create multi-platform apps for iOS / Android using Flutter Development Framework.
  • Participation in the process of analysis, designing, implementation and testing of new apps.
  • Apply industry standards during the development process to ensure high quality.
  • Translate designs and wireframes into high quality code.
  • Ensure the best possible performance, quality, and responsiveness of the application.
  • Help maintain code quality, organisation, and automatisation.
  • Work on bug fixing and improving application performance

What should our ideal candidate have a.k.a “Your Requirements”:

  • Knowledge of mobile app development.
  • Worked at any stage startup or have developed projects of their own ideas.
  • Good knowledge of Flutter and interest in developing mobile applications.
  • Available for full time (in-office) internship.

Not sure whether you should apply? Here's a quick checklist to make things easier. You are someone who:

  • You are ready to be a part of a Zero To One Journey which implies that you shall be involved in building fintech products and processes from the ground up.
  • You are comfortable to work in an unstructured environment with a small team where you decide what your day looks like and take initiative to take up the right piece of work, own it and work with the founding team on it.
  • This is not an environment where someone will be checking up on you every few hours. It is up to you to schedule check-ins whenever you find the need to, else we assume you are progressing well with your tasks. You will be expected to find solutions to problems and suggest improvements
  • You want complete ownership for your role & be able to drive it the way you think is right. You are looking to stick around for the long term and grow with the company.
  • You have the ability to be a self-starter and take ownership of deliverables to develop a consensus with the team on approach and methods and deliver to them.


Speed-track your application process by completing the 40 min test at the link below:

https://app.testgorilla.com/s/itrlc3m2


On successfully clearing the above, there is 20-30 min video interview followed by a technical interview and meeting with the founder at the office. You may be requested to complete a brief in-person exercise at that point.


Please note that this is an On-site/ Work from Office opportunity at our headquarters at Prabhat Road, Pune

Read more
Cognida

at Cognida

2 candid answers
Keshav Kumar
Posted by Keshav Kumar
Remote, Hyderabad
5 - 10 yrs
₹20L - ₹40L / yr
skill iconPython
SQL
skill iconDjango
FastAPI
skill iconFlask
+2 more

About Cognida.ai


Our Purpose is to boost your competitive advantage using AI and Analytics.

We Deliver tangible business impact with data-driven insights powered by AI. Drive revenue growth, increase profitability and improve operational efficiencies.

We Are technologists with keen business acumen - Forever curious, always on the front lines of technological advancements. Applying our latest learnings, and tools to solve your everyday business challenges.

We Believe the power of AI should not be the exclusive preserve of the few. Every business, regardless of its size or sector deserves the opportunity to harness the power of AI to make better decisions and drive business value.

We See a world where our AI and Analytics solutions democratize decision intelligence for all businesses. With Cognida.ai, our motto is ‘No enterprise left behind’.


Qualifications


Bachelor's degree in BE/B. Tech /MCA


Required Experience: 5 years


Position: Python Backend Developer


Location: Hyderabad/ Hybrid/ Remote


If you are interested, please reach me on keshav.kumar (at) cognida.ai


Job Description:


We are seeking a skilled and motivated Python Backend Developer to join our team. In this role, you will be responsible for designing, building and maintaining high-performance, scalable and secure backend systems. The ideal candidate will have a strong understanding of Python programming and web development frameworks.


Responsibilities:


Design, build and maintain efficient, reusable, and reliable Python code

Integration of user-facing elements developed by front-end developers with server-side logic.

Write clean, efficient and well-documented code, following best practices and industry standards.

Develop and implement RESTful APIs and microservices.

Ensure the performance, scalability and security of the backend systems.

Collaborate with cross-functional teams to develop, test, and deploy new features.

Troubleshoot and debug issues in the backend systems.

Keep up-to-date with the latest industry trends and technologies.


Requirements:


Strong experience in Python programming and web development frameworks (such as Flask, Django, etc.)

Experience with RESTful API development and microservices architecture.

Knowledge of SQL and NoSQL databases (such as PostgreSQL, MongoDB, etc.)

Experience with cloud computing platforms (such as AWS, Google Cloud, etc.)

Strong understanding of security principles and how to implement them in a web environment

Ability to write clean, maintainable, and efficient code

Excellent problem-solving and communication skills

BTech/Mtech/MS in Computer Science or related field, or equivalent experience.

Read more
NeoGenCode Technologies Pvt Ltd
Pune
8 - 15 yrs
₹5L - ₹24L / yr
Data engineering
Snow flake schema
SQL
ETL
ELT
+5 more

Job Title : Data Engineer – Snowflake Expert

Location : Pune (Onsite)

Experience : 10+ Years

Employment Type : Contractual

Mandatory Skills : Snowflake, Advanced SQL, ETL/ELT (Snowpipe, Tasks, Streams), Data Modeling, Performance Tuning, Python, Cloud (preferably Azure), Security & Data Governance.


Job Summary :

We are seeking a seasoned Data Engineer with deep expertise in Snowflake to design, build, and maintain scalable data solutions.

The ideal candidate will have a strong background in data modeling, ETL/ELT, SQL optimization, and cloud data warehousing principles, with a passion for leveraging Snowflake to drive business insights.

Responsibilities :

  • Collaborate with data teams to optimize and enhance data pipelines and models on Snowflake.
  • Design and implement scalable ELT pipelines with performance and cost-efficiency in mind.
  • Ensure high data quality, security, and adherence to governance frameworks.
  • Conduct code reviews and align development with best practices.

Qualifications :

  • Bachelor’s in Computer Science, Data Science, IT, or related field.
  • Snowflake certifications (Pro/Architect) preferred.
Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Hyderabad, Bengaluru (Bangalore)
3 - 8 yrs
₹5L - ₹20L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
AI
ML
skill iconPython

Desired Competencies (Technical/Behavioral Competency)

Must-Have

  1. Experience in working with various ML libraries and packages like Scikit learn, Numpy, Pandas, Tensorflow, Matplotlib, Caffe, etc.
  2. Deep Learning Frameworks: PyTorch, spaCy, Keras
  3. Deep Learning Architectures: LSTM, CNN, Self-Attention and Transformers
  4. Experience in working with Image processing, computer vision is must
  5. Designing data science applications, Large Language Models(LLM) , Generative Pre-trained Transformers (GPT), generative AI techniques, Natural Language Processing (NLP), machine learning techniques, Python, Jupyter Notebook, common data science packages (tensorflow, scikit-learn,kerasetc.,.) , LangChain, Flask,FastAPI, prompt engineering.
  6. Programming experience in Python
  7. Strong written and verbal communications
  8. Excellent interpersonal and collaboration skills.

Role descriptions / Expectations from the Role

Design and implement scalable and efficient data architectures to support generative AI workflows.

Fine tune and optimize large language models (LLM) for generative AI, conduct performance evaluation and benchmarking for LLMs and machine learning models

Apply prompt engineer techniques as required by the use case

Collaborate with research and development teams to build large language models for generative AI use cases, plan and breakdown of larger data science tasks to lower-level tasks

Lead junior data engineers on tasks such as design data pipelines, dataset creation, and deployment, use data visualization tools, machine learning techniques, natural language processing , feature engineering, deep learning , statistical modelling as required by the use case.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vishakha Walunj
Posted by Vishakha Walunj
Bengaluru (Bangalore), Pune, Mumbai
7 - 12 yrs
Best in industry
PySpark
databricks
SQL
skill iconPython

Required Skills:

  • Hands-on experience with Databricks, PySpark
  • Proficiency in SQL, Python, and Spark.
  • Understanding of data warehousing concepts and data modeling.
  • Experience with CI/CD pipelines and version control (e.g., Git).
  • Fundamental knowledge of any cloud services, preferably Azure or GCP.


Good to Have:

  • Bigquery
  • Experience with performance tuning and data governance.


Read more
Third Rock Techkno
Pune
5 - 7 yrs
₹10L - ₹15L / yr
ASP.NET
Entity Framework
skill iconC#
SQL server
skill iconAmazon Web Services (AWS)
+4 more

Required Qualifications:

  • 5+ years of professional software development experience.
  • Post-secondary degree in computer science, software engineering or related discipline, or equivalent working experience.
  • Development of distributed applications with Microsoft technologies: C# .NET/Core, SQL Server, Entity Framework.
  • Deep expertise with microservices architectures and design patterns.
  • Cloud Native AWS experience with services such as Lambda, SQS, RDS/Aurora, S3, Lex, and Polly.
  • Mastery of both Windows and Linux environments and their use in the development and management of complex distributed systems architectures.
  • Git source code repository and continuous integration tools.


Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Hyderabad
1 - 4 yrs
₹5L - ₹15L / yr
Generative AI
Artificial Intelligence (AI)
Retrieval Augmented Generation (RAG)
Large Language Models (LLM) tuning
skill iconDeep Learning
+3 more

What You'll Do:

Design, develop, and deploy machine learning models for real-world applications

Work with large datasets, perform data preprocessing, feature engineering, and model evaluation

Collaborate with cross-functional teams to integrate ML solutions into production

Stay up to date with the latest ML/AI research and technologies

What We’re Looking For:

Solid experience with Python and popular ML libraries (e.g., scikit-learn, TensorFlow, PyTorch)

Experience with data pipelines, model deployment, and performance tuning

Familiarity with cloud platforms (AWS/GCP/Azure) is a plus

Strong problem-solving and analytical skills

Excellent communication and teamwork abilities

Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Bengaluru (Bangalore), Hyderabad
3 - 10 yrs
₹6L - ₹25L / yr
Gen AI
NLP
skill iconPython
TensorFlow
skill iconMachine Learning (ML)
+4 more

Desired Competencies (Technical/Behavioral Competency)

Must-Have

  1. Hands-on knowledge in machine learning, deep learning, TensorFlow, Python, NLP
  2. Stay up to date on the latest AI emergences relevant to the business domain.
  3. Conduct research and development processes for AI strategies.

4.     Experience developing and implementing generative AI models, with a strong understanding of deep learning techniques such as GPT, VAE, and GANs.

5.     Experience with transformer models such as BERT, GPT, RoBERTa, etc, and a solid understanding of their underlying principles is a plus

Good-to-Have

  1. Have knowledge of software development methodologies, such as Agile or Scrum
  2. Have strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience.
  3. Have experience with natural language processing (NLP) techniques and tools, such as SpaCy, NLTK, or Hugging Face
  4. Ensure the quality of code and applications through testing, peer review, and code analysis.
  5. Root cause analysis and bugs correction
  6. Familiarity with version control systems, preferably Git.
  7. Experience with building or maintaining cloud-native applications.
  8. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud is Plus
Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by susmitha o
Hyderabad, Bengaluru (Bangalore)
3 - 8 yrs
₹7L - ₹24L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
Natural Language Processing (NLP)
NumPy
+1 more

Design and implement scalable and efficient data architectures to support generative AI workflows.

2 Fine tune and optimize large language models (LLM) for generative AI, conduct performance evaluation and benchmarking for LLMs and machine learning models

3 Apply prompt engineer techniques as required by the use case

4 Collaborate with research and development teams to build large language models for generative AI use cases, plan and breakdown of larger data science tasks to lower-level tasks

5 Lead junior data engineers on tasks such as design data pipelines, dataset creation, and deployment, use data visualization tools, machine learning techniques, natural language processing , feature engineering, deep learning , statistical modelling as required by the use case.

Read more
F22Labs Global

at F22Labs Global

1 video
2 recruiters
F Twenty Two
Posted by F Twenty Two
Chennai
0 - 1 yrs
₹10000 - ₹10000 / mo
Manual testing
Test Automation (QA)
Systems Development Life Cycle (SDLC)
Selenium
Bug tracking
+7 more

Job Title: QA Intern

Location: Chennai (Work From Office – Let’s ensure quality together!)

Duration: 6 months (High-performing interns may be offered a full-time role) 

Stipend: INR 10,000/- per month


About the Company:

F22 Labs GLOBAL is a startup software studio based out of Chennai. We are the rocket fuel for other startups across the world, powering them with extremely high-quality software. We help entrepreneurs build their vision into beautiful software products (web/mobile). If you're into creating beautiful software and solving real problems, you’ll fit right in with us. Let’s make cool things happen!


Position Overview:

Are you detail-oriented, curious, and excited about breaking things (in a good way)? As a QA Intern at F22 Labs, you’ll learn the ropes of software testing and contribute to ensuring that our web and mobile applications meet the highest standards. You'll work alongside experienced QA professionals, developers, and designers to help us release top-quality products. If you’re looking to launch your career in quality assurance and make a real impact — we want you on board!


Key Responsibilities:

  • Test Case Design: Learn to write clear and concise test cases for web and mobile applications (we’ll guide you every step of the way!).
  • Manual Testing: Execute manual functional, regression, and exploratory tests to uncover bugs and ensure everything works smoothly.
  • Bug Reporting: Log bugs using tools like JIRA or ClickUp and collaborate with developers to verify fixes.
  • Requirement Understanding: Participate in team discussions to understand product features and translate them into test scenarios.
  • API Testing Exposure: Assist in testing APIs using tools like Postman (we’ll teach you how to validate backend responses).
  • Cross-Browser & Device Testing: Help ensure our applications work seamlessly across browsers and mobile devices.
  • Test Documentation: Maintain basic test reports and update defect logs to keep things organized.
  • Team Collaboration: Work closely with QA mentors, developers, and designers in a fun, fast-paced environment.


Skills Required:

  • Basic knowledge of manual testing concepts (functional, regression, exploratory).
  • Familiarity with bug tracking tools (JIRA, ClickUp, or similar).
  • Understanding of SDLC and STLC.
  • Understanding of web and mobile application workflows.
  • Awareness of cross-browser/device testing.
  • Strong attention to detail and willingness to learn quickly.
  • Strong communication and analytical thinking skills.
  • Good to have:
  • Basic knowledge with test automation tools such as Selenium.
  • Exposure to any programming language such as Java or Python for writing simple automation scripts.


Why Join Us (Perks & Benefits):

  • Mentorship and hands-on training (learn from experienced QA professionals).
  • Opportunity to work on real projects from day one.
  • Supportive and collaborative team environment.
  • Exposure to fast-paced startup culture.
  • Possibility of full-time placement post internship based on performance.
  • Paid internship.


If you're ready to kickstart your QA career in a dynamic, high-growth environment — apply today and help us build software that works beautifully!

Read more
NeoGenCode Technologies Pvt Ltd
Gurugram
4 - 6 yrs
₹2L - ₹12L / yr
skill iconPython
skill iconDjango
skill iconPostgreSQL
RabbitMQ
skill iconRedis
+5 more

Role : Python Django Developer (Immediate Joiner)

Location : Gurugram, India (Onsite)

Experience : 4+ Years

Working Days : 6 Days WFO (Monday to Saturday)


Job Summary :

We are looking for an experienced Python Developer with strong expertise in Django to join our team. The ideal candidate will have 4+ years of hands-on experience in building robust, scalable, and efficient web applications.

Proficiency in RabbitMQ, Redis, Celery, and PostgreSQL is essential for managing background tasks, caching, and database performance.


Mandatory Skills : Python, Django, RabbitMQ, Redis, Celery, PostgreSQL, RESTful APIs, Docker.


Key Responsibilities :

  • Develop, maintain, and enhance Django-based web applications and RESTful APIs.
  • Design and implement message broker systems using RabbitMQ for asynchronous communication.
  • Integrate Redis for caching and session management to improve application performance.
  • Utilize Celery for managing distributed task queues and background processing.
  • Work with PostgreSQL for schema design, optimization, and query tuning.
  • Collaborate with cross-functional teams including front-end developers and DevOps engineers.
  • Write clean, maintainable, and well-documented code aligned with industry best practices.
  • Debug and troubleshoot issues across the application stack.
  • Participate in system architecture discussions, code reviews, and agile ceremonies.
  • Ensure performance, scalability, and security of applications.


Technical Skills Required :

  • Minimum 4+ years of hands-on experience with Python and Django.
  • Proficiency with RabbitMQ for message brokering.
  • Experience with Redis for caching and session storage.
  • Practical knowledge of Celery for asynchronous task processing.
  • Strong command over PostgreSQL including complex queries and optimization techniques.
  • Experience developing and consuming RESTful APIs.
  • Familiarity with Docker and containerization concepts.

Preferred Skills :

  • Exposure to CI/CD tools and processes.
  • Experience with cloud platforms such as AWS or GCP.
  • Understanding of Django ORM performance tuning.
  • Basic knowledge of front-end technologies (HTML, CSS, JavaScript).
Read more
Remote only
0 - 1 yrs
₹5000 - ₹5500 / mo
skill iconPython
RESTful APIs

Job Title: IT and Cybersecurity Network Backend Engineer (Remote)

Job Summary:

Join Springer Capital’s elite tech team to architect and fortify our digital infrastructure, ensuring robust, secure, and scalable backend systems that power cutting‑edge investment solutions.

Job Description:

Founded in 2015, Springer Capital is a technology-forward asset management and investment firm that redefines financial strategies through innovative digital solutions. We identify high-potential opportunities and leverage advanced technology to drive value, transforming traditional investment paradigms. Our culture is built on agility, creative problem-solving, and a relentless pursuit of excellence.

Job Highlights:

As an IT and Cybersecurity Network Backend Engineer, you will play a central role in designing, developing, and securing our backend systems. You’ll be responsible for creating bulletproof server architectures and integrating sophisticated cybersecurity measures to ensure our digital assets remain secure, reliable, and scalable—all while working fully remotely.

Responsibilities:

  • Backend Architecture & Security:
  • Design, develop, and maintain high-performance backend systems and RESTful APIs using technologies such as Python, Node.js, or Java.
  • Implement advanced cybersecurity protocols including encryption, multi-factor authentication, and anomaly detection to safeguard our infrastructure.
  • Network Infrastructure Management:
  • Architect secure cloud and hybrid network solutions to protect sensitive data and ensure uninterrupted service.
  • Develop robust logging, monitoring, and compliance mechanisms.
  • Collaborative Innovation:
  • Partner with cross-functional teams (DevOps, frontend, and product managers) to integrate security seamlessly into every layer of our technology stack.
  • Participate in regular security audits, agile sprints, and technical reviews.
  • Continuous Improvement:
  • Keep abreast of emerging technologies and cybersecurity threats, proposing and implementing innovative solutions to maintain system integrity.

What We Offer:

  • Advanced Learning & Mentorship: Work side-by-side with industry experts who will empower you to push the boundaries of cybersecurity and backend engineering.
  • Impactful Work: Engage in projects that directly influence the security and scalability of our revolutionary digital investment strategies.
  • Dynamic, Remote Culture: Thrive in a flexible, remote-first environment that champions creativity, collaboration, and work-life balance.
  • Career Growth: Unlock long-term career advancement opportunities in a forward-thinking organization that values innovation and initiative.

Requirements:

  • Degree (or current enrollment) in Computer Science, Cybersecurity, or a related field.
  • Proficiency in at least one backend programming language (Python, Node.js, or Java) and hands-on experience with RESTful API design.
  • Solid understanding of network security principles and experience implementing cybersecurity best practices.
  • Passionate about designing secure systems, solving complex technical challenges, and staying ahead of industry trends.
  • Strong analytical and communication skills, with the ability to work effectively in a collaborative, fast-paced environment.

About Springer Capital:

At Springer Capital, we blend financial expertise with digital innovation to shape tomorrow’s investment landscape. Our relentless drive to merge technology and asset management has positioned us as leaders in transforming traditional finance into dynamic, tech-enabled ventures.

Location: Global (Remote)

Job Type: Full-time

Pay: $50 USD per month

Work Location: Remote

Embark on your next challenge with Springer Capital—where your technical prowess and dedication to security help safeguard the future of digital investments.

Read more
Top tier global IT consulting company

Top tier global IT consulting company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹11L - ₹11L / yr
Computer Networking
Linux administration
skill iconPython
Bash
Object Oriented Programming (OOPs)
+2 more

AccioJob is conducting a Walk-In Hiring Drive with a reputed global IT consulting company at AccioJob Skill Centres for the position of Infrastructure Engineer, specifically for female candidates.


To Apply, Register and select your Slot herehttps://go.acciojob.com/kcYTAp


We will not consider your application if you do not register and select slot via the above link.


Required Skills: Linux, Networking, One scripting language among Python, Bash, and PowerShell, OOPs, Cloud Platforms (AWS, Azure)


Eligibility:


  • Degree: B.Tech/BE
  • Branch: CSE Core With Cloud Certification
  • Graduation Year: 2024 & 2025


Note: Only Female Candidates can apply for this job opportunity


Work Details:


  • Work Mode: Work From Office
  • Work Location: Bangalore & Coimbatore
  • CTC: 11.1 LPA


Evaluation Process:


  • Round 1: Offline Assessment at AccioJob Skill Centre in Noida, Pune, Hyderabad.


  • Further Rounds (for Shortlisted Candidates only)

 

  1. HackerRank Online Assessment
  2. Coding Pairing Interview
  3. Technical Interview
  4. Cultural Alignment Interview


Important Note: Please bring your laptop and earphones for the test.


Register here: https://go.acciojob.com/kcYTAp

Read more
Top tier global IT consulting company

Top tier global IT consulting company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹11L - ₹11L / yr
skill iconPython
MySQL
Big Data

AccioJob is conducting a Walk-In Hiring Drive with a reputed global IT consulting company at AccioJob Skill Centres for the position of Data Engineer, specifically for female candidates.


To Apply, Register and select your Slot here: https://go.acciojob.com/8p9ZXN


We will not consider your application if you do not register and select slot via the above link.


Required Skills: Python, Database(MYSQL), Big Data(Spark, Kafka)


Eligibility:


  • Degree: B.Tech/BE
  • Branch: CSE – AI & DS / AI & ML
  • Graduation Year: 2024 & 2025


Note: Only Female Candidates can apply for this job opportunity


Work Details:


  • Work Mode: Work From Office
  • Work Location: Bangalore & Coimbatore
  • CTC: 11.1 LPA


Evaluation Process:


  • Round 1: Offline Assessment at AccioJob Skill Centre in Noida, Pune, Hyderabad.


  • Further Rounds (for Shortlisted Candidates only)

 

  1. HackerRank Online Assessment
  2. Coding Pairing Interview
  3. Technical Interview
  4. Cultural Alignment Interview


Important Note: Please bring your laptop and earphones for the test.


Register here: https://go.acciojob.com/8p9ZXN

Read more
NeoGenCode Technologies Pvt Ltd
Shivank Bhardwaj
Posted by Shivank Bhardwaj
Remote, Thiruvananthapuram, Kochi (Cochin)
11 - 12 yrs
₹15L - ₹30L / yr
skill iconJava
skill iconAngular (2+)
skill iconAmazon Web Services (AWS)
skill iconSpring Boot
skill iconPython
+8 more

About the Role

NeoGenCode Technologies is looking for a Senior Technical Architect with strong expertise in enterprise architecture, cloud, data engineering, and microservices. This is a critical role demanding leadership, client engagement, and architectural ownership in designing scalable, secure enterprise systems.


Key Responsibilities

  • Design scalable, secure, and high-performance enterprise software architectures.
  • Architect distributed, fault-tolerant systems using microservices and event-driven patterns.
  • Provide technical leadership and hands-on guidance to engineering teams.
  • Collaborate with clients, understand business needs, and translate them into architectural designs.
  • Evaluate, recommend, and implement modern tools, technologies, and processes.
  • Drive DevOps, CI/CD best practices, and application security.
  • Mentor engineers and participate in architecture reviews.


Must-Have Skills

  • Architecture: Enterprise Solutions, EAI, Design Patterns, Microservices (API & Event-driven)
  • Tech Stack: Java, Spring Boot, Python, Angular (recent 2+ years experience), MVC
  • Cloud Platforms: AWS, Azure, or Google Cloud
  • Client Handling: Strong experience with client-facing roles and delivery
  • Data: Data Modeling, RDBMS & NoSQL, Data Migration/Retention Strategies
  • Security: Familiarity with OWASP, PCI DSS, InfoSec principles


Good to Have

  • Experience with Mobile Technologies (native, hybrid, cross-platform)
  • Knowledge of tools like Enterprise Architect, TOGAF frameworks
  • DevOps tools, containerization (Docker), CI/CD
  • Experience in financial services / payments domain
  • Familiarity with BI/Analytics, AI/ML, Predictive Analytics
  • 3rd-party integrations (e.g., MuleSoft, BizTalk)
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Remote, Kochi (Cochin), Trivandrum
8 - 15 yrs
₹10L - ₹24L / yr
skill iconJava
skill iconSpring Boot
skill iconPython
skill iconAngular (2+)
skill iconAmazon Web Services (AWS)
+7 more

Job Title : Technical Architect

Experience : 8 to 12+ Years

Location : Trivandrum / Kochi / Remote

Work Mode : Remote flexibility available

Notice Period : Immediate to max 15 days (30 days with negotiation possible)


Summary :

We are looking for a highly skilled Technical Architect with expertise in Java Full Stack development, cloud architecture, and modern frontend frameworks (Angular). This is a client-facing, hands-on leadership role, ideal for technologists who enjoy designing scalable, high-performance, cloud-native enterprise solutions.


🛠 Key Responsibilities :

  • Architect scalable and high-performance enterprise applications.
  • Hands-on involvement in system design, development, and deployment.
  • Guide and mentor development teams in architecture and best practices.
  • Collaborate with stakeholders and clients to gather and refine requirements.
  • Evaluate tools, processes, and drive strategic technical decisions.
  • Design microservices-based solutions deployed over cloud platforms (AWS/Azure/GCP).

Mandatory Skills :

  • Backend : Java, Spring Boot, Python
  • Frontend : Angular (at least 2 years of recent hands-on experience)
  • Cloud : AWS / Azure / GCP
  • Architecture : Microservices, EAI, MVC, Enterprise Design Patterns
  • Data : SQL / NoSQL, Data Modeling
  • Other : Client handling, team mentoring, strong communication skills

Nice to Have Skills :

  • Mobile technologies (Native / Hybrid / Cross-platform)
  • DevOps & Docker-based deployment
  • Application Security (OWASP, PCI DSS)
  • TOGAF familiarity
  • Test-Driven Development (TDD)
  • Analytics / BI / ML / AI exposure
  • Domain knowledge in Financial Services or Payments
  • 3rd-party integration tools (e.g., MuleSoft, BizTalk)

⚠️ Important Notes :

  • Only candidates from outside Hyderabad/Telangana and non-JNTU graduates will be considered.
  • Candidates must be serving notice or joinable within 30 days.
  • Client-facing experience is mandatory.
  • Java Full Stack candidates are highly preferred.

🧭 Interview Process :

  1. Technical Assessment
  2. Two Rounds – Technical Interviews
  3. Final Round
Read more
Trellissoft Inc.

at Trellissoft Inc.

3 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
5yrs+
Upto ₹18L / yr (Varies
)
skill iconPython
skill iconFlask
Microservices
SQL
NOSQL Databases

Job Responsibilities:

  • Design, develop, test, and maintain high-performance web applications and backend services using Python.
  • Build scalable, secure, and reliable backend systems and APIs.
  • Optimize and debug existing codebases to enhance performance and maintainability.
  • Collaborate closely with cross-functional teams to gather requirements and deliver high-quality solutions.
  • Mentor junior developers, conduct code reviews, and uphold best coding practices.
  • Write clear, comprehensive technical documentation for internal and external use.
  • Stay current with emerging technologies, tools, and industry trends to continually improve development processes.

Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 5+ years of hands-on experience in Python development.
  • Strong expertise Flask.
  • In-depth understanding of software design principles, architecture, and design patterns.
  • Proven experience working with both SQL and NoSQL databases.
  • Solid debugging and problem-solving capabilities.
  • Effective communication and collaboration skills, with a team-first mindset.

Technical Skills:

  • Programming: Python (Advanced)
  • Web Frameworks: Flask
  • Databases: PostgreSQL, MySQL, MongoDB, Redis
  • Version Control: Git
  • API Development: RESTful APIs
  • Containerization & Orchestration: Docker, Kubernetes
  • Cloud Platforms: AWS or Azure (hands-on experience preferred)
  • DevOps: CI/CD pipelines (e.g., Jenkins, GitHub Actions)


Read more
Hunarstreet Technologies pvt ltd

Hunarstreet Technologies pvt ltd

Agency job
via Hunarstreet Technologies pvt ltd by Sakshi Patankar
Remote only
10 - 20 yrs
₹15L - ₹30L / yr
Data engineering
databricks
skill iconPython
skill iconScala
Spark
+14 more

What You’ll Be Doing:

● Design and build parts of our data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources using the latest Big Data technologies.

● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

● Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

● Work with machine learning, data, and analytics experts to drive innovation, accuracy and greater functionality in our data system. Qualifications:

● Bachelor's degree in Engineering, Computer Science, or relevant field.

● 10+ years of relevant and recent experience in a Data Engineer role. ● 5+ years recent experience with Apache Spark and solid understanding of the fundamentals.

● Deep understanding of Big Data concepts and distributed systems.

● Strong coding skills with Scala, Python, Java and/or other languages and the ability to quickly switch between them with ease.

● Advanced working SQL knowledge and experience working with a variety of relational databases such as Postgres and/or MySQL.

● Cloud Experience with DataBricks

● Experience working with data stored in many formats including Delta Tables, Parquet, CSV and JSON.

● Comfortable working in a linux shell environment and writing scripts as needed.

● Comfortable working in an Agile environment

● Machine Learning knowledge is a plus.

● Must be capable of working independently and delivering stable, efficient and reliable software.

● Excellent written and verbal communication skills in English.

● Experience supporting and working with cross-functional teams in a dynamic environment


EMPLOYMENT TYPE: Full-Time, Permanent

LOCATION: Remote (Pan India)

SHIFT TIMINGS: 2.00 pm-11:00pm IST 

Read more
Data Havn

Data Havn

Agency job
via Infinium Associate by Toshi Srivastava
Noida
5 - 8 yrs
₹25L - ₹40L / yr
Data engineering
skill iconPython
SQL
Data Warehouse (DWH)
ETL
+6 more

About the Role:

We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.

Responsibilities:

  • Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
  • Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
  • Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
  • Team Management: Able to handle team.
  • Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
  • Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
  • Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
  • Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.

 

 Skills:

  • Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
  • Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
  • Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
  • Understanding of data modeling and data architecture concepts.
  • Experience with ETL/ELT tools and frameworks.
  • Excellent problem-solving and analytical skills.
  • Ability to work independently and as part of a team.

Preferred Qualifications:

  • Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
  • Knowledge of machine learning and artificial intelligence concepts.
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Certification in cloud platforms or data engineering.


Read more
Mumbai, Kolkata
4 - 10 yrs
₹7L - ₹25L / yr
skill iconPython
skill iconMachine Learning (ML)
skill iconFlask
Artificial Intelligence (AI)

3+ years’ experience as Python Developer / Designer and Machine learning 2. Performance Improvement understanding and able to write effective, scalable code 3. security and data protection solutions 4. Expertise in at least one popular Python framework (like Django, Flask or Pyramid) 5. Knowledge of object-relational mapping (ORM) 6. Familiarity with front-end technologies (like JavaScript and HTML5

Read more
Remote only
0 - 1 yrs
₹5000 - ₹5500 / mo
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconData Science

Job description

Job Title: AI-Driven Data Science Automation Intern – Machine Learning Research Specialist

Location: Remote (Global)

Compensation: $50 USD per month

Company: Meta2 Labs

www.meta2labs.com

About Meta2 Labs:

Meta2 Labs is a next-gen innovation studio building products, platforms, and experiences at the convergence of AI, Web3, and immersive technologies. We are a lean, mission-driven collective of creators, engineers, designers, and futurists working to shape the internet of tomorrow. We believe the next wave of value will come from decentralized, intelligent, and user-owned digital ecosystems—and we’re building toward that vision.

As we scale our roadmap and ecosystem, we're looking for a driven, aligned, and entrepreneurial AI-Driven Data Science Automation Intern – Machine Learning Research Specialist to join us on this journey.

The Opportunity:

We’re seeking a part-time AI-Driven Data Science Automation Intern – Machine Learning Research Specialist to join Meta2 Labs at a critical early stage. This is a high-impact role designed for someone who shares our vision and wants to actively shape the future of tech. You’ll be an equal voice at the table and help drive the direction of our ventures, partnerships, and product strategies.

Responsibilities:

  • Collaborate on the vision, strategy, and execution across Meta2 Labs' portfolio and initiatives.
  • Drive innovation in areas such as AI applications, Web3 infrastructure, and experiential product design.
  • Contribute to go-to-market strategies, business development, and partnership opportunities.
  • Help shape company culture, structure, and team expansion.
  • Be a thought partner and problem-solver in all key strategic discussions.
  • Lead or support verticals based on your domain expertise (e.g., product, technology, growth, design, etc.).
  • Act as a representative and evangelist for Meta2 Labs in public or partner-facing contexts.

Ideal Profile:

  • Passion for emerging technologies (AI, Web3, XR, etc.).
  • Comfortable operating in ambiguity and working lean.
  • Strong strategic thinking, communication, and collaboration skills.
  • Open to wearing multiple hats and learning as you build.
  • Driven by purpose and eager to gain experience in a cutting-edge tech environment.

Commitment:

  • Flexible, part-time involvement.
  • Remote-first and async-friendly culture.

Why Join Meta2 Labs:

  • Join a purpose-led studio at the frontier of tech innovation.
  • Help build impactful ventures with real-world value and long-term potential.
  • Shape your own role, focus, and future within a decentralized, founder-friendly structure.
  • Be part of a collaborative, intellectually curious, and builder-centric culture.

Job Types: Part-time, Internship

Pay: $50 USD per month

Work Location: Remote

Job Types: Full-time, Part-time, Internship

Contract length: 3 months

Pay: Up to ₹5,000.00 per month

Benefits:

  • Flexible schedule
  • Health insurance
  • Work from home

Work Location: Remote


Read more
Automate Accounts
Namrata Das
Posted by Namrata Das
Remote only
2 - 5 yrs
₹6L - ₹15L / yr
skill iconPython
SQL
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconGitHub
+2 more

Responsibilities


Develop and maintain web and backend components using Python, Node.js, and Zoho tools


Design and implement custom workflows and automations in Zoho


Perform code reviews to maintain quality standards and best practices


Debug and resolve technical issues promptly


Collaborate with teams to gather and analyze requirements for effective solutions


Write clean, maintainable, and well-documented code


Manage and optimize databases to support changing business needs


Contribute individually while mentoring and supporting team members


Adapt quickly to a fast-paced environment and meet expectations within the first month



Leadership Opportunities


Lead and mentor junior developers in the team


Drive projects independently while collaborating with the broader team


Act as a technical liaison between the team and stakeholders to deliver effective solutions



Selection Process


1. HR Screening: Review of qualifications and experience


2. Online Technical Assessment: Test coding and problem-solving skills


3. Technical Interview: Assess expertise in web development, Python, Node.js, APIs, and Zoho


4. Leadership Evaluation: Evaluate team collaboration and leadership abilities


5. Management Interview: Discuss cultural fit and career opportunities


6. Offer Discussion: Finalize compensation and role specifics



Experience Required


2–5 years of relevant experience as a Software Developer


Proven ability to work as a self-starter and contribute individually


Strong technical and interpersonal skills to support team members effectively


Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, PAn india
5 - 10 yrs
₹10L - ₹25L / yr
Test Automation
Selenium
skill iconJava
skill iconPython
skill iconJavascript

Test Automation Engineer Job Description

A Test Automation Engineer is responsible for designing, developing, and implementing automated testing solutions to ensure the quality and reliability of software applications. Here's a breakdown of the job:


Key Responsibilities

- Test Automation Framework: Design and develop test automation frameworks using tools like Selenium, Appium, or Cucumber.

- Automated Test Scripts: Create and maintain automated test scripts to validate software functionality, performance, and security.

- Test Data Management: Develop and manage test data, including data generation, masking, and provisioning.

- Test Environment: Set up and maintain test environments, including configuration and troubleshooting.

- Collaboration: Work with cross-functional teams, including development, QA, and DevOps to ensure seamless integration of automated testing.


Essential Skills

- Programming Languages: Proficiency in programming languages like Java, Python, or C#.

- Test Automation Tools: Experience with test automation tools like Selenium,.

- Testing Frameworks: Knowledge of testing frameworks like TestNG, JUnit, or PyUnit.

- Agile Methodologies: Familiarity with Agile development methodologies and CI/CD pipelines.

Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by Jhansi Padiy
AnyWhere India
5 - 10 yrs
₹8L - ₹30L / yr
skill iconPython
skill iconFlask
NumPy
pandas
SQL

Requirement Summary

•       4 to 10 years of experience inPython Developer

•       Have good communication skills

•       High energy and self-motivated professional with the ability to make things happen and adhere to strict deadlines

Mandatory Technical Skills

·        Should have strong software development experience, not necessarily in Python. Candidates with good experience software development with most of their experience in technologies like C# or Java if not entirely in Python but with minimum 3+ year experience in Python as developer.

·        Resources should have exposure to core python, design principles, OPPS concepts (classes, methods, decorators), data structure concepts & uses of relevant packages (numpy, pandas, etc.), testing (Pytest ) frameworks(TDD/BDD), cloud concepts(CI/CD), DB uses(SQL mainly), application security awareness, code quality checks, etc.

Responsibility of / Expectations from the Role

·        Developer efficient reusable codes towards model implementation

·        Participate in Agile Ceremony’s

·        Present the analysis outcome and performance to business and support adhoc analysis as required

Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore), Chennai, Kochi (Cochin)
6 - 9 yrs
₹7L - ₹15L / yr
skill iconAmazon Web Services (AWS)
sagemaker
skill iconMachine Learning (ML)
skill iconDocker
skill iconPython
  • Design, develop, and maintain data pipelines and ETL workflows on AWS platform
  • Work with AWS services like S3, Glue, Lambda, Redshift, EMR, and Athena for data ingestion, transformation, and analytics
  • Collaborate with Data Scientists, Analysts, and Business teams to understand data requirements
  • Optimize data workflows for performance, scalability, and reliability
  • Troubleshoot data issues, monitor jobs, and ensure data quality and integrity
  • Write efficient SQL queries and automate data processing tasks
  • Implement data security and compliance best practices
  • Maintain technical documentation and data pipeline monitoring dashboards
Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Bengaluru (Bangalore), Gurugram, Chennai, Mumbai
5 - 7 yrs
₹6L - ₹20L / yr
skill iconAmazon Web Services (AWS)
Amazon Redshift
AWS Glue
skill iconPython
PySpark

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
LearnTube.ai

at LearnTube.ai

2 candid answers
Misbaah Shaik
Posted by Misbaah Shaik
Remote, Mumbai
1 - 3 yrs
₹8L - ₹18L / yr
Generative AI
skill iconPython
skill iconMachine Learning (ML)
skill iconChatGPT
AI Agents

Apply only if:

  1. You are an AI agent.
  2. OR you know how to build an AI agent that can do this job.


What You’ll Do: At LearnTube, we’re pushing the boundaries of Generative AI to revolutionize how the world learns. As an Agentic AI Engineer, you’ll:

  • Develop intelligent, multimodal AI solutions across text, image, audio, and video to power personalized learning experiences and deep assessments for millions of users.
  • Drive the future of live learning by building real-time interaction systems with capabilities like instant feedback, assistance, and personalized tutoring.
  • Conduct proactive research and integrate the latest advancements in AI & agents into scalable, production-ready solutions that set industry benchmarks.
  • Build and maintain robust, efficient data pipelines that leverage insights from millions of user interactions to create high-impact, generalizable solutions.
  • Collaborate with a close-knit team of engineers, agents, founders, and key stakeholders to align AI strategies with LearnTube's mission.


About Us: At LearnTube, we’re on a mission to make learning accessible, affordable, and engaging for millions of learners globally. Using Generative AI, we transform scattered internet content into dynamic, goal-driven courses with:

  • AI-powered tutors that teach live, solve doubts in real time, and provide instant feedback.
  • Seamless delivery through WhatsApp, mobile apps, and the web, with over 1.4 million learners across 64 countries.


Meet the Founders: LearnTube was founded by Shronit Ladhani and Gargi Ruparelia, who bring deep expertise in product development and ed-tech innovation. Shronit, a TEDx speaker, is an advocate for disrupting traditional learning, while Gargi’s focus on scalable AI solutions drives our mission to build an AI-first company that empowers learners to achieve career outcomes.


We’re proud to be recognized by Google as a Top 20 AI Startup and are part of their 2024 Startups Accelerator: AI First Program, giving us access to cutting-edge technology, credits, and mentorship from industry leaders.


Why Work With Us? At LearnTube, we believe in creating a work environment that’s as transformative as the products we build. Here’s why this role is an incredible opportunity:

  • Cutting-Edge Technology: You’ll work on state-of-the-art generative AI applications, leveraging the latest advancements in LLMs, multimodal AI, and real-time systems.
  • Autonomy and Ownership: Experience unparalleled flexibility and independence in a role where you’ll own high-impact projects from ideation to deployment.
  • Rapid Growth: Accelerate your career by working on impactful projects that pack three years of learning and growth into one.
  • Founder and Advisor Access: Collaborate directly with founders and industry experts, including the CTO of Inflection AI, to build transformative solutions.
  • Team Culture: Join a close-knit team of high-performing engineers and innovators, where every voice matters, and Monday morning meetings are something to look forward to.
  • Mission-Driven Impact: Be part of a company that’s redefining education for millions of learners and making AI accessible to everyone.
Read more
FiftyFive Technologies Pvt Ltd
Nishant Gandhi
Posted by Nishant Gandhi
Remote, Indore, Jaipur, Gurugram
4 - 15 yrs
₹10L - ₹30L / yr
PowerBI
Google Cloud Platform (GCP)
skill iconPython
SQL

Senior Data Analyst – Power BI, GCP, Python & SQL

 

Job Summary

 

We are looking for a Senior Data Analyst with strong expertise in Power BI, Google Cloud Platform (GCP), Python, and SQL to design data models, automate analytics workflows, and deliver business intelligence that drives strategic decisions. The ideal candidate is a problem-solver who can work with complex datasets in the cloud, build intuitive dashboards, and code custom analytics using Python and SQL.

 

Key Responsibilities

 

* Develop advanced Power BI dashboards and reports based on structured and semi-structured data from BigQuery and other GCP sources.

* Write and optimize complex SQL queries (BigQuery SQL) for reporting and data modeling.

* Use Python to automate data preparation tasks, build reusable analytics scripts, and support ad hoc data requests.

* Partner with data engineers and stakeholders to define metrics, build ETL pipelines, and create scalable data models.

* Design and implement star/snowflake schema models and DAX measures in Power BI.

* Maintain data integrity, monitor performance, and ensure security best practices across all reporting systems.

* Drive initiatives around data quality, governance, and cost optimization on GCP.

* Mentor junior analysts and actively contribute to analytics strategy and roadmap.

 

Must-Have Skills

 

* Expert-level SQL : Hands-on experience writing complex queries in BigQuery , optimizing joins, window functions, CTEs.

* Proficiency in Python : Data wrangling, Pandas, NumPy, automation scripts, API consumption, etc.

* Power BI expertise : Building dashboards, using DAX, Power Query (M), custom visuals, report performance tuning.

* GCP hands-on experience : Especially with BigQuery, Cloud Storage, and optionally Cloud Composer or Dataflow.

* Strong understanding of data modeling, ETL pipelines, and analytics workflows.

* Excellent communication skills and the ability to explain data insights to non-technical audiences.

 

Preferred Qualifications

 

* Experience in version control (Git) and working in CI/CD environments.

* Google Professional Data Engineer

* PL-300: Microsoft Power BI Data Analyst Associate


Read more
Alpha

at Alpha

2 candid answers
Yash Makhecha
Posted by Yash Makhecha
Remote, Bengaluru (Bangalore)
1 - 6 yrs
₹4L - ₹12L / yr
skill iconPython
skill iconNodeJS (Node.js)
skill iconReact.js
TypeScript
skill iconDocker
+10 more

Full Stack Engineer

Location: Remote (India preferred) · Type: Full-time · Comp: Competitive salary + early-stage stock



About Alpha

Alpha is building the simplest way for anyone to create AI agents that actually get work done. Our platform turns messy prompt chaining, data schemas, and multi-tool logic into a clean, no-code experience. We’re backed, funded, and racing toward our v1 launch. Join us on the ground floor and shape the architecture, the product, and the culture.



The Role

We’re hiring two versatile full-stack engineers. One will lean infra/back-end, the other front-end/LLM integration, but both will ship vertical slices end-to-end.


You will:

  • Design and build the agent-execution runtime (LLMs, tools, schemas).
  • Stand up secure VPC deployments with Docker, Terraform, and AWS or GCP.
  • Build REST/GraphQL APIs, queues, Postgres/Redis layers, and observability.
  • Create a React/Next.js visual workflow editor with drag-and-drop blocks.
  • Build the Prompt Composer UI, live testing mode, and cost dashboard.
  • Integrate native tools: search, browser, CRM, payments, messaging, and more.
  • Ship fast—design, code, test, launch—and own quality (no separate QA team).
  • Talk to early users and fold feedback into weekly releases.



What We’re Looking For


  • 3–6 years building production web apps at startup pace.
  • Strong TypeScript + Node.js or Python.
  • Solid React/Next.js and modern state management.
  • Comfort with AWS or GCP, Docker, and CI/CD.
  • Bias for ownership from design to deploy.


Nice but not required: Terraform or CDK, IAM/VPC networking, vector DBs or RAG pipelines, LLM API experience, React-Flow or other canvas libs, GraphQL or event streaming, prior dev-platform work.


We don’t expect every box ticked—show us you learn fast and ship.



What You’ll Get


• Meaningful equity at the earliest stage.

• A green-field codebase you can architect the right way.

• Direct access to the founder—instant decisions, no red tape.

• Real customers from day one; your code goes live, not to backlog.

• Stipend for hardware, LLM credits, and professional growth.



Come build the future of work—where AI agents handle the busywork and people do the thinking.

Read more
NeoGenCode Technologies Pvt Ltd
Gurugram
3 - 6 yrs
₹2L - ₹12L / yr
skill iconPython
skill iconDjango
skill iconPostgreSQL
Payment gateways
skill iconRedis
+16 more

Job Title : Python Django Developer

Experience : 3+ Years

Location : Gurgaon Sector - 48

Working Days : 6 Days WFO (Monday to Saturday)


Job Summary :

We are looking for a skilled Python Django Developer with strong foundational knowledge in backend development, data structures, and operating system concepts.

The ideal candidate should have experience in Django and PostgreSQL, along with excellent logical thinking and multithreading knowledge.


Main Technical Skills : Python, Django (or Flask), PostgreSQL/MySQL, SQL & NoSQL ORM, Microservice Architecture, Third-party API integrations (e.g., payment gateways, SMS/email APIs), REST API development, JSON/XML, strong knowledge of data structures, multithreading, and OS concepts.


Key Responsibilities :

  • Write efficient, reusable, testable, and scalable code using the Django framework
  • Develop backend components, server-side logic, and statistical models
  • Design and implement high-availability, low-latency applications with robust data protection and security
  • Contribute to the development of highly responsive web applications
  • Collaborate with cross-functional teams on system design and integration

Mandatory Skills :

  • Strong programming skills in Python and Django (or similar frameworks like Flask).
  • Proficiency with PostgreSQL / MySQL and experience in writing complex queries.
  • Strong understanding of SQL and NoSQL ORM.
  • Solid grasp of data structures, multithreading, and operating system concepts.
  • Experience with RESTful API development and implementation of API security.
  • Knowledge of JSON/XML and their use in data exchange.

Good-to-Have Skills :

  • Experience with Redis, MQTT, and message queues like RabbitMQ or Kafka.
  • Understanding of microservice architecture and third-party API integrations (e.g., payment gateways, SMS/email APIs).
  • Familiarity with MongoDB and other NoSQL databases.
  • Exposure to data science libraries such as Pandas, NumPy, Scikit-learn.
  • Knowledge in building and integrating statistical learning models.
Read more
NeoGenCode Technologies Pvt Ltd
Gurugram
3 - 6 yrs
₹2L - ₹12L / yr
skill iconPython
skill iconDjango
skill iconPostgreSQL
MySQL
SQL
+17 more

Job Title : Python Django Developer

Experience : 3+ Years

Location : Gurgaon

Working Days : 6 Days (Monday to Saturday)


Job Summary :

We are looking for a skilled Python Django Developer with strong foundational knowledge in backend development, data structures, and operating system concepts.

The ideal candidate should have experience in Django and PostgreSQL, along with excellent logical thinking and multithreading knowledge.


Technical Skills : Python, Django (or Flask), PostgreSQL/MySQL, SQL & NoSQL ORM, REST API development, JSON/XML, strong knowledge of data structures, multithreading, and OS concepts.


Key Responsibilities :

  • Write efficient, reusable, testable, and scalable code using the Django framework.
  • Develop backend components, server-side logic, and statistical models.
  • Design and implement high-availability, low-latency applications with robust data protection and security.
  • Contribute to the development of highly responsive web applications.
  • Collaborate with cross-functional teams on system design and integration.

Mandatory Skills :

  • Strong programming skills in Python and Django (or similar frameworks like Flask).
  • Proficiency with PostgreSQL / MySQL and experience in writing complex queries.
  • Strong understanding of SQL and NoSQL ORM.
  • Solid grasp of data structures, multithreading, and operating system concepts.
  • Experience with RESTful API development and implementation of API security.
  • Knowledge of JSON/XML and their use in data exchange.

Good-to-Have Skills :

  • Experience with Redis, MQTT, and message queues like RabbitMQ or Kafka
  • Understanding of microservice architecture and third-party API integrations (e.g., payment gateways, SMS/email APIs)
  • Familiarity with MongoDB and other NoSQL databases
  • Exposure to data science libraries such as Pandas, NumPy, Scikit-learn
  • Knowledge in building and integrating statistical learning models.
Read more
Poshmark

at Poshmark

3 candid answers
1 recruiter
Eman Khan
Posted by Eman Khan
Chennai
5 - 10 yrs
₹25L - ₹50L / yr
skill iconMachine Learning (ML)
skill iconPython
Scikit-Learn
NumPy
pandas
+9 more

Are you passionate about the power of data and excited to leverage cutting-edge AI/ML to drive business impact? At Poshmark, we tackle complex challenges in personalization, trust & safety, marketing optimization, product experience, and more.


Why Poshmark?

As a leader in Social Commerce, Poshmark offers an unparalleled opportunity to work with extensive multi-platform social and commerce data. With over 130 million users generating billions of daily events and petabytes of rapidly growing data, you’ll be at the forefront of data science innovation. If building impactful, data-driven AI solutions for millions excites you, this is your place.


What You’ll Do

  • Drive end-to-end data science initiatives, from ideation to deployment, delivering measurable business impact through projects such as feed personalization, product recommendation systems, and attribute extraction using computer vision.
  • Collaborate with cross-functional teams, including ML engineers, product managers, and business stakeholders, to design and deploy high-impact models.
  • Develop scalable solutions for key areas like product, marketing, operations, and community functions.
  • Own the entire ML Development lifecycle: data exploration, model development, deployment, and performance optimization.
  • Apply best practices for managing and maintaining machine learning models in production environments.
  • Explore and experiment with emerging AI trends, technologies, and methodologies to keep Poshmark at the cutting edge.


Your Experience & Skills

  • Ideal Experience: 6-9 years of building scalable data science solutions in a big data environment. Experience with personalization algorithms, recommendation systems, or user behavior modeling is a big plus.
  • Machine Learning Knowledge: Hands-on experience with key ML algorithms, including CNNs, Transformers, and Vision Transformers. Familiarity with Large Language Models (LLMs) and techniques like RAG or PEFT is a bonus.
  • Technical Expertise: Proficiency in Python, SQL, and Spark (Scala or PySpark), with hands-on experience in deep learning frameworks like PyTorch or TensorFlow. Familiarity with ML engineering tools like Flask, Docker, and MLOps practices.
  • Mathematical Foundations: Solid grasp of linear algebra, statistics, probability, calculus, and A/B testing concepts.
  • Collaboration & Communication: Strong problem-solving skills and ability to communicate complex technical ideas to diverse audiences, including executives and engineers.
Read more
MNC in B2B Insurance Domain

MNC in B2B Insurance Domain

Agency job
via Bean HR Consulting by Sachin Bhandari
Noida, Gurugram
13 - 20 yrs
₹30L - ₹40L / yr
skill iconPython
Architecture
skill iconDjango
skill iconFlask

Job Description:

Position: Python Technical Architect

 

Major Responsibilities:

 

●           Develop and customize solutions, including workflows, Workviews, and application integrations.

●           Integrate with other enterprise applications and systems.

●           Perform system upgrades and migrations to ensure optimal performance.

●           Troubleshoot and resolve issues related to applications and workflows using Diagnostic console.

●           Ensure data integrity and security within the system.

●           Maintain documentation for system configurations, workflows, and processes.

●           Stay updated on best practices, new features and industry trends.

●           Hands-on in Waterfall & Agile Scrum methodology.

●           Working on software issues and specifications and performing Design/Code Review(s).

●           Engaging in the assignment of work to the development team resources, ensuring effective transition of knowledge, design assumptions and development expectations.

●           Ability to mentor developers and lead cross-functional technical teams.

●           Collaborate with stakeholders to gather requirements and translate them into technical specifications for effective workflow/Workview design.

●           Assist in the training of end-users and provide support as needed

●           Contributing to the organizational values by actively working with agile development teams, methodologies, and toolsets.

●           Driving concise, structured, and effective communication with peers and clients.

 

Key Capabilities and Competencies Knowledge

 

●           Proven experience as a Software Architect or Technical Project Manager with architectural responsibilities.

●           Strong proficiency in Python and relevant frameworks (Django, Flask, FastAPI).

●           Strong understanding of software development lifecycle (SDLC), agile methodologies (Scrum, Kanban) and DevOps practices.

●           Expertise in Azure cloud ecosystem and architecture design patterns.

●           Familiarity with Azure DevOps, CI/CD pipelines, monitoring and logging.

●           Experience with RESTful APIs, microservices architecture and asynchronous processing.

●           Deep understanding of insurance domain processes such as claims management, policy administration etc.

●           Experience in database design and data modelling with SQL(MySQL) and NoSQL(Azure Cosmos DB).

●           Knowledge of security best practices including data encryption, API security and compliance standards.

●           Knowledge of SAST and DAST security tools is a plus.

●           Strong documentation skill for articulating architecture decisions and technical concepts to stakeholders.

●           Experience with system integration using middleware or web services.

●           Server Load Balancing, Planning, configuration, maintenance and administration of the Server Systems.

●           Experience with developing reusable assets such as prototypes, solution designs, documentation and other materials that contribute to department efficiency.

●           Highly cognizant of the DevOps approach like ensuring basic security measures.

●           Technical writing skills, strong networking, and communication style with the ability to formulate professional emails, presentations, and documents.

●           Passion for technology trends in the insurance industry and emerging technology space.

 

 

Qualification and Experience

 

●           Recognized with a Bachelor’s degree in Computer Science, Information Technology, or equivalent.

●           Work experience - Overall experience 10-12 years

●           Recognizable domain knowledge and awareness of basic insurance and regulatory frameworks.

●           Previous experience working in the insurance industry (AINS Certification is a plus).

Read more
Hyderabad
5 - 8 yrs
₹24L - ₹30L / yr
Apache Kafka
skill iconElastic Search
skill iconNodeJS (Node.js)
ETL
skill iconPython
+2 more

Company Overview

We are a dynamic startup dedicated to empowering small businesses through innovative technology solutions. Our mission is to level the playing field for small businesses by providing them with powerful tools to compete effectively in the digital marketplace. Join us as we revolutionize the way small businesses operate online, bringing innovation and growth to local communities.


Job Description

We are seeking a skilled and experienced Data Engineer to join our team. In this role, you will develop systems on cloud platforms capable of processing millions of interactions daily, leveraging the latest cloud computing and machine learning technologies while creating custom in-house data solutions. The ideal candidate should have hands-on experience with SQL, PL/SQL, and any standard ETL tools. You must be able to thrive in a fast-paced environment and possess a strong passion for coding and problem-solving.


Required Skills and Experience

  • Minimum 5 years of experience in software development.
  • 3+ years of experience in data management and SQL expertise – PL/SQL, Teradata, and Snowflake experience strongly preferred.
  • Expertise in big data technologies such as Hadoop, HiveQL, and Spark (Scala/Python).
  • Expertise in cloud technologies – AWS (S3, Glue, Terraform, Lambda, Aurora, Redshift, EMR).
  • Experience with queuing systems (e.g., SQS, Kafka) and caching systems (e.g., Ehcache, Memcached).
  • Experience with container management tools (e.g., Docker Swarm, Kubernetes).
  • Familiarity with data stores, including at least one of the following: Postgres, MongoDB, Cassandra, or Redis.
  • Ability to create advanced visualizations and dashboards to communicate complex findings (e.g., Looker Studio, Power BI, Tableau).
  • Strong skills in manipulating and transforming complex datasets for in-depth analysis.
  • Technical proficiency in writing code in Python and advanced SQL queries.
  • Knowledge of AI/ML infrastructure, best practices, and tools is a plus.
  • Experience in analyzing and resolving code issues.
  • Hands-on experience with software architecture concepts such as Separation of Concerns (SoC) and micro frontends with theme packages.
  • Proficiency with the Git version control system.
  • Experience with Agile development methodologies.
  • Strong problem-solving skills and the ability to learn quickly.
  • Exposure to Docker and Kubernetes.
  • Familiarity with AWS or other cloud platforms.


Responsibilities

  • Develop and maintain our inhouse search and reporting platform
  • Create data solutions to complement core products to improve performance and data quality
  • Collaborate with the development team to design, develop, and maintain our suite of products.
  • Write clean, efficient, and maintainable code, adhering to coding standards and best practices.
  • Participate in code reviews and testing to ensure high-quality code.
  • Troubleshoot and debug application issues as needed.
  • Stay up-to-date with emerging trends and technologies in the development community.


How to apply?

  • If you are passionate about designing user-centric products and want to be part of a forward-thinking company, we would love to hear from you. Please send your resume, a brief cover letter outlining your experience and your current CTC (Cost to Company) as a part of the application.


Join us in shaping the future of e-commerce!

Read more
logiquad solutions
Rahul Sharma
Posted by Rahul Sharma
Remote only
3 - 4 yrs
₹4L - ₹6L / yr
skill iconPython
skill iconReact.js
FastAPI

Job Title: Full Stack Developer

Experience- 3+ years

Location: Remote

Notice Period: Immediate Joiner Preferred


Job Description:


Responsibilities:

  • Design, develop, and maintain full-stack applications using Python (FastAPI), Node.js, and React.
  • Create and optimize RESTful APIs and backend services.
  • Work with PostgreSQL to design efficient data models and write complex SQL queries.
  • Develop responsive and interactive front-end components using React.
  • Collaborate with team members using Git for version control and code reviews.
  • Debug, troubleshoot, and enhance existing applications for better performance and user experience.

Requirements:

  • 3 years of experience in full-stack development or a similar role.
  • Strong proficiency in Python with hands-on experience in FastAPI.
  • Solid front-end development experience using React.
  • Experience developing backend services with Node.js.
  • Proficiency in PostgreSQL, including writing efficient SQL queries.
  • Familiarity with Git for version control and collaborative development.
  • Good communication and problem-solving skills.


Read more
HaystackAnalytics
Careers Hr
Posted by Careers Hr
Navi Mumbai
0 - 5 yrs
₹3L - ₹8L / yr
skill iconPython
Algorithms
skill iconFlask
skill iconDjango
skill iconMongoDB

Position – Python Developer

Location – Navi Mumbai


Who are we

Based out of IIT Bombay, HaystackAnalytics is a HealthTech company creating clinical genomics products, which enable diagnostic labs and hospitals to offer accurate and personalized diagnostics. Supported by India's most respected science agencies (DST, BIRAC, DBT), we created and launched a portfolio of products to offer genomics in infectious diseases. Our genomics-based diagnostic solution for Tuberculosis was recognized as one of the top innovations supported by BIRAC in the past 10 years, and was launched by the Prime Minister of India in the BIRAC Showcase event in Delhi, 2022.


Objectives of this Role:

  • Design and implement efficient, scalable backend services using Python.
  • Work closely with healthcare domain experts to create innovative and accurate diagnostics solutions.
  • Build APIs, services, and scripts to support data processing pipelines and front-end applications.
  • Automate recurring tasks and ensure robust integration with cloud services.
  • Maintain high standards of software quality and performance using clean coding principles and testing practices.
  • Collaborate within the team to upskill and unblock each other for faster and better outcomes.



Primary Skills – Python Development

  • Proficient in Python 3 and its ecosystem
  • Frameworks: Flask / Django / FastAPI
  • RESTful API development
  • Understanding of OOPs and SOLID design principles
  • Asynchronous programming (asyncio, aiohttp)
  • Experience with task queues (Celery, RQ)

Database & Storage

  • Relational Databases: PostgreSQL / MySQL
  • NoSQL: MongoDB / Redis / Cassandra
  • ORM Tools: SQLAlchemy / Django ORM

Testing & Automation

  • Unit Testing: PyTest / unittest
  • Automation tools: Ansible / Terraform (good to have)
  • CI/CD pipelines

DevOps & Cloud

  • Docker, Kubernetes (basic knowledge expected)
  • Cloud platforms: AWS / Azure / GCP
  • GIT and GitOps workflows
  • Familiarity with containerized deployment & serverless architecture

Bonus Skills

  • Data handling libraries: Pandas / NumPy
  • Experience with scripting: Bash / PowerShell
  • Functional programming concepts
  • Familiarity with front-end integration (REST API usage, JSON handling)

 Other Skills

  • Innovation and thought leadership
  • Interest in learning new tools, languages, workflows
  • Strong communication and collaboration skills
  • Basic understanding of UI/UX principles


To know more about ushttps://haystackanalytics.in


Read more
eblity

at eblity

1 recruiter
Rajendran Govindasamy
Posted by Rajendran Govindasamy
Remote only
3 - 5 yrs
₹9L - ₹12L / yr
skill iconReact.js
skill iconPython
skill iconPostgreSQL
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
+3 more

Job Title: Full-Stack Developer

Location: Bangalore/Remote

Type: Full-time

About Eblity: 

Eblity’s mission is to empower educators and parents to help children facing challenges. 


Over 50% of children in mainstream schools face academic or behavioural challenges, most of which go unnoticed and underserved. By providing the right support at the right time, we could make a world of difference to these children.

We serve a community of over 200,000 educators and parents and over 3,000 schools.

If you are purpose-driven and want to use your skills in technology to create a positive impact for children facing challenges and their families, we encourage you to apply.

Join us in shaping the future of inclusive education and empowering learners of all abilities.


Role Overview: 

As a full-stack developer, you will lead the development of critical applications. 


These applications enable services for parents of children facing various challenges such as Autism, ADHD and Learning Disabilities, and for experts who can make a significant difference in these children’s lives. 

You will be part of a small, highly motivated team who are constantly working to improve outcomes for children facing challenges like Learning Disabilities, ADHD, Autism, Speech Disorders, etc. 

Job Description:

We are seeking a talented and proactive Full Stack Developer with hands-on experience in the React / Python / Postgres stack, leveraging Cursor and Replit for full-stack development. As part of our product development team, you will work on building responsive, scalable, and user-friendly web applications, utilizing both front-end and back-end technologies. Your expertise with Cursor as an AI agent-based development platform and Replit will be crucial for streamlining development processes and accelerating product timelines.

Responsibilities:

  • Design, develop, and maintain front-end web applications using React, ensuring a responsive, intuitive, and high-performance user experience.
  • Build and optimize the back-end using FastAPI or Flask and PostgreSQL, ensuring scalability, performance, and maintainability.
  • Leverage Replit for full-stack development, deploying applications, managing cloud resources, and streamlining collaboration across team members.
  • Utilize Cursor, an AI agent-based development platform, to enhance application development, automate processes, and optimize workflows through AI-driven code generation, data management, and integration.
  • Collaborate with cross-functional teams (back-end developers, designers, and product managers) to gather requirements, design solutions, and implement them seamlessly across the front-end and back-end.
  • Design and implement PostgreSQL database schemas, writing optimized queries to ensure efficient data retrieval and integrity.
  • Integrate RESTful APIs and third-party services across the React front-end and FastAPI/Flask/PostgreSQLback-end, ensuring smooth data flow.
  • Implement and optimize reusable React components and FastAPI/Flask functions to improve code maintainability and application performance.
  • Conduct thorough testing, including unit, integration, and UI testing, to ensure application stability and reliability.
  • Optimize both front-end and back-end applications for maximum speed and scalability, while resolving performance issues in both custom code and integrated services.
  • Stay up-to-date with emerging technologies to continuously improve the quality and efficiency of our solutions.

Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
  • 2+ years of experience in React development, with strong knowledge of component-based architecture, state management, and front-end best practices.
  • Proven experience in Python development, with expertise in building web applications using frameworks like FastAPI or Flask.
  • Solid experience working with PostgreSQL, including designing database schemas, writing optimized queries, and ensuring efficient data retrieval.
  • Experience with Cursor, an AI agent-based development platform, to enhance full-stack development through AI-driven code generation, data management, and automation.
  • Experience with Replit for full-stack development, deploying applications, and collaborating within cloud-based environments.
  • Experience working with RESTful APIs, including their integration into both front-end and back-end systems.
  • Familiarity with development tools and frameworks such as Git, Node.js, and Nginx.
  • Strong problem-solving skills, a keen attention to detail, and the ability to work independently or within a collaborative team environment.
  • Excellent communication skills to effectively collaborate with team members and stakeholders.

Nice-to-Have:

  • Experience with other front-end frameworks (e.g., Vue, Angular).
  • Familiarity with Agile methodologies and project management tools like Jira.
  • Understanding of cloud technologies and experience deploying applications to platforms like AWS or Google Cloud.
  • Knowledge of additional back-end technologies or frameworks (e.g., FastAPI).

What We Offer:

  • A collaborative and inclusive work environment that values every team member’s input.
  • Opportunities to work on innovative projects using Cursor and Replit for full-stack development.
  • Competitive salary and comprehensive benefits package.
  • Flexible working hours and potential for remote work options.

Location: Remote

If you're passionate about full-stack development and leveraging AI-driven platforms like Cursor and Replit to build scalable solutions, apply today to join our forward-thinking team!

Read more
SCRUT Automation

at SCRUT Automation

2 candid answers
1 recruiter
Praveen K
Posted by Praveen K
Remote only
4 - 6 yrs
₹25L - ₹35L / yr
skill iconPython

Position: Software Development Engineer - 2

Location: Bangalore/Remote


Role Overview

We are looking for a Software Development Engineer - 2 with 4-6 years of experience who is passionate about writing clean, scalable code and enjoys solving complex backend challenges. As part of the engineering team, you’ll work on designing, developing, and maintaining backend services, primarily in Python, with exposure to other backend technologies like Node.js and Go. You'll contribute to our microservices architecture, APIs, and cloud-native solutions, ensuring security and performance at scale.


Responsibilities

  • Write, test, and maintain scalable and efficient backend code in Python (FastAPI or similar frameworks).
  • Collaborate with cross-functional teams to design and implement APIs and microservices.
  • Ensure code quality by writing and reviewing test cases, and conducting code reviews.
  • Handle bug fixing and troubleshooting for backend systems as needed.
  • Build and optimize backend systems to positively impact business outcomes.
  • Design and implement cloud-native solutions with a focus on performance and security.
  • Monitor system health and continuously improve performance and reliability.
  • Contribute to process and code improvements, focusing on best practices.


Must-Have Technical Skills

  • 4-6 years of experience working with Python (preferably FastAPI or other frameworks).
  • Strong understanding of OOP principles and best coding practices.
  • Experience in designing and releasing production APIs.
  • Proficiency in RDBMS and NoSQL databases.
  • Familiarity with microservices and event-driven architecture.
  • Experience in cloud-native application development (SaaS).
  • Knowledge of cloud services such as GCS, AWS, or Azure.
  • Strong focus on security in design and coding practices.


Good-to-Have Skills

  • Experience in building and maintaining CI/CD pipelines.
  • Hands-on experience with NoSQL DBs.
  • Exposure to working in a cloud environment and familiarity with infrastructure management.
  • Aggressive problem diagnosis and creative problem-solving skills.
  • Excellent communication skills for collaborating with global teams.


About Us 

Scrut Automation is an information security and compliance monitoring platform, aimed at helping small and medium cloud-native enterprises develop and maintain a robust security posture, and comply with various infosec standards such as SOC 2, ISO 27001, GDPR, and the like with ease. With the help of the Scrut platform, customers reduce their manual effort for security and compliance tasks by 70%, and build real-time visibility of their security posture. 

Founded by IIT/ISB/McKinsey alumni, the founding team has over 15 years of combined Infosec experience. Scrut is built out of India for the world, with customers across India, APAC, North America, Europe and the Middle East. Scrut is backed by Lightspeed Ventures, MassMutual Ventures and Endiya Partners, along with prominent angels from the global SaaS community.


Why should this job excite you?


  • Flat-hierarchy, performance-driven culture 
  • Rapid growth and learning opportunities
  • Comprehensive medical insurance coverage 
  • A high-performing action-oriented team 
  • Competitive package, benefits and employee-friendly work culture


Note: Due to a high volume of applications, only shortlisted candidates will be contacted. Thank you for your understanding.


Read more
Gaian Solutions India

at Gaian Solutions India

1 video
2 recruiters
Agency job
via AccioJob by AccioJobHiring Board
Hyderabad
0 - 0 yrs
₹4.5L - ₹6L / yr
skill iconPython
SQL
skill iconMachine Learning (ML)
pandas
TensorFlow
+1 more

AccioJob is conducting an offline hiring drive with Gaian Solutions India for the position of AI /ML Intern.


Required Skills - Python,SQL, ML libraries like (scikit-learn, pandas, TensorFlow, etc.)


Apply Here - https://go.acciojob.com/tUxTdV


Eligibility -

  • Degree: B.Tech/BE/BCA/MCA/M.Tech
  • Graduation Year: 2023, 2024, and 2025
  • Branch: All Branches
  • Work Location: Hyderabad


Compensation -

  • Internship stipend: 20- 25k 
  • Internship duration: 3 months
  • CTC:- 4.5-6 LPA


Evaluation Process -


  • Assessment at the AccioJob Skill Centre in Pune
  • 2 Technical Interviews


Apply Here - https://go.acciojob.com/tUxTdV


Important: Please bring your laptop & earphones for the test.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Annie Varghese
Posted by Annie Varghese
Bengaluru (Bangalore)
1 - 3 yrs
Best in industry
skill iconPython
SQL
ETL
Data Visualization
Data Warehouse (DWH)

Job Summary:

We are looking for a motivated and detail-oriented Data Engineer with 1–2 years of experience to join our data engineering team. The ideal candidate should have solid foundational skills in SQL and Python, along with exposure to building or maintaining data pipelines. You’ll play a key role in helping to ingest, process, and transform data to support various business and analytical needs.

Key Responsibilities:

  • Assist in the design, development, and maintenance of scalable and efficient data pipelines.
  • Write clean, maintainable, and performance-optimized SQL queries.
  • Develop data transformation scripts and automation using Python.
  • Support data ingestion processes from various internal and external sources.
  • Monitor data pipeline performance and help troubleshoot issues.
  • Collaborate with data analysts, data scientists, and other engineers to ensure data quality and consistency.
  • Work with cloud-based data solutions and tools (e.g., AWS, Azure, GCP – as applicable).
  • Document technical processes and pipeline architecture.

Core Skills Required:

  • Proficiency in SQL (data querying, joins, aggregations, performance tuning).
  • Experience with Python, especially in the context of data manipulation (e.g., pandas, NumPy).
  • Exposure to ETL/ELT pipelines and data workflow orchestration tools (e.g., Airflow, Prefect, Luigi – preferred).
  • Understanding of relational databases and data warehouse concepts.
  • Familiarity with version control systems like Git.


Note: Its mandatory to give one round F2F from Bangalore office and the candidate should be based out of Bangalore.

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
4 - 10 yrs
Best in industry
skill iconPython
Spark
Apache Airflow
skill iconDocker
SQL
+2 more

What You’ll Do:


As a Sr. Data Scientist, you will work closely across DeepIntent Data Science teams located in New York City, India, and Bosnia. The role will focus on building predictive models, implement data drive solutions to maximize ad effectiveness. You will also lead efforts in generating analyses and insights related to measurement of campaign outcomes, Rx, patient journey, and supporting evolution of DeepIntent product suite. Activities in this position include developing and deploying models in production, reading campaign results, analyzing medical claims, clinical, demographic and clickstream data, performing analysis and creating actionable insights, summarizing, and presenting results and recommended actions to internal stakeholders and external clients, as needed.

  • Explore ways to to create better predictive models 
  • Analyze medical claims, clinical, demographic and clickstream data to produce and present actionable insights 
  • Explore ways of using inference, statistical, machine learning techniques to improve the performance of existing algorithms and decision heuristics
  • Design and deploy new iterations of production-level code
  • Contribute posts to our upcoming technical blog  


Who You Are:


  • Bachelor’s degree in a STEM field, such as Statistics, Mathematics, Engineering, Biostatistics, Econometrics, Economics, Finance, OR, or Data Science. Graduate degree is strongly preferred 
  • 5+ years of working experience as Data Scientist or Researcher in digital marketing, consumer advertisement, telecom, or other areas requiring customer level predictive analytics
  • Advanced proficiency in performing statistical analysis in Python, including relevant libraries is required
  • Experience working with data processing , transformation and building model pipelines using tools such as spark , airflow , docker
  • You have an understanding of the ad-tech ecosystem, digital marketing and advertising data and campaigns or familiarity with the US healthcare patient and provider systems (e.g. medical claims, medications)
  • You have varied and hands-on predictive machine learning experience (deep learning, boosting algorithms, inference…) 
  • You are interested in translating complex quantitative results into meaningful findings and interpretable deliverables, and communicating with less technical audiences orally and in writing
  • You can write production level code, work with Git repositories
  • Active Kaggle participant 
  • Working experience with SQL
  • Familiar with medical and healthcare data (medical claims, Rx, preferred)
  • Conversant with cloud technologies such as AWS or Google Cloud


Read more
Bengaluru (Bangalore)
3 - 6 yrs
₹10L - ₹30L / yr
cicd
skill iconC
skill iconPython
skill iconJenkins
skill iconGitHub
+4 more

Role Summary :

We are seeking a skilled and detail-oriented SRE Release Engineer to lead and streamline the CI/CD pipeline for our C and Python codebase. You will be responsible for coordinating, automating, and validating biweekly production releases, ensuring operational stability, high deployment velocity, and system reliability.


Key Responsibilities :

● Own the release process: Plan, coordinate, and execute biweekly software releases across multiple services.

● Automate release pipelines: Build and maintain CI/CD workflows using tools such as GitHub Actions, Jenkins, or GitLab CI.

● Version control: Manage and enforce Git best practices, branching strategies (e.g., Git Flow), tagging, and release versioning.

● Integrate testing frameworks: Ensure automated test coverage (unit, integration, regression) is enforced pre-release.

● Release validation: Develop pre-release verification tools/scripts to validate build integrity and backward compatibility.

● Deployment strategy: Implement and refine blue/green, rolling, or canary deployments in staging and production environments.

● Incident readiness: Partner with SREs to ensure rollback strategies, monitoring, and alerting are release-aware.

● Collaboration: Work closely with developers, QA, and product teams to align on release timelines and feature readiness. 


Required Qualifications

● Bachelor’s degree in Computer Science, Engineering, or related field. ● 3+ years in SRE, DevOps, or release engineering roles.

● Proficiency in CI/CD tooling (e.g., GitHub Actions, Jenkins, GitLab).

● Experience automating deployments for C and Python applications.

● Strong understanding of Git version control, merge/rebase strategies, tagging, and submodules (if used).

● Familiarity with containerization (Docker) and deployment orchestration (e.g., Kubernetes, Ansible, or Terraform).

● Solid scripting experience (Python, Bash, or similar). ● Understanding of observability, monitoring, and incident response tooling (e.g., Prometheus, Grafana, ELK, Sentry).


Preferred Skills

● Experience with release coordination in data networking environments ● Familiarity with build tools like Make, CMake, or Bazel.

● Exposure to artifact management systems (e.g., Artifactory, Nexus).

● Experience deploying to Linux production systems with service uptime guarantees.  

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort