50+ Python Jobs in Mumbai | Python Job openings in Mumbai
Apply to 50+ Python Jobs in Mumbai on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.


🚀 We’re Hiring: Python Developer – Quant Strategies & Backtesting | Mumbai (Goregaon East)
Are you a skilled Python Developer passionate about financial markets and quantitative trading?
We’re looking for someone to join our growing Quant Research & Algo Trading team, where you’ll work on:
🔹 Developing & optimizing trading strategies in Python
🔹 Building backtesting frameworks across multiple asset classes
🔹 Processing and analyzing large market datasets
🔹 Collaborating with quant researchers & traders on real-world strategies
What we’re looking for:
✔️ 3+ years of experience in Python development (preferably in fintech/trading/quant domains)
✔️ Strong knowledge of Pandas, NumPy, SciPy, SQL
✔️ Experience in backtesting, data handling & performance optimization
✔️ Familiarity with financial markets is a big plus
📍 Location: Goregaon East, Mumbai
💼 Competitive package + exposure to cutting-edge quant strategies

Wissen Technology is hiring for Data Engineer
About Wissen Technology:At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset—ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia. Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge technology expertise, and a relentless focus on quality. We don’t just meet expectations—we exceed them by ensuring faster time-to-market, reduced rework, and greater alignment with client objectives. We have a proven track record of building mission-critical systems across industries, including financial services, healthcare, retail, manufacturing, and more. Wissen stands apart through its unique delivery models. Our outcome-based projects ensure predictable costs and timelines, while our agile pods provide clients the flexibility to adapt to their evolving business needs. Wissen leverages its thought leadership and technology prowess to drive superior business outcomes. Our success is powered by top-tier talent. Our mission is clear: to be the partner of choice for building world-class custom products that deliver exceptional impact—the first time, every time.
Job Summary:Wissen Technology is hiring a Data Engineer with a strong background in Python, data engineering, and workflow optimization. The ideal candidate will have experience with Delta Tables, Parquet, and be proficient in Pandas and PySpark.
Experience:7+ years
Location:Pune, Mumbai, Bangalore
Mode of Work:Hybrid
Key Responsibilities:
- Develop and maintain data pipelines using Python (Pandas, PySpark).
- Optimize data workflows and ensure efficient data processing.
- Work with Delta Tables and Parquet for data storage and management.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Ensure data quality and integrity throughout the data lifecycle.
- Implement best practices for data engineering and workflow optimization.
Qualifications and Required Skills:
- Proficiency in Python, specifically with Pandas and PySpark.
- Strong experience in data engineering and workflow optimization.
- Knowledge of Delta Tables and Parquet.
- Excellent problem-solving skills and attention to detail.
- Ability to work collaboratively in a team environment.
- Strong communication skills.
Good to Have Skills:
- Experience with Databricks.
- Knowledge of Apache Spark, DBT, and Airflow.
- Advanced Pandas optimizations.
- Familiarity with PyTest/DBT testing frameworks.
Wissen Sites:
- Website: http://www.wissen.com
- LinkedIn: https://www.linkedin.com/company/wissen-technology
- Wissen Leadership: https://www.wissen.com/company/leadership-team/
- Wissen Live: https://www.linkedin.com/company/wissen-technology/posts/feedView=All
- Wissen Thought Leadership: https://www.wissen.com/articles/
Wissen | Driving Digital Transformation
A technology consultancy that drives digital innovation by connecting strategy and execution, helping global clients to strengthen their core technology.


Job Title: Data Engineering Support Engineer / Manager
Experience range:-8+ Years
Location:- Mumbai
Experience :
Knowledge, Skills and Abilities
- Python, SQL
- Familiarity with data engineering
- Experience with AWS data and analytics services or similar cloud vendor services
- Strong problem solving and communication skills
- Ablity to organise and prioritise work effectively
Key Responsibilities
- Incident and user management for data and analytics platform
- Development and maintenance of Data Quliaty framework (including anomaly detection)
- Implemenation of Python & SQL hotfixes and working with data engineers on more complex issues
- Diagnostic tools implementation and automation of operational processes
Key Relationships
- Work closely with data scientists, data engineers, and platform engineers in a highly commercial environment
- Support research analysts and traders with issue resolution

Responsibilities • Design, develop, and maintain backend systems and RESTful APIs using Python (Django, FastAPI, or Flask)• Build real-time communication features using WebSockets, SSE, and async IO • Implement event-driven architectures using messaging systems like Kafka, RabbitMQ, Redis Streams, or NATS • Develop and maintain microservices that interact over messaging and streaming protocols • Ensure high scalability and availability of backend services • Collaborate with frontend developers, DevOps engineers, and product managers to deliver end-to-end solutions • Write clean, maintainable code with unit/integration tests • Lead technical discussions, review code, and mentor junior engineers
Requirements • 8+ years of backend development experience, with at least 8 years in Python • Strong experience with asynchronous programming in Python (e.g., asyncio, aiohttp, FastAPI) • Production experience with WebSockets and Server-Sent Events • Hands-on experience with at least one messaging system: Kafka, RabbitMQ, Redis Pub/Sub, or similar • Proficient in RESTful API design and microservices architecture • Solid experience with relational and NoSQL databases • Familiarity with Docker and container-based deployment • Strong understanding of API security, authentication, and performance optimization
Nice to Have • Experience with GraphQL or gRPC • Familiarity with stream processing frameworks (e.g., Apache Flink, Spark Streaming) • Cloud experience (AWS, GCP, Azure), particularly with managed messaging or pub/sub services • Knowledge of CI/CD and infrastructure as code • Exposure to AI engineering workflows and tools • Interest or experience in building Agentic AI systems or integrating backends with AI agents
About the Role
We are looking for a hands-on and solution-oriented Senior Data Scientist – Generative AI to join our growing AI practice. This role is ideal for someone who thrives in designing and deploying Gen AI solutions on AWS, enjoys working with customers directly, and can lead end-to-end implementations. You will play a key role in architecting AI solutions, driving project delivery, and guiding junior team members.
Key Responsibilities
- Design and implement end-to-end Generative AI solutions for customers on AWS.
- Work closely with customers to understand business challenges and translate them into Gen AI use-cases.
- Own technical delivery, including data preparation, model integration, prompt engineering, deployment, and performance monitoring.
- Lead project execution – ensure timelines, manage stakeholder communications, and collaborate across internal teams.
- Provide technical guidance and mentorship to junior data scientists and engineers.
- Develop reusable components and reference architectures to accelerate delivery.
- Stay updated with latest developments in Gen AI, particularly AWS offerings like Bedrock, SageMaker, LangChain integrations, etc.
Required Skills & Experience
- 4–8 years of hands-on experience in Data Science/AI/ML, with at least 2–3 years in Generative AI projects.
- Proficient in building solutions using AWS AI/ML services (e.g., SageMaker, Amazon Bedrock, Lambda, API Gateway, S3, etc.).
- Experience with LLMs, prompt engineering, RAG pipelines, and deployment best practices.
- Solid programming experience in Python, with exposure to libraries such as Hugging Face, LangChain, etc.
- Strong problem-solving skills and ability to work independently in customer-facing roles.
- Experience in collaborating with Systems Integrators (SIs) or working with startups in India is a major plus.
Soft Skills
- Strong verbal and written communication for effective customer engagement.
- Ability to lead discussions, manage project milestones, and coordinate across stakeholders.
- Team-oriented with a proactive attitude and strong ownership mindset.
What We Offer
- Opportunity to work on cutting-edge Generative AI projects across industries.
- Collaborative, startup-like work environment with flexibility and ownership.
- Exposure to full-stack AI/ML project lifecycle and client-facing roles.
- Competitive compensation and learning opportunities in the AWS AI ecosystem.
About Oneture Technologies
Founded in 2016, Oneture is a cloud-first, full-service digital solutions company, helping clients harness the power of Digital Technologies and Data to drive transformations and turning ideas into business realities. Our team is full of curious, full-stack, innovative thought leaders who are dedicated to providing outstanding customer experiences and building authentic relationships. We are compelled by our core values to drive transformational results from Ideas to Reality for clients across all company sizes, geographies, and industries. The Oneture team delivers full lifecycle solutions— from ideation, project inception, planning through deployment to ongoing support and maintenance.
Our core competencies and technical expertise includes Cloud powered: Product Engineering, Big Data and AI ML. Our deep commitment to value creation for our clients and partners and “Startups-like agility with Enterprises-like maturity” philosophy has helped us establish long-term relationships with our clients and enabled us to build and manage mission-critical platforms for them.

Teknobuilt is an innovative construction technology company accelerating Digital and AI platform to help all aspects of program management and execution for workflow automation, collaborative manual tasks and siloed systems. Our platform has received innovation awards and grants in Canada, UK and S. Korea and we are at the frontiers of solving key challenges in the built environment and digital health, safety and quality.
Teknobuilt's vision is helping the world build better- safely, smartly and sustainably. We are on a mission to modernize construction by bringing Digitally Integrated Project Execution System - PACE and expert services for midsize to large construction and infrastructure projects. PACE is an end-to-end digital solution that helps in Real Time Project Execution, Health and Safety, Quality and Field management for greater visibility and cost savings. PACE enables digital workflows, remote working, AI based analytics to bring speed, flow and surety in project delivery. Our platform has received recognition globally for innovation and we are experiencing a period of significant growth for our solutions.
Job Responsibilities
As a Quality Analyst Engineer, you will be expected to:
· Thoroughly analyze project requirements, design specifications, and user stories to understand the scope and objectives.
· Arrange, set up, and configure necessary test environments for effective test case execution.
· Participate in and conduct review meetings to discuss test plans, test cases, and defect statuses.
Execute manual test cases with precision, analyze results, and identify deviations from expected behavior.
· Accurately track, log, prioritize, and manage defects through their lifecycle, ensuring clear communication with developers until resolution.
· Maintain continuous and clear communication with the Test Manager and development team regarding testing progress, roadblocks, and critical findings.
· Develop, maintain, and manage comprehensive test documentation, including:
o Detailed Test Plans
o Well-structured Test Cases for various testing processes
o Concise Summary Reports on test execution and defect status
o Thorough Test Data preparation for test cases
o "Lessons Learned" documents based on testing inputs from previous projects
o "Suggestion Documents" aimed at improving overall software quality
o Clearly defined Test Scenarios
· Clearly report identified bugs to developers with precise steps to reproduce, expected results, and actual results, facilitating efficient defect resolution
Job Overview:
We are looking for a skilled Senior Backend Engineer to join our team. The ideal candidate will have a strong foundation in Java and Spring, with proven experience in building scalable microservices and backend systems. This role also requires familiarity with automation tools, Python development, and working knowledge of AI technologies.
Responsibilities:
- Design, develop, and maintain backend services and microservices.
- Build and integrate RESTful APIs across distributed systems.
- Ensure performance, scalability, and reliability of backend systems.
- Collaborate with cross-functional teams and participate in agile development.
- Deploy and maintain applications on AWS cloud infrastructure.
- Contribute to automation initiatives and AI/ML feature integration.
- Write clean, testable, and maintainable code following best practices.
- Participate in code reviews and technical discussions.
Required Skills:
- 4+ years of backend development experience.
- Strong proficiency in Java and Spring/Spring Boot frameworks.
- Solid understanding of microservices architecture.
- Experience with REST APIs, CI/CD, and debugging complex systems.
- Proficient in AWS services such as EC2, Lambda, S3.
- Strong analytical and problem-solving skills.
- Excellent communication in English (written and verbal).
Good to Have:
- Experience with automation tools like Workato or similar.
- Hands-on experience with Python development.
- Familiarity with AI/ML features or API integrations.
- Comfortable working with US-based teams (flexible hours).

📢 DATA SOURCING & ANALYSIS EXPERT (L3 Support) – Mumbai 📢
Are you ready to supercharge your Data Engineering career in the financial domain?
We’re seeking a seasoned professional (5–7 years experience) to join our Mumbai team and lead in data sourcing, modelling, and analysis. If you’re passionate about solving complex challenges in Relational & Big Data ecosystems, this role is for you.
What You’ll Be Doing
- Translate business needs into robust data models, program specs, and solutions
- Perform advanced SQL optimization, query tuning, and L3-level issue resolution
- Work across the entire data stack: ETL, Python / Spark, Autosys, and related systems
- Debug, monitor, and improve data pipelines in production
- Collaborate with business, analytics, and engineering teams to deliver dependable data services
What You Should Bring
- 5+ years in financial / fintech / capital markets environment
- Proven expertise in relational databases and big data technologies
- Strong command over SQL tuning, query optimization, indexing, partitioning
- Hands-on experience with ETL pipelines, Spark / PySpark, Python scripting, job scheduling (e.g. Autosys)
- Ability to troubleshoot issues at the L3 level, root cause analysis, performance tuning
- Good communication skills — you’ll coordinate with business users, analytics, and tech teams

Designation: Python Developer
Experienced in AI/ML
Location: Turbhe, Navi Mumbai
CTC: 6-12 LPA
Years of Experience: 2-5 years
At Arcitech.ai, we’re redefining the future with AI-powered software solutions across education, recruitment, marketplaces, and beyond. We’re looking for a Python Developer passionate about AI/ML, who’s ready to work on scalable, cloud-native platforms and help build the next generation of intelligent, LLM-driven products.
💼 Your Responsibilities
AI/ML Engineering
- Develop, train, and optimize ML models using PyTorch/TensorFlow/Keras.
- Build end-to-end LLM and RAG (Retrieval-Augmented Generation) pipelines using LangChain.
- Collaborate with data scientists to convert prototypes into production-grade AI applications.
- Integrate NLP, Computer Vision, and Recommendation Systems into scalable products.
- Work with transformer-based architectures (BERT, GPT, LLaMA, etc.) for real-world AI use cases.
Backend & Systems Development
- Design, develop, and maintain robust Python microservices with REST/GraphQL APIs.
- Implement real-time communication with Django Channels/WebSockets.
- Containerize AI services with Docker and deploy on Kubernetes (EKS/GKE/AKS).
- Configure and manage AWS (EC2, S3, RDS, SageMaker, CloudWatch) for AI/ML workloads.
Reliability & Automation
- Develop background task queues with Celery, ensuring smart retries and monitoring.
- Implement CI/CD pipelines for automated model training, testing, and deployment.
- Write automated unit & integration tests (pytest/unittest) with ≥80% coverage.
Collaboration
- Contribute to MLOps best practices and mentor peers in LangChain/AI integration.
- Participate in tech talks, code reviews, and AI learning sessions within the team.
🎓 Required Qualifications
- Bachelor’s or Master’s degree in Computer Science, AI/ML, or related field.
- 2–5 years of experience in Python development with strong AI/ML exposure.
- Hands-on experience with LangChain for building LLM-powered workflows and RAG systems.
- Deep learning experience with PyTorch or TensorFlow.
- Experience deploying ML models and LLM apps into production systems.
- Familiarity with REST/GraphQL APIs and cloud platforms (AWS/Azure/GCP).
- Skilled in Git workflows, automated testing, and CI/CD practices.
🌟 Nice to Have
- Experience with vector databases (Pinecone, Weaviate, FAISS, Milvus) for retrieval pipelines.
- Knowledge of LLM fine-tuning, prompt engineering, and evaluation frameworks.
- Familiarity with Airflow/Prefect/Dagster for data and model pipelines.
- Background in statistics, optimization, or applied mathematics.
- Contributions to AI/ML or LangChain open-source projects.
- Experience with model monitoring and drift detection in production.
🎁 Why Join Us
- Competitive compensation and benefits 💰
- Work on cutting-edge LLM and AI/ML applications 🤖
- A collaborative, innovation-driven work culture 📚
- Opportunities to grow into AI/ML leadership roles 🚀

What we want to accomplish and why we need you?
Jio Haptik is an AI leader having pioneered AI-powered innovation since 2013. Reliance Jio Digital Services acquired Haptik in April 2019. Haptik currently leads India’s AI market having become the first to process 15 billion+ two-way conversations across 10+ channels and in 135 languages. Haptik is also a Category Leader across platforms including Gartner, G2, Opus Research & more. Recently Haptik won the award for “Tech Startup of the Year” in the AI category at Entrepreneur India Awards 2023, and gold medal for “Best Chat & Conversational Bot” at Martequity Awards 2023. Haptik has a headcount of 200+ employees with offices in Mumbai, Delhi, and Bangalore.
What will you do every day?
Our Implementation & Professional Services team plays a pivotal role in ensuring clients derive maximum value from our solutions. We are looking for an Engineering Manager to lead a delivery-focused team that bridges technology, client success, and execution excellence. You will ensure seamless solution implementation, timely project delivery, and a high-quality client experience.
- Manage and lead a team of solution engineers, developers, and QA professionals working on client implementations and custom delivery projects.
- Collaborate with project managers and client-facing teams to ensure smooth execution of implementation roadmaps.
- Translate client requirements into scalable, maintainable technical solutions while ensuring alignment with Haptik’s platform architecture.
- Own sprint planning, resource allocation, and delivery commitments across multiple concurrent client projects.
- Establish and enforce delivery best practices across coding, testing, deployment, and documentation.
- Act as the escalation point for technical challenges during delivery, unblocking teams and ensuring minimal disruption to project timelines.
- Drive continuous improvements in tools, processes, and automation to accelerate implementations.
- Coach and mentor engineers to develop their skills while keeping them aligned to client-centric delivery goals.
- Partner closely with cross-functional teams (Product, Solutions, Customer Success, and Sales Engineering) to ensure successful hand-offs and long-term customer satisfaction.
- Support recruitment and team scaling as we grow our services and delivery footprint.
Ok, you're sold, but what are we looking for in the perfect candidate?
- Strong technical foundation with the ability to guide teams on architecture, integrations, and scalable solution design.
- Demonstrated experience managing agile delivery teams that have implemented enterprise-grade software solutions.
- Experience in Professional Services, Client Implementations, or Delivery Engineering is highly preferred.
- Proven ability to balance customer requirements, delivery commitments, and engineering best practices.
- Strong exposure to technologies relevant to distributed systems: programming languages (Python preferred), APIs, cloud platforms, databases, CI/CD, containerization, queues, caches, Elasticsearch/SOLR, etc.
- Excellent stakeholder management and communication skills — comfortable working with both internal teams and enterprise customers.
- Ability to mentor, motivate, and inspire teams to deliver on ambitious goals while ensuring a culture of ownership and accountability.
- 5–8 years of professional experience, including 2–4 years in leading delivery or implementation-focused engineering teams.
Requirements*:
- B.E. / B.Tech. / MCA in Computer Science, Engineering or a related field is required.
- Overall 5-8 years of professional experience. 2-4 years of hands-on experience in managing and leading agile teams delivering tech products.
- Deep understanding of best practices in development and testing processes.
- Good exposure to various technical and architectural concepts of building distributed systems - not limited to but including at least one programming language/framework, version control, CI/CD, queues, caches, SOLR/Elasticsearch, databases, containerization, cloud platforms.
- Excellent written and verbal communication skills.
- Exceptional organizational skills and time management abilities.
- Prior experience of working with Python Programming language.
- Background of being part of a high-paced development team having delivered client-facing products with hands-on involvement.
* Requirements is such a strong word. We don’t necessarily expect to find a candidate who has done everything listed, but you should be able to make a credible case that you’ve done most of it and are ready for the challenge of adding some new things to your resume.
Tell me more about Haptik
- On a roll: Announced major strategic partnership in April 2019 with Jio in a $100 million deal.
- Great team: You will be working with great leaders who have been listed in Business World 40 Under 40, Forbes 30 Under 30 and MIT 35 Under 35 Innovators.
- Great culture: The freedom to think and innovate is something that defines the culture of Haptik. Every person is approachable. While we are working hard, it is also important to take breaks to not get too worked up.
- Huge market: Disrupting a massive, growing AI market. The global market is projected to attain a valuation of $9 billion by the end of 2024.
- Emerging technology: We are moving to a Gen AI first world, and Haptik is one of the largest Generative AI first companies globally, based out of India.
- Great customers: Some of the most notable brands in the world - Jio, Paytm, Adani, Paisabazaar, Puma & Whirlpool
- Impact: A fun and exciting start-up culture that empowers its people to make a huge impact.
Working hard for things that we don't care about is stress, but working hard for something we love is called passion! At Haptik we passionately solve problems in order to be able to move faster and each Haptikan imbibes our key values of honesty, ownership, perseverance, communication, impact, curiosity, courage, agility and selflessness.
About The Company:
Dolat is a dynamic team of traders, puzzle solvers, and coding enthusiasts focused on tackling complex challenges in the financial world. We specialize in trading in volatile markets and developing cutting-edge technologies and strategies. We're seeking a skilled Linux Support Engineer to manage over 400 servers and support 100+ users. Our engineers ensure a high-performance, low-latency environment while maintaining simplicity and control. If you're passionate about technology, trading, and problem-solving, this is the place to engineer your skills into a rewarding career.
Qualifications:
- B.E/ B.Tech
- Experience: 1-3 years.
- Job location – Andheri West, Mumbai.
Responsibilities:
- Troubleshoot network issues, kernel panics, system hangs, and performance bottlenecks.
- Fine-tune processes for minimal jitter in a low-latency environment.
- Support low-latency servers, lines, and networks, participating in on-call rotations.
- Install, configure, and deploy fully-distributed Red Hat Linux systems.
- Deploy, configure, and monitor complex trading applications.
- Provide hands-on support to trading, risk, and compliance teams (Linux & Windows platforms).
- Automate processes and analyze performance metrics to improve system efficiency.
- Collaborate with development teams to maintain a stable, high-performance trading environment.
- Drive continuous improvement, system reliability, and simplicity.
- Resolve issues in a fast-paced, results-driven IT team.
- Provide level one and two support for tools systems testing and production release.
Skills Required:
- Expertise in Linux kernel tuning, configuration management (cfengine).
- Experience with hardware testing/integration and IT security.
- Proficient in maintaining Cisco, Windows, and PC hardware.
- Good knowledge in Perl, Python, Powershell & Bash.
- Hands-on knowledge of SSH, iptables, NFS, DNS, DHCP, and LDAP.
- Experience with Open Source tools (Nagios, SmokePing, MRTG) for enterprise-level systems.
- Knowledge of maintaining Cisco ports/VLANs/dot1.x
- Solid understanding of OS and network architectures.
- REDHAT Certification and SQL/database knowledge.
- Ability to manage multiple tasks in a fast-paced environment.
- Excellent communication skills and fluency in English.
- Proven technical problem-solving capabilities.
- Strong documentation and knowledge management skills.
- Software development skills are a bonus.
- Preferred SQL and database administration skills.
Industry
- Financial Services

About Oneture Technologies
Founded in 2016, Oneture is a cloud-first, full-service digital solutions company, helping clients harness the power of Digital Technologies and Data to drive transformations and turning ideas into business realities. Our team is full of curious, full-stack, innovative thought leaders who are dedicated to providing outstanding customer experiences and building authentic relationships.
We are compelled by our core values to drive transformational results from Ideas to Reality for clients across all company sizes, geographies, and industries. The Oneture team delivers full lifecycle solutions—from ideation, project inception, planning through deployment to ongoing support and maintenance.
Our core competencies and technical expertise includes Cloud powered: Product Engineering, Big Data and AI ML. Our deep commitment to value creation for our clients and partners and “Startups-like agility with Enterprises-like maturity” philosophy has helped us establish long- term relationships with our clients and enabled us to build and manage mission-critical platforms for them.
About The Role
As a Data Platform Lead, you will utilize your strong technical background and hands-on development skills to design, develop, and maintain data platforms.
Leading a team of skilled data engineers, you will create scalable and robust data solutions that enhance business intelligence and decision-making. You will ensure the reliability, efficiency, and scalability of data systems while mentoring your team to achieve excellence.
Collaborating closely with our client’s CXO-level stakeholders, you will oversee pre-sales activities, solution architecture, and project execution. Your ability to stay ahead of industry trends and integrate the latest technologies will be crucial in maintaining our competitive edge.
Key Responsibilities
- Client-Centric Approach: Understand client requirements deeply and translate them into robust technical specifications, ensuring solutions meet their business needs.
- Architect for Success: Design scalable, reliable, and high-performance systems that exceed client expectations and drive business success.
- Lead with Innovation: Provide technical guidance, support, and mentorship to the development team, driving the adoption of cutting-edge technologies and best practices.
- Champion Best Practices: Ensure excellence in software development and IT service delivery, constantly assessing and evaluating new technologies, tools, and platforms for project suitability.
- Be the Go-To Expert: Serve as the primary point of contact for clients throughout the project lifecycle, ensuring clear communication and high levels of satisfaction.
- Build Strong Relationships: Cultivate and manage relationships with CxO/VP level stakeholders, positioning yourself as a trusted advisor.
- Deliver Excellence: Manage end-to-end delivery of multiple projects, ensuring timely and high-quality outcomes that align with business goals.
- Report with Clarity: Prepare and present regular project status reports to stakeholders, ensuring transparency and alignment.
- Collaborate Seamlessly: Coordinate with cross-functional teams to ensure smooth and efficient project execution, breaking down silos and fostering collaboration.
- Grow the Team: Provide timely and constructive feedback to support the professional growth of team members, creating a high-performance culture.
Qualifications
- Master’s (M. Tech., M.S.) in Computer Science or equivalent from reputed institutes like IIT, NIT preferred
- Overall 6-8 years of experience with minimum 2 years of relevant experience and a strong technical background.
- Experience working in mid size IT Services company is preferred
Location : Mumbai / Pune (Hybrid)
Technical Expertise:
- Advanced knowledge of distributed architectures and data modeling practices.
- Extensive experience with Data Lakehouse systems like Databricks and data warehousing solutions such as Redshift and Snowflake.
- Hands-on experience with data technologies such as Apache Spark, SQL, Airflow, Kafka, Jenkins, Hadoop, Flink, Hive, Pig, HBase, Presto, and Cassandra.
- Knowledge in BI tools including PowerBi, Tableau, Quicksight and open source equivalent like Superset and Metabase is good to have
- Strong knowledge of data storage formats including Iceberg, Hudi, and Delta.
- Proficient programming skills in Python, Scala, Go, or Java.
- Ability to architect end-to-end solutions from data ingestion to insights, including designing data integrations using ETL and other data integration patterns.
- Experience working with multi-cloud environments, particularly AWS and Azure.
- Excellent teamwork and communication skills, with the ability to thrive in a fast-paced, agile environment.

Job description:
Title: Python Developer
Location: Onsite – Mumbai, Maharashtra
Experience: 5-6 years in Python development
Joining: Immediate
About the Role
We are building cutting-edge AI products designed for enterprise-scale applications and arelooking for a Senior Python Developer to join our core engineering team. You will beresponsible for designing and delivering robust, scalable backend systems that power ouradvanced AI solutions.
Key Responsibilities
- Design, develop, and maintain scalable Python-based backend applications and services.
- Collaborate with AI/ML teams to integrate machine learning models into productionenvironments.
- Optimize applications for performance, reliability, and security.
- Write clean, maintainable, and testable code following best practices.
- Work with cross-functional teams including Data Science, DevOps, and UI/UX to ensureseamless delivery.
- Participate in code reviews, architecture discussions, and technical decision-making.
- Troubleshoot, debug, and upgrade existing systems.
Required Skills & Experience
- Minimum 5 years of professional Python development experience.
- Strong expertise in Django / Flask / FastAPI.
- Hands-on experience with REST APIs, microservices, and event-driven architecture.
- Solid understanding of databases (PostgreSQL, MySQL, MongoDB, Redis).
- Familiarity with cloud platforms (AWS / Azure / GCP) and CI/CD pipelines.
- Experience with AI/ML pipeline integration is a strong plus.
- Strong problem-solving and debugging skills.
- Excellent communication skills and ability to work in a collaborative environment.
Good to Have
- Experience with Docker, Kubernetes.
- Exposure to message brokers (RabbitMQ, Kafka).
- Knowledge of data engineering tools (Airflow, Spark).
- Familiarity with Neo4j or other graph databases.

About Us:
PluginLive is an all-in-one tech platform that bridges the gap between all its stakeholders - Corporates, Institutes Students, and Assessment & Training Partners. This ecosystem helps Corporates in brand building/positioning with colleges and the student community to scale its human capital, at the same time increasing student placements for Institutes, and giving students a real time perspective of the corporate world to help upskill themselves into becoming more desirable candidates.
Role Overview:
Entry-level Data Engineer position focused on building and maintaining data pipelines while developing visualization skills. You'll work alongside senior engineers to support our data infrastructure and create meaningful insights through data visualization.
Responsibilities:
- Assist in building and maintaining ETL/ELT pipelines for data processing
- Write SQL queries to extract and analyze data from various sources
- Support data quality checks and basic data validation processes
- Create simple dashboards and reports using visualization tools
- Learn and work with Oracle Cloud services under guidance
- Use Python for basic data manipulation and cleaning tasks
- Document data processes and maintain data dictionaries
- Collaborate with team members to understand data requirements
- Participate in troubleshooting data issues with senior support
- Contribute to data migration tasks as needed
Qualifications:
Required:
- Bachelor's degree in Computer Science, Information Systems, or related field
- around 2 years of experience in data engineering or related field
- Strong SQL knowledge and database concepts
- Comfortable with Python programming
- Understanding of data structures and ETL concepts
- Problem-solving mindset and attention to detail
- Good communication skills
- Willingness to learn cloud technologies
Preferred:
- Exposure to Oracle Cloud or any cloud platform (AWS/GCP)
- Basic knowledge of data visualization tools (Tableau, Power BI, or Python libraries like Matplotlib)
- Experience with Pandas for data manipulation
- Understanding of data warehousing concepts
- Familiarity with version control (Git)
- Academic projects or internships involving data processing
Nice-to-Have:
- Knowledge of dbt, BigQuery, or Snowflake
- Exposure to big data concepts
- Experience with Jupyter notebooks
- Comfort with AI-assisted coding tools (Copilot, GPTs)
- Personal projects showcasing data work
What We Offer:
- Mentorship from senior data engineers
- Hands-on learning with modern data stack
- Access to paid AI tools and learning resources
- Clear growth path to mid-level engineer
- Direct impact on product and data strategy
- No unnecessary meetings — focused execution
- Strong engineering culture with continuous learning opportunities


Job Description
We are looking for a talented Java Developer to work in abroad countries. You will be responsible for developing high-quality software solutions, working on both server-side components and integrations, and ensuring optimal performance and scalability.
Preferred Qualifications
- Experience with microservices architecture.
- Knowledge of cloud platforms (AWS, Azure).
- Familiarity with Agile/Scrum methodologies.
- Understanding of front-end technologies (HTML, CSS, JavaScript) is a plus.
Requirment Details
Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).
Proven experience as a Java Developer or similar role.
Strong knowledge of Java programming language and its frameworks (Spring, Hibernate).
Experience with relational databases (e.g., MySQL, PostgreSQL) and ORM tools.
Familiarity with RESTful APIs and web services.
Understanding of version control systems (e.g., Git).
Solid understanding of object-oriented programming (OOP) principles.
Strong problem-solving skills and attention to detail.


Job Title: Python Developer (FastAPI)
Experience Required: 4+ years
Location: Pune, Bangalore, Hyderabad, Mumbai, Panchkula, Mohali
Shift: Night Shift 6:30 pm to 3:30 AM IST
About the Role
We are seeking an experienced Python Developer with strong expertise in FastAPI to join our engineering team. The ideal candidate should have a solid background in backend development, RESTful API design, and scalable application development.
Required Skills & Qualifications
· 4+ years of professional experience in backend development with Python.
· Strong hands-on experience with FastAPI (or Flask/Django with migration experience).
· Familiarity with asynchronous programming in Python.
· Working knowledge of version control systems (Git).
· Good problem-solving and debugging skills.
· Strong communication and collaboration abilities.



Lead Data Scientist
Location: Mumbai
Application Link: https://flpl.keka.com/careers/jobdetails/40052
What you’ll do
- Manage end-to-end data science projects from scoping to deployment, ensuring accuracy, reliability and measurable business impact
- Translate business needs into actionable DS tasks, lead data wrangling, feature engineering, and model optimization
- Communicate insights to non-technical stakeholders to guide decisions while mentoring a 14 member DS team.
- Implement scalable MLOps, automated pipelines, and reusable frameworks to accelerate delivery and experimentation
What we’re looking for
- 4-5 years of hands-on experience in Data Science/ML with strong foundations in statistics, Linear Algebra, and optimization
- Proficient in Python (NumbPy, pandas, scikit-learn, XGBoost) and experienced with at least one cloud platform (AWS, GCP or Azure)
- Skilled in building data pipelines (Airflow, Spark) and deploying models using Docker, FastAPI, etc
- Adept at communicating insights effectively to both technical and non-technical audiences
- Bachelor’s from any field
You might have an edge over others if
- Experience with LLMs or GenAI apps
- Contributions to open-source or published research
- Exposure to real-time analytics and industrial datasets
You should not apply with us if
- You don’t want to work in agile environments
- The unpredictability and super iterative nature of startups scare you
- You hate working with people who are smarter than you
- You don’t thrive in self-driven, “owner mindset” environments- nothing wrong- just not our type!
About us
We’re Faclon Labs – a high-growth, deep-tech startup on a mission to make infrastructure and utilities smarter using IoT and SaaS. Sounds heavy? That’s because we do heavy lifting — in tech, in thinking, and in creating real-world impact.
We’re not your average startup. We don’t do corporate fluff. We do ownership, fast iterations, and big ideas. If you're looking for ping-pong tables, we're still saving up. But if you want to shape the soul of the company while it's being built- this is the place!

🚀 Hiring: Python Developer
⭐ Experience: 2+ Years
📍 Location: Mumbai
⭐ Work Mode:- 5 Days Work From Office
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Looking for a skilled Python Developer with experience in Django / FastAPI and MongoDB / PostgreSQL.
⭐ Must-Have Skills:-
✅ 2+ years of professional experience as a Python Developer
✅Proficient in Django or FastAPI
✅Hands-on with MongoDB & PostgreSQL
✅Strong understanding of REST APIs & Git

🚀 We’re Hiring: Senior Cloud & ML Infrastructure Engineer 🚀
We’re looking for an experienced engineer to lead the design, scaling, and optimization of cloud-native ML infrastructure on AWS.
If you’re passionate about platform engineering, automation, and running ML systems at scale, this role is for you.
What you’ll do:
🔹 Architect and manage ML infrastructure with AWS (SageMaker, Step Functions, Lambda, ECR)
🔹 Build highly available, multi-region solutions for real-time & batch inference
🔹 Automate with IaC (AWS CDK, Terraform) and CI/CD pipelines
🔹 Ensure security, compliance, and cost efficiency
🔹 Collaborate across DevOps, ML, and backend teams
What we’re looking for:
✔️ 6+ years AWS cloud infrastructure experience
✔️ Strong ML pipeline experience (SageMaker, ECS/EKS, Docker)
✔️ Proficiency in Python/Go/Bash scripting
✔️ Knowledge of networking, IAM, and security best practices
✔️ Experience with observability tools (CloudWatch, Prometheus, Grafana)
✨ Nice to have: Robotics/IoT background (ROS2, Greengrass, Edge Inference)
📍 Location: Bengaluru, Hyderabad, Mumbai, Pune, Mohali, Delhi
5 days working, Work from Office
Night shifts: 9pm to 6am IST
👉 If this sounds like you (or someone you know), let’s connect!
Apply here:

Job Title : Software Development Engineer (Python, Django & FastAPI + React.js)
Experience : 2+ Years
Location : Nagpur / Remote (India)
Job Type : Full Time
Collaboration Hours : 11:00 AM – 7:00 PM IST
About the Role :
We are seeking a Software Development Engineer to join our growing team. The ideal candidate will have strong expertise in backend development with Python, Django, and FastAPI, as well as working knowledge of AWS.
While backend development is the primary focus, you should also be comfortable contributing to frontend development using JavaScript, TypeScript, and React.
Mandatory Skills : Python, Django, FastAPI, AWS, JavaScript/TypeScript, React, REST APIs, SQL/NoSQL.
Key Responsibilities :
- Design, develop, and maintain backend services using Python (Django / FastAPI).
- Deploy, scale, and manage applications on AWS cloud services.
- Collaborate with frontend developers and contribute to React (JS/TS) development when required.
- Write clean, efficient, and maintainable code following best practices.
- Ensure system performance, scalability, and security.
- Participate in the full software development lifecycle : planning, design, development, testing, and deployment.
- Work collaboratively with cross-functional teams to deliver high-quality solutions.
Requirements :
- Bachelor’s degree in Computer Science, Computer Engineering, or related field.
- 2+ years of professional software development experience.
- Strong proficiency in Python, with hands-on experience in Django and FastAPI.
- Practical experience with AWS cloud services.
- Basic proficiency in JavaScript, TypeScript, and React for frontend development.
- Solid understanding of REST APIs, databases (SQL/NoSQL), and software design principles.
- Familiarity with Git and collaborative workflows.
- Strong problem-solving ability and adaptability in a fast-paced environment.
Good to Have :
- Experience with Docker for containerization.
- Knowledge of CI/CD pipelines and DevOps practices.


Required Skills & Qualifications
- 4+ years of professional experience in backend development with Python.
- Strong hands-on experience with FastAPI (or Flask/Django with migration experience).
- Familiarity with asynchronous programming in Python.
- Working knowledge of version control systems (Git).
- Good problem-solving and debugging skills.
- Strong communication and collaboration abilities.
- should have a solid background in backend development, RESTful API design, and scalable application development.
Shift: Night Shift 6:30 pm to 3:30 AM IST

Job Summary:
We are looking for a highly skilled and experienced Data Engineer with deep expertise in Airflow, dbt, Python, and Snowflake. The ideal candidate will be responsible for designing, building, and managing scalable data pipelines and transformation frameworks to enable robust data workflows across the organization.
Key Responsibilities:
- Design and implement scalable ETL/ELT pipelines using Apache Airflow for orchestration.
- Develop modular and maintainable data transformation models using dbt.
- Write high-performance data processing scripts and automation using Python.
- Build and maintain data models and pipelines on Snowflake.
- Collaborate with data analysts, data scientists, and business teams to deliver clean, reliable, and timely data.
- Monitor and optimize pipeline performance and troubleshoot issues proactively.
- Follow best practices in version control, testing, and CI/CD for data projects.
Must-Have Skills:
- Strong hands-on experience with Apache Airflow for scheduling and orchestrating data workflows.
- Proficiency in dbt (data build tool) for building scalable and testable data models.
- Expert-level skills in Python for data processing and automation.
- Solid experience with Snowflake, including SQL performance tuning, data modeling, and warehouse management.
- Strong understanding of data engineering best practices including modularity, testing, and deployment.
Good to Have:
- Experience working with cloud platforms (AWS/GCP/Azure).
- Familiarity with CI/CD pipelines for data (e.g., GitHub Actions, GitLab CI).
- Exposure to modern data stack tools (e.g., Fivetran, Stitch, Looker).
- Knowledge of data security and governance best practices.
Note : One face-to-face (F2F) round is mandatory, and as per the process, you will need to visit the office for this.

We are seeking a highly skilled React JS Developer with exceptional DOM manipulation expertise and real-time data handling experience to join our team. You'll be building and optimizing high-performance user interfaces for stock market trading applications where milliseconds matter and data flows continuously.
The ideal candidate thrives in fast-paced environments, understands the intricacies of browser performance, and has hands-on experience with WebSockets and real-time data streaming architectures.
Key Responsibilities
Core Development
- Advanced DOM Operations: Implement complex, performance-optimized DOM manipulations for real-time trading interfaces
- Real-time Data Management: Build robust WebSocket connections and handle high-frequency data streams with minimal latency
- Performance Engineering: Create lightning-fast, scalable front-end applications that process thousands of market updates per second
- Custom Component Architecture: Design and build reusable, high-performance React components optimized for trading workflows
Collaboration & Integration
- Work closely with traders, quants, and backend developers to translate complex trading requirements into intuitive interfaces
- Collaborate with UX/UI designers and product managers to create responsive, trader-focused experiences
- Integrate with real-time market data APIs and trading execution systems
Technical Excellence
- Implement sophisticated data visualizations and interactive charts using libraries like Chartjs, TradingView, or custom D3.js solutions
- Ensure cross-browser compatibility and responsiveness across multiple devices and screen sizes
- Debug and resolve complex performance issues, particularly in real-time data processing and rendering
- Maintain high-quality code through reviews, testing, and comprehensive documentation
Required Skills & Experience
React & JavaScript Mastery
- 5+ years of professional React.js development with deep understanding of React internals, hooks, and advanced patterns
- Expert-level JavaScript (ES6+) with strong proficiency in asynchronous programming, closures, and memory management
- Advanced HTML5 & CSS3 skills with focus on performance and cross-browser compatibility
Real-time & Performance Expertise
- Proven experience with WebSockets and real-time data streaming protocols
- Strong DOM manipulation skills - direct DOM access, virtual scrolling, efficient updates, and performance optimization
- RESTful API integration with experience in handling high-frequency data feeds
- Browser performance optimization - understanding of rendering pipeline, memory management, and profiling tools
Development Tools & Practices
- Proficiency with modern build tools: Webpack, Babel, Vite, or similar
- Experience with Git version control and collaborative development workflows
- Agile/Scrum development environment experience
- Understanding of testing frameworks (Jest, React Testing Library)
Financial Data Visualization
- Experience with financial charting libraries: Chartjs, TradingView, D3.js, or custom visualization solutions
- Understanding of market data structures, order books, and trading terminology
- Knowledge of data streaming optimization techniques for financial applications
Nice-to-Have Skills
Domain Expertise
- Prior experience in stock market, trading, or financial services - understanding of trading workflows, order management, risk systems
- Algorithmic trading knowledge or exposure to quantitative trading systems
- Financial market understanding - equities, derivatives, commodities
Technical Plus Points
- Backend development experience with GoLang, Python, or Node.js
- Database knowledge: SQL, NoSQL, time-series databases (InfluxDB, TimescaleDB)
- Cloud platform experience: AWS, Azure, GCP for deploying scalable applications
- Message queue systems: Redis, RabbitMQ, Kafka, NATS for real-time data processing
- Microservices architecture understanding and API design principles
Advanced Skills
- Service Worker implementation for offline-first applications
- Progressive Web App (PWA) development
- Mobile-first responsive design expertise
Qualifications
- Bachelor's degree in Computer Science, Engineering, or related field (or equivalent professional experience)
- 5+ years of professional React.js development with demonstrable experience in performance-critical applications
- Portfolio or examples of complex real-time applications you've built
- Financial services experience strongly preferred
Why You'll Love Working Here
We're a team that hustles—plain and simple. But we also believe life outside work matters. No cubicles, no suits—just great people doing great work in a space built for comfort and creativity.
What We Offer
💰 Competitive salary – Get paid what you're worth
🌴 Generous paid time off – Recharge and come back sharper
🌍 Work with the best – Collaborate with top-tier global talent
✈️ Adventure together – Annual offsites (mostly outside India) and regular team outings
🎯 Performance rewards – Multiple bonuses for those who go above and beyond
🏥 Health covered – Comprehensive insurance so you're always protected
⚡ Fun, not just work – On-site sports, games, and a lively workspace
🧠 Learn and lead – Regular knowledge-sharing sessions led by your peers
📚 Annual Education Stipend – Take any external course, bootcamp, or certification that makes you better at your craft
🏋️ Stay fit – Gym memberships with equal employer contribution to keep you at your best
🚚 Relocation support – Smooth move? We've got your back
🏆 Friendly competition – Work challenges and extracurricular contests to keep things exciting
We work hard, play hard, and grow together. Join us.

We are seeking an experienced Operations Lead to drive operational excellence and lead a dynamic team in our fast-paced environment. The ideal candidate will combine strong technical expertise in Python with proven leadership capabilities to optimize processes, ensure system reliability, and deliver results.
Key Responsibilities
- Team & stakeholder leadership - Lead 3-4 operations professionals and work cross-functionally with developers, system administrators, quants, and traders
- DevOps automation & deployment - Develop deployment pipelines, automate configuration management, and build Python-based tools for operational processes and system optimization
- Technical excellence & standards - Drive code reviews, establish development standards, ensure regional consistency with DevOps practices, and maintain technical documentation
- System operations & performance - Monitor and optimize system performance for high availability, scalability, and security while managing day-to-day operations
- Incident management & troubleshooting - Coordinate incident response, resolve infrastructure and deployment issues, and implement automated solutions to prevent recurring problems
- Strategic technical leadership - Make infrastructure decisions, identify operational requirements, design scalable architecture, and stay current with industry best practices
- Reporting & continuous improvement - Report on operational metrics and KPIs to senior leadership while actively contributing to DevOps process improvements
Qualifications and Experience
- Bachelor's degree in Computer Science, Engineering, or related technical field
- Proven experience of at least 5 years as a Software Engineer including at least 2 years as a DevOps Engineer or similar role, working with complex software projects and environments.
- Excellent knowledge with cloud technologies, containers and orchestration.
- Proficiency in scripting and programming languages such as Python and Bash.
- Experience with Linux operating systems and command-line tools.
- Proficient in using Git for version control.
Good to Have
- Experience with Nagios or similar monitoring and alerting systems
- Backend and/or frontend development experience for operational tooling
- Previous experience working in a trading firm or financial services environment
- Knowledge of database management and SQL
- Familiarity with cloud platforms (AWS, Azure, GCP)
- Experience with DevOps practices and CI/CD pipelines
- Understanding of network protocols and system administration
Why You’ll Love Working Here
We’re a team that hustles—plain and simple. But we also believe life outside work matters. No cubicles, no suits—just great people doing great work in a space built for comfort and creativity.
Here’s what we offer:
💰 Competitive salary – Get paid what you’re worth.
🌴 Generous paid time off – Recharge and come back sharper.
🌍 Work with the best – Collaborate with top-tier global talent.
✈️ Adventure together – Annual offsites (mostly outside India) and regular team outings.
🎯 Performance rewards – Multiple bonuses for those who go above and beyond.
🏥 Health covered – Comprehensive insurance so you’re always protected.
⚡ Fun, not just work – On-site sports, games, and a lively workspace.
🧠 Learn and lead – Regular knowledge-sharing sessions led by your peers.
📚 Annual Education Stipend – Take any external course, bootcamp, or certification that makes you better at your craft.
🏋️ Stay fit – Gym memberships with equal employer contribution to keep you at your best.
🚚 Relocation support – Smooth move? We’ve got your back.
🏆 Friendly competition – Work challenges and extracurricular contests to keep things exciting.
We work hard, play hard, and grow together. Join us.
(P.S. We hire for talent, not pedigree—but if you’ve worked at a top tech co or fintech startup, we’d love to hear how you’ve shipped great products.)


- 4= years of experience
- Proficiency in Python programming.
- Experience with Python Service Development (RestAPI/FlaskAPI)
- Basic knowledge of front-end development.
- Basic knowledge of Data manipulation and analysis libraries
- Code versioning and collaboration. (Git)
- Knowledge for Libraries for extracting data from websites.
- Knowledge of SQL and NoSQL databases
- Familiarity with Cloud (Azure /AWS) technologies


We are seeking a highly skilled Qt/QML Engineer to design and develop advanced GUIs for aerospace applications. The role requires working closely with system architects, avionics software engineers, and mission systems experts to create reliable, intuitive, and real-time UI for mission-critical systems such as UAV ground control stations, and cockpit displays.
Key Responsibilities
- Design, develop, and maintain high-performance UI applications using Qt/QML (Qt Quick, QML, C++).
- Translate system requirements into responsive, interactive, and user-friendly interfaces.
- Integrate UI components with real-time data streams from avionics systems, UAVs, or mission control software.
- Collaborate with aerospace engineers to ensure compliance with DO-178C, or MIL-STD guidelines where applicable.
- Optimise application performance for low-latency visualisation in mission-critical environments.
- Implement data visualisation (raster and vector maps, telemetry, flight parameters, mission planning overlays).
- Write clean, testable, and maintainable code while adhering to aerospace software standards.
- Work with cross-functional teams (system engineers, hardware engineers, test teams) to validate UI against operational requirements.
- Support debugging, simulation, and testing activities, including hardware-in-the-loop (HIL) setups.
Required Qualifications
- Bachelor’s / Master’s degree in Computer Science, Software Engineering, or related field.
- 1-3 years of experience in developing Qt/QML-based applications (Qt Quick, QML, Qt Widgets).
- Strong proficiency in C++ (11/14/17) and object-oriented programming.
- Experience integrating UI with real-time data sources (TCP/IP, UDP, serial, CAN, DDS, etc.).
- Knowledge of multithreading, performance optimisation, and memory management.
- Familiarity with aerospace/automotive domain software practices or mission-critical systems.
- Good understanding of UX principles for operator consoles and mission planning systems.
- Strong problem-solving, debugging, and communication skills.
Desirable Skills
- Experience with GIS/Mapping libraries (OpenSceneGraph, Cesium, Marble, etc.).
- Knowledge of OpenGL, Vulkan, or 3D visualisation frameworks.
- Exposure to DO-178C or aerospace software compliance.
- Familiarity with UAV ground control software (QGroundControl, Mission Planner, etc.) or similar mission systems.
- Experience with Linux and cross-platform development (Windows/Linux).
- Scripting knowledge in Python for tooling and automation.
- Background in defence, aerospace, automotive or embedded systems domain.
What We Offer
- Opportunity to work on cutting-edge aerospace and defence technologies.
- Collaborative and innovation-driven work culture.
- Exposure to real-world avionics and mission systems.
- Growth opportunities in autonomy, AI/ML for aerospace, and avionics UI systems.

Responsibilities :
- Design and develop user-friendly web interfaces using HTML, CSS, and JavaScript.
- Utilize modern frontend frameworks and libraries such as React, Angular, or Vue.js to build dynamic and responsive web applications.
- Develop and maintain server-side logic using programming languages such as Java, Python, Ruby, Node.js, or PHP.
- Build and manage APIs for seamless communication between the frontend and backend systems.
- Integrate third-party services and APIs to enhance application functionality.
- Implement CI/CD pipelines to automate testing, integration, and deployment processes.
- Monitor and optimize the performance of web applications to ensure a high-quality user experience.
- Stay up-to-date with emerging technologies and industry trends to continuously improve development processes and application performance.
Qualifications :
- Bachelors/master's in computer science or related subjects or hands-on experience demonstrating working understanding of software applications.
- Knowledge of building applications that can be deployed in a cloud environment or are cloud native applications.
- Strong expertise in building backend applications using Java/C#/Python with demonstrable experience in using frameworks such as Spring/Vertx/.Net/FastAPI.
- Deep understanding of enterprise design patterns, API development and integration and Test-Driven Development (TDD)
- Working knowledge in building applications that leverage databases such as PostgreSQL, MySQL, MongoDB, Neo4J or storage technologies such as AWS S3, Azure Blob Storage.
- Hands-on experience in building enterprise applications adhering to their needs of security and reliability.
- Hands-on experience building applications using one of the major cloud providers (AWS, Azure, GCP).
- Working knowledge of CI/CD tools for application integration and deployment.
- Working knowledge of using reliability tools to monitor the performance of the application.

🚀 Hiring: Python Developer
⭐ Experience: 2+ Years
📍 Location: Mumbai
⭐ Work Mode:- 5 Days Work From Office
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Looking for a skilled Python Developer with experience in Django / FastAPI and MongoDB / PostgreSQL.
⭐ Must-Have Skills:-
✅ 2+ years of professional experience as a Python Developer
✅Proficient in Django or FastAPI
✅Hands-on with MongoDB & PostgreSQL
✅Strong understanding of REST APIs & Git

Role Overview:
We are seeking a talented and experienced Data Architect with strong data visualization capabilities to join our dynamic team in Mumbai. As a Data Architect, you will be responsible for designing, building, and managing our data infrastructure, ensuring its reliability, scalability, and performance. You will also play a crucial role in transforming complex data into insightful visualizations that drive business decisions. This role requires a deep understanding of data modeling, database technologies (particularly Oracle Cloud), data warehousing principles, and proficiency in data manipulation and visualization tools, including Python and SQL.
Responsibilities:
- Design and implement robust and scalable data architectures, including data warehouses, data lakes, and operational data stores, primarily leveraging Oracle Cloud services.
- Develop and maintain data models (conceptual, logical, and physical) that align with business requirements and ensure data integrity and consistency.
- Define data governance policies and procedures to ensure data quality, security, and compliance.
- Collaborate with data engineers to build and optimize ETL/ELT pipelines for efficient data ingestion, transformation, and loading.
- Develop and execute data migration strategies to Oracle Cloud.
- Utilize strong SQL skills to query, manipulate, and analyze large datasets from various sources.
- Leverage Python and relevant libraries (e.g., Pandas, NumPy) for data cleaning, transformation, and analysis.
- Design and develop interactive and insightful data visualizations using tools like [Specify Visualization Tools - e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly] to communicate data-driven insights to both technical and non-technical stakeholders.
- Work closely with business analysts and stakeholders to understand their data needs and translate them into effective data models and visualizations.
- Ensure the performance and reliability of data visualization dashboards and reports.
- Stay up-to-date with the latest trends and technologies in data architecture, cloud computing (especially Oracle Cloud), and data visualization.
- Troubleshoot data-related issues and provide timely resolutions.
- Document data architectures, data flows, and data visualization solutions.
- Participate in the evaluation and selection of new data technologies and tools.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field.
- Proven experience (typically 5+ years) as a Data Architect, Data Modeler, or similar role.
- Deep understanding of data warehousing concepts, dimensional modeling (e.g., star schema, snowflake schema), and ETL/ELT processes.
- Extensive experience working with relational databases, particularly Oracle, and proficiency in SQL.
- Hands-on experience with Oracle Cloud data services (e.g., Autonomous Data Warehouse, Object Storage, Data Integration).
- Strong programming skills in Python and experience with data manipulation and analysis libraries (e.g., Pandas, NumPy).
- Demonstrated ability to create compelling and effective data visualizations using industry-standard tools (e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly).
- Excellent analytical and problem-solving skills with the ability to interpret complex data and translate it into actionable insights.
- Strong communication and presentation skills, with the ability to effectively communicate technical concepts to non-technical audiences.
- Experience with data governance and data quality principles.
- Familiarity with agile development methodologies.
- Ability to work independently and collaboratively within a team environment.
Application Link- https://forms.gle/km7n2WipJhC2Lj2r5

Responsibilities:
- We are seeking a highly skilled and motivated Full-Stack Developer with expertise in DVR, NVR, IP Camera integration, and IoT communication. The ideal candidate will have experience with the ONVIF protocol and be capable of developing a seamless, unified application that provides live streaming, NVR/DVR management, and smart home automation.
- Design and develop robust applications for integrating DVR, NVR, and IP Camera systems.
- Implement live video streaming using protocols such as RTSP.
- Develop and maintain RESTful APIs for managing and configuring NVR/DVR settings.
- Integrate ONVIF protocol to ensure compatibility with ONVIF-compliant devices.
- Establish and configure IoT communication using protocols like MQTT, CoAP, or HTTP/HTTPS.
- Integrate smart home automation systems (e.g., Google Home, Amazon Alexa, Apple HomeKit).
- Collaborate with cross-functional teams to define, design, and ship new features.
- Write clean, maintainable, and efficient code.
- Conduct thorough testing and debugging to ensure the reliability and security of the application.
- Stay updated with the latest industry trends and technologies.
Requirements:
- Bachelor's degree in computer science, Engineering, or related field, or equivalent practical experience.
- Proven experience as a Full-Stack Developer or similar role.
- Strong proficiency in Python & CPP back-end technologies
- Familiarity with IoT communication protocols (MQTT, CoAP, HTTP/HTTPS).
- Experience with smart home automation platforms (Google Home, Amazon Alexa, Apple HomeKit).
- Proficiency in database management (PostgreSQL, MySQL, MongoDB).
- Solid understanding of security best practices and secure communication channels (TLS/SSL).
- Hands on with Linux/Debian
- Experience of working with Raspberry PI/Arduino Uno/Andbox
- Strong problem-solving skills and the ability to work independently or as part of a team.
Good to have:
- Knowledge of ONVIF protocol /ONVIF SDKs
- Working knowledge of IP Camera
- Excellent communication and collaboration skills.

Role Overview:
As a Backend Developer at LearnTube.ai, you will ship the backbone that powers 2.3 million learners in 64 countries—owning APIs that crunch 1 billion learning events & the AI that supports it with <200 ms latency.
Skip the wait and get noticed faster by completing our AI-powered screening. Click this link to start your quick interview. It only takes a few minutes and could be your shortcut to landing the job! -https://bit.ly/LT_Python
What You'll Do:
At LearnTube, we’re pushing the boundaries of Generative AI to revolutionize how the world learns. As a Backend Engineer, your roles and responsibilities will include:
- Ship Micro-services – Build FastAPI services that handle ≈ 800 req/s today and will triple within a year (sub-200 ms p95).
- Power Real-Time Learning – Drive the quiz-scoring & AI-tutor engines that crunch millions of events daily.
- Design for Scale & Safety – Model data (Postgres, Mongo, Redis, SQS) and craft modular, secure back-end components from scratch.
- Deploy Globally – Roll out Dockerised services behind NGINX on AWS (EC2, S3, SQS) and GCP (GKE) via Kubernetes.
- Automate Releases – GitLab CI/CD + blue-green / canary = multiple safe prod deploys each week.
- Own Reliability – Instrument with Prometheus / Grafana, chase 99.9 % uptime, trim infra spend.
- Expose Gen-AI at Scale – Publish LLM inference & vector-search endpoints in partnership with the AI team.
- Ship Fast, Learn Fast – Work with founders, PMs, and designers in weekly ship rooms; take a feature from Figma to prod in < 2 weeks.
What makes you a great fit?
Must-Haves:
- 2+ yrs Python back-end experience (FastAPI)
- Strong with Docker & container orchestration
- Hands-on with GitLab CI/CD, AWS (EC2, S3, SQS) or GCP (GKE / Compute) in production
- SQL/NoSQL (Postgres, MongoDB) + You’ve built systems from scratch & have solid system-design fundamentals
Nice-to-Haves
- k8s at scale, Terraform,
- Experience with AI/ML inference services (LLMs, vector DBs)
- Go / Rust for high-perf services
- Observability: Prometheus, Grafana, OpenTelemetry
About Us:
At LearnTube, we’re on a mission to make learning accessible, affordable, and engaging for millions of learners globally. Using Generative AI, we transform scattered internet content into dynamic, goal-driven courses with:
- AI-powered tutors that teach live, solve doubts in real time, and provide instant feedback.
- Seamless delivery through WhatsApp, mobile apps, and the web, with over 1.4 million learners across 64 countries.
Meet the Founders:
LearnTube was founded by Shronit Ladhani and Gargi Ruparelia, who bring deep expertise in product development and ed-tech innovation. Shronit, a TEDx speaker, is an advocate for disrupting traditional learning, while Gargi’s focus on scalable AI solutions drives our mission to build an AI-first company that empowers learners to achieve career outcomes. We’re proud to be recognised by Google as a Top 20 AI Startup and are part of their 2024 Startups Accelerator: AI First Program, giving us access to cutting-edge technology, credits, and mentorship from industry leaders.
Why Work With Us?
At LearnTube, we believe in creating a work environment that’s as transformative as the products we build. Here’s why this role is an incredible opportunity:
- Cutting-Edge Technology: You’ll work on state-of-the-art generative AI applications, leveraging the latest advancements in LLMs, multimodal AI, and real-time systems.
- Autonomy and Ownership: Experience unparalleled flexibility and independence in a role where you’ll own high-impact projects from ideation to deployment.
- Rapid Growth: Accelerate your career by working on impactful projects that pack three years of learning and growth into one.
- Founder and Advisor Access: Collaborate directly with founders and industry experts, including the CTO of Inflection AI, to build transformative solutions.
- Team Culture: Join a close-knit team of high-performing engineers and innovators, where every voice matters, and Monday morning meetings are something to look forward to.
- Mission-Driven Impact: Be part of a company that’s redefining education for millions of learners and making AI accessible to everyone.
Job Responsibilities:
- Managing and maintaining the efficient functioning of containerized applications and systems within an organization
- Design, implement, and manage scalable Kubernetes clusters in cloud or on-premise environments
- Develop and maintain CI/CD pipelines to automate infrastructure and application deployments, and track all automation processes
- Implement workload automation using configuration management tools, as well as infrastructure as code (IaC) approaches for resource provisioning
- Monitor, troubleshoot, and optimize the performance of Kubernetes clusters and underlying cloud infrastructure
- Ensure high availability, security, and scalability of infrastructure through automation and best practices
- Establish and enforce cloud security standards, policies, and procedures Work agile technologies
Primary Requirements:
- Kubernetes: Proven experience in managing Kubernetes clusters (min. 2-3 years)
- Linux/Unix: Proficiency in administering complex Linux infrastructures and services
- Infrastructure as Code: Hands-on experience with CM tools like Ansible, as well as the
- knowledge of resource provisioning with Terraform or other Cloud-based utilities
- CI/CD Pipelines: Expertise in building and monitoring complex CI/CD pipelines to
- manage the build, test, packaging, containerization and release processes of software
- Scripting & Automation: Strong scripting and process automation skills in Bash, Python
- Monitoring Tools: Experience with monitoring and logging tools (Prometheus, Grafana)
- Version Control: Proficient with Git and familiar with GitOps workflows.
- Security: Strong understanding of security best practices in cloud and containerized
- environments.
Skills/Traits that would be an advantage:
- Kubernetes administration experience, including installation, configuration, and troubleshooting
- Kubernetes development experience
- Strong analytical and problem-solving skills
- Excellent communication and interpersonal skills
- Ability to work independently and as part of a team

Key Responsibilities
- Design and implement ETL/ELT pipelines using Databricks, PySpark, and AWS Glue
- Develop and maintain scalable data architectures on AWS (S3, EMR, Lambda, Redshift, RDS)
- Perform data wrangling, cleansing, and transformation using Python and SQL
- Collaborate with data scientists to integrate Generative AI models into analytics workflows
- Build dashboards and reports to visualize insights using tools like Power BI or Tableau
- Ensure data quality, governance, and security across all data assets
- Optimize performance of data pipelines and troubleshoot bottlenecks
- Work closely with stakeholders to understand data requirements and deliver actionable insights
🧪 Required Skills
Skill AreaTools & TechnologiesCloud PlatformsAWS (S3, Lambda, Glue, EMR, Redshift)Big DataDatabricks, Apache Spark, PySparkProgrammingPython, SQLData EngineeringETL/ELT, Data Lakes, Data WarehousingAnalyticsData Modeling, Visualization, BI ReportingGen AI IntegrationOpenAI, Hugging Face, LangChain (preferred)DevOps (Bonus)Git, Jenkins, Terraform, Docker
📚 Qualifications
- Bachelor's or Master’s degree in Computer Science, Data Science, or related field
- 3+ years of experience in data engineering or data analytics
- Hands-on experience with Databricks, PySpark, and AWS
- Familiarity with Generative AI tools and frameworks is a strong plus
- Strong problem-solving and communication skills
🌟 Preferred Traits
- Analytical mindset with attention to detail
- Passion for data and emerging technologies
- Ability to work independently and in cross-functional teams
- Eagerness to learn and adapt in a fast-paced environment

Position: Python Developer
Location: Andheri East, Mumbai
Work Mode: 5 Days WFO
Availability: Immediate joiners only (or notice period completed)
What We're Looking For:
✅ 2+ years of solid Python development experience
✅ Django framework expertise - must have!
✅ FastAPI framework knowledge - essential!
✅ Database skills in MongoDB OR PostgreSQL
✅ Ready to work from office 5 days a week
At WeAssemble, we connect global businesses with top-tier talent to build dedicated offshore teams. Our mission is to deliver exceptional services through innovation, collaboration, and transparency. We pride ourselves on a vibrant work culture and are constantly on the lookout for passionate professionals to join our journey.
Job Description:
We are looking for a highly skilled Automation Tester with 3 years of experience to join our dynamic team in Mumbai. The ideal candidate should be proactive, detail-oriented, and ready to hit the ground running. If you’re passionate about quality assurance and test automation, we’d love to meet you!
Key Responsibilities:
Design, develop, and execute automated test scripts using industry-standard tools and frameworks.
Collaborate with developers, business analysts, and other stakeholders to understand requirements and ensure quality.
Maintain and update automation test suites as per application changes.
Identify, record, document, and track bugs.
Ensure the highest quality of deliverables with minimal supervision.
Contribute to the continuous improvement of QA processes and automation strategies.
Skills & Qualifications:
Minimum 3 years of hands-on experience in automation testing.
Proficiency in automation tools such as Selenium, TestNG, JUnit, etc.
Solid knowledge of programming/scripting languages (Java, Python, etc.).
Familiarity with CI/CD tools like Jenkins, Git, etc.
Good understanding of software development lifecycle and agile methodologies.
Excellent analytical and problem-solving skills.
Strong communication and teamwork abilities.
Location: Mumbai (Work from Office)
Notice Period: Candidates who can join immediately or within 15 days will be preferred.

🚀 Hiring: Python Developer
⭐ Experience: 2+ Years
📍 Location: Mumbai (Andheri East )
⭐ Work Mode:- 5 Days Work From Office
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Looking for a skilled Python Developer with experience in Django / FastAPI and MongoDB / PostgreSQL.
⭐ Must-Have Skills:-
✅ 2+ years of professional experience as a Python Developer
✅Proficient in Django or FastAPI
✅Hands-on with MongoDB or PostgreSQL
✅Strong understanding of REST APIs & Git

We are seeking a highly skilled and motivated Python Developer with hands-on experience in AWS cloud services (Lambda, API Gateway, EC2), microservices architecture, PostgreSQL, and Docker. The ideal candidate will be responsible for designing, developing, deploying, and maintaining scalable backend services and APIs, with a strong emphasis on cloud-native solutions and containerized environments.
Key Responsibilities:
- Develop and maintain scalable backend services using Python (Flask, FastAPI, or Django).
- Design and deploy serverless applications using AWS Lambda and API Gateway.
- Build and manage RESTful APIs and microservices.
- Implement CI/CD pipelines for efficient and secure deployments.
- Work with Docker to containerize applications and manage container lifecycles.
- Develop and manage infrastructure on AWS (including EC2, IAM, S3, and other related services).
- Design efficient database schemas and write optimized SQL queries for PostgreSQL.
- Collaborate with DevOps, front-end developers, and product managers for end-to-end delivery.
- Write unit, integration, and performance tests to ensure code reliability and robustness.
- Monitor, troubleshoot, and optimize application performance in production environments.
Required Skills:
- Strong proficiency in Python and Python-based web frameworks.
- Experience with AWS services: Lambda, API Gateway, EC2, S3, CloudWatch.
- Sound knowledge of microservices architecture and asynchronous programming.
- Proficiency with PostgreSQL, including schema design and query optimization.
- Hands-on experience with Docker and containerized deployments.
- Understanding of CI/CD practices and tools like GitHub Actions, Jenkins, or CodePipeline.
- Familiarity with API documentation tools (Swagger/OpenAPI).
- Version control with Git.

Responsibilities:
• Help define and create Backend architecture and deployment using Python- Django-AWS in an agile environment with lots of ownership and active mentoring
• Work with the Product and Design teams to build new features to solve business problems and fill business needs
• Participate in code reviews to create robust and maintainable code
• Work in an agile environment where quick iterations and good feedback are a way of life
• Interact with other stakeholders for requirement, design discussions and for adoption of new features • Communicate and coordinate with our support and professional services teams to solve customer issues
• Help scale our platform as we expand our product across various markets and verticals globally
As a young, fresh startup, we are hoping to be joined by self-starting, hardworking, passionate individuals who are committed to delivering their best, who can grow into future leaders of FactWise.


- 5+ years of experience
- FlaskAPI, RestAPI development experience
- Proficiency in Python programming.
- Basic knowledge of front-end development.
- Basic knowledge of Data manipulation and analysis libraries
- Code versioning and collaboration. (Git)
- Knowledge for Libraries for extracting data from websites.
- Knowledge of SQL and NoSQL databases
- Familiarity with RESTful APIs
- Familiarity with Cloud (Azure /AWS) technologies

Tableau Server Administrator (10+ Yrs Exp.) 📊🔒
📍Location: Remote
🗓️ Experience: 10+ years
MandatorySkills & Qualifications:
1. Proven expertise in Tableau architecture, clustering, scalability, and high availability.
2. Proficiency in PowerShell, Python, or Shell scripting.
3. Experience with cloud platforms (AWS, Azure, GCP) and Tableau Cloud.
4. Familiarity with database systems (SQL Server, Oracle, Snowflake).
5. Any certification Plus.

Job Summary:
We are looking for a skilled and motivated Python AWS Engineer to join our team. The ideal candidate will have strong experience in backend development using Python, cloud infrastructure on AWS, and building serverless or microservices-based architectures. You will work closely with cross-functional teams to design, develop, deploy, and maintain scalable and secure applications in the cloud.
Key Responsibilities:
- Develop and maintain backend applications using Python and frameworks like Django or Flask
- Design and implement serverless solutions using AWS Lambda, API Gateway, and other AWS services
- Develop data processing pipelines using services such as AWS Glue, Step Functions, S3, DynamoDB, and RDS
- Write clean, efficient, and testable code following best practices
- Implement CI/CD pipelines using tools like CodePipeline, GitHub Actions, or Jenkins
- Monitor and optimize system performance and troubleshoot production issues
- Collaborate with DevOps and front-end teams to integrate APIs and cloud-native services
- Maintain and improve application security and compliance with industry standards
Required Skills:
- Strong programming skills in Python
- Solid understanding of AWS cloud services (Lambda, S3, EC2, DynamoDB, RDS, IAM, API Gateway, CloudWatch, etc.)
- Experience with infrastructure as code (e.g., CloudFormation, Terraform, or AWS CDK)
- Good understanding of RESTful API design and microservices architecture
- Hands-on experience with CI/CD, Git, and version control systems
- Familiarity with containerization (Docker, ECS, or EKS) is a plus
- Strong problem-solving and communication skills
Preferred Qualifications:
- Experience with PySpark, Pandas, or data engineering tools
- Working knowledge of Django, Flask, or other Python frameworks
- AWS Certification (e.g., AWS Certified Developer – Associate) is a plus
Educational Qualification:
- Bachelor's or Master’s degree in Computer Science, Engineering, or related field
Mumbai malad work from office
6 Days working
1 & 3 Saturday off
AWS Expertise: Minimum 2 years of experience working with AWS services like RDS, S3, EC2, and Lambda.
Roles and Responsibilities
1. Backend Development: Develop scalable and high-performance APIs and backend systems using Node.js. Write clean, modular, and reusable code following best practices. Debug, test, and optimize backend services for performance and scalability.
2. Database Management: Design and maintain relational databases using MySQL, PostgreSQL, or AWS RDS. Optimize database queries and ensure data integrity. Implement data backup and recovery plans.
3. AWS Cloud Services: Deploy, manage, and monitor applications using AWS infrastructure. Work with AWS services including RDS, S3, EC2, Lambda, API Gateway, and CloudWatch. Implement security best practices for AWS environments (IAM policies, encryption, etc.).
4. Integration and Microservices:Integrate third-party APIs and services. Develop and manage microservices architecture for modular application development.
5. Version Control and Collaboration: Use Git for code versioning and maintain repositories. Collaborate with front-end developers and project managers for end-to-end project delivery.
6. Troubleshooting and Debugging: Analyze and resolve technical issues and bugs. Provide maintenance and support for existing backend systems.
7. DevOps and CI/CD: Set up and maintain CI/CD pipelines. Automate deployment processes and ensure zero-downtime releases.
8. Agile Development:
Participate in Agile/Scrum ceremonies such as daily stand-ups, sprint planning, and retrospectives.
Deliver tasks within defined timelines while maintaining high quality.
Required Skills
Strong proficiency in Node.js and JavaScript/TypeScript.
Expertise in working with relational databases like MySQL/PostgreSQL and AWS RDS.
Proficient with AWS services including Lambda, S3, EC2, and API Gateway.
Experience with RESTful API design and GraphQL (optional).
Knowledge of containerization using Docker is a plus.
Strong problem-solving and debugging skills.
Familiarity with tools like Git, Jenkins, and Jira.

Role Overview:
As a Backend Developer at LearnTube.ai, you will ship the backbone that powers 2.3 million learners in 64 countries—owning APIs that crunch 1 billion learning events & the AI that supports it with <200 ms latency.
Skip the wait and get noticed faster by completing our AI-powered screening. Click this link to start your quick interview. It only takes a few minutes and could be your shortcut to landing the job! - https://bit.ly/LT_Python
What You'll Do:
At LearnTube, we’re pushing the boundaries of Generative AI to revolutionise how the world learns. As a Backend Engineer, you will be building the backend for an AI system and working directly on AI. Your roles and responsibilities will include:
- Ship Micro-services – Build FastAPI services that handle ≈ 800 req/s today and will triple within a year (sub-200 ms p95).
- Power Real-Time Learning – Drive the quiz-scoring & AI-tutor engines that crunch millions of events daily.
- Design for Scale & Safety – Model data (Postgres, Mongo, Redis, SQS) and craft modular, secure back-end components from scratch.
- Deploy Globally – Roll out Dockerised services behind NGINX on AWS (EC2, S3, SQS) and GCP (GKE) via Kubernetes.
- Automate Releases – GitLab CI/CD + blue-green / canary = multiple safe prod deploys each week.
- Own Reliability – Instrument with Prometheus / Grafana, chase 99.9 % uptime, trim infra spend.
- Expose Gen-AI at Scale – Publish LLM inference & vector-search endpoints in partnership with the AI team.
- Ship Fast, Learn Fast – Work with founders, PMs, and designers in weekly ship rooms; take a feature from Figma to prod in < 2 weeks.
What makes you a great fit?
Must-Haves:
- 2+ yrs Python back-end experience (FastAPI)
- Strong with Docker & container orchestration
- Hands-on with GitLab CI/CD, AWS (EC2, S3, SQS) or GCP (GKE / Compute) in production
- SQL/NoSQL (Postgres, MongoDB) + You’ve built systems from scratch & have solid system-design fundamentals
Nice-to-Haves
- k8s at scale, Terraform,
- Experience with AI/ML inference services (LLMs, vector DBs)
- Go / Rust for high-perf services
- Observability: Prometheus, Grafana, OpenTelemetry
About Us:
At LearnTube, we’re on a mission to make learning accessible, affordable, and engaging for millions of learners globally. Using Generative AI, we transform scattered internet content into dynamic, goal-driven courses with:
- AI-powered tutors that teach live, solve doubts in real time, and provide instant feedback.
- Seamless delivery through WhatsApp, mobile apps, and the web, with over 1.4 million learners across 64 countries.
Meet the Founders:
LearnTube was founded by Shronit Ladhani and Gargi Ruparelia, who bring deep expertise in product development and ed-tech innovation. Shronit, a TEDx speaker, is an advocate for disrupting traditional learning, while Gargi’s focus on scalable AI solutions drives our mission to build an AI-first company that empowers learners to achieve career outcomes. We’re proud to be recognised by Google as a Top 20 AI Startup and are part of their 2024 Startups Accelerator: AI First Program, giving us access to cutting-edge technology, credits, and mentorship from industry leaders.
Why Work With Us?
At LearnTube, we believe in creating a work environment that’s as transformative as the products we build. Here’s why this role is an incredible opportunity:
- Cutting-Edge Technology: You’ll work on state-of-the-art generative AI applications, leveraging the latest advancements in LLMs, multimodal AI, and real-time systems.
- Autonomy and Ownership: Experience unparalleled flexibility and independence in a role where you’ll own high-impact projects from ideation to deployment.
- Rapid Growth: Accelerate your career by working on impactful projects that pack three years of learning and growth into one.
- Founder and Advisor Access: Collaborate directly with founders and industry experts, including the CTO of Inflection AI, to build transformative solutions.
- Team Culture: Join a close-knit team of high-performing engineers and innovators, where every voice matters, and Monday morning meetings are something to look forward to.
- Mission-Driven Impact: Be part of a company that’s redefining education for millions of learners and making AI accessible to everyone.

Job Description:
- He / She candidate must possess a strong technology background with advanced knowledge of Java and Python based technology stack.
- Java, JEE, Spring MVC, Python, JPA, Spring Boot, REST API, Database, Playwright, CI/CD pipelines
- * At least 3 years of Hand-on Java EE and Core Java experience with strong leadership qualities.
- * Experience with Web Service development, REST and Services Oriented Architecture.
- * Expertise in Object Oriented Design, Design patterns, Architecture and Application Integration.
- * Working knowledge of Databases including Design, SOL proficiency.
- * Strong experience with frameworks used for development and automated testing like SpringBoot, Junit, BDD etc.
- * Experience with Unix/Linux Operating System and Basic Linux Commands.
- * Strong development skills with ability to understand technical design and translate the same into workable solution.
- * Basic knowledge of Python and Hand-on experience on Python scripting
- * Build, deploy, and monitor applications using CI/CD pipelines, * Experience with agile development methodology.
- Good to Have - Elastic Index Database, MongoDB. - No SQL Database Docker Deployments, Cloud Deployments Any Al ML. snowflake Experience

Design, develop and maintain robust test automation frameworks for financial applications
Create detailed test plans, test cases, and test scripts based on business requirements and user stories
Execute functional, regression, integration, and API testing with a focus on financial data integrity
Validate complex financial calculations, transaction processing, and reporting functionalities
Collaborate with Business Analysts and development teams to understand requirements and ensure complete test coverage
Implement automated testing solutions within CI/CD pipelines for continuous delivery
Perform data validation testing against financial databases and data warehouses
Identify, document, and track defects through resolution using defect management tools
Verify compliance with financial regulations and industry standards

The Assistant Professor in CSE will teach undergraduate and graduate courses, conduct independent and collaborative research, mentor students, and contribute to departmental and institutional service.


About NxtWave
NxtWave is one of India’s fastest-growing ed-tech startups, reshaping the tech education landscape by bridging the gap between industry needs and student readiness. With prestigious recognitions such as Technology Pioneer 2024 by the World Economic Forum and Forbes India 30 Under 30, NxtWave’s impact continues to grow rapidly across India.
Our flagship on-campus initiative, NxtWave Institute of Advanced Technologies (NIAT), offers a cutting-edge 4-year Computer Science program designed to groom the next generation of tech leaders, located in Hyderabad’s global tech corridor.
Know more:
🌐 NxtWave | NIAT
About the Role
As a PhD-level Software Development Instructor, you will play a critical role in building India’s most advanced undergraduate tech education ecosystem. You’ll be mentoring bright young minds through a curriculum that fuses rigorous academic principles with real-world software engineering practices. This is a high-impact leadership role that combines teaching, mentorship, research alignment, and curriculum innovation.
Key Responsibilities
- Deliver high-quality classroom instruction in programming, software engineering, and emerging technologies.
- Integrate research-backed pedagogy and industry-relevant practices into classroom delivery.
- Mentor students in academic, career, and project development goals.
- Take ownership of curriculum planning, enhancement, and delivery aligned with academic and industry excellence.
- Drive research-led content development, and contribute to innovation in teaching methodologies.
- Support capstone projects, hackathons, and collaborative research opportunities with industry.
- Foster a high-performance learning environment in classes of 70–100 students.
- Collaborate with cross-functional teams for continuous student development and program quality.
- Actively participate in faculty training, peer reviews, and academic audits.
Eligibility & Requirements
- Ph.D. in Computer Science, IT, or a closely related field from a recognized university.
- Strong academic and research orientation, preferably with publications or project contributions.
- Prior experience in teaching/training/mentoring at the undergraduate/postgraduate level is preferred.
- A deep commitment to education, student success, and continuous improvement.
Must-Have Skills
- Expertise in Python, Java, JavaScript, and advanced programming paradigms.
- Strong foundation in Data Structures, Algorithms, OOP, and Software Engineering principles.
- Excellent communication, classroom delivery, and presentation skills.
- Familiarity with academic content tools like Google Slides, Sheets, Docs.
- Passion for educating, mentoring, and shaping future developers.
Good to Have
- Industry experience or consulting background in software development or research-based roles.
- Proficiency in version control systems (e.g., Git) and agile methodologies.
- Understanding of AI/ML, Cloud Computing, DevOps, Web or Mobile Development.
- A drive to innovate in teaching, curriculum design, and student engagement.
Why Join Us?
- Be at the forefront of shaping India’s tech education revolution.
- Work alongside IIT/IISc alumni, ex-Amazon engineers, and passionate educators.
- Competitive compensation with strong growth potential.
- Create impact at scale by mentoring hundreds of future-ready tech leaders.


Job title - Python developer
Exp – 4 to 6 years
Location – Pune/Mum/B’lore
PFB JD
Requirements:
- Proven experience as a Python Developer
- Strong knowledge of core Python and Pyspark concepts
- Experience with web frameworks such as Django or Flask
- Good exposure to any cloud platform (GCP Preferred)
- CI/CD exposure required
- Solid understanding of RESTful APIs and how to build them
- Experience working with databases like Oracle DB and MySQL
- Ability to write efficient SQL queries and optimize database performance
- Strong problem-solving skills and attention to detail
- Strong SQL programing (stored procedure, functions)
- Excellent communication and interpersonal skill
Roles and Responsibilities
- Design, develop, and maintain data pipelines and ETL processes using pyspark
- Work closely with data scientists and analysts to provide them with clean, structured data.
- Optimize data storage and retrieval for performance and scalability.
- Collaborate with cross-functional teams to gather data requirements.
- Ensure data quality and integrity through data validation and cleansing processes.
- Monitor and troubleshoot data-related issues to ensure data pipeline reliability.
- Stay up to date with industry best practices and emerging technologies in data engineering.



Position – Python Developer
Location – Navi Mumbai
Who are we
Based out of IIT Bombay, HaystackAnalytics is a HealthTech company creating clinical genomics products, which enable diagnostic labs and hospitals to offer accurate and personalized diagnostics. Supported by India's most respected science agencies (DST, BIRAC, DBT), we created and launched a portfolio of products to offer genomics in infectious diseases. Our genomics-based diagnostic solution for Tuberculosis was recognized as one of the top innovations supported by BIRAC in the past 10 years, and was launched by the Prime Minister of India in the BIRAC Showcase event in Delhi, 2022.
Objectives of this Role:
- Design and implement efficient, scalable backend services using Python.
- Work closely with healthcare domain experts to create innovative and accurate diagnostics solutions.
- Build APIs, services, and scripts to support data processing pipelines and front-end applications.
- Automate recurring tasks and ensure robust integration with cloud services.
- Maintain high standards of software quality and performance using clean coding principles and testing practices.
- Collaborate within the team to upskill and unblock each other for faster and better outcomes.
Primary Skills – Python Development
- Proficient in Python 3 and its ecosystem
- Frameworks: Flask / Django / FastAPI
- RESTful API development
- Understanding of OOPs and SOLID design principles
- Asynchronous programming (asyncio, aiohttp)
- Experience with task queues (Celery, RQ)
- Rust programming experience for systems-level or performance-critical components
Testing & Automation
- Unit Testing: PyTest / unittest
- Automation tools: Ansible / Terraform (good to have)
- CI/CD pipelines
DevOps & Cloud
- Docker, Kubernetes (basic knowledge expected)
- Cloud platforms: AWS / Azure / GCP
- GIT and GitOps workflows
- Familiarity with containerized deployment & serverless architecture
Bonus Skills
- Data handling libraries: Pandas / NumPy
- Experience with scripting: Bash / PowerShell
- Functional programming concepts
- Familiarity with front-end integration (REST API usage, JSON handling)
Other Skills
- Innovation and thought leadership
- Interest in learning new tools, languages, workflows
- Strong communication and collaboration skills
- Basic understanding of UI/UX principles
To know more about us – https://haystackanalytics.in

Experience in Python (Only Backend), Data structures, Oops, Algorithms, Django, NumPy etc.
• Good understanding of writing Unit Tests using PYTest.
• Good understanding of parsing XML’s and handling files using Python.
• Good understanding with Databases/SQL, procedures and query tuning.
• Service Design Concepts, OO and Functional Development concepts.
• Agile Development Methodologies.
• Strong oral and written communication skills.
• Excellent interpersonal skills and professional approach Skills desired.