
- Key Responsibilities
- Manage on-time and on-budget delivery with planned profitability, consistently looking to improve quality and profitability
- Lead the onsite project teams and ensure they understand the client environment
- Responsible for Backlog growth - Existing projects + Renewals/Extensions of current projects + Rate Revisions
- Responsible for driving RFPs / Proactive bids
- Build senior and strategic relationships through delivery excellence
- Understand the client environment, issues, and priorities
- Serve as the day-to-day point of contact for the clients

Similar jobs

We are looking for an experienced Python Developer with 5–6 years of hands-on experience in designing, developing, and maintaining scalable backend applications and APIs. The ideal candidate should have strong expertise in Python, backend frameworks, databases, and cloud/deployment practices. The candidate should be capable of working in a fast-paced environment and collaborating with cross-functional teams to deliver high-quality software solutions.
Key Responsibilities
- Design, develop, test, and maintain robust and scalable Python-based applications.
- Build and integrate RESTful APIs and backend services.
- Work on server-side logic, database integration, and performance optimization.
- Collaborate with frontend developers, QA teams, DevOps, and product teams for end-to-end delivery.
- Write reusable, testable, and efficient code following best practices.
- Debug, troubleshoot, and resolve production issues.
- Participate in code reviews, technical design discussions, and architecture planning.
- Optimize applications for maximum speed, scalability, and reliability.
- Implement security and data protection measures.
- Work with CI/CD pipelines and deployment processes.
Required Skills
- Strong experience in Python development with 5–6 years of relevant experience.
- Hands-on experience with Python frameworks such as:
- Django
- Flask
- FastAPI
- Strong understanding of OOPs, Data Structures, and Algorithms.
- Experience in building and consuming REST APIs.
- Good knowledge of SQL and relational databases like:
- MySQL
- PostgreSQL
- Experience with NoSQL databases like:
- MongoDB
- Redis (preferred)
- Knowledge of ORM frameworks such as SQLAlchemy or Django ORM.
- Familiarity with Git/GitHub/GitLab version control.
- Understanding of unit testing, debugging, and code quality practices.
- Experience in working with Linux/Unix environments.
- Knowledge of Docker, containerization, and deployment concepts.
- Exposure to cloud platforms like AWS / Azure / GCP is preferred.
Preferred / Good to Have Skills
- Experience in microservices architecture.
- Knowledge of Celery, asynchronous processing, and message queues like:
- RabbitMQ
- Kafka
- Familiarity with CI/CD pipelines.
- Experience in writing clean architecture and scalable backend systems.
- Exposure to DevOps practices is a plus.
- Experience in Agile/Scrum methodology.
Job Title : Java Backend Developer
Experience : 3 – 8 Years
Location : Pune (Onsite) (Pune candidates Only)
Notice Period : Immediate to 15 Days (or serving NP whose LWD is near)
About the Role :
We are seeking an experienced Java Backend Developer with strong hands-on skills in backend microservices development, API design, cloud platforms, observability, and CI/CD.
The ideal candidate will contribute to building scalable, secure, and reliable applications while working closely with cross-functional teams.
Mandatory Skills : Java 8 / Java 17, Spring Boot 3.x, REST APIs, Hibernate / JPA, MySQL, MongoDB, Prometheus / Grafana / Spring Actuators, AWS, Docker, Jenkins / GitHub Actions, GitHub, Windows 7 / Linux.
Key Responsibilities :
- Design, develop, and maintain backend microservices and REST APIs
- Implement data persistence using relational and NoSQL databases
- Ensure performance, scalability, and security of backend systems
- Integrate observability and monitoring tools for production environments
- Work within CI/CD pipelines and containerized deployments
- Collaborate with DevOps, QA, and product teams for feature delivery
- Troubleshoot, optimize, and improve existing modules and services
Mandatory Skills :
- Languages & Frameworks : Java 8, Java 17, Spring Boot 3.x, REST APIs, Hibernate, JPA
- Databases : MySQL, MongoDB
- Observability : Prometheus, Grafana, Spring Actuators
- Cloud Technologies : AWS
- Containerization Tools : Docker
- CI/CD Tools : Jenkins, GitHub Actions
- Version Control : GitHub
- Operating Systems : Windows 7, Linux
Nice to Have :
- Strong analytical and debugging abilities
- Experience working in Agile/Scrum environments
- Good communication and collaborative skills
We are seeking an experienced AI Architect to design, build, and scale production-ready AI voice conversation agents deployed locally (on-prem / edge / private cloud) and optimized for GPU-accelerated, high-throughput environments.
You will own the end-to-end architecture of real-time voice systems, including speech recognition, LLM orchestration, dialog management, speech synthesis, and low-latency streaming pipelines—designed for reliability, scalability, and cost efficiency.
This role is highly hands-on and strategic, bridging research, engineering, and production infrastructure.
Key Responsibilities
Architecture & System Design
- Design low-latency, real-time voice agent architectures for local/on-prem deployment
- Define scalable architectures for ASR → LLM → TTS pipelines
- Optimize systems for GPU utilization, concurrency, and throughput
- Architect fault-tolerant, production-grade voice systems (HA, monitoring, recovery)
Voice & Conversational AI
- Design and integrate:
- Automatic Speech Recognition (ASR)
- Natural Language Understanding / LLMs
- Dialogue management & conversation state
- Text-to-Speech (TTS)
- Build streaming voice pipelines with sub-second response times
- Enable multi-turn, interruptible, natural conversations
Model & Inference Engineering
- Deploy and optimize local LLMs and speech models (quantization, batching, caching)
- Select and fine-tune open-source models for voice use cases
- Implement efficient inference using TensorRT, ONNX, CUDA, vLLM, Triton, or similar
Infrastructure & Production
- Design GPU-based inference clusters (bare metal or Kubernetes)
- Implement autoscaling, load balancing, and GPU scheduling
- Establish monitoring, logging, and performance metrics for voice agents
- Ensure security, privacy, and data isolation for local deployments
Leadership & Collaboration
- Set architectural standards and best practices
- Mentor ML and platform engineers
- Collaborate with product, infra, and applied research teams
- Drive decisions from prototype → production → scale
Required Qualifications
Technical Skills
- 7+ years in software / ML systems engineering
- 3+ years designing production AI systems
- Strong experience with real-time voice or conversational AI systems
- Deep understanding of LLMs, ASR, and TTS pipelines
- Hands-on experience with GPU inference optimization
- Strong Python and/or C++ background
- Experience with Linux, Docker, Kubernetes
AI & ML Expertise
- Experience deploying open-source LLMs locally
- Knowledge of model optimization:
- Quantization
- Batching
- Streaming inference
- Familiarity with voice models (e.g., Whisper-like ASR, neural TTS)
Systems & Scaling
- Experience with high-QPS, low-latency systems
- Knowledge of distributed systems and microservices
- Understanding of edge or on-prem AI deployments
Preferred Qualifications
- Experience building AI voice agents or call automation systems
- Background in speech processing or audio ML
- Experience with telephony, WebRTC, SIP, or streaming audio
- Familiarity with Triton Inference Server / vLLM
- Prior experience as Tech Lead or Principal Engineer
What We Offer
- Opportunity to architect state-of-the-art AI voice systems
- Work on real-world, high-scale production deployments
- Competitive compensation and equity (if applicable)
- High ownership and technical influence
- Collaboration with top-tier AI and infrastructure talent
Role: Lead Data Engineer Core
Responsibilities: Lead end-to-end design, development, and delivery of complex cloud-based data pipelines.
Collaborate with architects and stakeholders to translate business requirements into technical data solutions.
Ensure scalability, reliability, and performance of data systems across environments. Provide mentorship and technical leadership to data engineering teams. Define and enforce best practices for data modeling, transformation, and governance.
Optimize data ingestion and transformation frameworks for efficiency and cost management. Contribute to data architecture design and review sessions across projects.
Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
8+ years of experience in data engineering with proven leadership in designing cloud native data systems.
Strong expertise in Python, SQL, Apache Spark, and at least one cloud platform (Azure, AWS, or GCP). Experience with Big Data, DataLake, DeltaLake, and Lakehouse architectures Proficient in one or more database technologies (e.g. PostgreSQL, Redshift, Snowflake, and NoSQL databases).
Ability to recommend and implement scalable data pipelines Preferred Qualifications: Cloud certification (AWS, Azure, or GCP). Experience with Databricks, Snowflake, or Terraform. Familiarity with data governance, lineage, and observability tools. Strong collaboration skills and ability to influence data-driven decisions across teams.
Must have:
- 8+ years of experience with a significant focus on developing, deploying & supporting AI solutions in production environments.
- Proven experience in building enterprise software products for B2B businesses, particularly in the supply chain domain.
- Good understanding of Generics, OOPs concepts & Design Patterns
- Solid engineering and coding skills. Ability to write high-performance production quality code in Python
- Proficiency with ML libraries and frameworks (e.g., Pandas, TensorFlow, PyTorch, scikit-learn).
- Strong expertise in time series forecasting using stat, ML, DL and foundation models
- Experience of working on processing time series data employing techniques such as decomposition, clustering, outlier detection & treatment
- Exposure to generative AI models and agent architectures on platforms such as AWS Bedrock, Crew AI, Mosaic/Databricks, Azure
- Experience of working with modern data architectures, including data lakes and data warehouses, having leveraged one or more of the frameworks such as Airbyte, Airflow, Dagster, AWS Glue, Snowflake,, DBT
- Hands-on experience with cloud platforms (e.g., AWS, Azure, GCP) and deploying ML models in cloud environments.
- Excellent problem-solving skills and the ability to work independently as well as in a collaborative team environment.
- Effective communication skills, with the ability to convey complex technical concepts to non-technical stakeholders
Good To Have:
- Experience with MLOps tools and practices for continuous integration and deployment of ML models.
- Has familiarity with deploying applications on Kubernetes
- Knowledge of supply chain management principles and challenges.
- A Master's or Ph.D. in Computer Science, Machine Learning, Data Science, or a related field is preferred
Job Description:
- Experienced in developing .NET Core, http://asp.net/" target="_blank">ASP.NET MVC, Web applications, WEB /REST APIs with C# Language.
- Proficient in OOP and design Skilled in the use of Structured Query Language (SQL) as a tool for retrieving, updating, inserting, deleting, manipulating, and managing data stored in enterprise databases.
- Strong knowledge of Database concepts and Relational Database Management Systems such as Microsoft SQL.
- Work closely with the other team members to meet the goals.
- Work with Quality Analysts to ensure changes are thoroughly tested before release.
- Communicating project status and timelines to business stakeholders and IT management.
Our client is a disruptive edtech company that provides management automation products to educational institutes. It brings a unified platform for all the stakeholders of the institution - for the parents, teachers, students and the management.
It was founded by a techno-entrepreneur with strong business and software development skills, delivering innovative solutions to their clients. They are transforming the way students, parents, teachers and school administrators communicate, collaborate and come together in building the future of students and creating successful Institutions.
As a Sales Manager, you will be responsible for actively seeking out new sales opportunities through cold calling, networking and social media.
What you will do:
- Conducting market research to identify selling possibilities and evaluating customer needs
- Setting up meetings with potential clients and listening to their wishes and concerns
- Preparing and delivering appropriate presentations on products and services
- Creating frequent reviews and reports with sales and financial data
- Participating on behalf of the company in exhibitions or conferences
- Negotiating/ closing deals and handling complaints or objections
- Collaborating with team members to achieve better results
- Gathering feedback from customers or prospects and sharing it with internal teams
Desired Candidate Profile
What you need to have:- Experience in Edtech sales
- Proven experience as a Sales Manager or relevant role
- Proficiency in English, fluency with communication be it verbal & written
- Excellent knowledge of MS Office
- Hands-on experience with ERP solutions is a plus
- Thorough understanding of marketing and negotiating techniques
- Passion for sales
- A results-driven approach
- Aptitude in delivering attractive presentations
Experience- 3 - 6
Location- Bengaluru.
Requirements:
• 3-6years of relevant experience in building webapps at scale
• You must have strong understanding of semantic HTML / HTML5, CSS / CSS3.
• You must have a good understanding of MVC architecture.
• Prior work experience in ReactJS is Must
• You must have experience in setting up the full UI workflow layer right from Development, Testing, Building and Deployment
- Leads a project end-to-end and collaborates across functions. Drives planning, estimation and execution.
- Understands requirements well and comes up with efficient design
- Develops complex, well backed and bug-free products. Estimates accurately.
- Takes well-reasoned tech decisions keeping in mind goals and trade-offs
- Becomes a go-to person in more than one area. Provide technical mentoring to team
- Communicates clearly, gets clarifications and establishes expectations for all parties
- Helps establish SDLC best practices and high standards of code quality
- Demonstrates excellent problem solving & debugging skills
- Proactively identifies and resolves issues in requirements, design and code
- Perform peer code reviews and help us all get better.








