
Key Factors:
- Proven experience in sales and business development, preferably within the education sector, specifically in selling admission management solutions to CBSE, ICSE, IGCSE, IB, and boarding schools.
- Strong understanding of the admission processes and challenges faced by educational institutions.
- Excellent communication and presentation skills, with the ability to effectively engage with school stakeholders at all levels.
- The proactive and results-driven mindset with a track record of meeting or exceeding sales targets.
- Ability to build and nurture long-term relationships with customers.
- Strong negotiation and closing skills, with attention to detail in contract and agreement management.
- Exceptional organizational and time management skills, with the ability to prioritize tasks effectively.
- Willingness to travel within the assigned territory as required.
- Bachelor's or Master’s degree in business, marketing, education, or a related field is preferred.

About Ezyschooling
About
Similar jobs
Job Title : Lead Database Engineer
Location : Gurgaon Sector-43
Experience Required : 4+ Years
Employment Type : Full-Time
Summary :
We are seeking a highly skilled Lead Database Engineer with expertise in managing and optimizing database systems, primarily focusing on Amazon Aurora PostgreSQL, MySQL, and NoSQL databases. The ideal candidate will have in-depth knowledge of AWS services, database architecture, performance tuning, and security practices.
Key Responsibilities :
1. Database Administration :
- Manage and administer Amazon Aurora PostgreSQL, MySQL, and NoSQL database systems to ensure high availability, performance, and security.
- Implement robust backup and recovery procedures to maintain data integrity.
2. Optimization and Performance:
- Develop and execute optimization strategies at the database, query, collection, and table levels.
- Proactively monitor performance and fine-tune RDS parameter groups for optimal database operations.
- Conduct root cause analysis and resolve complex database performance issues.
3. AWS Services and Architecture :
- Leverage AWS services such as RDS, Aurora, and DMS to ensure seamless database operations.
- Perform database version upgrades for PostgreSQL and MySQL, integrating new features and performance enhancements.
4. Replication and Scalability:
- Implement and manage various replication strategies, including master-master and master-slave replication, ensuring data consistency and scalability.
5. Security and Access Control:
- Manage user permissions and roles, maintaining strict security protocols and access controls.
6. Collaboration:
- Work closely with development teams to optimize database design and queries, aligning database performance with application requirements.
Required Skills :
- Strong Expertise: Amazon Aurora PostgreSQL, MySQL, and NoSQL databases.
- AWS Services: Experience with RDS, Aurora, and DMS.
- Optimization: Hands-on experience in query optimization, database tuning, and performance monitoring.
- Replication Strategies: Knowledge of master-master and master-slave replication setups.
- Problem Solving: Proven ability to troubleshoot and resolve complex database issues, including root cause analysis.
- Security: Strong understanding of data security and access control practices.
- Collaboration: Ability to work with cross-functional teams and provide database-related guidance.
Preferred Qualifications :
- Certification in AWS or database management tools.
- Experience with other NoSQL databases like MongoDB or Cassandra.
- Familiarity with Agile and DevOps methodologies.
Job Title: Backend Engineer – Python (AI Backend)
Location: Bangalore, India
Experience: 1–2 Years
Job Description
We are looking for a Backend Engineer with strong Python skills and hands-on exposure to AI-based applications. The candidate will be responsible for developing scalable backend services and supporting AI-powered systems such as LLM integrations, AI agents, and RAG pipelines.
Key Responsibilities
- Develop and maintain backend services using Python (FastAPI preferred)
- Build and manage RESTful APIs for frontend and AI integrations
- Support development of AI-driven features (LLMs, RAG systems, AI agents)
- Design and maintain both monolithic and microservices architectures
- Optimize database performance and backend scalability
- Work with DevOps for Docker-based deployments
Required Skills
- Strong experience in Python backend development
- Hands-on experience with FastAPI / Django / Flask
- Knowledge of REST APIs and microservices
- Experience with AI applications (LLM usage, prompt engineering basics)
- Database knowledge: MongoDB, PostgreSQL or MySQL
- Experience with Docker and basic cloud platforms (AWS/GCP/Azure)
- Hands-on experience with Redis for caching and in-memory storage
Good to Have
- Experience integrating payment gateways (Razorpay, Stripe, PayU, etc.)
- Exposure to event-driven architectures using RabbitMQ, Kafka, or Redis Streams
- Kubernetes
- Understanding of model fine-tuning concepts
Job Description: Data Engineer
Position Overview:
Role Overview
We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.
Key Responsibilities
· Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.
· Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).
· Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.
· Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.
· Ensure data quality and consistency by implementing validation and governance practices.
· Work on data security best practices in compliance with organizational policies and regulations.
· Automate repetitive data engineering tasks using Python scripts and frameworks.
· Leverage CI/CD pipelines for deployment of data workflows on AWS.
We are hiring Research Associates for a AI project for Gurgaon location. The Research Associate role is focused on Prompt Writing, Annotation and Labeling in which you will focus on the improvement of an AI engine.
In addition to an ability to write clearly and concisely, successful Research Associates must be able to tailor their writing style to each assignment’s requirements.
The ideal candidates will have a solid ability to focus on efficiency and problem-solving, and excellent writing and reading comprehension skills – including experience in creating and composing text within a specified amount of time.
Key Responsibilities:
- Work on various client projects to train generative AI models, by creating prompts and responses based on the instructions provided and on using established best practices for quality prompts.
- Given examples, generate similar prompts and responses.
- Use a variety of communication channels such as Slack, Teams, and SharePoint, to learn about new projects, collaborate with your team, and ask questions.
- Learn new software programs on the job.
- Providing supporting documentation when the AI fails.
Requirements
- Experience – 0-2 years
- Excellent Communication skills (Oral and Written)
- Qualification: Bachelor's degree (any stream)
- Preferred: Good understanding of AI.
- Ability to gain new skills and knowledge through hands-on experience
- Keen eye for detail.
- Demonstrated ability to work independently
- Strong time management skills
- Exemplify the quality of doing "get it done attitude," including a high level of accountability, transparency, and teamwork first & foremost
Job Overview:
You will work in engineering and development teams to integrate and develop cloud solutions and virtualized deployment of software as a service product. This will require understanding the software system architecture function as well as performance and security requirements. The DevOps Engineer is also expected to have expertise in available cloud solutions and services, administration of virtual machine clusters, performance tuning and configuration of cloud computing resources, the configuration of security, scripting and automation of monitoring functions. This position requires the deployment and management of multiple virtual clusters and working with compliance organizations to support security audits. The design and selection of cloud computing solutions that are reliable, robust, extensible, and easy to migrate are also important.
Experience:
- Experience working on billing and budgets for a GCP project - MUST
- Experience working on optimizations on GCP based on vendor recommendations - NICE TO HAVE
- Experience in implementing the recommendations on GCP
- Architect Certifications on GCP - MUST
- Excellent communication skills (both verbal & written) - MUST
- Excellent documentation skills on processes and steps and instructions- MUST
- At least 2 years of experience on GCP.
Basic Qualifications:
- Bachelor’s/Master’s Degree in Engineering OR Equivalent.
- Extensive scripting or programming experience (Shell Script, Python).
- Extensive experience working with CI/CD (e.g. Jenkins).
- Extensive experience working with GCP, Azure, or Cloud Foundry.
- Experience working with databases (PostgreSQL, elastic search).
- Must have 2 years of minimum experience with GCP certification.
Benefits :
- Competitive salary.
- Work from anywhere.
- Learning and gaining experience rapidly.
- Reimbursement for basic working set up at home.
- Insurance (including top-up insurance for COVID).
Location :
Remote - work from anywhere.
Ideal joining preferences:
Immediate or 15 days
Neo Jarvis: Vice President - Software Engineering
About the Role :
VP- Software Engineering would be responsible for providing technical leadership to our product engineering team. We are looking for a Full-stack Developer (Architecture Scope) with the craftsmanship of understanding, thinking, learning from the past, and then building. In addition, you should have the leverage to work on mistakes immediately, which requires one to learn by trial and error and eventually lead to perfection, instead of looking at perfection at the first attempt.
Expectations :
- Ability to learn new technology and be hands-on .
- Apply the fundamentals of Data Structure & algorithms over application development.
- To have explored serverless and basics of Docker and Kubernetes.
- Should have enabled cloud-based micro-services production-ready designs using AWS or Google Cloud services handling a scale of more than 1000+ concurrent user volume .
- Take ownership, and visualize security and data privacy while building.
- Primary ownership lies in POCs for solutions requiring Performance, Scale, Cost Optimization.
- Secondary ownership lies in POCs on AI Initiatives .
- Brainstorm and prepare logical and physical solution architecture and code flow diagrams for internal and external consumption .
- Apply design thought process and enable POC outcomes to facilitate production-ready code .
- Self-initiated improvements within the product to scale the solution architecture .
- Prioritize, optimize and automate any manual activities .
- Owns and facilitates Code Review .
- Owns and facilitates unit and integration test cases coverage from the product code lines.
- Understand SCRUM/Agile methodologies of working together.
Requirements :
- Total experience of 10 to 12 years.
- Angularjs or Reactjs, Typescript or JavaScript, NodeJs, MYSQL, NoSQL (Additional preference).
- Should have enabled cloud-based production-ready designs using AWS or Google Cloud services.
- Someone who is disciplined to document the artifacts and enables a design-based problem-solving approach.
- An ever-learning mindset and never-give-up attitude is more valuable to us than work experience.
Perks:
- Flexible work timings and no leave policy
- We don't go behind timelines, we work for better product at flexible working hours.
- No hierarchy - Everyone has open access to anyone in the team.
- Medical insurance - We got you covered we provide insurance for you and your family members.
- 5 days a week - Take time off to revitalise.
Ruby On Rails Developer
Job Description:
· Relevant experience in Rails Web Applications should at least have 2+ Years of Experience
· Good experience with front-end technologies such as HTML, CSS, JavaScript
· Strong experience with Ruby on Rails Frameworks, ActiveRecord, GEMs, CRON jobs, and schedulers
· Develop enhancements on ruby-based web applications
· Experience in working with GitHub integrations and Docker setups.
· Experience in maintaining and troubleshooting existing applications
· Develop and manage APIs, third-party API integrations
· Good knowledge of SQL Server
- Client acquisition & servicing.
- Understanding of currency / equity / banking markets
- Sales & marketing of financial products / services
- Experience with selling of financial products / services will be preferred.
Job Role :
- Creating data base of potential clients
- Fixing meetings with prospective clients through cold calling or references
- Holding meetings with prospective clients and informing them about products / services offered.
- Responsible for overall Marketing and achieving annual budgets.
- Analyzing financials of prospective clients to gauge best alternatives and cost reduction









