
Presenting lessons in a comprehensive manner and using visual/audio means to facilitate learning.
Providing individualized instruction to each student by promoting interactive learning
Create and distribute educational content (notes, summaries, assignments etc.)
Assess and record students’ progress and provide grades and feedback.
Maintain a tidy and orderly classroom.
Collaborate with other teachers, and parents and participate in meetings as an when required.
Observe and understand students’ behaviour and psyche.
Develop and enrich professional skills and knowledge by attending seminars, conferences etc.
Explaining the concepts in easy and understandable terms to the students.
Using real-life examples to teach geometrical concepts.
Developing students' interest in subjects.
Planning and creating teaching materials.
You will also be responsible for assigning homework, grading assignments and quizzes, and documenting students' progress. Etc.,

About JANAPRIYA NIRMAAN CONGLOMERATE PVT LTD
About
Similar jobs
About Company
MyOperator is a Business AI Operator, a category-leader that unifies WhatsApp, Calls, and AI-powered chat & voice bots into one intelligent business communication platform. Unlike fragmented communication tools, MyOperator combines automation, intelligence, and workflow integration to help businesses run WhatsApp campaigns, manage calls, deploy AI chatbots, and track performance all from a single, no-code platform. Trusted by 12,000+ brands including Amazon, Domino's, Apollo, and Razorpay, MyOperator enables faster responses, higher resolution rates, and scalable customer engagement — without fragmented tools or increased headcount.
Role Overview
We’re looking for a high-ownership Operations Intern who wants hands-on exposure to how backend operations work at a fast-growing AI SaaS company. This is a real ops role, not a shadow internship. High performers may be considered for a full-time opportunity post internship.
What You’ll Do
- Support coordination with telecom operators and data center partners
- Assist in vendor management and asset tracking
- Handle operational queries via the ticketing system
- Support basic troubleshooting of assets and infrastructure (with guidance)
- Assist with day-to-day admin and backend operations
Requirements
Who Should Apply
- Graduates (BBA preferred; B.Com / BA / BSc welcome)
- 0–1 year experience or strong internship exposure in operations
- Good communication and coordination skills
- Comfortable with MS Excel, Word, and PowerPoint
- Willing to learn technical and operational systems
Work Expectations
- 6-day workweek, roster-based (including occasional Sundays)
- Willingness to travel occasionally to data centers across India
- Fast-paced startup environment with real ownership
- Tenure: 6 Months
- Location: Noida, Sector 2 (Work-from-office)
Benefits
What You Get
- Hands-on exposure to AI SaaS, telecom, and infrastructure operations
- Steep learning curve with direct team interaction
- Laptop reimbursement
- High chance of full-time conversion based on performance

We are looking for a data-driven and proactive CRM & Performance Marketing Associate to strengthen customer engagement, retention, and full-funnel marketing for our premium lifestyle brands.
This role integrates CRM, email/WhatsApp marketing, automation, and performance marketing (Meta, Google, YouTube) to drive acquisition and loyalty.
You will report to the Marketing Manager and collaborate closely with the Operations and E-commerce teams.
Key Responsibilities:
1. CRM & Customer Retention
- Manage and segment customer data across Shopify, marketplaces (Amazon, Myntra, Tata CLiQ Luxury), and CRM platforms.
- Execute email and WhatsApp campaigns — automation, A/B testing, personalization, and performance tracking.
- Develop and manage loyalty, referral, and win-back programs to improve customer lifetime value (LTV).
- Integrate and optimize AI Chatbots for personalized engagement and lead nurturing.
2. Performance Marketing (Meta, Google, YouTube)
- Plan, execute, and optimize campaigns across Meta (Facebook/Instagram), Google Ads, and YouTube.
- Collaborate with internal and external teams for audience segmentation, creative testing, UTM setup, and reporting.
- Monitor and analyze KPIs including ROAS, CTR, CAC, and conversion rate, making data-backed optimization recommendations.
- Test audience clusters for premium segmentation.
- Align ad campaigns with CRM strategies to ensure consistent communication across channels.
3. Analytics & Reporting
- Create integrated dashboards covering CRM and paid media performance.
- Track key engagement metrics such as open rate, CTR, repeat purchase rate, and campaign ROI.
- Provide insights to enhance customer journey mapping, funnel efficiency, and conversion pathways.
- Present monthly performance summaries and improvement recommendations.
4. Cross-Functional Collaboration
- Work with marketing, operations, and tech teams to synchronize CRM, automation, and digital marketing efforts.
- Coordinate with agencies or vendors for campaign execution, data sync, and CRM tool integration.
- Ensure brand consistency and premium positioning across all digital touchpoints.
Qualifications
- Bachelor’s degree in Marketing, Business, or related field.
- 3–5 years of experience in CRM and/or Performance Marketing for premium or lifestyle brands.
- Proficiency in CRM and automation tools (e.g., HubSpot, Brevo, Mailchimp, Zoho, Klaviyo).
- Working knowledge of Meta Ads Manager, Google Ads, and YouTube Campaign Manager.
- Strong analytical mindset; ability to interpret campaign data and make optimization decisions.
- Experience with Shopify, marketplaces (Amazon, Myntra, Tata CLiQ Luxury), and email workflows and automations.
- Excellent communication and coordination skills, with a premium brand sensibility.
mail updated resume with salary details-
email: etalenthire[at]gmail[dot]com
satish: 88O 27 49 743
Job Summary
We are seeking an experienced Databricks Developer with strong skills in PySpark, SQL, Python, and hands-on experience deploying data solutions on AWS (preferred), Azure. The role involves designing, developing, and optimizing scalable data pipelines and analytics workflows on the Databricks platform.
Key Responsibilities
- Develop and optimize ETL/ELT pipelines using Databricks and PySpark.
- Build scalable data workflows on AWS (EC2, S3, Glue, Lambda, IAM) or Azure (ADF, ADLS, Synapse).
- Implement and manage Delta Lake (ACID, schema evolution, time travel).
- Write efficient, complex SQL for transformation and analytics.
- Build and support batch and streaming ingestion (Kafka, Kinesis, EventHub).
- Optimize Databricks clusters, jobs, notebooks, and PySpark performance.
- Collaborate with cross-functional teams to deliver reliable data solutions.
- Ensure data governance, security, and compliance.
- Troubleshoot pipelines and support CI/CD deployments.
Required Skills & Experience
- 4–8 years in Data Engineering / Big Data development.
- Strong hands-on experience with Databricks (clusters, jobs, workflows).
- Advanced PySpark and strong Python skills.
- Expert-level SQL (complex queries, window functions).
- Practical experience with AWS (preferred) or Azure cloud services.
- Experience with Delta Lake, Parquet, and data lake architectures.
- Familiarity with CI/CD tools (GitHub Actions, Azure DevOps, Jenkins).
- Good understanding of data modeling, optimization, and distributed systems.
Trying to get in touch with you all for an exciting role for 1 of the Startup Firm into Wealtth Mangement .
A small description about the Company.
This Company is building the platform to drive Wealth Mangement .
They own and operate an online investing platform that distributes mutual funds in India. Its platform allows investors to buy and sell equity, debt, and tax saving mutual funds. It has its headquarters in Bengaluru in India.
Looking for Great Talent for Backend Developer with beloww skills.
• Excellent knowledge of at least one ecosystem based on Elixir/Phoenix, Ruby/Rails, Python/Django,
Go/Scala/Clojure
• Good OO skills, including strong design patterns knowledge
• Familiar with datastores like MySQL, PostgreSQL, Redis, Redshift etc.
• Familiarity with react.js/react-native, vue.js etc. • Knowledge of deploying software to AWS, GCP, Azure
• Knowledge of software best practices, like Test-Driven Development (TDD) and Continuous Integration.
Job Description
Title:- Backend Developer
About us:-
If you think back to your school or college days, the best memories you have are those with your friends! Your first day in college, your best friend, your dorms, hanging in the corridors, your school canteen, all nighters before exams & chilling for weeks after exams - our school lives are filled with so many awesome interactions, experiences & memories. In each of them you’re with a friend, never alone! Humans always have, and will always continue to learn from one another. We are a social learning platform that allows us to learn the way we learn best – socially
It is a network of after-school learning centres meant for inspiration, collaboration, and holistic growth. We encourage children to explore what works best for them. It is our belief that every child deserves the space and opportunity to explore & create. Learning happens best in person, but there is an opportunity to make this experience 10X via modern centres, technology, and powerful communities! We're a mission driven organisation trying to build a network of such learning centres and with empowered teachers in every neighbourhood of our country!
Responsibilities :-
We are looking for a Python(Django) Web Developer responsible for managing the interchange of data between the server and the users. Writing reusable, testable, and efficient code. Design and implementation of low-latency, high-availability, and performance applications. Expert in Python, with good knowledge of Python web framework Django. Understanding of fundamental design principles behind a scalable application.
Skills:- Python, Django, Websockets, storage systems
Skills And Qualifications
- Expert in Python, with knowledge of at least one Python web framework (such as Django, Flask, etc depending on your technology stack)
- Familiarity with some ORM (Object Relational Mapper) libraries
- Able to integrate multiple data sources and databases into one system
- Understanding of the threading limitations of Python, and multi-process architecture
- Good understanding of server-side templating languages (such as Jinja 2, Mako, etc depending on your technology stack)
- Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
- Understanding of accessibility and security compliance (depending on the specific project)
- Knowledge of user authentication and authorization between multiple systems, servers, and environments
- Understanding of fundamental design principles behind a scalable application
- Familiarity with event-driven programming in Python
- Understanding of the differences between multiple delivery platforms, such as mobile vs desktop, and optimizing output to match the specific platform
- Able to create database schemas that represent and support business processes
- Strong unit test and debugging skills
- Proficient understanding of code versioning tools (such as Git, Mercurial or SVN)
- (Make sure to mention other frameworks, libraries, or any other technology related to your development stack)
- (List education level or certification you require)
Skills we are looking for :
- Experience in developing full-stack applications at scale in ReactJs/AngularJs & NodeJs.
- Strong knowledge and experience in HTML(5), CSS, SCSS, and Advanced JavaScript
- Good knowledge of coding RESTful APIs.
- Strong knowledge of Web Storage (Cookie, Local Storage, and Session Storage)
- Knowledge of modern authorization mechanisms, such as JWT
- Familiarity with modern front-end build pipelines and tools
- Ability to work in a lean-agile development environment.
Good to know :
- Using GIT
- Knowledge of AWS
- Familiarity with GraphQL will be a brownie point
· Primarily responsible for working on various service requests related to maintaining database security (roles, privileges, authentication), creating schema and copying data
· Work on installation, configuration, creation and administration of HP Vertica database clusters and AWS RDS MySQL database.
· Work with platform, product and other support teams to participate in release and change management processes to implement new features/functions as well support mandatory patches or upgrade activities on weekends.
· Responsible for database performance tuning, query tuning and database operations including migration and upgrade.
· Plan and implement periodic upgrades of database schema and perform testing and verification of released database schema upgrades.
· Configure and manage database backup and restore, participate in Disaster Recovery
· Create and maintain documentation, procedures and best practice guides for database deployment and maintenance.
Mandatory Requirement/ Preference:
- 4 years of experience for delivering enterprise solutions or products using full SDLC
- 4 years of work experience with enterprise solutions for Microsoft platform (SQL Server, stored procedures, WCF, C#.net, Entity Framework)
- Extensive knowledge and experience with creating and maintaining stored procedure in MS SQL Server.
- Working with WEB API added advantage








