
Key Factors:
- Proven experience in sales and business development, preferably within the education sector, specifically in selling admission management solutions to CBSE, ICSE, IGCSE, IB, and boarding schools.
- Strong understanding of the admission processes and challenges faced by educational institutions.
- Excellent communication and presentation skills, with the ability to effectively engage with school stakeholders at all levels.
- The proactive and results-driven mindset with a track record of meeting or exceeding sales targets.
- Ability to build and nurture long-term relationships with customers.
- Strong negotiation and closing skills, with attention to detail in contract and agreement management.
- Exceptional organizational and time management skills, with the ability to prioritize tasks effectively.
- Willingness to travel within the assigned territory as required.
- Bachelor's or Master’s degree in business, marketing, education, or a related field is preferred.

About Ezyschooling
About
Similar jobs

Key Responsibilities
• Identify and develop new business opportunities across target industries and
customer segments.
• Promote and sell lubrication system solutions including:
o Grease systems: Dual Line, Progressive, Multiline, Spray
o Oil systems: Pre-Jacking, HP-LP Systems, Circulating Systems
• Achieve or exceed monthly and annual sales targets across assigned territories.
• Develop and maintain strong relationships with OEMs, end-users, EPCs, and
government entities.
• Manage channel sales network by appointing, training, and supporting dealers and
sub-dealers across India.
• Participate in government tendering processes, including GeM portal-based
opportunities.
• Collaborate with internal teams (engineering, service, logistics) to ensure seamless
customer experience.
• Conduct product presentations, site visits, and customer meetings to demonstrate
value and drive engagement.
• Provide regular sales forecasts, market intelligence, and competitor analysis.
• Ensure timely collection of payments from direct customers and channel partners.
• Represent the company at industry events, expos, and exhibitions as required.
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Applications through email only.
Please visit the below URL to participate -
https://ukti.co.in/jd_writer.html
About Role
The Writer will play a key role in daily operations, eventually managing a growing team of content creators in the capacity of a Manager/Editor.
The role demands someone proactive (getting to work instead of waiting for instructions) with killer written communication skills. The candidate must hold a deep belief in the power of words and should have some understanding of the purpose of content creation for brands. The candidate should also be familiar with content marketing and the SaaS space.
The role entails a training period of 2 months, during which structured sessions are delivered to help writers excel in their role. Since this is a small and early-stage setup, opportunities to learn, grow and don multiple hats will be in plenty
The Writer will be trained on different types and formats of written communication – blogs, articles, whitepapers, website copy, various marketing collateral.
Roles and Responsibilities
The Writer will be responsible for the following:
- Creating well-researched and punchy content pieces
- Creating content in line with brand and editorial guidelines
- Developing an understanding of the brand and its audiences
- Staying up-to-date with industry developments in the content and marketing spaces
What is Ukti Looking For?
As a Writer at Ukti, you would need to be:
- A minimum of two years of experience in B2B SaaS writing is required
- Detail-oriented
- Creative
- Empathetic
- A problem solver
- A team player
- Able to perform well in high-pressure situations
The Writer must possess the following skills:
- Critical thinking
- Time management
- Clarity of thought
- Leadership
- Strong interpersonal and business communication skills
- Proficiency in verbal and written English
- Familiarity with MS Word
- Ability to work independently and take ownership
Hiring Process
The hiring process consists of two written rounds that assess the candidate’s communication skills and linguistic proficiency, followed by an in-person/video interview where the details of the position are discussed
To participate please visit Careers page at ukti.co.in
● Design overall architecture of the web application.
● Maintain quality and ensure responsiveness of applications.
● Collaborate with the rest of the engineering team to design and launch new features.
● Maintain code integrity and organization.
● Experience working with graphic designers and converting designs to visual elements.
● Understanding and implementation of security and data protection.
● Highly experienced with back-end programming languages (Ex: PHP, .NET, NodeJS
etc)
● Proficient experience using JavaScript libraries and ReactJS.
● Experience with SQL and NO SQL databases like MySql and MongoDB.
● Experience with cloud message APIs and usage of push notifications.
● Knowledge of code versioning tools (such as Git, Mercurial or SVN).
● Should have good communication skills to communicate with clients or other stakeholders and understand the requirements.
,
Desired Skills and Experience:
hp data protector, graphic designers, communication skills, javascript libraries, programming languages, hp, PHP, SQL, git, net, rest, MySQL, cloud, design, backend, MongoDB, security, mercurial, databases, javascript
Job Type: Full-time

Experience : 3 to 7 Years
Number of Positions : 20
Job Location : Hyderabad
Notice : 30 Days
1. Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
2. Experience in developing lambda functions with AWS Lambda
3. Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
4. Should be able to code in Python and Scala.
5. Snowflake experience will be a plus
Hadoop and Hive requirements as good to have or understanding of is enough.
Purpose: We are looking for React Native Application Developer who possesses a passion for mobile technologies.
Roles & Responsibilities:
- Ideal candidate should have Working knowledge of React Native and React JS is a must.
- Over 3+ years of React Native experience with strong basics.
- At least have done 4/5 projects using React Native.
- Strong passion for Programming in general and Android and IOS App development.
- Strong problem solving skills.
- Strong system design and architecture skills.
- Curiosity to tinker around, explore new paradigms and strong zest for continuous improvement.
- Over 3+ years of Mobile development experience with strong basics.
- E2E App development and/or experience of developing SDKs is good to have.
Expected Start Date: 1/4/2021
Job Type: Full-time
ML ARCHITECT
Job Overview
We are looking for a ML Architect to help us discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver even better products. Your primary focus will be in applying data mining techniques, doing statistical analysis, and building high quality prediction systems integrated with our products. They must have strong experience using variety of data mining and data analysis methods, building and implementing models, using/creating algorithm’s and creating/running simulations. They must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes. Automating to identify the textual data with their properties and structure form various type of document.
Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state-of-the-art methods
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Creating automated anomaly detection systems and constant tracking of its performance
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Secure and manage when needed GPU cluster resources for events
- Write comprehensive internal feedback reports and find opportunities for improvements
- Manage GPU instances/machines to increase the performance and efficiency of the ML/DL model.
Skills and Qualifications
- Strong Hands-on experience in Python Programming
- Working experience with Computer Vision models - Object Detection Model, Image Classification
- Good experience in feature extraction, feature selection techniques and transfer learning
- Working Experience in building deep learning NLP Models for text classification, image analytics-CNN,RNN,LSTM.
- Working Experience in any of the AWS/GCP cloud platforms, exposure in fetching data from various sources.
- Good experience in exploratory data analysis, data visualisation, and other data preprocessing techniques.
- Knowledge in any one of the DL frameworks like Tensorflow, Pytorch, Keras, Caffe
- Good knowledge in statistics,distribution of data and in supervised and unsupervised machine learning algorithms.
- Exposure to OpenCV Familiarity with GPUs + CUDA
- Experience with NVIDIA software for cluster management and provisioning such as nvsm, dcgm and DeepOps.
- We are looking for a candidate with 14+ years of experience, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with AWS cloud services: EC2, RDS, AWS-Sagemaker(Added advantage)
- Experience with object-oriented/object function scripting languages in any: Python, Java, C++, Scala, etc.
What you will work on Your primary focus will be to implement a complete user interface in the form of a responsive web app, with a focus on performance. Your main duties will include creating modules and components and coupling them together into a functional app. A thorough understanding of all the components of our platform and infrastructure is required as you will be working closely with the back-end team to decide REST contracts.
Responsibilties
• Assess the technical feasibility of UI/UX designs
• Develop new user-facing features
• Build reusable code and libraries for future use
• Optimize application for maximum speed and scalability
• Assure that all edge cases are handled
• Collaborate with other team members and stakeholder
What can CasaOne promise you – An opportunity to - increase your rate of learning exponentially by defining hard problems and solving them - partake in a high-growth journey and increase revenues 5x+ Y-o-Y - be an early innovator in the shifting trend: ‘ownership economy’ -> ‘access economy’ - build a category-defining platform for FF&E (Furniture, Fixture, and Equipment) leasing - build high-performance teams
The must-haves
• Good understanding of single-page web applications and Javascript libraries and frameworks, such as ReactJS, AngularJS, and jQuery.
• Good understanding of asynchronous request handling, partial page updates, and AJAX.
• Proficient understanding of web markup, including HTML5, CSS3.
• Basic understanding of CSS pre-processing platforms, such as LESS and SASS.
• Good understanding of responsive web development.
• Proficient understanding of cross-browser compatibility issues and ways to work around them.
• Basic understanding of SEO







