
job Title: Ad Sales Executive/ Manager
Location: Mumbai, Pune, Nagpur
Experience: 2-8 years in advertising field
Salary: As per industry standards + Benefits
Responsibilities:
- Identifies and prospects potential clients across the market segments
- Builds and maintains strong relationships with key decision-makers and stakeholders to understand their marketing needs and objectives
- Develops customized proposals and presentations that showcase Reach India's capabilities and solutions tailored to each client's requirements
- Collaborates with internal teams, including creative, strategy and account management, to ensure seamless execution of client campaigns
- Staying abreast on industry trends, competitor activities, and market insights to identify opportunities for innovation and differentiation.
- Understands media verticals (Radio, TV, Digital, OOH etc.) and pitches the proposal by understanding the client’s requirements and budget

Similar jobs
Job Title : DevOps Engineer
Experience : 3+ Years
Location : Mumbai
Employment Type : Full-time
Job Overview :
We’re looking for an experienced DevOps Engineer to design, build, and manage Kubernetes-based deployments for a microservices data discovery platform.
The ideal candidate has strong hands-on expertise with Helm, Docker, CI/CD pipelines, and cloud networking — and can handle complex deployments across on-prem, cloud, and air-gapped environments.
Mandatory Skills :
✅ Helm, Kubernetes, Docker
✅ Jenkins, ArgoCD, GitOps
✅ Cloud Networking (VPCs, bare metal vs. VMs)
✅ Storage (MinIO, Ceph, NFS, S3/EBS)
✅ Air-gapped & multi-tenant deployments
Key Responsibilities :
- Build and customize Helm charts from scratch.
- Implement CI/CD pipelines using Jenkins, ArgoCD, GitOps.
- Manage containerized deployments across on-prem/cloud setups.
- Work on air-gapped and restricted environments.
- Optimize for scalability, monitoring, and security (Prometheus, Grafana, RBAC, HPA).
JOB DESCRIPTION
Job Title: Admission Counsellor
Experience: 1-4 Years
Location: Gurgaon
Type of Working: Work from Office, 5 Days
RESPONSIBILITIES
● Counselling prospective students and guiding them about the different Courses
● Counselling through Data Calling.
● Advise students on specific degree programs and admission procedures.
● Handling queries over the telephone
● Converting leads into admissions.
ELIGIBILITY:
● Excellent written and verbal communications.
● Good knowledge about the courses and ready to work as per requirements
● Well-versed in the background of the Edtech industry.
● Must possess thorough knowledge of institutions and programs offered.
● Should maintain a professional work environment.
● Full of energy, result-oriented nature and takes initiative.
● Experience from the Edtech industry would be preferred
Profile: 𝐍𝐨𝐝𝐞.𝐣𝐬 𝐁𝐚𝐜𝐤𝐞𝐧𝐝 𝐃𝐞𝐯𝐞𝐥𝐨𝐩𝐞𝐫
Location: 𝐀𝐧𝐝𝐡𝐞𝐫𝐢 𝐄𝐚𝐬𝐭, 𝐌𝐮𝐦𝐛𝐚𝐢, 𝐌𝐚𝐡𝐚𝐫𝐚𝐬𝐡𝐭𝐫𝐚
Experience: 2+ 𝐘𝐞𝐚𝐫𝐬 𝐄𝐱𝐩
Work Mode: 𝐇𝐲𝐛𝐫𝐢𝐝
𝐓𝐞𝐜𝐡 𝐒𝐭𝐚𝐜𝐤:
✅ Node.js & Express.js
✅ MongoDB & Mongoose
✅ RESTful APIs & GraphQL
✅ JavaScript (ES6+) & TypeScript
✅ JWT Authentication
✅ Docker & Containerization
✅ Git Version Control
✅ Postman/Thunder Client
Position: Student Counselor
Experience Required: 3–4 Years
Salary Range: ₹30,000 – ₹45,,000 per month
Location: Kurukshetra, Haryana
Company: Digimaniac (Marketing Agency & Educational Institute)
About Us:
Digimaniac is a growing digital marketing agency and training institute in Kurukshetra. We provide marketing solutions to clients across industries and also train students in professional digital marketing skills.
We are looking for a Counselor who will play a vital role in guiding students, handling inquiries, and supporting admissions for our institute.
Key Responsibilities:
- Counsel students on courses, training programs, and career opportunities
- Guide students through the enrollment process and answer queries
- Maintain regular follow-up with students and parents
- Organize and participate in seminars, workshops, and counseling sessions
- Keep updated with the latest industry trends and course offerings
- Maintain records of student interactions and progress
Requirements:
- Strong communication and interpersonal skills
- Ability to listen actively and provide constructive guidance
- Prior experience in counseling or academic advisory preferred
- Good organizational and time-management skills
- Bachelor’s degree in any discipline (Education or Psychology preferred)
Why Join Us?
- Opportunity to make a real impact on students’ futures
- Supportive and collaborative work culture
- Professional growth opportunities
Role: Python Developer
Location: HYD , BLR,
Experience : 6+ Years
Skills needed:
Python developer experienced 5+ Years in designing, developing, and maintaining scalable applications with a strong focus on API integration. Must demonstrate proficiency in RESTful API consumption, third-party service integration, and troubleshooting API-related issues.
The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
we are arranging a virtual walk-in on Saturday 21 st May' 22
for Java Developers from 0 to 7 years of experience.
Kindly drop in your CV's with subject line: Virtual Walk-In 21st May
· Job Location: Work from Home
· Salary Package: As per Company Standards
Job Description:
Technologies/Frameworks –
· Core Java, J2EE,
· Spring Core and Spring MVC, Sprint Boot, Spring Security,
· JDBC, Hibernate, RESTful APIs, SOAP WebServices
· Knowledge of JavaScript, JQuery, AJAX, HTML5, and CSS3, Angular is added advantage
· Junit or Mockito frameworks
· Maven, Git
· Knowledge Data Structures,
· SQL, MySQL
· Designing relational database schemas
· Basics of AWS, Cloud, Microservices
Must have strong knowledge and experience in Agile based software development methodologies. Knowledge of an Agile tool like Jira.
Strong knowledge of working in Integrations of 3rd party APIs and creating the new APIs.
Must have good knowledge of creating Flow diagrams, UMLs and all the required documentations.
Must have Team Lead exposure
Domain Preference
This role is for Work from the office.
Job Description
Roles & Responsibilities
- Work across the entire landscape that spans network, compute, storage, databases, applications, and business domain
- Use the Big Data and AI-driven features of vuSmartMaps to provide solutions that will enable customers to improve the end-user experience for their applications
- Create detailed designs, solutions and validate with internal engineering and customer teams, and establish a good network of relationships with customers and experts
- Understand the application architecture and transaction-level workflow to identify touchpoints and metrics to be monitored and analyzed
- Analytics and analysis of data and provide insights and recommendations
- Constantly stay ahead in communicating with customers. Manage planning and execution of platform implementation at customer sites.
- Work with the product team in developing new features, identifying solution gaps, etc.
- Interest and aptitude in learning new technologies - Big Data, no SQL databases, Elastic Search, Mongo DB, DevOps.
Skills & Experience
- At least 2+ years of experience in IT Infrastructure Management
- Experience in working with large-scale IT infra, including applications, databases, and networks.
- Experience in working with monitoring tools, automation tools
- Hands-on experience in Linux and scripting.
- Knowledge/Experience in the following technologies will be an added plus: ElasticSearch, Kafka, Docker Containers, MongoDB, Big Data, SQL databases, ELK stack, REST APIs, web services, and JMX.
Keep the manual test cases updated;
Come up with testing procedures to validate functional, system, and performance requirements for new features;
Ensuring the quality of releases for the features by running the test cases and reporting them;
Work with the Customer Support team to reproduce customer problems and provide solutions;
Set up test environments as required, services and databases;
Write and maintain automated test suites for functional and performance testing;
Participate in the product feature specification with Product Managers, UX engineers and developers to understand the test the same;
Design & build automation smoke, feature & regression tests;
Prepare Automation Test Plans for Service Level Tests, Integration Tests and UI tests as needed;
Write feature use cases in BDD to drive the creation of automated test cases;
Participate in Release/Iteration planning, smoke, feature and regression test planning.
Requirements:
Hands on experience in java, selenium, CI tools like Cucumber / BDD, Jenkins, Maven, GITHub etc.,
Strong skills in automation test framework development using Selenium;
Highly motivated self-starter with a desire to help others and take action;
Experience with one or more programming languages: C#, Java, Node.JS/JavaScript, Go, Python;
At least 2 year of experience in QA automation;
Excellent written and communication skills;
Exposure to API Testing and performance testing using JMeter will be a plus.
Category Manager should have worked in a Marketing role in the E-Commerce domain or Digital Services (Docter consultation platform).
Owning End to End for a category.
Assisting in Getting a brand on Board.
Taking care of Brand Representation on the platform mainly through Content.
Analyzing data to understand the performance of the Category and mapping it with respect to competitors.
Work with the sourcing team to make sure the important SKUs are always in stock.
Work with Brands to get the latest products/offers on our platform











