We are hiring Telecallers.
Objective
Reaching out to existing clients and new prospect to generate sales and build relationships.
Prerequisites
Should have excellent verbal and written communication skills, good interpersonal skills, and the ability to handle pressure. Should also be creative thinkers and have good listening skills.
Following will be broad job outlines
- Outbound calls (Make cold calls to potential customers to present products or services, explain their benefits, and encourage purchases or appointments)
- Inbound calls (Manage incoming calls, resolve customer issues, and collect customer information)
- Customer relationships (Build relationships with existing clients and foster rapport with potential clients)
- Documentation (Create reports and documentation based on conversations with customers)
- Sales pitches (Memorize and customize scripts for clients, and modify pitches based on customer demand)
- Customer data (Maintain contact information databases and collect customer data to improve the user experience)
- Communication (Interact with customers via various channels, including live chat, emails, social media, and direct calls)
- Follow up (Follow up with incoming leads and book appointments for the field sales team)
- Compliance (Adhere to organizational guidelines and methodology)
About Unique CompSol Private Limited
About
Company social profiles
Similar jobs
Experience
-3-6 years of experience in an HR Generalist role in a startup
-Bachelor’s degree (or equivalent) in human resources management or similar field Knowledge of applicant tracking systems
Role and Responsibilities
- Should have worked on end to end talent management - hire to retire model Should be able to take us through the employee lifecycle process
- Should be good at interacting with people and should have done the following: Conduct review and appraisal cycle across teams
- Look after employees' growth who is ready for promotion, who needs to do something different within the organization, who are the high potentials
- How to measure performance
- Benchmarking compensation across companies
- Attrition planning
- Succession planning
- Driving Learning & Development across company
- Ensuring cultural homogeneity
- Employee engagement
- Performance Management and its process
- Compensation benchmarking
Talent Acquisition
- Coordinate with hiring managers to identify staffing needs and candidate selection criteria
- Source applicants through online channels, such as LinkedIn and other professional networks
- Create job descriptions and interview questions that reflect the requirements for each position
- Compile lists of most-suitable candidates by assessing their CVs, portfolios, and references
- Organize and attend job fairs and recruitment events to build a strong candidate pipeline
- Maintain records of all materials used for recruitment, including interview notes and related paperwork, to share with key stakeholders.
Onboarding & Induction
Should have done the following:
- Plan and execute staff welfare programs.
- Provide a presentation to new joiners about HR policies of the company. Discuss about future planning and expansion of company with employees Organize health awareness programs for employees.
- Take care of maintaining a good relationship between employee and employer
Employee Grievance Handling:
Should have been involved in the below:
- Prepare and maintain employee retention policy.
- Handle and resolve grievances of employees.
- Resolve problems between employee and employer regarding different issues eg: compensation, processes, policies, etc. Share instances of any difficult issue resolved.
Skills
- Proficiency in documenting processes and keeping up with industry trends.
- Excellent interpersonal and communication skills.
- Exceptional ability to screen candidates, compile shortlists and interview candidates.
- Experience in creating awareness of the company brand and establishing professional relationships with candidates.
Benefits
Too lazy to cook food? Delicious lunch served at office.
ESOPs, based on grade level
Comprehensive Health Insurance plans
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Key Responsibilities
• As a part of the DevOps team, you will be responsible for the configuration, optimization, documentation, and support of the CI/CD components.
• Creating and managing build and release pipelines with Azure DevOps and Jenkins.
• Assist in planning and reviewing application architecture and design to promote an efficient deployment process.
• Troubleshoot server performance issues & handle the continuous integration system.
• Automate infrastructure provisioning using ARM Templates and Terraform.
• Monitor and Support deployment, Cloud-based and On-premises Infrastructure.
• Diagnose and develop root-cause solutions for failures and performance issues in the production environment.
• Deploy and manage Infrastructure for production applications
• Configure security best practices for application and infrastructure
Essential Requirements
• Good hands-on experience with cloud platforms like Azure, AWS & GCP. (Preferably Azure)
• Strong knowledge of CI/CD principles.
• Strong work experience with CI/CD implementation tools like Azure DevOps, Team City, Octopus Deploy, AWS Code Deploy, and Jenkins.
• Experience of writing automation scripts with PowerShell, Bash, Python, etc.
• GitHub, JIRA, Confluence, and Continuous Integration (CI) system.
• Understanding of secure DevOps practices
Good to Have -
•Knowledge of scripting languages such as PowerShell, Bash
• Experience with project management and workflow tools such as Agile, Jira, Scrum/Kanban, etc.
• Experience with Build technologies and cloud services. (Jenkins, TeamCity, Azure DevOps, Bamboo, AWS Code Deploy)
• Strong communication skills and ability to explain protocol and processes with team and management.
• Must be able to handle multiple tasks and adapt to a constantly changing environment.
• Must have a good understanding of SDLC.
• Knowledge of Linux, Windows server, Monitoring tools, and Shell scripting.
• Self-motivated; demonstrating the ability to achieve in technologies with minimal supervision.
• Organized, flexible, and analytical ability to solve problems creatively
Who we are looking for
- Total work experience of minimum 5+years in developing web-based solutions and services
- Gathering functional requirements, developing technical specifications, and project & test planning
- Designing/developing web, software, mobile apps, prototypes, or proofs of concepts (POCs)
- Apply technical expertise to challenging programming and design problems
- Resolve defects/bugs during QA testing, pre-production, production, and post-release patches
- Work cross-functionally with various Intuit teams: product management, QA/QE, different product
- lines, or business units to drive forward results
- Contribute to the design and architecture of the project
- Experience with Agile Development, SCRUM, or Extreme Programming methodologies
- Solid communication skills: Demonstrated ability to explain complex technical issues to both
- technical and non-technical audiences
- Strong understanding of the Software design/architecture process
- Experience with unit testing & Test Driven Development (TDD)
- Should have knowledge of frameworks such as VueJS, Node JS, Express JS, Grunt, Gulp, Yeoman,
- AngularJS, Backbone JS, React JS, Bootstrap (or alternatives)
- Should have expertise in efficient, minimal JavaScript on both mobile and web.
Skills Needed
- Hands-on extensive Salesforce
experience - Experience in lightning implementation, LWC
- Ability to Perform configuration, customization, integration and support of Salesforce.com
- Hands-on Experience in customizing using APEX, Triggers, Batch Apex, Visualforce, etc
- Understanding of the Salesforce.com data model
- Must-Have hands-on experience on Deployments using Change set, , Managing release dependencies across multiple projects
- Understanding the release management and software development life cycle
Work with customers or their representatives
- Possess a professional, knowledgeable, positive and energetic attitude § Uses strong consultative sales skills and interpersonal skills (both oral and written)
- Attention to detail, strong follow-up skills and motivation
- Assist manager in marketing campaigns
- Maintain accurate and organized files
- Work with Outside Sales Staff and Counter Sales Staff to assist in following up with customers
- Develop long-term relationships with industry customers (i.e. designers)
- Maintain the Showroom as a professional place of business
- Follow company policies and procedures and other duties as assigned
- Contribute in all phases of the development lifecycle
- Write well designed, testable, efficient code
- Ensure development is in compliance with specifications
- Prepare and produce releases of software components
- Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review
Qualifications
- BSc/BE/BTech in Computer Science, Engineering, or a related field
- 3+ years of experience in Spring boot Framework
- Experience with REST- JSON, SOAP/XML is mandatory
- Proven working experience in Java development
- Experienced in Server Side Java, J2EE, Servlets, Spring/Spring boot, JAXB, JAX-WS, MySQL/PostgreSQL, Junit
- Knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate).
- Hands-on experience in designing and developing applications using Java EE
- Platforms with Open Source Frameworks - Spring/Spring boot, XML, Integration with SOAP and RESTful Web Services, WS security.
- Experience with test-driven development
- Experience with GIT and Maven.
- Experience with Spring Security.
- Develop detailed test strategy and test plans for various features at the functional and system level.
- Be a gatekeeper for the quality of any product going out in the market from Rapido including customer experiences.
- Integral members of SDLC right from product requirements gathering phase to design reviews till the delivery of the product.
- Maintain a staging environment to effectively verify and validate the products.
- Ensure comprehensive test coverage by preparing a detailed capability matrix to map features to the functional requirements of the test plan and for defining the acceptance criteria.
- Effectively develop an automation to improve efficiency/productivity.
- Expertise in developing automation framework for mobile apps, web UI, APIs using appropriate open source tools like Appium, Selenium, SOAP UI, Rest Assured etc.
- Fluent with OOPS concepts and any one programming language like Java, Python etc.
- Cover all dimensions of Rapido’s product which includes features/functionality, scale/load, usability, and performance.
- Hands on experience in CI/CD driven test processes is an added benefit.
- Analyze defects to assess the severity and prioritize them for development to fix.
- Work with software developers to tune code and track all problem reports to closure. Analyze test results to ensure functionality and recommend appropriate action.
- Mentor QAs within the team.
- Graduation/Masters in computer science
- 2-7 Years of relevant experience
- Round 1 – Technical Interview 1
- Round 2 – Technical Interview 2
- Round 3 – Managerial Round
- Round 4 – HR Round
Job description:
Job Title/Designation/Position: Dotnet Developer / Sr. Dotnet Developer
Total Experience: 4-8 years
Relevant Experience: 4-8 years
Job Location: Pune/Hyderabad/Bangalore & Hyderabad
Skills/Qualifications:
Primary Skills (Must have):
- C# with good understanding of OOPS, design patterns and multi-threading
- ASP.Net MVC, WEBAPI, RESTful services, JavaScript, jQuery
- .Net Framework
- Advanced knowledge of Microsoft SQL Server
- Knowledge of Unit Testing and Automated tests (nUnit)
- Knowledge of SQL Server/Oracle database and able to write Store-Procedures
- Strong analytical and problem-solving skills
- Good analytical skills along with Excellent written and verbal communication skills in English
- Bachelor’s Degree in Engineering /Technology / MCA
Secondary Skills (Nice to have):
- HTML5
- CSS3
- Node JS