
Must be Graduate
➢ Exposure to Sales domain will be preferred.
➢ Excellent Verbal & Written Communications Skills
➢ Custodian of all Qualitative parameters including Quality Scores, Compliance and CSAT ➢ Very Good Presentation, Feedback and Coaching skills ➢ Ability to observe, analyze and identify process improvement opportunities ➢ Must be comfortable working in 24/7 rotational shifts ➢ Ability to work under pressure, Customer Service Attitude with Analytical bent of mind ➢ Highly energetic & enthusiastic. ➢ Should be able to work as Individual Contributor and as a good Team player ➢ Hands-on experience on MS-Office; preferably on MS-Excel, Power Point. ➢ Basic Data Handling and Data interpretation skill
➢ Graduation mandatory
➢ Package: up to 4 - 6 Lac
➢ Location: Gurgaon
➢ Should be from neighbouring/ accessible locations from Gurgaon.
➢ Should be comfortable with 24*7 rotational shifts & self-commuting.
Key Responsibilities: ➢ Audit Call (Voice) interactions on Product, Process, Communication and compliance parameters ➢ Conduct One-O-One and Group Feedback Sessions for the targeted population to improve their performance ➢ Custodian of all Qualitative parameters including Quality Scores, Compliance and CSAT ➢ Data analysis and making quality reports and review decks ➢ Conducting calibration sessions with partner BPO teams and work as master calibrator to ensure consistent scoring approach ➢ Identify Process Improvements and make recommendations ➢ By using knowledge of process, proactively identify areas of concern & highlight to the change Team ➢ Identify Training needs and working in close coordination with Training team to help Advisors come up of the learning curve ➢ Conduct Certification for the New Hires ➢ Conducting Compliance Audits to trace malpractices and share internal compliance feedback with the Management

Similar jobs
Job Description
Cateina Technologies is looking for IBM ACE Developers
Technical Skills
● IBM Message Broker
● IIB ACE
● Kubernetes
● Microservices
● Docker
● EAI and SOA architecture
● Java
● Web services (REST, SOAP)
● JSON, XML
● Database knowledge
● Understanding of Nodes
● Basic Unix knowledge
● Angular JS
● Jira
Required competencies
● Experience in designing, developing, implementing and supporting SDLC by using IBM integration bus/IBM connect API/IBM ACE
● Required API Banking Knowledge Banking Domain, Oracle PL SQL 16 WebSphere Message Broker IBM Integration Bus IIB
● Sound knowledge of REST XML/JSON, WSDL, SOAP, JMS, HHTP, SSL, XSLs, XST and gateway script is required
● Clear understanding of the IBM API Connect architecture and its components.
● Should be clear on the concepts of different protocols (HTTP(s) etc) and RESTful services standards.
Skills
- Strong communication skills both written and verbal
- Any degree
Who can apply
Candidates who:
- are willing to travel to client site
- willing to relocate to Mumbai
Office Location
Cateina Technologies
Vikhroli (West),
Mumbai, Maharashtra 400083

Key Skills Expected:
- C# ASP.NET, MVC, Razor Syntax
- .NET Core, WebAPI
- MS SQL, MySQL, PostgreSQL
- HTML / CSS / Bootstrap, JavaScript, jQuery, Angular (at least one resource should know Angular)
- Microsoft Azure / AWS (would be a big advantage)
Role & Responsibilities
- Looking for an enthusiastic .NET Developer with good hands-on project management and client handling experience.
- Excellent development / technical skills on C# / MVC / .NET / WebAPI / SQL
- Good understanding of version controls – github, Gitlab, SVN
- Ready to explore and take up technically challenging tasks
- Ready to dive in to new technology and frameworks
- Knowledge of React Js or Angular would be an advantageous
- Requirement Analysis / Gap Analysis, Understand the client/user perspective
- Excellent Good Communication ability to Express own view and understanding to client/team
Company’s perks:
- 5 days working.
- Flexible working hours.
- Modern infrastructure and Friendly environment.
- Paid leaves and other performance bonuses.
- Festivals, birthdays, work anniversary celebrations and company outings

- You're proficient in React.JS and strong frontend javascript foundation
- Knowing Web3.js integration is a definite plus but not a requirement
- You have a passion for writing code as well as understanding and crafting the ways systems interact
- You have experience deploying to and implementing solutions in AWS
- You believe in the benefits of agile processes and shipping code often
- You are pragmatic and work to coalesce requirements into reasonable solutions that provide value
Responsibilities
- Deploy well-tested, maintainable and scalable software solutions
- Take end-to-end ownership of the technology stack and product
- Collaborate with other engineers to architect scalable technical solutions
- Embrace and improve our standards and processes to reduce friction and unlock efficiency
Current Ecosystem :
ShibaSwap : https://shibaswap.com/#/" target="_blank">https://shibaswap.com/#/
Metaverse : https://shib.io/#/" target="_blank">https://shib.io/#/
NFTs : https://opensea.io/collection/theshiboshis" target="_blank">https://opensea.io/collection/theshiboshis
Game: Shiba Eternity on iOS and Android
document, build, and deploy RESTful JSON APIs with continuous uptime and minimal latency.
Roles and Responsibilities:
-Building API services using NodeJS Express and related frameworks
Expert level understanding of NodeJS asynchronous runtime
Expert level understanding of Javascript concepts on callbacks and closures
Experience with Postgres, NoSQL, Redis, and Firebase Realtime database
Experience with AWS services like Elastic Beanstalk, Cloudfront, S3, EC2, Lambda,
API Gateway, SQS, etc
Understanding of patterns and techniques for building scalable back-end
infrastructure including caching, rate limiting, authentication, and authorization
schemes.
Experience in building highly scalable and high throughput services with
millisecond response times
Experience working in a collaborative team environment
Excellent communication & interpersonal skills
Willingness to learn and pick up new technology along with patience to mentor.
NOTE- Candidates only from PRODUCT BASED companies will be preferred. Location is Mumbai and its Work from Office.
Participation in multiple Avaloq Core Banking Platform implementations in various business / technical streams
Ability to develop high level software designs and solutions
Excellent analytical skills and systematic approach to problem solving
Ability to articulate complex technical issues to business stakeholders
Good understanding of core business processes and products in the private banking industry
We are looking for a Java backend developer who would be working on the bleeding edge of technologies. We work primarily with a fully reactive stack powered by Spring Webflux and Reactive MongoDb Repository in AWS. Our services follow both event based approaches as well as workflow based depending on the use case.
Responsibilities
- Understand why a particular design was chosen and code accordingly
- Will have to deliver clean bug free unit tested code with minimal guidance
- Strive for continuous improvement by refactoring and applying best practices
- Learn and adapt to new technologies as necessary.
- Capable of working in both client side and server side technologies
Requirements
- Must have a Bachelor’s degree in computer science or equivalent
- Must have 2-3 years experience as a software developer
- Must be proficient in Core Java and Spring/Springboot. Knowledge of Webflux is not mandatory.
- Must have developed RESTful services
- Should understand git
Added Bonus
- Experience working with microservices
- Experience in a NoSql solution like MongoDB
- Experience working with Python
Job Purpose |
|
Managerial Responsibilities |
|
Functional Responsibilities |
|
Behavioural Skills |
|
Sales Team |
|
Research Team |
|
Other LOB / SBU |
|
Editorial / DTP Teams |
|
Compliance / Risk |
|
External |
|
Clients |
|
Media |
|
Fund Managers |
|
Corporates |
|

Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow



