
- Getting candidate registered on tool
- To release bulk offers within TAT.
- Close salary discussions with selected candidates.
- Ensure timely onboarding.
- Submission of back papers.
- Keep checking on BGC clearance and getting insuff cleared if any.
- Work From Office

About Chaitanya India Fin Credit
About
Connect with the team
Similar jobs
Roles and Responsibilities
- Actively prospect and acquire new clients through various sales and marketing techniques, including cold calling, networking, and referrals.
- Build and maintain strong relationships with clients to understand their trading needs and provide tailored investment solutions.
- Execute trades on behalf of clients and ensure timely and accurate order execution.
- Stay updated on market trends, economic indicators, and geopolitical events to provide clients with informed trading recommendations.
- Collaborate with the research and analysis team to develop market insights and trading strategies.
- Provide exceptional customer service and support to clients throughout the trading process.
- Meet and exceed sales targets and performance metrics.
Desired Candidate Profile
- Bachelor's degree in Finance, Business, Economics, or related field.
- Proven track record of success in client acquisition and sales within the financial services industry, prefer FX trading.
- Strong understanding of financial markets, trading instruments, and risk management principles.
- Excellent communication, negotiation, and interpersonal skills.
- Ability to thrive in a fast-paced, competitive environment.
Perks and Benefits
- Desired Incentives.
- Overseas Trip by Company.
- Monthly Contest & Trophies.
- Quarterly and Annual Rewards and Recognition.
Power BI Developer
Senior visualization engineer with 5 years’ experience in Power BI to develop and deliver solutions that enable delivery of information to audiences in support of key business processes. In addition, Hands-on experience on Azure data services like ADF and databricks is a must.
Ensure code and design quality through execution of test plans and assist in development of standards & guidelines working closely with internal and external design, business, and technical counterparts.
Candidates should have worked in agile development environments.
Desired Competencies:
- Should have minimum of 3 years project experience using Power BI on Azure stack.
- Should have good understanding and working knowledge of Data Warehouse and Data Modelling.
- Good hands-on experience of Power BI
- Hands-on experience T-SQL/ DAX/ MDX/ SSIS
- Data Warehousing on SQL Server (preferably 2016)
- Experience in Azure Data Services – ADF, DataBricks & PySpark
- Manage own workload with minimum supervision.
- Take responsibility of projects or issues assigned to them
- Be personable, flexible and a team player
- Good written and verbal communications
- Have a strong personality who will be able to operate directly with users
- Experience building applications using NodeJS and frameworks such as Express.
- Thorough understanding of React.js and NodeJS including its core principles.
- Ability to understand business requirements and translate them into technical requirements.
- Familiarity with code versioning tools (such as Git, SVN, and Mercurial).
- Understanding the nature of asynchronous programming and its quirks and workarounds
- Strong experience with MongoDB, Postgres
- Highly proficient with Vue.js framework and its core principles such as components, reactivity, and the virtual DOM
- Familiarity with the Vue.js ecosystem, including Vue CLI, Vuex, Vue Router
- Good understanding of HTML5 and CSS3, and Sass
- Understanding of server-side rendering and its benefits and use cases

Job Description:
We are looking for a Big Data Engineer who have worked across the entire ETL stack. Someone who has ingested data in a batch and live stream format, transformed large volumes of daily and built Data-warehouse to store the transformed data and has integrated different visualization dashboards and applications with the data stores. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.
Responsibilities:
- Develop, test, and implement data solutions based on functional / non-functional business requirements.
- You would be required to code in Scala and PySpark daily on Cloud as well as on-prem infrastructure
- Build Data Models to store the data in a most optimized manner
- Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Implementing the ETL process and optimal data pipeline architecture
- Monitoring performance and advising any necessary infrastructure changes.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Proactively identify potential production issues and recommend and implement solutions
- Must be able to write quality code and build secure, highly available systems.
- Create design documents that describe the functionality, capacity, architecture, and process.
- Review peer-codes and pipelines before deploying to Production for optimization issues and code standards
Skill Sets:
- Good understanding of optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies.
- Proficient understanding of distributed computing principles
- Experience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.
- Implemented complex projects dealing with the considerable data size (PB).
- Optimization techniques (performance, scalability, monitoring, etc.)
- Experience with integration of data from multiple data sources
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.,
- Knowledge of various ETL techniques and frameworks, such as Flume
- Experience with various messaging systems, such as Kafka or RabbitMQ
- Creation of DAGs for data engineering
- Expert at Python /Scala programming, especially for data engineering/ ETL purposes
Urgent requirement for Java developer for a product-based company.
Apply only if you have experience in data structure and algorithms
Please note the notice period. we are looking for an immediate to 15 days joiner only.
Experience: 3+ years
Location: Bangalore
Notice Period: Immediate to 15 days
Working: currently WFH
Skills: Core java developer, Hibernate, Spring, Data structure, Spring boot/Micro service

Responsibilities:
- Develop REST/JSON API’s Design code for high scale/availability/resiliency.
- Develop responsive web apps and integrate APIs using NodeJS.
- Presenting Chat efficiency reports to higher Management
- Develop system flow diagrams to automate a business function and identify impacted systems; metrics to depict the cost benefit analysis of the solutions developed.
- Work closely with business operations to convert requirements into system solutions and collaborate with development teams to ensure delivery of highly scalable and available systems.
- Using tools to classify/categorize the chat based on intents and coming up with F1 score for Chat Analysis
- Experience in analyzing real agents Chat conversation with agent to train the Chatbot.
- Developing Conversational Flows in the chatbot
- Calculating Chat efficiency reports.
Good to Have:
- Monitors performance and quality control plans to identify performance.
- Works on problems of moderate and varied complexity where analysis of data may require adaptation of standardized practices.
- Works with management to prioritize business and information needs.
- Experience in analyzing real agents Chat conversation with agent to train the Chatbot.
- Identifies, analyzes, and interprets trends or patterns in complex data sets.
- Ability to manage multiple assignments.
- Understanding of ChatBot Architecture.
- Experience of Chatbot training


- Must have working experience on any PHP Framework (CodeIgniter / Laravel / Yii / Symfony /openchart / joomla / wordpress)
- Should be strong in jQuery.- Good knowledge of relational databases, version control tools and of developing web services- MySQL, query optimization
- Hands on Experience in Design, Development, Module Handling, Architecture.
- Able to work independently
Walk in-Votive Technologies Indore
308, Shrivardhan complex,
4 RNT Marg,Indore (MP), India 452001.


