
Qualifications
- Proven experience in sales, preferably in the fintech or technology industry
- Strong knowledge of financial products, services, and industry trends
- Excellent communication and interpersonal skills
- Ability to build and maintain relationships with clients
- Demonstrated track record of meeting or exceeding sales targets
- Self-motivated and driven to achieve results
- Strong analytical and problem-solving skills
- Ability to work independently and as part of a team
- Bachelor's degree in business, finance, or a related field
- Proficiency in using CRM software and sales tools

About Kapil Technologies PVT LTD
About
Similar jobs
Growisto - Creating a WIN-WIN ecosystem
At Growisto, we solve complex business problems with simplified solutions enabling digital transformation. As a team, we are obsessively passionate about technology, marketing and data and see it as an opportunity for digital growth. Nothing gives us a bigger kick than boosting our client’s sales and margins!
What do we stand for?
For Clients – Partner to deliver quality solutions to complex business problems through digital transformation.
For Teammates – Create an inclusive environment for growth and give them the opportunity to do their best work.
For Society – Good social impact through our work and policies.
Link to our website - https://www.growisto.com/
Responsibilities -
Project Execution
- Performing various marketing activities to achieve project goals, assisting the Project Owner to provide deliverables to clients
- Develop customised and targeted marketing solutions to maximise the effectiveness of campaigns
- Handling multiple marketing projects at a time
- Keep up to date with the latest marketing strategies & technologies and help in improving existing solution offerings
Client Assistance:
- Understanding the client and their business from a holistic perspective
- Prioritize client requirements during project execution and deliver on time with perfection
Qualification requirements & preferences -
- Any Graduate with good academics
- Strong written/verbal communication and analytical skills
Why should you consider joining Growisto?
- Challenging role and complete ownership to solve challenging business problems
- Exponential growth and continuous learning opportunities
- A collaborative & a positive culture - Your team will be as smart, helpful & driven as you.
- Mentorship and quick loops of feedback to reflect and improve constantly
- An opportunity to make an impact - Your work will contribute directly to our strategy & clients’ growth
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Accounting and Compliances
- Responsible for day to day accounting function of the organization
- Responsible for monthly/ yearly book closing
- Responsible for all statutory compliances including TDS, RBI, MCA etc.
- Ensuring adequate controls are in place
- Managing the payroll process
Lender Management
- Accurately calculating the billing amounts
- Managing the invoicing & collection process
Financial Planning and Analysis
- Responsible for the weekly/monthly MIS Reporting – internal as well as investor reporting
- Analysing the variances from the forecasted numbers
Taxation
- Managing the Corporate taxes including the tax audit
- Managing in the Indirect taxes including Goods and Service Tax
Treasury
- Monitoring the day-to-day cash and bank movements to calculate the free cash flows

Designation: Graphics and Simulation Engineer
Experience: 3-15 Yrs
Position Type: Full Time
Position Location: Hyderabad
Description:
We are looking for engineers to work on applied research problems related to computer graphics in autonomous driving of electric tractors. The team works towards creating a universe of farm environments in which tractors can driver around for the purposes of simulation, synthetic data generation for deep learning training, simulation of edges cases and modelling physics.
Technical Skills:
● Background in OpenGL, OpenCL, graphics algorithms and optimization is necessary.
● Solid theoretical background in computational geometry and computer graphics is desired. Deep learning background is optional.
● Experience in two view and multi-view geometry.
● Necessary Skills: Python, C++, Boost, OpenGL, OpenCL, Unity3D/Unreal, WebGL, CUDA.
● Academic experience for freshers in graphics is also preferred.
● Experienced candidates in Computer Graphics with no prior Deep Learning experience willing to apply their knowledge to vision problems are also encouraged to apply.
● Software development experience on low-power embedded platforms is a plus.
Responsibilities:
● Understanding of engineering principles and a clear understanding of data structures and algorithms.
● Ability to understand, optimize and debug imaging algorithms.
● Ability to drive a project from conception to completion, research papers to code with disciplined approach to software development on Linux platform
● Demonstrate outstanding ability to perform innovative and significant research in the form of technical papers, thesis, or patents.
● Optimize runtime performance of designed models.
● Deploy models to production and monitor performance and debug inaccuracies and exceptions.
● Communicate and collaborate with team members in India and abroad for the fulfillment of your duties and organizational objectives.
● Thrive in a fast-paced environment and have the ability to own the project end to end with minimum hand holding
● Learn & adapt new technologies & skillsets
● Work on projects independently with timely delivery & defect free approach.
● Thesis focusing on the above skill set may be given more preference.


👋🏼We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (19000+ experts across 33 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in.
REQUIREMENTS:
· Relevant expertise in C#, HTML5, CSS, JavaScript and Unit testing.
· Should be capable to work any one of the Database: SQL Server/MySQL/Oracle.
· Should have worked on .Net Framework, .Net Core, ASP.NET MVC and Object Oriented Programming System (OOPS)
· Should have worked on Asp.Net MVC or Asp.Net Web forms.
· Should have good exposure in any three or more : Web API, Concurrent design and multithreading (Capable), Rest, Entity Framework, OOPS, Cloud Development, Microservices architecture (MSA).
· Passionate about writing world class code.
· High level of commitment to client satisfaction and agility.
· Collaborate with others and build positive working relationships.
· Possess a strong work ethics.
RESPONSIBILITIES:
· Analyzing the project’s requirements and the ability to convert said requirements into technical documents, design and code.
· Implementing design methodologies and tool sets.
· Writing well-designed, defect free code which scales well and follows all best practices and guidelines.
· Executing the development of software with a strong focus on the security, performance, and robustness.
· Conducting deep level analysis to identify root cause to systematically resolve issues.
· Following all defined software configuration management best practices.
· Reviewing code to identify issues as well as deviations from best practices.
· Addressing issues promptly, responding positively to setbacks and challenges with a mindset of continuous improvement.
2. Control panel designing
3. Site commissioning and installation
4. FAT/SAT
5. Documentation(FDS,DDS, IO Assignment, Heat load calculation)
Also, co-existence interface between Oracle apps HRMS and Fusion HCM.
Oracle Apps HRMS R12 and Oracle Fusion HCM Cloud.
:Oracle global human resources cloud 2021 implementations essentials. Payroll and Compensation Module.
Desired Candidate Profile
1. Exp. In online bidding on various portal like Elance, Guru, Freelancer, Odesk, Upwork etc.
2. Expert in generating business from online bidding.
Writing proposals, costing and negotiations.
3. Prospect for potential new clients and turn this into increased business.
4. Candidate should be flexible in working hours.
5. Generate new business leads for Web design, graphic design, web development & internet marketing.




