
- Filter criteria –
- 7-8 years & above experience in Corporate Banking or allied areas (Mid-market, Commercial, Trade)
- Corporate relationship and Sales ownership experience
- Understanding of Corporate/ Commercial/ SME/ Trade Finance
- Individual needs to be self-driven, capable of working with minimum supervision
- This will be an acquisition and execution role, hence proven competency in these will be essential
- Job Role –
- Responsible for bringing in new client acquisitions and driving closure on mandates for the region
- Understanding the key offerings of the company platform and applicability across industries
- Marketing SCF solutions to Corporates and their Supply Chains
- Ability to engage with a wide set of stakeholders on the Corporate side and identify opportunities
- Engaging with the FI partners to activate and grow financing on the Network
- Working closely with the India Corporate Sales - Head and the Network Sales teams to achieve the KRAs for the region

Similar jobs
Chartered Accountant (CA) with 5-6 years of post-qualification experience.
Prior experience in startups or high-growth environments preferred.
Strong command over Indian GAAP, financial reporting, and tax compliance.
Hands-on experience with accounting tools and ERP systems (e.g., Tally, Zoho, QuickBooks).
Proven ability to manage fundraising support processes (valuation models, investor decks, datarooms).
Job Title : Python Django Developer
Location : Gurgaon (On-site)
Work Mode : 6 Days a Week (Work from Office)
Experience Level : 3+ Years
About the Role :
We are seeking a highly skilled and motivated Python Django Developer to join our team in Gurgaon. This role requires a hands-on developer with expertise in building scalable web applications and APIs using Python and Django. The ideal candidate will have a strong background in relational databases, message brokers, and distributed systems.
Key Responsibilities :
- Design, develop, and maintain robust, scalable, and secure web applications using Python and Django.
- Build and optimize back-end services, RESTful APIs, and integrations with third-party tools.
- Implement and maintain asynchronous task processing using Celery and RabbitMQ.
- Work with PostgreSQL to design and optimize database schemas and queries.
- Utilize Redis and Kafka for caching, data streaming, and other distributed system needs.
- Debug and troubleshoot issues across the application stack.
- Collaborate with cross-functional teams to gather requirements and deliver solutions.
- Ensure code quality through comprehensive testing, code reviews, and adherence to best practices.
Required Skills and Qualifications:
Technical Expertise:
- Proficiency in Python and strong experience with Django framework.
- Hands-on experience with PostgreSQL for database design and management.
- Familiarity with RabbitMQ, Celery, and Redis for asynchronous processing and caching.
- Experience with Kafka for building real-time data pipelines and event-driven architectures.
Other Skills:
- Strong understanding of software development best practices and design patterns.
- Proficiency in writing efficient, reusable, and testable code.
- Good knowledge of Linux/Unix environments.
- Familiarity with Docker and containerized deployments is a plus.
Soft Skills:
- Excellent problem-solving and analytical skills.
- Good communication and teamwork abilities.
- Ability to work independently and in a collaborative team environment.
Preferred Qualifications:
- Experience in microservices architecture.
- Exposure to DevOps tools and practices.
- Knowledge of front-end technologies like React or Angular is a bonus.
What You’ll Be Doing:
● Own the architecture and roadmap for scalable, secure, and high-quality data pipelines
and platforms.
● Lead and mentor a team of data engineers while establishing engineering best practices,
coding standards, and governance models.
● Design and implement high-performance ETL/ELT pipelines using modern Big Data
technologies for diverse internal and external data sources.
● Drive modernization initiatives including re-architecting legacy systems to support
next-generation data products, ML workloads, and analytics use cases.
● Partner with Product, Engineering, and Business teams to translate requirements into
robust technical solutions that align with organizational priorities.
● Champion data quality, monitoring, metadata management, and observability across the
ecosystem.
● Lead initiatives to improve cost efficiency, data delivery SLAs, automation, and
infrastructure scalability.
● Provide technical leadership on data modeling, orchestration, CI/CD for data workflows,
and cloud-based architecture improvements.
Qualifications:
● Bachelor's degree in Engineering, Computer Science, or relevant field.
● 8+ years of relevant and recent experience in a Data Engineer role.
● 5+ years recent experience with Apache Spark and solid understanding of the
fundamentals.
● Deep understanding of Big Data concepts and distributed systems.
● Demonstrated ability to design, review, and optimize scalable data architectures across
ingestion.
● Strong coding skills with Scala, Python and the ability to quickly switch between them with
ease.
● Advanced working SQL knowledge and experience working with a variety of relational
databases such as Postgres and/or MySQL.
● Cloud Experience with DataBricks.
● Strong understanding of Delta Lake architecture and working with Parquet, JSON, CSV,
and similar formats.
● Experience establishing and enforcing data engineering best practices, including CI/CD
for data, orchestration and automation, and metadata management.
● Comfortable working in an Agile environment
● Machine Learning knowledge is a plus.
● Demonstrated ability to operate independently, take ownership of deliverables, and lead
technical decisions.
● Excellent written and verbal communication skills in English.
● Experience supporting and working with cross-functional teams in a dynamic
environment.
REPORTING: This position will report to Sr. Technical Manager or Director of Engineering as
assigned by Management.
EMPLOYMENT TYPE: Full-Time, Permanent
SHIFT TIMINGS: 10:00 AM - 07:00 PM IST
Should be possessing around 4-5 years of Advt. Sales
experience, of which around 2 years should be in digital
with understanding of concept selling. Experience of
working with Publishers would be an add on
What we Seek from You :-
Hands on expertise with brands, digital agencies and clarity of concept
selling. Someone who works smartly and hard and is focused on achieving results &
targets. Possessing Excellent communication skills both written & verbal Should be
able to make & present effectively. Has a good network & database within the
industry. Should be self starter & a team player. Willingness to work with a startup
and come with innovative strategies.
Job Responsibilities :-
1) The Business Development Manager will be responsible for growing new
revenues at an individual level and contribute to the broader businesses
goals.
2) This person will own their targets; possess an entrepreneurial attitude that
drives market understanding and growth through aggressive networking,
selling and alignment of customer needs and internal expertise to bring
about innovation and opportunity.
3) Responsible for pipeline/Opportunity management and closing deals. From
bootstrap to execution to successful launch.
4) Developing new opportunities for promoting the organisation’s products and
services.
5) Create a very aggressive ‘outreach’ program that will pitch our network to
advertisers, brands, agencies and intermediaries for direct sales
6) Analyze and report program performance; provide results and
recommendations for improvement and new programs
1. Flink Sr. Developer
Location: Bangalore(WFO)
Mandatory Skills & Exp -10+ Years : Must have Hands on Experience on FLINK, Kubernetes , Docker, Microservices, any one of Kafka/Pulsar, CI/CD and Java.
Job Responsibilities:
As the Data Engineer lead, you are expected to engineer, develop, support, and deliver real-time
streaming applications that model real-world network entities, and have a good understanding of the
Telecom Network KPIs to improve the customer experience through automation of operational network
data. Real-time application development will include building stateful in-memory backends, real-time
streaming APIs , leveraging real-time databases such as Apache Druid.
Architecting and creating the streaming data pipelines that will enrich the data and support
the use cases for telecom networks
Collaborating closely with multiple stakeholders, gathering requirements and seeking
iterative feedback on recently delivered application features.
Participating in peer review sessions to provide teammates with code review as well as
architectural and design feedback.
Composing detailed low-level design documentation, call flows, and architecture diagrams
for the solutions you build.
Running to a crisis anytime the Operations team needs help.
Perform duties with minimum supervision and participate in cross-functional projects as
scheduled.
Skills:
Flink Sr. Developer, who has implemented and dealt with failure scenarios of
processing data through Flink.
Experience with Java, K8S, Argo CD/Workflow, Prometheus, and Aether.
Familiarity with object-oriented design patterns.
Experience with Application Development DevOps Tools.
Experience with distributed cloud-native application design deployed on Kubernetes
platforms.
Experience with PostGres, Druid, and Oracle databases.
Experience with Messaging Bus - Kafka/Pulsar
Experience with AI/ML - Kubeflow, JupyterHub
Experience with building real-time applications which leverage streaming data.
Experience with streaming message bus platforming, either Kafka or Pulsar.
Experience with Apache Spark applications and Hadoop platforms.
Strong problem solving skills.
Strong written and oral communication skills.
Attention all Graphic Designing Enthusiasts!
Kenaxs IT Private Limited is hiring a Graphic Designer in Vadodara. No matter your experience level, if you know how to use Corel Draw, AI, and Photoshop and are willing to work in agency culture, we want you! Also open to learn new designing softwares.
Salary ranges from 15-25k and immediate joining (0-15 days) is available.
Job description for Backend Developer:
If interested can forward your resume to (ambikadotjsemperfidotcodotin)
For more details contact 8a3a1a0a8a4a4a6a8a2
• Hands on experience in Springboot and good knowledge of MySQL, MongoDB
• Hands on experience in building RESTful based APIs
• Develop and manage well-functioning databases and applications
• Write effective APIs
• Integration of user-facing elements developed by a front-end developers with server side logic
• Building reusable code and libraries for future use
• Optimization of the application for maximum speed and scalability•
• Implementation of security and data protection










