
Job Description
The ideal candidate should have minimum 8 years of experience on SAP Abap.
Should be well versed with Reports , Interfaces , Conversions , Enhancements , Forms and User exits.
Should be good in stakeholder management.
Should have excellent verbal and written communication skills
Should be good in Project management.
Knowledge of S/4 Hana is required.

Similar jobs
About the company
KPMG International Limited, commonly known as KPMG, is one of the largest professional services networks in the world, recognized as one of the "Big Four" accounting firms alongside Deloitte, PricewaterhouseCoopers (PwC), and Ernst & Young (EY). KPMG provides a comprehensive range of professional services primarily focused on three core areas: Audit and Assurance, Tax Services, and Advisory Services. Their Audit and Assurance services include financial statement audits, regulatory audits, and other assurance services. The Tax Services cover various aspects such as corporate tax, indirect tax, international tax, and transfer pricing. Meanwhile, their Advisory Services encompass management consulting, risk consulting, deal advisory, and other related services.
Apply through this link for quicker response-https://forms.gle/aSyXcxVNzQptbWt9A
Job Description
Position: ML Engineer
Experience: Experience 4+ years of relevant experience
Location : WFO (3 days working) Pune – Kharadi
Employment Type: contract for 3-5 months-Can be extended basis performance and future requirements
Skills Required:
• Building and maintaining pipelines for model development, testing, deployment, and monitoring.
• Automating repetitive tasks such as model re-training, hyperparameter tuning, and data validation.
• Developing CI/CD pipelines for seamless code migration.
• Collaborating with cross-functional teams to ensure proper integration of models into production systems.
Key Skills
• 3+ years of experience in developing and deploying ML models in production.
• Strong programming skills in Python (with familiarity in Bash/Shell scripting).
• Hands-on experience with tools like Docker, Kubernetes, MLflow, or Airflow.
• Knowledge of cloud services such as AWS SageMaker or equivalent.
• Familiarity with DevOps principles and tools like Jenkins, Git, or Terraform.
• Understanding of versioning systems for data, models, and code.
• Solid understanding of MLflow, ML services, model monitoring, and enabling logging services for performance tracking
Job Title: Credit Risk Analyst
Company: FatakPay FinTech
Location: Mumbai, India
Salary Range: INR 8 - 15 Lakhs per annum
Job Description:
FatakPay, a leading player in the fintech sector, is seeking a dynamic and skilled Credit Risk Analyst to join our team in Mumbai. This position is tailored for professionals who are passionate about leveraging technology to enhance financial services. If you have a strong background in engineering and a keen eye for risk management, we invite you to be a part of our innovative journey.
Key Responsibilities:
- Conduct thorough risk assessments by analyzing borrowers' financial data, including financial statements, credit scores, and income details.
- Develop and refine predictive models using advanced statistical methods to forecast loan defaults and assess creditworthiness.
- Collaborate in the formulation and execution of credit policies and risk management strategies, ensuring compliance with regulatory standards.
- Monitor and analyze the performance of loan portfolios, identifying trends, risks, and opportunities for improvement.
- Stay updated with financial regulations and standards, ensuring all risk assessment processes are in compliance.
- Prepare comprehensive reports on credit risk analyses and present findings to senior management.
- Work closely with underwriting, finance, and sales teams to provide critical input influencing lending decisions.
- Analyze market trends and economic conditions, adjusting risk assessment models and strategies accordingly.
- Utilize cutting-edge financial technologies for more efficient and accurate data analysis.
- Engage in continual learning to stay abreast of new tools, techniques, and best practices in credit risk management.
Qualifications:
- Minimum qualification: B.Tech or Engineering degree from a reputed institution.
- 2-4 years of experience in credit risk analysis, preferably in a fintech environment.
- Proficiency in data analysis, statistical modeling, and machine learning techniques.
- Strong analytical and problem-solving skills.
- Excellent communication skills, with the ability to present complex data insights clearly.
- A proactive approach to work in a fast-paced, technology-driven environment.
- Up-to-date knowledge of financial regulations and compliance standards.
We look forward to discovering how your expertise and innovative ideas can contribute to the growth and success of FatakPay. Join us in redefining the future of fintech!
You have 3 to 14 yrs of software engineering & product delivery experience with strong
background in datastructures & algorithms
• You’ve proven software development credentials having successfully built complex
products
• You are experienced with one or more general programming languages (e.g. Java, C/C++,
Go). Ability to learn other coding languages as needed.
• You’ve proven software development credentials having successfully built complex
products
• You have a strong foundation in the fundamentals of computer science, with familiarity in
data structures, algorithms and a strong command of object-oriented principles.
• You have experience in one or more of the following areas: Server Backend, Distributed
and Parallel Systems, Full Stack Development (frontend and backend), Scalable Enterprise
Platforms and Applications, Application Security and Incident Management, Android,
iOS, and Machine Learning.
• You have a spark that separates you from the crowd and ability to think out of the box and
on your feet
• You possess multi-dimensional skills that make you a valuable co-worker in a fast,
changing and ambiguous environment
• You have the ability to learn other coding languages as needed real quick
• You are comfortable in working with a team that deals with ambiguity every day
Job Description:
We are looking for a Big Data Engineer who have worked across the entire ETL stack. Someone who has ingested data in a batch and live stream format, transformed large volumes of daily and built Data-warehouse to store the transformed data and has integrated different visualization dashboards and applications with the data stores. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.
Responsibilities:
- Develop, test, and implement data solutions based on functional / non-functional business requirements.
- You would be required to code in Scala and PySpark daily on Cloud as well as on-prem infrastructure
- Build Data Models to store the data in a most optimized manner
- Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Implementing the ETL process and optimal data pipeline architecture
- Monitoring performance and advising any necessary infrastructure changes.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Proactively identify potential production issues and recommend and implement solutions
- Must be able to write quality code and build secure, highly available systems.
- Create design documents that describe the functionality, capacity, architecture, and process.
- Review peer-codes and pipelines before deploying to Production for optimization issues and code standards
Skill Sets:
- Good understanding of optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies.
- Proficient understanding of distributed computing principles
- Experience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.
- Implemented complex projects dealing with the considerable data size (PB).
- Optimization techniques (performance, scalability, monitoring, etc.)
- Experience with integration of data from multiple data sources
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.,
- Knowledge of various ETL techniques and frameworks, such as Flume
- Experience with various messaging systems, such as Kafka or RabbitMQ
- Creation of DAGs for data engineering
- Expert at Python /Scala programming, especially for data engineering/ ETL purposes

- Develop front end User Interface of our UKLB Experience Analysis (Biometrics) tool using SharePoint, PowerApps, Power Automate and Logic Apps
- Connection of UI with back end SQL Database using Logic Apps - Advise solution design as a Logic Apps expert (specifically around UI)
- Responsible for managing technology in projects and providing technical guidance or solutions for work completion
Skills and Experience
- Experience with Azure services like Azure App Services, Azure Active Directory, Azure SQL, Azure PostgreSQL, Key Vault, Azure DevOps, Application Insights, Azure Storage, Redis Cache
- Microsoft Azure Developer Certification
- Experience with .Net SDK, integration tools, Application and Security frameworks
- C#, ASP.NET, Application development using .Net Framework
- Preferred: Insurance or BFSI domain experience
- .Net
- Azure/
Location: Chennai
Salary : 15-20 LPA
We have a urgent requirement for a post of Mulesoft Architect for a repted MNC Compnay
Notice Period- 15-30 days
Responsibilities
- Develop new user-facing features
- Build reusable code and libraries for future use
- Ensure the technical feasibility of UI/UX designs
- Optimize application for maximum speed and scalability
- Assure that all user input is validated before submitting to back-end
- Collaborate with other team members and stakeholders






