11+ Certified Scrum Professional (CSP) Jobs in Pune | Certified Scrum Professional (CSP) Job openings in Pune
Apply to 11+ Certified Scrum Professional (CSP) Jobs in Pune on CutShort.io. Explore the latest Certified Scrum Professional (CSP) Job opportunities across top companies like Google, Amazon & Adobe.
EnKash is a leading corporate spends management and Payments Company. EnKash’s platform helps
businesses manage their payables, receivables, corporate cards & expenses. Businesses can manage
& control their spends in a completely DIY environment. EnKash serves both, financial institutions
and businesses with an objective to bring operational efficiencies and savings in business payments
flow.
EnKash is a leader in value creation for businesses, especially SMBs. Over the past four years EnKash
has delivered savings to more than 65000 users by digitizing their processes. The best part is EnKash
works as a layer on businesses’ existing softwares and banking relationships, building state of the art
experience and accessibility. This is combined with the ease of access to credit, purpose-based cards,
end use monitoring, customisable approval workflows and integrations with leading banks and
accounting platforms, is set to ease the journey for our SMBs.
The management and founding team at EnKash come with 100+ years of experience in paymentsbanking domain and have been instrumental in bringing many firsts to the ecosystem. EnKash is
based out of India with its offices at Mumbai, NCR, Pune and Bengaluru.
While being a leader in Indian ecosystem, EnKash often gets compared with global peers like Ramp,
Spendesk, Pleo, Payhawk, Soldo, Lithic, Marqeta, bill.com and a combination of their compelling
business models.
EnKash is a series B funded company with some marquee investors on board.
Feathers in the cap:
· Winner: India Fintech Awards 2020 by NASSCOM
· Best B2B Solution provider of the year, 2020 at Payments & Cards Summit.
Responsibilities:
- Manage each project's scope and timeline
- Coordinate sprints, retrospective meetings and daily stand-ups across teams
- Coach team members in Agile frameworks
- Facilitate internal communication and effective collaboration
- Be the point of contact for external communications (e. g. from customers or stakeholders)
- Work with product owners to handle backlogs and new requests
- Resolve conflicts and remove obstacles that occur
- Help teams implement changes effectively
- Ensure deliverables are up to quality standards at the end of each sprint
- Guide development teams to higher scrum maturity
- Help build a productive environment where team members own' the product and enjoy working on it
- High level understanding of various tech stacks, basics of API and microservices
Requirements:
- Leadership, Decision-Making skills.
- Responsibility.
- Effective communication.
- Attention to Detail.
- Task Delegation.
- Comfortable with participatory management.
We are looking for an experienced and strategic Chief Financial Officer (CFO) to oversee the company’s financial operations, planning, and risk management. The CFO will play a key leadership role in driving financial performance, supporting business growth, and ensuring financial stability.
Key Responsibilities:
- Lead and manage all financial functions including accounting, budgeting, forecasting, and financial reporting
- Develop and execute financial strategies aligned with business goals
- Monitor cash flow, profitability, and financial performance
- Prepare and present financial reports to the CEO, Board of Directors, and stakeholders
- Ensure compliance with financial regulations, tax laws, and company policies
- Manage financial risk, audits, and internal controls
- Support strategic initiatives such as investments, mergers, and acquisitions
- Oversee relationships with banks, auditors, and financial institutions
At least 5 years of experience in testing and developing automation tests.
A minimum of 3 years of experience writing tests in Python, with a preference for experience in designing automation frameworks.
Experience in developing automation for big data testing, including data ingestion, data processing, and data migration, is highly desirable.
Familiarity with Playwright or other browser application testing frameworks is a significant advantage.
Proficiency in object-oriented programming and principles is required.
Extensive knowledge of AWS services is essential.
Strong expertise in REST API testing and SQL is required.
A solid understanding of testing and development life cycle methodologies is necessary.
Knowledge of the financial industry and trading systems is a plus
Key Responsibilities:
1. Development, integration and testing of embedded software in the Embedded Linux RTOS
2. Integrate Application based on Adaptive AUTOSAR Platform
3. Contribute for Architecture, Detailed design, programming in C++ (11/14/17)
4. Perform Unit, Integration Tests of developed Application
5. Detail oriented systematic problem-solving approach in the embedded software.
6. Debug embedded software on hardware platforms for issue identification and resolution
Minimum qualification criteria:
1. Bachelor's degree in Electronics, Computer Science, Electrical Engineering, or related field
2. 2+ years of experience in software development in C or C++
3. 2+ years of experience in designing and implementing embedded systems for high performance, high reliability real-time embedded computing platforms
Location: Ahmedabad / Pune
Team: Technology
Company Profile
InFoCusp is a company working in the broad field of Computer Science, Software Engineering, and Artificial Intelligence (AI). It is headquartered in Ahmedabad, India, having a branch office in Pune.
We have worked on / are working on AI projects / algorithms-heavy projects with applications ranging in finance, healthcare, e-commerce, legal, HR/recruiting, pharmaceutical, leisure sports and computer gaming domains. All of this is based on the core concepts of data science,
computer vision, machine learning (with emphasis on deep learning), cloud computing, biomedical signal processing, text and natural language processing, distributed systems, embedded systems and the Internet of Things.
PRIMARY RESPONSIBILITIES:
● Applying machine learning, deep learning, and signal processing on large datasets (Audio, sensors, images, videos, text) to develop models.
● Architecting large scale data analytics/modeling systems.
● Designing and programming machine learning methods and integrating them into our ML framework/pipeline.
● Analyzing data collected from various sources,
● Evaluate and validate the analysis with statistical methods. Also presenting this in a lucid form to people not familiar with the domain of data science/computer science.
● Writing specifications for algorithms, reports on data analysis, and documentation of algorithms.
● Evaluating new machine learning methods and adapting them for our
purposes.
● Feature engineering to add new features that improve model
performance.
KNOWLEDGE AND SKILL REQUIREMENTS:
● Background and knowledge of recent advances in machine learning, deep learning, natural language processing, and/or image/signal/video processing with at least 3 years of professional work experience working on real-world data.
● Strong programming background, e.g. Python, C/C++, R, Java, and knowledge of software engineering concepts (OOP, design patterns).
● Knowledge of machine learning libraries Tensorflow, Jax, Keras, scikit-learn, pyTorch. Excellent mathematical skills and background, e.g. accuracy, significance tests, visualization, advanced probability concepts
● Ability to perform both independent and collaborative research.
● Excellent written and spoken communication skills.
● A proven ability to work in a cross-discipline environment in defined time frames. Knowledge and experience of deploying large-scale systems using distributed and cloud-based systems (Hadoop, Spark, Amazon EC2, Dataflow) is a big plus.
● Knowledge of systems engineering is a big plus.
● Some experience in project management and mentoring is also a big plus.
EDUCATION:
- B.E.\B. Tech\B.S. candidates' entries with significant prior experience in the aforementioned fields will be considered.
- M.E.\M.S.\M. Tech\PhD preferably in fields related to Computer Science with experience in machine learning, image and signal processing, or statistics preferred.
Datametica is Hiring for Datastage Developer
- Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
- Should have extensive knowledge in Unix shell scripting.
- Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
- Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
- Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
- Should be good in writing complex SQL queries.
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
Experience: 2.5 to 4 years
Location: Pune
About Studily: Optimizing education through personalized learning, Studily is here to empower educators and indulge learners across the globe. Studily uses the Flipped Learning Model to create a student-centric tool that applies artificial intelligence technology to prepare today's schooling system for tomorrow's Education Revolution.
Qualifications
- 2+ years' of experience in web development using NodeJs technologies.
- B.E., B. Tech., Msc IT, MCA etc in Software engineering/ Information technology.
Responsibilities
- Write reusable, testable, and efficient code following best practices (unit testing, source control, continuous integration, automation, design patterns, etc)
- Debug and refactoring of existing code, troubleshoot problems
- Collaborate with other developers, testers and other leads to enhance to quality product enhancements
- Take full responsibility for the quality of the code and test cases that are developed.
- Integration of user-facing elements developed by front-end developers with server-side logic.
- Provide task estimations and deliver quality code on time.
- Participate in architectural, design, and product sessions.
- Interact with different stakeholders to gather feedback and clarification.
- Research and apply new technologies and best practices.
- Should enjoy the experience of mentoring new hires on technical and process areas.
- Must have hands-on experience in building Microservices based software architecture. Must have Unix /Docker /Kubernetes /NOSQL experience
Requirements
- Knowledge of ReactJS is preferable.
- Extensive knowledge of JavaScript.
- Thorough understanding of databases such as MySQL, MongoDB or similar technologies
- In depth knowledge of working with Git.
- Experience with Restful APIs, Postman etc.
- Understanding of AWS/EC2, or other cloud services.
- Object-oriented application building experience in a professional Agile/Scrum environment.
- Good to have Unix /Docker/Kubernetes/NOSQL experience.
Skills
- Node.js
- AWS services
- Deployment
- Unix /Docker/Kubernetes/NOSQL
- Hands on experience in following is a must: Unix, Python and Shell Scripting.
- Hands on experience in creating infrastructure on cloud platform AWS is a must.
- Must have experience in industry standard CI/CD tools like Git/BitBucket, Jenkins, Maven, Artifactory and Chef.
- Must be good at these DevOps tools:
Version Control Tools: Git, CVS
Build Tools: Maven and Gradle
CI Tools: Jenkins
- Hands-on experience with Analytics tools, ELK stack.
- Knowledge of Java will be an advantage.
- Experience designing and implementing an effective and efficient CI/CD flow that gets code from dev to prod with high quality and minimal manual effort.
- Ability to help debug and optimise code and automate routine tasks.
- Should be extremely good in communication
- Experience in dealing with difficult situations and making decisions with a sense of urgency.
- Experience in Agile and Jira will be an add on





