
About Cashfree
About
Connect with the team
Similar jobs

· 12+ years of professional experience in front-end development.
· 7+ years of hands-on experience with React.js and its ecosystem (Redux, Context API, Hooks, etc.)
· Strong understanding of JavaScript (ES6+), HTML5, and CSS3.
· Experience with RESTful APIs, GraphQL, and WebSocket integration.
· Proficiency in modern build tools (Webpack, Vite, Babel, etc.).
· Expertise in state management libraries (Redux, MobX) and React Query.
· Solid understanding of UI/UX principles and responsive design.
· Experience with testing frameworks (Jest, React Testing Library, Cypress, etc.).
· Familiarity with modern CI/CD pipelines and version control tools (Git).
We are looking for a product manager for the Smart Ship© Hub™ Digital platform, which enables teams to efficiently guide businesses to success across our Family of Maritime Apps (Vessel Monitoring, Crew Management, Condition Monitoring Voyage Performance, etc.).
We envision a central communication platform “Digital Twins” that matches and delivers the best Remote Vessel Performance systems to businesses when and where they need it to help them achieve their objectives as quickly as possible.
Our internal partners and customers include Engineering team, Analytics Team, Software Development and Product teams launching new features and teams working on business growth.
Work Experience
- 5 - 8 years Product Management experience
Key Responsibilities:
- Need to have 4+ years of software engineering or technical product/ experience.
- Enable, scale and optimize meaningful communications between Smart Ship Hub (SSH) and large businesses using SSH world-class IOT capabilities across multiple dashboards and Mobile Apps.
- Collaborate with cross-functional technical and business partner teams who are equally as passionate about maritime businesses helping to grow the Smart Ship Hub product ecosystem.
- Champion the platform across SSH, identify significant opportunities, and drive product vision, strategies and roadmaps in the context of broader organizational strategies and goals.
- Drive execution with quantitative and qualitative data on user behavior, platform stats and experimentation.
- Identify and break down complex business problems into solvable pieces that can be shipped iteratively.
Skills and Experience.
- Experience in building and scaling world-class platforms
- Experience driving projects with cross-functional colleagues.
- Proven record of successful product outcomes.
- Experience working with technical management teams to develop strategy, systems, solutions, and products.
- Experience establishing work relationships across multi-disciplinary teams and multiple partners in different time zones
- Past experience managing products in one of the following spaces: CRM, Marketing, e-Commerce, Logistics, Customer support
Education
- B.S. in Computer Science or a related technical discipline, or equivalent experience.
Smart Ship Hub’s mission is to give Ship owners the power to build strong Fleet of vessels monitored remotely. Through our family of apps and services, we're building a different kind of company that connects thousands of vessels around the world, gives them ways to share vessel performance status, and helps bring higher vessel performance. Our global teams are constantly iterating, solving problems, and working together to empower maritime industry around the world to build high performing and eco-friendly space.

Role: Data Engineer
Company: PayU
Location: Bangalore/ Mumbai
Experience : 2-5 yrs
About Company:
PayU is the payments and fintech business of Prosus, a global consumer internet group and one of the largest technology investors in the world. Operating and investing globally in markets with long-term growth potential, Prosus builds leading consumer internet companies that empower people and enrich communities.
The leading online payment service provider in 36 countries, PayU is dedicated to creating a fast, simple and efficient payment process for merchants and buyers. Focused on empowering people through financial services and creating a world without financial borders where everyone can prosper, PayU is one of the biggest investors in the fintech space globally, with investments totalling $700 million- to date. PayU also specializes in credit products and services for emerging markets across the globe. We are dedicated to removing risks to merchants, allowing consumers to use credit in ways that suit them and enabling a greater number of global citizens to access credit services.
Our local operations in Asia, Central and Eastern Europe, Latin America, the Middle East, Africa and South East Asia enable us to combine the expertise of high growth companies with our own unique local knowledge and technology to ensure that our customers have access to the best financial services.
India is the biggest market for PayU globally and the company has already invested $400 million in this region in last 4 years. PayU in its next phase of growth is developing a full regional fintech ecosystem providing multiple digital financial services in one integrated experience. We are going to do this through 3 mechanisms: build, co-build/partner; select strategic investments.
PayU supports over 350,000+ merchants and millions of consumers making payments online with over 250 payment methods and 1,800+ payment specialists. The markets in which PayU operates represent a potential consumer base of nearly 2.3 billion people and a huge growth potential for merchants.
Job responsibilities:
- Design infrastructure for data, especially for but not limited to consumption in machine learning applications
- Define database architecture needed to combine and link data, and ensure integrity across different sources
- Ensure performance of data systems for machine learning to customer-facing web and mobile applications using cutting-edge open source frameworks, to highly available RESTful services, to back-end Java based systems
- Work with large, fast, complex data sets to solve difficult, non-routine analysis problems, applying advanced data handling techniques if needed
- Build data pipelines, includes implementing, testing, and maintaining infrastructural components related to the data engineering stack.
- Work closely with Data Engineers, ML Engineers and SREs to gather data engineering requirements to prototype, develop, validate and deploy data science and machine learning solutions
Requirements to be successful in this role:
- Strong knowledge and experience in Python, Pandas, Data wrangling, ETL processes, statistics, data visualisation, Data Modelling and Informatica.
- Strong experience with scalable compute solutions such as in Kafka, Snowflake
- Strong experience with workflow management libraries and tools such as Airflow, AWS Step Functions etc.
- Strong experience with data engineering practices (i.e. data ingestion pipelines and ETL)
- A good understanding of machine learning methods, algorithms, pipelines, testing practices and frameworks
- Preferred) MEng/MSc/PhD degree in computer science, engineering, mathematics, physics, or equivalent (preference: DS/ AI)
- Experience with designing and implementing tools that support sharing of data, code, practices across organizations at scale

- Strong experience with one or more general purpose programming languages including but
not limited to: Python, Java, C/C++, C#
- Demonstrated expertise working with at least one modern enterprise application frameworks like
Spring Boot, Play Framework, Django
- Demonstrated expertise in building scalable distributed applications in microservices architecture
- Expert knowledge of best practice software engineering methodologies and coding standards
- Strong and proven advocacy for Test Driven Development is preferred
- Experience with SQL (mySQL, Postgres, etc) and NoSQL (MongoDb, DynamoDB, Aerospike or Redis)
- Production experience in running cloud-based enterprise-grade systems at scale
- Natural ability to process requirements, figure out multiple execution options, their complexity, and
estimate the scope of work required to get tasks done
- DevOps experience
- Cloud experience (AWS required, Google Cloud Platform bonus
• 2+ years of experience in data engineering & strong understanding of data engineering principles using big data technologies
• Excellent programming skills in Python is mandatory
• Expertise in relational databases (MSSQL/MySQL/Postgres) and expertise in SQL. Exposure to NoSQL such as Cassandra. MongoDB will be a plus.
• Exposure to deploying ETL pipelines such as AirFlow, Docker containers & Lambda functions
• Experience in AWS loud services such as AWS CLI, Glue, Kinesis etc
• Experience using Tableau for data visualization is a plus
• Ability to demonstrate a portfolio of projects (GitHub, papers, etc.) is a plus
• Motivated, can-do attitude and desire to make a change is a must
• Excellent communication skills
• They have to generate leads, make sales calls, and update the CRM database.
• Sales executives have to meet with new and existing customers of their organization.

- In-depth knowledge of C++
- Proven and demonstrable experience in developing software applications in C++
- Experience In Large Scala application
- Should have worked Independently
- Experience designing software components using OOA/OOD methodologies
- Very strong problem solving, analysis skills
- Image processing / Image analysis programming experience would be beneficial
- Experience with DICOM standards and Medical Imaging will be a plus
- MS Visual Studio and Unit testing
- SCRUM / Agile development framework
- Distributed Source Control


