11+ Open Source Contribution Jobs in Pune | Open Source Contribution Job openings in Pune
Apply to 11+ Open Source Contribution Jobs in Pune on CutShort.io. Explore the latest Open Source Contribution Job opportunities across top companies like Google, Amazon & Adobe.
ABOUT US.
Established in 2009, Ashnik is a leading open-source solutions and consulting company in South East Asia and India, headquartered in Singapore. We enable digital transformation for large enterprises through our design, architecting, and solution skills. Over 100 large enterprises in the region have acknowledged our expertise in delivering solutions using key open-source technologies. Our offerings form critical part of Digital transformation, Big Data platform, Cloud and Web acceleration and IT modernization. We represent EDB, Pentaho, Docker, Couchbase, MongoDB, Elastic, NGINX, Sysdig, Redis Labs, Confluent, and HashiCorp as their key partners in the region. Our team members bring decades of experience in delivering confidence to enterprises in adopting open source software and are known for their thought leadership.
As a team culture, Ashnik is a family for its team members. Each member brings in different perspective, new ideas and diverse background. Yet we all together strive for one goal – to deliver best solutions to our customers using open source software. We passionately believe in the power of collaboration. Through an open platform of idea exchange, we create vibrant environment for growth and excellence.
THE POSITION
Ashnik is looking for talented and passionate people to be part of the team for an upcoming project at client location.
QUALIFICATION AND EXPERIENCE
- Preferably have a working experience of 4 Years and more , on production PostgreSQL DBs.
- Experience of working in a production support environment
- Engineering or Equivalent degree
- Passion for open-source technologies is desired
ADDITIONAL SKILLS
- Install & Configure PostgreSQL, Enterprise DB
- Technical capabilities PostgreSQL 9.x, 10.x, 11.x
- Server tuning
- Troubleshooting of Database issues
- Linux Shell Scripting
- Install, Configure and maintain Fail Over mechanism
- Backup - Restoration, Point in time database recovery
- A demonstrable ability to articulate and sell the benefits of modern platforms, software and technologies.
- A real passion for being curious and a continuous learner. You are someone that invests in yourself as much as you invest in your professional relationships.
RESPONSIBILITIES
- Monitoring database performance
- Optimizing Queries and handle escalations
- Analyse and assess the impact and risk of low to medium risk changes on high profile production databases
- Implement security features
- DR implementation and switch over
LOCATION: Pune
Experience: 8 yrs plus
Package: upto 17 LPA
🚀 We’re Hiring- PHP Developer Deqode
📍 Location: Pune (Hybrid)
🕒Experience: 4–6 Years
⏱️ Notice Period: Immediate Joiner
We're looking for a skilled PHP Developer to join our team. If you have a strong grasp of secure coding practices, are experienced in PHP upgrades, and thrive in a fast-paced deployment environment, we’d love to connect with you!
🔧 Key Skills:
- PHP | MySQL | JavaScript | Jenkins | Nginx | AWS
🔐 Security-Focused Responsibilities Include:
- Remediation of PenTest findings
- XSS mitigation (input/output sanitization)
- API rate limiting
- 2FA integration
- PHP version upgrade
- Use of AWS Secrets Manager
- Secure session and password policies
Job Overview
Experienced Coupa Implementation and Configuration Consultant to lead and support end-to-end implementations of the Coupa Business Spend Management (BSM) platform, including Supplier Information Management (SIM).
The ideal candidate will possess strong Procure-to-Pay (P2P), Source-to-Contract (S2C), and financial process expertise, along with hands-on Coupa configuration and ERP integration experience.
This role requires close collaboration with Finance, Procurement, IT teams, and executive stakeholders to deliver scalable, compliant, and optimized spend management solutions.
Key Responsibilities
Implementation & Roll-Out
- Lead full lifecycle implementation of Coupa BSM modules.
- Drive Business Process Design workshops and requirement gathering.
- Manage global or multi-entity roll-outs.
- Conduct SIT, UAT, and go-live support.
Configuration & Technical Expertise
- Configure Procurement, Sourcing, Contracts, Catalogues, Invoicing, and Expenses modules.
- Manage Supplier Information Management (SIM) and onboarding workflows.
- Configure PR, PO, Receipt, and Invoicing lifecycle.
- Implement approval workflows, compliance controls, and security configurations.
- Handle advanced system configurations and policy enforcement.
Integration & Technical
- Lead API-based integrations between Coupa and ERP systems (SAP / Oracle / Workday, etc.).
- Support data migration, reconciliation, and validation.
- Ensure system performance and compliance alignment.
Reporting & Governance
- Enable spend visibility through dashboards and analytics.
- Support audit controls and procurement governance frameworks.
Required Skills
- Strong hands-on experience in Coupa BSM implementation.
- Expertise in P2P and S2C processes.
- Experience in Supplier Information Management (SIM).
- ERP integration exposure (API-based preferred).
- Business process design and documentation capability.
- Experience in enterprise or multi-country roll-outs.
- Strong stakeholder management skills.
- Coupa certification is mandate
Job Description
- Meeting with the client and internal team to review website and application requirements.
- Setting up project completion timelines with client and tracking it to closure.
- Configuring the company SharePoint systems to specified requirements.
- Developing new web components using XML, .NET, SQL, and C#.
- Designing, coding, and implementing scalable applications.
- Extending SharePoint functionality with forms, web parts, and application technologies.
- Testing and debugging code, troubleshooting software issues.
- Reviewing website interface and software stability.
- Maintaining and updating SharePoint applications.
- Prepare Solution Design Documentation of projects.
- Providing systems training to staff and customers.
- Reporting - Dashboard, Growth Report, All Project Reports MTD, QTD, YTD wise
- Collect the Data from Project Team to analyze it daily or monthly basis
- Expertise in MS Office and Report Generation
- Work very closely with teams across delivery locations and client
- SharePoint and VBA skills
- To be willing to work in US shifts
- Perform requisite MIS (Count sheets, internal & external reporting)
- Adhere to reasonable operational requests from the management
- Expert in managing new SharePoint sites with approval workflow and maintenance of the existing sites.
- High-level coding skills.
- Ability to solve complex software issues.
- Detail orientated.
- Self-motivated.
Profile: AWS Data Engineer
Mandate skills :AWS + Databricks + Pyspark + SQL role
Location: Bangalore/Pune/Hyderabad/Chennai/Gurgaon:
Notice Period: Immediate
Key Requirements :
- Design, build, and maintain scalable data pipelines to collect, process, and store from multiple datasets.
- Optimize data storage solutions for better performance, scalability, and cost-efficiency.
- Develop and manage ETL/ELT processes to transform data as per schema definitions, apply slicing and dicing, and make it available for downstream jobs and other teams.
- Collaborate closely with cross-functional teams to understand system and product functionalities, pace up feature development, and capture evolving data requirements.
- Engage with stakeholders to gather requirements and create curated datasets for downstream consumption and end-user reporting.
- Automate deployment and CI/CD processes using GitHub workflows, identifying areas to reduce manual, repetitive work.
- Ensure compliance with data governance policies, privacy regulations, and security protocols.
- Utilize cloud platforms like AWS and work on Databricks for data processing with S3 Storage.
- Work with distributed systems and big data technologies such as Spark, SQL, and Delta Lake.
- Integrate with SFTP to push data securely from Databricks to remote locations.
- Analyze and interpret spark query execution plans to fine-tune queries for faster and more efficient processing.
- Strong problem-solving and troubleshooting skills in large-scale distributed systems.
Job Summary:
We are looking for a skilled and motivated Python AWS Engineer to join our team. The ideal candidate will have strong experience in backend development using Python, cloud infrastructure on AWS, and building serverless or microservices-based architectures. You will work closely with cross-functional teams to design, develop, deploy, and maintain scalable and secure applications in the cloud.
Key Responsibilities:
- Develop and maintain backend applications using Python and frameworks like Django or Flask
- Design and implement serverless solutions using AWS Lambda, API Gateway, and other AWS services
- Develop data processing pipelines using services such as AWS Glue, Step Functions, S3, DynamoDB, and RDS
- Write clean, efficient, and testable code following best practices
- Implement CI/CD pipelines using tools like CodePipeline, GitHub Actions, or Jenkins
- Monitor and optimize system performance and troubleshoot production issues
- Collaborate with DevOps and front-end teams to integrate APIs and cloud-native services
- Maintain and improve application security and compliance with industry standards
Required Skills:
- Strong programming skills in Python
- Solid understanding of AWS cloud services (Lambda, S3, EC2, DynamoDB, RDS, IAM, API Gateway, CloudWatch, etc.)
- Experience with infrastructure as code (e.g., CloudFormation, Terraform, or AWS CDK)
- Good understanding of RESTful API design and microservices architecture
- Hands-on experience with CI/CD, Git, and version control systems
- Familiarity with containerization (Docker, ECS, or EKS) is a plus
- Strong problem-solving and communication skills
Preferred Qualifications:
- Experience with PySpark, Pandas, or data engineering tools
- Working knowledge of Django, Flask, or other Python frameworks
- AWS Certification (e.g., AWS Certified Developer – Associate) is a plus
Educational Qualification:
- Bachelor's or Master’s degree in Computer Science, Engineering, or related field
Role - Lead (Technology & Data Cell)
Experience - 6+ years
Job Location - Aundh, Pune, Maharashtra
About our Client :-
Our client is a Communities Foundation that works in the area of skilling and livelihoods for underserved youths. This is a pioneering program with a strong PPP model, an agency-led approach to livelihoods and a vision of socio-economic transformation.
The Lead for Technology and Data consultant Cell has the opportunity to create and implement the vision for enabling the organization to serve 1 million youth by 2030 by using cutting-edge technology and data systems.
They will Tech enable organizational systems for effective operations, devise data solutions for effective decision making and strategic direction. They will closely work with the program teams to fully understand the
program landscape and implement technology solutions accordingly. Implementation would include being the single point of contact for the Software service provider, end to end back-end support and training of the users.
- Design and Implementation/upgradation of a Tech platform for the Livelihood program:
In collaboration with the Software service provider, an ERP system is being developed and is close to going-live. The responsibilities would include:
i) Understanding the business requirements w.r.t the platform
ii) Data migration: Migrating the legacy data on the platform in the required format whilst ensuring accuracy of the data
iii) End-user training across centers and central team: Hand Holding the team along with Service provider during go-Live and implementation
iv) Troubleshooting wherever required through constant updates and follow-up on system glitches and ensuring resolution with the support of Software service providers.
v) Monitoring of the system application across centers. Identifying required improvisations and suggesting the same.
vi) Coordinating with software service provider for changes and support required for smooth running of the application
vii) Managing and maintaining SMS/Email gateways, domain, servers etc.
viii) Meaningful data extraction and reporting.
ix) Establish Data systems: Establish protocol for data storage and data sharing.
i) Identify technology requirements for Donor management, HR management and all other areas as required.
- Data Analytics: Facilitate culture of data-driven decision making within the organization, including but not limited to, provision of relevant data analytics to the program team.
- Knowledge Management: Lead the overall knowledge management system for the organization and enable data to be available on cloud with a clear protocol for sharing and storage.
- Education: BE Computers
- Experience: Project management experience of 5+ years
- Data management skills Proven understanding the principles of data management and administration.
- IT and database skills Familiarity with modern databases and IT systems. - Candidates with a fair understanding of PHP and SQL databases would be preferred.
- Analytical skills
- Problem-solving skills
- Partnership management
- Excellent verbal and written communication skills.
You would be leading the Post-sales management and will closely coordinate with the Sales team in US and with Customers.
You may also be helping us in lead generation via LinkedIn and other media.
4-6 years of total experience in data warehousing and business intelligence
3+ years of solid Power BI experience (Power Query, M-Query, DAX, Aggregates)
2 years’ experience building Power BI using cloud data (Snowflake, Azure Synapse, SQL DB, data lake)
Strong experience building visually appealing UI/UX in Power BI
Understand how to design Power BI solutions for performance (composite models, incremental refresh, analysis services)
Experience building Power BI using large data in direct query mode
Expert SQL background (query building, stored procedure, optimizing performance)
Specialism- Advance Analytics, Data Science, regression, forecasting, analytics, SQL, R, python, decision tree, random forest, SAS, clustering classification
Senior Analytics Consultant- Responsibilities
- Understand business problem and requirements by building domain knowledge and translate to data science problem
- Conceptualize and design cutting edge data science solution to solve the data science problem, apply design thinking concepts
- Identify the right algorithms , tech stack , sample outputs required to efficiently adder the end need
- Prototype and experiment the solution to successfully demonstrate the value
Independently or with support from team execute the conceptualized solution as per plan by following project management guidelines - Present the results to internal and client stakeholder in an easy to understand manner with great story telling, story boarding, insights and visualization
- Help build overall data science capability for eClerx through support in pilots, pre sales pitches, product development , practice development initiatives
Location: Hyderabad, India
Nisum is a leading global digital commerce firm headquartered in California, with services spanning digital strategy and transformation, insights and analytics, blockchain, business agility, and custom software development. Founded in 2000 with the customer-centric motto “Building Success Together®,” Nisum has grown to over 1,400 professionals across the United States, Chile, India, and Pakistan. A preferred advisor to leading Fortune 500 brands, Nisum enables clients to achieve direct business growth by building the advanced technology they need to reach end customers in today’s world, with immersive and seamless experiences across digital and physical channels.
What You’ll Do
Coding in Java8, Spring, Microservices, WebFlux/Reactive Programing, Rest services, Kafka, PCF, Azure, Spring Cloud Config and NoSQL technologies.
Solve technical problems using cutting-edge technologies and best practices.
Ensure code meets the required development standards and is optimized for performance.
Unit testing for each line of new code introduced (JUnit/Mockito)
Peer code review process using GIT pull requests and Crucible (for SVN)
Propose multiple solutions to a problem, show how one option is better than another.
Ensure all aspects of technical design are correctly incorporated.
Contribute in research and implementing POC’s as required.
Collaborating with onsite team in scrum ceremonies.
Who you are
Senior developer using technical skills in Java8, J2EE, Spring boot(rest services), Web Services(Rest & SOAP), WebFlux, Spring Cloud Config, Maven/Gradle, JUNIT/TestNG, Mockito/JMock/EasyMock, JIRA, XML, JSON, EhCache/MemCache/Redis with skills in JMS, Kafka
Experience with hands on any one of the cloud platforms like PCF, Azure
Need hands on skills using NoSQL databases (Cassandra, MongoDB), SQL(Oracle/DB2/MySQL)
Experience with UI development skills to the level of debugging and enhancements.
Experience with expertise in code quality and coding standards.
Need to applying different design patterns, especially GoF, J2EE and Integration design patterns.
Making sure implementing unit testing for each line of new code introduced (JUnit/Mockito).
Ensuring the code meets the required development standards and is optimized for performance.
Education
Bachelor’s / Master’s degree in specific technical fields like computer science, math, statistics preferred; or equivalent practical experience.


