11+ Oracle NoSQL Database Jobs in Pune | Oracle NoSQL Database Job openings in Pune
Apply to 11+ Oracle NoSQL Database Jobs in Pune on CutShort.io. Explore the latest Oracle NoSQL Database Job opportunities across top companies like Google, Amazon & Adobe.

Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist
Experience : 3 to 8 years
Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata
Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)
Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.
Key Responsibilities :
- Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
- Build and optimize complex RPD models, OAC reports, and data visualizations.
- Utilize SQL and PL/SQL for data querying and performance optimization.
- Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
- Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
- Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
- Implement cloud scripting using CURL for Oracle Cloud automation.
- Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
- Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.
Required Skills :
- Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
- Deep understanding of data modeling, reporting, and visualization techniques.
- Proficiency in SQL, PL/SQL, and relational databases on Oracle.
- Familiarity with DevOps tools, version control, and deployment automation.
- Working knowledge of Oracle Cloud services, scripting, and monitoring.
Good to Have :
- Prior experience in OBIEE to OAC migrations.
- Exposure to data security models and cloud performance tuning.
- Certification in Oracle Cloud-related technologies.


Job Summary:
We are looking for a highly skilled and experienced .NET Full Stack Developer to join our growing engineering team. The ideal candidate will have strong experience in developing scalable web applications using .NET Core and Angular. You will be involved in designing, developing, testing, and maintaining robust solutions in a collaborative and fast-paced environment.
Key Responsibilities:
- Design, develop, and maintain web applications using .NET Core, ASP.NET, and C#
- Build responsive and interactive frontend interfaces using Angular
- Write clean, scalable, and maintainable code adhering to coding standards and best practices
- Collaborate with cross-functional teams including product managers, designers, and QA to define and deliver new features
- Participate in code reviews, system design discussions, and team planning activities
- Troubleshoot and debug applications to ensure high performance and responsiveness
- Ensure secure coding practices and optimize application for maximum speed and scalability
Must-Have Skills:
- 4+ years of hands-on experience with .NET, .NET Core, and C#
- Strong frontend development skills using Angular (v8 and above)
- Proficiency in writing RESTful APIs and integrating backend services
- Solid understanding of object-oriented programming and design patterns
- Experience with version control systems like Git
- Familiarity with SQL Server or any relational database
Preferred Skills:
- Experience with Agile/Scrum development methodologies
- Knowledge of CI/CD tools and processes
- Familiarity with cloud platforms like Azure or AWS
SAP APO – PPDS Consultant
We are looking for a skilled SAP APO – PPDS Consultant with at least 6 years of strong experience in Production Planning and Detailed Scheduling (PPDS) using SAP APO.
✅ Key Responsibilities:
- Configure and implement PPDS solutions to meet business requirements
- Work with CIF for master data and transactional data integration
- Develop planning strategies using heuristics and optimization techniques
- Collaborate with cross-functional teams (e.g. SCM, production, IT)
- Analyze and resolve issues in planning and scheduling processes
✅ Must-have Skills:
- SAP APO – PPDS
- Heuristic Planning
- CIF Integration
- Planning Run Configuration
- Experience with SAP SCM modules
📍 Location Options:
- Pune
- Mumbai
- Gandhinagar
- Bangalore
👨💻 Experience Required:
- Minimum 6 years in SAP APO – PPDS


Position Overview:
We are looking for a skilled Software Developer to work in abroad countries. The ideal candidate will have a strong background in software development, problem-solving abilities, and a passion for creating high-quality applications. We provide job assistance to the canditates for there great future and career growth.
Responsibilities:
- Person should have good experience working as Full Stack developer.
- Manage End to End Solution Development.
- Write clean, scalable, and maintainable code using Next.js, React.js, Node.js, Express, and MongoDB.
- Debug and troubleshoot issues to enhance performance and user experience.
- Participate in code reviews, project planning, and discussions.
JOB DESCRIPTION | IFAS PUBLICATIONS | GRAPHIC DESIGNER INTERN
ABOUT COMPANY: IFAS Publications is India's No. 1 publisher for reference books in graduate and postgraduate college exams, as well as competitive examinations like CSIR NET, UGC NET, GATE, SET, PSC, CUET PG, IIT JAM, and more, IFAS Publications is your trusted companion in the quest for excellence. Since our humble beginnings in 2002, we have grown to become the fastest-growing publishing house in India, with over 200+ titles available at 2000+ bookshops and leading online stores. Our commitment to "Learning Made Simple" drives us to provide books that empower every student to achieve their academic goals. We have 4 branches in India - Kolkata, Pune (Someshwarwadi & Hinjewadi), Jodhpur, and Hyderabad.
Join the IFAS Publications family and unlock your full potential!
RESPONSIBILITIES:
• Study design briefs and determine requirements.
• Schedule projects and define budget constraints.
• Conceptualize visuals based on requirements.
• Prepare rough drafts and present ideas.
• Develop illustrations, logos and other designs using software or by hand.
• Use the appropriate colors and layouts for each graphic.
• Work with copywriters and creative director to produce final design.
• Test graphics across various media.
• Amend designs after feedback. • Ensure final graphics and layouts are visually appealing and on-brand.
REQUIREMENTS: • Qualification: Minimum Graduated.
• Familiarity with design software and technologies (such as InDesign, Illustrator, Dreamweaver, Photoshop).
• A strong portfolio of illustrations or other graphics.
• A keen eye for aesthetics and details.
• Excellent communication skills.
• Ability to work methodically and meet deadlines.
NOTE: This is a paid internship 3 months & with full potential to convert it into a fulltime role based on your internship performance - Stipend bracket (5k to 7k per month).
Working Days: Monday to Saturday
Working Time: 10 am – 7 pm
Office Location: Shivranjan Tower, 2nd Floor, Someshwar Wadi Rd, near Rajwada Hotel, In front of Pawana sahkari Bank, Someshwarwadi, Pashan, Pune, Maharashtra 411045.
Website: www.ifasonline.com
Enterprise Data Architect - Dataeconomy (25+ Years Experience)
About Dataeconomy:
Dataeconomy is a rapidly growing company at the forefront of Information Technology. We are driven by data and committed to using it to make better decisions, improve our products, and deliver exceptional value to our customers.
Job Summary:
Dataeconomy seeks a seasoned and strategic Enterprise Data Architect to lead the company's data transformation journey. With 25+ years of experience in data architecture and leadership, you will be pivotal in shaping our data infrastructure, governance, and culture. You will leverage your extensive expertise to build a foundation for future growth and innovation, ensuring our data assets are aligned with business objectives and drive measurable value.
Responsibilities:
Strategic Vision and Leadership:
Lead the creation and execution of a long-term data strategy aligned with the company's overall vision and goals.
Champion a data-driven culture across the organization, fostering cross-functional collaboration and data literacy.
Advise senior leadership on strategic data initiatives and their impact on business performance.
Architecture and Modernization:
Evaluate and modernize the existing data architecture, recommending and implementing innovative solutions.
Design and implement a scalable data lake/warehouse architecture for future growth.
Advocate for and adopt cutting-edge data technologies and best practices.
ETL Tool Experience (8+ years):
Extensive experience in designing, developing, and implementing ETL (Extract, Transform, Load) processes using industry-standard tools such as Informatica PowerCenter, IBM DataStage, Microsoft SSIS, or open-source options like Apache Airflow.
Proven ability to build and maintain complex data pipelines that integrate data from diverse sources, transform it into usable formats, and load it into target systems.
Deep understanding of data quality and cleansing techniques to ensure the accuracy and consistency of data across the organization.
Data Governance and Quality:
Establish and enforce a comprehensive data governance framework ensuring data integrity, consistency, and security.
Develop and implement data quality standards and processes for continuous data improvement.
Oversee the implementation of master data management and data lineage initiatives.
Collaboration and Mentorship:
Mentor and guide data teams, including architects, engineers, and analysts, on data architecture principles and best practices.
Foster a collaborative environment where data insights are readily shared and acted upon across the organization.
Build strong relationships with business stakeholders to understand and translate their data needs into actionable solutions.
Qualifications:
Education: master’s degree in computer science, Information Systems, or related field; Ph.D. preferred.
Experience: 25+ years of experience in data architecture and design, with 10+ years in a leadership role.
Technical Skills:
Deep understanding of TOGAF, AWS, MDM, EDW, Hadoop ecosystem (MapReduce, Hive, HBase, Pig, Flume, Scoop), cloud data platforms (Azure Synapse, Google Big Query), modern data pipelines, streaming analytics, data governance frameworks.
Proficiency in programming languages (Java, Python, SQL), scripting languages (Bash, Python), data modelling tools (ER diagramming software), and BI tools.
Extensive expertise in ETL tools (Informatica PowerCenter, IBM DataStage, Microsoft SSIS, Apache Airflow)
Familiarity with emerging data technologies (AI/ML, blockchain), data security and compliance frameworks.
Soft Skills:
Outstanding communication, collaboration, and leadership skills.
Strategic thinking and problem-solving abilities with a focus on delivering impactful solutions.
Strong analytical and critical thinking skills.
Ability to influence and inspire teams to achieve goals.
Mid / Senior Big Data Engineer
Job Description:
Role: Big Data EngineerNumber of open positions: 5Location: PuneAt Clairvoyant, we're building a thriving big data practice to help enterprises enable and accelerate the adoption of Big data and cloud services. In the big data space, we lead and serve as innovators, troubleshooters, and enablers. Big data practice at Clairvoyant, focuses on solving our customer's business problems by delivering products designed with best in class engineering practices and a commitment to keep the total cost of ownership to a minimum.
Must Have:
- 4-10 years of experience in software development.
- At least 2 years of relevant work experience on large scale Data applications.
- Strong coding experience in Java is mandatory
- Good aptitude, strong problem solving abilities, and analytical skills, ability to take ownership as appropriate
- Should be able to do coding, debugging, performance tuning and deploying the apps to Prod.
- Should have good working experience on
- o Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)
- o Kafka
- o J2EE Frameworks (Spring/Hibernate/REST)
- o Spark Streaming or any other streaming technology.
- Strong coding experience in Java is mandatory
- Ability to work on the sprint stories to completion along with Unit test case coverage.
- Experience working in Agile Methodology
- Excellent communication and coordination skills
- Knowledgeable (and preferred hands on) - UNIX environments, different continuous integration tools.
- Must be able to integrate quickly into the team and work independently towards team goals
- Take the complete responsibility of the sprint stories' execution
- Be accountable for the delivery of the tasks in the defined timelines with good quality.
- Follow the processes for project execution and delivery.
- Follow agile methodology
- Work with the team lead closely and contribute to the smooth delivery of the project.
- Understand/define the architecture and discuss the pros-cons of the same with the team
- Involve in the brainstorming sessions and suggest improvements in the architecture/design.
- Work with other team leads to get the architecture/design reviewed.
- Work with the clients and counter-parts (in US) of the project.
- Keep all the stakeholders updated about the project/task status/risks/issues if there are any.
Experience: 4 to 9 years
Keywords: java, scala, spark, software development, hadoop, hive
Locations: Pune
- Develop and execute influencer marketing strategies and creative
campaigns
- Identify and build relationships with prominent influencers
- Building an influencer ecosystem for e-commerce/start-ups.
- Research relevant influencers and contact them.
- Build good relationships with influencers.
- Negotiating with influencers
- Monitor influencer activity, analyse their performance, identify areas
of improvement, and recommend
- Research relevant industry experts, competitors.
- Brainstorm new, creative approaches to influencer/ affiliate/
collaboration campaigns.
- Liaising with the marketing team to create and coordinate
marketing strateg
Position: Windchill Developer
Profile of the Job Holder:
· Bachelor’s Degree in Mechanical or Computer Engineering stream;
· 2-5 Years of relevant work experience;
· Highly motivated & self-initiator with good communication skills with the ability to work effectively in a team environment;
· Ability to work within a fast-paced environment with changing priorities;
· Ease at dealing with ambiguity in work situations
· Empower themselves by seeking authority to make decisions within appropriate guidelines;
· Recover quickly and learn from setbacks;
Key Skill & Knowledge Profile:
· Understand Basic Concepts of PLM
· Experience in Windchill / PDMLink customization
· Thingworx Knowledge
· Programming Knowledge of Java, JSP, XML, and HTML
· Knowledge of Windchill Data Model
· Ability to create reports using query builder
· Basic knowledge of any database (Oracle, SQL servers)
· Working knowledge of any Bug tracking & End user support tool;
Key Responsibilities
Business Development & Support
· Analyze & understand existing customizations
· Design, Develop & enhance customizations based on requirements
· Testing & resolving of issues in Windchill
· Prepare and maintain all necessary documentations
· Responsible for meeting development schedules and ensuring the delivered solution meets the technical specifications and design requirements
· Adopt best practices on software development processes and ensure that quality processes is adhered
· Understand the deployment process and execute accordingly
· Escalate issues in a timely fashion to the PLM Operations Team to ensure Business visibility
- Data Steward :
Data Steward will collaborate and work closely within the group software engineering and business division. Data Steward has overall accountability for the group's / Divisions overall data and reporting posture by responsibly managing data assets, data lineage, and data access, supporting sound data analysis. This role requires focus on data strategy, execution, and support for projects, programs, application enhancements, and production data fixes. Makes well-thought-out decisions on complex or ambiguous data issues and establishes the data stewardship and information management strategy and direction for the group. Effectively communicates to individuals at various levels of the technical and business communities. This individual will become part of the corporate Data Quality and Data management/entity resolution team supporting various systems across the board.
Primary Responsibilities:
- Responsible for data quality and data accuracy across all group/division delivery initiatives.
- Responsible for data analysis, data profiling, data modeling, and data mapping capabilities.
- Responsible for reviewing and governing data queries and DML.
- Accountable for the assessment, delivery, quality, accuracy, and tracking of any production data fixes.
- Accountable for the performance, quality, and alignment to requirements for all data query design and development.
- Responsible for defining standards and best practices for data analysis, modeling, and queries.
- Responsible for understanding end-to-end data flows and identifying data dependencies in support of delivery, release, and change management.
- Responsible for the development and maintenance of an enterprise data dictionary that is aligned to data assets and the business glossary for the group responsible for the definition and maintenance of the group's data landscape including overlays with the technology landscape, end-to-end data flow/transformations, and data lineage.
- Responsible for rationalizing the group's reporting posture through the definition and maintenance of a reporting strategy and roadmap.
- Partners with the data governance team to ensure data solutions adhere to the organization’s data principles and guidelines.
- Owns group's data assets including reports, data warehouse, etc.
- Understand customer business use cases and be able to translate them to technical specifications and vision on how to implement a solution.
- Accountable for defining the performance tuning needs for all group data assets and managing the implementation of those requirements within the context of group initiatives as well as steady-state production.
- Partners with others in test data management and masking strategies and the creation of a reusable test data repository.
- Responsible for solving data-related issues and communicating resolutions with other solution domains.
- Actively and consistently support all efforts to simplify and enhance the Clinical Trial Predication use cases.
- Apply knowledge in analytic and statistical algorithms to help customers explore methods to improve their business.
- Contribute toward analytical research projects through all stages including concept formulation, determination of appropriate statistical methodology, data manipulation, research evaluation, and final research report.
- Visualize and report data findings creatively in a variety of visual formats that appropriately provide insight to the stakeholders.
- Achieve defined project goals within customer deadlines; proactively communicate status and escalate issues as needed.
Additional Responsibilities:
- Strong understanding of the Software Development Life Cycle (SDLC) with Agile Methodologies
- Knowledge and understanding of industry-standard/best practices requirements gathering methodologies.
- Knowledge and understanding of Information Technology systems and software development.
- Experience with data modeling and test data management tools.
- Experience in the data integration project • Good problem solving & decision-making skills.
- Good communication skills within the team, site, and with the customer
Knowledge, Skills and Abilities
- Technical expertise in data architecture principles and design aspects of various DBMS and reporting concepts.
- Solid understanding of key DBMS platforms like SQL Server, Azure SQL
- Results-oriented, diligent, and works with a sense of urgency. Assertive, responsible for his/her own work (self-directed), have a strong affinity for defining work in deliverables, and be willing to commit to deadlines.
- Experience in MDM tools like MS DQ, SAS DM Studio, Tamr, Profisee, Reltio etc.
- Experience in Report and Dashboard development
- Statistical and Machine Learning models
- Python (sklearn, numpy, pandas, genism)
- Nice to Have:
- 1yr of ETL experience
- Natural Language Processing
- Neural networks and Deep learning
- xperience in keras,tensorflow,spacy, nltk, LightGBM python library
Interaction : Frequently interacts with subordinate supervisors.
Education : Bachelor’s degree, preferably in Computer Science, B.E or other quantitative field related to the area of assignment. Professional certification related to the area of assignment may be required
Experience : 7 years of Pharmaceutical /Biotech/life sciences experience, 5 years of Clinical Trials experience and knowledge, Excellent Documentation, Communication, and Presentation Skills including PowerPoint
