11+ 3D printing Jobs in Pune | 3D printing Job openings in Pune
Apply to 11+ 3D printing Jobs in Pune on CutShort.io. Explore the latest 3D printing Job opportunities across top companies like Google, Amazon & Adobe.
- Sell Vacuum Casting and 3D Printing / Rapid Prototyping services
- Develop new customers and manage existing client relationships
- Understand technical requirements and coordinate with engineering teams
- Prepare quotations, negotiate deals, and close sales
- Achieve sales targets and support business growth
Requirements:
- Experience in Vacuum Casting / 3D Printing / Additive Manufacturing sales
- Strong communication and negotiation skills
- Engineering or technical background preferred
- Willingness to travel
Job Title : Senior SAP PPDS Consultant
Experience : 6+ Years
Location : Open to USI locations (Hyderabad / Bangalore / Mumbai / Pune / Chennai / Gurgaon)
Job Type : Full-Time
Start Date : Immediate Joiners Preferred
Job Description :
We are urgently seeking a Senior SAP PPDS (Production Planning and Detailed Scheduling) Consultant with strong implementation experience.
The ideal candidate will be responsible for leading and supporting end-to-end project delivery for SAP PPDS, contributing to solution design, configuration, testing, and deployment in both Greenfield and Brownfield environments.
Mandatory Skills : SAP PPDS, CIF Integration, Heuristics, Pegging Strategies, Production Scheduling, S/4 HANA or ECC, Greenfield/Brownfield Implementation.
Key Responsibilities :
- Lead the implementation of SAP PPDS modules including system configuration and integration with SAP ECC/S4 HANA.
- Collaborate with stakeholders to gather requirements and define functional specifications.
- Design, configure, and test SAP PPDS solutions to meet business needs.
- Provide support for system upgrades, patches, and enhancements.
- Participate in workshops, training sessions, and knowledge transfers.
- Troubleshoot and resolve issues during implementation and post-go-live.
- Ensure documentation of functional specifications, configuration, and user manuals.
Required Skills :
- Minimum 6+ Years of SAP PPDS experience.
- At least 1-2 Greenfield or Brownfield implementation projects.
- Strong understanding of supply chain planning and production scheduling.
- Hands-on experience in CIF integration, heuristics, optimization, and pegging strategies.
- Excellent communication and client interaction skills.
Preferred Qualifications :
- Experience in S/4 HANA environment.
- SAP PPDS Certification is a plus.
- Experience working in large-scale global projects.
We are seeking an experienced and visionary Product Strategist to lead the strategic planning and positioning of our products in the market. The ideal candidate will be responsible for identifying market opportunities, aligning product development with business goals, and delivering customer-centric solutions that drive growth and innovation.
Key Responsibilities:
- Develop and communicate clear product strategies aligned with business objectives and market needs.
- Conduct market research, competitor analysis, and trend forecasting to identify growth opportunities.
- Collaborate with cross-functional teams including Product Management, Marketing, Sales, and Engineering to ensure alignment on strategy and execution.
- Define product vision, positioning, value propositions, and go-to-market plans.
- Analyze customer insights and user feedback to inform product direction and improvements.
- Create strategic roadmaps, product lifecycle plans, and innovation pipelines.
- Work with data analytics teams to assess product performance and recommend strategic pivots when needed.
- Advocate for user needs while balancing business and technical constraints.
Requirements:
- Bachelor’s or Master’s degree in Business, Marketing, Product Management, or a related field.
- 4+ years of experience in product strategy, product management, or business strategy, preferably in a tech-driven environment.
- Proven ability to build and execute successful product strategies and roadmaps.
- Strong analytical and problem-solving skills, with a data-driven mindset.
- Excellent communication and storytelling abilities to articulate product vision and strategy to stakeholders.
- Experience working with agile teams and frameworks.
- Familiarity with tools such as Jira, Confluence, Productboard, Aha!, or similar platforms.
Wissen Technology is hiring for Data Engineer
About Wissen Technology: At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset—ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia. Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge technology expertise, and a relentless focus on quality. We don’t just meet expectations—we exceed them by ensuring faster time-to-market, reduced rework, and greater alignment with client objectives. We have a proven track record of building mission-critical systems across industries, including financial services, healthcare, retail, manufacturing, and more. Wissen stands apart through its unique delivery models. Our outcome-based projects ensure predictable costs and timelines, while our agile pods provide clients the flexibility to adapt to their evolving business needs. Wissen leverages its thought leadership and technology prowess to drive superior business outcomes. Our success is powered by top-tier talent. Our mission is clear: to be the partner of choice for building world-class custom products that deliver exceptional impact—the first time, every time.
Job Summary: Wissen Technology is hiring a Data Engineer with expertise in Python, Pandas, Airflow, and Azure Cloud Services. The ideal candidate will have strong communication skills and experience with Kubernetes.
Experience: 4-7 years
Notice Period: Immediate- 15 days
Location: Pune, Mumbai, Bangalore
Mode of Work: Hybrid
Key Responsibilities:
- Develop and maintain data pipelines using Python and Pandas.
- Implement and manage workflows using Airflow.
- Utilize Azure Cloud Services for data storage and processing.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Ensure data quality and integrity throughout the data lifecycle.
- Optimize and scale data infrastructure to meet business needs.
Qualifications and Required Skills:
- Proficiency in Python (Must Have).
- Strong experience with Pandas (Must Have).
- Expertise in Airflow (Must Have).
- Experience with Azure Cloud Services.
- Good communication skills.
Good to Have Skills:
- Experience with Pyspark.
- Knowledge of Kubernetes.
Wissen Sites:
- Website: http://www.wissen.com
- LinkedIn: https://www.linkedin.com/company/wissen-technology
- Wissen Leadership: https://www.wissen.com/company/leadership-team/
- Wissen Live: https://www.linkedin.com/company/wissen-technology/posts/feedView=All
- Wissen Thought Leadership: https://www.wissen.com/articles/
Key Responsibilities:
Purchase Order Processing:
Prepare and review purchase orders from various departments or individuals within the organization.
Verify the accuracy and completeness of purchase orders, ensuring they are properly authorized and comply with company policies.
Enter purchase order details into the accounting system (Tally Prime)
Invoice Verification and Processing:
Gather invoices and match them with corresponding purchase orders and delivery receipts.
Verify the accuracy of invoices, including quantities, prices, and applicable taxes.
Resolve discrepancies or issues with invoices by communicating with vendors, procurement, and other internal stakeholders.
Coordinate with vendors to resolve any payment-related inquiries or issues.
Engage in Warehousing activities related to invoices and inventory.
Vendor Relationship Management:
Develop and maintain positive relationships with vendors, responding to inquiries and resolving any issues promptly.
Ensure vendor accounts are accurately maintained in the accounting system, including contact information and payment terms.
Communicate payment schedules to vendors and provide remittance advice as required.
Reporting and Reconciliation:
Prepare periodic reports related to accounts payable, such as the outstanding creditor's reports, vendor ageing payables, and vendor statements.
Reconcile vendor statements with accounts payable records, investigating and resolving any discrepancies.
Enterprise Data Architect - Dataeconomy (25+ Years Experience)
About Dataeconomy:
Dataeconomy is a rapidly growing company at the forefront of Information Technology. We are driven by data and committed to using it to make better decisions, improve our products, and deliver exceptional value to our customers.
Job Summary:
Dataeconomy seeks a seasoned and strategic Enterprise Data Architect to lead the company's data transformation journey. With 25+ years of experience in data architecture and leadership, you will be pivotal in shaping our data infrastructure, governance, and culture. You will leverage your extensive expertise to build a foundation for future growth and innovation, ensuring our data assets are aligned with business objectives and drive measurable value.
Responsibilities:
Strategic Vision and Leadership:
Lead the creation and execution of a long-term data strategy aligned with the company's overall vision and goals.
Champion a data-driven culture across the organization, fostering cross-functional collaboration and data literacy.
Advise senior leadership on strategic data initiatives and their impact on business performance.
Architecture and Modernization:
Evaluate and modernize the existing data architecture, recommending and implementing innovative solutions.
Design and implement a scalable data lake/warehouse architecture for future growth.
Advocate for and adopt cutting-edge data technologies and best practices.
ETL Tool Experience (8+ years):
Extensive experience in designing, developing, and implementing ETL (Extract, Transform, Load) processes using industry-standard tools such as Informatica PowerCenter, IBM DataStage, Microsoft SSIS, or open-source options like Apache Airflow.
Proven ability to build and maintain complex data pipelines that integrate data from diverse sources, transform it into usable formats, and load it into target systems.
Deep understanding of data quality and cleansing techniques to ensure the accuracy and consistency of data across the organization.
Data Governance and Quality:
Establish and enforce a comprehensive data governance framework ensuring data integrity, consistency, and security.
Develop and implement data quality standards and processes for continuous data improvement.
Oversee the implementation of master data management and data lineage initiatives.
Collaboration and Mentorship:
Mentor and guide data teams, including architects, engineers, and analysts, on data architecture principles and best practices.
Foster a collaborative environment where data insights are readily shared and acted upon across the organization.
Build strong relationships with business stakeholders to understand and translate their data needs into actionable solutions.
Qualifications:
Education: master’s degree in computer science, Information Systems, or related field; Ph.D. preferred.
Experience: 25+ years of experience in data architecture and design, with 10+ years in a leadership role.
Technical Skills:
Deep understanding of TOGAF, AWS, MDM, EDW, Hadoop ecosystem (MapReduce, Hive, HBase, Pig, Flume, Scoop), cloud data platforms (Azure Synapse, Google Big Query), modern data pipelines, streaming analytics, data governance frameworks.
Proficiency in programming languages (Java, Python, SQL), scripting languages (Bash, Python), data modelling tools (ER diagramming software), and BI tools.
Extensive expertise in ETL tools (Informatica PowerCenter, IBM DataStage, Microsoft SSIS, Apache Airflow)
Familiarity with emerging data technologies (AI/ML, blockchain), data security and compliance frameworks.
Soft Skills:
Outstanding communication, collaboration, and leadership skills.
Strategic thinking and problem-solving abilities with a focus on delivering impactful solutions.
Strong analytical and critical thinking skills.
Ability to influence and inspire teams to achieve goals.
As an Associate Manager - Senior Data scientist you will solve some of the most impactful business problems for our clients using a variety of AI and ML technologies. You will collaborate with business partners and domain experts to design and develop innovative solutions on the data to achieve
predefined outcomes.
• Engage with clients to understand current and future business goals and translate business
problems into analytical frameworks
• Develop custom models based on an in-depth understanding of underlying data, data structures,
and business problems to ensure deliverables meet client needs
• Create repeatable, interpretable and scalable models
• Effectively communicate the analytics approach and insights to a larger business audience
• Collaborate with team members, peers and leadership at Tredence and client companies
Qualification:
1. Bachelor's or Master's degree in a quantitative field (CS, machine learning, mathematics,
statistics) or equivalent experience.
2. 5+ years of experience in data science, building hands-on ML models
3. Experience leading the end-to-end design, development, and deployment of predictive
modeling solutions.
4. Excellent programming skills in Python. Strong working knowledge of Python’s numerical, data
analysis, or AI frameworks such as NumPy, Pandas, Scikit-learn, Jupyter, etc.
5. Advanced SQL skills with SQL Server and Spark experience.
6. Knowledge of predictive/prescriptive analytics including Machine Learning algorithms
(Supervised and Unsupervised) and deep learning algorithms and Artificial Neural Networks
7. Experience with Natural Language Processing (NLTK) and text analytics for information
extraction, parsing and topic modeling.
8. Excellent verbal and written communication. Strong troubleshooting and problem-solving skills.
Thrive in a fast-paced, innovative environment
9. Experience with data visualization tools — PowerBI, Tableau, R Shiny, etc. preferred
10. Experience with cloud platforms such as Azure, AWS is preferred but not required
Datametica is Hiring for Datastage Developer
- Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
- Should have extensive knowledge in Unix shell scripting.
- Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
- Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
- Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
- Should be good in writing complex SQL queries.
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
· 2 - 8 years’ of industry experience in programming web applications, mobile and/or large scale enterprise products
· Strong experience on Mendix and knowledge of Mendix Mobile Development using native features
· Design and Develop integrations with external systems leveraging the REST APIs
· Worked extensively in Mendix domain modelling, Forms, complex Microflow logic building
· Good knowledge on installation and configuration of application in Mendix public/private cloud
· Webpage designing using HTML5, CSS3, Bootstrap, jQuery
· Extended knowledge with postgre/sql and Git/TFS is required
· Experience with Agile methodology
· Skilled at reviewing new feature impact on an application and recognizing potential risks
· Uses time effectively and efficiently
· Quickly learns new technologies
· Detailed oriented, professional and possesses a positive work attitude
· Communicate professionally both verbally and in writing
· Effective time management skills








