11+ ENOVIA Jobs in Pune | ENOVIA Job openings in Pune
Apply to 11+ ENOVIA Jobs in Pune on CutShort.io. Explore the latest ENOVIA Job opportunities across top companies like Google, Amazon & Adobe.
Role: ME Designer
Exp: 3-6 Years
CTC: up to 15 LPA
Location: Pune & Bangalore
Basic Qualifications
• Master of Science in Mechanical Engineering, Automotive Engineering, or related field
• Minimum of 3 years of experience in embedded design developments
• Experience in Design for Assembly and Design for Manufacturing methodology.
• Familiar with more than 1 leading CAD design tool \
Should have experience in:
• UG nx or CATIA CAD tool min 3-6 Yrs.
• Experience in CAD Modeling and Detailing min 3-6 Yrs.
• PLM tools (ENOVIA, Tecamente etc.) min 3-6 Yrs.
• CAE Tools Ansys for Mechanical Preferred.
About the company:
Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.
About the role:
As a Technical Project Manager, you will lead the planning, execution, and delivery of complex technical projects while ensuring alignment with business objectives and timelines. You will act as a bridge between technical teams and stakeholders, managing resources, risks, and communications to deliver high-quality solutions. This role demands strong leadership, project management expertise, and technical acumen to drive project success in a dynamic and collaborative environment.
Qualifications:
- Education Background: Any ME / M Tech / BE / B Tech
Key Competencies:
Technical Skills
1. Data & BI Technologies-
- Proficiency in SQL & PL/SQL for database querying and optimization.
- Understanding of data warehousing concepts, dimensional modeling, and data lake/lakehouse architectures.
- Experience with BI tools such as Power BI, Tableau, Qlik Sense/View.
- Familiarity with traditional platforms like Oracle, Informatica, SAP BO, BODS, BW.
2. Cloud & Data Engineering :
- Strong knowledge of AWS (EC2, S3, Lambda, Glue, Redshift), Azure (Data Factory, Synapse, Databricks, ADLS),
- Snowflake (warehouse architecture, performance tuning), and Databricks (Delta Lake, Spark).
- Experience with cloud-based ETL/ELT pipelines, data ingestion, orchestration, and workflow automation.
3. Programming
- Hands-on experience in Python or similar scripting languages for data processing and automation.
Soft Skills
- Strong leadership and team management skills.
- Excellent verbal and written communication for stakeholder alignment.
- Structured problem-solving and decision-making capability.
- Ability to manage ambiguity and handle multiple priorities.
Tools & Platforms
- Cloud: AWS, Azure
- Data Platforms: Snowflake, Databricks
- BI Tools: Power BI, Tableau, Qlik
- Data Management: Oracle, Informatica, SAP BO
- Project Tools: JIRA, MS Project, Confluence (recommended additions if you want)
Key Responsibilities:
- End-to-End Project Management: Lead the team through the full project lifecycle, delivering techno-functional solutions.
- Methodology Expertise: Apply Agile, PMP, and other frameworks to ensure effective project execution and resource management.
- Technology Integration: Oversee technology integration and ensure alignment with business goals.
- Stakeholder & Conflict Management: Manage relationships with customers, partners, and vendors, addressing expectations and conflicts proactively.
- Technical Guidance: Provide expertise in software design, architecture, and ensure project feasibility.
- Change Management: Analyse new requirements/change requests, ensuring alignment with project goals.
- Effort & Cost Estimation: Estimate project efforts and costs and identify potential risks early.
- Risk Mitigation: Proactively identify risks and develop mitigation strategies, escalating issues in advance.
- Hands-On Contribution: Participate in coding, code reviews, testing, and documentation as needed.
- Project Planning & Monitoring: Develop detailed project plans, track progress, and monitor task dependencies.
- Scope Management: Manage project scope, deliverables, and exclusions, ensuring technical feasibility.
- Effective Communication: Communicate with stakeholders to ensure agreement on scope, timelines, and objectives.
- Reporting: Provide status and RAG reports, proactively addressing risks and issues.
- Change Control: Manage changes in project scope, schedule, and costs using appropriate verification techniques.
- Performance Measurement: Measure project performance with tools and techniques to ensure progress.
- Operational Process Management: Oversee operational tasks like timesheet approvals, leave, appraisals, and invoicing.
Greetings , Wissen Technology is Hiring for the position of Data Engineer
Please find the Job Description for your Reference:
JD
- Design, develop, and maintain data pipelines on AWS EMR (Elastic MapReduce) to support data processing and analytics.
- Implement data ingestion processes from various sources including APIs, databases, and flat files.
- Optimize and tune big data workflows for performance and scalability.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Manage and monitor EMR clusters, ensuring high availability and reliability.
- Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and store data in data lakes and data warehouses.
- Implement data security best practices to ensure data is protected and compliant with relevant regulations.
- Create and maintain technical documentation related to data pipelines, workflows, and infrastructure.
- Troubleshoot and resolve issues related to data processing and EMR cluster performance.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 5+ years of experience in data engineering, with a focus on big data technologies.
- Strong experience with AWS services, particularly EMR, S3, Redshift, Lambda, and Glue.
- Proficiency in programming languages such as Python, Java, or Scala.
- Experience with big data frameworks and tools such as Hadoop, Spark, Hive, and Pig.
- Solid understanding of data modeling, ETL processes, and data warehousing concepts.
- Experience with SQL and NoSQL databases.
- Familiarity with CI/CD pipelines and version control systems (e.g., Git).
- Strong problem-solving skills and the ability to work independently and collaboratively in a team environment
Below is a list of preferred requirements. Bear in mind that we always favor talent, energy, and a
history of being a “top performer” over any specific skill set.
● BTech/MS in Computer Science or related fields or extensive software development
experience
● Should have 4- 9 experience in Tibco Business Works
● Must be hands on with TIBCO Business Works 5.x,Tibco Bw 6x along with TIBCO Administrator , TIBCO
EMS, FTP ,SFTP Pallets, XML technologies (XML , WSDL , XSD) , Web Services (SOAP over JMS),
Rest Services
advantage
● Should be able to work with business stakeholders and understand the business
requirements and able to create necessary TIBCO integration interface
2. Understanding of Agile workflow
3. Should have good Hands-on knowledge
1- Product uploading, Product Listing, Catalogue Management, Product Searching.
2- Candidate should have good knowledge of MS-Excel.
3- Candidate should have good typing speed.
4- Candidate should be proactive in approach
5- Basic knowledge of photo edit is required
testament to our work and efforts put in to provide reliable and affordable technology to our customer in just a few clicks. zillionsbuyer is known for its professionalism, integrity, transparency and our total commitment to timely deliveries.

