11+ Catia V5R19 Jobs in Chennai | Catia V5R19 Job openings in Chennai
Apply to 11+ Catia V5R19 Jobs in Chennai on CutShort.io. Explore the latest Catia V5R19 Job opportunities across top companies like Google, Amazon & Adobe.

Job Summary:
Welcome to AA MANPOWER SOLUTIONS,
Your resume makes you an excellent candidate for the following job. We would like to invite you to an interview with our organization Head for immediate openings in AUTO PARTS MANUFACTURING Companies.
Job Duties:
- Develop and establish quality procedures and systems for inspecting plans, quality trends, statistical plan price estimates and technical quality proposal plans.
- Coordinate with suppliers and customers relating to quality trends, performance and corrective action.
- Verify conformance and productivity of quality engineering system through supplier audits and surveys.
- Interact with Quality Control Packaging team to conduct Technician Training Program after evaluating Quality Inspections Techniques.
Qualification:
B.E/B.Tech(Mech)
Function of Work:
Quality Control,Quality Assurance,Quality Engineering
SALARY:
10% - 15 % Hike of previous salary
Eligibility:
8 years Above
Walk In Interview
06 Nov to 30 Nov (Sunday Holiday)
Pandiyan(HR)
Venue:
AA MANPOWER SOLUTIONS.
No.24, F1, First Floor,
Bajanai Kovil 2nd street,
Vadapalani,
Chennai-600026.
Landmark: SIMS Hospital Back side.
(Above south Indian Movie Still camera Man Association)
KEY RESPONSIBILITIES:
Sales function:
* Acquisition of New Clients
* Responsible for business development for General Insurance products including Mediclaim, Asset, Fire, Liability, Engineering, Marine, etc.
* Create and execute strategies to explore new potential markets and retain existing clients.
* Driving Sales team for Initial contact for Enquiries / RFQ/Mandate Letters.
* Giving a clear understanding of the pricing and features of the product and their impact on profitability.
Relationship management
* Handling corporate customers and maintaining excellent relationships with them.
* Leveraging on the relationship with corporate and focusing on closures.
* Liaisoning with insurers and TPA to provide better services and quotes to clients.
Other Functions:
* Coordinating with HR & Finance for Corporate Data.
* Administering guidelines in terms of documentation while policy issuance and claims servicing
* Tracking of competitor activities to understand the market trends and take proactive actions.
* Analyzing their risk portfolio, suggesting ways forward to mitigate losses and maintain a healthy bottom line.
Personal Skills
* At least 3-10 years of experience in B2B sales and customer-facing role
* Very strong written and verbal communication
* High customer empathy
* Strong problem solver,
* Ability to work with cross-functional teams to resolve issues
. Experience in outbound prospecting, cold calling, and managing sales pipeline
* Previous experience mainly in hunting, new client acquisition, and sales closures
* Experience having sold to Founders, CHROs & CFOs is a plus
* Consistently meeting assigned sales targets
* Strong spoken and written communication skills in English
Experience/skills required
Key Qualifications
- Developer Role:
- Preferably Java
- CI/CD - DevOps (Exposure)
- Messaging Middleware (Exposure to Kafka or any other messaging middleware)
- DB: Oracle (pref) - any other Database platforms (SQL/NoSQL)
- Server Side: Java, Spring boot Microservices
- Exposure to Any Major Cloud Platform (AWS/Azure/GCP)
Roles and Responsibilities
§ Writing, editing and publishing engaging content for various social networks, including Instagram, Linkedin, Twitter and Facebook etc.
§ Tracking and reporting on social media responses by analyzing traffic to the site
§ Update our social media pages with compelling company news, events, celebrations or solutions and services
§ Respond to questions and comments on our social media pages in a timely and accurate manner
§ Maintaining a visible online on social media presence and adhering to the marketing goals
§ Increasing the company's awareness and readership by using relevant advertising techniques
§ Optimize social media content (language, message, tone) on the basis of the behaviour of our target audience
§ Familiarity with paid advertisement and aware about different websites, and tools for creating graphics, images and video content
§ To create a range of engaging digital content such as Marketing and
Promotional videos and Reels to be used across a variety of online
platforms and social media channels.
§ To help us maximise our online profile and develop our ever-growing social media presence through the creation of video driven strategies and digital media content.
§ Pays attention to trends on Instagram Reels creating and analyzing what works for our business.
§ Coordinate with the creative department to create advertising/engagement posts (e.g. for Company Events, Job posts, etc.)
§ Contribute to the creation of various content materials including: Blogs, Articles, Website Pages, Email templates, Social Media posts etc
§ Develop a deep understanding of MHFAI’s business, brand voice, and target audiences.
§ Develop posts and articles on a wide range of topics that resonate with our target audience and drive traffic to our website
§ Proofread and edit content to ensure high-quality standards, including grammar, spelling, and punctuation
§ Continuously discover and implement new editing technologies and industry best practices to maximize efficiency
§ Requirements and skills
§ 3+ years of solid experience in digital content marketing or related fields
§ Excellent written and oral Communication skills
§ The ability to create original web content with images or videos
§ Proven work experience as a Social Media content writer or a similar role
§ Hands-on experience using various social media platforms to advertise
§ Proven record of excellent writing demonstrated in a professional portfolio
§ Ability to work independently with little or no daily supervision
§ Working knowledge of SEO, analytics tools, and keyword research
§ Strong research skills and the ability to synthesize complex information into clear and concise content
§ Creative thinking and the ability to generate innovative ideas
👋🏼We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (19000+ experts across 33 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!
REQUIREMENTS:
- Bachelor's/master’s degree or equivalent experience in computer science
- Overall, 10-12 years of experience with at least 4 years of experience with Jitterbit Harmony platform and Jitterbit Cloud.
- Should have the experience to technically lead groom developers who might be geographically distributed
- Knowledge of Change & Incident Management process (JIRA etc.)
RESPONSIBILITIES:
- Responsible for end-to-end implementation of integration use case using Jitterbit platform.
- Coordinate with all the stakeholders for successful project execution.
- Responsible for requirement gathering, Integration strategy, design, implementation etc.
- Should have strong hands-on experience in designing, building, and deploying integration solution using Jitterbit harmony Platform.
- Should have developed enterprise services using REST based APIs, SOAP Web Services and use of different Jitterbit connectors (Salesforce, DB, JMS, File connector, Http/Https connectors, any TMS connector).
- Should have knowledge of Custom Jitterbit Plugins and Custom Connectors.
- Experience in Jitterbit implementations including security, logging, error handling, scalability and clustering.
- Strong experience in Jitterbit Script, XSLT and JavaScript.
- Install, configure and deploy solution using Jitterbit.
- Provide test support for bug fixes during all stages of test cycle.
- Provide support for deployment and post go-live.
- Knowledge of professional software engineering practices & best practices for the full software development life cycle including coding standards, code reviews, source control management, build processes, testing,
- Understand the requirements, create necessary documentation, give presentations to clients and get necessary approvals and create design doc for the release.
- Estimate the tasks and discuss with the clients on Risks/Issues.
- Working on the specific module independently and test the application. Code reviews suggest the team on best practices.
- Create necessary documentation, give presentations to clients and get necessary approvals.
- Broad knowledge of web standards relating to APIs (OAuth, SSL, CORS, JWT, etc.)
Responsibilities:
• Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for Structured & Semi-structured data processing
• Implementing Spark processing based ETL frameworks
• Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
• Modifying the Informatica-Teradata & Unix based data pipeline
• Enhancing the Talend-Hive/Spark & Unix based data pipelines
• Develop and Deploy Scala/Python based Spark Jobs for ETL processing
• Strong SQL & DWH concepts.
Preferred Background:
• Function as integrator between business needs and technology solutions, helping to create technology solutions to meet clients’ business needs
• Lead project efforts in defining scope, planning, executing, and reporting to stakeholders on strategic initiatives
• Understanding of EDW system of business and creating High level design document and low level implementation document
• Understanding of Big Data Lake system of business and creating High level design document and low level implementation document
• Designing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
Counseling the students over the study abroad options.
Handling enquiries and getting the enrollments done .
Achieving monthly sales targets, cold calling.
Work Schedule: 6 days a week (including weekends), any 1 fixed weekly off.
11am-7pm / 12pm - 8pm - Mon to Fri
10am to 7pm / 11am to 8pm - Sat n Sun
Job Dsecription:
○ Develop best practices for team and also responsible for the architecture
○ solutions and documentation operations in order to meet the engineering departments quality and standards
○ Participate in production outage and handle complex issues and works towards Resolution
○ Develop custom tools and integration with existing tools to increase engineering Productivity
Required Experience and Expertise
○ Having a good knowledge of Terraform + someone who has worked on large TF code bases.
○ Deep understanding of Terraform with best practices & writing TF modules.
○ Hands-on experience of GCP and AWS and knowledge on AWS Services like VPC and VPC related services like (route tables, vpc endpoints, privatelinks) EKS, S3, IAM. Cost aware mindset towards Cloud services.
○ Deep understanding of Kernel, Networking and OS fundamentals
NOTICE PERIOD - Max - 30 days

Location: Chennai- Guindy Industrial Estate
Duration: Full time role
Company: Mobile Programming (https://www.mobileprogramming.com/" target="_blank">https://www.
Client Name: Samsung
We are looking for a Data Engineer to join our growing team of analytics experts. The hire will be
responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing
data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline
builder and data wrangler who enjoy optimizing data systems and building them from the ground up.
The Data Engineer will support our software developers, database architects, data analysts and data
scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout
ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple
teams, systems and products.
Responsibilities for Data Engineer
Create and maintain optimal data pipeline architecture,
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes,
optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data
from a wide variety of data sources using SQL and AWS big data technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer
acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with
data-related technical issues and support their data infrastructure needs.
Create data tools for analytics and data scientist team members that assist them in building and
optimizing our product into an innovative industry leader.
Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
Experience building and optimizing big data ETL pipelines, architectures and data sets.
Advanced working SQL knowledge and experience working with relational databases, query
authoring (SQL) as well as working familiarity with a variety of databases.
Experience performing root cause analysis on internal and external data and processes to
answer specific business questions and identify opportunities for improvement.
Strong analytic skills related to working with unstructured datasets.
Build processes supporting data transformation, data structures, metadata, dependency and
workload management.
A successful history of manipulating, processing and extracting value from large disconnected
datasets.
Working knowledge of message queuing, stream processing and highly scalable ‘big data’ data
stores.
Strong project management and organizational skills.
Experience supporting and working with cross-functional teams in a dynamic environment.
We are looking for a candidate with 3-6 years of experience in a Data Engineer role, who has
attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
Experience with big data tools: Spark, Kafka, HBase, Hive etc.
Experience with relational SQL and NoSQL databases
Experience with AWS cloud services: EC2, EMR, RDS, Redshift
Experience with stream-processing systems: Storm, Spark-Streaming, etc.
Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.
Skills: Big Data, AWS, Hive, Spark, Python, SQL
