Warm Welcome to the Production Quality Engineering
Job Summary:
Currently we are hiring for the position of Production and quality Maintenance Engineer based at Chennai location. Ref:Pandiyan
Please refer the beneath job description and walk-in for Top MNC
RESPONSIBILITIES:
Maximize productivity of machinery and workers.
Design and implement cost-reductive changes.
Determining quality metrics for all manufacturing procedures.
Monitoring the entire production cycle and reporting on malfunctions.
Designation: GAT,NEEM,GET,TRAINEE,QUALITY,PRODUCTION,QUALITY,CHEMICAL, MAINTENANCE,SUPERVISOR (ONROLE )
Qualification: Diploma, B.E/ B.TECH – Mechanical Engineering, EEE, ,ECE, AUTO Engineering.
Work Location: Chennai
Interview Date: start to 27th December(Interview going on)
Last date to apply: 30th JANUARY
Experience-0 to 3 Years
Nature of Job: Production or Quality maintenance
Job Type: Full-time
Benefits:
Food and Transport
ESI + PF Available
8 hours of duty
About AA MANPOWER SOLUTIONS
Similar jobs
Job Summary:
We are looking for a dynamic and detail-oriented HR Executive to support the human resources department in all facets of HR operations. This role involves managing employee relations, recruitment, performance management, training, and administrative duties to ensure smooth HR processes and a positive work environment.
Qualifications:
- Bachelor’s degree in Human Resources, Business Administration, or related field.
- 3+ years of experience in an HR role, preferably in a fast-paced environment or organization.
- Experience with HRMS/HRIS (Human Resource Management Systems) and MS Office Suite (Excel, Word, PowerPoint) is preferred.
Skills and Abilities:
- Strong knowledge of HR practices and labor laws.
- Excellent communication and interpersonal skills.
- Ability to maintain confidentiality and handle sensitive information.
- Strong organizational skills with attention to detail.
- Ability to handle multiple tasks and prioritize effectively.
Job Description:
As an Azure Data Engineer, your role will involve designing, developing, and maintaining data solutions on the Azure platform. You will be responsible for building and optimizing data pipelines, ensuring data quality and reliability, and implementing data processing and transformation logic. Your expertise in Azure Databricks, Python, SQL, Azure Data Factory (ADF), PySpark, and Scala will be essential for performing the following key responsibilities:
Designing and developing data pipelines: You will design and implement scalable and efficient data pipelines using Azure Databricks, PySpark, and Scala. This includes data ingestion, data transformation, and data loading processes.
Data modeling and database design: You will design and implement data models to support efficient data storage, retrieval, and analysis. This may involve working with relational databases, data lakes, or other storage solutions on the Azure platform.
Data integration and orchestration: You will leverage Azure Data Factory (ADF) to orchestrate data integration workflows and manage data movement across various data sources and targets. This includes scheduling and monitoring data pipelines.
Data quality and governance: You will implement data quality checks, validation rules, and data governance processes to ensure data accuracy, consistency, and compliance with relevant regulations and standards.
Performance optimization: You will optimize data pipelines and queries to improve overall system performance and reduce processing time. This may involve tuning SQL queries, optimizing data transformation logic, and leveraging caching techniques.
Monitoring and troubleshooting: You will monitor data pipelines, identify performance bottlenecks, and troubleshoot issues related to data ingestion, processing, and transformation. You will work closely with cross-functional teams to resolve data-related problems.
Documentation and collaboration: You will document data pipelines, data flows, and data transformation processes. You will collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide data engineering support.
Skills and Qualifications:
Strong experience with Azure Databricks, Python, SQL, ADF, PySpark, and Scala.
Proficiency in designing and developing data pipelines and ETL processes.
Solid understanding of data modeling concepts and database design principles.
Familiarity with data integration and orchestration using Azure Data Factory.
Knowledge of data quality management and data governance practices.
Experience with performance tuning and optimization of data pipelines.
Strong problem-solving and troubleshooting skills related to data engineering.
Excellent collaboration and communication skills to work effectively in cross-functional teams.
Understanding of cloud computing principles and experience with Azure services.
- 3+ years of Experience majoring in applying AI/ML/ NLP / deep learning / data-driven statistical analysis & modelling solutions.
- Programming skills in Python, knowledge in Statistics.
- Hands-on experience developing supervised and unsupervised machine learning algorithms (regression, decision trees/random forest, neural networks, feature selection/reduction, clustering, parameter tuning, etc.). Familiarity with reinforcement learning is highly desirable.
- Experience in the financial domain and familiarity with financial models are highly desirable.
- Experience in image processing and computer vision.
- Experience working with building data pipelines.
- Good understanding of Data preparation, Model planning, Model training, Model validation, Model deployment and performance tuning.
- Should have hands on experience with some of these methods: Regression, Decision Trees,CART, Random Forest, Boosting, Evolutionary Programming, Neural Networks, Support Vector Machines, Ensemble Methods, Association Rules, Principal Component Analysis, Clustering, ArtificiAl Intelligence
- Should have experience in using larger data sets using Postgres Database.
-Exp in C++ as IT dev exp
-strong exp in CAD programming
Required:
- Strong background in, and at least 3+ years of working in tooling or QA automation
- Thorough understanding of SDLC, specifically automated QA processes in agile development environments
- Experience in writing, executing and monitoring automated test suites using a variety of technologies
- Proficient with bug tracking and test management toolsets to support development processes
- Strong working knowledge of testing fundamentals such as TDD & BDD
- Proficient working with relational databases such as MySQL & PostreSQL
- Some knowledge of Unix/Linux
Desired Skills:
- Building test infrastructures using containerization technologies such as Docker and working within continuous
delivery / continuous release pipeline processes
- Testing enterprise applications deployed to cloud environments such as Google Cloud Platform
- Experience mentoring QA staff and end users on quality objectives and testing processes
- Understanding of coding enterprise applications within Java, PHP, and other languages
- Understanding of NoSQL database technologies such as MongoDB or DynamoDB
- Degree level qualifications in a technical related subject
- Proactive 'self-starter' attitude
- Lifelong learner - thrives from developing and sharing knowledge
Job Summary
- BS/BE/BCA/MSC/MCA degree in Computer Science, Engineering or a related subject
- Hands on experience is preferable in designing and developing applications using Java EE platforms
- Object oriented analysis and design using common design patterns.
- Profound insight of Java and J2EE internals
- Excellent knowledge of Relational Databases and SQL
- Experience in developing web applications using at least one popular web framework (JSF, HTML5, MVC)
- Knowledge on Micro services, Containers / Docker would be added advantage.
- Knowledge on data science would be preferred.
- Exposure to building API, rest service and webservices.
- Exposure to open source like Tensor flow, NIFI, Stream pipes etc.,
- Experience with test-driven development
- Good communication skills and client-oriented attitude
- Organized and detail-oriented person
- Problem solving skills, analytical mind and positive attitude
- Results oriented and focused on meeting deliverable timelines
- Availability to travel, if needed
- Fluency in English is a must
Responsibilities and Duties
- Design and develop features and modules for mission-critical applications
- Build modules on MES products like (SAP, Apriso, Rockwell etc.,)
- Contribute in all phases of the development lifecycle
- Write well designed, testable, efficient code
- Ensure designs are in compliance with specifications
- Prepare and produce releases of software components
- Support continuous improvement by investigating alternatives and technologies
Required Experience, Skills and Qualifications
2 - 5 years of hands-on Software Development experience using the below mentioned Technologies
- Java / J2EE
- EJB, JSF, Servlets
- HTML, HTML5
- SQL server / Oracle
- Json, webservice and etc.
Benefits
- Candidate would be Trained on SAP modules.
- Industry best pay.
Company Profile:
"Founded in 2012, Pay1 empowers neighbourhood retail businesses & micro-entrepreneurs by adding revenues streams and providing them access to new products, credit and technology. With a network of 300k+ merchants spread across 350+ cities in India, Pay1 is at the forefront of combining retail, finance, travel and consumer technology to create an integrated business ecosystem unique to the Indian markets.
The platform has a wide array of distinguished services such as Digital Payments, Banking Services, Travel Booking Services, Financial services (MFs. Insurance, loans, gold investments), Remittances, Recharges, Bill Payments, B2B commerce and many more. The diverse selections of services provide our merchant network with immense growth opportunities that have been previously unexplored in the Indian retail scenario."
Job Description:
- Establish strategic tie ups with the large & medium size merchant aggregators for the enablement of Digital Lending services offered.
- Create value proposition, pitch presentation and designing SLA for business partnerships
- On boarding new partners and be an interface between internal teams and partner for system and process support
- Commercial and Contract Negotiations with prospective business partners and structuring financial proposals with best-value proposition
- Key Account Management with Merchant Aggregators for New Business Opportunities / Enhancements
- Suggest measures / solutions for tapping opportunities like product customization/ innovation.
- Keep abreast of the Digital Lending landscape, competitors and business needs.
- Issue resolution in relation to Digital Lending product related issues with Partners
Intro
Our data and risk team is the core pillar of our business that harnesses alternative data sources to guide the decisions we make at Rely. The team designs, architects, as well as develop and maintain a scalable data platform the powers our machine learning models. Be part of a team that will help millions of consumers across Asia, to be effortlessly in control of their spending and make better decisions.
What will you do
The data engineer is focused on making data correct and accessible, and building scalable systems to access/process it. Another major responsibility is helping AI/ML Engineers write better code.
• Optimize and automate ingestion processes for a variety of data sources such as: click stream, transactional and many other sources.
- Create and maintain optimal data pipeline architecture and ETL processes
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Develop data pipeline and infrastructure to support real-time decisions
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data' technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs.
What will you need
• 2+ hands-on experience building and implementation of large scale production pipeline and Data Warehouse
• Experience dealing with large scale
- Proficiency in writing and debugging complex SQLs
- Experience working with AWS big data tools
• Ability to lead the project and implement best data practises and technology
Data Pipelining
- Strong command in building & optimizing data pipelines, architectures and data sets
- Strong command on relational SQL & noSQL databases including Postgres
- Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
Big Data: Strong experience in big data tools & applications
- Tools: Hadoop, Spark, HDFS etc
- AWS cloud services: EC2, EMR, RDS, Redshift
- Stream-processing systems: Storm, Spark-Streaming, Flink etc.
- Message queuing: RabbitMQ, Spark etc
Software Development & Debugging
- Strong experience in object-oriented programming/object function scripting languages: Python, Java, C++, Scala, etc
- Strong hold on data structures & algorithms
What would be a bonus
- Prior experience working in a fast-growth Startup
- Prior experience in the payments, fraud, lending, advertising companies dealing with large scale data