Huge Openings For Production,Quality,Maintenance Engineering
JOB SUMMARY...
Welcome to AA MANPOWER SOLUTIONS,
Your resume makes you an excellent candidate for the following job. We would like to invite you to an interview with our organization Head for immediate openings in Auto Parts Manufacturing Companies.
Job Responsibilities:
- Provide engineering support for production and maintenance activities to ensure maximum production.
- Determine quality metrics for all manufacturing procedures.
- Identify the root of technical issues and recommend fixes.
- Planning and undertaking scheduled maintenance
- Managing stocks of supplies and equipment.
- Research and draft blueprints, engineering plans, and graphics.
Qualification:
B.E/B.Tech(Mech,EEE,ECE)
Function of Work:
Production,Quality,Maintenance,Design Engineering
Role:
GET,NEEM,On Role
SALARY:
15000 to 20000
Eligibility:
01 year to 3 years
Walk In Interview
09 Nov to 15 Nov (Sunday Holiday)
Pandiyan(HR)
Venue:
AA MANPOWER SOLUTIONS.
No.24, F1, First Floor,
Bajanai Kovil 2nd street,
Vadapalani,
Chennai-600026.
Landmark: SIMS Hospital Back side.
(Above south Indian Movie Still camera Man Association)

About AA MANPOWER SOLUTIONS
Similar jobs
Profile: Big Data Engineer (System Design)
Experience: 5+ years
Location: Bangalore
Work Mode: Hybrid
About the Role
We're looking for an experienced Big Data Engineer with system design expertise to architect and build scalable data pipelines and optimize big data solutions.
Key Responsibilities
- Design, develop, and maintain data pipelines and ETL processes using Python, Hive, and Spark
- Architect scalable big data solutions with strong system design principles
- Build and optimize workflows using Apache Airflow
- Implement data modeling, integration, and warehousing solutions
- Collaborate with cross-functional teams to deliver data solutions
Must-Have Skills
- 5+ years as a Data Engineer with Python, Hive, and Spark
- Strong hands-on experience with Java
- Advanced SQL and Hadoop experience
- Expertise in Apache Airflow
- Strong understanding of data modeling, integration, and warehousing
- Experience with relational databases (PostgreSQL, MySQL)
- System design knowledge
- Excellent problem-solving and communication skills
Good to Have
- Docker and containerization experience
- Knowledge of Apache Beam, Apache Flink, or similar frameworks
- Cloud platform experience.
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
🎥 Job Title: Canva AI Video Editor (Remote)
Company: Memoneet
Location: Remote
Job Type: Contract / Part-time / Full-time
Apply here : https://forms.gle/txMLAGPGicJvqm6y6
🎯 Role Overview
As a Canva AI Video Editor at Memoneet, you’ll be responsible for transforming ideas, scripts, and brand messaging into captivating video content using Canva's AI-powered tools. From short-form social content to branded explainers, you'll create visually compelling videos that resonate with our audience and elevate our brand presence.
🛠️ Key Responsibilities
* Create engaging and on-brand videos using Canva’s AI-powered editing features.
* Interpret creative briefs and scripts to produce clear, visually impactful content.
* Leverage tools like AI-generated voiceovers, text-to-video, background removal, and dynamic captioning.
* Adapt content for various platforms (Instagram Reels, YouTube Shorts, LinkedIn, etc.).
* Collaborate with designers, writers, and project managers to deliver high-quality output.
* Maintain consistency with Memoneet’s brand guidelines and storytelling style.
* Stay up to date with Canva updates, video trends, and social media algorithms.
✅ Requirements
1. Proven experience editing videos using Canva, especially its AI features.
2. Strong sense of timing, pacing, and visual storytelling.
3. Portfolio showcasing short-form or branded video content.
4. Understanding of platform-specific video requirements and audience engagement strategies.
5. Creative mindset with the ability to work independently and meet deadlines.
6. Familiarity with Canva’s Brand Kit, templates, and cloud collaboration features.
Bonus: Experience with scriptwriting, motion graphics, or other video tools is a plus.
Company: SMSNiti
Location: Remote (India)
About Us:
SMSNiti is a growing marketing solutions company providing businesses with cutting-edge digital marketing services including bulk SMS, WhatsApp marketing, voice call marketing, and omnichannel engagement platforms. We are passionate about helping small businesses thrive by offering high-quality services at affordable rates.
Job Description:
We are seeking a motivated and dynamic Business Development Officer to join our remote team. The ideal candidate will focus on lead generation, driving sales, managing client accounts, and building strong relationships with customers. This is a work-from-home position with internet reimbursement provided.
Key Responsibilities:
- Lead Generation: Identify potential clients through online research, cold calling, LinkedIn outreach, and networking.
- Sales : Engage with prospects, understand their needs, and convert them into clients by presenting relevant SMSNiti services.
- Account Management: Maintain and nurture relationships with existing clients, addressing their queries and ensuring a positive experience.
- Relationship Building: Develop lasting partnerships with clients, understanding their business needs and providing tailored solutions.
- Market Research: Stay updated on industry trends and competitor offerings to ensure SMSNiti remains competitive.
- Reporting: Track and report on sales activities and progress toward targets.
Key Skills:
- Strong communication and interpersonal skills
- Proven experience in lead generation and sales
- Ability to manage multiple clients effectively
- Self-motivated and results-oriented
- Experience in digital marketing services is a plus
Qualifications:
- Bachelor's degree in Business, Marketing, or a related field
- 1-3 years of experience in sales or business development
- Familiarity with CRM tools, Microsoft Office, and LinkedIn
Perks & Benefits:
- Competitive salary and performance-based incentives
- Internet reimbursement
- Opportunity for career growth in a fast-growing company
- Flexible work-from-home arrangement
How to Apply:
Job Types: Full-time, Permanent
Pay: ₹8,000.00 - ₹16,000.00 per month
Benefits:
Cell phone reimbursement
Work from home
Schedule:
Day shift
Supplemental Pay:
Commission pay
Performance bonus
Yearly bonus
Education:
Bachelor's (Preferred)
Experience:
Marketing / Sales: 2 years (Preferred)
total work: 2 years (Preferred)
Language:
English (Preferred)
Work Location: Remote
LogiNext is looking for a technically savvy and passionate Principle Engineer - Data Science to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights.
In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research.
Your goal will be to help our company analyze trends to make better decisions. Without knowledge of how the software works, data scientists might have difficulty in work. Apart from experience in developing R and Python, they must know modern approaches to software development and their impact. DevOps continuous integration and deployment, experience in cloud computing are everyday skills to manage and process data.
Responsibilities :
Adapting and enhancing machine learning techniques based on physical intuition about the domain Design sampling methodology, prepare data, including data cleaning, univariate analysis, missing value imputation, , identify appropriate analytic and statistical methodology, develop predictive models and document process and results Lead projects both as a principal investigator and project manager, responsible for meeting project requirements on schedule and on budget Coordinate and lead efforts to innovate by deriving insights from heterogeneous sets of data generated by our suite of Aerospace products Support and mentor data scientists Maintain and work with our data pipeline that transfers and processes several terabytes of data using Spark, Scala, Python, Apache Kafka, Pig/Hive & Impala Work directly with application teams/partners (internal clients such as Xbox, Skype, Office) to understand their offerings/domain and help them become successful with data so they can run controlled experiments (a/b testing) Understand the data generated by experiments, and producing actionable, trustworthy conclusions from them Apply data analysis, data mining and data processing to present data clearly and develop experiments (ab testing) Work with development team to build tools for data logging and repeatable data tasks tol accelerate and automate data scientist duties
Requirements:
Bachelor’s or Master’s degree in Computer Science, Math, Physics, Engineering, Statistics or other technical field. PhD preferred 8 to 10 years of experience in data mining, data modeling, and reporting 5+ years of experience working with large data sets or do large scale quantitative analysis Expert SQL scripting required Development experience in one of the following: Scala, Java, Python, Perl, PHP, C++ or C# Experience working with Hadoop, Pig/Hive, Spark, MapReduce Ability to drive projects Basic understanding of statistics – hypothesis testing, p-values, confidence intervals, regression, classification, and optimization are core lingo Analysis - Should be able to perform Exploratory Data Analysis and get actionable insights from the data, with impressive visualization. Modeling - Should be familiar with ML concepts and algorithms; understanding of the internals and pros/cons of models is required. Strong algorithmic problem-solving skills Experience manipulating large data sets through statistical software (ex. R, SAS) or other methods Superior verbal, visual and written communication skills to educate and work with cross functional teams on controlled experiments Experimentation design or A/B testing experience is preferred. Experience in leading a team required.
Opportunity for Unix Developer!!
We at Datametica are looking for talented Unix engineers who would get trained and will get the opportunity to work on Google Cloud Platform, DWH and Big Data.
Experience - 2 to 7 years
Job location - Pune
Mandatory Skills:
Strong experience in Unix with Shell Scripting development.
What opportunities do we offer?
-Selected candidates will be provided training opportunities in one or more of following: Google Cloud, AWS, DevOps Tools and Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- You would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- You will play an active role in setting up the Modern data platform based on Cloud and Big Data
- You would be a part of teams with rich experience in various aspects of distributed systems and computing.
We are looking to hire a talented PHP developer to manage our back-end services and ensure a seamless interchange of data between the server and our users. As a PHP developer, you will be responsible for developing and coding all server-side logic. You will also be required to maintain the central database and respond to requests from front-end developers.
To ensure success as a PHP developer, you should have in-depth knowledge of object-oriented PHP programming, understanding of MVC designs, and working knowledge of front-end technologies including HTML5, JavaScript, and CSS3. Ultimately, a top-level PHP Developer can design and build efficient PHP modules while seamlessly integrating front-end technologies.
PHP Developer Responsibilities:
- Conducting analysis of website and application requirements.
- Writing back-end code and building efficient PHP modules.
- Developing back-end portals with an optimized database.
- Troubleshooting application and code issues.
- Integrating data storage solutions.
- Responding to integration requests from front-end developers.
- Finalizing back-end features and testing web applications.
- Updating and altering application features to enhance performance.
PHP Developer Requirements:
- Bachelor’s degree in computer science or a similar field.
- Knowledge of PHP web frameworks including Yii, Laravel, and CodeIgniter.
- Knowledge of front-end technologies including CSS3, JavaScript, and HTML5.
- Understanding of object-oriented PHP programming.
- Previous experience creating scalable applications.
- Proficient with code versioning tools including Git, Mercurial, CVS, and SVN.
- Familiarity with SQL/NoSQL databases.
- Ability to project manage.
- Good problem-solving skills.
Roles and Responsibilities
- 5 - 8 years of experience in Infrastructure setup on Cloud, Build/Release Engineering, Continuous Integration and Delivery, Configuration/Change Management.
- Good experience with Linux/Unix administration and moderate to significant experience administering relational databases such as PostgreSQL, etc.
- Experience with Docker and related tools (Cassandra, Rancher, Kubernetes etc.)
- Experience of working in Config management tools (Ansible, Chef, Puppet, Terraform etc.) is a plus.
- Experience with cloud technologies like Azure
- Experience with monitoring and alerting (TICK, ELK, Nagios, PagerDuty)
- Experience with distributed systems and related technologies (NSQ, RabbitMQ, SQS, etc.) is a plus
- Experience with scaling data store technologies is a plus (PostgreSQL, Scylla, Redis) is a plus
- Experience with SSH Certificate Authorities and Identity Management (Netflix BLESS) is a plus
- Experience with multi-domain SSL certs and provisioning a plus (Let's Encrypt) is a plus
- Experience with chaos or similar methodologies is a plus

Role As part of the role, the Developers / Senior Developers will be responsible for Design, Coding and Integration while meeting the quality targets set for the project. The Tech Leads will be additionally responsible for Estimation, Architecture, Design, Technical Reviews, Customer Interaction, Building and Mentoring a talented team of Engineers
Knowledge & Skills
- Technical Experience on Physical Layer of 3GPP LTE Experience on DSP Development\
- Proficient in C
- Good understanding of Wireless Communications
- Experience on MAC-PHY interface
- Behavioral Interpersonal skills & Ability to work in team
- Good Communication Skills Good Analytical Skills
- Customer Orientation Job Title LTE Developer with DSP Physical Layer(Engineer/Senior Engineer)







.jpg&w=3840&q=75)





