The brand is associated with some of the major icons across categories and tie-ups with industries covering fashion, sports, and music, of course. The founders are Marketing grads, with vast experience in the consumer lifestyle products and other major brands. With their vigorous efforts toward quality and marketing, they have been able to strike a chord with major E-commerce brands and even consumers.
- Coordinating with Supplier to discuss and resolving product quality issues raised by customers.
- Updating customers in a timely fashion regarding the status of quality issues or any requests.
- Participating in product quality planning and control process based on customer’s specifications and requirements.
- Managing escalations to suppliers on product quality, Root cause, action
- Preparing and updating all necessary quality reports as required by customers.
Desired Candidate Profile
What you need to have:- Experience in handling customer complaints
- knowledge of DOA, FFR, RCA & Supplier escalation, Excel report
- Communication skills, problem solving attitude
- Client facing experience
- Product Quality management
- Escalation and complaint management
Similar jobs
About us
Blitz is into Instant Logistics in Southeast Asia. Blitz was founded in the year 2021. It is in the business of delivering orders using EV bikes. Blitz not only delivers instant orders through EV Bikes, but it also finances the EV bikes to the drivers on lease and generates another source of revenue from the leasing as well apart from delivery charges. Blitz is revolutionizing instant coordination with the help of advanced technology-based solutions. It is a product-driven company and uses modern technologies to build products that solve problems in EV-based Logistics. Blitz is utilizing data sources coming from the EV bikes through IOT and smart engines to make technology-driven decisions to create a delightful experience for consumers
About the Role
We are seeking an experienced Data Engineer to join our dynamic team. The Data Engineer will be responsible for designing, developing, and maintaining scalable data pipelines and infrastructure to support our data-driven initiatives. The ideal candidate will have a strong background in software engineering, database management, and data architecture, with a passion for building robust and efficient data systems
What you will do
- Design, build, and maintain scalable data pipelines and infrastructure to ingest, process, and analyze large volumes of structured and unstructured data.
- Collaborate with cross-functional teams to understand data requirements and develop solutions to meet business needs.
- Optimise data processing and storage solutions for performance, reliability, and cost-effectiveness.
- Implement data quality and validation processes to ensure accuracy and consistency of data.
- Monitor and troubleshoot data pipelines to identify and resolve issues in time.
- Stay updated on emerging technologies and best practices in data engineering and recommend innovations to enhance our data infrastructure.
- Document data pipelines, workflows, and infrastructure to facilitate knowledge sharing and ensure maintainability.
- Create Data Dashboards from the datasets to visualize different data requirements
What we need
- Bachelor's degree or higher in Computer Science, Engineering, or a related field.
- Proven experience as a Data Engineer or similar role, with expertise in building and maintaining data pipelines and infrastructure.
- Proficiency in programming languages such as Python, Java, or Scala.
- Strong knowledge of database systems (e.g., SQL, NoSQL, BigQuery) and data warehousing concepts.
- Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform.
- Familiarity with data processing frameworks and tools (e.g., Apache, Spark, Hadoop, Kafka).
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Preferred Qualifications
- Advanced degree in Computer Science, Engineering, or related field.
- Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes).
- Knowledge of machine learning and data analytics concepts.
- Experience with DevOps practices and tools.
- Certifications in relevant technologies (e.g., AWS Certified Big Data Specialty, Google Professional Data Engineer).
Please refer to the Company’s website - https://rideblitz.com/
· Strong knowledge on Windows and Linux
· Experience working in Version Control Systems like git
· Hands-on experience in tools Docker, SonarQube, Ansible, Kubernetes, ELK.
· Basic understanding of SQL commands
· Experience working on Azure Cloud DevOps
- Create, test, and maintain web-based applications using Laravel Framework.
- Work with members of the other teams to design, develop, and implement software solutions.
- Implement and manage the entire web application development lifecycle, from conception to delivery and post-launch maintenance.
- Write clean, efficient, and well-documented code.
- Make use of back-end data services and support the growth of current data services API.
- Effectively communicate all project updates, evaluations, suggestions, schedules, and technical and procedural difficulties.
- The development procedure, architecture, and similar information should all be documented.
- Troubleshoot and debug software issues.
- Continuously improve software quality and performance.
- Keep up with the latest web development technologies and trends.
Why Cybernetyx?
- Work on groundbreaking machine vision projects that leverage the latest advancements in AI.
- Join a team of leading experts in machine vision, AI, and machine learning. Competitive salary, exceptional benefits, and flexible work options. Opportunity for continuous learning and development.
- Make a tangible impact in the tech industry and beyond.
About Our Team:
You will be a part of a cutting-edge team that specializes in the development of machine vision-based AI products. Our team is committed to pushing the boundaries of what's possible, delivering solutions that revolutionize various industries.
Your Day-to-Day Role:
- Developing and optimizing vision algorithms in C/C++.
- Collaborate with cross-functional teams to integrate machine vision capabilities into larger systems and products.
- Software Engineer - Computer Vision at Cybernetyx 1
- Implement real-time processing techniques for handling video and image data.
- Conduct research to improve existing technologies and contribute to product development.
- Utilize QT for application development, if applicable.
- Leverage OpenCV library for computer vision tasks.
What Are We Looking For:
- 4+ years of experience in C/C++ programming.
- Experience with QT is beneficial but not mandatory.
- Knowledge of OpenCV library is a plus but not required.
- Strong understanding of machine vision, image processing, and AI algorithms. Analytical mindset with strong problem-solving skills.
- Ability to work in a fast-paced, collaborative environment.
At SpringML, we are all about empowering the ‘doers’ in companies to make smarter decisions
with their data. Our predictive analytics products and solutions apply machine learning to
today’s most pressing business problems so customers get insights they can trust to drive
business growth.
We are a tight-knit, friendly team of passionate and driven people who are dedicated to
learning, get excited to solve tough problems and like seeing results, fast. Our core values
include putting our customers first, empathy and transparency, and innovation. We are a
team with a focus on individual responsibility, rapid personal growth, and execution. If you
share similar traits, we want you on our team.
What’s the opportunity?
SpringML is looking for a Top-notch Salesforce Tableau CRM expert. You will play a critical role
in our client engagements using the Salesforce Analytics Cloud platform (Tableau CRM,Einstein Discovery).
You will design and implement highly customized solutions for our customer's business
problems, typically across multiple functions of a customer's organization through data
integration, visualization, and analysis.
Responsibilities:
Translate Business needs to technical specifications
Design and Deploy the Dataflows/Recipes as per the business requirements
Maintain/Support the existing Analytics platforms built using Tableau CRM
Conduct Unit testing and troubleshooting the issues
Create Visualizations using Native Built-in features/SAQL
Implement best security practices
Design the dashboards/Recipes using best practices
Develop/update the technical documentation
Skills:
- Knowledge and experience of B2B Lead Generation in IT services
- Experience of Cold calling
- A good knowledge and understanding of the technical aspects of the following:-
- Microsoft Office and associated systems, including CRM
- Social Networking (LinkedIn)
- Excellent communication skills and an excellent telephone manner
- Ability to work independently and under pressure
- Good verbal and written skills
- Good organisational skills and attention to detail
Roles and Responsibilities:
- Generate new leads using cold calling, social media, and other relevant lead generation tools.
- Classify hot, warm, and cold leads based on their need, budget, and decision-making capabilities.
- Organise and keep the lead status updated in the CRM software
- Follow up on leads and conduct research to identify potential prospects.
- Promote the company’s products/services addressing or predicting clients’ objectives
- To use company databases to gain potential leads.
- Think strategically - seeing the bigger picture and setting aims and objectives in order to develop and improve the business
- Screens potential business deals by analysing market strategies, deal requirements, and financials.
● 4+ years experience in JavaScript for Scripting and UI Automation Script.
●Strong experience in working with Protractor, Super test - Mocha.
●Experienced in SQL for DB scripting and DB Operation and good knowledge in SQL Server management studio.
●Hands-on with API Automation testing using any decent tool like is by.
●Experienced in GitHub, Jenkin, TeamCity and Knowledge on Jira.
●The ability to build automation frameworks from scratch and being confident to manage all aspects of the framework
●Exposure to/working knowledge of the Agile Methodology (e.g. Kanban, Scrum)
Work Location: Hyderabad
J.D:
- Minimum 3+ years of professional experience in Angular 2+ and above, React JS and scripting Language.
- Excellence in modern JavaScript, HTML5 and design patterns.
- Thorough understanding of the responsibilities of the platform, database, API, caching layer, proxies, React testing library, server side rendering, and Type script.
- Validating user input on the client side and implementing meaningful feedback.
- Skill in designing a modern build process that integrates testing and continuous delivery.
- Hands-on experience with creating configuration, build, and test scripts for continuous integration environments.
JOB DESCRIPTION
(NOTE- we are looking for those candidates who join immediately or notice period of within 15-20days)
• Job Scope
o Conduct penetration testing on internal website/system owned by EC-Council
o Produce a report and presentation to the system owner explaining the security
structure and the vulnerabilities of the system
o Conduct scoping for any new projects
o Research and recommend fixes for issues/vulnerabilities identified during the
penetration testing
o Create and update security test plan regularly according to the nature of the website
assigned
o Conduct research on new vulnerabilities and threats regularly to improve oneself
capabilities
• Minimum Requirements
o At least 3 year experience in conducting any three of the following
▪ Network Penetration Testing
▪ Mobile Application Penetration Testing
▪ Web Application Penetration Testing
▪ Source Code Review
▪ Writing, extending and modifying exploits, shellcode
▪ Reverse engineering malware, data obfuscation and ciphers
o Bachelor’s degree in IT security related field or equivalent
o Any (2) of the following certification ; OSCP, OSCE, OSEP, OSWE, CRT, LPT or
equivalent
o Proficiency in at least 1 programming language such as PHP, ruby, Python, Perl
o Strong understanding of encryption (SSL/TLS, PKI) and other authentication methods
o Good experience with tools used for penetration testing such as Metasploit,
BurpSuite, w3af, Kali Linux, SQLMap, Skipfish
o Excellent written and verbal communication skills, especially when dealing with
large reports and datasets with a high standard of documentation
o Mastery in linux/unix operating system and bash/Powershell