11+ Know your customer Jobs in Pune | Know your customer Job openings in Pune
Apply to 11+ Know your customer Jobs in Pune on CutShort.io. Explore the latest Know your customer Job opportunities across top companies like Google, Amazon & Adobe.
About Hamleys:
Hamleys is not just a Toy Store but an entire Toy experience. Currently there are more 125 stores in India and we are present in more than 20 countries. 365 days of play is the brand philosophy of Hamleys. At Hamleys we believe in giving our customers an Instore experience that helps him build memories for lifetime.
Job Title: Fun Consultant
Grade: A1
Role:
- Warmly welcome and approach each customer – customer’s inclination to shop starts right from the first step & eye contact.
- Believes and lives the attitude of “Sheer customer delight” – go beyond
- To efficiently adhere to all “company procedures and policies.”
- To work with the team to achieve award winning “customer service.”
Key Responsibilities:
- Contribute to achieve store KPIs.
- Utilize the knowledge of trends and guide each customer on “what suits better” – in line with the customer needs, individuality and product range.
- Believes and lives the attitude of “Sheer customer delight” – go beyond
Desired Skills & Abilities
- Superior understanding of Indian retailing and global trends.
- Good communication and interpersonal skills - understand customer needs
- Good at relationship building - conversion to sales
- Ability to follow brand and store guidelines
- Good operations experience – receiving stock, inventory keeping, cash handling etc.
Job Title: Node.js Developer
Experience: 4+ Years
Salary: Up to ₹12 LPA
Joining: Immediate
Gender Preference: Male candidates only
Requirements:
- Minimum 4 years of hands-on experience in Node.js development.
- Strong understanding of JavaScript (ES6+), RESTful APIs, and asynchronous programming.
- Experience with databases like MongoDB, MySQL, or PostgreSQL.
- Familiarity with Git and version control workflows.
- Strong problem-solving skills and attention to detail.
- Excellent communication and interpersonal skills.
- Ability to work in a fast-paced, collaborative environment.
Job Title: IAC SRE Engineer
Location: Pune, Mumbai, Bangalore
Experience Required: 4 Years
Role Overview:
We are looking for experienced IAC Engineers with a strong background in Akamai, Data Structures & Algorithms (DSA), Java, and DevSecOps. The ideal candidate should have hands-on development experience, be proficient in writing Infrastructure as Code using Terraform, and demonstrate strong problem-solving skills.
Core Skills:
- Akamai – Strong experience in CDN, caching, and performance optimization.
- Data Structures & Algorithms (DSA) – Strong problem-solving and coding abilities.
- Java – Solid programming background and experience in development.
- DevSecOps – Understanding of integrating security in CI/CD pipelines and infrastructure.
Good to Have:
- WAF (Web Application Firewall) – Knowledge of WAF is a plus, though not mandatory.
Additional Skills:
- Experience with SRE (Site Reliability Engineering) practices is beneficial.
- Strong hands-on with Terraform for managing cloud infrastructure.
To be successful in this role, you should possess
• Collaborate closely with Product Management and Engineering leadership to devise and build the
right solution.
• Participate in Design discussions and brainstorming sessions to select, integrate, and maintain Big
Data tools and frameworks required to solve Big Data problems at scale.
• Design and implement systems to cleanse, process, and analyze large data sets using distributed
processing tools like Akka and Spark.
• Understanding and critically reviewing existing data pipelines, and coming up with ideas in
collaboration with Technical Leaders and Architects to improve upon current bottlenecks
• Take initiatives, and show the drive to pick up new stuff proactively, and work as a Senior
Individual contributor on the multiple products and features we have.
• 3+ years of experience in developing highly scalable Big Data pipelines.
• In-depth understanding of the Big Data ecosystem including processing frameworks like Spark,
Akka, Storm, and Hadoop, and the file types they deal with.
• Experience with ETL and Data pipeline tools like Apache NiFi, Airflow etc.
• Excellent coding skills in Java or Scala, including the understanding to apply appropriate Design
Patterns when required.
• Experience with Git and build tools like Gradle/Maven/SBT.
• Strong understanding of object-oriented design, data structures, algorithms, profiling, and
optimization.
• Have elegant, readable, maintainable and extensible code style.
You are someone who would easily be able to
• Work closely with the US and India engineering teams to help build the Java/Scala based data
pipelines
• Lead the India engineering team in technical excellence and ownership of critical modules; own
the development of new modules and features
• Troubleshoot live production server issues.
• Handle client coordination and be able to work as a part of a team, be able to contribute
independently and drive the team to exceptional contributions with minimal team supervision
• Follow Agile methodology, JIRA for work planning, issue management/tracking
Additional Project/Soft Skills:
• Should be able to work independently with India & US based team members.
• Strong verbal and written communication with ability to articulate problems and solutions over phone and emails.
• Strong sense of urgency, with a passion for accuracy and timeliness.
• Ability to work calmly in high pressure situations and manage multiple projects/tasks.
• Ability to work independently and possess superior skills in issue resolution.
• Should have the passion to learn and implement, analyze and troubleshoot issues
Requirements:
- Undergraduate degree or equivalent experience.
- Working knowledge of AS400-Cobol.
- Familiarity with CL and DB2
- Basic knowledge of theories, practices, and procedures in a function or skill.
- Willing to depend on others for instruction, guidance, or direction.
- Knowledge of Automation and robotics Industry
- Knowledge of Bottling industry
- Material handling - lifts, cranes, conveyors etc.
- Getting job works for VMC and CNC machines
- Hand on Experience in Food technology
- Experience in IoT
- Lead Generation X
We are looking for a Digital marketing expert who is deeply passionate about marketing, loves to take initiative, and is a go-getter.
The responsibilities include:
1) Search Engine Optimization
2) Media outreach
3) Developing strategies for Paid Search Engine Marketing campaigns and outbound campaigns to generate leads,
4) Coming up with an innovative social media content strategy and executing it
5) Assessing the result of marketing efforts, communicating with the management team and improvising the plan to get the best results.
To be successful in this role, you need to be self-driven and display outstanding performance. This is an excellent career-defining opportunity for a seasoned professional who enjoys a position filled with variety, challenges, depth, and autonomy.
- Java 8, Spring Boot, Java Microservices
- REST API’s
- Angular 11, HTML5, CSS3, Bootstrap
- SQL DB
- Azure/AWS
- Kafka
Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.
Must Have Skills
- Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
- Leading in the identification, isolation, resolution and communication of problems within the production environment.
- Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
- Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
- Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
- Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
- Works on multiple platforms and multiple projects concurrently.
- Performs code and unit testing for complex scope modules, and projects
- Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
- Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
- Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
- Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
- Working knowledge on Kafka Rest proxy.
- Ensure optimum performance, high availability and stability of solutions.
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
- Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem.
- Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
- Ability to perform data related benchmarking, performance analysis and tuning.
- Strong skills in In-memory applications, Database Design, Data Integration.
About US: Newbie Soft Solutions is an IT service provider focused on providing solutions in niche areas to support and build future -ready, resilient solutions for medium sized industries and growth-focused technology organizations.
The name NEWBIE signifies a new chapter, a new beginning in the field of staffing solutions. Founded in 2015, we have grown from strength to strength with a strong presence across India, United States and Australia. Our offerings include Staffing Solutions, IT Consulting, Business Intelligence, Security Solutions, Legacy Application Management and Modernization. We value consistency, which is our core principle, to reach the end goal of complete user satisfaction. We constantly strive to outperform our competitors to become the leaders in digital revolution.
Job Requirement :
- Clear understanding of end to end communication of service calls via API Gateway/Service Mesh/Service Registry
- Experience on Springboot/SpringCloud/Restful Webservices
- Experience in containerisation (Docker) and Kubernetes in terms of creating container images and writing manifest files/helm charts on designing PODs/Side-car patterns etc.
- Good design experience on Web Applications (backend) & since we operate as a DevOps pod we would expect the person be involved in production deployments/support.
- Exposure to usage of CI-CD tools like Git/Jenkins/Maven/Sonar/Junit/CheckMarx/Netsparker/Cucumber



