11+ Warehouse Management System (WMS) Jobs in Hyderabad | Warehouse Management System (WMS) Job openings in Hyderabad
Apply to 11+ Warehouse Management System (WMS) Jobs in Hyderabad on CutShort.io. Explore the latest Warehouse Management System (WMS) Job opportunities across top companies like Google, Amazon & Adobe.
REVIEW CRITERIA:
MANDATORY:
- Strong Customer Success, Large account management Profile
- Must have 5+ YOE in Customer Success, Account Management, or Client Relationship in a SaaS environment.
- Must have 3+ YOE in Team management
- Must have expertise in managing and ensuring long-term satisfaction and growth with the enterprise-level customers,
- Must be proficient in CRM and customer success tools
- Candidate for Hyderabad should know 1 more South Indian language (Tamil, Kannada, Malayalam) other than Telugu
PREFERRED:
- Retail, Manufacturing or FMCG
- Candidates for Hyderabad knowing any other south Indian language apart from Telegu
ROLE & RESPONSIBILITIES:
- Build and maintain strong relationships with key stakeholders within assigned high ticket accounts.
- Serve as the primary point of contact for strategic customers, ensuring their needs are met and expectations are exceeded.
- Develop and execute customer success plans, including onboarding, training, and ongoing support.
- Proactively identify and address customer challenges, providing solutions and recommendations to drive customer success.
- Collaborate with cross-functional teams, including Sales, Product, and Support, to ensure a seamless customer experience.
- Conduct regular business reviews with customers to review performance, identify areas for improvement, and present new opportunities.
- Monitor customer health metrics and proactively address any red flags to prevent churn.
- Act as a customer advocate, providing feedback and insights to internal teams to drive product enhancements and improvements.
- Stay up-to-date with industry trends and best practices in customer success management.
IDEAL CANDIDATE:
- Bachelor's degree in Business Administration, Accounts, or a related field.
- Minimum of 5 years of experience in a customer success role, preferably in a SaaS company.
- Proven track record of managing and retaining high-value customers.
- Excellent communication and interpersonal skills, with the ability to build strong relationships with customers.
- Strong problem-solving and analytical skills, with the ability to identify and address customer challenges.
- Self-motivated and results-oriented, with the ability to work independently and as part of a team.
- Proficient in CRM software and other customer success tools.
- Ability to travel as needed to meet with customers.
We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
Job Description:
As a Frontend Developer, your responsibilities will include:
- Developing new user-facing features using React.js and integrating with backend services.
- Building reusable components and front-end libraries for future use.
- Translating designs and wireframes into high-quality code.
- Optimizing components for maximum performance across a broad spectrum of web-capable devices and browsers.
Key Performance Indicators:
- Efficiency and quality of the code developed.
- Adherence to project timelines and delivery milestones.
- Positive stakeholder feedback on usability and design.
- Continuous improvement and adoption of best practices in frontend development.
Prior Experience Required:
- Minimum 3+ years of experience in frontend web development.
- Proficient in JavaScript, including DOM manipulation and the JavaScript object model.
- Extensive experience with React.js and its core principles.
- Strong proficiency in HTML5, CSS3, and modern frontend development tools.
- Experience with popular React.js workflows (such as Redux).
- Familiarity with newer specifications of EcmaScript.
- Experience with data structure libraries (e.g., Immutable.js).
- Good understanding of RESTful APIs and modern authorization mechanisms (e.g., JSON Web Token).
- Good to have: Experience with TypeScript and Next.js.
Employer:
RaptorX.ai
Location:
Hyderabad
Department:
IT Development
Collaboration:
The role involves working closely with backend developers, UI/UX designers, and project managers to deliver seamless, high-quality web applications.
Salary:
Competitive, based on experience and market standards.
Education:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
Language Skills:
- Strong command of Business English, both verbal and written, is required.
Other Skills:
- Excellent problem-solving skills.
- Looking for perfection in development
- Taking Ownership and Accountability in a fast-paced environment.
- Proficiency in code versioning tools, such as Git.
- Knowledge of modern frontend build pipelines and tools.
- Experience with responsive and adaptive design.
Additional Requirements:
- Portfolio of previous projects that demonstrates expertise in frontend development.
- Familiarity with agile methodologies.
- Strong teamwork skills, with the ability to collaborate effectively with colleagues and clients across diverse teams.
Tech stack : React JS, JS, Node JS, HTML 5, CSS3
Candidates serving notice period & immediately available can apply, should join within 7 days Urgent requirement
.
Analysis/audits inbound/outbound calls, emails, and customer surveys to identify areas of service delivery that did not meet pre-established performance standards within the CS support /Sales teams
Provides structured and timely recommendations; verbal and/or written feedback to the advisors.
Skills and requirements
- Experience analyzing complex and varied data in a commercial or academic setting.
- Desire to solve new and complex problems every day.
- Excellent ability to communicate scientific results to both technical and non-technical team members.
Desirable
- A degree in a numerically focused discipline such as, Maths, Physics, Chemistry, Engineering or Biological Sciences..
- Hands on experience on Python, Pyspark, SQL
- Hands on experience on building End to End Data Pipelines.
- Hands on Experience on Azure Data Factory, Azure Data Bricks, Data Lake - added advantage
- Hands on Experience in building data pipelines.
- Experience with Bigdata Tools, Hadoop, Hive, Sqoop, Spark, SparkSQL
- Experience with SQL or NoSQL databases for the purposes of data retrieval and management.
- Experience in data warehousing and business intelligence tools, techniques and technology, as well as experience in diving deep on data analysis or technical issues to come up with effective solutions.
- BS degree in math, statistics, computer science or equivalent technical field.
- Experience in data mining structured and unstructured data (SQL, ETL, data warehouse, Machine Learning etc.) in a business environment with large-scale, complex data sets.
- Proven ability to look at solutions in unconventional ways. Sees opportunities to innovate and can lead the way.
- Willing to learn and work on Data Science, ML, AI.
- Minimum 1 years of relevant experience, in PySpark (mandatory)
- Hands on experience in development, test, deploy, maintain and improving data integration pipeline in AWS cloud environment is added plus
- Ability to play lead role and independently manage 3-5 member of Pyspark development team
- EMR ,Python and PYspark mandate.
- Knowledge and awareness working with AWS Cloud technologies like Apache Spark, , Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS

This company provides on-demand cloud computing platforms.
- 15+ years of Hands-on technical application architecture experience and Application build/ modernization experience
- 15+ years of experience as a technical specialist in Customer-facing roles.
- Ability to travel to client locations as needed (25-50%)
- Extensive experience architecting, designing and programming applications in an AWS Cloud environment
- Experience with designing and building applications using AWS services such as EC2, AWS Elastic Beanstalk, AWS OpsWorks
- Experience architecting highly available systems that utilize load balancing, horizontal scalability and high availability
- Hands-on programming skills in any of the following: Python, Java, Node.js, Ruby, .NET or Scala
- Agile software development expert
- Experience with continuous integration tools (e.g. Jenkins)
- Hands-on familiarity with CloudFormation
- Experience with configuration management platforms (e.g. Chef, Puppet, Salt, or Ansible)
- Strong scripting skills (e.g. Powershell, Python, Bash, Ruby, Perl, etc.)
- Strong practical application development experience on Linux and Windows-based systems
- Extra curricula software development passion (e.g. active open source contributor)
Job Description
We are looking for a great JavaScript developer who is proficient with React.js. Your primary focus will be on developing user interface components and implementing them following well - known React.js workflows (such as Flux or Redux). You will ensure that these components and the overall application are robust and easy to maintain. You will coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important.
Responsibilities
-
Developing new user-facing features using React.js
-
Building reusable components and front-end libraries for future use
-
Translating designs and wireframes into high quality code
-
Optimizing components for maximum performance across a vast array of web-capable
devices and browsers
Skills
-
Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model
-
Thorough understanding of React.js and its core principles
-
Experience with popular React.js workflows (such as Flux or Redux)
-
Familiarity with newer specifications of EcmaScript
-
Experience with data structure libraries (e.g., Immutable.js)
-
Knowledge of isomorphic React is a plus
-
Familiarity with RESTful APIs
-
Knowledge of modern authorization mechanisms, such as JSON Web Token
-
Familiarity with modern front-end build pipelines and tools
-
Experience with common front-end development tools such as Babel, Webpack, NPM,
etc.
-
Ability to understand business requirements and translate them into technical
requirements
-
A knack for benchmarking and optimization
-
Familiarity with code versioning tools (such as Git, SVN, and Mercurial)
SpringML is looking to hire a top-notch Senior Data Engineer who is passionate about working with data and using the latest distributed framework to process large dataset. As an Associate Data Engineer, your primary role will be to design and build data pipelines. You will be focused on helping client projects on data integration, data prep and implementing machine learning on datasets. In this role, you will work on some of the latest technologies, collaborate with partners on early win, consultative approach with clients, interact daily with executive leadership, and help build a great company. Chosen team members will be part of the core team and play a critical role in scaling up our emerging practice.
RESPONSIBILITIES:
- Ability to work as a member of a team assigned to design and implement data integration solutions.
- Build Data pipelines using standard frameworks in Hadoop, Apache Beam and other open-source solutions.
- Learn quickly – ability to understand and rapidly comprehend new areas – functional and technical – and apply detailed and critical thinking to customer solutions.
- Propose design solutions and recommend best practices for large scale data analysis
SKILLS:
- B.tech degree in computer science, mathematics or other relevant fields.
- 4+years of experience in ETL, Data Warehouse, Visualization and building data pipelines.
- Strong Programming skills – experience and expertise in one of the following: Java, Python, Scala, C.
- Proficient in big data/distributed computing frameworks such as Apache,Spark, Kafka,
- Experience with Agile implementation methodologies



