
Customer Success Manager
at You’re a problem-solver and amazing with customers. You’re enthralled by the wa

Similar jobs
•1-5 years of proven experience as a sales or in a similar role.
•Strong written and verbal communication skills.
•Strong technical knowledge of equipment and elevator systems. •Excellent communication and presentation skills.
•Time management and ability to meet deadlines
•Strong problem-solving skills and a customer-focused approach. •Proficiency in CRM software and Microsoft Office Suite (Excel, Word, PowerPoint).
•Valid driver’s license and willingness to travel as required.
•Ability to work independently and manage time effectively in a field-based role
Key Responsibilities:
Regulatory Compliance & Governance
Ensure end-to-end compliance with the Companies Act, 2013, FEMA, Securities Laws, and all other applicable laws and regulations.
Prepare, file, and maintain all statutory records, registers, and returns in a timely and accurate manner.
Conduct and document compliance audits to assess adherence to statutory requirements.
Stay updated with amendments to laws, rules, and regulations, and proactively adapt company practices.
Board & Committee Support
Organize and manage Board Meetings, Committee Meetings (Audit, Nomination & Remuneration, CSR, etc.), and General Meetings (GMs).
Draft agendas, circulate notices, prepare board notes, liaise with directors, and record/document minutes accurately.
Advise the Board and committees on compliance, governance, and legal implications of key business decisions.
Assist the Board in implementation of best corporate governance practices.
Shareholder & Investor Relations
Maintain the company’s statutory registers and records relating to shareholding.
Handle shareholder queries, grievances, and communications promptly.
Coordinate dividend distribution, rights/bonus issues, transmission and transfer of shares, and corporate action processes.
Legal & Corporate Documentation
Draft, vet, and review contracts, agreements, policies, and other corporate documents.
Safeguard the company’s legal interests by advising on matters such as mergers, acquisitions, restructuring, or capital raising (if applicable).
Maintain records of significant policies and ensure periodic reviews.
Liaison & External Relations
Act as the primary point of contact between the company and regulatory/government authorities or external agencies such as ROC, RBI, MCA, RTA etc.
Maintain professional networks with peers, regulators, and industry associations to ensure timely guidance and support in compliance-related queries.
Interface with auditors, legal advisors, and consultants for statutory and compliance matters.
HR Compliance & Coordination
Ensure compliance with applicable HR and labour laws including the Shops and Establishment Act, Payment of Wages Act, Employees’ Provident Funds & Miscellaneous Provisions Act, Employees’ State Insurance Act, and other relevant statutes.
Coordinate with the HR department to align statutory requirements with company policies and practices.
Monitor timely submission of statutory returns related to payroll, provident fund, gratuity, professional tax, etc.
Facilitate secretarial audits that encompass HR compliance aspects.
Provide guidance on governance-related HR matters to the Board and leadership.
Job Summary:
As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.
Key Responsibilities:
- Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
- Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
- Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
- Work with AWS DMS and RDS for database integration and migration
- Optimize data flows and system performance for speed and cost-effectiveness
- Deploy and manage infrastructure using AWS CloudFormation templates
- Collaborate with cross-functional teams to gather requirements and build robust data solutions
- Ensure data integrity, quality, and security across all systems and processes
Required Skills & Experience:
- 6+ years of experience in Data Engineering with strong AWS expertise
- Proficient in Python and PySpark for data processing and ETL development
- Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
- Strong SQL skills for building complex queries and performing data analysis
- Familiarity with AWS CloudFormation and infrastructure as code principles
- Good understanding of serverless architecture and cost-optimized design
- Ability to write clean, modular, and maintainable code
- Strong analytical thinking and problem-solving skills
• Run the production environment by monitoring availability and taking a holistic view of
system health
• Build software and systems to manage platform infrastructure and applications
• Improve reliability, quality, and time-to-market of our suite of software solutions
• Measure and optimize system performance, with an eye toward pushing our capabilities
forward, getting ahead of customer needs, and innovating to continually improve
• Provide primary operational support and engineering for multiple large distributed
software applications
• Drive cross-team alignment across development teams around reliability initiatives
The ideal candidate must -
• Bachelor’s degree in computer science or other highly technical, scientific discipline
• Ability to program (structured and OO) with one or more high level languages, such as
Python, Java, C/C++, Ruby, and JavaScript
• Good experience with microservices architecture and serverless technologies
• Exposure to event driven architecture and state machines
• A proactive approach to spotting problems, areas for improvement, and performance
bottlenecks
- We are looking for : Data engineer
- Sprak
- Scala
- Hadoop
N.p - 15 days to 30 Days
Location : Bangalore / Noida
§ Experience of developing web based enterprise wide applications in .NET4.5 (C#, ASP.NET), WCF, WPF, AJAX etc
§ Understanding of enterprise application development and deployment.
§ Good understanding OOPS and Design Patterns and experience of tools like Enterprise Architect.
§ Good knowledge of RDBMS design, programming and DBA concepts and experience of working in SQL Server and Oracle databases.
§ Good Design, Coding and Testing skills.
§ Knowledge and exposure to Service Oriented Architecture, Enterprise Service Bus and application Integration (using middleware).
§ Experience of working in electricity utility domain – preferably in EA/AMR/AMI/EMS/IT System Integration/ERP (for Utilities) projects
§ Mobility based software development, GPRS, GSM Knowledge of BI tools implementation.
§ Experience in mobility based applications will be an added advantage.
- 3+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd and streaming databases like druid
- Strong industry expertise with containerization technologies including kubernetes, docker-compose
- 2+ years of industry in experience in developing scalable data ingestion processes and ETLs
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- Experience with scripting languages. Python experience highly desirable.
- 2+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Demonstrated expertise of building cloud native applications
- Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
- Experience in API development using Swagger
- Strong expertise with containerization technologies including kubernetes, docker-compose
- Experience with cloud platform services such as AWS, Azure or GCP.
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
- Design and Implement Large scale data processing pipelines using Kafka, Fluentd and Druid
- Assist in dev ops operations
- Develop data ingestion processes and ETLs
- Design and Implement APIs
- Assist in dev ops operations
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
+Work from Office
+Salary not a deterrent to the deserving applicants
+5-day work week (Sat-Sun off)
+Healthy and Conducive Work Environment
+Special care to WLB
+Medical Insurance
+To & fro Cab facility to Female employees
+Best in class incentives
+Timely Salary and incentive credits
+HR which cares for its HR
+fast track growth
What you’ll do
- Deliver plugins for our Python-based ETL pipelines.
- Deliver Python microservices for provisioning and managing cloud infrastructure.
- Implement algorithms to analyse large data sets.
- Draft design documents that translate requirements into code.
- Deal with challenges associated with handling large volumes of data.
- Assume responsibilities from technical design through technical client support.
- Manage expectations with internal stakeholders and context-switch in a fast paced environment.
- Thrive in an environment that uses AWS and Elasticsearch extensively.
- Keep abreast of technology and contribute to the engineering strategy.
- Champion best development practices and provide mentorship.
What we’re looking for
- Experience in Python 3.
- Python libraries used for data (such as pandas, numpy).
- AWS.
- Elasticsearch.
- Performance tuning.
- Object Oriented Design and Modelling.
- Delivering complex software, ideally in a FinTech setting.
- CI/CD tools.
- Knowledge of design patterns.
- Sharp analytical and problem-solving skills.
- Strong sense of ownership.
- Demonstrable desire to learn and grow.
- Excellent written and oral communication skills.
- Mature collaboration and mentoring abilities.
About SteelEye Culture
- Work from home until you are vaccinated against COVID-19
- Top of the line health insurance • Order discounted meals every day from a dedicated portal
- Fair and simple salary structure
- 30+ holidays in a year
- Fresh fruits every day
- Centrally located. 5 mins to the nearest metro station (MG Road)
- Measured on output and not input
About the Role
Dremio’s user experience is one of its key differentiators and makes all your data easily accessible and shareable by your data consumers. UI Engineers at Dremio are responsible for the development of the user interface and user experience on Dremio’s Data Lake Engine.
Responsibilities and ownership
- Own the full cycle of development of our modern single page web application from inception, design, development, testing, and production.
- Care deeply about modular design patterns and frameworks to deliver an architecture that’s rooted in simplicity, that’s easy to iterate on and constantly evolve.
- Passionate about ease of use, experience and quality of the product.
Requirements
- 5+ years of experience working with JavaScript frameworks such as React, Angular.js, Angular, or Vue.js.
- 2 years minimum experience with React is highly preferred and currently utilizing React in their current job.
- Strong coding experience in JavaScript (or TypeScript), HTML, and CSS.
- Passion about UI development and UX design
- Shown proven success in delivering high-quality front end applications
- Fluency in the understanding of SQL and databases (relational or non-relational)
- B.S. or M.S in Computer Science in a relevant technical field or equivalent professional experience









