
About ZipGrid - MyAashiana Management Services
Similar jobs
We are seeking a highly skilled QA Salesforce Automation Engineer with 6+ years of experience to join our dynamic team in Bangalore. The ideal candidate will have a strong background in Salesforce testing, with hands-on experience in automation tools such as Selenium, Provar, or TestNG. You will be responsible for creating and maintaining automated test scripts, executing test cases, identifying bugs, and ensuring the quality and reliability of Salesforce applications. A solid understanding of Salesforce modules (Sales Cloud, Service Cloud, etc.) and APIs is essential. Experience with CI/CD tools like Jenkins and version control systems like Git is preferred. You will work closely with developers, business analysts, and stakeholders to define test strategies and improve the overall QA process. This is a hybrid role, offering flexibility and a collaborative environment focused on delivering high-quality enterprise solutions.
Responsibility:
∙Develop and maintain code following predefined cost, company and security
standards.
∙Work on bug fixes, supporting in the maintenance and improvement of existing
applications.
∙Elaborate interfaces using standards and design principles defined by the team.
∙Develop systems with high availability.
∙Attend and contribute to development meetings.
∙Well versed with Unit testing and PSR Standards.
∙Master Software Development lifecycle, standards and technologies used by the
team.
∙Deliver on time with high quality.
∙Write Automation tests before to API call to code it and test it.
∙Trouble Shooting and debugging skills.
∙Perform technical documentation of the implemented tasks.
About - The company is an online discovery platform that offers direct-to-home product trials for D2C brands.
Location- Bangalore
Series- B
Responsibilities
- Lead simultaneous development for multiple business verticals.Design & develop highly scalable, reliable, secure, and fault-tolerant systems.Ensure that exceptional standards are maintained in all aspects of engineering.Collaborate with other engineering teams to learn and share best practices.Take ownership of technical performance metrics and strive actively to improve them.Mentors junior members of the team and contributes to code reviews.
Requirements
- A passion to solve tough engineering/data challenges.Be well versed with cloud computing platforms AWS/GCPExperience with SQL technologies (MySQL,PostgreSQL)Experience working with NoSQL technologies (MongoDB,ElasticSearch)Excellent Programming skills in Python/Java/GoLangBig Data streaming services (Kinesis ,Kafka ,RabbitMQ)Distributed cache systems (Redis ,Memcache)Advanced data solutions (BigQuery, RedShift, DynamoDB, Cassandra)Automated testing frameworks and CI/CD pipelinesInfrastructure orchestration(Docker/Kubernetes/Nginx)Cloud-native tech like Lambda,ASG,CDN,ELB,SNS/SQS,S3 Route53 SES
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Job Description:
Want to help- We are looking for “Area Head” to expand our customer base and achieve sales quotas for a specific zone of our company.
Ultimately, you will ensure your area of responsibility meets and exceeds the expectations of our business objectives and contributes to our company’s success in the long run.
Responsibilities:
- Create area sales plans and quotas in alignment with business objectives
- Achieve business targets of the area for various quarters as well as the financial year
- Develop, manage and fulfillment of an efficient distribution network to improve sales performance
- Building the AffordPlan brand within the defined hospital network zone
- Maintaining strong relationships with doctors, and customers.
- Implement innovative sales techniques to increase customer satisfaction.
- Identify hiring needs, select and train new executives.
- Optimize and oversee operations to ensure efficiency
- Able to convert leads from various channels.
- Forecasting daily MIS on product performance across all branches.
- Developing, Constructing and Cooperative working relationship with the employee of the mapped branch.
Requirement:
- Proven work experience as Area Sales Manager, Territory, Zone Manager or similar senior sales role
- Minimum 4 years of experience required in B2C sales (preferring Industry - BFSI Insurance/Retail & distribution/ Telecom/Consumer Durables).
- Ability to measure and analyze key performance indicators (Profit and KPIs)
- Ability to lead and motivate a high-performance sales team.
- Strong organizational skills with a problem-solving attitude
- Exceptional communication and interpersonal abilities.
- Availability to travel as needed
SDE3 Solutions Architect (Mobile) (Remote)
Grip Invest (http://www.gripinvest.in">www.gripinvest.in)
Why should you look at this role?
Salary Bracket: 40-50 LPA
Healthy work-life Balance
Core values
Health insurance
Provident fund
Annual Bonus
What else?
- Great culture based on the following core values
- Courage
- Transparency
- Ownership
- Commitment
- Celebration
- Lean structure and no micromanaging. You get to own your work
- The company just turned one so you get a seat on a rocket ship that's just taking off!
- High focus on Learning & Development and monetary support for relevant up skilling
- Competitive compensation along with equity ownership for wealth creation
Company Size- less than 50
About the company
Grip is building a new category of investment options for the new-age of Indians. Millennial don’t communicate, shop, pay, entertain or work like the previous generation - then why should they invest the same way? Started in June’20, Grip has seen 35% month-on-month growth to become one of India’s fastest-growing destinations for alternative investments. Today, Grip offers a unique investment option of leasing assets to some of India’s most disruptive businesses like Udaan, Stanza Living, Furlenco, Bounce, BlueStone, FabAlley and LetsTransport. With a minimum investment size of INR 20,000, Grip is democratizing investment options that have only been available to the largest funds and family offices. Finance and technology (FinTech) is what we do, but people are at the core of our mission. From client-facing roles to technology, and everywhere in between, you’ll work alongside a diverse team who loves to solve problems, think creatively, and fly the plane as we continue to build it.
Employer Reviews
“Work from home”
“Free work culture”
For more reviews, click here
https://www.glassdoor.co.in/Overview/Working-at-Grip-EI_IE4118155.11,15.htm">https://www.glassdoor.co.in/Overview/Working-at-Grip-EI_IE4118155.11,15.htm
What will you do every day?
- S/he will build a clear technical vision based on business requirements and ensure alignment of all stakeholders.
- Lead and participate in project deliverables - including architecture, technical design, coding, and QA
- Monitoring the performance of live apps and work on optimising them at the code level
- Identifying and resolving bottlenecks, rectifying bugs and enhancing application performance
- Additionally, s/he will be the primary technical point of contact for the business team to help them understand the complexity of any new feature.
Your Superpowers
- Expertise in iOS, Android, HTML5, CSS3, and other mobile frameworks/accelerators. ○ Strong understanding experience with Design patterns, Data Structures.
- Demonstrated deployments of enterprise or consumer-facing mobile software systems using industry standard environments including iOS, Android and Hybrid platforms.
- Understanding of REST and JSON, and experience with utilising REST on mobile clients.
- Good understanding of Version Control principles, preferably using Git
- Experience in working on setting up DevOps practices
- Ability to lead a team of 3-5 engineers across Android/iPhone ○ Experience shipping a few native apps.
- Strong collaboration, communication, and creative thinking skills
- Ability to rapidly learn and take advantage of new concepts, business models, and technologies
Click Here to apply
https://forms.gle/vF3FYw2ucE1vYur96">https://forms.gle/vF3FYw2ucE1vYur96
Leena AI has been founded by IIT Delhi alumnus and senior industry veterans with the mantra to "Improve Employee Experience through Intelligent Conversations”.
Leena AI provides an end to end Chatbot to better the employee/candidate experience & product longevity using our capabilities in HR Chatbot. This solution ensures a reduction in query handling time for HR & improves the employee experience. Apart from this we also have out of the box modules which take care of real-time status updates for employee queries about Workday, improve employee on-boarding and exit process along with simplifying employee-manager's self-service process in the organization. We enable out of the box integration with Workday, Microsoft AD, Kronos and SharePoint for providing a superior experience to employees over the Chatbot.
At Leena AI, We are looking for a passionate Lead Front-End Developer who is proficient with React.js. You will be involved from conception to completion with projects that are technologically sound and aesthetically impressive. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality products are important.
Responsibilities:
As a Lead Front-End Developer you would be responsible for:
- Leading a team, Collaborate with team to deploy and continuously improve our solutions
- Developing new user-facing features using React.js.
- Building reusable components and front-end libraries for future use.
- Translating designs and wireframes into high quality and pixel perfect code.
- Optimising components for maximum performance across a vast array of web-capable devices and browsers.
- And everything that a coder does.
- Strong proficiency in JavaScript, including DOM manipulation and theJavaScript object model.
- Thorough understanding of React.js and Redux its core principles.
- Familiarity with newer specifications of ECMA Script is a plus.
- Familiarity with RESTful APIs.Knowledge of modern authorisation mechanisms, such as JSON Web Token.
- Familiarity with modern front-end build pipelines and tools.
- Experience with common front-end development tools such as Babel, Webpack, NPM, etc.
- Ability to understand business requirements and translate them into technical requirements.
- Lead a product (would be a plus)
- Have managed a team before
● Design overall architecture of the web application.
● Maintain quality and ensure responsiveness of applications.
● Collaborate with the rest of the engineering team to design and launch new features.
● Maintain code integrity and organization.
● Experience working with graphic designers and converting designs to visual elements.
● Understanding and implementation of security and data protection.
● Highly experienced with front-end programming languages - HTML, CSS JavaScript, etc
● Proficient experience using - advanced JavaScript libraries and frameworks
Development experience for both mobile and desktop.
● Knowledge of code versioning tools such as Git
Requirement -
● Proven work experience as a Front-end developer
● Hands on experience with markup languages
● Experience with JavaScript, CSS and HTML
● Wordpress would be an added advantage.
● Prefer from Tier One college like ( IIIT, NIT, IIT, NSUT, DTU..etc )
● Familiarity with browser testing and debugging
● In-depth understanding of the entire web development process (design, development
and deployment)
● Understanding of layout aesthetics
● An ability to perform well in a fast-paced environment
● Excellent analytical and multitasking skills
● B. Tech. in computer science Preferred.










