• Manage new feature releases while keeping the quality bar high in term of engineering practices
• Manage, Mentor and coach a team of 5-10 frontend engineers while being a hands-on engineer yourself
• Understand current set of the product's UI components and work on extending the same and build more reusable components
• Define & document coding standards & best practices
• Improve processes and technology continuously by showing the team better ways of doing things and help improve skills in the team
• Optimize solution for performance and scalability
• Dive deep into the usage of coding patterns and contribute in curating a library of reusable components that can be used to build more complex and unified Interfaces
• Estimate, Plan, Track and handle multiple priorities in a fast-paced environment
• Write test cases and document UI components
• Ensure code quality by reviewing team members code and mentor others in the frontend team
• Interface with the Design Backend and DevOps teams
Requirements:
• 8+ years of product development experience with expertise in designing and implementing high-performing web applications
• 3+ of experience in leading a team of junior and senior React engineers
• Strong knowledge in React, ES6, TypeScript and HTML/CSS3
• Experience in building React applications using Redux, Flux, webpack, NPM, create-react-app
• Deep understanding of Frontend Performance Optimization aspects - with respect to resource loading strategy, CPU/Memory profiling on the browser
• Experience in testing libraries such as Jest, Enzyme or Mocha
• Experience in building analytics UI using D3, HighCharts, amChart libraries
• Experience in building Progressive Web Apps (PWA)
• Knowledge on Node server, Socket connections and mechanisms to handle real-time async data
• Good knowledge in content serving technologies like HTTP, CDN, proxies and caching

Similar jobs
About the Role:
We are seeking an experienced Data Engineer to lead and execute the migration of existing Databricks-based pipelines to Snowflake. The role requires strong expertise in PySpark/Spark, Snowflake, DBT, and Airflow with additional exposure to DevOps and CI/CD practices. The candidate will be responsible for re-architecting data
pipelines, ensuring data consistency, scalability, and performance in Snowflake, and enabling robust automation and monitoring across environments.
Key Responsibilities
Databricks to Snowflake Migration
· Analyze and understand existing pipelines and frameworks in Databricks (PySpark/Spark).
· Re-architect pipelines for execution in Snowflake using efficient SQL-based processing.
· Translate Databricks notebooks/jobs into Snowflake/DBT equivalents.
· Ensure a smooth transition with data consistency, performance, and scalability.
Snowflake
· Hands-on experience with storage integrations, staging (internal/external), Snowpipe, tables/views, COPY INTO, CREATE OR ALTER, and file formats.
· Implement RBAC (role-based access control), data governance, and performance tuning.
· Design and optimize SQL queries for large-scale data processing.
DBT (with Snowflake)
· Implement and manage models, macros, materializations, and SQL execution within DBT.
· Use DBT for modular development, version control, and multi-environment deployments.
Airflow (Orchestration)
· Design and manage DAGs to automate workflows and ensure reliability.
· Handle task dependencies, error recovery, monitoring, and integrations (Cosmos, Astronomer, Docker).
DevOps & CI/CD
· Develop and manage CI/CD pipelines for Snowflake and DBT using GitHub Actions, Azure DevOps, or equivalent.
· Manage version-controlled environments and ensure smooth promotion of changes across dev, test, and prod.
Monitoring & Observability
· Implement monitoring, alerting, and logging for data pipelines.
· Build self-healing or alert-driven mechanisms for critical/severe issue detection.
· Ensure system reliability and proactive issue resolution.
Required Skills & Qualifications
· 5+ years of experience in data engineering with focus on cloud data platforms.
· Strong expertise in:
· Databricks (PySpark/Spark) – analysis, transformations, dependencies.
· Snowflake – architecture, SQL, performance tuning, security (RBAC).
· DBT – modular model development, macros, deployments.
· Airflow – DAG design, orchestration, and error handling.
· Experience in CI/CD pipeline development (GitHub Actions, Azure DevOps).
· Solid understanding of data modeling, ETL/ELT processes, and best practices.
· Excellent problem-solving, communication, and stakeholder collaboration skills.
Good to Have
· Exposure to Docker/Kubernetes for orchestration.
· Knowledge of Azure Data Services (ADF, ADLS) or similar cloud tools.
· Experience with data governance, lineage, and metadata management.
Education
· Bachelor’s / Master’s degree in Computer Science, Engineering, or related field.
Job Summary: Our growing company is in need of an experienced and resourceful Business Development Executive (Online Bidder) to develop the growth opportunities in existing and new IT Markets. Expected Qualities -
1. Must have experience in Online bidding.
2. Must be familiar with Online portals like Upwork
3. Experience in proposal writing.
4. Confident enough to do international Client Communication using emails, chat, audio and video call, Skype call
5. Excellent expertise in Sales for the projects as IT Services for Digital Marketing, SEO, and online branding projects.
6. Good working experience in effort estimation, client follow up, proposal writing.
7. Excellent communication/presentation skills and ability to build relationships
8. Analytical and time-management skills
9. Up selling and cross selling skills
10. A flexible work schedule
Roles & Responsibilities:
1. To identify profitable business opportunities based on the analysis of potential profit margins, timescales, and competition
2. The candidate is responsible for generating business from online Bidding Portals like Upwork.
3. Responsible for bidding on Digital Marketing, SEO, and online branding projects.
4. To develop and maintain a lead generation plan.
5. Will be responsible for generate new leads and contribute to revenue generation.
6. Creating and maintaining a list/database of prospect clients, maintaining a database of prospective client information.
7. Responsible for costing, negotiations, follow up & requirement gathering.
8. To identify and report on market trends, competitor activity, customer demand, buying process developments and other relevant market intelligence
- Candidate should have good Platform experience on Azure with Terraform.
- The devops engineer needs to help developers, create the Pipelines and K8s Deployment Manifests.
- Good to have experience on migrating data from (AWS) to Azure.
- To manage/automate infrastructure automatically using Terraforms. Jenkins is the key CI/CD tool which we uses and it will be used to run these Terraforms.
- VMs to be provisioned on Azure Cloud and managed.
- Good hands on experience of Networking on Cloud is required.
- Ability to setup Database on VM as well as managed DB and Proper set up of cloud hosted microservices needs to be done to communicate with the db services.
- Kubernetes, Storage, KeyValult, Networking(load balancing and routing) and VMs are the key infrastructure expertise which are essential.
- Requirement is to administer Kubernetes cluster end to end. (Application deployment, managing namespaces, load balancing, policy setup, using blue-green/canary deployment models etc).
- The experience in AWS is desirable
- Python experience is optional however Power shell is mandatory
- Know-how on the use of GitHub
Job Description :-
Job Title: Accounts & Audit Executive
Experience: 4 to 5 years
Gender - Male
Qualification: Graduate / Post Graduate
Working days:- 5
Location: Mumbai (Andheri West)
Roles and responsibilities:-
Understanding internal audits to ensure the company meets its financial, operational and compliance objective.
Initiating improvements to the Financial Control and Auditing process.
Acquire, analyze and evaluate accounting documentation, data’s & reports.
Prepare and present reports that reflect audit results and document process.
Identify loopholes and suggest appropriate risk management activities.
Maintain open communication with management committee.
Conduct follow up audits.
Desire Skills:-
Bachelor's degree in Accounting, Finance, or related field.
Experience in Accounting and Audit
Experience in finalization of accounts
Experience in payroll audit
Knowledge and expertise in Ind AS, IFRS, Accounting and Audit Standards
Minimum 4 or 5 years of experience is needed.
Strong interpersonal skills, critical thinking skills, and time management skills.
Proficient verbal and written communication skills.
- Identify and assess customers’ needs to achieve satisfaction
- Build sustainable relationships and trust with customer accounts through open and interactive communication
- Provide accurate, valid and complete information by using right methods/tools
- Follow communication procedures, guidelines and policies
- Take the extra mile to engage customers
- Keep records of customer interactions
- Making calls on Decline Users and understanding the reason behind it
- Converting of Decline users to business.
- Generating leads for the business via online or offline mode
- Making more people aware about the product
- Reaching out to customers and make them Register and avail the loan.
- Expanding the business by Marketing and Branding
- Making 200 + connected calls everyday
- Produce detailed specifications
- Troubleshoot, test and maintain the core product software and databases to ensure strong optimization and functionality
- Contribute in all phases of the development lifecycle
- Follow industry best practices
- Develop and deploy new features to facilitate related procedures and tools if necessary
Requirements :
- Expert in Python with at least one Python web framework (such as Flask, Django etc)
- Hands-on experience on building and leveraging third party Restful APIs
- Strong knowledge in algorithms and data structures
- Experience in any one of the python libraries such as Pandas, Scikit, Numpy, BeautifulSoup, and Scrapy
- Hands-on experience in Linux platform is a must
- Understanding of design principles behind a scalable application
- Knowledge in relational and NoSQL databases
- Exposure to Big Data and Real-time analytics is a plus
- Knowledge of Machine learning or Natural language processing in Python is a plus
- Experience using a public cloud such as Amazon AWS
- Minimum 2 years experience on MEAN Stack
- Extensive hands-on experience in Node JS/Express/Hapi JS, Moleculer and NoSQL DBs (Mongodb and Redis are preferred)
- Strong coding and designing skills
- Working understanding of Continuous Integration and Continuous Deployment concepts and tools such as (Gitlab CI/CD), Development tools (Git), Application Servers (nginx, Apache)
- Experience in consuming and developing secure RESTful API/web-services
- Comfortable using Unix / Linux machines from command prompt
- As a full-stack architect in the team, you'll bring your ideas to life on a technology stack of Node JS and NoSQL Databases among others.
- We are looking for tech geeks who are hands-on and in love with building scalable, distributed and large web / mobile products.
- You must be an excellent problem solver with passion to self-learn and implement backend technologies
- You would be responsible for the architecture design, code review and technology build and deployment activities of web/mobile applications.
- Own the product development from scratch - Architect scalable, distributed and large scale web and mobile solutions from scratch
- You would be responsible for writing SRS and require strong technical writing and communication skills
- You would be responsible for managing and mentoring your team members and help them advance in their learning and career goals
- Ensure test driven development (TDD) methodologies are deployed to execute project
- Define and ensure right coding practices
- Code for fresh development and to troubleshoot and resolve issues
- Lead web development efforts, including hiring, mentoring and advising peers









