Cutshort logo
SQL Jobs in Bangalore (Bengaluru)

50+ SQL Jobs in Bangalore (Bengaluru) | SQL Job openings in Bangalore (Bengaluru)

Apply to 50+ SQL Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsSQL JobsSQL Jobs in AhmedabadSQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
Deqode

at Deqode

1 recruiter
Mokshada Solanki
Posted by Mokshada Solanki
Bengaluru (Bangalore), Mumbai, Pune, Gurugram
4 - 5 yrs
₹4L - ₹20L / yr
SQL
skill iconAmazon Web Services (AWS)
Migration
PySpark
ETL

Job Summary:

Seeking a seasoned SQL + ETL Developer with 4+ years of experience in managing large-scale datasets and cloud-based data pipelines. The ideal candidate is hands-on with MySQL, PySpark, AWS Glue, and ETL workflows, with proven expertise in AWS migration and performance optimization.


Key Responsibilities:

  • Develop and optimize complex SQL queries and stored procedures to handle large datasets (100+ million records).
  • Build and maintain scalable ETL pipelines using AWS Glue and PySpark.
  • Work on data migration tasks in AWS environments.
  • Monitor and improve database performance; automate key performance indicators and reports.
  • Collaborate with cross-functional teams to support data integration and delivery requirements.
  • Write shell scripts for automation and manage ETL jobs efficiently.


Required Skills:

  • Strong experience with MySQL, complex SQL queries, and stored procedures.
  • Hands-on experience with AWS Glue, PySpark, and ETL processes.
  • Good understanding of AWS ecosystem and migration strategies.
  • Proficiency in shell scripting.
  • Strong communication and collaboration skills.


Nice to Have:

  • Working knowledge of Python.
  • Experience with AWS RDS.



Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Bengaluru (Bangalore), Pune, Chennai, Mumbai, Gurugram
5 - 7 yrs
₹5L - ₹15L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL

Profile: AWS Data Engineer

Mode- Hybrid

Experience- 5+7 years

Locations - Bengaluru, Pune, Chennai, Mumbai, Gurugram


Roles and Responsibilities

  • Design and maintain ETL pipelines using AWS Glue and Python/PySpark
  • Optimize SQL queries for Redshift and Athena
  • Develop Lambda functions for serverless data processing
  • Configure AWS DMS for database migration and replication
  • Implement infrastructure as code with CloudFormation
  • Build optimized data models for performance
  • Manage RDS databases and AWS service integrations
  • Troubleshoot and improve data processing efficiency
  • Gather requirements from business stakeholders
  • Implement data quality checks and validation
  • Document data pipelines and architecture
  • Monitor workflows and implement alerting
  • Keep current with AWS services and best practices


Required Technical Expertise:

  • Python/PySpark for data processing
  • AWS Glue for ETL operations
  • Redshift and Athena for data querying
  • AWS Lambda and serverless architecture
  • AWS DMS and RDS management
  • CloudFormation for infrastructure
  • SQL optimization and performance tuning


Read more
Gruve
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore), Pune
5yrs+
Upto ₹50L / yr (Varies
)
skill iconPython
SQL
Data engineering
Apache Spark
PySpark
+6 more

About the Company:

Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.

 

Why Gruve:

At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.

Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.

 

Position summary:

We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. 

Key Roles & Responsibilities:

  • Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake.
  • Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP).
  • Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data.
  • Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency.
  • Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization.
  • Implement data governance, security, and compliance best practices.
  • Build and maintain data models, transformations, and data marts for analytics and reporting.
  • Collaborate with data scientists, analysts, and business teams to define data engineering requirements.
  • Automate infrastructure and deployments using Terraform, Airflow, or dbt.
  • Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks.
  • Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools.


Basic Qualifications:

  • Bachelor’s or Master’s Degree in Computer Science or Data Science.
  • 5–8 years of experience in data engineering, big data processing, and cloud-based data platforms.
  • Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks.
  • Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization).
  • Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine).
  • Proficiency in SQL, Python, or Scala for data transformation and analytics.
  • Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg).
  • Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery).
  • Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster.
  • Strong understanding of data governance, access control, and encryption strategies.
  • Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies.


Preferred Qualifications:

  • Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub).
  • Experience in BI and analytics tools (Tableau, Power BI, Looker).
  • Familiarity with data observability tools (Monte Carlo, Great Expectations).
  • Experience with machine learning feature engineering pipelines in Databricks.
  • Contributions to open-source data engineering projects.
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Pune, Mumbai, Bengaluru (Bangalore), Chennai
4 - 7 yrs
₹5L - ₹27L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
Glue semantics
Amazon Redshift
+1 more

Job Overview:

We are seeking an experienced AWS Data Engineer to join our growing data team. The ideal candidate will have hands-on experience with AWS Glue, Redshift, PySpark, and other AWS services to build robust, scalable data pipelines. This role is perfect for someone passionate about data engineering, automation, and cloud-native development.

Key Responsibilities:

  • Design, build, and maintain scalable and efficient ETL pipelines using AWS Glue, PySpark, and related tools.
  • Integrate data from diverse sources and ensure its quality, consistency, and reliability.
  • Work with large datasets in structured and semi-structured formats across cloud-based data lakes and warehouses.
  • Optimize and maintain data infrastructure, including Amazon Redshift, for high performance.
  • Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver solutions.
  • Automate data validation, transformation, and loading processes to support real-time and batch data processing.
  • Monitor and troubleshoot data pipeline issues and ensure smooth operations in production environments.

Required Skills:

  • 5 to 7 years of hands-on experience in data engineering roles.
  • Strong proficiency in Python and PySpark for data transformation and scripting.
  • Deep understanding and practical experience with AWS Glue, AWS Redshift, S3, and other AWS data services.
  • Solid understanding of SQL and database optimization techniques.
  • Experience working with large-scale data pipelines and high-volume data environments.
  • Good knowledge of data modeling, warehousing, and performance tuning.

Preferred/Good to Have:

  • Experience with workflow orchestration tools like Airflow or Step Functions.
  • Familiarity with CI/CD for data pipelines.
  • Knowledge of data governance and security best practices on AWS.


Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Pune, Mumbai, Bengaluru (Bangalore), Gurugram
4 - 6 yrs
₹5L - ₹10L / yr
ETL
SQL
skill iconAmazon Web Services (AWS)
PySpark
KPI

Role - ETL Developer

Work ModeHybrid

Experience- 4+ years

Location - Pune, Gurgaon, Bengaluru, Mumbai

Required Skills - AWS, AWS Glue, Pyspark, ETL, SQL

Required Skills:

  • 4+ years of hands-on experience in MySQL, including SQL queries and procedure development
  • Experience in Pyspark, AWS, AWS Glue
  • Experience in AWS ,Migration
  • Experience with automated scripting and tracking KPIs/metrics for database performance
  • Proficiency in shell scripting and ETL.
  • Strong communication skills and a collaborative team player
  • Knowledge of Python and AWS RDS is a plus


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shilpi Mishra
Posted by Shilpi Mishra
Bengaluru (Bangalore)
3 - 8 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Microservices
SQL
Multithreading

Role : Java Developer

Location : Bangalore

Key responsibilities

  • Experience – 3 to 8 years of experience.
  • Experience in Core Java and Spring Boot.
  • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
  • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications.
  • Good development experience with RDBMS
  • Good knowledge of multi-threading and high-performance server-side development.
  • Basic working knowledge of Unix/Linux.
  • Excellent problem solving and coding skills.
  • Strong interpersonal, communication and analytical skills.
  • Should have the ability to express their design ideas and thoughts.

 

About Wissen Technology:

 

The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.

 

Our workforce consists of highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments. Globally present with offices US, India, UK, Australia, Mexico, and Canada.

 

We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.

 

We have been certified as a Great Place to Work® company for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.

 

Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.

 

The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients. We have served clients across sectors like Banking, Ecommerce, Telecom, Healthcare, Manufacturing, and Energy.

 

Career Progression:

 

At Wissen Technology, your career growth is important for us. Therefore, we put in several efforts towards each employee’s career progression – to enable and empower them to grow within the company as well as to instill a sense of responsibility, loyalty, and trust.

There have been instances where a software engineer has grown from being an individual contributor to technical lead and now on the path to becoming a director responsible for growing revenues and profitability. We deeply value Ownership: taking responsibility, making it happen, not letting the ball drop, and being accountable.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Khushboo Kumari
Posted by Khushboo Kumari
Bengaluru (Bangalore), Mumbai
4 - 14 yrs
₹6L - ₹29L / yr
skill iconJava
Data Structures
Algorithms
Multithreading
SQL
+1 more

Job Title: Java Developer


Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.

 Required Skills: • Exp. - 4 to 14 years.

• Experience in Core Java and Spring Boot.

• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.

 • Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications.

• Good development experience with RDBMS.

 • Good knowledge of multi-threading and high-performance server-side development.

• Basic working knowledge of Unix/Linux.

• Excellent problem solving and coding skills.

• Strong interpersonal, communication and analytical skills.

• Should have the ability to express their design ideas and thoughts.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Khushboo Kumari
Posted by Khushboo Kumari
Bengaluru (Bangalore), Mumbai
3 - 12 yrs
₹6L - ₹29L / yr
skill iconJava
Data Structures
Algorithms
Multithreading
SQL

Hello Everyone,


Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.

 Required Skills: • Exp. - 5 to 14 years.

• Experience in Core Java and Spring Boot.

• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.

 • Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications.

• Good development experience with RDBMS.

 • Good knowledge of multi-threading and high-performance server-side development.

• Basic working knowledge of Unix/Linux.

• Excellent problem solving and coding skills.

• Strong interpersonal, communication and analytical skills.

• Should have the ability to express their design ideas and thoughts.


About Wissen Technology:

Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen hassuccessfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.



Read more
Wissen Technology

at Wissen Technology

4 recruiters
Chetna Jain
Posted by Chetna Jain
Bengaluru (Bangalore)
4 - 8 yrs
Best in industry
Automation anywhere
skill iconPython
SQL
AA
A360
+1 more

Job Description: 


Wissen Technology is looking for a skilled Automation Anywhere Engineer to join our dynamic team in Bangalore. The ideal candidate will have hands-on experience in Automation Anywhere , Document Automation , SQL , and  Python , with a strong background in designing and implementing automation solutions.   


Key Responsibilities:   

- Design, develop, and deploy automation solutions using Automation Anywhere.   

- Work on  Document Automation  to extract, process, and validate structured/unstructured data.   

- Develop scripts and automation solutions using  Python for enhanced process efficiency.   

- Optimize data processing workflows and database queries using  SQL.   

- Collaborate with cross-functional teams to identify automation opportunities and enhance business processes.   

- Perform unit testing, debugging, and troubleshooting of automation scripts.   

- Ensure adherence to industry best practices and compliance standards in automation processes.   

- Provide support, maintenance, and enhancements to existing automation solutions.   

 

Required Skills & Qualifications:   

- 4 to 8 years  of experience in RPA development using Automation Anywhere.   

- Strong expertise in Automation Anywhere A360(preferable).   

- Hands-on experience with Document Automation tools and technologies.   

- Proficiency in Python  for scripting and automation.   

- Strong knowledge of SQL for data processing and querying.   

- Experience in troubleshooting, debugging, and optimizing automation workflows.   

- Ability to work in an Agile environment and collaborate with cross-functional teams.   

- Excellent problem-solving skills and attention to detail.   

Read more
Zenius IT Services Pvt Ltd

at Zenius IT Services Pvt Ltd

2 candid answers
Sunita Pradhan
Posted by Sunita Pradhan
Hyderabad, Bengaluru (Bangalore), Chennai
1 - 4 yrs
₹7L - ₹12L / yr
skill iconHTML/CSS
skill iconJavascript
skill iconReact.js
skill iconAngular (2+)
SQL
+4 more

Role Description

This is a full-time on-site role for a Full Stack Developer at Zenius IT Services in Hyderabad. The Full Stack Developer will be responsible for back-end and front-end web development, software development, and utilizing CSS to enhance user interfaces.


Experience Required: 0-3 Years


Required Technical and Professional Expertise

  • Bachelor’s degree in computer science, Engineering, or a related field
  • Strong Experience of front-end technologies such as HTML, CSS, JavaScript, and popular frameworks like React, Angular, or Vue.js
  • Proven experience in database modeling and design using SQL
  • Experience with back-end technologies such as Node.js, Python, .NET.
  • Experience working with RESTful APIs and building scalable web applications
  • Strong understanding of software development principles and best practices
  • Ability to work independently and as part of a team.
  • Passion for learning new technologies and solving challenging problems.
  • Strong Problem-Solving skills with curiosity to learn, step out of comfort zones and explore new areas.
  • Strong execution discipline.
  • Able to deliver his/her jobs with complete ownership Strong communication and collaboration skills.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Chetna Jain
Posted by Chetna Jain
Bengaluru (Bangalore)
5 - 8 yrs
Best in industry
skill iconJava
06692
Microservices
skill iconReact.js
Multithreading
+4 more

Responsibilities: 

  • Develop and maintain high-quality, efficient, and scalable backend applications. 
  • Participate in all phases of the software development lifecycle (SDLC) 
  • Write clean, well-documented, and testable code adhering to best practices. 
  • Collaborate with team members to ensure the successful delivery of projects. 
  • Debug and troubleshoot complex technical problems. 
  • Identify and implement performance optimizations. 
  • Participate in code reviews 
  • Hands-on experience with Springboot, Java 8 and above. 
  • 5-8 years of experience developing Java applications. 
  • Knowledge about at least one messaging system like Kafka, RabbitMQ etc. 
  • Required React developer requirements, qualifications & skills: 
  • Proficiency in React.js and its core principles 
  • Strong JavaScript, HTML5, and CSS3 skills 
  • Experience with popular React.js workflows (such as Redux) 
  • Strong understanding of object-oriented programming (OOP) principles. 
  • Experience with design patterns and best practices for Java development. 
  • Proficient in unit testing frameworks (e.g., JUnit). 
  • Experience with build automation tools (e.g., Maven, Gradle). 
  • Experience with version control systems (e.g., Git). 
  • Experience with one of these databases – Postgres, MongoDb, Cassandra 
  • Knowledge on Retail or OMS is a plus. 
  • Experienced in containerized deployments using Docker, Kubernetes and DevOps mindset 
  • Ability to reverse engineer existing/legacy and document findings on confluence. 
  • Create automated tests for unit, integration, regression, performance, and functional testing, to meet established expectations and acceptance criteria. 
  • Document APIs using Lowe’s established tooling. 




Read more
The Alter Office

at The Alter Office

2 candid answers
Harsha Ravindran
Posted by Harsha Ravindran
Bengaluru (Bangalore)
3 - 6 yrs
₹12L - ₹18L / yr
skill iconNodeJS (Node.js)
MySQL
NOSQL Databases
skill iconMongoDB
Google Cloud Platform (GCP)
+14 more

Role: Senior Software Engineer - Backend

Location: In-Office, Bangalore, Karnataka, India

 

Job Summary:

We are seeking a highly skilled and experienced Senior Backend Engineer with a minimum of 3 years of experience in product building to join our dynamic and innovative team. In this role, you will be responsible for designing, developing, and maintaining robust backend systems that power our applications. You will work closely with cross-functional teams to ensure seamless integration between frontend and backend components, leveraging your expertise to architect scalable, secure, and high-performance solutions. As a senior team member, you will mentor junior developers and lead technical initiatives to drive innovation and excellence.

 

Annual Compensation: 12-18 LPA


Responsibilities:

  • Lead the design, development, and maintenance of scalable and efficient backend systems and APIs.
  • Architect and implement complex backend solutions, ensuring high availability and performance.
  • Collaborate with product managers, frontend developers, and other stakeholders to deliver comprehensive end-to-end solutions.
  • Design and optimize data storage solutions using relational databases and NoSQL databases.
  • Mentor and guide junior developers, fostering a culture of knowledge sharing and continuous improvement.
  • Implement and enforce best practices for code quality, security, and performance optimization.
  • Develop and maintain CI/CD pipelines to automate build, test, and deployment processes.
  • Ensure comprehensive test coverage, including unit testing, and implement various testing methodologies and tools to validate application functionality.
  • Utilize cloud services (e.g., AWS, Azure, GCP) for infrastructure deployment, management, and optimization.
  • Conduct system design reviews and provide technical leadership in architectural discussions.
  • Stay updated with industry trends and emerging technologies to drive innovation within the team.
  • Implement secure authentication and authorization mechanisms and ensure data encryption for sensitive information.
  • Design and develop event-driven applications utilizing serverless computing principles to enhance scalability and efficiency.

Requirements:

  • Minimum of 3 years of proven experience as a Backend Engineer, with a strong portfolio of product-building projects.
  • Strong proficiency in backend development using Java, Python, and JavaScript, with experience in building scalable and high-performance applications.
  • Experience with popular backend frameworks and libraries for Java (e.g., Spring Boot) and Python (e.g., Django, Flask).
  • Strong expertise in SQL and NoSQL databases (e.g., MySQL, MongoDB) with a focus on data modeling and scalability.
  • Practical experience with caching mechanisms (e.g., Redis) to enhance application performance.
  • Proficient in RESTful API design and development, with a strong understanding of API security best practices.
  • In-depth knowledge of asynchronous programming and event-driven architecture.
  • Familiarity with the entire web stack, including protocols, web server optimization techniques, and performance tuning.
  • Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) is highly desirable.
  • Proven experience working with cloud technologies (AWS/GCP/Azure) and understanding of cloud architecture principles.
  • Strong understanding of fundamental design principles behind scalable applications and microservices architecture.
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work collaboratively in a fast-paced, agile environment and lead projects to successful completion.
Read more
The Alter Office

at The Alter Office

2 candid answers
Harsha Ravindran
Posted by Harsha Ravindran
Bengaluru (Bangalore)
1 - 4 yrs
₹6L - ₹10L / yr
skill iconNodeJS (Node.js)
MySQL
SQL
skill iconMongoDB
skill iconExpress
+9 more

Job Title: Backend Developer

Location: In-Office, Bangalore, Karnataka, India


Job Summary:

We are seeking a highly skilled and experienced Backend Developer with a minimum of 1 year of experience in product building to join our dynamic and innovative team. In this role, you will be responsible for designing, developing, and maintaining robust backend systems that drive our applications. You will collaborate with cross-functional teams to ensure seamless integration between frontend and backend components, and your expertise will be critical in architecting scalable, secure, and high-performance backend solutions.


Annual Compensation: 6-10 LPA


Responsibilities:

  • Design, develop, and maintain scalable and efficient backend systems and APIs using NodeJS.
  • Architect and implement complex backend solutions, ensuring high availability and performance.
  • Collaborate with product managers, frontend developers, and other stakeholders to deliver comprehensive end-to-end solutions.
  • Design and optimize data storage solutions using relational databases (e.g., MySQL) and NoSQL databases (e.g., MongoDB, Redis).
  • Promoting a culture of collaboration, knowledge sharing, and continuous improvement.
  • Implement and enforce best practices for code quality, security, and performance optimization.
  • Develop and maintain CI/CD pipelines to automate build, test, and deployment processes.
  • Ensure comprehensive test coverage, including unit testing, and implement various testing methodologies and tools to validate application functionality.
  • Utilize cloud services (e.g., AWS, Azure, GCP) for infrastructure deployment, management, and optimization.
  • Conduct system design reviews and contribute to architectural discussions.
  • Stay updated with industry trends and emerging technologies to drive innovation within the team.
  • Implement secure authentication and authorization mechanisms and ensure data encryption for sensitive information.
  • Design and develop event-driven applications utilizing serverless computing principles to enhance scalability and efficiency.


Requirements:

  • Minimum of 1 year of proven experience as a Backend Developer, with a strong portfolio of product-building projects.
  • Extensive experience with JavaScript backend frameworks (e.g., Express, Socket) and a deep understanding of their ecosystems.
  • Strong expertise in SQL and NoSQL databases (MySQL and MongoDB) with a focus on data modeling and scalability.
  • Practical experience with Redis and caching mechanisms to enhance application performance.
  • Proficient in RESTful API design and development, with a strong understanding of API security best practices.
  • In-depth knowledge of asynchronous programming and event-driven architecture.
  • Familiarity with the entire web stack, including protocols, web server optimization techniques, and performance tuning.
  • Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) is highly desirable.
  • Proven experience working with cloud technologies (AWS/GCP/Azure) and understanding of cloud architecture principles.
  • Strong understanding of fundamental design principles behind scalable applications and microservices architecture.
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work collaboratively in a fast-paced, agile environment and lead projects to successful completion.
Read more
TechMynd Consulting

at TechMynd Consulting

2 candid answers
Suraj N
Posted by Suraj N
Bengaluru (Bangalore), Gurugram, Mumbai
4 - 8 yrs
₹10L - ₹24L / yr
skill iconData Science
skill iconPostgreSQL
skill iconPython
Apache
skill iconAmazon Web Services (AWS)
+5 more

Senior Data Engineer


Location: Bangalore, Gurugram (Hybrid)


Experience: 4-8 Years


Type: Full Time | Permanent


Job Summary:


We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command of modern data stacks. You’ll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams.


This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools.


Key Responsibilities:


PostgreSQL & Data Modeling


· Design and optimize complex SQL queries, stored procedures, and indexes


· Perform performance tuning and query plan analysis


· Contribute to schema design and data normalization


Data Migration & Transformation


· Migrate data from multiple sources to cloud or ODS platforms


· Design schema mapping and implement transformation logic


· Ensure consistency, integrity, and accuracy in migrated data


Python Scripting for Data Engineering


· Build automation scripts for data ingestion, cleansing, and transformation


· Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e.g., Boto3)


· Maintain reusable script modules for operational pipelines


Data Orchestration with Apache Airflow


· Develop and manage DAGs for batch/stream workflows


· Implement retries, task dependencies, notifications, and failure handling


· Integrate Airflow with cloud services, data lakes, and data warehouses


Cloud Platforms (AWS / Azure / GCP)


· Manage data storage (S3, GCS, Blob), compute services, and data pipelines


· Set up permissions, IAM roles, encryption, and logging for security


· Monitor and optimize cost and performance of cloud-based data operations


Data Marts & Analytics Layer


· Design and manage data marts using dimensional models


· Build star/snowflake schemas to support BI and self-serve analytics


· Enable incremental load strategies and partitioning


Modern Data Stack Integration


· Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka


· Support modular pipeline design and metadata-driven frameworks


· Ensure high availability and scalability of the stack


BI & Reporting Tools (Power BI / Superset / Supertech)


· Collaborate with BI teams to design datasets and optimize queries


· Support development of dashboards and reporting layers


· Manage access, data refreshes, and performance for BI tools




Required Skills & Qualifications:


· 4–6 years of hands-on experience in data engineering roles


· Strong SQL skills in PostgreSQL (tuning, complex joins, procedures)


· Advanced Python scripting skills for automation and ETL


· Proven experience with Apache Airflow (custom DAGs, error handling)


· Solid understanding of cloud architecture (especially AWS)


· Experience with data marts and dimensional data modeling


· Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc.)


· Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI


· Version control (Git) and CI/CD pipeline knowledge is a plus


· Excellent problem-solving and communication skills

Read more
NeoGenCode Technologies Pvt Ltd
Bengaluru (Bangalore)
8 - 15 yrs
₹5L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Microservices
skill iconKubernetes
Multithreading
+6 more

🔥 High Priority – Senior Lead Java Developer (10+ Years) | Bangalore – Onsite


Summary :

We are hiring Senior Lead Java Developers with 10+ years of experience for an onsite role in Bangalore.

If you're a hands-on expert with a strong background in Java, Spring Boot, Microservices, and Kubernetes, this is your opportunity to lead, mentor, and deliver high-quality solutions in a fast-paced environment.


🔹 Position : Senior Lead Java Developer

🔹 Experience : 10+ Years (12+ preferred)

🔹 Location : Bangalore (Onsite)

🔹 Openings : 6+

Must-Have Skills :

  • 8+ years of hands-on experience with Core Java & Spring Boot
  • Expertise in Multithreading, Dependency Injection, and AOP
  • Strong in Microservices Architecture and RESTful services
  • Good exposure to SQL & NoSQL databases
  • Proficient with Git (GitLab preferred)
  • Experience with Kubernetes deployments and APM tools (New Relic preferred)
  • Solid understanding of distributed tracing and log analysis
  • Proven debugging and performance optimization skills

💼 Responsibilities :

  • Design and develop high-quality, scalable microservices
  • Act as SME for multiple services or subsystems
  • Own service performance, SLAs, and incident resolutions
  • Mentor junior developers and conduct technical interviews
  • Participate in production war rooms and troubleshooting
  • Lead development efforts and drive code quality

🎓 Qualification :

  • BE/B.Tech or equivalent degree
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
8 - 12 yrs
₹12L - ₹25L / yr
ETL
SQL
Snow flake schema
  • 8-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
  • Strong SQL knowledge & debugging skills are a must.
  • Experience in Azure and Snowflake Testing is plus
  • Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
  • Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
  • Experience in JIRA, Xray defect management tool is good to have.
  • Exposure to the financial domain knowledge is considered a plus.
  • Testing the data-readiness (data quality) address code or data issues
  • Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
  • Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
  • Prior experience with State Street and Charles River Development (CRD) considered a plus
  • Experience in tools such as PowerPoint, Excel, SQL
  • Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus


Read more
Leading HealthTech, a U.S.-based product company

Leading HealthTech, a U.S.-based product company

Agency job
via Recruiting Bond by Pavan Kumar
Bengaluru (Bangalore), Mumbai
9 - 13 yrs
₹35L - ₹45L / yr
skill iconJava
J2EE
WebLogic
Spring
Apache Camel
+18 more

🚀 We're Hiring: Technical Lead – Java Backend & Integration

📍 Bangalore | Hybrid | Full-Time

👨‍💻 9+ Years Experience | Enterprise Product Development

🏥 Healthcare Tech | U.S. Health Insurance Domain

Join Leading HealthTech, a U.S.-based product company driving innovation in the $1.1 trillion health insurance industry. We power over 81 million lives, with 130+ customers and 100+ third-party integrations. At our growing Bangalore tech hub, you’ll solve real-world, large-scale problems and help modernize one of the most stable and impactful industries in the world.


🔧 What You'll Work On:

  • Architect and build backend & integration solutions using Java, J2EE, WebLogic, Spring, Apache Camel
  • Transition monolith systems to microservices-based architecture
  • Lead design reviews, customer discussions, code quality, UAT & production readiness
  • Work with high-volume transactional systems processing millions of health claims daily
  • Coach & mentor engineers, contribute to platform modernization


🧠 What You Bring:

  • 9+ years in backend Java development and enterprise system integration
  • Hands-on with REST, SOAP, JMS, SQL, stored procedures, XML, ESBs
  • Solid understanding of SOA, data structures, system design, and performance tuning
  • Experience with Agile, CI/CD, unit testing, and code quality tools
  • Healthcare/payor domain experience is a huge plus!


💡 Why this opportunity?

  • Global product impact from our India technology center
  • Work on mission-critical systems in a stable and recession-resilient sector
  • Be part of a journey to modernize healthcare through tech
  • Solve complex challenges at scale that few companies offer

🎯 Ready to drive change at the intersection of tech and healthcare?

Read more
Trellissoft Inc.

at Trellissoft Inc.

3 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
5yrs+
Upto ₹18L / yr (Varies
)
skill iconPython
skill iconFlask
Microservices
SQL
NOSQL Databases

Job Responsibilities:

  • Design, develop, test, and maintain high-performance web applications and backend services using Python.
  • Build scalable, secure, and reliable backend systems and APIs.
  • Optimize and debug existing codebases to enhance performance and maintainability.
  • Collaborate closely with cross-functional teams to gather requirements and deliver high-quality solutions.
  • Mentor junior developers, conduct code reviews, and uphold best coding practices.
  • Write clear, comprehensive technical documentation for internal and external use.
  • Stay current with emerging technologies, tools, and industry trends to continually improve development processes.

Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 5+ years of hands-on experience in Python development.
  • Strong expertise Flask.
  • In-depth understanding of software design principles, architecture, and design patterns.
  • Proven experience working with both SQL and NoSQL databases.
  • Solid debugging and problem-solving capabilities.
  • Effective communication and collaboration skills, with a team-first mindset.

Technical Skills:

  • Programming: Python (Advanced)
  • Web Frameworks: Flask
  • Databases: PostgreSQL, MySQL, MongoDB, Redis
  • Version Control: Git
  • API Development: RESTful APIs
  • Containerization & Orchestration: Docker, Kubernetes
  • Cloud Platforms: AWS or Azure (hands-on experience preferred)
  • DevOps: CI/CD pipelines (e.g., Jenkins, GitHub Actions)


Read more
builds holistic technology solutions for the entertainment and leisure industry. We help you automate key processes and manage them centrally. Our hardware and software solutions are designed to create delightful experiences for your customers, while also making your business more robust and your staff more productive.

builds holistic technology solutions for the entertainment and leisure industry. We help you automate key processes and manage them centrally. Our hardware and software solutions are designed to create delightful experiences for your customers, while also making your business more robust and your staff more productive.

Agency job
via HyrHub by Shwetha Naik
Bengaluru (Bangalore), Mangalore
4 - 6 yrs
₹8L - ₹10L / yr
Windows Presentation Foundation(WPF)
User Experience (UX) Design
skill iconC#
skill icon.NET
SQL
+3 more

1. 4 - 7 years working as a professional WPF UI developer.

2. Proficient knowledge of WPF and .NET C#.

3. Proficient understanding of UX design principles and creating responsive layouts.

4. Good understanding of SQL and REST APIs

5. Excellent analytical and multitasking skills.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
1 - 3 yrs
₹5L - ₹17L / yr
skill iconPython
SQL
ETL
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)

Job Summary:

We are looking for a motivated and detail-oriented Data Engineer with 1–2 years of experience to join our data engineering team. The ideal candidate should have solid foundational skills in SQL and Python, along with exposure to building or maintaining data pipelines. You’ll play a key role in helping to ingest, process, and transform data to support various business and analytical needs.

Key Responsibilities:

  • Assist in the design, development, and maintenance of scalable and efficient data pipelines.
  • Write clean, maintainable, and performance-optimized SQL queries.
  • Develop data transformation scripts and automation using Python.
  • Support data ingestion processes from various internal and external sources.
  • Monitor data pipeline performance and help troubleshoot issues.
  • Collaborate with data analysts, data scientists, and other engineers to ensure data quality and consistency.
  • Work with cloud-based data solutions and tools (e.g., AWS, Azure, GCP – as applicable).
  • Document technical processes and pipeline architecture.

Core Skills Required:

  • Proficiency in SQL (data querying, joins, aggregations, performance tuning).
  • Experience with Python, especially in the context of data manipulation (e.g., pandas, NumPy).
  • Exposure to ETL/ELT pipelines and data workflow orchestration tools (e.g., Airflow, Prefect, Luigi – preferred).
  • Understanding of relational databases and data warehouse concepts.
  • Familiarity with version control systems like Git.

Preferred Qualifications:

  • Experience with cloud data services (AWS S3, Redshift, Azure Data Lake, etc.)
  • Familiarity with data modeling and data integration concepts.
  • Basic knowledge of CI/CD practices for data pipelines.
  • Bachelor’s degree in Computer Science, Engineering, or related field.


Read more
ZeMoSo Technologies

at ZeMoSo Technologies

11 recruiters
Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Chennai, Pune
4 - 8 yrs
₹10L - ₹15L / yr
Data engineering
skill iconPython
SQL
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
+3 more

Work Mode: Hybrid


Need B.Tech, BE, M.Tech, ME candidates - Mandatory



Must-Have Skills:

● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.

● Minimum of 3 years of proven experience as a Data Engineer.

● Strong proficiency in Python programming language and SQL.

● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.

● Good comprehension and critical thinking skills.


● Kindly note Salary bracket will vary according to the exp. of the candidate - 

- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA

- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA

- Experience more than 8 yrs - Salary upto 40 LPA

Read more
KNS Technologies
Bengaluru (Bangalore)
2 - 3 yrs
₹3.5L - ₹5L / yr
skill iconC#
ASP.NET
ADO.NET
ASP.NET MVC
SQL
+6 more

Job Summary: 

We are looking for a talented Full Stack Developer with experience in C# (ASP.NET Core Web API), React, and SQL Server. The successful candidate will be responsible for designing, developing, and maintaining robust web applications and APIs, ensuring seamless integration between the front-end and back-end systems. 

Key Responsibilities: 

Full Stack Development: Design, develop, and maintain web applications using C#, ASP.NET Core Web API, and React. 

API Development: Create and maintain RESTful APIs to support front-end applications and integrations. 

Database Management: Design, optimize, and manage SQL Server databases, including writing complex queries, stored procedures, and indexing. 

Front-End Development: Implement user interfaces using React, ensuring a smooth and responsive user experience. 

Code Quality: Write clean, scalable, and well-documented code following best practices in software development. 

Collaboration: Work closely with cross-functional teams, including UI/UX designers, back-end developers, and DevOps, to deliver high-quality software solutions. 

Testing & Debugging: Conduct unit testing, integration testing, and debugging to ensure the quality and reliability of applications. 

Continuous Improvement: Stay updated on the latest industry trends and technologies and integrate them into development processes where applicable. 

Required Qualifications: 

Experience: Proven experience as a Full Stack Developer with a strong focus on C#, ASP.NET Core Web API, React, and SQL Server. 

Technical Skills: 

Proficient in C# and ASP.NET Core Web API development. 

Strong experience with React and related front-end technologies (JavaScript, HTML, CSS). 

Expertise in SQL Server, including database design, query optimization, and performance tuning. 

Familiarity with version control systems like Git. 

Understanding of RESTful architecture and Web API design. 

Problem-Solving: Excellent analytical and problem-solving skills with the ability to troubleshoot complex issues. 

Communication: Strong verbal and written communication skills, with the ability to articulate technical concepts to non-technical stakeholders. 

Team Collaboration: Ability to work effectively in a team environment, collaborating with cross-functional teams to achieve project goals. 

Preferred Qualifications: 

Experience with ASP.NET Core MVC or Blazor. 

Knowledge of cloud platforms such as Azure or AWS. 

Experience with Agile/Scrum development methodologies. 

Education: 

Bachelor’s degree in Computer Science, Software Engineering, or a related field (or equivalent experience)

Read more
Peenak Business solutions
Gaurav Kaushik
Posted by Gaurav Kaushik
Bengaluru (Bangalore)
4 - 6 yrs
₹25L - ₹32L / yr
skill iconPython
skill iconNodeJS (Node.js)
skill iconGo Programming (Golang)
SQL
NOSQL Databases

Exp: 4-6 years

Position: Backend Engineer

Job Location: Bangalore ( office near cubbon park - opp JW marriott)

Work Mode : 5 days work from office 


Requirements:

● Engineering graduate with 3-5 years of experience in software product development.

● Proficient in Python, Node.js, Go

● Good knowledge of SQL and NoSQL

● Strong Experience in designing and building APIs

● Experience with working on scalable interactive web applications

● A clear understanding of software design constructs and their implementation

● Understanding of the threading limitations of Python and multi-process architecture

● Experience implementing Unit and Integration testing

● Exposure to the Finance domain is preferred

● Strong written and oral communication skills

Read more
Trellissoft Inc.

at Trellissoft Inc.

3 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
6 - 9 yrs
Upto ₹25L / yr (Varies
)
Data Warehouse (DWH)
ETL
ELT
SQL
skill iconAmazon Web Services (AWS)
+4 more

We’re looking for an experienced Senior Data Engineer to lead the design and development of scalable data solutions at our company. The ideal candidate will have extensive hands-on experience in data warehousing, ETL/ELT architecture, and cloud platforms like AWS, Azure, or GCP. You will work closely with both technical and business teams, mentoring engineers while driving data quality, security, and performance optimization.


Responsibilities:

  • Lead the design of data warehouses, lakes, and ETL workflows.
  • Collaborate with teams to gather requirements and build scalable solutions.
  • Ensure data governance, security, and optimal performance of systems.
  • Mentor junior engineers and drive end-to-end project delivery.

Requirements:

  • 6+ years of experience in data engineering, including at least 2 full-cycle data warehouse projects.
  • Strong skills in SQL, ETL tools (e.g., Pentaho, dbt), and cloud platforms.
  • Expertise in big data tools (e.g., Apache Spark, Kafka).
  • Excellent communication skills and leadership abilities.

Preferred: Experience with workflow orchestration tools (e.g., Airflow), real-time data, and DataOps practices.

Read more
Tech Prescient

at Tech Prescient

2 candid answers
3 recruiters
Ashwini Damle
Posted by Ashwini Damle
Remote, Bengaluru (Bangalore)
8 - 10 yrs
₹20L - ₹35L / yr
skill iconJava
skill iconNodeJS (Node.js)
NOSQL Databases
SQL
skill iconAmazon Web Services (AWS)
+1 more

Job Title- Senior Full Stack Web Developer

Job location- Bangalore/Hybrid

Availability- Immediate Joiners

Experience Range- 5-8yrs

Desired skills - Java,AWS, SQL/NoSQL, Javascript, Node.js(good to have)


We are looking for 8-10 years Senior Full Stack Web Developer Java 



  1. Working on different aspects of the core product and associated tools, (server-side or user-interfaces depending on the team you'll join)
  2. Expertise as a full stack software engineer of large scale complex software systems with at 8+ years of experience with technologies such as Java, Relational and Non relational databases,Node.js and AWS Cloud
  3. Assisting with in-life maintenance, testing, debugging and documentation of deployed services
  4. Coding & designing new features
  5. Creating the supporting functional and technical specifications
  6. Deep understanding of system architecture , and distributed systems
  7. Stay updated with the latest services, tools, and trends, and implement innovative solutions that contribute to the company's growth


Read more
Tech Prescient

at Tech Prescient

2 candid answers
3 recruiters
Ashwini Damle
Posted by Ashwini Damle
Remote, Bengaluru (Bangalore)
8 - 10 yrs
₹15L - ₹35L / yr
skill iconJava
skill iconAmazon Web Services (AWS)
skill iconKubernetes
SQL
skill iconSpring Boot

Job Title- Senior Java Developer

Exp Range- 8-10 yrs

Location- Bangalore/ Hybrid

Desired skill- Java 8, Microservices (Must), AWS, Kafka, Kubernetes


What you will bring:


● Strong core Java, concurrency and server-side experience

● 8 + Years of experience with hands-on coding.

● Strong Java8 and Microservices. (Must)

● Should have good understanding on AWS/GCP

● Kafka, AWS stack/Kubernetes

● An understanding of Object Oriented Design and standard design patterns.

● Experience of multi-threaded, 3-tier architectures/Distributed architectures, web services and caching.

● A familiarity with SQL databases

● Ability and willingness to work in a global, fast-paced environment.


● Flexible with the ability to adapt working style to meet objectives.

● Excellent communication and analytical skills

● Ability to effectively communicate with team members

● Experience in the following technologies would be beneficial but not essential, SpringBoot, AWS, Kubernetes, Terraform, Redis

Read more
Quinnox

at Quinnox

2 recruiters
Agency job
via hirezyai by HR Hirezyai
Bengaluru (Bangalore)
7 - 12 yrs
₹30L - ₹34L / yr
skill icon.NET
Technical Architecture
SQL
Microservices

Job Summary:

We are looking for a highly skilled senior .NET Full Stack Developer with 7+ years of experience to join our dynamic team. The ideal candidate should have strong expertise in .NET 8, Microservices Architecture, SQL, and various design patterns. Prior experience in the banking or financial services domain is highly preferred. The role requires excellent communication skills, client interaction abilities, and a strong understanding of agile methodologies. You will be responsible for designing, developing, and maintaining scalable applications while collaborating closely with clients and cross-functional teams.

Key Responsibilities:

  1. Design, develop, and maintain robust, scalable, and high-performance applications using .NET 8 and related technologies.
  2. Develop and implement Microservices Architecture to build modular and scalable solutions.
  3. Work on both frontend and backend development, ensuring seamless integration.
  4. Apply design patterns such as SOLID principles, Repository Pattern, CQRS, and DDD for optimized application development.
  5. Develop and optimize complex SQL queries, stored procedures, and database structures to ensure efficient data management.
  6. Collaborate with business stakeholders and clients to gather requirements and provide technical solutions.
  7. Ensure security, performance, and scalability of applications, particularly in banking and financial environments.
  8. Actively participate in Agile/Scrum development cycles, including sprint planning, daily stand-ups, and retrospectives.
  9. Communicate effectively with clients and internal teams, demonstrating strong problem-solving skills.
  10. Troubleshoot and debug technical issues, ensuring high availability and smooth performance of applications.
  11. Mentor and guide junior developers, conducting code reviews and knowledge-sharing sessions.
  12. Stay updated with the latest technologies, frameworks, and best practices in .NET development and Microservices.

Required Skills & Experience:

  1. .NET 8 (C#, ASP.NET Core, Web API) – Strong hands-on experience in enterprise-level application development.
  2. Microservices Architecture – Experience designing and implementing scalable, loosely coupled services.
  3. Frontend Technologies – Knowledge of Angular, React, or Blazor for UI development.
  4. Design Patterns & Best Practices – Strong understanding of SOLID principles, Repository Pattern, CQRS, Factory Pattern, etc.
  5. SQL & Database Management – Expertise in MS SQL Server, query optimization, and stored procedures.
  6. Agile Methodologies – Solid understanding of Scrum, Kanban, and Agile best practices.
  7. Banking/Financial Domain (Preferred) – Experience in core banking systems, payment gateways, or financial applications.
  8. Client Interaction & Communication Skills – Excellent verbal and written communication, with the ability to engage with clients effectively.
  9. Logical & Analytical Thinking – Strong problem-solving skills with the ability to design efficient solutions.
  10. Cloud & DevOps (Preferred) – Exposure to Azure MANDATORY /AWS, Docker, Kubernetes, and CI/CD pipelines.
  11. Version Control & Collaboration – Proficiency in Git, Azure DevOps, and Agile tools (JIRA, Confluence, etc.).

Preferred Qualifications:

  1. Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  2. Certifications in Microsoft .NET, Azure, or Agile methodologies are a plus.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
3 - 15 yrs
Best in industry
skill iconJava
06692
Microservices
SQL

  Minimum 3+ Years of Core Java Programming with Collections Framework, Concurrent Programming, Multi-threading (Good knowledge in Executor service, Fork joins pool and other threading concepts)

 

·       Good knowledge of the JVM with an understanding of performance and memory optimization.

 

·       Extensive and expert programming experience in JAVA programming language (strong OO skills preferred).

 

·       Excellent knowledge on collections like, Array List, Vector, LinkedList, HashMap, Hash Table, HashSet is mandate.

 

·       Exercised exemplary development practices including design specification, coding standards, unit testing, and code-reviews.

 

·       Expert level understanding of Object-Oriented Concepts and Data Structures

 

·       Good experience in Database (Sybase, Oracle or SQL Server) like indexing (clustered, non-clustered), hashing, segmenting, data types like clob / blob, views (materialized), replication, constraints, functions, triggers, procedures etc.


Read more
Auxo AI
kusuma Gullamajji
Posted by kusuma Gullamajji
Hyderabad, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram
2 - 6 yrs
₹15L - ₹35L / yr
Microsoft Windows Azure
ETL
Data Warehouse (DWH)
SQL

We are seeking skilled Data Engineers with prior experience on Azure data engineering services and SQL server to join our team. As a Data Engineer, you will be responsible for designing and implementing robust data infrastructure, building scalable data pipelines, and ensuring efficient data integration and storage.


Experience : 2+ years


Notice : Immediate to 30days


Responsibilities :


- Design, develop, and maintain scalable data pipelines and use Azure Data Factory and Azure Stream Analytics


- Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.


- Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness.


- Implement data governance and security best practices to ensure compliance and data integrity.


- Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring.


- Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.


Qualifications :


- Proven experience as a Data Engineer or in a similar role.


- Experience in designing and hands-on development in cloud-based (AWS/Azure) analytics solutions.


- Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, Azure App Service, Azure Databricks, Azure IoT, Azure HDInsight + Spark, Azure Stream Analytics


- Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential.


- Experience in SQL server and procedures.


- Thorough understanding of Azure Infrastructure offerings.


- Strong experience in common data warehouse modelling principles including Kimball, Inmon.


- Experience in additional Modern Database terminologies.


- Working knowledge of Python or Java or Scala is desirable


- Strong knowledge of data modelling, ETL processes, and database technologies


- Experience with big data processing frameworks (Hadoop, Spark) and data pipeline orchestration tools (Airflow).


- Solid understanding of data governance, data security, and data quality best practices.


- Strong analytical and problem-solving skills, with attention to detail.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
3 - 8 yrs
₹3L - ₹27L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
snowflake
SQL

Key Responsibilities:

  • Design, develop, and maintain data pipelines on AWS.
  • Work with large-scale data processing using SQL, Python or PySpark.
  • Implement and optimize ETL processes for structured and unstructured data.
  • Develop and manage data models in Snowflake.
  • Ensure data security, integrity, and compliance on AWS cloud infrastructure.
  • Collaborate with cross-functional teams to support data-driven decision-making.

Required Skills:

  • Strong hands-on experience with AWS services 
  • Proficiency in SQL, Python, or PySpark for data processing and transformation.
  • Experience working with Snowflake for data warehousing.
  • Strong understanding of data modeling, data governance, and performance tuning.
  • Knowledge of CI/CD pipelines for data workflows is a plus.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill icon.NET
Windows Azure
SQL
skill iconC#

JD:

The Senior Software Engineer works closely with our development team, product manager, dev-ops and business analysts to build our SaaS platform to support efficient, end-to-end business processes across the industry using modern flexible technologies such as GraphQL, Kubernetes and React.


Technical Skills : C#, Angular, Azure with preferably .Net


Responsibilities

· Develops and maintains back-end, front-end applications and cloud services using C#, . Angular, Azure

· Accountable for delivering high quality results

· Mentors less experienced members of the team

· Thrives in a test-driven development organization with high quality standards

· Contributes to architecture discussions as needed

· Collaborates with Business Analyst to understand user stories and requirements to meet functional needs

· Supports product team’s efforts to produce product roadmap by providing estimates for enhancements

· Supports user acceptance testing and user story approval processes on development items

· Participates in sessions to resolve product issues

· Escalates high priority issues to appropriate internal stakeholders as necessary and appropriate

· Maintains a professional, friendly, open, approachable, positive attitude


Location : Bangalore

Ideal Work Experience and Skills

· At least 7 - 15 years’ experience working in a software development environment

· Prefer Bachelor’s degree in software development or related field

· Development experience with Angular and .NET is beneficial but not required

· Highly self-motivated and able to work effectively with virtual teams of diverse backgrounds

· Strong desire to learn and grow professionally

· A track record of following through on commitments; Excellent planning, organizational, and time management skills


Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Bengaluru (Bangalore)
3 - 5 yrs
₹20L - ₹25L / yr
ETL
SQL
Apache Spark
Apache Kafka

Role & Responsibilities

About the Role:


We are seeking a highly skilled Senior Data Engineer with 5-7 years of experience to join our dynamic team. The ideal candidate will have a strong background in data engineering, with expertise in data warehouse architecture, data modeling, ETL processes, and building both batch and streaming pipelines. The candidate should also possess advanced proficiency in Spark, Databricks, Kafka, Python, SQL, and Change Data Capture (CDC) methodologies.

Key responsibilities:


Design, develop, and maintain robust data warehouse solutions to support the organization's analytical and reporting needs.

Implement efficient data modeling techniques to optimize performance and scalability of data systems.

Build and manage data lakehouse infrastructure, ensuring reliability, availability, and security of data assets.

Develop and maintain ETL pipelines to ingest, transform, and load data from various sources into the data warehouse and data lakehouse.

Utilize Spark and Databricks to process large-scale datasets efficiently and in real-time.

Implement Kafka for building real-time streaming pipelines and ensure data consistency and reliability.

Design and develop batch pipelines for scheduled data processing tasks.

Collaborate with cross-functional teams to gather requirements, understand data needs, and deliver effective data solutions.

Perform data analysis and troubleshooting to identify and resolve data quality issues and performance bottlenecks.

Stay updated with the latest technologies and industry trends in data engineering and contribute to continuous improvement initiatives.

Read more
Xebia IT Architects

at Xebia IT Architects

2 recruiters
Vijay S
Posted by Vijay S
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Chennai, Bhopal, Jaipur
10 - 15 yrs
₹30L - ₹40L / yr
Spark
Google Cloud Platform (GCP)
skill iconPython
Apache Airflow
PySpark
+1 more

We are looking for a Senior Data Engineer with strong expertise in GCP, Databricks, and Airflow to design and implement a GCP Cloud Native Data Processing Framework. The ideal candidate will work on building scalable data pipelines and help migrate existing workloads to a modern framework.


  • Shift: 2 PM 11 PM
  • Work Mode: Hybrid (3 days a week) across Xebia locations
  • Notice Period: Immediate joiners or those with a notice period of up to 30 days


Key Responsibilities:

  • Design and implement a GCP Native Data Processing Framework leveraging Spark and GCP Cloud Services.
  • Develop and maintain data pipelines using Databricks and Airflow for transforming Raw → Silver → Gold data layers.
  • Ensure data integrity, consistency, and availability across all systems.
  • Collaborate with data engineers, analysts, and stakeholders to optimize performance.
  • Document standards and best practices for data engineering workflows.

Required Experience:


  • 7-8 years of experience in data engineering, architecture, and pipeline development.
  • Strong knowledge of GCP, Databricks, PySpark, and BigQuery.
  • Experience with Orchestration tools like Airflow, Dagster, or GCP equivalents.
  • Understanding of Data Lake table formats (Delta, Iceberg, etc.).
  • Proficiency in Python for scripting and automation.
  • Strong problem-solving skills and collaborative mindset.


⚠️ Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.


Looking forward to your response!


Best regards,

Vijay S

Assistant Manager - TAG

https://www.linkedin.com/in/vijay-selvarajan/

Read more
Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises

Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises

Agency job
via HyrHub by Neha Koshy
Bengaluru (Bangalore)
2 - 4 yrs
₹15L - ₹25L / yr
skill iconJava
skill iconSpring Boot
Apache Kafka
SQL
Algorithms
+6 more

Mandatory Skills:

  • Java
  • Kafka
  • Spring Boot
  • SQL / MySQL
  • Algorithms
  • Data Structures

Key Responsibilities:

  • Design and Develop large scale sub-systems
  • To periodically explore latest technologies (esp Open Source) and prototype sub-systems
  • Be a part of the team that develops the next gen Customer Data Platform
  • Build components to make the customer data platform more efficient and scalable

Qualifications:

  • 2-4 years of relevant experience with Algorithms, Data Structures, & Optimizations in addition to Coding.
  • Education: B.E/B-Tech/M-Tech/M.S/MCA Computer Science or Equivalent from premier institutes only
  • Candidates with CGPA 9 or above will be preferred.

Skill Set:

  • Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding)
  • Good System design and Class design
  • Good knowledge of Databases (Both SQL/NOSQL)
  • Good knowledge of Kafka, Streaming Systems
  • Good Knowledge of Java, Unit Testing

Soft Skills:

  • Has appreciation of technology and its ability to create value in the CDP domain
  • Excellent written and verbal communication skills
  • Active & contributing team member
  • Strong work ethic with demonstrated ability to meet and exceed commitments
  • Others: Experience of having worked in a start-up is a plus
Read more
Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises

Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises

Agency job
via HyrHub by Neha Koshy
Bengaluru (Bangalore)
1 - 2 yrs
₹7L - ₹12L / yr
skill iconJava
Apache Kafka
SQL
skill iconSpring Boot
Algorithms
+6 more

Mandatory Skills:

  • Java
  • Kafka
  • Spring Boot
  • SQL / MySQL
  • Algorithms
  • Data Structures

Key Responsibilities:

  • Design and Develop large scale sub-systems.
  • To periodically explore latest technologies (esp. Open Source) and prototype sub-systems.
  • Be a part of the team that develops the next-gen Targeting platform.
  • Build components to make the customer data platform more efficient and scalable.

Qualifications:

  • 0-2 years of relevant experience with Java, Algorithms, Data Structures, & Optimizations in addition to Coding.
  • Education: B.E/B-Tech/M-Tech/M.S in Computer Science or IT from premier institutes.
  • Candidates with CGPA 9 or above will be preferred.

Skill Set:

  • Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding).
  • Good knowledge of Databases - SQL, NoSQL.
  • Knowledge of Unit Testing a plus.

Soft Skills:

  • Has an appreciation of technology and its ability to create value in the marketing domain.
  • Excellent written and verbal communication skills.
  • Active & contributing team member.
  • Strong work ethic with demonstrated ability to meet and exceed commitments.
  • Others: Experience of having worked in a start-up is a plus.
Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Bengaluru (Bangalore), Pune, Gurugram, Chennai, Bhopal, Jaipur
5 - 10 yrs
₹15L - ₹24L / yr
Tableau
SQL

Job Description:

We are seeking a Tableau Developer with 5+ years of experience to join our Core Analytics team. The candidate will work on large-scale BI projects using Tableau and related tools.


Must Have:

  • Strong expertise in Tableau Desktop and Server, including add-ons like Data and Server Management.
  • Ability to interpret business requirements, build wireframes, and finalize KPIs, calculations, and designs.
  • Participate in design discussions to implement best practices for dashboards and reports.
  • Build scalable BI and Analytics products based on feedback while adhering to best practices.
  • Propose multiple solutions for a given problem, leveraging toolset functionality.
  • Optimize data sources and dashboards while ensuring business requirements are met.
  • Collaborate with product, platform, and program teams for timely delivery of dashboards and reports.
  • Provide suggestions and take feedback to deliver future-ready dashboards.
  • Peer review team members’ dashboards, offering constructive feedback to improve overall design.
  • Proficient in SQL, UI/UX practices, and alation, with an understanding of good data models for reporting.
  • Mentor less experienced team members.


Read more
Zenius IT Services Pvt Ltd

at Zenius IT Services Pvt Ltd

2 candid answers
Sunita Pradhan
Posted by Sunita Pradhan
Bengaluru (Bangalore), Chennai
5 - 10 yrs
₹10L - ₹15L / yr
snowflake
SQL
data intergration tool
ETL/ELT Pipelines
SQL Queries
+5 more

Job Summary


We are seeking a skilled Snowflake Developer to design, develop, migrate, and optimize Snowflake-based data solutions. The ideal candidate will have hands-on experience with Snowflake, SQL, and data integration tools to build scalable and high-performance data pipelines that support business analytics and decision-making.


Key Responsibilities:


Develop and implement Snowflake data warehouse solutions based on business and technical requirements.

Design, develop, and optimize ETL/ELT pipelines for efficient data ingestion, transformation, and processing.

Write and optimize complex SQL queries for data retrieval, performance enhancement, and storage optimization.

Collaborate with data architects and analysts to create and refine efficient data models.

Monitor and fine-tune Snowflake query performance and storage optimization strategies for large-scale data workloads.

Ensure data security, governance, and access control policies are implemented following best practices.

Integrate Snowflake with various cloud platforms (AWS, Azure, GCP) and third-party tools.

Troubleshoot and resolve performance issues within the Snowflake environment to ensure high availability and scalability.

Stay updated on Snowflake best practices, emerging technologies, and industry trends to drive continuous improvement.


Qualifications:

Education: Bachelor’s or master’s degree in computer science, Information Systems, or a related field.


Experience:


6+ years of experience in data engineering, ETL development, or similar roles.

3+ years of hands-on experience in Snowflake development.


Technical Skills:


Strong proficiency in SQL, Snowflake Schema Design, and Performance Optimization.

Experience with ETL/ELT tools like dbt, Talend, Matillion, or Informatica.

Proficiency in Python, Java, or Scala for data processing.

Familiarity with cloud platforms (AWS, Azure, GCP) and integration with Snowflake.

Experience with data governance, security, and compliance best practices.

Strong analytical, troubleshooting, and problem-solving skills.

Communication: Excellent communication and teamwork abilities, with a focus on collaboration across teams.


Preferred Skills:


Snowflake Certification (e.g., SnowPro Core or Advanced).

Experience with real-time data streaming using tools like Kafka or Apache Spark.

Hands-on experience with CI/CD pipelines and DevOps practices in data environments.

Familiarity with BI tools like Tableau, Power BI, or Looker for data visualization and reporting.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Bengaluru (Bangalore)
5 - 8 yrs
₹30L - ₹45L / yr
skill iconSpring Boot
Spring
Microservices
skill iconJava
skill iconAmazon Web Services (AWS)
+7 more

What we Require


We are recruiting technical experts with the following core skills and hands-on experience on


Mandatory skills : Core java, Microservices, AWS/Azure/GCP, Spring, Spring Boot

Hands on experience on : Kafka , Redis ,SQL, Docker, Kubernetes

Expert proficiency in designing both producer and consumer types of Rest services.

Expert proficiency in Unit testing and Code Quality tools.

Expert proficiency in ensuring code coverage.

Expert proficiency in understanding High-Level Design and translating that to Low-Level design

Hands-on experience working with no-SQL databases.

Experience working in an Agile development process - Scrum.

Experience working closely with engineers and software cultures.

Ability to think at a high level about product strategy and customer journeys.

Ability to produce low level design considering the paradigm that journeys will be extensible in the future and translate that into components that can be easily extended and reused.

Excellent communication skills to clearly articulate design decisions.

Read more
Jio Tesseract
TARUN MISHRA
Posted by TARUN MISHRA
Bengaluru (Bangalore), Pune, Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Navi Mumbai, Kolkata, Rajasthan
5 - 24 yrs
₹9L - ₹70L / yr
skill iconC
skill iconC++
Visual C++
Embedded C++
Artificial Intelligence (AI)
+32 more

JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.


Mon-fri role, In office, with excellent perks and benefits!


Position Overview

We are seeking a Software Architect to lead the design and development of high-performance robotics and AI software stacks utilizing NVIDIA technologies. This role will focus on defining scalable, modular, and efficient architectures for robot perception, planning, simulation, and embedded AI applications. You will collaborate with cross-functional teams to build next-generation autonomous systems 9


Key Responsibilities:

1. System Architecture & Design

● Define scalable software architectures for robotics perception, navigation, and AI-driven decision-making.

● Design modular and reusable frameworks that leverage NVIDIA’s Jetson, Isaac ROS, Omniverse, and CUDA ecosystems.

● Establish best practices for real-time computing, GPU acceleration, and edge AI inference.


2. Perception & AI Integration

● Architect sensor fusion pipelines using LIDAR, cameras, IMUs, and radar with DeepStream, TensorRT, and ROS2.

● Optimize computer vision, SLAM, and deep learning models for edge deployment on Jetson Orin and Xavier.

● Ensure efficient GPU-accelerated AI inference for real-time robotics applications.


3. Embedded & Real-Time Systems

● Design high-performance embedded software stacks for real-time robotic control and autonomy.

● Utilize NVIDIA CUDA, cuDNN, and TensorRT to accelerate AI model execution on Jetson platforms.

● Develop robust middleware frameworks to support real-time robotics applications in ROS2 and Isaac SDK.


4. Robotics Simulation & Digital Twins

● Define architectures for robotic simulation environments using NVIDIA Isaac Sim & Omniverse.

● Leverage synthetic data generation (Omniverse Replicator) for training AI models.

● Optimize sim-to-real transfer learning for AI-driven robotic behaviors.


5. Navigation & Motion Planning

● Architect GPU-accelerated motion planning and SLAM pipelines for autonomous robots.

● Optimize path planning, localization, and multi-agent coordination using Isaac ROS Navigation.

● Implement reinforcement learning-based policies using Isaac Gym.


6. Performance Optimization & Scalability

● Ensure low-latency AI inference and real-time execution of robotics applications.

● Optimize CUDA kernels and parallel processing pipelines for NVIDIA hardware.

● Develop benchmarking and profiling tools to measure software performance on edge AI devices.


Required Qualifications:

● Master’s or Ph.D. in Computer Science, Robotics, AI, or Embedded Systems.

● Extensive experience (7+ years) in software development, with at least 3-5 years focused on architecture and system design, especially for robotics or embedded systems.

● Expertise in CUDA, TensorRT, DeepStream, PyTorch, TensorFlow, and ROS2.

● Experience in NVIDIA Jetson platforms, Isaac SDK, and GPU-accelerated AI.

● Proficiency in programming languages such as C++, Python, or similar, with deep understanding of low-level and high-level design principles.

● Strong background in robotic perception, planning, and real-time control.

● Experience with cloud-edge AI deployment and scalable architectures.


Preferred Qualifications

● Hands-on experience with NVIDIA DRIVE, NVIDIA Omniverse, and Isaac Gym

● Knowledge of robot kinematics, control systems, and reinforcement learning

● Expertise in distributed computing, containerization (Docker), and cloud robotics

● Familiarity with automotive, industrial automation, or warehouse robotics

● Experience designing architectures for autonomous systems or multi-robot systems.

● Familiarity with cloud-based solutions, edge computing, or distributed computing for robotics

● Experience with microservices or service-oriented architecture (SOA)

● Knowledge of machine learning and AI integration within robotic systems

● Knowledge of testing on edge devices with HIL and simulations (Isaac Sim, Gazebo, V-REP etc.)

Read more
Bengaluru (Bangalore)
5 - 9 yrs
₹15L - ₹40L / yr
ICM
Varicent
callidus
sap commission
SQL

The Opportunity:

Working within the Professional Services team as Senior Implementation Consultant you will be responsible for ensuring our customers are implemented efficiently and effectively. You will partner with our customers to take them on a journey from understanding their incentive compensation needs, through design and building within the product, testing and training, to the customer using Performio to pay their commissions. You will be able to work independently as well as part of small project teams to create and implement solutions for existing and new customers alike.

About Performio:

We are a small, but mighty company offering incentive compensation management software (ICM) that regularly beats the legacy incumbents in our industry. How? Our people and our product.

Our people are highly-motivated and engaged professionals with a clear set of values and behaviors. We prove these values matter to us by living them each day. This makes Performio both a great place to work and a great company to do business with.

But a great team alone is not sufficient to win. We also have a great product that balances the flexibility that large companies need in a sales commission solution with the great user experience buyers have come to expect from modern software. We are the only company in our industry that can make this claim and the market has responded favorably. We have a global customer base across Australia, Asia, Europe, and the US in 25+ industries that includes many well-known companies like News Corp, Johnson & Johnson and Vodafone.

What will you be doing:

  • ● Effectively organize, lead and facilitate discussions with Customers and internal Stakeholders
  • ● Elicit detailed requirements from Customers around their sales comp plans, data, and processes
  • ● Deliver thorough, well-documented, detailed system design that ties back to Customer specific requirements
  • ● Implement system designs in our SaaS ICM Software product: Setup data intake from the Customer, data integration / enrichment / transformation, buildout the compensation plans in the software tool, configure reporting & analytics according to customer requirements gleaned in the discovery process
  • ● Conduct thorough functional testing to ensure a high-quality solution is delivered to customers
  • ● Evaluate discrepancies in data and accurately identify root causes for any errors
  • ● Lead the Customer Testing Support & System Handover activities, working closely with Customer Admins
  • ● Train & support Customer Admins on the Performio product and the Customer’s specific implementation
  • ● Be actively involved and support Customers through the system go-live process
  • ● Support existing Customers by investigating and resolving issues
  • ● Provide detailed and accurate estimates for potential Customer Projects & Change
  • Requests
  • ● Participate in activities, and provide feedback on internal processes and the standard
  • solution
  • ● Document best practices, define/develop reusable components; and advocate
  • leverage/reuse of these to improve the Delivery Efficiencies for Performio & the Time to
  • Value for Customers
  • ● Participate in activities, and provide Customer focused feedback on improving and
  • stabilizing the product
  • ● Track work on projects using our PS automation software
  • What we’re looking for:
  • ● 5+ Years of relevant working experience in professional services on implementation of ICM solutions using products like SAP Commissions/Callidus, Xactly, Varicent etc.
  • ● Proficient in working with large datasets using Excel, Relational Database Tables, SQL, ETL or similar type of tools
  • ● Good understanding of Incentive Compensation Management concepts
  • ● Willing to take on ambiguous & complex challenges and solve them
  • ● Loves and is good at multitasking and juggling multiple workstreams
  • ● Effective and confident in communication - ability to interface with senior employees at
  • our Customers
  • ● Ability to lead projects / initiatives, coach & guide junior consultants to help them be
  • successful in their roles
  • ● Highly detail oriented - takes great notes, can document complex solutions in detail
  • ● Some understanding of accounting, finance, and/or sales comp concepts
  • ● Positive Attitude - optimistic, cares deeply about Company and Customers
  • ● High Emotional IQ - shows empathy, listens when appropriate, creates healthy
  • conversation, dynamic, humble / no ego
  • ● Resourceful - has a "I'll figure it out" attitude if something they need doesn't exist
  • ● Smart and curious to learn
  • ● Programming experience using Python will be a plus
  • Why us:
  • We’re fast-growing, but still small enough for everyone to make a big impact (and have face time with the CEO). We have genuine care for our customers and the passion to transform our product into one where experience and ease of use is a true differentiator. Led by a

strong set of company values, we play to win and are incentivized to do so through our employee equity plan.

We’ve adapted well to the work from home lifestyle, and take advantage of flexible working arrangements.

Our values speak strongly to who we really are. They mean a lot to us, and we use them every day to make decisions, and of course to hire great people!

  • ● Play to win - we focus on results, have a bias to action and finish what we start
  • ● Paint a clear picture - we’re clear, concise and communicate appropriately
  • ● Be curious - we surface alternative solutions and consistently expand our knowledge
  • ● Work as one - we all pitch in but also hold each other to account
  • ● Do the right thing - we put what’s right for our customers over our own ego 
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore), Hyderabad
6 - 12 yrs
Best in industry
SQL
Datawarehousing
ETL QA
snowflak

Job Description for QA Engineer:

  • 6-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
  • Strong SQL knowledge & debugging skills are a must.
  • Experience in Azure and Snowflake Testing is plus
  • Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
  • Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
  • Experience in JIRA, Xray defect management toolis good to have.
  • Exposure to the financial domain knowledge is considered a plus.
  • Testing the data-readiness (data quality) address code or data issues
  • Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
  • Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
  • Prior experience with State Street and Charles River Development (CRD) considered a plus
  • Experience in tools such as PowerPoint, Excel, SQL
  • Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus

Key Attributes include:

  • Team player with professional and positive approach
  • Creative, innovative and able to think outside of the box
  • Strong attention to detail during root cause analysis and defect issue resolution
  • Self-motivated & self-sufficient
  • Effective communicator both written and verbal
  • Brings a high level of energy with enthusiasm to generate excitement and motivate the team
  • Able to work under pressure with tight deadlines and/or multiple projects
  • Experience in negotiation and conflict resolution


Read more
Pixis AI

at Pixis AI

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 6 yrs
Upto ₹45L / yr (Varies
)
skill iconPython
skill iconGo Programming (Golang)
Apache Kafka
Apache Spark
Apache Airflow
+5 more

Key Responsibilities:

  • Design, build, and maintain scalable, real-time data pipelines using Apache Flink (or Apache Spark).
  • Work with Apache Kafka (mandatory) for real-time messaging and event-driven data flows.
  • Build data infrastructure on Lakehouse architecture, integrating data lakes and data warehouses for efficient storage and processing.
  • Implement data versioning and cataloging using Apache Nessie, and optimize datasets for analytics with Apache Iceberg.
  • Apply advanced data modeling techniques and performance tuning using Apache Doris or similar OLAP systems.
  • Orchestrate complex data workflows using DAG-based tools like Prefect, Airflow, or Mage.
  • Collaborate with data scientists, analysts, and engineering teams to develop and deliver scalable data solutions.
  • Ensure data quality, consistency, performance, and security across all pipelines and systems.
  • Continuously research, evaluate, and adopt new tools and technologies to improve our data platform.

Skills & Qualifications:

  • 3–6 years of experience in data engineering, building scalable data pipelines and systems.
  • Strong programming skills in Python, Go, or Java.
  • Hands-on experience with stream processing frameworks – Apache Flink (preferred) or Apache Spark.
  • Mandatory experience with Apache Kafka for stream data ingestion and message brokering.
  • Proficiency with at least one DAG-based orchestration tool like Airflow, Prefect, or Mage.
  • Solid understanding and hands-on experience with SQL and NoSQL databases.
  • Deep understanding of data lakehouse architectures, including internal workings of data lakes and data warehouses, not just usage.
  • Experience working with at least one cloud platform, preferably AWS (GCP or Azure also acceptable).
  • Strong knowledge of distributed systems, data modeling, and performance optimization.

Nice to Have:

  • Experience with Apache Doris or other MPP/OLAP databases.
  • Familiarity with CI/CD pipelines, DevOps practices, and infrastructure-as-code in data workflows.
  • Exposure to modern data version control and cataloging tools like Apache Nessie.
Read more
Pixis AI

at Pixis AI

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 6 yrs
Upto ₹40L / yr (Varies
)
skill iconNodeJS (Node.js)
skill iconReact.js
SQL
skill iconRedis
Apache Kafka

Key Responsibilities:


• Design, develop, and maintain scalable and robust full-stack applications using

cutting-edge technologies.

• Collaborate with product managers, designers, and other stakeholders to

understand requirements and translate them into technical specifications.

• Implement front-end and back-end features with a focus on usability,

performance, and security.

• Write clean, efficient, and maintainable code while following best practices and

coding standards.

• Conduct code reviews, provide constructive feedback, and mentor junior

developers to foster a culture of continuous improvement.

• Troubleshoot and debug issues, identify root causes, and implement solutions to

ensure smooth application operation.

• Stay updated on emerging technologies and industry trends and apply them to

enhance our software development process and capabilities.


Requirements & Skills:


• Bachelor’s degree in computer science, Engineering, or a related field.

• 4+ years of professional experience in software development, with a focus on fullstack development.

• Proficiency in programming languages such as JavaScript (Node.js), Python, orJava.

• Experience with front-end frameworks/libraries such as React, Angular, or Vue.js.

• Solid understanding of back-end frameworks/libraries such as Express.js, Django, or Spring Boot.

• Experience with database systems (SQL and NoSQL) and ORMs (e.g., Sequelize, SQLAlchemy, Hibernate).

Read more
OnActive
Mansi Gupta
Posted by Mansi Gupta
Gurugram, Pune, Bengaluru (Bangalore), Chennai, Bhopal, Hyderabad, Jaipur
5 - 8 yrs
₹6L - ₹12L / yr
skill iconPython
Spark
SQL
AWS CloudFormation
skill iconMachine Learning (ML)
+3 more

Level of skills and experience:


5 years of hands-on experience in using Python, Spark,Sql.

Experienced in AWS Cloud usage and management.

Experience with Databricks (Lakehouse, ML, Unity Catalog, MLflow).

Experience using various ML models and frameworks such as XGBoost, Lightgbm, Torch.

Experience with orchestrators such as Airflow and Kubeflow.

Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes).

Fundamental understanding of Parquet, Delta Lake and other data file formats.

Proficiency on an IaC tool such as Terraform, CDK or CloudFormation.

Strong written and verbal English communication skill and proficient in communication with non-technical stakeholderst

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill iconC#
Selenium
SDET
SQL

Required Skills and Experience:

  • 5-7 years of experience in Software Development Engineering in Test (SDET).
  • Proven experience in developing and maintaining automated test suites using Selenium.
  • Strong proficiency in .NET programming languages (C# ).
  • Solid understanding of software testing principles and methodologies.
  • Experience with SQL for database testing.
  • Ability to analyze and troubleshoot complex software issues.
  • Excellent communication and collaboration skills.


Nice-to-Have Skills:

  • Experience writing manual test cases and creating comprehensive test documentation.
  • Understanding of API testing and experience with API test automation.
  • Familiarity with Azure DevOps and GitHub for version control and CI/CD.
  • Experience in performance testing .
  • Experience with Agile/Scrum development methodologies.


Read more
Averlon

at Averlon

2 products
2 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
7yrs+
Upto ₹90L / yr (Varies
)
skill iconGo Programming (Golang)
Distributed Systems
Microservices
Architecture
SQL
+3 more

Join an innovative and groundbreaking cybersecurity startup focused on helping customers identify, mitigate, and protect against ever-evolving cyber threats. With the current geopolitical climate, organizations need to stay ahead of malicious threat actors as well as nation-state actors. Cybersecurity teams are getting overwhelmed, and they need intelligent systems to help them focus on addressing the biggest and current risks first.


We help organizations protect their assets and customer data by continuously evaluating the new threats and risks to their cloud environment. This will, in turn, help mitigate the high-priority threats quickly so that the engineers can spend more time innovating and providing value to their customers.


About the Engineering Team:

We have several decades of experience working in the security industry, having worked on some of the most cutting-edge security technology that helped protect millions of customers. We have built technologies from the ground up, partnered with the industry on innovation, and helped customers with some of the most stringent requirements. We leverage industry and academic experts and veterans for their unique insight. Security technology includes all facets of software engineering work from data analytics and visualization, AI/ML processing, highly distributed and available services with real-time monitoring, integration with various other services, including protocol-level work. You will be learning from some of the best engineering talent with multi-cloud expertise.


We are looking for a highly experienced Principal Software Engineer to lead the development and scaling of our backend systems. The ideal candidate will have extensive experience in distributed systems, database management, Kubernetes, and cloud technologies. As a key technical leader, you will design, implement, and optimize critical backend services, working closely with cross-functional teams to ensure system reliability, scalability, and performance.


Key Responsibilities:

  • Architect and Develop Distributed Systems: Design and implement scalable, distributed systems using microservices architecture. Expertise in both synchronous (REST/gRPC) and asynchronous communication patterns (message queues, Kafka), with a strong emphasis on building resilient services that can handle large data and maintain high throughput. Craft cloud solutions tailored to specific needs, choosing appropriate AWS services and optimizing resource utilization to ensure performance and high availability.
  • Database Architecture & Optimization: Lead efforts to design and manage databases with a focus on scaling, replication, query optimization, and managing large datasets.
  • Performance & Reliability: Engage in continuous learning and innovation to improve customer satisfaction. Embrace accountability and respond promptly to service issues to maintain and enhance system health. Ensure the backend systems meet high standards for performance, reliability, and scalability, identifying and solving bottlenecks and architectural challenges by leveraging various observability tools (such as Prometheus and Grafana).
  • Leadership & Mentorship: Provide technical leadership and mentorship to other engineers, guiding architecture decisions, reviewing code, and helping to build a strong engineering culture. Stay abreast of the latest industry trends in cloud technology, adopting best practices to continuously improve our services and security measures.


Key Qualifications:

  • Experience: 8+ years of experience in backend engineering, with at least 5 years of experience in building distributed systems.
  • Technical Expertise:
  • Distributed Systems: Extensive experience with microservices architecture, working with both synchronous (REST, gRPC) and asynchronous patterns (SNS, SNQ). Strong understanding of service-to-service authentication and authorization, API rate limiting, and other critical aspects of scalable systems.
  • Database: Expertise in database technologies with experience working with large datasets, optimizing queries, handling replication, and creating views for performance. Hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB, Cassandra). Expertise in various database technologies and deep experience with creating data models that provide consistent data views to the customer while data is being morphed, handling data migrations, and ensuring data integrity and high availability.
  • Kubernetes: In-depth knowledge of Kubernetes, with experience deploying and managing services in Kubernetes clusters (EKS, AKS). Strong understanding of pods, services, networking, and scaling applications within Kubernetes environments.
  • Golang: Proven experience using Golang as the primary programming language for backend development. Deep understanding of concurrency, performance optimization, and scalability in Golang applications.
  • Cloud Technologies: Strong hands-on experience with AWS services (EC2, S3, DynamoDB, Lambda, RDS, EKS). Experience in designing and optimizing cloud-based architectures for large-scale distributed systems.
  • Problem Solver: Strong problem-solving and debugging skills, with a proven ability to design and optimize complex systems.
  • Leadership: Experience in leading engineering teams, guiding architectural decisions, and mentoring junior engineers.


Preferred Skills:

  • Experience with infrastructure as code (Terraform, CloudFormation).
  • Knowledge of GitHub-based CI/CD tools and best practices.
  • Experience with monitoring and logging tools (Prometheus, Grafana, ELK).
  • Cybersecurity experience.
Read more
Fork Technologies
Bengaluru (Bangalore)
5 - 8 yrs
Upto ₹24L / yr (Varies
)
Tableau
skill iconPython
SQL
ETL

Job Summary:


We are seeking a skilled Senior Tableau Developer to join our data team. In this role, you will design and build interactive dashboards, collaborate with data teams to deliver impactful insights, and optimize data pipelines using Airflow. If you are passionate about data visualization, process automation, and driving business decisions through analytics, we want to hear from you.

Key Responsibilities:

  • Develop and maintain dynamic Tableau dashboards and visualizations to provide actionable business insights.
  • Partner with data teams to gather reporting requirements and translate them into effective data solutions.
  • Ensure data accuracy by integrating various data sources and optimizing data pipelines.
  • Utilize Airflow for task orchestration, workflow scheduling, and monitoring.
  • Enhance dashboard performance by streamlining data processing and improving query efficiency.

Requirements:

  • 5+ years of hands-on experience in Tableau development.
  • Proficiency in Airflow for building and automating data pipelines.
  • Strong skills in data transformation, ETL processes, and data modeling.
  • Solid understanding of SQL and database management.
  • Excellent problem-solving skills and the ability to work collaboratively across teams.

Nice to Have:

  • Experience with cloud platforms like AWS, GCP, or Azure.
  • Familiarity with programming languages such as Python or R.

Why Join Us?

  • Work on impactful data projects with a talented and collaborative team.
  • Opportunity to innovate and shape data visualization strategies.
  • Competitive compensation and professional growth opportunities
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
3 - 8 yrs
₹5L - ₹30L / yr
snowflake
skill iconPython
SQL
skill iconAmazon Web Services (AWS)

Required Skills and Qualifications:

  • Proficiency in AWS Cloud services
  • Strong experience with SQL for data manipulation and analysis.
  • Hands-on programming experience in Python or pyspark.
  • Working knowledge of Snowflake, including data modeling, performance tuning, and security configurations.
  • Familiarity with CI/CD pipelines and version control (e.g., Git) is a plus.
  • Excellent problem-solving and communication skills.

Note : one face-to-face (F2F) round is mandatory, and as per the process, you will need to visit the office for this.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Tony Tom
Posted by Tony Tom
Bengaluru (Bangalore)
3 - 8 yrs
Best in industry
uipath
SQL
API
Scripting
skill iconPython
+2 more

Position: UiPath Developer

Experience: 3-7 years


Key Responsibilities:

1. Develop and implement automation solutions using UiPath.

2. Design, develop, test, and deploy RPA bots for process automation.

3. Write and optimize SQL queries, including joins, to manage and manipulate data effectively.

4. Develop scripts using Python, VB, .NET, or JavaScript to enhance automation capabilities.

5. Work with business stakeholders to analyze and optimize automation workflows.

6. Troubleshoot and resolve issues in RPA processes and scripts.

7. Ensure adherence to best practices in automation development and deployment.



Required Skills & Experience:


1. 3-7 years of experience in RPA development with UiPath.

2. Strong expertise in SQL, including writing complex queries and joins.

3. Hands-on experience with at least one scripting language: Python, VB, .NET, or JavaScript.

4. Understanding of RPA best practices, exception handling, and performance optimization.

5. Experience integrating UiPath with databases and other applications.

6. Strong problem-solving and analytical skills.



Preferred Qualifications:


1. UiPath certification is a plus.

2. Experience working in Agile environments.

3. Knowledge of APIs and web automation (AA).

Read more
Performio

Performio

Agency job
via maple green services by Nikita Sinha
Bengaluru (Bangalore)
4 - 10 yrs
Upto ₹65L / yr (Varies
)
skill iconJava
skill iconReact.js
skill iconAmazon Web Services (AWS)
skill iconSpring Boot
SQL

HIRING FOR OUR CLIENT - PERFORMIO


https://www.performio.co

 

https://www.linkedin.com/company/performio/


About Performio


Headquartered in Irvine, California, and with offices in San Francisco and Melbourne, Performio continues to offer sales performance management software for businesses looking to automate their sales compensation calculations and provide increased transparency to their sales reps.


Used by large global enterprises such as Veeva, GrubHub, Johnson & Johnson, and Vodafone - as well as growing mid-market companies - Performio is a new breed of sales compensation software that combines the enterprise-grade functionality that you need with the ease of use you’ve come to expect from modern software applications.



What’s the opportunity?


As Senior/Lead Software Engineer, you will play a significant role in turning our product vision into reality, while working within the Product Engineering team to add value across our product. You’ll draw on your experience to help us establish contemporary, web application software that is highly scalable, durable, and based on current best practices.


Our company is built on good practice and high standards. Passion for our customers and a willingness to put their needs first is at the centre of everything we do. We have a long history of going the extra mile to make sure our customers are happy. We’re looking for someone to join our team, ensuring that our systems scale effectively with our growth, and considering not only where we want to be but how we will get there.


Our product is written mainly in Java (Spring, Hibernate) and ReactJS with our Design System, Electric. Our architecture is a combination of microservices, decoupled AWS service architectures, and a well-maintained monolith. The product is deployed on AWS across multiple regions. We use tools like Docker and Buildkite and deploy our systems and monitor our technology using CloudWatch, New Relic and SquadCast. We’re looking for someone to help us evolve how our systems operate together while we grow our team and capability.



What will I be doing?

  • Creating change in a complex system. Working within our product stream, making well-considered decisions around patterns, principals, frameworks, languages and tools, thinking through and mitigating for potential cascading impacts of those changes.
  • Designing and developing well-architected systems. Understand and contribute to our product source code and cloud infrastructure.
  • Designing holistically, delivering iteratively. Work with the team to break down system-wide architecture recommendations into small, intelligently planned increments for delivery. 
  • Advocate for technology needs. Translate technology risk into opportunity during product and technology roadmap discussions.
  • Coach and mentor. Assist with career development of less experienced staff on our teams.
  • Putting Customers First. A regular rotation on support for the systems we develop.

 


 What we’re looking for

  • Demonstrated experience as a software engineer, with 4-8 years experience in technology roles
  • Experience working on complex systems and cloud architectures
  • Significant experience with across the full stack:
  • The Java programming language and frameworks such as Spring & SpringBoot 
  • Front-end Javascript frameworks such as ReactJS
  • Good knowledge of AWS services, design patterns and practices - ideally certified but if not, we’ll help you get there
  • Some knowledge of optimising databases and SQL queries for high-performance 
  • Experience and keen understanding of the value of working in agile teams
  • A “quality-first” mindset, with experience working in continuous integration environments and supporting the systems you contribute to
  • Highly effective at communicating, and comfortable whiteboarding design ideas with teams of engineers, product managers, and business analysts
  • Desire to challenge the status quo and maturity to know when to compromise
  • Respect for other team members and a highly collaborative approach to working and learning together
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort