Cutshort logo
Python Jobs in Bangalore (Bengaluru)

50+ Python Jobs in Bangalore (Bengaluru) | Python Job openings in Bangalore (Bengaluru)

Apply to 50+ Python Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
HNM Solutions

at HNM Solutions

1 recruiter
Yogitha Rani
Posted by Yogitha Rani
chennai,bangalore,hyderabad,kochi
3 - 5 yrs
₹6L - ₹12L / yr
Python
Django

Role: Python-Django Developer 

Location: Noida, India


Description:

  • Develop web applications using Python and Django.
  • Write clean and maintainable code following best practices and coding standards.
  • Collaborate with other developers and stakeholders to design and implement new features.
  • Participate in code reviews and maintain code quality.
  • Troubleshoot and debug issues as they arise.
  • Optimize applications for maximum speed and scalability.
  • Stay up-to-date with emerging trends and technologies in web development.

Requirements:

  • Bachelor's or Master's degree in Computer Science, Computer Engineering or a related field.
  • 4+ years of experience in web development using Python and Django.
  • Strong knowledge of object-oriented programming and design patterns.
  • Experience with front-end technologies such as HTML, CSS, and JavaScript.
  • Understanding of RESTful web services.
  • Familiarity with database technologies such as PostgreSQL or MySQL.
  • Experience with version control systems such as Git.
  • Ability to work in a team environment and communicate effectively with team members.
  • Strong problem-solving and analytical skills.


Read more
DailyRounds/Marrow

at DailyRounds/Marrow

1 video
8 recruiters
Prabakharan  SD
Posted by Prabakharan SD
Bengaluru (Bangalore)
2 - 4 yrs
Best in industry
Python
MongoDB

Marrow is a learning platform for doctors, medical students, and other healthcare practitioners with topic-wise learning modules, tests, performance analytics, and high-quality recorded medical video classes. Marrow is currently used by over 5 lakh medical students in India to prepare for the country’s largest medical competitive exam - NEET PG.


USP of Marrow


1) Loved by more than 70% of aspiring doctors in India.

2) NEET-PG 2020, 2021, 2022,2023 - Top 10 Rankers were the Marrow users.


DailyRounds is a healthcare startup focused on organizing “Knowledge of practice of Medicine” and building a community of Doctors (and healthcare professionals). We hold the largest IP (intellectual property) in clinical medicine in India. We hope to put this IP, network, and our best efforts to help Doctors improve how they diagnose and treat. We are a diverse team of 300 people based in Bangalore.


We are product-driven. We believe businesses should scale and be profitable. We avoid fads and focus on what makes business sense, what can scale, and what can make a positive impact (in that order).


In April 2019 M3 India, the Indian subsidiary of Japanese Healthtech company M3 (one of the largest healthcare networks globally, listed on the Tokyo Stock Exchange), picked up a majority stake in DailyRounds to foray into case-based problem-solving, community platform, and medical test preparation business in India.


What will you be doing here?


  • Work with the Engineering and Product team to build a scalable product while solving some of the core healthcare problems.
  • Design, build, and maintain efficient, reusable, and reliable Python code for backend systems.
  • Ensure the best possible performance, quality, and responsiveness of the applications
  • Will have exceptional learning opportunities.
  • Will be given a chance to own a product or a piece of technology.


Who are we looking for?


  • Minimum of 2 - 3.5 years of professional experience working with Python technologies.
  • Solid understanding of RESTful API principles and experience working with API frameworks (Flask, Django, or FastApi).
  • Strong problem-solving skills and the ability to work independently or collaboratively as part of a team.
  • Proficiency in database concepts, ensuring efficient data management.
  • Preferred experience with MongoDB.
  • Passionate about application security, scalability, and stability


Please note that only shortlisted candidates will be contacted.

Read more
Bengaluru (Bangalore)
6 - 12 yrs
₹25L - ₹35L / yr
Data Science
Machine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
Deep Learning
+4 more


• 6+ years of data science experience.

• Demonstrated experience in leading programs.

• Prior experience in customer data platforms/finance domain is a plus.

• Demonstrated ability in developing and deploying data-driven products.

• Experience of working with large datasets and developing scalable algorithms.

• Hands-on experience of working with tech, product, and operation teams.


Technical Skills:

• Deep understanding and hands-on experience of Machine learning and Deep

learning algorithms. Good understanding of NLP and LLM concepts and fair

experience in developing NLU and NLG solutions.

• Experience with Keras/TensorFlow/PyTorch deep learning frameworks.

• Proficient in scripting languages (Python/Shell), SQL.

• Good knowledge of Statistics.

• Experience with big data, cloud, and MLOps.

Soft Skills:

• Strong analytical and problem-solving skills.

• Excellent presentation and communication skills.

• Ability to work independently and deal with ambiguity.

Continuous Learning:

• Stay up to date with emerging technologies.


Qualification.


A degree in Computer Science, Statistics, Applied Mathematics, Machine Learning, or any related field / B. Tech.



Read more
Nudge

at Nudge

2 candid answers
1 product
Nikhil Mohite
Posted by Nikhil Mohite
Bengaluru (Bangalore)
4 - 8 yrs
Upto ₹45L / yr (Varies
)
Go Programming (Golang)
NextJs (Next.js)
Python
Mobile App Development

Lightning Job By Cutshort ⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)

 

About

Nudge is a user experience platform for consumer companies to help them activate, retain, and understand users. This enables product and growth teams to embed Onboarding, Stories, Streaks, and Surveys inside user journeys within minutes.

 

We’re backed by Antler, and marquee angels including Kunal Shah(Cred)Dhruv Bahl(BharatPe)Bharati Balakrishnan(Shopify), Prashant Pitti(Easemytrip)Pallav Nadhani(FusionCharts), and Ajinkya Kulkarni(WintWealth).

 

Website:- https://www.nudgenow.com/

  

Skills Required:

1. Minimum 4 years of overall experience.

2. Strong background in backend development and DevOps.

  • Pimary proficiency in Golang with a little focus on mobile app development.
  • Secondary skills in Python, with a willingness to work with Golang & a little focus on mobile app development.

3. Expertise in mobile technologies, preferably Flutter, or alternatively React Native, Swift iOS, or Kotlin Java.

4. Hands-on experience with AWS.

5. Proficient in designing and scaling systems.

6. Basic understanding of Next.js/React.js for frontend development.

7. 6 months to 1 year of team-leading experience, with the ability to manage a team of 8 to 9 individuals in the future.


Key Responsibilities:-

Technical Leadership:

- Set overall technical direction for projects, leading architectural decisions and

design choices.

- Mentor and guide a team of software developers, fostering a culture of technical

excellence.

- Oversee code quality, ensuring adherence to standards and best practices.

- Collaborate with product managers to translate business requirements into

technical solutions.


System Design and Architecture:

- Apply sound system design principles and Clean Architecture to B2B and B2C

infrastructures.

- Design scalable, performant, and resilient systems utilizing event-driven

architectures.

- Identify and address potential bottlenecks, performance issues, and security

vulnerabilities in system designs.


Development:

- Contribute directly to code development using TypeScript, Python, and Go.

- Experience in mobile development with at least one of the following: Swift-iOS,

Android-Java/Kotlin, Flutter-Dart, React Native-JS.

- Utilize React and Redux for effective frontend development.


 Cloud Infrastructure & DevOps:

- Deep understanding of AWS technologies (EC2, ELB, ELK, Route53).

- Working knowledge of Kubernetes, Terraform, Hashicorp Vault, Cloudflare,

Jenkins, ArgoCD, Traefik, Ansible, Karpenter, Nginx, and Helm Charts.

- Proactively manage and optimize cloud-based infrastructure for performance and

cost-efficiency.


Read more
DataGrokr

at DataGrokr

4 candid answers
5 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
5 - 8 yrs
Upto ₹30L / yr (Varies
)
Data engineering
Python
SQL
ETL
Data Warehouse (DWH)
+12 more

Lightning Job By Cutshort⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered).


About DataGrokr

DataGrokr (https://www.datagrokr.com) is a cloud native technology consulting organization providing the next generation of big data analytics, cloud and enterprise solutions. We solve complex technology problems for our global clients who rely on us for our deep technical knowledge and delivery excellence.

If you are unafraid of technology, believe in your learning ability and are looking to work amongst smart, driven colleagues whom you can look up to and learn from, you might want to check us out. 


About the role

We are looking for a Senior Data Engineer to join our growing engineering team. As a member of the team,

• You will work on enterprise data platforms, architect and implement data lakes both on-prem and in the cloud.

• You will be responsible for evolving technical architecture, design and implementation of data solutions using a variety of big data technologies. You will work extensively on all major public cloud platforms - AWS, Azure and GCP.

• You will work with senior technical architects on our client side to evolve an effective technology architecture and development strategy to implement the solution.

• You will work with extremely talented peers and follow modern engineering practices using agile methodologies.

• You will coach, mentor and lead other engineers and provide guidance to ensure the quality of and consistency of the solution.


Must-have skills and attitudes:

• Passion for data engineering, in-depth knowledge of some of the following technologies – SQL (expert level), Python (expert level), Spark (intermediate level), Big data stack of one of AWS/GCP.

• Hands on experience in data wrangling, data munging and ETL. Should be able to source data from anywhere and transform data to any shape using SQL, Python or Spark.

• Hands on experience working with variable data structures like XML/JSON/AVRO etc

• Ability to create data models and architect data warehouse components

• Experience with Version control (GIT/BIT BUCKET etc)

• Strong understanding of Agile, experience with CI/CD pipelines and processes

• Ability to communicate with technical as well as non-technical audience

• Collaborating with various stakeholders

• Have led scrum teams, participated in Sprint grooming and planning sessions, work / effort sizing and estimation


Desired Skills & Experience:

• At least 5 years of industry experience

• Working knowledge of any of the following - AWS Big Data Stack (S3, Redshift, Athena, Glue, etc.), GCP Big Data Stack (Cloud Storage, Workflow, Dataflow, Cloud Functions, Big Query, Pub Sub, etc.).

• Working knowledge of traditional enterprise data warehouse architectures and migrating them to the Cloud.

• Experience with Data Visualization tool (Tableau / Power BI etc)

• Experience with JIRA / Azure DevOps etc


How will DataGrokr support you in your growth:

• You will be groomed and mentored by senior leaders to take on leadership positions in the company

• You will be actively encouraged to attain certifications, lead technical workshops and conduct meetups to grow your own technology acumen and personal brand

• You will work in an open culture that promotes commitment over compliance, individual responsibility over rules and bringing out the best in everyone.

Read more
Silq

at Silq

2 candid answers
3 products
Akshara Jeeva
Posted by Akshara Jeeva
Bengaluru (Bangalore)
2 - 4 yrs
Upto ₹23L / yr (Varies
)
Python
SQL
PowerBI

Lightning job by Cutshort :


As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered).


About Silq: 

At Silq, we aim to build the largest sustainable global shipping network with a 40% Gross Margin by leveraging people, data, & technology.


Silq’s people-powered technology makes moving shipments a breeze. Our proprietary technology allows our customers to design custom task workflows built to their specifications and their specifications alone. Silq’s people supplement our robust technology with information straight from the factory floor, minimizing exceptions and consolidating shipments more efficiently. This is how Silq’s people-powered technology helps our customers save money on their shipping spend from day 1.


About the role:

We are seeking a highly analytical and detail-oriented Data Analyst to join our dynamic team. The ideal candidate will possess a strong analytical mindset, excellent problem-solving skills, and a keen eye for interpreting and presenting data. The role involves collecting, processing, and analyzing large datasets to derive meaningful insights, contributing to informed business decisions.


In this role, you will


Build:

  • Develop and maintain databases, data systems, and dashboards for efficient data analysis and reporting enabling stakeholders to manage the business and to make informed decisions
  • Building reports & dashboards that provide our stakeholders timely, flexible and structured access to their data
  • Develop Test Scripts to validate data, data accuracy and address data latency issues


Execute/Support:

  • Scripts to collect, clean & analyze large datasets
  • Preparing reports for the management stating trends, patterns, and predictions using relevant data
  • Conduct thorough analyses of business processes, workflows, and systems to identify opportunities for improvement

Partner:

  • Collaborate with cross-functional teams (Product Managers, Operations, Client Services and Sales) to understand data requirements and provide actionable insights
  • Work closely with stakeholders to gather, document, and prioritize business requirements


In this role, you will need


  • Bachelor’s degree in Information Technology, Computer Science, Mathematics, Statistics or a

related field

  • Prior experience of 2-4 years
  • Proficient in using business analysis tools and methodologies.
  • Proficient in data analysis tools and languages (e.g., SQL, Python)
  • Experience with data visualization tools (e.g., Metabase, Tableau, Power BI)
  • Should have hands on experience with DBT or Airflow
  • Strong analytical and problem-solving skills
  • Excellent written and verbal communication skills
  • Familiarity with business process modeling and analysis techniques
  • Work both independently & collaboratively in a fast-paced and dynamic work environment
  • Knowledge of project management principles is a plus


Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Pune, Hyderabad, Gurugram, Noida
5 - 11 yrs
₹20L - ₹36L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+7 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.


Role & Responsibilities:

Your role is focused on Design, Development and delivery of solutions involving:

• Data Integration, Processing & Governance

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Implement scalable architectural models for data processing and storage

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode

• Build functionality for data analytics, search and aggregation

Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 5+ years of IT experience with 3+ years in Data related technologies

2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc

6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Cloud data specialty and other related Big data technology certifications


Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes


Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Noida
4 - 10 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.


Role & Responsibilities:

Job Title: Senior Associate L1 – Data Engineering

Your role is focused on Design, Development and delivery of solutions involving:

• Data Ingestion, Integration and Transformation

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time

• Build functionality for data analytics, search and aggregation


Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies

2.Minimum 1.5 years of experience in Big Data technologies

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security

7.Cloud data specialty and other related Big data technology certifications


Job Title: Senior Associate L1 – Data Engineering

Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes

Read more
Arting Digital
Pragati Bhardwaj
Posted by Pragati Bhardwaj
Bengaluru (Bangalore)
10 - 16 yrs
₹10L - ₹15L / yr
databricks
Data modeling
SQL
Python
AWS Lambda
+2 more

Title:- Lead Data Engineer 


Experience: 10+y

Budget: 32-36 LPA

Location: Bangalore 

Work of Mode: Work from office

Primary Skills: Data Bricks, Spark, Pyspark,Sql, Python, AWS

Qualification: Any Engineering degree


Roles and Responsibilities:


• 8 - 10+ years’ experience in developing scalable Big Data applications or solutions on

 distributed platforms.

• Able to partner with others in solving complex problems by taking a broad

 perspective to identify.

• innovative solutions.

• Strong skills building positive relationships across Product and Engineering.

• Able to influence and communicate effectively, both verbally and written, with team

  members and business stakeholders

• Able to quickly pick up new programming languages, technologies, and frameworks.

• Experience working in Agile and Scrum development process.

• Experience working in a fast-paced, results-oriented environment.

• Experience in Amazon Web Services (AWS) mainly S3, Managed Airflow, EMR/ EC2,

  IAM etc.

• Experience working with Data Warehousing tools, including SQL database, Presto,

  and Snowflake

• Experience architecting data product in Streaming, Serverless and Microservices

  Architecture and platform.

• Experience working with Data platforms, including EMR, Airflow, Databricks (Data

  Engineering & Delta

• Lake components, and Lakehouse Medallion architecture), etc.

• Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for

  Managed Spark jobs, build Docker images, etc.

• Experience working with distributed technology tools, including Spark, Python, Scala

• Working knowledge of Data warehousing, Data modelling, Governance and Data

  Architecture

• Working knowledge of Reporting & Analytical tools such as Tableau, Quicksite

  etc.

• Demonstrated experience in learning new technologies and skills.

• Bachelor’s degree in computer science, Information Systems, Business, or other

  relevant subject area

Read more
Arting Digital
Pragati Bhardwaj
Posted by Pragati Bhardwaj
Bengaluru (Bangalore)
4 - 6 yrs
₹10L - ₹15L / yr
databricks
Python
Spark
SQL
AWS Lambda

Title:- Senior Data Engineer 


Experience: 4-6 yrs

Budget: 24-28 lpa

Location: Bangalore 

Work of Mode: Work from office

Primary Skills: Data Bricks, Spark, Pyspark,Sql, Python, AWS

Qualification: Any Engineering degree


Responsibilities:

∙Design and build reusable components, frameworks and libraries at scale to support  analytics products. 

∙Design and implement product features in collaboration with business and Technology

 stakeholders. 

∙Anticipate, identify and solve issues concerning data management to improve data   quality.

∙Clean, prepare and optimize data at scale for ingestion and consumption.

∙Drive the implementation of new data management projects and re-structure of the   current data architecture.

∙Implement complex automated workflows and routines using workflow scheduling tools. 

∙Build continuous integration, test-driven development and production deployment

 frameworks.

∙Drive collaborative reviews of design, code, test plans and dataset implementation  performed by other data engineers in support of maintaining data engineering  standards.

∙Analyze and profile data for the purpose of designing scalable solutions. 

∙Troubleshoot complex data issues and perform root cause analysis to proactively resolve

 product and operational issues.

∙Mentor and develop other data engineers in adopting best practices.

 

Qualifications:

 

Primary skillset:


∙Experience working with distributed technology tools for developing Batch and  Streaming pipelines using 


  o SQL, Spark, Python, PySpark [4+ years],

  o Airflow [3+ years],

  o Scala [2+ years].


∙Able to write code which is optimized for performance.

∙Experience in Cloud platform, e.g., AWS, GCP, Azure, etc.

∙Able to quickly pick up new programming languages, technologies, and frameworks.

∙Strong skills building positive relationships across Product and Engineering.

∙Able to influence and communicate effectively, both verbally and written, with team  members and business stakeholders

∙Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for  Managed Spark jobs, build Docker images, etc.

∙Working knowledge of Data warehousing, Data modelling, Governance and Data   Architecture

 

Good to have:


∙Experience working with Data platforms, including EMR, Airflow, Databricks (Data

 Engineering & Delta Lake components, and Lakehouse Medallion architecture), etc.

∙Experience working in Agile and Scrum development process.

∙Experience in EMR/ EC2, Databricks etc.

∙Experience working with Data warehousing tools, including SQL database, Presto, and

 Snowflake

∙Experience architecting data product in Streaming, Serverless and Microservices  Architecture and platform.

Read more
Bengaluru (Bangalore), Chennai
6 - 15 yrs
Best in industry
NodeJS (Node.js)
MongoDB
Mongoose
Express
Python

Backend - Software Development Engineer III



Experience - 7+ yrs


About Wekan Enterprise Solutions


Wekan Enterprise Solutions is a leading Technology Consulting company and a strategic investment partner of MongoDB. We help companies drive innovation in the cloud by adopting modern technology solutions that help them achieve their performance and availability requirements. With strong capabilities around Mobile, IOT and Cloud environments, we have an extensive track record helping Fortune 500 companies modernize their most critical legacy and on-premise applications, migrating them to the cloud and leveraging the most cutting-edge technologies.


Job Description


We are looking for passionate software engineers eager to be a part of our growth journey. The right candidate needs to be interested in working in high-paced and challenging environments leading technical teams, designing system architecture and reviewing peer code. Interested in constantly upskilling, learning new technologies and expanding their domain knowledge to new industries. This candidate needs to be a team player and should be looking to help build a culture of excellence. Do you have what it takes?


You will be working on complex data migrations, modernizing legacy applications and building new applications on the cloud for large enterprise and/or growth stage startups. You will have the opportunity to contribute directly into mission critical projects directly interacting with business stakeholders, customer’s technical teams and MongoDB solutions Architects.


Location - Chennai or Bangalore

  • Relevant experience of 7+ years building high-performance back-end applications with at least 3 or more projects delivered using the required technologies
  • Good problem solving skills
  • Strong mentoring capabilities
  • Good understanding of software development life cycle
  • Strong experience in system design and architecture
  • Strong focus on quality of work delivered
  • Excellent verbal and written communication skills



Required Technical Skills


  • Extensive hands-on experience building high-performance web back-ends using Node.Js and Javascript/Typescript
  • Strong experience with Express.Js framework
  • Working experience with Python web app development or python scripting
  • Implementation experience in monolithic and microservices architecture
  • Hands-on experience with data modeling on MongoDB and any other Relational or NoSQL databases
  • Experience integrating with any 3rd party services such as cloud SDKs (AWS, Azure) authentication, etc.
  • Hands-on experience with Kafka, RabbitMQ or any similar technologies.
  • Exposure into unit testing with frameworks such as Mocha, Chai, Jest or others
  • Strong experience writing and maintaining clear documentation



Good to have skills:


  • Experience working with common services in any of the major cloud providers - AWS or GCP or Azure

Technical certifications in AWS / Azure / GCP / MongoDB or other relevant technologies

Read more
GoComet

at GoComet

3 recruiters
Abhimanyu Shrimal
Posted by Abhimanyu Shrimal
Bengaluru (Bangalore)
5 - 8 yrs
₹20L - ₹35L / yr
NodeJS (Node.js)
MongoDB
Mongoose
Express
Ruby on Rails (ROR)
+6 more

As a Tech Lead I (Backend) at Gocomet, you will be pivotal in driving backend development initiatives. Your role will involve leading the design and implementation of robust backend systems, ensuring high performance and responsiveness to requests from the front-end. You will work closely with other team leads and departments to define and meet the technical and business requirements.


Roles and Responsibilities:


Backend vision & ownership: Architect and implement scalable and efficient backend solutions, contributing significantly to the design and overall architecture, keeping in mind a long term vision of the product & scale.

Optimize Performance: Focus on optimizing the application for maximum speed and scalability, including database optimization, effective caching strategies, improving code health, code design & refactoring, etc.

Technical Innovation: Suggest innovative solutions wherever required for problems related to new product requirements & scale related issues. Stay abreast of emerging backend technologies and trends, advocating and leading the adoption of technologies that will give a competitive edge to our products.

Mentor and Guide: Provide mentorship and guidance to backend development team members, ensuring the dissemination of best practices and efficient use of technology stacks. Take ownership of hiring & building a high quality backend team.

Quality Assurance: Uphold high standards in code quality, including rigorous code review processes, and ensure adherence to industry best practices in backend development.


Experience & Skills:


Backend Expertise: 4-6 years of experience in backend development, with a deep understanding of server-side logic, architecture design & handling scalability issues.

Cloud & DevOps exposure: Experience with Cloud (GCP or AWS) with a deep understanding of different services, resources utilisations, CI/CD pipelines, Observability tools (Sentry, New Relic / Datadog, etc.) and an ability to debug infra related issues.

Strong Technical Skills: Strong expertise in a Backend development framework (preferably Ruby on Rails but not mandatory), SQL & NoSQL database technologies (like Postgresql and MongoDB), Caching (Redis), Async job frameworks (like Sidekiq), etc.

Problem-Solving Ability: Strong analytical and problem-solving skills, research ability with a proven track record of delivering high-quality software solutions.

Leadership Skills: Demonstrated experience in mentoring backend development teams and taking complete tech ownership.

Communication Skills: Excellent verbal and written communication skills, with the ability to clearly articulate technical challenges and solutions to both technical and non-technical team members.

Continuous Learning: A commitment to continuous learning and staying current with the latest backend development trends and technologies.


Benefits:

● Collaborative and innovative work environment.

● Dynamic & high-performing team that thrives in a fast-paced environment.

● Opportunities for professional growth and development.

● Cutting-edge technology stack and tools.

● Make a significant impact on our products and the user experience.

● Join a team of passionate, creative, and driven individuals.

● 5 Days working.

● Flexible Working Hours.

● Experience good work culture with regular fun activities.

● Health medical insurance coverage with Family, etc.


Why Gocomet?

GoComet is a dynamic SaaS start-up that provides AI-powered transportation visibility solutions to revolutionize the trillion-dollar logistics sector. At GoComet, we are revolutionizing the logistics sector one day at a time, and every team member is committed to making it a reality. We're seeking individuals who embody our core values, character, and attitude. While we recognize that skills can be developed with the right mindset and learnability, we prioritize those who share our philosophy. Our recruitment processes reflect this belief. Look no further if you're looking for a diverse, talented, and vibrant workplace that recognizes and rewards hard work. We're ambitious, fast-paced, and unafraid to experiment, fail, learn, and ultimately succeed. This is us! Join our team if you share our culture and values. We're an equal-opportunity employer. We welcome qualified applicants from all races, colors, religions, sexes, nationalities, sexual orientations, gender identities, and abilities.


Read more
Education
Agency job
via The Hub by Sridevi Viswanathan
Bengaluru (Bangalore)
1 - 4 yrs
₹1L - ₹10L / yr
Python
Django
Javascript
HTML/CSS
RESTful APIs

Job Title: Django Developer

Job Overview:

We are seeking a skilled Django developer to join our dynamic team. The ideal

candidate will have hands-on experience with the Django framework, proficiency in

Python, and a strong understanding of web development best practices. The Django

developer will be responsible for designing, implementing, testing, and maintaining

web applications that meet our clients; needs.

Responsibilities:

Design and develop robust, scalable, and secure web applications using the

Django framework.

Collaborate with cross-functional teams to define, design, and ship new

features.

Write clean, maintainable, and efficient code.

Integrate user-facing elements using HTML, CSS, and JavaScript. Implement

and maintain RESTful APIs.

Collaborate with front-end developers to integrate user-facing elements with

server-side logic.

Work with databases, including designing schemas, writing queries, and

optimising performance.

Troubleshoot, debug, and resolve issues in the development and production

environments.

Stay up-to-date with the latest industry trends, technologies, and best

practices.

Participate in code reviews and provide constructive feedback to peers.

Requirements:

Proven experience as a Django developer or similar role.

Strong understanding of Python and the Django web framework.

Experience with front-end technologies, including HTML, CSS, and JavaScript.

Knowledge of relational databases, ORM (Object-Relational Mapping), and

database design.

Familiarity with version control systems (e.g., Git).

Understanding of web security best practices.

Read more
Loyalytics

at Loyalytics

2 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore), Delhi, Gurugram
2 - 5 yrs
Upto ₹20L / yr (Varies
)
Artificial Intelligence (AI)
Machine Learning (ML)
Large Language Models (LLM) tuning
Large Language Models (LLM)
Deep Learning
+2 more

Lightning Job By Cutshort⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered).

 

About the job


This role is for an experienced Artificial Intelligence Engineer on the Marketing Data Science team at HP.This individual will be responsible for helping the team to shape its AI strategy, specifically through the creation of AI models, such as Large Language models (LLMs) and Natural Language Processing (NLP), for Marketing. Marketers are tasked with combing through tons of research and data to find appropriate insights; there is oftentimes too much information and too little time. This individual will provide Marketers with curated AI tools that help them to have the right information at the right time to make the best decisions. This is a fantastic opportunity to be part of cutting-edge AI work that enables Marketing to drive innovation and business growth.


Responsibilities

  • Design the architecture and environment to enable AI models to run
  • Assess which environments are most appropriate, whether current compute is
  • sufficient, whether we should choose an existing model or pretrain our own
  • Lead AI model construction to both evolve existing capabilities and create new ones
  • Apply AI to Marketing use cases to improve customer intelligence and data analytics
  • Teach Data Scientists how to use the created LLMs to analyze structured and unstructured data
  • Build AI applications and interfaces at scale for the bigger Marketing organization to use
  • Advise the team on best practices for AI models (i.e. parameters, data structure, etc.)
  • Evaluate LLM performance using different applications and file formats to identify the best
  • solution
  • Debug and identify causality for issues in the AI results
  • Research new AI models and trends to understand how we can maximize the usage of our
  • existing models and/or whether we should adopt new models



Requirements

  • Strong knowledge of ML and Deep Learning in the context of LLMs, NLPs and chatbots
  • Experience working with AI models using real-life (“industry”) data
  • Expert Python skills (PyTorch/TensorFlow, Pandas, NumPy, Matplotlib, creating functions)
  • Mastery in tool usage (AWS, Jupyter notebook, Streamlit or similar UIs)
  • Expertise in building and fine-tuning LLMs; experience with PEFT methods and alignment with
  • human feedback (RLHF)
  • Expertise in Prompt Engineering using various AI models
  • Understanding of the differences and advantages of Llama, Falcon, and other open source LLMs
  • Experience utilizing structured and unstructured data in LLMs (CSV, Json, TXT, PDF)
  • Mastery in database engineering (Relational and Vector databases/Embeddings)
  • Proficiency in AI frameworks (Langchain, Agents, Transformer Architecture etc.)
  • Experience utilizing both Generative and Discriminative AI models for data querying and
  • representation
  • In-depth knowledge of big data platforms, statistical software and machine learning algorithms
  • packages


Benefits

  • Be a part of the dynamic and fun team, making an impact in the retail tech industry with clients like Lulu, Aster, GMG etc.
  • Flexible Work Hours.
  • Wellness & Family Benefits.
  • Access to various learning platforms.


Read more
Drivex Mobility Pvt Ltd
Roshan Bindra
Posted by Roshan Bindra
Bengaluru (Bangalore)
4 - 7 yrs
Best in industry
React.js
Django
API
API Design
Python

About Us:

DriveX is India’s largest used 2-wheeler platform founded by Narain Karthikeyan, India’s first Formula 1 racer. DriveX is a fully stacked platform having capabilities across the pre-owned vehicle value chain from procurement to refurbishment to retail and service of pre-owned two-wheelers.

Website- https://www.drivex.in

Location- Bangalore


Job Overview:

We are looking for a talented Python Developer to join our tech team. The successful candidate will be responsible for developing server-side logic, integrating front-end components, and supporting the development of scalable and high-performance applications. The Python Developer will collaborate with cross-functional teams to deliver high-quality software solutions.


Responsibilities:

1. Develop server-side logic using Python for seamless integration with front-end components.

2. Design and implement scalable and high-performance applications.

3. Collaborate with front-end developers to integrate user-facing elements with server-side logic.

4. Implement security and data protection features.

5. Work closely with the product management team to define and implement new features.

6. Optimize applications for maximum speed and scalability.

7. Collaborate with other team members and stakeholders to deliver software solutions that align with business requirements.

8. Stay up-to-date with industry trends and technologies, and apply them to the development process.


Requirements:

1. Proven 3+years of experience as a Python Developer.

2. Strong proficiency in Python, with a good understanding of their ecosystems. 3. Knowledge of server-side templating languages such as Jinja 2 or EJS.

4. Understanding of front-end technologies, such as HTML5, CSS3, and JavaScript.

5. Experience with data migration, transformation, and scripting.

6. Proficient understanding of code versioning tools, such as Git.

7. Knowledge of authentication and authorization mechanisms.

8. Familiarity with common front-end development tools, such as Babel, Webpack, etc. 9. Strong problem-solving skills and attention to detail.

10. Excellent communication and collaboration skills.

11. Ability to work well in a team-oriented, collaborative environment.

12. Experience with microservices architecture is a plus.



Read more
Merito

at Merito

1 video
2 recruiters
Jinita Sumaria
Posted by Jinita Sumaria
Bengaluru (Bangalore), Gurugram
3 - 5 yrs
₹15L - ₹20L / yr
Software Testing (QA)
SQL
Python

About Company :-


Our Client is the world’s largest media investment company and is a part of WPP. In fact, we are responsible for one in every three ads you see globally. We are currently looking for a Manager -Social to join us. As part of the largest media agency in India, you’ll have the opportunity to leverage the scale that comes with the job. You will become an integral part of this growing team and will be working with both internal teams and external parties to ensure campaign delivery objectives are met.

This team is responsible for delivering international solutions, particularly in APAC & EMEA, with some global influence. You will enjoy working in a collaborative team environment and will hold a ‘can do’ attitude with the passion to learn and grow.

At APAC, our people are our strength, which is why fostering a culture of diversity and inclusion is important to us.




Key Responsibilities:


- Lead QA team for effective testing, validation, and verification.

- Implement QA processes and standards.

- Design and execute test plans, cases, and automated scripts.

- Utilize SQL, Excel, and Python for data analysis and testing strategies.

- Validate data across platforms and reports.

- Analyze data for system improvement.

- Monitor and report key quality metrics.

- Develop and maintain process documentation.



Requirements

- Bachelor’s degree in CS, IT, Engineering, or related field.

- 3+ years of QA experience.

- Proficient in SQL, Excel, and Python.

- Strong knowledge of QA methodologies and tools.

- Experience in commerce or digital marketing preferred.

- Demonstrated leadership skills.

- Excellent problem-solving and communication.

- Proactive in task optimization.

- Systems approach to problem-solving.

- Strong attention to detail.


Read more
Intellectyx Data Science India Private Limited
Divya DharshiniV
Posted by Divya DharshiniV
Remote, Tamilnadu, kerala, Bengaluru (Bangalore)
6 - 13 yrs
₹1L - ₹15L / yr
Selenium Web driver
Python
cloud platform
Java

Attaching the Job description for reference -

Role : Automation Tester

EXP : 5+ yrs

Mode : WFO

Location : Coimbatore

Role Description

As an Automation Cloud QA Engineer, you will be responsible for designing, implementing, and executing automated tests for cloud-based applications. You will work closely with the development and DevOps teams to ensure the quality and reliability of software releases in a cloud environment. The ideal candidate will have a strong background in test automation, cloud technologies, and a deep understanding of quality assurance best practices.

Automation QA Engineer Responsibilities:

Automated Testing:

· Design, develop, and maintain automated test scripts for cloud-based applications using industry-standard tools and frameworks.

· Implement end-to-end test automation to validate system functionality, performance, and scalability.

Cloud Testing:

· Perform testing on cloud platforms (e.g., AWS, Azure, Google Cloud) to ensure applications function seamlessly in a cloud environment.

· Collaborate with DevOps teams to ensure continuous integration and deployment pipelines are robust and reliable.

Test Strategy and Planning:

· Contribute to the development of test strategies, test plans, and test cases for cloud-based applications.

· Work with cross-functional teams to define and implement quality metrics and standards.

Defect Management:

· Identify, document, and track defects using established tools and processes.

· Collaborate with developers and product teams to ensure timely resolution of issues.

Performance Testing:

· Conduct performance testing to identify and address bottlenecks, ensuring optimal application performance in a cloud environment.

Collaboration

· Work closely with development teams to understand system architecture and functionality.

· Collaborate with cross-functional teams to promote a culture of quality throughout the development lifecycle.

Requirements And Skills:

· Bachelor's degree in Computer Science, Engineering, or a related field.

· Proven experience as a QA Engineer, with a focus on automation and cloud technologies.

· Strong programming skills in languages like Python, Java, or other scripting languages.

· Experience with cloud platforms such as AWS, Azure, or Google Cloud.

· Proficiency in using automation testing frameworks (e.g., Selenium, JUnit, TestNG), Selenium is preferred.

· Solid understanding of software development and testing methodologies.

· Excellent problem-solving and analytical skills.

· Strong communication and collaboration skills.

· Maintain documentation for test environment



Regards

Divya

Read more
TIFIN FINTECH

at TIFIN FINTECH

1 recruiter
Vrishali Mishra
Posted by Vrishali Mishra
Bengaluru (Bangalore)
8 - 12 yrs
Best in industry
Python
Go Programming (Golang)
React.js
NodeJS (Node.js)

Technical Lead TIFIN

Mumbai, India


WHO WE ARE:

TIFIN is a fintech platform backed by industry leaders including JP Morgan, Morningstar, Broadridge, Hamilton Lane, Franklin Templeton, Motive Partners and a who’s who of the financial service industry. We are creating engaging wealth experiences to better financial lives through AI and investment intelligence powered personalization. We are working to change the world of wealth in ways that personalization has changed the world of movies, music and more but with the added responsibility of delivering better wealth outcomes.

We use design and behavioral thinking to enable engaging experiences through software and application programming interfaces (APIs). We use investment science and intelligence to build algorithmic engines inside the software and APIs to enable better investor outcomes.

In a world where every individual is unique, we match them to financial advice and investments with a recognition of their distinct needs and goals across our investment marketplace and our advice and planning divisions.


OUR VALUES: Go with your GUT

●     Grow at the Edge. We are driven by personal growth. We get out of our comfort zone and keep egos aside to find our genius zones. With self-awareness and integrity we strive to be the best we can possibly be. No excuses.

●     Understanding through Listening and Speaking the Truth. We value transparency. We communicate with radical candor, authenticity and precision to create a shared understanding. We challenge, but once a decision is made, commit fully.

●     I Win for Teamwin. We believe in staying within our genius zones to succeed and we take full ownership of our work. We inspire each other with our energy and attitude. We fly in formation to win together.

 

 

WHO WE ARE: 

 

TIFIN is a fintech platform backed by industry leaders including JP Morgan, Morningstar, Broadridge, Hamilton Lane and a who’s who of the financial service industry. We are creating engaging wealth experiences to better financial lives through AI and investment intelligence powered personalisation. We are working to change the world of wealth in ways that personalisation has changed the world of movies, music and more but with the added responsibility of delivering better wealth outcomes. 

                   

We use design and behavioural thinking to enable engaging experiences through software and application programming interfaces (APIs). We use investment science and intelligence to build algorithmic engines inside the software and APIs to enable better investor outcomes. 

                   

In a world where every individual is unique, we match them to financial advice and investments with a recognition of their distinct needs and goals across our investment marketplace and our advice and planning divisions. 

                                                                          

WHAT YOU'LL BE DOING: 

                   

As part of TIFIN’s Technology division, you’ll be in charge of the Engineering, Architecture and tech delivery for one our products. Working directly with the executive team across Product, Design and Technology, you’ll have the opportunity to craft the future of cutting edge Financial Products using the very latest technology.

                   

We are seeking a world-class Technology leader who can provide executive leadership for the Division’s Engineering organisation, building personalisation capabilities for financial advisors with an emphasis on integration capabilities. The VP will play a critical role in the growth of the Division and will be an integral member of the Executive team. This leader must have the market savvy and technical acumen to effectively partner both internally and externally, driving both strategy and execution, to ensure success as the preeminent AI-powered, personalisation toolbox for RIA’s. 

 

THE ROLE:

  • Reports to TIFIN India’s CTO
  • Lead Project management of technology projects
  • Liaise with stakeholders to gather & understand the functional requirement
  • Develop / implement software code, conduct thorough end-to-end system testing, deploy code to production, and provide post go-live support.
  •  

WHO YOU ARE: 

  • 10+ years of experience in software
  • Proficient in Python, Django, Node JS, ReactJS, Pandas, GitHub/GitLab and Jira
  • Understanding of React JS and other MERN stack technologies is essential
  • Ability to lead, design and implement backend services
  • Ability to grow and lead technology teams and drive best practices and coding standards.
  • Experience with working on scalable interactive web applications
  • Infrastructure knowledge including AWS, GCP, etc.
  • Clear understanding of software design constructs and their implementation.
  • Strong communication (written and oral) and analytical problem-solving skills.
  • Strong sense of attention to detail, accountability, pride in delivering high quality work and willingness to learn.
  • Strong analytical and problem-solving skills with hands-on experience and startup mindset.
  • An understanding of or exposure to financial capital markets, various financial instruments (such as stocks, ETFs, Mutual Funds, etc.), and financial tools (such as Factset, Reuters, etc.) would be beneficial
  • Ability to own the user experience for the customer, and work across the stack from database schema, designing APIs

 

 

Job Location : Mumbai

 

Work Location : 8+ Years

 

BENEFITS PACKAGE:

TIFIN offers a competitive benefits package that includes:

· Performance linked variable compensation, including equity

· Medical insurance

· Tax-saving benefits

· Flexible PTO policy and Company-paid holidays

· Parental Leave: 6 months paid maternity, 2 weeks paid paternity leave

· Access to our Wellness trainers, including 1:1 personal coaching

for executives and rising stars

 

A note on location. While we have team centers in Boulder, New York City, San Francisco, Charlotte, and Bangalore,this role is based out of Mumbai.  

 

TIFIN is proud to be an equal opportunity workplace and values the multitude of talents and perspectives that a diverse workforce brings. All qualified applicants will receive consideration for employment without regard to race, national origin, religion, age, color, sex, sexual orientation, gender identity, disability, or protected veteran status.

 

 

 

 

 

Read more
My Yoga Teacher

at My Yoga Teacher

1 video
3 recruiters
MYT HR
Posted by MYT HR
Bengaluru (Bangalore)
2 - 4 yrs
₹10L - ₹20L / yr
Data Analytics
Attention to detail
SQL
Python

PRODUCT MANAGER

MyYogaTeacher is a US based marketplace for connecting yoga teachers in India with students in the US for 1:1 yoga sessions.The teachers in India earn a good income - sometimes more than top engineers and MBA - whilst providing US based students with personalized yogic and spiritual teaching. MyYogaTeacher is headquartered in the US with technology office Bangalore and is founded by repeat founders and IIT Graduates with a track record of building unicorn-level, successful companies. Check out myyogateacher.com

You are responsible for:

  • Specify product requirements to address needs for US based yoga students
  • Analyze consumer analytics data to understand student behavior in order to develop ideas that improve student conversion, usage and retention
  • Assesses market competition by comparing the company's product to competitors' products
  • Communicate requirements to engineering team, design team, data analytics team, support team and marketing team
  • Conduct training to educate different organizations on the new products
  • Develop marketing collateral - text based and audio/visual content to drive the launch and usage of new products
  • Conduct conversations and surveys with students to better understand their usage of products and improvements that they would like. Understand the neuroscience of building habits and how they can build products, game mechanics and triggers to drive engagement and user adoption - in order to improve students health and happiness.
  • Align product requirements with overall company strategy.

You are qualified:

  • If you have 2 years of experience working as a core product manager
  • Graduated with Bachelors Degree in Engineering(B.E./B.Tech)

About you:

  • Technology competence: Comfortable with different user interface designs for mobile and desktop applications
  • Deep curiosity about neuroscience and human behavior: Well read books about human behavior, neuroscience, human and machine interfaces, habit formation, addictions etc. 
  • Empathy: Listens to students stories, connects with them and understands their needs
  • Honesty/Integrity: Does not cut corners. Earns trust and maintains confidence. Do what is right, not just what is expedient. Speaks plainly and truthfully. 
  • Persistence: Demonstrates tenacity and willingness to go the distance to get something done
  • Attention to detail: Does not let important details slip through the cracks, well organized
  • Growth mindset: Learn quickly and not be afraid to fail. Demonstrates ability to quickly and proficiently understand and absorb new information. 
  • Follow through on commitments: Lives up to verbal and written agreements, regardless of personal cost. 
  • Enthusiasm: Exhibit passion, excitement and positive energy over work.
  • Teamwork: Reaches out to peers and cooperates with supervisors to establish an overall collaborative working relationship. 
  • Yoga & Spiritual ideas: Ability to talk about yoga and spiritual matters with students.
  • Great communication skills: Good understanding of self and motivations of audience in order to communicate effectively with different stakeholders - engineering, design, executive team, students and teachers
  • Student centric: Ability to put student interest above everything else. Demonstrate respect for customers and ensure that customers get the best service possible
  • Professional: Cares deeply about his/her profession and invests in professional growth. 
  • Respect for the team: Respects everybody's right to have opinions. Affirms the positive in other people
  • Risk-taking: Embrace challenges, try new approaches and be ok with all failures except a failure to learn

Would be nice to have someone who understands Yoga Practice(Ideal but not mandatory).



Read more
Elocity Technologies India Private Limited
Bengaluru (Bangalore)
2 - 4 yrs
₹5L - ₹15L / yr
DevOps
Ruby
Python
Kubernetes
Jenkins
+3 more


Job Title: DevOps Engineer

Experience: 2+ Years

Location: Bangalore, India

Elocity is a cleantech start-up striving to make the world a better place through technology innovations. We

are building a global infrastructure for making the transition to electric vehicles viable, affordable, and

sustainable by working closely with the utilities, governments, and public.

Headquartered out of Canada, we are a team of highly specialized domain experts and problem solvers

enabling utilities, public and private sector entities to successfully manage the demands of electric vehicle

charging and its infrastructure needs to pave the way for electromobility in future.

To know more visit https://elocitytech.com/

Responsibilities:

• Building and setting up new development tools and infrastructure

• Working on ways to automate and improve development and release processes

• Ensuring that systems are safe and secure against cybersecurity threats

• Working with software developers and software engineers to ensure that development follows

established processes and works as intended

• Planning out projects and being involved in project management decisions

• Provides direct and responsive support for urgent analytic needs.

• Participates in architecture and software development activities.

• Uses open source technologies and tools to accomplish specific use cases encountered within the

project.

• Uses coding languages or scripting methodologies to solve a problem with a custom workflow.

• Provide Level 2 technical support

• Perform root cause analysis for production errors

• Investigate and resolve technical issues

• Design procedures for system troubleshooting and maintenance

Qualification & Requirement:

• BSc in Computer Science, Engineering, or relevant field

• Work experience as a DevOps Engineer or similar software engineering role

• Good knowledge of Ruby or Python

• Working knowledge of databases and SQL

• Familiarity with container orchestration services, especially Kubernetes.

• Familiarity with agile software development in Go, C/C++, Java, JavaScript

• Experience administering and deploying development CI/CD tools such as Git, Jira, GitLab, or Jenkins

• Significant experience with Windows and Linux operating system environments

• Experience with infrastructure scripting solutions such as PowerShell or Python

Skillset:

AWS: IAM Roles

Datastore: Postgres, ElasticSearch, S3

Runtime: Elastic Kubernets Service, Docker

CI/CD: GitLab, ArgoCD

Service Mesh: Kafka, Istio, Gateway, ELB (Load Balancer)

Monitoring: CloudWatch, CloudTrail, Prometheus, Grafana, ELK (ElasticSearch)

Read more
Elocity Technologies India Private Limited
Bengaluru (Bangalore)
5 - 8 yrs
₹25L - ₹45L / yr
NodeJS (Node.js)
MongoDB
Mongoose
Express
Javascript
+8 more

Elocity is a cleantech start-up striving to make the world a better place through technology innovations. We are building a global infrastructure for making the transition to electric vehicles viable, affordable, and sustainable by working closely with the utilities, governments, and public.

Headquartered out of Canada, we are a team of highly specialized domain experts and problem solvers enabling utilities, public and private sector entities to successfully manage the demands of electric vehicle charging and its infrastructure needs to pave the way for electromobility in future.

To know more visit https://elocitytech.com/

Responsibilities:

  • Determines technical feasibility of features or solutions by evaluating problem, customer requirements, possible solutions and technology requirements.
  • Exercises judgement in prioritizing tasks and selecting methods and techniques for obtaining solutions.
  • Create low-level design of modules of a software application through proper documentation and
  • diagrams.
  • Develops software solutions by studying requirements, clarifying customer/user needs, analysing data
  • and processes and following established software development practices and processes.
  • Develops proof of concepts for technical evaluation and early customer feedback
  • Updates and shares knowledge by studying state-of-the-art development tools, programming
  • techniques, and computing technology; reading professional publications
  • Networks with internal and external personnel in own area of expertise.
  • Skills:
  • Good command in JavaScript/TypeScript. Knowledge of Java/Python will be a plus.
  • Experience in Debugging/troubleshooting TypeScript code.
  • Experience in API development (REST/GraphQL etc)
  • Experience in development of Web and Mobile(android/iOS) applications
  • Exposure of Parallel and Asynchronous programming
  • Experience in writing Unit tests (Jest or any similar framework)
  • Should be proficient in relational Database concepts (Postgres etc.)
  • Knowledge of Non-relational Databases would be a plus.
  • Good Understanding of Object-Oriented Programming Concepts.
  • Good Understanding of Design Patterns.
  • Good command of Data structures, Algorithms and Complexity.
  • Good at problem solving and analytical skills.
  • Experience with Source Code Versioning systems (Git etc)
  • Understanding of Micro services Architecture would be a plus 


Read more
Sizzle

at Sizzle

1 recruiter
Vijay Koduri
Posted by Vijay Koduri
Bengaluru (Bangalore)
3 - 7 yrs
₹10L - ₹15L / yr
Python
API
FAST API
SQLAlchemy
PostgreSQL
+8 more

Sizzle is an exciting new startup that’s changing the world of gaming.  At Sizzle, we’re building AI to automate gaming highlights, directly from Twitch and YouTube streams. We’re looking for a superstar Python expert to help develop and deploy our AI pipeline. The main task will be deploying models and algorithms developed by our AI team, and keeping the daily production pipeline running. Our pipeline is centered around several microservices, all written in Python, that coordinate their actions through a database. We’re looking for developers with deep experience in Python including profiling and improving the performance of production code, multiprocessing / multithreading, and managing a pipeline that is constantly running. AI/ML experience is a plus, but not necessary. AWS / docker / CI/CD practices are also a plus. If you are a gamer or streamer, or enjoy watching video games and streams, that is also definitely a plus :-)


You will be responsible for:

  • Building Python scripts to deploy our AI components into pipeline and production
  • Developing logic to ensure multiple different AI components work together seamlessly through a microservices architecture
  • Managing our daily pipeline on both on-premise servers and AWS
  • Working closely with the AI engineering, backend and frontend teams


You should have the following qualities:

  • Deep expertise in Python including:
  • Multiprocessing / multithreaded applications
  • Class-based inheritance and modules
  • DB integration including pymongo and sqlalchemy (we have MongoDB and PostgreSQL databases on our backend)
  • Understanding Python performance bottlenecks, and how to profile and improve the performance of production code including:
  • Optimal multithreading / multiprocessing strategies
  • Memory bottlenecks and other bottlenecks encountered with large datasets and use of numpy / opencv / image processing
  • Experience in creating soft real-time processing tasks is a plus
  • Expertise in Docker-based virtualization including:
  • Creating & maintaining custom Docker images
  • Deployment of Docker images on cloud and on-premise services
  • Experience with maintaining cloud applications in AWS environments
  • Experience in deploying machine learning algorithms into production (e.g. PyTorch, tensorflow, opencv, etc) is a plus
  • Experience with image processing in python is a plus (e.g. openCV, Pillow, etc)
  • Experience with running Nvidia GPU / CUDA-based tasks is a plus (Nvidia Triton, MLFlow)
  • Knowledge of video file formats (mp4, mov, avi, etc.), encoding, compression, and using ffmpeg to perform common video processing tasks is a plus.
  • Excited about working in a fast-changing startup environment
  • Willingness to learn rapidly on the job, try different things, and deliver results
  • Ideally a gamer or someone interested in watching gaming content online


Seniority: We are looking for a mid to senior level engineer


Salary: Will be commensurate with experience. 


Who Should Apply:

If you have the right experience, regardless of your seniority, please apply.

Work Experience:  4 years to 8 years


About Sizzle

Sizzle is building AI to automate gaming highlights, directly from Twitch and YouTube videos. Sizzle works with thousands of gaming streamers to automatically create highlights and social content for them. Sizzle is available at www.sizzle.gg. 



Read more
Bengaluru (Bangalore)
3 - 5 yrs
Upto ₹25L / yr (Varies
)
React.js
AngularJS (1.x)
Java
Python
Go Programming (Golang)
+3 more

Lightning Job By Cutshort ⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)


About enParadigm:


enParadigm is one of the world’s leading enterprise SaaS gamification technology companies. We are recognized among the fastest growing tech companies as part of the Deloitte Tech Fast 500 APAC program and have won multiple global accolades for our technology platforms in the gamification space. Our proprietary platform helps organizations across industries map different roles in their organization and the skills required for success in the roles. Our proprietary recommendation engines help create hyper-personalized and immersive AI based skill-building experiences for improving role-fit and performance. We work with over 500 global corporations such as Google, Amazon, P&G, Daimler, Asian Paints, Infosys, Societe Generale etc., to help drive growth and performance. We are funded by SALT Partners and Cornerstone Venture Partners and looking to grow exponentially on the path to $100 million ARR in the next few years.


For more details visit website www.enparadigm.com


Role Overview:


We are looking for a Full Stack Developer to produce scalable software solutions. You’ll be part of a team that’s responsible for the full software development life cycle, from conception to deployment. As a Full Stack Developer, you should be comfortable around both front-end and back-end coding languages, development frameworks and third-party libraries. You should also be a team player with a knack for visual design and utility. If you’re also familiar with Agile methodologies, we’d like to meet you.


Responsibilities:


● Work with development teams and product managers to ideate software solutions

● Design client-side and server-side architecture

● Build the front-end of applications through appealing visual design

● Develop and manage well-functioning databases and applications

● Write effective APIs

● Test software to ensure responsiveness and efficiency

● Troubleshoot, debug and upgrade software

● Create security and data protection settings

● Build features and applications with a mobile responsive design

● Write technical documentation


Requirements:


● Education: B.Tech/ BE Degree in Computer Science, Statistics or relevant field.

● Experience: Minimum of 3 years of experience as full stack developer or similar role.


● Skill Set role based:

o Experience developing desktop and mobile applications

o Familiarity with common stacks

o Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery)

o Knowledge of multiple back-end languages (e.g. Java, Python, PHP) and JavaScript frameworks (e.g. Angular, React,Svelte, Node.js)

o Familiarity with databases (e.g. PostgreSQL, MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design


● Skill set behaviour based :

o Excellent communication and teamwork skills

o Great attention to detail

o Organizational skills

o An analytical mind


Current Technologies used in enParadigm:


● FastAPI(active), PHP(legacy), Java(legacy)

● Svelte, TS, JS

It's a plus if you have worked on python and php already, but not mandatory.

Read more
Eightysix Media
Abhinav Nair
Posted by Abhinav Nair
Bengaluru (Bangalore)
2 - 4 yrs
₹10L - ₹13L / yr
HTML/CSS
Java
React.js
Vue.js
Ruby
+2 more

You will play a pivotal role in developing and maintaining both the front-end and back-end of web applications. You will work closely with cross-functional teams, including designers and product managers, to bring concepts to life. The ideal candidate is passionate about technology, thrives in a collaborative environment, and possesses expertise in both front-end and back-end development.

Key Responsibilities:

  • Web Application Development: Develop, test, and maintain web applications and digital solutions.
  • Front-End Development: Create responsive and user-friendly interfaces using HTML, CSS, and JavaScript frameworks (e.g., React, Angular, Vue.js).
  • Back-End Development: Develop server-side logic, APIs, and databases using programming languages (e.g., Node.js, Python, Ruby, Java).
  • Database Management: Design and manage databases, optimize query performance, and ensure data security.
  • Integration: Integrate third-party APIs, services, and tools to enhance application functionality.
  • Testing: Perform unit and integration testing to identify and resolve issues and ensure the reliability of applications.
  • Code Optimization: Optimize code for performance, scalability, and security.
  • Collaboration: Collaborate with cross-functional teams to translate business requirements into technical solutions.
  • Documentation: Create and maintain technical documentation for development projects.
  • Stay Current: Keep up-to-date with emerging technologies, trends, and best practices in web development.


Read more
Lufthansa Technik Services India
Bengaluru (Bangalore)
5 - 10 yrs
₹15L - ₹25L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+4 more

Responsibility :


- Analyze complex data to identify patterns, detect anomalies in data using statistical tools as well as machine learning algorithms if required.


- Drive product and process improvements across the broad spectrum of projects in the Item and Inventory organization


- Find workable solutions in case of data inconsistency and inconclusive data.


- Drive projects with minimal guidance. Provide thought leadership by researching best practices and conducting experiments.


- Evaluate various analytical/statistical methods and procedures and provide recommendations of relevance, applicability, and efficiency.


- Conduct data storytelling to stakeholders through rich visualization and presentation.


- Able to communicate internally and externally through publication, presentations, and other mediums on research progress, major breakthroughs, and product innovation.


- Effectively coach junior data analysts to work through ambiguous issues, integrate input from stakeholders across the company


Technical Skills:


1. Experience with large data sets/ data warehouses, including the ability to extract data from data sets using advanced scripting or SQL.


2. Expert Knowledge in Microsoft Excel, Access, and VBA.


3. Proficiency in Data Visualization using Tableau or Power BI, or similar tools.


4. Experience with Python, or to help facilitate predictive analytics.


5. Strong analytical mindset with excellent logical, reasoning, and problem-solving skills.


6. Background in data analytics, business intelligence and reporting.


Qualifications :


- Bachelor's / Master's degree or related discipline with 5 to 8+ years' of experience (5+ Relevant).


- Minimum of relevant 5 years of job experience and more (in SQL, Excel, Tableau/ Power BI).


- Minimum of relevant 2 years job experience in Python/ML/AI.


- Good to have hands-on experience with BI ToolsETL, data processing, database programming and data analytics & reporting.


- Good to have Knowledge of RPA.


- Knowledge of working in a cloud environment (AWS) is a plus.


- Excellent communication skills in English both written and spoken.

Read more
Early stage startup
Agency job
via Qrata by Rayal Rajan
Bengaluru (Bangalore)
3 - 5 yrs
Best in industry
NodeJS (Node.js)
React.js
MERN Stack
MEAN stack
Python
+4 more

Job title = Node js developer


Experience = 3 - 5 years


Location = Bangalore



Looking for people only from startups experience.



Responsibilities:




Develop and maintain scalable, high-performance applications using Node.js

Collaborate with cross-functional teams to define, design, and ship new features

Optimize and troubleshoot applications for performance, scalability, and maintainability

Write well-documented, testable, and efficient code

Participate in code reviews to ensure code quality and adherence to coding standards

Work closely with front-end developers to integrate user-facing elements with server-side logic

Stay updated on industry best practices and emerging technologies to continuously improve development processes



Requirements:



Bachelor's degree in Computer Science, Engineering, or a related field

Proven experience as a Node.js Developer with 3-5 years of hands-on development experience

Strong proficiency in JavaScript and its ecosystem

Experience with server-side frameworks such as Express.js

Solid understanding of asynchronous or object oriented or scripting programming language and event-driven architecture

Familiarity with databases such as MongoDB, MySQL, or PostgreSQL

Experience with version control systems (e. g., Git)

Knowledge of containerization and orchestration tools (e. g., Docker, Kubernetes)

Familiarity with front-end technologies (e. g., HTML5 CSS3 React) is a plus

Excellent communication and collaboration skills

Ability to work independently and take ownership of projects

Strong problem-solving and critical-thinking skills

Read more
Series A Funded product Startup
Agency job
via Qrata by Blessy Fernandes
Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹22L / yr
Java
J2EE
Spring Boot
Python

Responsibilities:

• Collaborate with cross-functional teams, including front-end developers, product managers, and designers, to understand project requirements and translate them into technical specifications.

• Design and develop server-side logic, APIs, and database schema to support the functionality and performance requirements of our SaaS platform.

• Write clean, modular, and well-documented code using any relevant programming language preferably Java with SpringBoot.

• Optimize the backend systems for maximum speed and scalability, ensuring high performance and responsiveness of the application.

• Implement data storage solutions using PostgreSQL or other relational databases, ensuring data integrity and security.

• Conduct thorough testing and debugging to identify and resolve any issues or bugs in the backend code.

• Stay up-to-date with emerging technologies, industry trends, and best practices in backend development and contribute to the continuous improvement of our development processes.


Requirements:

• Proven work experience as a Backend Developer or similar role, with a focus on server-side development.

• Proficiency in working with relational databases, particularly PostgreSQL, and writing efficient SQL queries.

• Familiarity with SaaS concepts and architecture.

• Experience with API design and development, including RESTful APIs.

• Solid understanding of software development principles, including object-oriented programming, design patterns, and data structures.

• Experience with version control systems, such as Git.

• Strong problem-solving and analytical skills, with keen attention to detail.

• Excellent communication and teamwork skills, with the ability to collaborate effectively with cross-functional teams.

• Bachelor's degree in Computer Science, Engineering, or a related field is preferred, but not mandatory

Read more
Bengaluru (Bangalore)
1 - 6 yrs
₹2L - ₹8L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+9 more

ROLE AND RESPONSIBILITIES

Should be able to work as an individual contributor and maintain good relationship with stakeholders. Should

be proactive to learn new skills per business requirement. Familiar with extraction of relevant data, cleanse and

transform data into insights that drive business value, through use of data analytics, data visualization and data

modeling techniques.


QUALIFICATIONS AND EDUCATION REQUIREMENTS

Technical Bachelor’s Degree.

Non-Technical Degree holders should have 1+ years of relevant experience.

Read more
CodeCraft Technologies Private Limited
Chandana B
Posted by Chandana B
Bengaluru (Bangalore), Mangalore
6 - 15 yrs
Best in industry
Data Science
Machine Learning (ML)
Python
Artificial Intelligence (AI)

CodeCraft Technologies is an award-winning creative engineering company where highly skilled designers and engineers work closely and bring to life, user-focused solutions.


Proven design & development methodologies are leveraged, and the latest technologies are explored, to deliver best-in-class mobile and web solutions. Our success is built on a team of talented and motivated individuals who drive excellence in everything they do. We are seeking a highly skilled and experienced Lead Data Scientist to join our growing team.


Responsibilities:

● Work with stakeholders across the organization to identify opportunities for leveraging company data to drive business solutions.

● Develop custom data models and algorithms to apply to data sets.

● Use predictive modeling to increase and optimize the business process and solutions

● Research and development of AI algorithms and their applicability in business-related problems to build intelligent systems.

● Build a Solid Data Science Team: Provide strategic direction for the data science team. Lead, mentor, and inspire a team of data scientists, fostering a culture of collaboration and continuous learning.

● Explore the latest technologies in the Data science domain and develop POCs.

● Establish a Technology Partnership with the leading technology providers in the AI/ML space.

● MLOps – Deploy ML solutions to the cloud.

● Collaborate with the content team to produce Tech blogs, case studies, etc.,


Required Skill Set:

● Strong foundational knowledge of data science concepts, machine learning algorithms, and programming skills in Python (and/or R).

● Expertise in Generative AI (GenAI), Large Language Models (LLM), Natural Language Processing (NLP), image processing and/or video analytics

● Proven track record of supporting global clients or internal stakeholders in data science projects.

● Experience in data analytics, descriptive analytics and predictive analytics

● Experience using AI/ML tools available from cloud service providers like AWS/AZURE/GCP including TensorFlow, SageMaker, and Azure ML

● Experience in deploying solutions to the cloud [AWS/Azure/GCP]

● Experience with Data Visualization tools like PowerBI, Tableau

● Proficient in SQL and other database technologies.

● Good understanding of the latest research and technologies in AI.

● Experience working across multiple geographic borders and time zones

● Outstanding communication and presentation skills


Education:

● Graduation/Post-graduation in Computers/Engineering/Statistics from a reputed institute

Read more
Shipthis Inc

at Shipthis Inc

2 candid answers
Shariba Tasneem
Posted by Shariba Tasneem
Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹15L / yr
Python
Angular (2+)

At Shipthis, we work to build a better future and make meaningful changes in the freight forwarding industry. Our team members aren't just employees. We are comprised of bright, skilled professionals with a single straightforward goal - Evolve Freight forwarders towards Digitalized operations and help them become more efficient.

 

As a company, we're just the right size for every person to take initiative and make things happen. Join us on this journey to make a difference in how Digitalization evolves the Freight Forwarding industry.

 

Visit us at https://www.shipthis.co


JOB DESCRIPTION


What You'll Be Doing

  • Manage the lifecycle of existing product modules, including, maintenance, feature addition, and deployment.
  •  Seeing through a project from conception to finished product.
  •  Designing and developing APIs.
  •  Developing full-stack solutions along with architecture.
  •  Server and Docker management.
  • Create strategies for enhancing the team's productivity through automation, process enhancements, and tool usage.
  •  Collaborate with cross-functional teams to analyze and understand business requirements related to accounting functions within the ERP system.
  •  Design and implement scalable, secure, and efficient accounting modules that integrate seamlessly into our ERP platform.
  •  Be a champion of Shipthis product and troubleshooting procedure Collaborate, work alongside, and build mutually beneficial relationships with other teams (Customer Success, Sales, Product, Engineering)

 

Responsibilities include:

  • Stay up-to-date: Stay updated on industry trends, accounting standards, and technology advancements to enhance the ERP system continuously.
  • Security: Implementing security and data protection measures to safeguard sensitive data and ensure compliance with industry standards.
  • Code Maintenance: Participating in code reviews, refactoring, and optimizing existing codebase for maintainability and scalability.
  • Documentation: Creating and maintaining technical documentation for backend systems, APIs, and databases.
  •  Collaboration: Collaborating closely with front-end developers, UI/UX designers, and other team members to deliver end-to-end solutions.

 

Who are we looking for

  •  Strong organizational and project management skills.
  •  Proven experience in designing, developing, and maintaining accounting modules within ERP systems.
  • Knowledge and proficiency in Python Node.js TypeScript Angular 12+.
  •  Familiarity with JavaScript frameworks such as Angular 12+ and Ionic.
  •  Proficiency with server-side languages such as Python, and Nodejs.
  • Familiarity with MongoDB database.
  • Good problem-solving skills.
  •  Self-motivated, self-learning, and organized person.
  • Ability to analyze, research, and solve highly technical and unique problems. Excellent communication skills.


Who can apply

  • Computer Science or any other domain with a strong orientation toward computer programming as part of the coursework or projects.
  • Have relevant skills and interests.
  • Female candidates returning to work after a career break are strongly encouraged to apply
  • Can start the job immediately.

 

We are an equal opportunity employer and value diversity at our company. We do not discriminate based on race, religion, color, gender, sexual orientation, age, marital status, or disability status.


JOB SYNOPSIS

  • Job Role: Full Stack Developer
  • Location: Bangalore
  • Job type: Full-time, permanent
  • Experience: (3-5) years
  • Industry Type: Software Product
  • Functional Area: Software Development


Read more
Seed Funded startup in Bangalore
Bengaluru (Bangalore)
3 - 5 yrs
₹20L - ₹30L / yr
React.js
Python

We are looking for a front-end engineer to join the founding team to build out the Product. You will get to work closely with the founders, product manager, design, infrastructure engineers, Data scientist to build out the first version of web and API servers that interact with the data infrastructure systems.


You can choose to work in Coimbatore or Bangalore office.

Responsibilities

● Work closely with our PM and design teams to define feature specifications and build products leveraging frameworks such as React & React Native

● Implement web or mobile interfaces using XHTML, CSS, and JavaScript

● Analyze and optimize UI and infrastructure application code for quality, efficiency, and performance

● Design& build the backend API servers that talk to the data infrastructure systemsfor fetching the data to be exposed via Arcana UI. ● Track data quality and latency, and set up monitors and alertsto ensure smooth operation

● Analyze and improve efficiency, scalability, and stability of various system resources

● Effectively communicate complex features and systems in detail.

● Establish self as an owner of a large scope component, feature or system with expert end-to-end understanding.

● Successfully completes projects atlarge scope while maintaining a consistent high level of productivity


Qualifications

● Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience.

● 3+ years experience with modern front end, backend systems.

● 3+ years experience with object-oriented software development experience

● 3+ Experience with HTML, CSS, JS, and JS frameworkslike React.

● Proficiency in Python.

● Curiosity in Web3 technologies and have tinkered in this space. Having a passion for building tools to help customers understand the space is a plus

. ● Interested in learning new technologiesto solve customer needs with lots of creative freedom

● have experience with building products and/or have the curiosity to think through the holistic product picture


Benefits

● Competitive salary

● Ownership in the company: equity options

● Great medical insurance

● Unlimited paid time off vacation policy

● Complete office set up with latest technology: M1 MacBook Pro, 4K Monitors

● Fun weekly team building events 

Read more
Eloelo

at Eloelo

1 recruiter
Vikas Saini
Posted by Vikas Saini
Bengaluru (Bangalore)
4 - 7 yrs
₹15L - ₹30L / yr
Python
Scala
MS SQLServer
Amazon Web Services (AWS)

Design a multi-tier data pipeline to feed data into applications for building a full-featured analytics environment. Develop high-quality code to support the platform's technical architecture and design. Participate and contribute to an effective software development lifecycle using Scrum and Agile. Collaborate with global teams and work as one team. what you get to do:

You'll work on the design, implementation, and maintenance of data pipelines. Design and build database schemas to handle large-scale data migration & transformation. Capable of designing a high-performance, scalable, distributed product in the cloudAWS, GCS. Review developmental frameworks, and coding standards, conducts code reviews and walkthroughs, and conduct in-depth design reviews. Identify gaps in the existing infrastructure and advocate for the necessary changes to close them. Who we are looking for:

4 to 7 years of industry experience working in Spark and Scala/Python. Working experience with big-data tech stacks like Spark, Kafka & Athena. Extensive experience in SQL query optimization/tuning and debugging SQL performance issues. Experience in ETL/ELT process to move data through the data processing pipeline. Be a fearless leader in championing smart design. Top 3 primary skills and expertise level requirements ( 1 to 5; 5 being expert)

Excellent programming experience in Scala or Python. Good experience in SQL queries and optimizations. 2 to 3 years of Spark experience. Nice to have experience in Airflow. Nice to have experience with AWS EMR, Lambda, and S3.

Employment Type - FULLTIME

Industry Type - Media / Entertainment / Internet

Seniority Level - Mid-Senior-Level

Work Experience(in years) - 4 - 7 Years

Education - B.Tech/B.E.

Skills - Python, Scala, Ms Sql Server, Aws

Read more
Shipthis Inc

at Shipthis Inc

2 candid answers
Shariba Tasneem
Posted by Shariba Tasneem
Bengaluru (Bangalore)
1 - 2 yrs
₹5.5L - ₹6L / yr
Javascript
HTML/CSS
JSON
Python
RESTful APIs

At Shipthis, we work to build a better future and make meaningful changes in the freight forwarding industry. Our team members aren't just employees. We are comprised of bright, skilled professionals with a single straightforward goal - Evolve Freight forwarders towards Digitalized operations and help them become more efficient.


As a company, we're just the right size for every person to take initiative and make things happen. Join us on this journey to make a difference in how Digitalization evolves the Freight Forwarding industry.

 

Job Description

 

Responsibilities:

  • Utilize programming expertise to troubleshoot and diagnose software issues, offering clear and actionable solutions.
  •  Perform regular maintenance and updates to the applications
  • Follow standard procedures for proper escalation of unresolved issues to the appropriate internal teams
  •  Reporting software bugs and supportability concerns along with customer suggestions to the product teams.
  • Proven ability to analyze, diagnose, and resolve complex technical issues efficiently.
  • Requirement gathering and analysis of customer requirements.
  • Development and configuration of client instances during onboarding
  • Document technical solutions, known issues, and best practices in a clear and concise manner.
  • Follow the SLA for issues with respect to the severity. 


Who are we looking for

  • Experience in debugging and interpreting code to identify and address software-related issues
  •   Ability to multi-task and work with minimal supervision
  • Knowledge of JavaScript, HTML, JSON, and Angular
  • Basic concepts of Python and REST APIs
  • Strong problem-solving skills
  • Excellent written and verbal communication skills


Who can apply

  • Computer Science or any other domain with a strong orientation toward computer programming as part of the coursework or projects.
  • Female candidates returning to work after a career break are strongly encouraged to apply
  • Have relevant skills and interests.


We are an equal-opportunity employer and value diversity at our company. We do not discriminate based on race, religion, color, gender, sexual orientation, age, marital status, or disability status. 

 

Job Synopsis

Location: Bangalore

Job Type: Full-time

Role: Junior Software Engineer

Industry Type: Software Product

Employment Type: Full-time, Permanent

Read more
Quotient
Bengaluru (Bangalore)
7 - 8 yrs
₹10L - ₹12L / yr
Python
Snowflake
SQL
DDL
DML
+1 more

Job Description

Title - Lead Snowflake Developer

Location - Chennai/Hyderabad/Bangalore

Role - Fulltime

Notice Period/Availability - Immediate

Years of Experience - 6+

 

Job Description:

  • Overall 6 years of experience in IT/Software development
  • Minimum 3 years of experience working with Snowflake.
  • Designing, implementing and testing cloud computing solutions using Snowflake technology.
  • Creating, monitoring and optimization of ETL/ELT processes.
  • Migrating solutions from on-premises to public cloud platforms.
  • Experience in SQL language and data warehousing concepts.
  • Experience in Cloud technologies: AWS, Azure or GCP.


Read more
Red.Health

at Red.Health

2 candid answers
Mayur Bellapu
Posted by Mayur Bellapu
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹30L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+5 more

Job Description: Data Engineer

We are looking for a curious Data Engineer to join our extremely fast-growing Tech Team at StanPlus

 

About RED.Health (Formerly Stanplus Technologies)

Get to know the team:

Join our team and help us build the world’s fastest and most reliable emergency response system using cutting-edge technology.

Because every second counts in an emergency, we are building systems and flows with 4 9s of reliability to ensure that our technology is always there when people need it the most. We are looking for distributed systems experts who can help us perfect the architecture behind our key design principles: scalability, reliability, programmability, and resiliency. Our system features a powerful dispatch engine that connects emergency service providers with patients in real-time

.

Key Responsibilities

●     Build Data ETL Pipelines

●     Develop data set processes

●     Strong analytic skills related to working with unstructured datasets

●     Evaluate business needs and objectives

●     Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery

●     Interpret trends and patterns

●     Work with data and analytics experts to strive for greater functionality in our data system

●     Build algorithms and prototypes

●     Explore ways to enhance data quality and reliability

●     Work with the Executive, Product, Data, and D   esign teams, to assist with data-related technical issues and support their data infrastructure needs.

●     Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.

 

Key Requirements

●     Proven experience as a data engineer, software developer, or similar of at least 3 years.

●     Bachelor's / Master’s degree in data engineering, big data analytics, computer engineering, or related field.

●     Experience with big data tools: Hadoop, Spark, Kafka, etc.

●     Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.

●     Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.

●     Experience with Azure, AWS cloud services: EC2, EMR, RDS, Redshift

●     Experience with BigQuery

●     Experience with stream-processing systems: Storm, Spark-Streaming, etc.

●     Experience with languages: Python, Java, C++, Scala, SQL, R, etc.

●     Good hands-on with Hive, Presto.

 


Read more
Red.Health

at Red.Health

2 candid answers
Mayur Bellapu
Posted by Mayur Bellapu
Bengaluru (Bangalore)
3 - 8 yrs
₹15L - ₹40L / yr
NodeJS (Node.js)
Java
Go Programming (Golang)
Python

Role: SDE 2

Notice: 15 days

Location: Bangalore


Get to know the team:

Join our team and help us build the world’s fastest and most reliable emergency response system using cutting edge technology.


Because every second counts in an emergency, we are building systems and flows with 4 9s of reliability to ensure that our technology is always there when people need it the most. We are looking for distributed systems experts who can help us perfect the architecture behind our key design principles: scalability, reliability, programmability, and resiliency. Our system features a powerful dispatch engine that connects emergency service providers with patients in real-time.


Get to know the role:

We are looking for an experienced coder with expertise in deep distributed systems and event-driven systems and experience working with streaming platforms. They should have a strong focus on scalability and reliability, and a deep understanding of collaboration and documentation.


Responsibilities:

  • Design, develop and maintain scalable, reliable and efficient backend system
  • Write high quality, well documented and maintainable code.
  • Seek and incorporate feedback from code and design reviews to improve code quality and design effectiveness.
  • Translate business or technical requirements into effective design documents such as ERDs and RFCs to address clearly defined problems.
  • Develops tests to verify code functionality and stability; establishes monitoring and alerting systems to ensure code reliability.
  • Collaborate with cross-functional teams to ensure timely delivery of high-quality software products.


Desired skill set:


  • At least 7 years of experience in backend development.
  • Strong proficiency in one or more of the following NodeJS, Java, or Go.
  • Solid understanding of distributed and event-driven systems.
  • Production level experience with PostgreSQL/MySQL and NoSQL databases.
  • Strong emphasis on reliability and observability.
  • Experience of cloud native architecture specification.



Come and be a part of a project that has the potential to make a real difference in people's lives.


Regards,

Mayur

Talent Acquisition



Read more
Intellikart Ventures LLP
ramandeep intellikart
Posted by ramandeep intellikart
Bengaluru (Bangalore)
5 - 10 yrs
₹5L - ₹30L / yr
Python
Amazon Web Services (AWS)
Microsoft Windows Azure
Google Cloud Platform (GCP)
Databases
+1 more

How You'll Contribute:

● Redefine Fintech architecture standards by building easy-to-use, highly scalable,robust, and flexible APIs

● In-depth analysis of the systems/architectures and predict potential future breakdown and proactively bring solution

● Partner with internal stakeholders, to identify potential features implementation on that could cater to our growing business needs

● Drive the team towards writing high-quality codes, tackle abstracts/flaws in system design to attain revved-up API performance, high code reusability and readability.

● Think through the complex Fintech infrastructure and propose an easy-to-deploy modular infrastructure that could adapt and adjust to the specific requirements of the growing client base

● Design and create for scale, optimized memory usage and high throughput performance.​


Skills Required:

● 5+ years of experience in the development of complex distributed systems

● Prior experience in building sustainable, reliable and secure microservice-based scalable architecture using Python Programming Language

● In-depth understanding of Python associated libraries and frameworks

● Strong involvement in managing and maintaining produc Ɵ on-level code with high volume API hits and low-latency APIs

● Strong knowledge of Data Structure, Algorithms, Design Patterns, Multi threading concepts, etc

● Ability to design and implement technical road maps for the system and components

● Bring in new software development practices, design/architecture innovations to make our Tech stack more robust

● Hands-on experience in cloud technologies like AWS/GCP/Azure as well as relational databases like MySQL/PostgreSQL or any NoSQL database like DynamoDB

Read more
Series A funded Startup in Bangalore
Bengaluru (Bangalore)
3 - 5 yrs
₹12L - ₹19L / yr
Java
Spring Boot
PostgreSQL
Python

We are seeking a highly skilled Backend Developer to join our team and contribute to the development and improvement of both our customer-centric SaaS platform and internal systems. As a Backend Developer, you will be responsible for designing, implementing, and maintaining the server-side logic of our application, ensuring high performance, scalability, and security.


As a Backend Developer with expertise in Java, and PostgreSQL, you will play a critical role in developing and maintaining the backend infrastructure of our SaaS and internal platforms, ensuring its reliability, scalability, and performance. Join our team and be a part of building cutting-edge software solutions that empower our customers and drive business growth.


Responsibilities:

  • Collaborate with cross-functional teams, including front-end developers, product managers, and designers, to understand project requirements and translate them into technical specifications.
  • Design and develop server-side logic, APIs, and database schema to support the functionality and performance requirements of our SaaS platform.
  • Write clean, modular, and well-documented code using any relevant programming language preferably Java with SpringBoot.
  • Optimize the backend systems for maximum speed and scalability, ensuring high performance and responsiveness of the application.
  • Implement data storage solutions using PostgreSQL or other relational databases, ensuring data integrity and security.
  • Conduct thorough testing and debugging to identify and resolve any issues or bugs in the backend code.
  • Stay up-to-date with emerging technologies, industry trends, and best practices in backend development and contribute to the continuous improvement of our development processes.


Requirements:

  • Proven work experience as a Backend Developer or similar role, with a focus on server-side development.
  • Proficiency in working with relational databases, particularly PostgreSQL, and writing efficient SQL queries.
  • Familiarity with SaaS concepts and architecture.
  • Experience with API design and development, including RESTful APIs.
  • Solid understanding of software development principles, including object-oriented programming, design patterns, and data structures.
  • Experience with version control systems, such as Git.
  • Strong problem-solving and analytical skills, with keen attention to detail.
  • Excellent communication and teamwork skills, with the ability to collaborate effectively with cross-functional teams.
  • Bachelor's degree in Computer Science, Engineering, or a related field is preferred, but not mandatory.


Read more
Thoughtworks

at Thoughtworks

1 video
27 recruiters
Sunidhi Thakur
Posted by Sunidhi Thakur
Bengaluru (Bangalore)
10 - 13 yrs
Best in industry
Data modeling
PySpark
Data engineering
Big Data
Hadoop
+10 more

Lead Data Engineer

 

Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions, or perhaps overseeing a program inception to build a new product. It could also be a software delivery project where you're equally happy coding and tech-leading the team to implement the solution.

 

Job responsibilities

 

·      You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems

·      You will partner with teammates to create complex data processing pipelines in order to solve our clients' most ambitious challenges

·      You will collaborate with Data Scientists in order to design scalable implementations of their models

·      You will pair to write clean and iterative code based on TDD

·      Leverage various continuous delivery practices to deploy, support and operate data pipelines

·      Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available

·      Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions

·      Create data models and speak to the tradeoffs of different modeling approaches

·      On other projects, you might be acting as the architect, leading the design of technical solutions, or perhaps overseeing a program inception to build a new product

·      Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process

·      Assure effective collaboration between Thoughtworks' and the client's teams, encouraging open communication and advocating for shared outcomes

 

Job qualifications Technical skills

·      You are equally happy coding and leading a team to implement a solution

·      You have a track record of innovation and expertise in Data Engineering

·      You're passionate about craftsmanship and have applied your expertise across a range of industries and organizations

·      You have a deep understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop

·      You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting

·      Hands on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions

·      You are comfortable taking data-driven approaches and applying data security strategy to solve business problems

·      You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments

·      Working with data excites you: you have created Big data architecture, you can build and operate data pipelines, and maintain data storage, all within distributed systems

 

Professional skills


·      Advocate your data engineering expertise to the broader tech community outside of Thoughtworks, speaking at conferences and acting as a mentor for more junior-level data engineers

·      You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives

·      An interest in coaching others, sharing your experience and knowledge with teammates

·      You enjoy influencing others and always advocate for technical excellence while being open to change when needed

Read more
Bengaluru (Bangalore)
5 - 9 yrs
₹18L - ₹25L / yr
C
Python

Role and Responsibilities:


We are seeking a skilled and experienced individual to join our team as a Post Silicon Validation Engineer. In this role, you will be responsible for validating and analyzing the performance of SOC level environments. Your primary duties will include functional and compliance testing, characterization across power, performance, and thermal parameters, as well as data collection and analysis for PVT analysis, SNR, harmonic distortions, dynamic range, and data rates. Additionally, you will be involved in debugging silicon failures to optimize the overall system performance.


Candidate Qualifications:


To be considered for this position, you must have the following qualifications:

  • Strong exposure to validation methodologies and analysis techniques
  • Proven experience in SOC level environment bring up
  • Expertise in functional and compliance testing
  • Experience with data collection and analysis for PVT analysis, SNR, harmonic distortions, dynamic range, and data rates
  • Ability to debug silicon failures and optimize system performance



Required Skills:


In order to succeed in this role, you should have the following skills:

  • Proficiency in C and Python programming languages
  • Familiarity with NI LabVIEW, TestStand, and Actor-framework
  • Experience in developing test plans and integrating serial communication protocols
  • Ability to automate lab instruments using SCPI commands with GPIB controllers
  • Experience in developing test cases and drivers
  • Knowledge of tools such as Infiinium Test Compliance Application, Lauterbach T32 Trace32, DMM, FG, DSO, Temp C, Digital Multimeter, Function Generator, Digital Storage Oscilloscope, and National Instruments PXI – DAQ
  • Proficiency in SPI, I2C, USB2, RS232, USB3 SS, JTAG, and AMBA protocols
  • Ability to analyze and debug protocol level failures


Read more
Gyrus AI Private Limited
Nagaraj Krishnamurthy
Posted by Nagaraj Krishnamurthy
Bengaluru (Bangalore)
1 - 4 yrs
₹8L - ₹20L / yr
Python
OpenCV
PyTorch
Deep Learning
Image Processing
+1 more

You will be part of the core engineering team that is working on developing AI/ML models, Algorithms, and Frameworks in the areas of Video Analytics, Business Intelligence, IoT Predictive Analytics. 

For more information visit www.gyrus.ai 


Candidate must have the following qualifications 

- Engineering or Masters degree in CS, EC, EE or related domains

- Proficient in OpenCV

- Profficiency in Python programming 

- Exposure to one of the AI platforms like Tensorflow, Caffe, PyTorch

- Must have trained and deployed at least one fairly big AI model

- Exposure to AI models for Audio/Image/Video Analytics

- Exposure to one of the Cloud Computing platforms AWS/GCP

- Strong mathematical background with special emphasis towards Linear Algebra and Statistics

Read more
Acuity Knowledge Partners

at Acuity Knowledge Partners

2 candid answers
1 video
Gangadhar S
Posted by Gangadhar S
Bengaluru (Bangalore)
4 - 9 yrs
₹16L - ₹40L / yr
Python
Amazon Web Services (AWS)
CI/CD
MongoDB
MLOps
+1 more

Job Responsibilities:

1. Develop/debug applications using Python.

2. Improve code quality and code coverage for existing or new program.

3. Deploy and Integrate the Machine Learning models.

4. Test and validate the deployments.

5. ML Ops function.


Technical Skills

1. Graduate in Engineering or Technology with strong academic credentials

2. 4 to 8 years of experience as a Python developer.

3. Excellent understanding of SDLC processes

4. Strong knowledge of Unit testing, code quality improvement

5. Cloud based deployment and integration of applications/micro services.

6. Experience with NoSQL databases, such as MongoDB, Cassandra

7. Strong applied statistics skills

8. Knowledge of creating CI/CD pipelines and touchless deployment.

9. Knowledge about API, Data Engineering techniques.

10. AWS

11. Knowledge of Machine Learning and Large Language Model.


Nice to Have

1. Exposure to financial research domain

2. Experience with JIRA, Confluence

3. Understanding of scrum and Agile methodologies

4. Experience with data visualization tools, such as Grafana, GGplot, etc

Read more
A fast growing Big Data company
Noida, Bengaluru (Bangalore), Chennai, Hyderabad
6 - 8 yrs
₹10L - ₹15L / yr
AWS Glue
SQL
Python
PySpark
Data engineering
+6 more

AWS Glue Developer 

Work Experience: 6 to 8 Years

Work Location:  Noida, Bangalore, Chennai & Hyderabad

Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops, 

Job Reference ID:BT/F21/IND


Job Description:

Design, build and configure applications to meet business process and application requirements.


Responsibilities:

7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.


Technical Experience:

Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.


➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.

➢ Create data pipeline architecture by designing and implementing data ingestion solutions.

➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.

➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.

➢ Author ETL processes using Python, Pyspark.

➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.

➢ ETL process monitoring using CloudWatch events.

➢ You will be working in collaboration with other teams. Good communication must.

➢ Must have experience in using AWS services API, AWS CLI and SDK


Professional Attributes:

➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.

➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.

➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.


Qualification:

➢ Degree in Computer Science, Computer Engineering or equivalent.


Salary: Commensurate with experience and demonstrated competence

Read more
hopscotch
Bengaluru (Bangalore)
5 - 8 yrs
₹6L - ₹15L / yr
Python
Amazon Redshift
Amazon Web Services (AWS)
PySpark
Data engineering
+3 more

About the role:

 Hopscotch is looking for a passionate Data Engineer to join our team. You will work closely with other teams like data analytics, marketing, data science and individual product teams to specify, validate, prototype, scale, and deploy data pipelines features and data architecture.


Here’s what will be expected out of you:

➢ Ability to work in a fast-paced startup mindset. Should be able to manage all aspects of data extraction transfer and load activities.

➢ Develop data pipelines that make data available across platforms.

➢ Should be comfortable in executing ETL (Extract, Transform and Load) processes which include data ingestion, data cleaning and curation into a data warehouse, database, or data platform.

➢ Work on various aspects of the AI/ML ecosystem – data modeling, data and ML pipelines.

➢ Work closely with Devops and senior Architect to come up with scalable system and model architectures for enabling real-time and batch services.


What we want:

➢ 5+ years of experience as a data engineer or data scientist with a focus on data engineering and ETL jobs.

➢ Well versed with the concept of Data warehousing, Data Modelling and/or Data Analysis.

➢ Experience using & building pipelines and performing ETL with industry-standard best practices on Redshift (more than 2+ years).

➢ Ability to troubleshoot and solve performance issues with data ingestion, data processing & query execution on Redshift.

➢ Good understanding of orchestration tools like Airflow.

 ➢ Strong Python and SQL coding skills.

➢ Strong Experience in distributed systems like spark.

➢ Experience with AWS Data and ML Technologies (AWS Glue,MWAA, Data Pipeline,EMR,Athena, Redshift,Lambda etc).

➢ Solid hands on with various data extraction techniques like CDC or Time/batch based and the related tools (Debezium, AWS DMS, Kafka Connect, etc) for near real time and batch data extraction.


Note :

Product based companies, Ecommerce companies is added advantage

Read more
Codemonk

at Codemonk

2 recruiters
Manjunath S
Posted by Manjunath S
Bengaluru (Bangalore)
5 - 8 yrs
Best in industry
NodeJS (Node.js)
TypeScript
Python
Go Programming (Golang)
Java

As a seasoned professional, your influence extends beyond development; you will be instrumental in orchestrating rollout plans and adoption blueprints, thereby shaping our forward-thinking roadmap. Envision yourself leading and contributing to groundbreaking projects such as:

  • Crafting a robust infrastructure adept at managing a surge of notifications with negligible latency.
  • Building an auto-scaled system from scratch, integrating elements like queuing and caching to address pressing business needs.
  • Engaging in profound dialogues to weigh the pros and cons of microservices versus monolithic architectures, steering the team towards the most beneficial approach.
  • Overseeing systems that cater to millions of requests per second, enhancing performance through scalable strategies.
  • Orchestrating a smooth database transition from self-hosted to managed setups, ensuring uninterrupted service.
  • Designing a multi-cloud framework that amalgamates the prowess of diverse cloud platforms, guaranteeing reliability and efficiency.
  • Leveraging AI to develop models capable of discerning patterns in creative content, including artworks and designs.
  • Developing a swift and efficient data pipeline that can handle the ingestion and processing of millions of records without compromising on speed and performance.
  • Excel in Customer Support by analyzing trends and proposing solutions, improve on-call processes, collaborates cross-functionally on bug fixes, and employs advanced debugging strategies for complex issues.

Who You Are

  • 4+ years of solid experience as a Backend Engineer, with expertise in handling large-scale projects using TypeScript/JavaScript, Python.
  • Comprehensive understanding of database systems and adeptness in query optimization.
  • Skilled in establishing and following best practices in API development and integration.
  • A penchant for crafting simple, sustainable, and scalable solutions, with a proven track record of developing enduring systems.
  • Agile learner with a proactive stance in adapting to the evolving SaaS landscape, prioritizing substantial business impact over mere code expansion.
  • Proven leadership in fostering collaboration and drawing actionable insights from various organizational sectors, steering teams with a visionary and inclusive approach.
  • Trusted figure in building and nurturing cross-functional relationships, commanding respect and trust across all organizational levels.
  • Source of inspiration and motivation for new team members, often being the catalyst for engineers opting to join your team.
  • Dedicated mentor with a deep-seated commitment to nurturing the growth of both engineers and managers, emphasizing personal development and mentorship.


Read more
BSEtec
Sushmitha Sri
Posted by Sushmitha Sri
Bengaluru (Bangalore), Madurai
0 - 2 yrs
₹1L - ₹3L / yr
Python
Solidity
Truffle (Ethereum framework)

About Us:

Join our team at BSEtec, a leading blockchain solutions provider. We're looking for a skilled Blockchain Developer to help us build the future of decentralized applications and solutions.


Job Description:


As a Blockchain Developer at BSEtec, you'll work on exciting projects that harness blockchain's potential. Your role will include:


Responsibilities:


  • Develop blockchain solutions, including smart contracts and dApps.
  • Collaborate on cross-functional teams to meet project requirements.
  • Stay updated on blockchain trends, protocols, and security.
  • Troubleshoot and resolve blockchain tech issues.
  • Integrate blockchain into existing systems.
  • Ensure security and scalability.
  • Document code and maintain project records.
  • Help deploy and maintain blockchain networks.
  • Contribute to coding standards.


Qualifications:


  • Degree in Computer Science or related field.
  • Blockchain development experience.
  • Proficiency in Ethereum, Solidity, Web3.js.
  • Strong blockchain fundamentals knowledge.
  • Smart contract experience.
  • Familiarity with blockchain tools.
  • Problem-solving skills.
  • Good communication and teamwork.
  • Blockchain certification is a plus.



Read more
globe teleservices
deepshikha thapar
Posted by deepshikha thapar
Bengaluru (Bangalore)
4 - 8 yrs
₹10L - ₹15L / yr
Python
SQL

RESPONSIBILITIES:

 Requirement understanding and elicitation, analyze, data/workflows, contribute to product

project and Proof of concept (POC)

 Contribute to prepare design documents and effort estimations.

 Develop AI/ML Models using best in-class ML models.

 Building, testing, and deploying AI/ML solutions.

 Work with Business Analysts and Product Managers to assist with defining functional user

stories.

 Ensure deliverables across teams are of high quality and clearly documented. 

 Recommend best ML practices/Industry standards for any ML use case.

 Proactively take up R and D and recommend solution options for any ML use case.

REQUIREMENTS:

Required Skills

 Overall experience of 4 to 7 Years working on AI/ML framework development

 Good programming knowledge in Python is must.

 Good Knowledge of R and SAS is desired.

 Good hands on and working knowledge SQL, Data Model, CRISP-DM.

 Proficiency with Uni/multivariate statistics, algorithm design, and predictive AI/ML modelling.

 Strong knowledge of machine learning algorithms, linear regression, logistic regression, KNN,

Random Forest, Support Vector Machines and Natural Language Processing.

 Experience with NLP and deep neural networks using synthetic and artificial data.

 Involved in different phases of SDLC and have good working exposure on different SLDC’s like

Agile Methodologies.

Read more
Seed Funded Product-based startup
Bengaluru (Bangalore)
4 - 6 yrs
₹10L - ₹14L / yr
Python
Django
Flask

We are seeking an experienced Senior Backend Engineer to join our passionate team. If you have a strong background in backend development, a track record of delivering scalable and reliable solutions, and are eager to contribute to complex projects, we would love to hear from you.


Responsibilities:

  • Design and develop robust, high-performance backend solutions using Python and related technologies.
  • Lead the architecture and design discussions for major backend components and services.
  • Collaborate closely with cross-functional teams to gather and analyze software requirements.
  • Mentor and guide junior and mid-level engineers, fostering their technical growth.
  • Review code and provide constructive feedback to ensure code quality and adherence to best practices.
  • Identify and address performance bottlenecks, scalability challenges, and technical issues.
  • Participate in sprint planning, task estimation, and agile development processes.
  • Keep up-to-date with industry trends, tools, and best practices to continuously improve our backend systems.
  • Drive the adoption of coding standards, design patterns, and engineering best practices.
  • Collaborate with frontend engineers to ensure seamless integration between frontend and backend components.


Requirements:

  • Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).
  • Minimum of 4 years of professional experience in backend development.
  • Strong proficiency in Python and backend frameworks like Django and Flask.
  • In-depth knowledge of database systems, both relational (MySQL) and NoSQL (MongoDB, etc. ).
  • Proven track record of designing and developing scalable and maintainable backend services.
  • Experience with RESTful API design and best practices.
  • Solid understanding of software architecture, design principles, and software development lifecycle.
  • Previous experience leading or mentoring engineers is a strong plus.
  • Strong problem-solving skills and a proactive attitude towards challenges.
  • Excellent communication skills, both verbal and written.
  • Familiarity with cloud platforms (e. g., AWS, Azure, GCP) and containerization (Docker) is a plus.


Read more
HelloAR

at HelloAR

2 recruiters
Jobs H
Posted by Jobs H
Bengaluru (Bangalore)
1 - 2 yrs
Best in industry
Python
Amazon Web Services (AWS)
RESTful APIs
Flask
MongoDB
+2 more

Role Responsibilities:

  • Development and Maintenance of REST APIs: Lead the creation and management of our RESTful APIs, ensuring top-notch performance and alignment with evolving requirements.
  • Proficiency in Coding: We're in search of expertise in Python or equivalent programming languages. Your coding skills will play a pivotal role in delivering high-quality (efficient, reusable, testable, and scalable) solutions.
  • Unit and Integration Testing: Apply your expertise to craft unit and integration tests, upholding code quality and reliability.
  • Version Control Systems: Proficiency in Distributed Version Control Systems is vital for seamless collaboration during development.
  • Elasticsearch Expertise: Having valuable experience with Elasticsearch is a plus, given its critical role in data retrieval and search functionalities.
  • NOSQL Database Familiarity: Knowledge of NOSQL databases like Cassandra and MongoDB will be advantageous.
  • Message Broker Knowledge: Understanding message brokers, especially RabbitMQ, is beneficial for effective communication within our systems.

Desired Qualifications:

  • Experience: 1-2 years of hands-on experience as a Python developer.
  • AWS: Proficiency in AWS cloud management and architecting enterprise data solutions.
  • Pragmatic Problem-Solving: Recognize when a solution should be streamlined and when creating the right abstraction will lead to long-term efficiency gains.
  • Passion for Quality: Demonstrate dedication to producing work of the highest quality and following best practices.
  • Agile/Lean Process: Familiarity with Agile/Lean methodologies is a plus, reflecting your adaptability and collaborative spirit.
  • Startup Mindset: Embrace the challenges and opportunities of a startup environment, contributing your skills and insights to our growth.
  • Debugging and Optimization: Showcase excellent debugging and optimization capabilities to enhance system performance.
  • Tech Awareness: Stay updated on emerging technologies and possess a solid understanding of the full product development life cycle.
  • UX and Information Architecture: Exhibit excellent knowledge of mobile user experience, information architecture, and industry trends.
Read more
globe teleservices
deepshikha thapar
Posted by deepshikha thapar
Bengaluru (Bangalore)
5 - 10 yrs
₹20L - ₹25L / yr
ETL
Python
Informatica
Talend



Good experience in the Extraction, Transformation, and Loading (ETL) of data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager,

Designer, Workflow Manager, Workflow Monitor, Metadata Manager), Power Connect as ETL tool on Oracle, and SQL Server Databases.



 Knowledge of Data Warehouse/Data mart, ODS, OLTP, and OLAP implementations teamed with

project scope, Analysis, requirements gathering, data modeling, ETL Design, development,

System testing, Implementation, and production support.

 Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts

and Dimensions

 Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,

Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

 Developed mapping parameters and variables to support SQL override.

 Created applets to use them in different mappings.

 Created sessions, configured workflows to extract data from various sources, transformed data,

and loading into the data warehouse.

 Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

 Modified existing mappings for enhancements of new business requirements.

 Involved in Performance tuning at source, target, mappings, sessions, and system levels.

 Prepared migration document to move the mappings from development to testing and then to

production repositories

 Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex

SQL queries using PL/SQL.


 Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica

/Talend sessions as well as performance tuning of mappings and sessions.

 Experience in all phases of Data warehouse development from requirements gathering for the

data warehouse to develop the code, Unit Testing, and Documenting.

 Extensive experience in writing UNIX shell scripts and automation of the ETL processes using

UNIX shell scripting.

 Experience in using Automation Scheduling tools like Control-M.

 Hands-on experience across all stages of Software Development Life Cycle (SDLC) including

business requirement analysis, data mapping, build, unit testing, systems integration, and user

acceptance testing.

 Build, operate, monitor, and troubleshoot Hadoop infrastructure.

 Develop tools and libraries, and maintain processes for other engineers to access data and write

MapReduce programs.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort