50+ Python Jobs in Bangalore (Bengaluru) | Python Job openings in Bangalore (Bengaluru)
Apply to 50+ Python Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.
About Us:
Optimo Capital is a newly established NBFC founded by Prashant Pitti, who is also a co-founder of EaseMyTrip (a billion-dollar listed startup that grew profitably without any funding).
Our mission is to serve the underserved MSME businesses with their credit needs in India. With less than 15% of MSMEs having access to formal credit, we aim to bridge this credit gap through a phygital model (physical branches + digital decision-making).
As a technology and data-first company, tech lovers and data enthusiasts play a crucial role in building the analytics & tech at Optimo that helps the company thrive.
What We Offer:
Join our dynamic startup team as a Senior Data Analyst and play a crucial role in core data analytics projects involving credit risk, lending strategy, credit underwriting features analytics, collections, and portfolio management. The analytics team at Optimo works closely with the Credit & Risk departments, helping them make data-backed decisions.
This is an exceptional opportunity to learn, grow, and make a significant impact in a fast-paced startup environment. We believe that the freedom and accountability to make decisions in analytics and technology bring out the best in you and help us build the best for the company. This environment offers you a steep learning curve and an opportunity to experience the direct impact of your analytics contributions. Along with this, we offer industry-standard compensation.
What We Look For:
We are looking for individuals with a strong analytical mindset and a fundamental understanding of the lending industry, primarily focused on credit risk. We value not only your skills but also your attitude and hunger to learn, grow, lead, and thrive, both individually and as part of a team. We encourage you to take on challenges, bring in new ideas, implement them, and build the best analytics systems. Your willingness to put in the extra hours to build the best will be recognized.
Skills/Requirements:
- Credit Risk & Underwriting: Fundamental knowledge of credit risk and underwriting processes is mandatory. Experience in any lending financial institution is a must. A thorough understanding of all the features evaluated in the underwriting process like credit report info, bank statements, GST data, demographics, etc., is essential.
- Analytics (Python): Excellent proficiency in Python - Pandas and Numpy. A strong analytical mindset and the ability to extract actionable insights from any analysis are crucial. The ability to convert the given problem statements into actionable analytics tasks and frame effective approaches to tackle them is highly desirable.
- Good to have but not mandatory: REST APIs: A fundamental understanding of APIs and previous experience or projects related to API development or integrations. Git: Proficiency in version control systems, particularly Git. Experience in collaborative projects using Git is highly valued.
What You'll Be Working On:
- Analyze data from different data sources, extract information, and create action items to tackle the given open-ended problems.
- Build strong analytics systems and dashboards that provide easy access to data and insights, including the current status of the company, portfolio health, static pool, branch-wise performance, TAT (turnaround time) monitoring, and more.
- Assist the credit and risk team with insights and action items, helping them make data-backed decisions and fine-tune the credit policy (high involvement in the credit and underwriting process).
- Work on different rule engines that automate the underwriting process end-to-end.
Other Requirements:
- Availability for full-time work in Bangalore. Immediate joiners are preferred.
- Strong passion for analytics and problem-solving.
- At least 1 year of industry experience in an analytics role, specifically in a lending institution, is a must.
- Self-motivated and capable of working both independently and collaboratively.
If you are ready to embark on an exciting journey of growth, learning, and innovation, apply now to join our pioneering team in Bangalore.
Job Description:
We are seeking a skilled Quant Developer with strong programming expertise and an interest in financial markets. You will be responsible for designing, developing, and optimizing trading algorithms, analytics tools, and quantitative models. Prior experience with cryptocurrencies is a plus but not mandatory.
Key Responsibilities:
• Develop and implement high-performance trading algorithms and strategies.
• Collaborate with quantitative researchers to translate models into robust code.
• Optimize trading system performance and latency.
• Maintain and improve existing systems, ensuring reliability and scalability.
• Work with market data, conduct analysis, and support trading operations.
Required Skills:
• Strong proficiency in Python and/or C++.
• Solid understanding of data structures, algorithms, and object-oriented programming.
• Familiarity with financial markets or trading systems (experience in crypto is a bonus).
• Experience with distributed systems, databases, and performance optimization.
• Knowledge of numerical libraries (e.g., NumPy, pandas, Boost) and statistical analysis.
Preferred Qualifications:
• Experience in developing low-latency systems.
• Understanding of quantitative modeling and back testing frameworks.
• Familiarity with trading protocols (FIX, WebSocket, REST APIs).
• Interest or experience in cryptocurrencies and blockchain technologies.
at Tech Prescient
Desired Skills: Java/Python/Go, Postman, SoapUI, Load Testing, performance Testing and Scalability Testing.
Experience range: 5-6 Years
Availability: Immediate
We are looking for a qualified and experienced QA Automation Tester who has the following expertise :
What’s required for application:
● 5+ years of Test Automation experience.
● Good software development/scripting skills in common languages like Python, Go, JAVA, PERL, C++, JavaScript, Node.js and/or Bash, etc.
● Experience with Cloud preferably AWS and resources within like EC2, EKS, Lambda, Kinesis, S3, AWS CI/CD pipeline
● Experience testing Relational and NoSQL database technologies
● Experience with testing applications with multi-tiered and distributed architecture, preferably with Restful Web Service and other J2EE applications
● API testing with knowledge of REST and/or SOAP APIs.
● API automation tools such as Postman, SoapUI, etc.
● Understanding of Real-Time Communication Systems and VoIP protocols such as SIP, RTP, WebRTC.
● Developing test scripts at a highly proficient level.
● Experience with load, performance and scalability testing.
● Documenting, tracking and escalating issues as appropriate with the ability to build effective relationships through partnering and collaboration.
● Handling complex software system infrastructure tasks – from requirements to production.
● Experience in designing, developing, building and running a Continuous Integration Test system.
Good to have:
● Experience with CI/CD deployment frameworks/tools like Jenkins, Github Actions, etc.
● Experience in testing Kafka or similar queuing services.
● Experience with Data Science and Big Data frameworks.
● Experience working with XML, XSLT, JSON.
● Exposure to API Security testing and vulnerability testing.
Key Responsibilities:
- Designing, developing, and maintaining AI/NLP-based software solutions.
- Collaborating with cross-functional teams to define requirements and implement new features.
- Optimizing performance and scalability of existing systems.
- Conducting code reviews and providing constructive feedback to team members.
- Staying up-to-date with the latest developments in AI and NLP technologies.
Requirements:
- Bachelor's degree in Computer Science, Engineering, or related field.
- 6+ years of experience in software development, with a focus on Python programming.
- Strong understanding of artificial intelligence and natural language processing concepts.
- Hands-on experience with AI/NLP frameworks such as Llamaindex/Langchain, OpenAI, etc.
- Experience in implementing Retrieval-Augmented Generation (RAG) systems for enhanced AI solutions.
- Proficiency in building and deploying machine learning models.
- Excellent problem-solving skills and attention to detail.
- Strong communication and interpersonal skills.
Preferred Qualifications:
- Master's degree or higher in Computer Science, Engineering, or related field.
- Experience with cloud computing platforms such as AWS, Azure, or Google Cloud.
- Familiarity with big data technologies such as Hadoop, Spark, etc.
- Contributions to open-source projects related to AI/NLP
Senior Backend Developer
Job Overview: We are looking for a highly skilled and experienced Backend Developer who excels in building robust, scalable backend systems using multiple frameworks and languages. The ideal candidate will have 4+ years of experience working with at least two backend frameworks and be proficient in at least two programming languages such as Python, Node.js, or Go. As an Sr. Backend Developer, you will play a critical role in designing, developing, and maintaining backend services, ensuring seamless real-time communication with WebSockets, and optimizing system performance with tools like Redis, Celery, and Docker.
Key Responsibilities:
- Design, develop, and maintain backend systems using multiple frameworks and languages (Python, Node.js, Go).
- Build and integrate APIs, microservices, and other backend components.
- Implement real-time features using WebSockets and ensure efficient server-client communication.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Optimize backend systems for performance, scalability, and reliability.
- Troubleshoot and debug complex issues, providing efficient and scalable solutions.
- Work with caching systems like Redis to enhance performance and manage data.
- Utilize task queues and background job processing tools like Celery.
- Develop and deploy applications using containerization tools like Docker.
- Participate in code reviews and provide constructive feedback to ensure code quality.
- Mentor junior developers, sharing best practices and promoting a culture of continuous learning.
- Stay updated with the latest backend development trends and technologies to keep our solutions cutting-edge.
Required Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
- 4+ years of professional experience as a Backend Developer.
- Proficiency in at least two programming languages: Python, Node.js, or Go.
- Experience working with multiple backend frameworks (e.g., Express, Flask, Gin, Fiber, FastAPI).
- Strong understanding of WebSockets and real-time communication.
- Hands-on experience with Redis for caching and data management.
- Familiarity with task queues like Celery for background job processing.
- Experience with Docker for containerizing applications and services.
- Strong knowledge of RESTful API design and implementation.
- Understanding of microservices architecture and distributed systems.
- Solid understanding of database technologies (SQL and NoSQL).
- Excellent problem-solving skills and attention to detail.
- Strong communication skills, both written and verbal.
Preferred Qualifications:
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Familiarity with CI/CD pipelines and DevOps practices.
- Experience with GraphQL and other modern API paradigms.
- Familiarity with task queues, caching, or message brokers (e.g., Celery, Redis, RabbitMQ).
- Understanding of security best practices in backend development.
- Knowledge of automated testing frameworks for backend services.
- Familiarity with version control systems, particularly Git.
Availability- Immediate Joiners
Location- Bangalore,Hybrid
Must have skills:
● 4+years of Software Development experience
● 4+years of GoLang programming; Prefers additional proficiency in either Java or Python
● Knowledgeable in writing REST APIs
● Comfortable programming in production grade systems
● Experience with building HTTP based services
● Strong background of optimizing performance
● Familiarity with event-driven systems
● Experience dealing with highly concurrent, distributed architectures/systems.
Good to have skills:
● Exposure to relational databases
● Experience with Cloud Providers such as AWS is an advantage
● Experience using Terraform to manage infrastructure as code would be an advantage
Building the machine learning production (or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-of-the-art AI solutions for Fractal clients.
Responsibilities
As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.
- Enable Model tracking, model experimentation, Model automation
- Develop ML pipelines to support
- Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
- Develop MLOps components in Machine learning development life cycle using Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
- Work across all phases of Model development life cycle to build MLOPS components
- Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
- Be an integral part of client business development and delivery engagements across multiple domains
Required Qualifications
- 3-5 years experience building production-quality software.
- B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent
- Strong experience in System Integration, Application Development or Data Warehouse projects across technologies used in the enterprise space
- Knowledge of MLOps, machine learning and docker
- Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
- CI/CD experience( i.e. Jenkins, Git hub action,
- Database programming using any flavors of SQL
- Knowledge of Git for Source code management
- Ability to collaborate effectively with highly technical resources in a fast-paced environment
- Ability to solve complex challenges/problems and rapidly deliver innovative solutions
- Foundational Knowledge of Cloud Computing on Azure
- Hunger and passion for learning new skills
Building the machine learning production System(or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-ofthe-art AI solutions for Fractal clients.
Responsibilities
As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.
- Enable Model tracking, model experimentation, Model automation
- Develop scalable ML pipelines
- Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
- Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
- Work across all phases of Model development life cycle to build MLOPS components
- Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
- Be an integral part of client business development and delivery engagements across multiple domains
Required Qualifications
- 5.5-9 years experience building production-quality software
- B.E/B.Tech/M.Tech in Computer Science or related technical degree OR equivalent
- Strong experience in System Integration, Application Development or Datawarehouse projects across technologies used in the enterprise space
- Expertise in MLOps, machine learning and docker
- Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
- Experience developing CI/CD components for production ready ML pipeline.
- Database programming using any flavors of SQL
- Knowledge of Git for Source code management
- Ability to collaborate effectively with highly technical resources in a fast-paced environment
- Ability to solve complex challenges/problems and rapidly deliver innovative solutions
- Team handling, problem solving, project management and communication skills & creative thinking
- Foundational Knowledge of Cloud Computing on Azure
- Hunger and passion for learning new skills
Responsibilities
- Design and implement advanced solutions utilizing Large Language Models (LLMs).
- Demonstrate self-driven initiative by taking ownership and creating end-to-end solutions.
- Conduct research and stay informed about the latest developments in generative AI and LLMs.
- Develop and maintain code libraries, tools, and frameworks to support generative AI development.
- Participate in code reviews and contribute to maintaining high code quality standards.
- Engage in the entire software development lifecycle, from design and testing to deployment and maintenance.
- Collaborate closely with cross-functional teams to align messaging, contribute to roadmaps, and integrate software into different repositories for core system compatibility.
- Possess strong analytical and problem-solving skills.
- Demonstrate excellent communication skills and the ability to work effectively in a team environment.
Primary Skills
- Generative AI: Proficiency with SaaS LLMs, including Lang chain, llama index, vector databases, Prompt engineering (COT, TOT, ReAct, agents). Experience with Azure OpenAI, Google Vertex AI, AWS Bedrock for text/audio/image/video modalities.
- Familiarity with Open-source LLMs, including tools like TensorFlow/Pytorch and Huggingface. Techniques such as quantization, LLM finetuning using PEFT, RLHF, data annotation workflow, and GPU utilization.
- Cloud: Hands-on experience with cloud platforms such as Azure, AWS, and GCP. Cloud certification is preferred.
- Application Development: Proficiency in Python, Docker, FastAPI/Django/Flask, and Git.
- Natural Language Processing (NLP): Hands-on experience in use case classification, topic modeling, Q&A and chatbots, search, Document AI, summarization, and content generation.
- Computer Vision and Audio: Hands-on experience in image classification, object detection, segmentation, image generation, audio, and video analysis.
a leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage. We are the partner of choice for enterprises on their digital transformation journey. Our teams offer solutions and services at the intersection of Advanced Data, Analytics, and AI.
Role: Lead Backend Developer
NP - Immediate to 15 days
Location: Bangalore/Mangalore-Hybrid
Skills: Python, Fast API, AWS, Oops Concepts
• 8+ Years of industry experience.
• Proficient in Object Oriented Methodologies
• Experience in developing RESTful Web services using FastAPI framework.
• Understanding of the threading limitations in Python, and multi-process architecture.
• Familiarity with event-driven programming in Python.
• Proficient understanding of code versioning tools such as Git.
• Experience with Amazon Web Services (AWS).
• Knowledge of user authentication and authorization between multiple systems, servers, and
environments.
• Implementing security and data protection
• Experience in TDD, continuous integration, code review practice is strongly desired
• Proven experience developing test scripts, test cases, and test data.
• Use of CI/CD tools in deployment.
• Experience deploying application to the cloud.
Key Responsibilities
- Design, develop, and optimize data pipelines using Apache Spark to process large volumes of structured and unstructured data.
- Write efficient and maintainable code in Scala and Python for data extraction, transformation, and loading (ETL) operations.
- Collaborate with cross-functional teams to define data engineering solutions to support analytics and machine learning initiatives.
- Implement and maintain data lake and warehouse solutions using cloud platforms (e.g., AWS, GCP, Azure).
- Ensure data workflows and distributed systems' performance, scalability, and reliability.
- Perform data quality assessments, implement monitoring, and improve data governance practices.
- Assist in migrating and refactoring legacy data systems into modern distributed data processing platforms.
- Provide technical leadership and mentorship to junior engineers and contribute to best practices in coding, testing, and deployment.
Qualifications
- Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
- 6+ years of hands-on experience in data engineering, with strong skills in Apache Spark, Scala, and Python.
- Experience with distributed data processing frameworks and real-time data processing.
- Strong experience with big data technologies such as Hadoop, Hive, and Kafka.
- Proficient with relational databases (SQL, PostgreSQL, MySQL) and NoSQL databases (Cassandra, HBase, MongoDB).
- Knowledge of CI/CD pipelines and DevOps practices for deploying data workflows.
- Strong problem-solving skills and experience with optimizing large-scale data systems.
- Excellent communication and collaboration skills.
- Experience with orchestration tools like Airflow
- Experience with containerization and orchestration (e.g., Docker, Kubernetes)
Key Responsibilities
- Design, develop, and optimize data pipelines using Apache Spark to process large volumes of structured and unstructured data.
- Write efficient and maintainable code in Scala and Python for data extraction, transformation, and loading (ETL) operations.
- Collaborate with cross-functional teams to define data engineering solutions to support analytics and machine learning initiatives.
- Implement and maintain data lake and warehouse solutions using cloud platforms (e.g., AWS, GCP, Azure).
- Ensure data workflows and distributed systems' performance, scalability, and reliability.
- Perform data quality assessments, implement monitoring, and improve data governance practices.
- Assist in migrating and refactoring legacy data systems into modern distributed data processing platforms.
- Provide technical leadership and mentorship to junior engineers and contribute to best practices in coding, testing, and deployment.
Qualifications
- Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
- 8+ years of hands-on experience in data engineering, with strong skills in Apache Spark, Scala, and Python.
- Experience with distributed data processing frameworks and real-time data processing.
- Strong experience with big data technologies such as Hadoop, Hive, and Kafka.
- Proficient with relational databases (SQL, PostgreSQL, MySQL) and NoSQL databases (Cassandra, HBase, MongoDB).
- Knowledge of CI/CD pipelines and DevOps practices for deploying data workflows.
- Strong problem-solving skills and experience with optimizing large-scale data systems.
- Excellent communication and collaboration skills.
- Experience with orchestration tools like Airflow
- Experience with containerization and orchestration (e.g., Docker, Kubernetes)
Data Scientist is responsible to discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver better products. Your primary focus will be in applying Machine Learning and Generative AI techniques for data mining and statistical analysis, Text analytics using NLP/LLM and building high quality prediction systems integrated with our products. The ideal candidate should have a prior background in Generative AI, NLP (Natural Language Processing), and Computer Vision techniques. Additionally, experience in working with current state of the art Large Language Models (LLMs), and Computer Vision algorithms.
Job Responsibilities:
» Building models using best in AI/ML technology.
» Leveraging your expertise in Generative AI, Computer Vision, Python, Machine Learning, and Data Science to develop cutting-edge solutions for our products.
» Integrating NLP techniques, and utilizing LLM's in our products.
» Training/fine tuning models with new/modified training dataset.
» Selecting features, building and optimizing classifiers using machine learning techniques.
» Conducting data analysis, curation, preprocessing, modelling, and post-processing to drive data-driven decision-making.
» Enhancing data collection procedures to include information that is relevant for building analytic systems
» Working understanding of cloud platforms (AWS).
» Collaborating with cross-functional teams to design and implement advanced AI models and algorithms.
» Involving in R&D activities to explore the latest advancements in AI technologies, frameworks, and tools.
» Documenting project requirements, methodologies, and outcomes for stakeholders.
Technical skills
Mandatory
» Minimum of 5 years of experience as Machine Learning Researcher or Data Scientist.
» Master's degree or Ph.D. (preferable) in Computer Science, Data Science, or a related field.
» Should have knowledge and experience in working with Deep Learning projects using CNN, Transformers, Encoder and decoder architectures.
» Working experience with LLM's (Large Language Models) and their applications (For e.g., tuning embedding models, data curation, prompt engineering, LoRA, etc.).
» Familiarity with LLM Agents and related frameworks.
» Good programming skills in Python and experience with relevant libraries and frameworks (e.g., PyTorch, and TensorFlow).
» Good applied statistics skills, such as distributions, statistical testing, regression, etc.
» Excellent understanding of machine learning and computer vision based techniques and algorithms.
» Strong problem-solving abilities and a proactive attitude towards learning and adopting new technologies.
» Ability to work independently, manage multiple projects simultaneously, and collaborate effectively with diverse stakeholders.
Nice to have
» Exposure to financial research domain
» Experience with JIRA, Confluence
» Understanding of scrum and Agile methodologies
» Basic understanding of NoSQL databases, such as MongoDB, Cassandra
Experience with data visualization tools, such as Grafana, GGplot, etc.
Job Description for Data Engineer Role:-
Must have:
Experience working with Programming languages. Solid foundational and conceptual knowledge is expected.
Experience working with Databases and SQL optimizations
Experience as a team lead or a tech lead and is able to independently drive tech decisions and execution and motivate the team in ambiguous problem spaces.
Problem Solving, Judgement and Strategic decisioning skills to be able to drive the team forward.
Role and Responsibilities:
- Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, mentoring other members of the engineering community, and from time to time, be asked to code or evaluate code
- Collaborate with digital product managers, and leaders from other team to refine the strategic needs of the project
- Utilize programming languages like Java, Python, SQL, Node, Go, and Scala, Open Source RDBMS and NoSQL databases
- Defining best practices for data validation and automating as much as possible; aligning with the enterprise standards
Qualifications -
- Experience with SQL and NoSQL databases.
- Experience with cloud platforms, preferably AWS.
- Strong experience with data warehousing and data lake technologies (Snowflake)
- Expertise in data modelling
- Experience with ETL/LT tools and methodologies
- Experience working on real-time Data Streaming and Data Streaming platforms
- 2+ years of experience in at least one of the following: Java, Scala, Python, Go, or Node.js
- 2+ years working with SQL and NoSQL databases, data modeling and data management
- 2+ years of experience with AWS, GCP, Azure, or another cloud service.
We are seeking a talented UiPath Developer with experience in Python, SQL, Pandas, and NumPy to join our dynamic team. The ideal candidate will have hands-on experience developing RPA workflows using UiPath, along with the ability to automate processes through scripting, data manipulation, and database queries.
This role offers the opportunity to collaborate with cross-functional teams to streamline operations and build innovative automation solutions.
Key Responsibilities:
- Design, develop, and implement RPA workflows using UiPath.
- Build and maintain Python scripts to enhance automation capabilities.
- Utilize Pandas and NumPy for data extraction, manipulation, and transformation within automation processes.
- Write optimized SQL queries to interact with databases and support automation workflows.
Skills and Qualifications:
- 2 to 5 years of experience in UiPath development.
- Strong proficiency in Python and working knowledge of Pandas and NumPy.
- Good experience with SQL for database interactions.
- Ability to design scalable and maintainable RPA solutions using UiPath.
Job Overview:
We are seeking a skilled Senior Python Full Stack Developer with a strong background in application architecture and design. The ideal candidate will be proficient in Python, with extensive experience in web frameworks such as Django or Flask along with front-end technologies- React, JavaScript. You'll play a key role in designing scalable applications, collaborating with cross-functional teams, and leveraging cloud technologies.
Key Responsibilities:
- Backend Development:
- - Architect, develop, and maintain high-performance backend systems using Python or Golang.
- - Build and optimize APIs and microservices that power innovative, user-focused features.
- - Implement security and data protection measures that are scalable from day one.
- - Collaborate closely with DevOps to deploy and manage applications seamlessly in dynamic cloud environments.
- Frontend Development:
- - Work hand-in-hand with front-end developers to integrate and harmonize backend systems with React-based applications.
- - Contribute to the UI/UX design process, ensuring an intuitive, frictionless user experience that aligns with the startup’s vision.
- - Continuously optimize web applications to ensure they are fast, responsive, and scalable as the user base grows.
Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 7+ years of experience in backend development, with proficiency in Python and/or Golang.
- Strong experience in front-end technologies, particularly React.
- Familiarity with cloud platforms (AWS, GCP, or Azure) and containerization tools like Docker and Kubernetes.
- Knowledge of Apache Spark is highly preferred.
- Solid understanding of database technologies, both relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., MongoDB, Cassandra).
- Experience with CI/CD pipelines and automated testing frameworks.
- Excellent problem-solving skills and a proactive attitude toward tackling challenges
at Sim Gems Group
- Hands-on experience of at least 6 to 8 years in Python or any modern programming language like Ruby on Rails, Node, Java, C++, or C#;
- Any frontend JavaScript framework like react.js, vue.js, angular, or react-native;
- Any modern SQL database like MySQL, Oracle, MS SQL, or Postgres.
- A sound foundation in data structure, algorithms, and System Design.
- Sound team tech leadership skills to mentor juniors.
- Knowledge of ERP or supply chain applications is a huge plus.
- Candidates from service companies who are actively into coding are also welcome.
team excelled in providing top-notch business solutions to industries such as Ecommerce, Marketing, Banking and Finance, Insurance, Transport and many more. For a generation that is driven by data, Insights & decision making, we help businesses to make the best possible use of data and enable businesses to thrive in this competitive space. Our expertise spans across Data, Analytics and Engineering to name a few.
● Strong programming skills in languages such as Java, Python, or C#.
● Experience with test automation tools (e.g., Selenium,
Appium, TestNG and CI/CD tools (e.g., Jenkins, GitLab CI).
Technical & Behavioural Skills
● Strong understanding of software testing methodologies, including functional,regression, and performance testing.
● Proficiency in scripting languages and test automation frameworks.
● Strong problem-solving skills and attention to detail.
● Excellent communication and collaboration skills, with the ability to work effectively in a
fast - paced, agile development environment.
● Experience with API testing tools (e.g., Postman, REST Assured) and performance
testing tools (e.g., JMeter, LoadRunner) is a plus.
Key Responsibilities
Test Automation Development:
● Design, develop, and maintain automated test frameworks, test scripts, and test cases
for , regression, and performance testing.
● Integrate automated tests into the CI/CD pipeline to provide continuous feedback on
product quality.
● Collaborate with developers to create and maintain testable software components and
interfaces.
● Test Planning and Execution:
● Analyze product requirements and technical specifications to develop comprehensive
test plans and test cases.
● Execute automated and manual tests, analyze test results, and report defects and issues
to the development team.
● Ensure that test environments are properly configured and maintained.
at Indee
About Indee
Indee is among the leading providers of a proprietary platform for secure video distribution and streaming, used by some of the world’s largest media companies, including Netflix, Paramount Pictures, Disney, and over 1100 other companies, big and small. Indee has grown 5x in the last 3 years and is scaling up at a rapid rate.
About the role
We are seeking a highly skilled and experienced Automation Engineer to join our dynamic team. As an Automation Engineer, you will play a key role in designing, implementing, and maintaining our automation testing framework. The primary focus of this role will be on utilizing Selenium, Pytest, Allure reporting, Python Requests, and Boto3 for automation testing and infrastructure management.
Responsibilities:
- Develop and maintain automated test scripts using Selenium WebDriver and Pytest to ensure the quality of web applications.
- Implement and enhance the automation testing framework to support scalability, reliability, and efficiency.
- Generate comprehensive test reports using Allure reporting for test result visualization and analysis.
- Conduct API testing using Python Requests, ensuring the functionality and reliability of backend services.
- Utilize Boto3 for automation of AWS infrastructure provisioning, configuration, and management.
- Collaborate with cross-functional teams, including developers, QA engineers, and DevOps engineers, to understand project requirements and deliver high-quality solutions.
- Identify opportunities for process improvement and optimization within the automation testing process.
- Provide technical expertise and guidance to junior team members, fostering a culture of continuous learning and development.
- Stay updated on industry trends and emerging technologies, incorporating them into our automation testing practices as appropriate.
- Participate in code reviews, ensuring adherence to coding standards and best practices.
Requirements:
- Strong programming skills in Python, with proficiency in writing clean, maintainable code.
- Experience with cloud infrastructure management and automation using AWS services and Boto3.
- Solid understanding of software testing principles, methodologies, and best practices.
- Excellent problem-solving skills and attention to detail.
- Ability to work effectively both independently and collaboratively in a fast-paced environment.
- Strong communication and interpersonal skills, with the ability to interact with stakeholders at all levels.
- Passion for technology and a desire to continuously learn and improve.
- Prior experience in Agile development methodologies.
- Experience with performance testing using Locust is considered a plus.
Qualifications:
- Education: Bachelor's degree in Computer Science, Software Engineering, or related field; Master’s degree preferred.
- Experience: 3 - 5 years of proven experience in automation testing using Selenium WebDriver, Pytest, Appium, Allure reporting, Python Requests, and Boto3
Benefits:
- Competitive salary and comprehensive benefits package.
- Opportunity to work with cutting-edge technologies and industry-leading experts.
- Flexible work environment with the option for remote work (hybrid).
- Professional development opportunities and support for continued learning.
- Dynamic and collaborative company culture with opportunities for growth and advancement.
If you are a highly motivated and skilled Automation Engineer looking to take the next step in your career, we encourage you to apply for this exciting opportunity to join our team at Indee. Help us drive innovation and shape the future of technology!
at Indee
About Indee
Indee is among the leading providers of a proprietary platform for secure video distribution and streaming, used by some of the world’s largest media companies, including Netflix, Paramount Pictures, Disney, and over 1100 other companies, big and small. Indee has grown 5x in the last 3 years and is scaling up at a rapid rate.
About the role
We are seeking a highly skilled and experienced Automation Engineer to join our dynamic team. As an Automation Engineer, you will play a key role in designing, implementing, and maintaining our automation testing framework. The primary focus of this role will be on utilizing Selenium, Pytest, Allure reporting, Python Requests, and Boto3 for automation testing and infrastructure management.
Responsibilities:
- Develop and maintain automated test scripts using Selenium WebDriver and Pytest to ensure the quality of web applications.
- Implement and enhance the automation testing framework to support scalability, reliability, and efficiency.
- Generate comprehensive test reports using Allure reporting for test result visualization and analysis.
- Conduct API testing using Python Requests, ensuring the functionality and reliability of backend services.
- Utilize Boto3 for automation of AWS infrastructure provisioning, configuration, and management.
- Collaborate with cross-functional teams, including developers, QA engineers, and DevOps engineers, to understand project requirements and deliver high-quality solutions.
- Identify opportunities for process improvement and optimization within the automation testing process.
- Provide technical expertise and guidance to junior team members, fostering a culture of continuous learning and development.
- Stay updated on industry trends and emerging technologies, incorporating them into our automation testing practices as appropriate.
- Participate in code reviews, ensuring adherence to coding standards and best practices.
Requirements:
- Strong programming skills in Python, with proficiency in writing clean, maintainable code.
- Experience with cloud infrastructure management and automation using AWS services and Boto3.
- Solid understanding of software testing principles, methodologies, and best practices.
- Excellent problem-solving skills and attention to detail.
- Ability to work effectively both independently and collaboratively in a fast-paced environment.
- Strong communication and interpersonal skills, with the ability to interact with stakeholders at all levels.
- Passion for technology and a desire to continuously learn and improve.
- Prior experience in Agile development methodologies.
- Experience with performance testing using Locust is considered a plus.
Qualifications:
- Education: Bachelor's degree in Computer Science, Software Engineering, or related field; Master’s degree preferred.
- Experience: 3 - 5 years of proven experience in automation testing using Selenium WebDriver, Pytest, Appium, Allure reporting, Python Requests, and Boto3
Benefits:
- Competitive salary and comprehensive benefits package.
- Opportunity to work with cutting-edge technologies and industry-leading experts.
- Flexible work environment with the option for remote work (hybrid).
- Professional development opportunities and support for continued learning.
- Dynamic and collaborative company culture with opportunities for growth and advancement.
If you are a highly motivated and skilled Automation Engineer looking to take the next step in your career, we encourage you to apply for this exciting opportunity to join our team at Indee. Help us drive innovation and shape the future of technology!
a leading data & analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage
What are we looking for?
- Bachelor’s degree in analytics related area (Data Science, Computer Engineering, Computer Science, Information Systems, Engineering, or a related discipline)
- 7+ years of work experience in data science, analytics, engineering, or product management for a diverse range of projects
- Hands on experience with python, deploying ML models
- Hands on experience with time-wise tracking of model performance, and diagnosis of data/model drift
- Familiarity with Dataiku, or other data-science-enabling tools (Sagemaker, etc)
- Demonstrated familiarity with distributed computing frameworks (snowpark, pyspark)
- Experience working with various types of data (structured / unstructured)
- Deep understanding of all data science phases (eg. Data engineering, EDA, Machine Learning, MLOps, Serving)
- Highly self-motivated to deliver both independently and with strong team collaboration
- Ability to creatively take on new challenges and work outside comfort zone
- Strong English communication skills (written & verbal)
Roles & Responsibilities:
- Clearly articulates expectations, capabilities, and action plans; actively listens with others’ frame of reference in mind; appropriately shares information with team; favorably influences people without direct authority
- Clearly articulates scope and deliverables of projects; breaks complex initiatives into detailed component parts and sequences actions appropriately; develops action plans and monitors progress independently; designs success criteria and uses them to track outcomes; drives implementation of recommendations when appropriate, engages with stakeholders throughout to ensure buy-in
- Manages projects with and through others; shares responsibility and credit; develops self and others through teamwork; comfortable providing guidance and sharing expertise with others to help them develop their skills and perform at their best; helps others take appropriate risks; communicates frequently with team members earning respect and trust of the team
- Experience in translating business priorities and vision into product/platform thinking, set clear directives to a group of team members with diverse skillsets, while providing functional & technical guidance and SME support
- Demonstrated experience interfacing with internal and external teams to develop innovative data science solutions
- Strong business analysis, product design, and product management skills
- Ability to work in a collaborative environment - reviewing peers' code, contributing to problem- solving sessions, ability to communicate technical knowledge to a a variety of audience (such as management, brand teams, data engineering teams, etc.)
- Ability to articulate model performance to a non-technical crowd, and ability to select appropriate evaluation criteria to evaluate hidden confounders and biases within a model
- MLOps frameworks and their use in model tracking and deployment, and automating the model serving pipeline
- Work with all sizes of ML model from linear/logistic regression to other sklearn-like models, to deep learning
- Formulate training schema for unbiased model training e.g. K-fold cross-validation, leave-one- out cross-validation) for parameter searching and model tuning
- Ability to work on machine learning work like recommender systems, end-to-end ML lifecycle)
- Ability to manage ML on largely imbalanced training sets (<5% positive rate)
- Ability to articulate model performance to a non-technical crowd, and ability to select appropriate evaluation criteria to evaluate hidden confounders and biases within a model
- MLOps frameworks and their use in model tracking and deployment, and automating the model serving pipeline
Client based at Bangalore location.
Data Scientist - Healthcare AI
Location: Bangalore, India
Experience: 4+ years
Skills Required: Radiology, visual images, text, classical models, LLM multi-modal
Responsibilities:
· LLM Development and Fine-tuning: Fine-tune and adapt large language models (e.g., GPT, Llama2, Mistral) for specific healthcare applications, such as text classification, named entity recognition, and question answering.
· Data Engineering: Collaborate with data engineers to build robust data pipelines for large-scale text datasets used in LLM training and fine-tuning.
· Model Evaluation and Optimization: Develop rigorous experimentation frameworks to assess model performance, identify areas for improvement, and inform model selection.
· Production Deployment: Work closely with MLOps and Data Engineering teams to integrate models into scalable production systems.
· Predictive Model Design: Leverage machine learning/deep learning and LLM methods to design, build, and deploy predictive models in oncology (e.g., survival models).
· Cross-functional Collaboration: Partner with product managers, domain experts, and stakeholders to understand business needs and drive the successful implementation of data science solutions.
· Knowledge Sharing: Mentor junior team members and stay up-to-date with the latest advancements in machine learning and LLMs.
Qualifications:
· Doctoral or master's degree in computer science, Data Science, Artificial Intelligence, or related field.
· 5+ years of hands-on experience in designing, implementing, and deploying machine learning and deep learning models.
· 12+ months of in-depth experience working with LLMs. Proficiency in Python and NLP-focused libraries (e.g., spaCy, NLTK, Transformers, TensorFlow/PyTorch).
· Experience working with cloud-based platforms (AWS, GCP, Azure).
Preferred Qualifications:
o Experience working in the healthcare domain, particularly oncology.
o Publications in relevant scientific journals or conferences.
o Degree from a prestigious university or research institution.
Salary: INR 15 to INR 30 lakhs per annum
Performance Bonus: Up to 10% of the base salary can be added
Location: Bangalore or Pune
Experience: 2-5 years
About AbleCredit:
AbleCredit is on a mission to solve the Credit Gap of emerging economies. In India alone, the Credit Gap is over USD 5T (Trillion!). This is the single largest contributor to poverty, poor genie index and lack of opportunities. Our Vision is to deploy AI reliably, and safely to solve some of the greatest problems of humanity.
Job Description:
This role is ideal for someone with a strong foundation in deep learning and hands-on experience with AI technologies.
- You will be tasked with solving complex, real-world problems using advanced machine learning models in a privacy-sensitive domain, where your contributions will have a direct impact on business-critical processes.
- As a Machine Learning Engineer at AbleCredit, you will collaborate closely with the founding team, who bring decades of industry expertise to the table.
- You’ll work on deploying cutting-edge Generative AI solutions at scale, ensuring they align with strict privacy requirements and optimize business outcomes.
This is an opportunity for experienced engineers to bring creative AI solutions to one of the most challenging and evolving sectors, while making a significant difference to the company’s growth and success.
Requirements:
- Experience: 2-4 years of hands-on experience in applying machine learning and deep learning techniques to solve complex business problems.
- Technical Skills: Proficiency in standard ML tools and languages, including:
- Python: Strong coding ability for building, training, and deploying machine learning models.
- PyTorch (or MLX or Jax): Solid experience in one or more deep learning frameworks for developing and fine-tuning models.
- Shell scripting: Familiarity with Unix/Linux shell scripting for automation and system-level tasks.
- Mathematical Foundation: Good understanding of the mathematical principles behind machine learning and deep learning (linear algebra, calculus, probability, optimization).
- Problem Solving: A passion for solving tough, ambiguous problems using AI, especially in data-sensitive, large-scale environments.
- Privacy & Security: Awareness and understanding of working in privacy-sensitive domains, adhering to best practices in data security and compliance.
- Collaboration: Ability to work closely with cross-functional teams, including engineers, product managers, and business stakeholders, and communicate technical ideas effectively.
- Work Experience: This position is for experienced candidates only.
Additional Information:
- Location: Pune or Bangalore
- Work Environment: Collaborative and entrepreneurial, with close interactions with the founders.
- Growth Opportunities: Exposure to large-scale AI systems, GenAI, and working in a data-driven privacy-sensitive domain.
- Compensation: Competitive salary and ESOPs, based on experience and performance
- Industry Impact: You’ll be at the forefront of applying Generative AI to solve high-impact problems in the finance/credit space, helping shape the future of AI in the business world.
Overview
We're looking for a Mid-to senior level data analyst who can create underwriting methodologies based on users' financial data, assist with regular evaluation and reporting, and help us in continuous product improvement. The ideal candidate should have previous experience in a good product-based startup who can be a go-getter, and has helped the product to launch different functionalities, and has experience stabilizing it.
Roles and Responsibilities
- Analytical Skills, Data Analytics, and Statistics
- Strong communication skills
- Proficiency in SQL, Python, and Excel is mandatory.
- Attention to detail and ability to work with large datasets
- Degree in Mathematics, Statistics, Computer Science, or related field
- Relevant work experience in the finance industry is a plus
- Strong proficiency in MS Excel and knowledge of data visualization tools such as Redash.
- ML/Statistical model or algorithm is a bonus
- A good understanding of the product life cycle and product-based analytical experience is a must
- The ability to communicate with cross-functional teams and help the stakeholders with the right insights is a must
- Ability to thrive in a fast-paced environment of start-up is a must
A leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage.
What are we looking for?
- Bachelor's degree in computer science, computer engineering, or related field.
- 4+ years of experience as a Python developer.
- Expert knowledge of Python and related frameworks including FastAPI/Flask/Django
- Experience in designing, developing, and managing cloud-based (AWS/GCP) infrastructure and applications.
- Good to have knowledge on Docker & CI-CD pipelines
- Good understanding of relational databases (e.g., MySQL, PostgreSQL)
- Ability to integrate multiple data sources into a single system.
- Ability to collaborate on projects and work independently when required.
- Excellent problem solving and communication abilities, to be able to solve complex problems that may arise during the development process.
Roles & Responsibilities:
- Developing applications using the python programming language.
- Involvement in all aspects of the software development life cycle, from requirements gathering to testing and deployment.
- Writing clean, scalable & efficient code
- Integrating user-facing elements developed by front-end developers with server-side logic
- Building reusable code libraries for future use
- Working closely with other members of the development team, as well as customers or clients to ensure that applications are developed according to specifications.
- Testing applications thoroughly before deployment in order to ensure that they are free of errors.
- Deploying applications and providing support after deployment, if necessary.
- Assisting senior developers in mentoring junior staff members
- Updating software programs as new versions become available.
WHO WE ARE:
TIFIN is a fintech platform backed by industry leaders including JP Morgan, Morningstar, Broadridge, Hamilton Lane, Franklin Templeton, Motive Partners and a who’s who of the financial service industry. We are creating engaging wealth experiences to better financial lives through AI and investment intelligence powered personalization. We are working to change the world of wealth in ways that personalization has changed the world of movies, music and more but with the added responsibility of delivering better wealth outcomes.
We use design and behavioral thinking to enable engaging experiences through software and application programming interfaces (APIs). We use investment science and intelligence to build algorithmic engines inside the software and APIs to enable better investor outcomes.
In a world where every individual is unique, we match them to financial advice and investments with a recognition of their distinct needs and goals across our investment marketplace and our advice and planning divisions.
OUR VALUES: Go with your GUT
- Grow at the Edge. We are driven by personal growth. We get out of our comfort zone and keep egos aside to find our genius zones. With self-awareness and integrity we strive to be the best we can possibly be. No excuses.
- Understanding through Listening and Speaking the Truth. We value transparency. We communicate with radical candor, authenticity and precision to create a shared understanding. We challenge, but once a decision is made, commit fully.
- I Win for Teamwin. We believe in staying within our genius zones to succeed and we take full ownership of our work. We inspire each other with our energy and attitude. We fly in formation to win together.
Responsibilities:
- Develop user-facing features such as web apps and landing portals.
- Ensure the feasibility of UI/UX designs and implement them technically.
- Create reusable code and libraries for future use.
- Optimize applications for speed and scalability.
- Contribute to the entire implementation process, including defining improvements based on business needs and architectural enhancements.
- Promote coding, testing, and deployment of best practices through research and demonstration.
- Review frameworks and design principles for suitability in the project context.
- Demonstrate the ability to identify opportunities, lay out rational plans, and see them through to completion.
Requirements:
- Bachelor’s degree in Engineering with 10+ years of software product development experience.
- Proficiency in React, Django, Pandas, GitHub, AWS, and JavaScript, Python
- Strong knowledge of PostgreSQL, MongoDB, and designing REST APIs.
- Experience with scalable interactive web applications.
- Understanding of software design constructs and implementation.
- Familiarity with ORM libraries and Test-Driven Development.
- Exposure to the Finance domain is preferred.
- Knowledge of HTML5, LESS/CSS3, jQuery, and Bootstrap.
- Expertise in JavaScript fundamentals and front-end/back-end technologies.
Nice to Have:
- Strong knowledge of website security and common vulnerabilities.
- Exposure to financial capital markets and instruments.
Compensation and Benefits Package:
- Competitive compensation with a discretionary annual bonus.
- Performance-linked variable compensation.
- Medical insurance.
A note on location. While we have team centers in Boulder, New York City, San Francisco, Charlotte, and Mumbai, this role is based out of Bangalore
TIFIN is an equal-opportunity workplace, and we value diversity in our workforce. All qualified applicants will receive consideration for employment without regard to any discrimination.
Job Description:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 7+ years of experience in backend development, with proficiency in Python and/or Golang.
- Strong experience in front-end technologies, particularly React.
- Familiarity with cloud platforms (AWS, GCP, or Azure) and containerization tools like Docker and Kubernetes.
- Knowledge of Apache Spark is highly preferred.
- Solid understanding of database technologies, both relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., MongoDB, Cassandra).
- Experience with CI/CD pipelines and automated testing frameworks.
- Excellent problem-solving skills and a proactive attitude toward tackling challenges
🚀 We're Hiring: Python AWS Fullstack Developer at InfoGrowth! 🚀
Join InfoGrowth as a Python AWS Fullstack Developer and be a part of our dynamic team driving innovative cloud-based solutions!
Job Role: Python AWS Fullstack Developer
Location: Bangalore & Pune
Mandatory Skills:
- Proficiency in Python programming.
- Hands-on experience with AWS services and migration.
- Experience in developing cloud-based applications and pipelines.
- Familiarity with DynamoDB, OpenSearch, and Terraform (preferred).
- Solid understanding of front-end technologies: ReactJS, JavaScript, TypeScript, HTML, and CSS.
- Experience with Agile methodologies, Git, CI/CD, and Docker.
- Knowledge of Linux (preferred).
Preferred Skills:
- Understanding of ADAS (Advanced Driver Assistance Systems) and automotive technologies.
- AWS Certification is a plus.
Why Join InfoGrowth?
- Work on cutting-edge technology in a fast-paced environment.
- Collaborate with talented professionals passionate about driving change in the automotive and tech industries.
- Opportunities for professional growth and development through exciting projects.
🔗 Apply Now to elevate your career with InfoGrowth and make a difference in the automotive sector!
Job Role: Adaptive Autosar + Bootloader Developer
Mandatory Skills:
- Adaptive Autosar Development
- Bootloader Experience
- C++ Programming
- Hands-on experience with ISO 14229 (UDS Protocol)
- Experience in Flash Bootloader and Software Update topics
- Proficient in C++ and Python programming
- Application development experience in Service-Oriented Architectures
- Hands-on experience with QNX and Linux operating systems
- Familiarity with software development tools like CAN Analyzer, CANoe, and Debugger
- Strong problem-solving skills and the ability to work independently
- Exposure to the ASPICE Process is an advantage
- Excellent analytical and communication skills
Job Responsibilities:
- Engage in tasks related to the integration and development of Flash Bootloader (FBL) features and perform comprehensive testing activities.
- Collaborate continuously with counterparts in Germany to understand requirements and develop FBL features effectively.
- Create test specifications and meticulously document testing results.
Why Join InfoGrowth?
- Become part of an innovative team focused on transforming the automotive industry with cutting-edge technology.
- Work on exciting projects that challenge your skills and promote professional growth.
- Enjoy a collaborative environment that values teamwork and creativity.
🔗 Apply Now to shape the future of automotive technology with InfoGrowth!
Roles and Responsibilities:
- Python Full Stack Development experience is preferred.
- Able to create modern data pipelines and data processing using AWS PAAS components (Glue, Sagemaker studio, etc.) or open source tools (Spark, Hbase, Hive, etc.).
- A good understanding of ML/ AI algorithms and statistical algorithms is mandatory.
- Critical thinking and problem-solving is essential
- Ability to model and design modern data structures, SQL/NoSQL databases, Data Lakes, Cloud Data Warehouse (SnowFlake preferred).
- Experience on data stores such as DynamoDB, Redis, Elasticsearch, MySQL, Oracle, and AWS RDS.
- Deploying software using CI/CD tools such as Azure Devops, Jenkins, etc.
- Experience with API tools such as REST, Swagger, Postman.
- Working in Agile Framework.
- Previous work experience in a fintech/finance product-based industry is a bonus.
- Previous work on the Churn models is a bonus.
🚀 We're Hiring: Test Engineer at InfoGrowth 🚀
Are you an experienced Test Engineer with a passion for automotive embedded systems and automation testing? Join InfoGrowth, a leader in IT and cloud services, and become a part of a dynamic team focused on cutting-edge automotive technology solutions.
Key Responsibilities:
- Design and implement test automation solutions for embedded products in the automotive industry.
- Architect efficient test automation strategies from software to integrated system tests on virtual and physical embedded targets.
- Offer automation testing as a platform for development teams, enabling them to run CI/CD pipeline tests independently.
- Lead in the design and operation of test automation platforms, ensuring top-quality testing efficiency.
- Conduct both manual and automated testing at various system levels, including SW, BaseTech, ADAS, Brake, Suspension, Steering, and Connectivity.
Key Skills:
- Proven expertise in test automation for embedded products in the automotive domain.
- Proficient in Python programming, with experience in Robot Frameworks being a strong plus.
- Hands-on experience with Vector tools such as CANalyzer, CANoe, and CAPL.
- Solid understanding of communication protocols like CAN, LIN, FlexRay, and Ethernet.
- Strong knowledge of HIL (Hardware-in-the-loop) and SIL (Software-in-the-loop) technologies.
- Deep understanding of embedded software testing.
Preferred Experience Areas:
- Testing and implementation of automated/manual testing on Complete Vehicle Electronics System level (SWDL, network management, customer functions).
- Boxcar and HIL testing for critical automotive functions, including ADAS and Steering functionalities.
- Testing and implementing strategies for the Connectivity and system-level automotive technologies.
Job Description
We are looking for a talented Java Developer to work in abroad countries. You will be responsible for developing high-quality software solutions, working on both server-side components and integrations, and ensuring optimal performance and scalability.
Preferred Qualifications
- Experience with microservices architecture.
- Knowledge of cloud platforms (AWS, Azure).
- Familiarity with Agile/Scrum methodologies.
- Understanding of front-end technologies (HTML, CSS, JavaScript) is a plus.
Requirment Details
Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).
Proven experience as a Java Developer or similar role.
Strong knowledge of Java programming language and its frameworks (Spring, Hibernate).
Experience with relational databases (e.g., MySQL, PostgreSQL) and ORM tools.
Familiarity with RESTful APIs and web services.
Understanding of version control systems (e.g., Git).
Solid understanding of object-oriented programming (OOP) principles.
Strong problem-solving skills and attention to detail.
Key Responsibilities
• Lead the automation testing effort of our cloud management platform.
• Create and maintain automation test cases and test suites.
• Work closely with the development team to ensure that the automation tests are integrated into the development process.
• Collaborate with other QA team members to identify and resolve defects.
• Implement automation testing best practices and continuously improve the automation testing framework.
• Develop and maintain automation test scripts using programming languages such as Python.
• Conduct performance testing using tools such as JMeter, Gatling, or Locust.
• Monitor and report on automation testing and performance testing progress and results.
• Ensure that the automation testing and performance testing strategy aligns with overall product quality goals and objectives.
• Manage and mentor a team of automation QA engineers.
Requirements
• Bachelor's degree in Computer Science or a related field.
• Minimum of 8+ years of experience in automation testing and performance testing.
• Experience in leading and managing automation testing teams.
• Strong experience with automation testing frameworks including Robot Framework.
• Strong experience with programming languages, including Python.
• Strong understanding of software development lifecycle and agile methodologies.
• Experience with testing cloud-based applications.
• Good understanding of Cloud services & ecosystem, specifically AWS.
• Experience with performance testing tools such as JMeter, Gatling, or Locust.
• Excellent analytical and problem-solving skills.
• Excellent written and verbal communication skills.
• Ability to work independently and in a team environment.
• Passionate about automation testing and performance testing.
is a leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage. We are the partner of choice for enterprises on their digital transformation journey. Our teams offer solutions and services at the intersection of Advanced Data, Analytics, and AI.
Skills: ITSM methodologies, Python, Snowflake and AWS. Open for 18*5 support as well.
Immediate Joiner - 30 days NP
• Bachelor’s degree in computer science, Software Engineering, or a related field.
• Should have hands on 5+ Experience in ITSM methodologies
• 3+ Years of experience in SQL, Snowflake, Python development
• 2+ years hands-on experience in Snowflake DW
• Good communication and client/stakeholders’ management skill
• Willing to work across multiple time-zone and handled team based out of off - shore
a leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage. We are the partner of choice for enterprises on their digital transformation journey. Our teams offer solutions and services at the intersection of Advanced Data, Analytics, and AI.
Skills: Python, Fast API, AWS/GCP/Azure'
Location - Bangalore / Mangalore (Hybrid)
NP - Immediate - 20 days
• Experience in building python-based utility-scale Enterprise APIs with QoS/SLA based specs building upon Cloud APIs from GCP, AWS and Azure
• Exposure to Multi-modal (Text, Audio, Video) development in synchronous and Batch mode in high-volume use-cases leveraging Queuing, Pooling and enterprise scaling patterns. • Solid understanding of API life cycle including versioning (ex: parallel deployment of multiple versions), exception management.
• Working experience (Development and/or troubleshooting skills) of Enterprise scale AWS, CI/CD leveraging GitHub actions-based workflow.
• Solid knowledge of developing/updating enterprise Cloud formation templates for python centric code-assets along with quality/security tooling
• Design/support tracing/monitoring capability (X-Ray, AWS distro for Open Telemetry) for Fargate services. • Responsible and able to communicate requirements.
a leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage. We are the partner of choice for enterprises on their digital transformation journey.
Skills: Python / Node, JavaScript, Any bot platform with hands on experience in Chatbot / Voice Bot & IVR
Notice period - Immediate to 15 days
• Minimum 7+ years of solid product/application development experience and should have worked on
open-source platforms.
• Hands -on experience working with Kore.AI Platform.
• You have excellent spoken and written communication skills.
• Must have strong expertise in either Java or Python, Node.js
• Skillset should have the following JSON, Redis / MongoDB, Web Sockets
• Proven experience as a Full Stack Developer or similar role
• Knowledge of AI, NLP, ML, Chat bots will be added advantage.
• Experience in Web Services using REST, SOAP
• Knowledge of different authentication and authorization techniques
• Candidate must have hands on experience in server-side programming independent of technology
• Strong knowledge of code documentation and handover
• Exposure to any of the cloud platforms like AWS/GCP/Azure
• UI-side development experience in any one of the technologies Java Script /Angular.JS /
React.JS/Vue.JS
• You possess the initiative to scale up and develop sufficient expertise on our products with the internal
training provided.
is a leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage. We are the partner of choice for enterprises on their digital transformation journey. Our teams offer solutions and services at the intersection of Advanced Data, Analytics, and AI. These include Data and Analytics CoE roadmap and buildout, Realtime Data Insights, and Advisory services. We help companies monetize their data assets and generate real-time actionable insights that drive growth and optimize business operations.
Mandatory Skills: Python / Node, JavaScript, Any bot platform with hands on experience in Chatbot / Voice Bot & IVR
Minimum 7+ years of solid product/application development experience and should have worked on
open-source platforms.
• Hands -on experience working with Kore.AI Platform.
• You have excellent spoken and written communication skills.
• Must have strong expertise in either Java or Python, Node.js
• Skillset should have the following JSON, Redis / MongoDB, Web Sockets
• Proven experience as a Full Stack Developer or similar role
• Knowledge of AI, NLP, ML, Chat bots will be added advantage.
• Experience in Web Services using REST, SOAP
• Knowledge of different authentication and authorization techniques
• Candidate must have hands on experience in server-side programming independent of technology
• Strong knowledge of code documentation and handover
• Exposure to any of the cloud platforms like AWS/GCP/Azure
• UI-side development experience in any one of the technologies Java Script /Angular.JS /
React.JS/Vue.JS
• You possess the initiative to scale up and develop sufficient expertise on our products with the internal
training provided.
Work Mode - Hybrid
Notice Period - Immediate to 15 days
Role: Data Analyst (Apprentice - 1 Year contract)
Location: Bangalore, Hybrid Model
Job Summary: Join our team as an Apprentice for one year and take the next step in your career while supporting FOX’s unique corporate services and Business Intelligence (BI) platforms. This role offers a fantastic opportunity to leverage your communication skills, technical expertise, analytical and problem-solving competencies, and customer-focused experience.
Key Responsibilities:
· Assist in analyzing and solving data-related challenges.
· Support the development and maintenance of corporate service and BI platforms.
· Collaborate with cross-functional teams to enhance user experience and operational efficiency.
· Participate in training sessions to further develop technical and analytical skills.
· Conduct research and analysis to identify trends and insights in data.
· Prepare reports and presentations to communicate findings to stakeholders.
· Engage with employees to understand their needs and provide support.
· Contribute insights and suggestions during team meetings to drive continuous improvement.
Qualifications:
· Bachelor’s degree in Engineering (2024 pass-out).
· Strong analytical and technical skills with attention to detail.
· Excellent communication skills, both verbal and written.
· Ability to work collaboratively in a team-oriented environment.
· Proactive attitude and a strong willingness to learn.
· Familiarity with data analysis tools and software (e.g., Excel, SQL, Tableau) is a plus.
· Basic understanding of programming languages (e.g., Python, R) is an advantage.
Additional Information:
- This position offers a hybrid work model, allowing flexibility between remote and in-office work.
- Opportunities for professional development and skill enhancement through buddy and mentorship.
- Exposure to real-world projects and the chance to contribute to impactful solutions.
- A supportive and inclusive team environment that values diverse perspectives.
at Juntrax Solutions
Software Development QA Engineer (Automation and Manual)
Juntrax Solutions is a SF/Bengaluru-based company developing and innovating SaaS-based products. Currently working on developing a brand new product, so great opportunity to be part of product lifecycle development and core team.
Roles and Responsibilities
- Understand and update project requirements documents as necessary
- create & execute test cases
- Design, develop and execute automated test scripts for regression testing.
- Test, report, and manage defect pipeline and communicate with the team.
- Collaborate with the design and development team to provide input as it relates to the requirements
- Setup test automation frameworks for multiple application platforms like Web and Mobile. Manage and execute test scripts on these frameworks.
- Investigate customer problems referred by the technical support team.
- Able to build different test scenarios and validate acceptance test cases.
- Handle technical communications with stakeholders and team
- Bachelor's in Computer Science, Engineering, and minimum 1-2 years of industry experience as a Software Development Test Engineer.
- Must have excellent verbal and written communication skills.
- Testing materials like test cases, plans, test strategies, bug reports created should be easy to read and comprehend.
- Must efficiently manage multiple tasks, have high productivity, time management skills
- You should be able to upgrade your technical skills with the changing technologies.
- Able to work independently and take ownership of the product testing role
- should have a product mindset and spirit to work in an early-stage start-up.
- Should have previously set up an automation framework.
Other
- Must be able to work in Bangalore.
- Salary based on qualifications and experience.
- Excellent growth opportunity
at Wissen Technology
Job Requirements:
Intermediate Linux Knowledge
- Experience with shell scripting
- Familiarity with Linux commands such as grep, awk, sed
- Required
Advanced Python Scripting Knowledge
- Strong expertise in Python
- Required
Ruby
- Nice to have
Basic Knowledge of Network Protocols
- Understanding of TCP/UDP, Multicast/Unicast
- Required
Packet Captures
- Experience with tools like Wireshark, tcpdump, tshark
- Nice to have
High-Performance Messaging Libraries
- Familiarity with tools like Tibco, 29West, LBM, Aeron
- Nice to have
• Proven experience as a Linux Platform Engineer, with a focus on Red Hat Enterprise Linux (RHEL) preferred.
• Proficient in Python and Ansible scripting.
• Demonstrated expertise in Linux OS configuration management.
• Experience with RHEL LEAPP upgrades
• Experience with Data Visualization tools, such as Splunk and Grafana
• Experience with Service Now
• Excellent problem-solving skills and attention to detail
• Strong communication and collaboration skills for working in a team-oriented environment.
• Red Hat Certified Engineer (RHCE) certification for RHEL8 or equivalent
• Willingness to adapt to evolving technologies and embrace continuous learning.
• Knowledge of financial services, including regulations and security standards, is preferred.
• Stay current with industry trends and best practices in Linux, financial services, and cybersecurity.
- Responsible for designing, storing, processing, and maintaining of large-scale data and related infrastructure.
- Can drive multiple projects both from operational and technical standpoint.
- Ideate and build PoV or PoC for new product that can help drive more business.
- Responsible for defining, designing, and implementing data engineering best practices, strategies, and solutions.
- Is an Architect who can guide the customers, team, and overall organization on tools, technologies, and best practices around data engineering.
- Lead architecture discussions, align with business needs, security, and best practices.
- Has strong conceptual understanding of Data Warehousing and ETL, Data Governance and Security, Cloud Computing, and Batch & Real Time data processing
- Has strong execution knowledge of Data Modeling, Databases in general (SQL and NoSQL), software development lifecycle and practices, unit testing, functional programming, etc.
- Understanding of Medallion architecture pattern
- Has worked on at least one cloud platform.
- Has worked as data architect and executed multiple end-end data engineering project.
- Has extensive knowledge of different data architecture designs and data modelling concepts.
- Manages conversation with the client stakeholders to understand the requirement and translate it into technical outcomes.
Required Tech Stack
- Strong proficiency in SQL
- Experience working on any of the three major cloud platforms i.e., AWS/Azure/GCP
- Working knowledge of an ETL and/or orchestration tools like IICS, Talend, Matillion, Airflow, Azure Data Factory, AWS Glue, GCP Composer, etc.
- Working knowledge of one or more OLTP databases (Postgres, MySQL, SQL Server, etc.)
- Working knowledge of one or more Data Warehouse like Snowflake, Redshift, Azure Synapse, Hive, Big Query, etc.
- Proficient in at least one programming language used in data engineering, such as Python (or Scala/Rust/Java)
- Has strong execution knowledge of Data Modeling (star schema, snowflake schema, fact vs dimension tables)
- Proficient in Spark and related applications like Databricks, GCP DataProc, AWS Glue, EMR, etc.
- Has worked on Kafka and real-time streaming.
- Has strong execution knowledge of data architecture design patterns (lambda vs kappa architecture, data harmonization, customer data platforms, etc.)
- Has worked on code and SQL query optimization.
- Strong knowledge of version control systems like Git to manage source code repositories and designing CI/CD pipelines for continuous delivery.
- Has worked on data and networking security (RBAC, secret management, key vaults, vnets, subnets, certificates)
Job Description: AI/ML Engineer
Location: Bangalore (On-site)
Experience: 2+ years of relevant experience
About the Role:
We are seeking a skilled and passionate AI/ML Engineer to join our team in Bangalore. The ideal candidate will have over two years of experience in developing, deploying, and maintaining AI and machine learning models. As an AI/ML Engineer, you will work closely with our data science team to build innovative solutions and deploy them in a production environmen
Key Responsibilities:
- Develop, implement, and optimize machine learning models.
- Perform data manipulation, exploration, and analysis to derive actionable insights.
- Use advanced computer vision techniques, including YOLO and other state-of-the-art methods, for image processing and analysis.
- Collaborate with software developers and data scientists to integrate AI/ML solutions into the company's applications and products.
- Design, test, and deploy scalable machine learning solutions using TensorFlow, OpenCV, and other related technologies.
- Ensure the efficient storage and retrieval of data using SQL and data manipulation libraries such as pandas and NumPy.
- Contribute to the development of backend services using Flask or Django for deploying AI models.
- Manage code using Git and containerize applications using Docker when necessary.
- Stay updated with the latest advancements in AI/ML and integrate them into existing projects.
Required Skills:
- Proficiency in Python and its associated libraries (NumPy, pandas).
- Hands-on experience with TensorFlow for building and training machine learning models.
- Strong knowledge of linear algebra and data augmentation techniques.
- Experience with computer vision libraries like OpenCV and frameworks like YOLO.
- Proficiency in SQL for database management and data extraction.
- Experience with Flask for backend development.
- Familiarity with version control using Git.
Optional Skills:
- Experience with PyTorch, Scikit-learn, and Docker.
- Familiarity with Django for web development.
- Knowledge of GPU programming using CuPy and CUDA.
- Understanding of parallel processing techniques.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- Demonstrated experience in AI/ML, with a portfolio of past projects.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork skills.
Why Join Us?
- Opportunity to work on cutting-edge AI/ML projects.
- Collaborative and dynamic work environment.
- Competitive salary and benefits.
- Professional growth and development opportunities.
If you're excited about using AI/ML to solve real-world problems and have a strong technical background, we'd love to hear from you!
Apply now to join our growing team and make a significant impact!
About Us
CallHub provides cloud based communication software for nonprofits, political parties, advocacy organizations and businesses. It has delivered over hundreds of millions of messages and calls for thousands of customers. It helps political candidates during their campaigns to get the message across to their voters, conduct surveys, manage event / town-hall invites and with recruiting volunteers for election campaigns. We are profitable with 8000+ paying customers from North America, Australia and Europe. Our customers include Uber, Democratic Party, major political parties in the US, Canada, UK, France and Australia.
About the Role
As a Senior Quality engineer in CallHub, you will be responsible for providing technical leadership to the quality team and expanding our automation goals advocating high quality user experiences without compromising the engineering velocity. Your primary role will be to design and develop the right test strategy including test automation and test plans that are targeted to uncover defects early to help build quality upstream and ensure quality products are reaching our customers. This includes adopting best practices and tools, driving product automation, suggesting new processes and policies, investigating customer reported issues, deduce patterns of issues and come up with solutions to address the technical challenges and work towards customer delight. You will act as a subject matter expert on everything we test and build. You are expected to understand the entire product workflow, customer experience and backend (API) and how everything affects each other and will act as the gatekeeper for quality. As part of the Engineering team, you will work with a team of highly technical software engineers, product managers and operation engineers to deliver great products that delight customers with exceptional product experience to users, both in terms of usability and performance. Our teams are small, yet incredibly impactful and we're curious, highly-motivated, engaged, and empowered to make a difference.
We're looking for engineers with strong analytical skills, meticulous attention to detail, attitude for perfection, knowledge of quality engineering processes who are inquisitive and eager to learn new technologies and love working in a dynamic environment. Your
Responsibilities
- Drive the overall quality & testing strategy including performance & resiliency testing with right test design, automation of test cases and test data ensuring all areas of the product are thoroughly tested and delight the customers by delivering defect free product
- Design & Development of automation test scripts (UI & API) using modern low-code/no-code tools integrating with CI/CD pipeline to achieve 100% automation
- Mentor the team with writing effective test strategy & design, test plan and test cases in solving complex quality issues
- Find scalable ways to automate functional, usability, performance, API, database and security testing
- Review product requirements and specifications and provide product & design feedback
- Track quality assurance metrics by analyzing and categorizing all customer reported bugs
- Quickly triage and test bug fixes on an ongoing basis
- Work in an agile environment, follow process guidelines and deliver tasks
- Participate in software architecture, design discussions and code reviews
- Be proactive, take ownership and be accountable
- Stay up-to-date with new testing tools and test strategies
What we’re looking for
- 1-2 yrs experience working as a Senior/Lead Quality Engineer
- 3+ yrs experience in automation testing of web applications (UI & API) using Selenium, Robot Framework, Rest Assured etc
- 5+ yrs experience in Software Quality, Testing & Automation
- Strong Knowledge of Jenkins, Git (Continuous Integration and Configuration Management)
- Knowledgeable with modern AI based low-code/no-code tools like Testsigma, Tosca, AccelQ, Katalon etc
- Experience in driving quality strategies & processes and guiding team to write clear and comprehensive test plans and test cases
- Attitude of breaking the system to make the system robust for users
- Detail oriented. Ability to empathize with customers
- The ability to work effectively in a fast-paced environment
- Team player with strong interpersonal skills, willing to ask for help and offer support to the rest of the team
- Good written and verbal communication skills
- BE/MS/MCA from reputed institutes in India or abroad
What you can look forward to
- You will get to see your work directly impacting users in a big way
- Freedom to contribute to multiple engineering disciplines (Development, Automation and DevOps)
- You will have the opportunity to work on the latest technologies as we are constantly innovating to provide reliable and scalable solutions for our customers.
- We value openness in the company and love delighting our customers.
Founded by IIT Delhi Alumni, Convin is a conversation intelligence platform that helps organisations improve sales/collections and elevate customer experience while automating the quality & coaching for reps, and backing it up with super deep business insights for leaders.
At Convin, we are leveraging AI/ML to achieve these larger business goals while focusing on bringing efficiency and reducing cost. We are already helping the leaders across Health-tech, Ed-tech, Fintech, E-commerce, and consumer services like Treebo, SOTC, Thomas Cook, Aakash, MediBuddy, PlanetSpark.
If you love AI, understand SaaS, love selling and looking to join a ship bound to fly- then Convin is the place for you!
We are seeking a talented and motivated Core Machine Learning Engineer with a passion for the audio domain. As a member of our dynamic team, you will play a crucial role in developing state-of-the-art solutions in speech-to-text, speaker separation, diarization, and related areas.
Responsibilities
- Collaborate with cross-functional teams to design, develop, and implement machine learning models and algorithms in the audio domain.
- Contribute to the research, prototyping, and deployment of speech-to-text, speaker separation, and diarization solutions.
- Explore and experiment with various techniques to improve the accuracy and efficiency of audio processing models.
- Work closely with senior engineers to optimize and integrate machine learning components into our products.
- Participate in code reviews, provide constructive feedback, and adhere to coding standards and best practices.
- Communicate effectively with team members, sharing insights and progress updates.
- Stay updated with the latest developments in machine learning, AI, NLP, and signal processing, and apply relevant advancements to our projects.
- Collaborate on the development of end-to-end systems that involve speech and language technologies.
- Assist in building and training large-scale language models like chatGPT, LLAMA, Falcon, etc., leveraging their capabilities as required.
Requirements
- Bachelor's or Master's degree in Computer Science or a related field from a reputed institution.
- 5+ years of hands-on experience in Machine Learning, Artificial Intelligence, Natural Language Processing, or signal processing.
- Strong programming skills in languages such as Python, and familiarity with relevant libraries and frameworks (e.g., TensorFlow, PyTorch).
- Knowledge of speech-to-text, text-to-speech, speaker separation, and diarization techniques is a plus.
- Solid understanding of machine learning fundamentals and algorithms.
- Excellent problem-solving skills and the ability to learn quickly.
- Strong communication skills to collaborate effectively within a team environment.
- Enthusiasm for staying updated with the latest trends and technologies in the field.
- Familiarity with large language models like chatGPT, LLAMA, Falcon, etc., is advantageous.
Founded by IIT Delhi Alumni, Convin is a conversation intelligence platform that helps organisations improve sales/collections and elevate customer experience while automating the quality & coaching for reps, and backing it up with super deep business insights for leaders.
At Convin, we are leveraging AI/ML to achieve these larger business goals while focusing on bringing efficiency and reducing cost. We are already helping the leaders across Health-tech, Ed-tech, Fintech, E-commerce, and consumer services like Treebo, SOTC, Thomas Cook, Aakash, MediBuddy, PlanetSpark.
If you love AI, understand SaaS, love selling and looking to join a ship bound to fly- then Convin is the place for you!
Responsibilities
- Designing and developing robust and scalable server-side applications using Python, Flask, Django, or other relevant frameworks and technologies.
- Collaborating with other developers, data scientists, and data engineers to design and implement RESTful APIs, web services, and microservices architectures.
- Writing clean, maintainable, and efficient code, and reviewing the code of other team members to ensure consistency and adherence to best practices.
- Participating in code reviews, testing, debugging, and troubleshooting to ensure the quality and reliability of applications.
- Optimising applications for performance, scalability, and security, and monitoring the production environment to ensure uptime and availability.
- Staying up-to-date with emerging trends and technologies in web development, and evaluating and recommending new tools and frameworks as needed.
- Mentoring and coaching junior developers to ensure they grow and develop their skills and knowledge in line with the needs of the team and the organisation.
- Communicating and collaborating effectively with other stakeholders, including product owners, project managers, and other development teams, to ensure projects are delivered on time and to specification
You are a perfect match, if you have these qualifications -
- Strong experience in GoLang or Python (server-side development frameworks such as Flask or Django)
- Experience in building RESTful APIs, web services, and microservices architectures.
- Experience in using database technologies such as MySQL, PostgreSQL, or MongoDB.
- Familiarity with cloud-based platforms such as AWS, Azure, or Google Cloud Platform.
- Knowledge of software development best practices such as Agile methodologies, Test-Driven Development (TDD), and Continuous Integration/Continuous Deployment (CI/CD).
- Excellent problem-solving and debugging skills, and the ability to work independently as well as part of a team.
- Strong communication and collaboration skills, and the ability to work effectively with other stakeholders in a fast-paced environment.
at REConnect Energy
Work at the Intersection of Energy, Weather & Climate Sciences and Artificial Intelligence
About the company:
REConnect Energy is India's largest tech-enabled service provider in predictive analytics and demand-supply aggregation for the energy sector. We focus on digital intelligence for climate resilience, offering solutions for efficient asset and grid management, minimizing climate-induced risks, and providing real-time visibility of assets and resources.
Responsibilities:
- Design, develop, and maintain data engineering pipelines using Python.
- Implement and optimize database solutions with SQL and NOSQL Databases (MySQL and MongoDB).
- Perform data analysis, profiling, and quality assurance to ensure high service quality standards.
- Troubleshoot and resolve data-pipeline related issues, ensuring optimal performance and reliability.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
- Participate in code reviews and contribute to the continuous improvement of the codebase.
- Utilize GitHub for version control and collaboration.
- Implement and manage containerization solutions using Docker.
- Implement tech solutions to new product development, ensuring scalability, performance, and security.
Requirements:
- Bachelors or Master’s degree in Computer Science, Software Engineering, Electrical Engineering or equivalent.
- Proficient in Python programming skills and expertise with data engineering.
- Experience in databases including MySQL and NoSQL.
- Experience in developing and maintaining critical and high availability systems will be given strong preference.
- Experience working with AWS cloud platform.
- Strong analytical and data-driven approach to problem solving.
Client based at Bangalore location.
Data Science:
• Python expert level, Analytical, Different models works, Basic concepts, CPG(Domain).
• Statistical Models & Hypothesis , Testing
• Machine Learning Important
• Business Understanding, visualization in Python.
• Classification, clustering and regression
•
Mandatory Skills
• Data Science, Python, Machine Learning, Statistical Models, Classification, clustering and regression
Looking for Python with React.
Python frameworks like Django or Flask.
Develop RESTful APIs or GraphQL endpoints
About us
Fisdom is one of the largest wealthtech platforms that allows investors to manage their wealth in an intuitive and seamless manner. Fisdom has a suite of products and services that takes care of every wealth requirement that an individual would have. This includes Mutual Funds, Stock Broking, Private Wealth, Tax Filing, and Pension funds
Fisdom has a B2C app and also an award-winning B2B2C distribution model where we have partnered with 15 of the largest banks in India such as Indian Bank and UCO Bank to provide wealth products to their customers. In our bank-led distribution model, our SDKs are integrated seamlessly into the bank’s mobile banking and internet banking application. Fisdom is the first wealthtech company in the country to launch a stock broking product for customers of a PSU bank.
The company is breaking down barriers by enabling access to wealth management to underserved customers. All our partners combined have a combined user base of more than 50 crore customers. This makes us uniquely placed to disrupt the wealthtech space which we believe is in its infancy in India in terms of wider adoption.
Where are we now and where are we heading towards
Founded by veteran VC-turned entrepreneur Subramanya SV(Subu) and former investment
banker Anand Dalmia, Fisdom is backed by PayU (Naspers), Quona Capital, and Saama Capital; with $37million of total funds raised so far. Fisdom is known for its revenue and profitability focussed approach towards sustainable business.
Fisdom is the No.1 company in India in the B2B2C wealthtech space and one of the most admired companies in the fintech ecosystem for our business model. We look forward to growing the leadership position by staying focussed on product and technology innovation.
Our technology team
Today we are a 60-member strong technology team. Everyone in the team is a hands-on engineer, including the team leads and managers. We take pride in being product engineers and we believe engineers are fundamentally problem solvers first. Our culture binds us together as one cohesive unit. We stress on engineering excellence and strive to become a high talent density team. Some values that we preach and practice include:
- Individual ownership and collective responsibility
- Focus on continuous learning and constant improvement in every aspect of engineering and product
- Cheer for openness, inclusivity and transparency
- Merit-based growth
What we are looking for
- Are open to work in a flat, non-hierarchical setup where daily focus is only shipping features not reporting to managers
- Experience designing highly interactive web applications with performance, scalability, accessibility, usability, design, and security in mind.
- Experience with distributed (multi-tiered) systems, algorithms, and relational and no-sql databases.
- Ability to break-down larger/fuzzier problems into smaller ones in the scope of the product
- Experience with architectural trade-offs, applying synchronous and asynchronous design patterns, and delivering with speed while maintaining quality.
- Raise the bar on sustainable engineering by improving best practices, producing best in class of code, documentation, testing and monitoring.
- Contributes in code and actively takes part in code reviews.
- Working with the Product Owner/managers to clearly define the scope of multiple sprints. Lead/guide the team through sprint(s) scoping, resource allocation and commitment - the execution plan.
- Drives feature development end-to-end. Active partner with product, design, and peer engineering leads and managers.
- Familiarity with build, release, deployment tools such as Ant, Maven, and Gradle, Docker, Kubernetes, Jenkins etc.
- Effective at influencing a culture of engineering craftsmanship and excellence
- Helps the team make the right choices. Drives adoption of engineering best practices and development processes within their team.
- Understanding security and compliance.
- User authentication and authorisation between multiple systems, servers, and environments.
- Based on your experience, you may lead a small team of Engineers.
If you don't have all of these, that's ok. But be excited about learning the few you don't know.
Skills
Microservices, Engineering Management, Quality management, Technical Architecture, technical lead. Hands-on programming experience in one of languages: Python, Golang.
Additional perks
- Access to large repositories of online courses through Myacademy (includes Udemy, Coursera, Harvard ManageMentor, Udacity and many more). We strongly encourage learning something outside of work as a habit.
- Career planning support/counseling / coaching support. Both internal and external coaches.
- Relocation policy
You will not be a good fit for this role if
- you have experience of only working with services companies or have spent a major part of your time there
- you are not open to shifting to new programming language or stack but exploring a position aligned to your current technical experience
- you are not very hands-on, seek direction constantly and need continuous supervision from a manager to finish tasks
- you like to working alone and mentoring junior engineers does not interest you
- you are looking to work in very large teams
Why join us and where?
We're a small but high performing engineering team. We recognize that the work we do impacts the lives of hundreds and thousands of people. Your work will contribute significantly to our mission. We pay competitive compensation and performance bonuses. We provide a high energy work environment and you are encouraged to play around new technology and self-learning. You will be based out of Bangalore