

Requirements:
- Have Strong knowledge of cloud technologies like (AWS/GCP/Azure).
- Strong in any Backend language like Java-Springboot(preferred), Python, ROR, NodeJs, GO etc.
- Have a working experience with any frontend technology.
- Strong system design experience in cloud-native products.
- Understands and knows how to work with GIT.
- Able to develop or implement HLD into LLD and vice versa.
- Strong knowledge of SQL/NoSQL database.
- Strong on DSA.
- Has previously built and delivered end-to-end products preferably with any FinTech startup.
- Someone very motivated and loves taking ownership of building an end-to-end application
Perks:
> Fully-remote
> Flexible work hours
> You could be our first Director/VP of technology
> Founding Team role
> ESOPs

About Bullsurge
About
Connect with the team
Company social profiles
Similar jobs


Job Title : Python Developer – API Integration & AWS Deployment
Experience : 5+ Years
Location : Bangalore
Work Mode : Onsite
Job Overview :
We are seeking an experienced Python Developer with strong expertise in API development and AWS cloud deployment.
The ideal candidate will be responsible for building scalable RESTful APIs, automating power system simulations using PSS®E (psspy), and deploying automation workflows securely and efficiently on AWS.
Mandatory Skills : Python, FastAPI/Flask, PSS®E (psspy), RESTful API Development, AWS (EC2, Lambda, S3, EFS, API Gateway), AWS IAM, CloudWatch.
Key Responsibilities :
Python Development & API Integration :
- Design, build, and maintain RESTful APIs using FastAPI or Flask to interface with PSS®E.
- Automate simulations and workflows using the PSS®E Python API (psspy).
- Implement robust bulk case processing, result extraction, and automated reporting systems.
AWS Cloud Deployment :
- Deploy APIs and automation pipelines using AWS services such as EC2, Lambda, S3, EFS, and API Gateway.
- Apply cloud-native best practices to ensure reliability, scalability, and cost efficiency.
- Manage secure access control using AWS IAM, API keys, and implement monitoring using CloudWatch.
Required Skills :
- 5+ Years of professional experience in Python development.
- Hands-on experience with RESTful API development (FastAPI/Flask).
- Solid experience working with PSS®E and its psspy Python API.
- Strong understanding of AWS services, deployment, and best practices.
- Proficiency in automation, scripting, and report generation.
- Knowledge of cloud security and monitoring tools like IAM and CloudWatch.
Good to Have :
- Experience in power system simulation and electrical engineering concepts.
- Familiarity with CI/CD tools for AWS deployments.
Experience: 6 - 10 years
Location: Bangalore

What would make you a good fit?
- You’re both relentless and kind, and don’t see these as being mutually exclusive
- You have a self-directed learning style, an insatiable curiosity, and a hands-on execution mindset
- You have deep experience working with product and engineering teams to launch backend services that power end-user applications
- You have deep experience in Python and the related tools and frameworks
- You have deep experience working with large datasets and relational databases, specifically PostgreSQL
- You have experience with microservices architecture, Docker, and Kubernetes
- You continuously raise the bar on development practices such as code quality tools, unit testing coverage, build tools, etc.
- You obsess about correctness, DRY development, reducing cognitive complexity, and performance
- You have excellent writing and speaking skills with a talent for applying technical solutions to customer problem statements
Must-Have Qualifications
- 5+ years of experience building RESTful services
- 2+ years of experience in Django Rest Framework, Flask, and/or FastAPI
- 3+ years of experience with SQL and Postgres to manage and analyze data
- Expert level skills using a debugging tool and developing unit tests
- Experience in Docker, Kubernetes, and microservices architecture
- Prior startup experience and enjoy taking on difficult challenges and broad responsibilities
- Track-record of delivering reliable and scalable RESTful services from requirements to production
- You’ve partnered with devops to deliver high-performant backend solutions to production
- You naturally think quantitatively about problems and work backward from a customer outcome
What’ll make you stand out (but not required)
- Terraform and AWS experience
- Experience creating technical design documents
- You have a strong connection to finance teams or closely related domains, the challenges they face, and deep appreciation for their aspirations
Role
Backend engineers at AssetSprout work on our products. They include software for Certified Financial Planners, their clients, and also internal admin tools. They work with the CTO, frontend engineers, and other backend engineers to deliver towards the company’s vision.
Responsibilities
- Develop and own product features end to end in a scalable, secure and maintainable way. The buck stops with you on whatever you own.
- Provide technical solutions through design, architecture and implementation. Wear multiple hats in delivering greenfield projects from concept to production.
- Establish, advocate and bring your experience on coding styles, best practices, and in scaling the product from MVP to production.
- Iterate fast. Display maturity in prioritizing towards velocity while balancing quality. As a startup, we break or make on how fast we deliver.
- Teach and mentor other backend engineers. Focus on providing technical expertise and solutions regardless of how long one has been working professionally.
Requirements
- We are language and framework agnostic as long as you can pick up new technologies.
- Proficient and expert level coding skills with any of the programming languages, preferably Java, Kotlin. Experience in Python, C++, Scala etc. is welcome.
- Develop web applications and services using Spring Boot. Experience with Akka, Play, Flask, Django is welcome.
- Write automated tests with any of the frameworks. We measure success on how well your code is unit tested and integration tested.
- Advanced level understanding of RDMS systems, preferably Postgres. Working knowledge of non-relational databases such as DynamoDB, Cassandra is helpful
- Able to use CI/CD tools such as CircleCI, GitLab, Jenkins etc. and create workflows and pipelines to release to production every other day.
- Expert level understanding of RESTful APIs, pagination, networking concepts around HTTP, thread pools, and other server-side concepts.
- Solid experience of how AWS services work. Some cloud services directly relevant are Lambda, EC2, S3, Dynamo, RDS, Eventbridge, SQS, ElastiCache Redis, Load Balancers etc.
Good-to-haves
- Early or mid-stage startup experience
- Eager to work in a flat organization with no corporate politics
- Positive energy with a get-it-done attitude.
- Worked in a remote environment and high trust and high responsibility role
- Working knowledge of build systems like Gradle, Maven, Bazel, Webpack etc. We use Gradle


non-metro and rural markets. DealShare has raised series C funding of USD 21 million with key investors like WestBridge Capital, Falcon Edge Capital, Matrix Partners India, Omidyar Network, Z3 Partners and Partners of DST Global and has a total funding of USD 34 million.They have 2 million customers across Rajasthan, Gujarat, Maharashtra, Karnataka and Delhi NCR with monthly transactions of 1.2 million and annual GMV of $100 million. Our aim is to expand operations to 100 cities across India and reach annual GMV of USD 500 Million by end of 2021.
They started in Sept 2018 and had 5000 active customers in the first three months. Today
we have 25K transactions per day, 1 Lakh DAU and 10 Lakh MAU with a monthly GMV of INR 100 Crores and 50% growth MoM. We aim to hit 2 Lakh transactions per day with an annual GMV of 500 Million USD by 2021.
We are hiring for various teams in discovery (search, recommendation, merchandising,
intelligent notifications) , pricing (automated pricing, competition price awareness, balancing revenue with profits, etc), user growth and retention (bargains, gamification), monetisation (ads), order fulfillment (cart/checkout, warehousing, last mile, delivery promise, demand forecasting), customer support, data infrastructure (warehousing, analytics), ML infrastructure (data versioning, model repository, model training, model hosting, feature store, etc). We are looking for passionate problem solvers to join us and solve really challenging problems and scale DealShare systems
You will:
● Implement the solve with minimal guidance after solutioning closure with senior engineers.
● Write code that has good low level design and is easy to understand, maintain, extend
and test.
● End to end ownership of product/feature from development to production and fixing
issues
● Ensure high unit, functional and integration automated test coverage. Ensure releases
are stable.
● Communicate with various stakeholders (product, QA, senior engineers) as necessary to
ensure quality deliverables, smooth execution and launch.
● Participate in code reviews, improve development and testing processes.
● Participate in hiring great engineers
Required:
● Bachelor’s degree (4 years) or higher in Computer Science or equivalent and 1-3 years
of experience in software development
● Excellent at problem solving, is a self thinker.
● Good understanding of computer science fundamentals, data structures and algorithms
and object oriented design.
● Good coding skills in any object oriented language (C++, Java, Scala, etc), preferably in
Java.
● Prior experience in building one or more modules of large-scale, highly available, low
latency, high quality distributed system is preferred.
● Extremely good at problem solving, is a self thinker.
● Ability to multitask and thrive in a fast paced timeline-driven environment.
● Good team player and ability to collaborate with others
● Self driven and motivated, very high on ownership
Is a plus
● Prior experience of working in Java
● Prior experience of using AWS offerings - EC2, S3, DynamoDB, Lambda, API Gateway,
Cloudfront, etc
● Prior experience of working on big data technologies - Spark, Hadoop, etc
● Prior experience on asynchronous processing (queuing systems), workflow systems.



Job Responsibilities: -
- Work on building enterprise grade applications with a strong knowledge of the .net core framework. Experience with Node JS.
- Create Libraries that would be shared across the different modules in the system that can be leveraged by multiple teams within the organization.
- Good understanding of data structures to identify the correct data structure for a specific use case.
- Build RESTful API for customer consumption. Prior knowledge building APIs is a requirement. Understanding of HTTP error codes is a must.
- Work on code in a distributed code development environment. Prior use of Git/Github is desirable.
- Work with a Unit Test platform.
- Strong knowledge on Encryption standards – Asymmetric and Symmetric.
- Create and Manage System Documentations.
Job Qualifications
- At least 4+ years of experience using C# programming language in the build of enterprise software (.net core, asp.net core, web api) – For .Net Developer
- At least 7+ years of experience using C# programming language in the build of enterprise software (.net core, asp.net core, web api). – For Lead.
- Experience handling ISO8583 message standard; Payments Industry knowledge.
- Good understanding of version control (using Git/Github).
- Experience with Unit Test framework; TDD Approach.
- Knowledge and understanding of PCI Requirements for Card processing software .
- Understanding of SQL Database and Database Design.
- Strong understating of hybrid cloud application basics based on Docker, Kubernetes, Microservices and Postgres.
- Minimal 2+ years of experience in building enterprise scale applications: coding, designing, developing, and analyzing data.
- Agile environment. AWS, Docker, Kubernetes, DevOps environment Lead levels only.
- Manage CI/CD Pipeline.
- E/B.Tech/B.Sc/MCA/Msc in Computer Science.
Company Overview
http://www.corecard.com">www.corecard.com & http://www.corecardindia.com">www.corecardindia.com
Core Card Software is a product company, to provide financial service organizations with the best card management applications to help them compete both locally and globally. Core Card Software, a leading provider of card management systems. The Core Card software solution provides the market's most feature-rich platform for processing and managing accounts receivables and a full range of card products including prepaid/stored-value, fleet, credit, debit, commercial, government, healthcare and private-label cards. Core Card have offices based in Atlanta, Bhopal & Navi Mumbai.


• 3+ Years of experience as a Go Developer
• Experience in ReactJS (most preferred) or AngularJS similar front end frameworks
• Experience with the Python or/and Golang (preferably both), SQL, and design/architectural
patterns
• Experience in Java or dotnet or other opensource technologies is an added advantage
• Hands-on experience on SQL, Query optimization, DB server migration
• Preferably experience in Postgre SQL or MySQL
• Knowledge of NOSQL databases will be an added advantage
• Experience in Cloud platforms like AWS, Azure with knowledge of containerization, Kubernetes is an
added advantage
• Knowledge of one or more programming languages along with HTML5/CSS3,Bootstrap
• Familiarity with architecture styles/APIs (REST, RPC)
• Understanding of Agile methodologies
• Experience with Threading, Multithreading and pipelines
• Experience in creating RESTful API’s With Golang or Python or Java in JSON, XMLs
• Experience with GitHub, Tortoise SVN Version Control
• Strong attention to detail
• Strong knowledge of asynchronous programming with the latest frameworks
• Excellent troubleshooting and communication skills
• Strong knowledge of unit testing frameworks
• Proven knowledge of ORM techniques
• Skill for writing reusable libraries Understanding of fundamental design principles for building a scalable
applicatio
SaveIN is India’s first ‘Buy now, Pay later’ platform for healthcare products and services. We offer flexible, low-cost repayment plans for a host of healthcare products and treatments, delivered through our network of healthcare providers.
We aim to create India’s largest integrated private healthcare ecosystem and build technology-first solutions to facilitate timely and quality care through enhanced affordability for millions of Indians.
We are backed by a strong set of global investors including Silicon Valley based Y-Combinator.
Our promise to you:
● We aim to hire the best of talent, passionate about the vision of SaveIN
● We aim to create an equal opportunity, open, challenging as well as rewarding environment to bring the best out of our people
● We are here to be a large, prosperous, profitable, and resilient organization so that we may serve our customers sustainably across economic cycles, we aim to achieve this most ethically and transparently possible
● Being compliant is not only an obligation but a chosen way of life
● We would love to see you grow and are committed to doing our best to contribute towards your success
About the role:
SaveIN is looking for a Software Engineer | Java, who enjoys solving challenging problems
and can develop and deploy APIs and Web applications using Java MVC Frameworks and
power a variety of classing leading digital products. You will work with developers, product
and founding team and would also be expected to lead a team of junior developers in the future.
Location: Gurugram
Key responsibilities:
● Work with business users to gather functional requirements
● Combine your technical expertise and passion for problem-solving to deliver end-to-end solutions
● Design and implement high-quality, test-driven code for various projects
● Unit Testing/Integration Testing
● Code configuration and release management
● Document technical design as per internal compliance standards
● Work with senior management and external stakeholders to ensure that deliverables are met
Skills and competencies:
● Education: BE/BTech/MTech/MCA
● Minimum 3 Years of experience in Web Application and API development, in Java 8 and above
● Working experience with MVC frameworks like Spring, Play, etc.
● Experience with Multi-threading, Collections, and concurrent API
● Working experience with web services and APIs (REST, SOAP)
● Experience in developing microservices in Spring Boot
● Experience working with tools like Git, and Maven.
● Experience writing high-quality code with fully automated unit test coverage (Junit, Mockito, etc.)
● Experience in defining and applying design/coding standards, patterns and quality metrics
● Working experience with data platforms (relational and/or NoSQL) and messaging technologies
● Excellent OOPs, data structure, and algorithm knowledge
● Understanding & experience in API management, Swagger
● Working experience with LINUX/UNIX environment and shell scripts
● Experience in working on public cloud infrastructure- AWS (EC2, ECS, Cognito, CloudWatch, SQS, S3)
● Understanding/experience with 3rd party integrations like CRM, payment gateways, performance marketing tools
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for technical leaders with passion and experience in architecting and delivering high-quality distributed systems at massive scale.
Responsibilities & ownership
- Lead end-to-end delivery and customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product
- Lead and mentor others about concurrency, parallelization to deliver scalability, performance and resource optimization in a multithreaded and distributed environment
- Propose and promote strategic company-wide tech investments taking care of business goals, customer requirements, and industry standards
- Lead the team to solve complex, unknown and ambiguous problems, and customer issues cutting across team and module boundaries with technical expertise, and influence others
- Review and influence designs of other team members
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Partner with other leaders to nurture innovation and engineering excellence in the team
- Drive priorities with others to facilitate timely accomplishments of business objectives
- Perform RCA of customer issues and drive investments to avoid similar issues in future
- Collaborate with Product Management, Support, and field teams to ensure that customers are successful with Dremio
- Proactively suggest learning opportunities about new technology and skills, and be a role model for constant learning and growth
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 15+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models and their use in developing distributed and scalable systems
- 8+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Subject Matter Expert in one or more of query processing or optimization, distributed systems, concurrency, micro service based architectures, data replication, networking, storage systems
- Experience in taking company-wide initiatives, convincing stakeholders, and delivering them
- Expert in solving complex, unknown and ambiguous problems spanning across teams and taking initiative in planning and delivering them with high quality
- Ability to anticipate and propose plan/design changes based on changing requirements
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Hands-on experience of working projects on AWS, Azure, and GCP
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and GCP)
- Understanding of distributed file systems such as S3, ADLS or HDFS
- Excellent communication skills and affinity for collaboration and teamwork

