
Senior Backend Engineer Tier 1 college only Fintech company only
at Talent Pro
Strong Senior Backend Engineer profiles
Mandatory (Experience 1) – Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
Mandatory (Experience 2) – Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
Mandatory (Experience 3) – Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
Mandatory (Experience 4) – Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
Mandatory (Experience 5) – Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
Mandatory (Domain) – Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
Mandatory (Company) – Must have worked in product companies / startups, preferably Series A to Series D
Mandatory (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred

Similar jobs
Job Title : Backend / API Developer - Python (FastAPI) / Node.js (Express)
Location : Remote
Experience : 4+ Years
Job Description :
We are looking for a skilled Backend / API Developer - Python (FastAPI) / Node.js (Express) with strong expertise in building secure, scalable, and reliable backend systems. The ideal candidate should be proficient in Python (FastAPI preferred) or Node.js (Express) and have hands-on experience deploying applications to serverless environments.
Key Responsibilities :
- Design, develop, and maintain RESTful APIs and backend services.
- Deploy and manage serverless applications on Cloudflare Workers, Firebase Functions, and Google Cloud Functions.
- Work with Google Cloud services including Cloud Run, Cloud Functions, Secret Manager, and IAM roles.
- Implement secure API development practices (HTTPS, input validation, and secrets management).
- Ensure performance optimization, scalability, and reliability of backend systems.
- Collaborate with front-end developers, DevOps, and product teams to deliver high-quality solutions.
Mandatory Skills :
Python (FastAPI) / Node.js (Express), Serverless Deployment (Cloudflare Workers, Firebase, GCP Functions), Google Cloud Services (Cloud Run, IAM, Secret Manager), API Security (HTTPS, Input Validation, Secrets Management).
Required Skills :
- Proficiency in Python (FastAPI preferred) or Node.js (Express).
- Hands-on experience with serverless platforms (Cloudflare Workers, Firebase Functions, GCP Functions).
- Familiarity with Google Cloud services (Cloud Run, IAM, Secret Manager, Cloud Functions).
- Strong understanding of secure API development (HTTPS, input validation, API keys & secret management).
- Knowledge of API design principles and best practices.
- Ability to work with CI/CD pipelines and modern development workflows.
Preferred Qualifications :
- Strong knowledge of microservices architecture.
- Experience with CI/CD pipelines.
- Knowledge of containerization (Docker, Kubernetes).
- Familiarity with monitoring and logging tools.
What You’ll Do
• Build and scale backend services using Java & Spring Boot
• Work on API integrations (REST, SOAP), caching & rate limiting
• Contribute across the full SDLC – design, development, testing & deployment
• Solve problems around performance, scalability & reliability
What We’re Looking For
• Strong knowledge of Data Structures & Algorithms
• Experience with Java, Spring Boot, REST/SOAP
• Hands-on with system & solution design
• Database experience: MongoDB / PostgreSQL / MySQL / Oracle
• Good debugging skills & unit testing
• Familiarity with Git and AI coding assistants (Copilot, Claude, etc.)
JOB DESCRIPTION:
Location: Pune, Mumbai
Mode of Work : 3 days from Office
DSA(Collections, Hash maps, trees, Linkedlist and Arrays, etc), Core Oops concepts(Multithreading, Multi Processing, Polymorphism, Inheritance etc) Annotations in Spring and Spring boot, Java 8 Vital features, database Optimization, Microsevices and Rest API
- Design, develop, and maintain low-latency, high-performance enterprise applications using Core Java (Java 5.0 and above).
- Implement and integrate APIs using Spring Framework and Apache CXF.
- Build microservices-based architecture for scalable and distributed systems.
- Collaborate with cross-functional teams for high/low-level design, development, and deployment of software solutions.
- Optimize performance through efficient multithreading, memory management, and algorithm design.
- Ensure best coding practices, conduct code reviews, and perform unit/integration testing.
- Work with RDBMS (preferably Sybase) for backend data integration.
- Analyze complex business problems and deliver innovative technology solutions in the financial/trading domain.
- Work in Unix/Linux environments for deployment and troubleshooting.
Position: Lead Python Developer
Location: Ahmedabad, Gujarat
The Client company includes a team of experienced information services professionals who are passionate about growing and enhancing the value of information services businesses. They provide support with talent, technology, tools, infrastructure and expertise required to deliver across the Data ecosystem. Position Summary We are seeking a skilled and experienced Backend Developer with strong expertise in TypeScript, Python, and web scraping. You will be responsible for designing, developing, and maintaining scalable backend services and APIs that power our data-driven products. Your role will involve collaborating with cross functional teams, optimizing system performance, ensuring data integrity, and contributing to the design of efficient and secure architectures.
Job Responsibility
● Design, develop, and maintain backend systems and services using Python and TypeScript.
● Develop and maintain web scraping solutions to extract, process, and manage large-scale data from multiple sources.
● Work with relational and non-relational
databases, ensuring high availability, scalability, and performance.
● Implement authentication, authorization, and security best practices across services.
● Write clean, maintainable, and testable code following best practices and coding standards.
● Collaborate with frontend engineers, data engineers, and DevOps teams to deliver robust solutions and troubleshoot, debug, and upgrade existing applications.
● Stay updated with backend development trends, tools, and frameworks to continuously improve processes.
● Utilize core crawling experience to design efficient strategies for scraping the data from different websites and applications.
● Collaborate with technology teams, data collection teams to build end to end technology-enabled ecosystems and partner in research projects to analyze the massive data inputs.
● Responsible for the design and development of web crawlers, able to independently solve various problems encountered in the actual development process.
● Stay updated with the latest web scraping techniques, tools, and industry trends to continuously improve the scraping processes.
Job Requirements
● 4+ years of professional experience in backend development with TypeScript and Python.
● Strong understanding of TypeScript-based server-side frameworks (e.g., Node.js, NestJS, Express) and Python frameworks (e.g., FastAPI, Django, Flask).
● Experience with tools and libraries for web scraping (e.g., Scrapy, BeautifulSoup, Selenium, Puppeteer)
● Hands-on experience with Temporal for creating and orchestrating workflows
● Proven hands-on experience in web scraping, including crawling, data extraction, deduplication, and handling dynamic websites.
● Proficient in implementing proxy solutions and handling bot-detection challenges (e.g., Cloudflare).
● Experience working with Docker, containerized deployments, and cloud environments (GCP or Azure).
● Proficiency with database systems such as MongoDB and Elastic Search.
● Hands-on experience with designing and maintaining scalable APIs.
● Knowledge of software testing practices (unit, integration, end-to-end).
● Familiarity with CI/CD pipelines and version control systems (Git).
● Strong problem-solving skills, attention to detail, and ability to work in agile environments.
● Great communication skills and ability to navigate in undirected situations.
Job Exposure:
● Opportunity to apply creative methods in acquiring and filtering the North American government, agencies data from various websites, sources
● In depth industry exposure on data harvesting techniques to build, scale the robust and sustainable model, using open-source applications ● Effectively collaboration with IT team to design the tailor-made solutions basis upon clients’ requirement
● Unique opportunity to research on various agencies, vendors, products as well as technology tools to compose a solution
We are seeking a highly motivated and skilled AI Engineer. You will have strong fundamentals in applied machine learning. You will have a passion for building and deploying production-grade AI solutions for enterprise clients. You will be a key technical expert and the face of our company. You will directly interface with customers to design, build, and deliver cutting-edge AI applications. This is a customer-facing role. It requires a balance of deep technical expertise and excellent communication skills.
Roles & Responsibilities
Design & Deliver AI Solutions
- Interact directly with customers.
- Understand their business requirements.
- Translate them into robust, production-ready AI solutions.
- Manage AI projects with the customer's vision in mind.
- Build long-term, trusted relationships with clients.
Build & Integrate Agents
- Architect, build, and integrate intelligent agent systems.
- Automate IT functions and solve specific client problems.
- Use expertise in frameworks like LangChain or LangGraph to build multi-step tasks.
- Integrate these custom agents directly into the RapidCanvas platform.
Implement LLM & RAG Pipelines
- Develop grounding pipelines with retrieval-augmented generation (RAG).
- Contextualize LLM behavior with client-specific knowledge.
- Build and integrate agents with infrastructure signals like logs and APIs.
Collaborate & Enable
- Work with customer data science teams.
- Collaborate with other internal Solutions Architects, Engineering, and Product teams.
- Ensure seamless integration of AI solutions.
- Serve as an expert on the RapidCanvas platform.
- Enable and support customers in building their own applications.
- Act as a Product Champion, providing crucial feedback to the product team to drive innovation.
Data & Model Management
- Oversee the entire AI project lifecycle.
- Start from data preprocessing and model development.
- Finish with deployment, monitoring, and optimization.
Champion Best Practices
- Write clean, maintainable Python code.
- Champion engineering best practices.
- Ensure high performance, accuracy, and scalability.
Key Skills Required
Experience
- Minimum 5+ years of hands-on experience in AI/ML engineering or backend systems.
- Recent exposure to LLMs or intelligent agents is a must.
Technical Expertise
- Proficiency in Python.
- Proven track record of building scalable backend services or APIs.
- Expertise in machine learning, deep learning, and Generative AI concepts.
- Hands-on experience with LLM platforms (e.g., GPT, Gemini).
- Deep understanding of and hands-on experience with agentic frameworks like LangChain, LangGraph, or CrewAI.
- Experience with vector databases (e.g., Pinecone, Weaviate, FAISS).
Customer & Communication Skills
- Proven ability to partner with enterprise stakeholders.
- Excellent presentation skills.
- Comfortable working independently.
- Manage multiple projects simultaneously.
Preferred Skills
- Experience with cloud platforms (e.g., AWS, Azure, Google Cloud).
- Knowledge of MLOps practices.
- Experience in the AI services industry or startup environments.
Why Join us
- High-impact opportunity: Play a pivotal role in building a new business vertical within a rapidly growing AI company.
- Strong leadership & funding: Backed by top-tier investors, our leadership team has deep experience scaling AI-driven businesses.
- Recognized as a top 5 Data Science and Machine Learning platform by independent research firm G2 for customer satisfaction.
1. Need to have an understanding of Elastic Search, Kafka, mongo DB, etc.
2. Should have experience of Jupter noobooks, data bricks
3. Java, Pythons
4. Senior level, 5-10 years of experience
5. It is important they have those skills so that they can take over current work. There are codes written in both Java as well as Python. (Java is legacy but that is the main search engine code). So it will be counter-productive if engineers hired have experience in both.
6. Excellent communication, analytical, research, grasping skills
- Developing and installing software solutions.
- Designing, implementing and delivering high-quality Software projects in JAVA, SQL / Oracle, J2EE and other JAVA technologies.
- Participate in detailed level in design, coding, code walk-through, peer code reviews and unit testing, System Testing, UAT, Demos, POCs, installation, maintenance of Software modules.
- Software prototype.
- System Architecture.
- Software Design Document.
- User interfaces to be developed as per UX guidelines Code files ensuring coding guidelines followed.
- Code review artifacts during peer code review.
- Test plan and Test Cases.
- Installation/Deployment document Release Document.
- Technical Documentation
• Must have good knowledge of Java and J2EE
• Experience working with Java Frameworks (Spring, Strut)
• Experience working with RESTful web services (JSON, JWT)
• Experience working with front-end Java script app and frameworks
• Experience with XML and JSON
• Experience with Vert.x is a plus











