
Mandatory (Experience 1) - Must have a minimum 4+ years of experience in backend software development.
Mandatory (Experience 2) -Must have 4+ years of experience in backend development using Python (Highly preferred), Java, or Node.js.
Mandatory (Experience 3) - Must have experience with Cloud platforms like AWS (highly preferred), gcp or azure
Mandatory (Experience 4) - Must have Experience in any databases - MySQL / PostgreSQL / Postgres / Oracle / SQL Server / DB2 / SQL / MongoDB / Ne

Similar jobs
About the Role-
Thinking big and executing beyond what is expected. The challenges cut across algorithmic problem solving, systems engineering, machine learning and infrastructure at a massive scale.
Reason to Join-
An opportunity for innovators, problem solvers & learners. Working will be Innovative, empowering, rewarding & fun. Amazing Office, competitive pay along with excellent benefits package.
Requiremets and Responsibilities- (please read carefully before applying)
- The overall experience of 3-6 years in Java/Python Framework and Machine Learning.
- Develop Web Services, REST, XSD, XML technologies, Java, Python, AWS, API.
- Experience on Elastic Search or SOLR or Lucene -Search Engine, Text Mining, Indexing.
- Experience in highly scalable tools like Kafka, Spark, Aerospike, etc.
- Hands on experience in Design, Architecture, Implementation, Performance & Scalability, and Distributed Systems.
- Design, implement, and deploy highly scalable and reliable systems.
- Troubleshoot Solr indexing process and querying engine.
- Bachelors or Masters in Computer Science from Tier 1 Institutions
Job Description:-
Design, develop IoT/Cloud-based Typescript/ JavaScript/ Node.JS applications using
Amazon Cloud Computing Services.
Work closely with onsite, offshore, and cross functional teams, Product Management, UI/UX developers, Web and Mobile developers, SQA teams to effectively use technologies to build and deliver high quality and on-time delivery of IoT applications
Bug and issue resolution
Proactively Identify risks and failure modes early in the development lifecycle and develop.
POCs to mitigate the risks early in the program.
Assertive communication and team skills
Primary Skills:
Hands on experience (3+ years) in AWS cloud native environment with work experience in
AWS Lambda, Kinesis, DynamoDB
3+ years’ experience in working with NodeJS, Python, Unit Testing and Git
3+ years in work experience with document, relational or timeseries databases
2+ years in work experience with typescript.
1+ years in IaaS framework like Serverless or CDK with CloudFormation knowledge
Backend Developer (Python)
Company and Founders
Egregore Labs (http://www.egregorelabs.com/">www.egregorelabs.com) is a financial software company founded in 2017 by Prashant Vijay (ISB, Tulane) & Hari Balaji (IIM Ahmedabad, IIT Madras) both of whom have spent over a decade each in Financial Services, with a majority of their experience at Goldman Sachs across New York, Hong Kong & Singapore in roles across Trading, Quant & Technology. More about the founders is available here and here
We operate at the intersection of Unstructured Data and Finance. We run multiple products, including Romulus (http://www.romulus.co/">www.romulus.co) and Robana (http://www.robana.ai/">www.robana.ai) - all our products work on the same underlying principles and set of technologies.
Ideal Background
- At least 2 years experience in back-end development in a fast-paced environment, working in Python
- Deep understanding of technologies used in web-deployed SaaS products, including Rest APIs
- Exposure to AWS, Azure or other cloud providers
- Sound understanding of computer science principles
- Exposure to any of the following
- Financial services
- Natural Language Processing
- Robotic Process Automation
- Intelligent Document Processing
- Document Management and Repositories
Opportunity
We will share our workload as a team and we expect you to work on a broad range of tasks. Here’s are some of the things you might have to do on any given day:
- Developing APIs and endpoints for deployments of our product
- Infrastructure Development such as building databases, creating and maintaining automated jobs
- Build out the back-end to deploy and scale our product
- Build POCs for client deployments
- Integrate our products with 3rd party products/tools/services
- Document your code, write test cases, etc.
Skills
- Hands-on experience with Python (2+ years)
- Sound understanding of Postgres and NoSQL databases such as MongoDB
- Deep familiarity with UNIX, major cloud platforms (AWS, Azure), DevOps
- Understanding of databases, and related tools, paradigms
- A computer science education would be great, but other engineering disciplines are ok as well
Desirables
We are looking for a person who has :
- Resourcefulness- - we're looking for versatile developers who are good at figuring out what they need to use, learn, build, re-purpose to get the job done quickly and efficiently.
- Ownership- - We like to be directive and not prescriptive in our management. We- d love for you to take ownership of what you work on, and tell us what to do, rather than the other way round.
- Work Ethic- - We- ve grown up on Wall Street. We work hard, and have aggressive goals. We want our team-mates to be focused, goal-oriented and consistent high achievers.
- Execution Focus- - Our business is about getting things done, and getting things done right. We want outcome focused colleagues who can multi-task, and execute quickly and elegantly.
Work From Home
Package: 7-12 LPA
Job Description:
We are looking for a passionate Backend developer with a focus on building maintainable and scalable systems. The developer will be responsible for the design and development of Jodo’s backend platforms. You will be closely working with Product Managers and Frontend developers to gather requirements and implement features. As a senior developer in the Team, you will be taking ownership of services/systems and be a mentor to other developers in the team.
Responsibilities:
● Own and drive the development of new features
● Lead design and development of the Jodo backend platform
● Troubleshoot production defects and performance issues
● Write reusable code/modules
● Optimize for speed of development/delivery
● Collaborate with frontend developers for integration
● Identify opportunities for automation
● Make cloud(AWS) infrastructure/services scalable and secure
Qualifications:
● 5+ years of proven experience as a Backend developer
● Experience in working with distributed systems
● Proficient in building Microservices/RESTful APIs with any modern tech stack
● Working knowledge of relational and nonrelational databases
● Prior working knowledge of Python/Django is a big plus
● Familiarity with modern CI/CD tools
● Knowledge of AWS or any other Cloud Platform services
● Champion code quality and drive best practices
● Ability to analyze and convert business requirements into technical requirements
● Self-starter and ability to take the ownership
● Prior experience working in a startup environment is great to have
Our Client is B2B SaaS Product Co. in the space of HR Technology. They are helping organisations to take informed decisions in the areas like Hiring, Training and Career Succession processes. The company was formed in 2010 and since has become a market leader in HR technology space. The founders are alumni of Stanford University and their employees have experience in working with PWC, McKinsey and other similar leagues of organisations.With a bright vision of the founders, the organisation is in an expansion mode to capture niche markets and become a global leader in this domain.
- Experience in Back-End development using Ruby on Rails or NodeJS
- Experience in working on at least two of MongoDB / Postgres / MySQL & Redis
- Experience on MVC patterns using frameworks like Rails, ExpressJS
- Strong understanding of RESTful APIs and HTTP protocol
- Understanding Security aspects of the applications and can successfully implement OWASP compliant systems
- Strong understanding of Linux OS, File Systems, Firewalls etc
- 3 years Experience in Ruby on Rails
- Minimum 3 years in MongoDB / PostgreSQL
- Must be from Product based companies
About Us
We Innvoage Technology Private Limited, incorporated in the year 2021, with a dream to revolutionise the Fintech industry by our innovative solutions in the field of online transactions. Our Vision is to “Be the most trusted and innovative service provider in the payment industry, with smoothest and most convenient to use interface and to offer unparalleled solutions to our valued customers in all our markets.”
We are looking for a core team to be the lead developers for our Payment Gateway Solution - UPayI.
Job Responsibilities
- Familiar with Laravel 8 with php 7.4
- Familiar with MySQL or PostgreSQL
- Familiar with WordPress plugin development
Required Skills
- Using Laravel to develop web system project. (about EC and Education)
- Developing WordPress plugins. (ITS VERY IMPORTANT)
- Modify WordPress theme, or plugin's template.
- Other web backend tasks.
- Must have at least 3 years of experience in Frontend Web Development.
Hi All,
We are hiring!!
Company: SpringML India Pvt Ltd.
Role:Lead Data Engineer
Location: Hyderabad
Website: https://springml.com/">https://springml.com/
About Company:
At SpringML, we are all about empowering the 'doers' in companies to make smarter decisions with their data. Our predictive analytics products and solutions apply machine learning to today's most pressing business problems so customers get insights they can trust to drive business growth.
We are a tight-knit, friendly team of passionate and driven people who are dedicated to learning, get excited to solve tough problems and like seeing results, fast. Our core values include placing our customers first, empathy and transparency, and innovation. We are a team with a focus on individual responsibility, rapid personal growth, and execution. If you share similar traits, we want you on our team.
What's the opportunity?
SpringML is looking to hire a top-notch Lead Data Engineer who is passionate about working with data and using the latest distributed framework to process large dataset.
As a Lead Data Engineer, your primary role will be to design and build data pipelines. You will be focused on helping client projects on data integration, data prep and implementing machine learning on datasets.
In this role, you will work on some of the latest technologies, collaborate with partners on early win, consultative approach with clients, interact daily with executive leadership, and help build a great company. Chosen team members will be part of the core team and play a critical role in scaling up our emerging practice.
Responsibilities:
- Ability to work as a member of a team assigned to design and implement data integration solutions.
- Build Data pipelines using standard frameworks in Hadoop, Apache Beam and other open-source solutions.
- Learn quickly – ability to understand and rapidly comprehend new areas – functional and technical – and apply detailed and critical thinking to customer solutions.
- Propose design solutions and recommend best practices for large scale data analysis
Skills:
- B.tech degree in computer science, mathematics or other relevant fields.
- 6+years of experience in ETL, Data Warehouse, Visualization and building data pipelines.
- Strong Programming skills – experience and expertise in one of the following: Java, Python, Scala, C.
- Proficient in big data/distributed computing frameworks such as Apache Spark, Kafka,
- Experience with Agile implementation methodology
Knowlarity Communications is India's largest cloud-based solutions provider. Our virtual phone system and enterprise solutions help make your business reliable and intelligent. With the capability to process over a million calls an hour, Knowlarity is a trusted brand for more than 8000 companies worldwide, SMBs as well as enterprises. We are funded by Sequoia Capital and Mayfield, headquartered in Singapore and have offices in Gurgaon, Mumbai, Bangalore, Dubai and the Philippines. Knowlarity solves business problems by making telephony intelligent and reliable in real time over the cloud, for Enterprises.
Must Have:
Languages : C, Python
DataBase: MySQL, PostgreSQL
Tools: Git
Operating System: Linux
Protocols: SIP, RTP, WebRTC
Good to have:
Languages : AWS, GCP, Azure (Cloud services)
Tools: FreeSWITCH, Asterisk, OpenSIP
We offer:
- A competitive salary and extensive social benefits
- Opportunity to be part of a team that invented and dominates the emerging market in the cloud telephony industry.
- Massive opportunities for growth.
- Work from a prime location - easy accessibility from both Gurgaon and Delhi.
- Work-life balance and support for career development.
- An amazing life inside the Knowlarity! Want to know more about it
Then let's stay connected!
https://www.facebook.com/Knowlarity/" target="_blank">https://www.facebook.com/
https://twitter.com/knowlarity" target="_blank">https://twitter.com/knowlarity
https://www.linkedin.com/company-beta/410771/" target="_blank">https://www.linkedin.com/
About Us
DataWeave provides Retailers and Brands with “Competitive Intelligence as a Service” that enables them to take
key decisions that impact their revenue. Powered by AI, we provide easily consumable and actionable
competitive intelligence by aggregating and analyzing billions of publicly available data points on the Web to
help businesses develop data-driven strategies and make smarter decisions.
Products@DataWeave
We, the Products team at DataWeave, build data products that provide timely insights that are readily
consumable and actionable, at scale. Our underpinnings are: scale, impact, engagement, and visibility. We help
businesses take data driven decisions everyday. We also give them insights for long term strategy. We are
focussed on creating value for our customers and help them succeed.
How we work
It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest
data problems that there are. We are in the business of making sense of messy public data on the web. At
serious scale! Read more on Become a DataWeaver
What do we offer?
● Opportunity to work on some of the most compelling data products that we are building for online
retailers and brands.
● Ability to see the impact of your work and the value you are adding to our customers almost immediately.
● Opportunity to work on a variety of challenging problems and technologies to figure out what really
excites you.
● A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible
working hours.
● Learning opportunities with courses, trainings, and tech conferences. Mentorship from seniors in the
team.
● Last but not the least, competitive salary packages and fast paced growth opportunities.
Role and Responsibilities
● Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics
functionality
● Build robust RESTful APIs that serve data and insights to DataWeave and other products
● Design user interaction workflows on our products and integrating them with data APIs
● Help stabilize and scale our existing systems. Help design the next generation systems.
● Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
● Work closely with the Head of Products and UX designers to understand the product vision and design
philosophy
● Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and
interns.
● Constantly think scale, think automation. Measure everything. Optimize proactively.
● Be a tech thought leader. Add passion and vibrance to the team. Push the envelope.
Skills and Requirements
● 4-7 years of experience building and scaling APIs and web applications.
● Experience building and managing large scale data/analytics systems.
● Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good
understanding of software design principles and architectural best practices.
● Be passionate about writing code and have experience coding in multiple languages, including at least
one scripting language, preferably Python.
● Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision
is right/wrong, and so on.
● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
● Have experience working with multiple storage and indexing technologies such as MySQL, Redis,
MongoDB, Cassandra, Elastic.
● Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
● Use the command line like a pro. Be proficient in Git and other essential software development tools.
● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
● Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana,
Graylog, StatsD, Datadog etc.
● Working knowledge of building websites and apps. Good understanding of integration complexities and
dependencies.
● Working knowledge linux server administration as well as the AWS ecosystem is desirable.
● It's a huge bonus if you have some personal projects (including open source contributions) that you work
on during your spare time. Show off some of your projects you have hosted on GitHub.








