



About Nextalytics Software Services Pvt Ltd
Similar jobs


· Develop and maintain scalable back-end applications using Python frameworks such as Flask/Django/FastAPI.
· Design, build, and optimize data pipelines for ETL processes using tools like PySpark, Airflow, and other similar technologies.
· Work with relational and NoSQL databases to manage and process large datasets efficiently.
Collaborate with data scientists to clean, transform, and prepare data for analytics and machine learning models.
Work in a dynamic environment, at the intersection of software development and data engineering.

A BIT ABOUT US
Appknox is one of the top Mobile Application security companies recognized by Gartner and G2. A profitable B2B SaaS startup headquartered in Singapore & working from Bengaluru.
The primary goal of Appknox is to help businesses and mobile developers secure their mobile applications with a focus on delivery speed and high-quality security audits.
Our business includes Fortune 500 companies with Major brands spread across regions like India, South-East Asia, Middle-East, Japan, US, and expanding rapidly.
The Opportunity:
We are seeking a highly skilled Senior Software Engineer (Backend) to join our dynamic software development team. In this role, you will contribute to key backend projects, collaborate across teams, and play a vital part in delivering robust, scalable, and high-performance software solutions. As a senior engineer, you will work independently, make impactful technical decisions, and help shape the backend architecture while collaborating with a passionate, high-performing team.
You will work hands-on with products primarily built in Python, with opportunities to contribute to Golang. These technologies are at the core of our development stack, and your focus will be on building, scaling, and maintaining distributed services. Distributed systems are integral to our architecture, providing a chance to gain hands-on experience with maintaining and optimizing them in a fast-paced environment.
As a Senior Engineer, you are expected to:
- Write clean, maintainable, and testable code while following best practices.
- Architect solutions, address complex problems, and deliver well-thought-out technical designs.
- Take ownership of assigned modules and features, delivering them with minimal supervision.
- Contribute to code reviews and technical discussions, ensuring high-quality deliverables.
We highly value open source contributions and encourage you to check out our work on GitHub at Appknox GitHub. While no prior experience in security is required, our experienced security professionals are available to support you in understanding the domain.
This role offers a unique opportunity to work on cutting-edge technology, drive impactful solutions, and grow within a collaborative environment that values autonomy, innovation, and technical excellence.
Responsibilities:
- Contribute to backend development for a cutting-edge product in the Security domain, with focus on performance, reliability, and maintainability.
- Implement software components and features based on high-level architectural guidance using Django and Django REST Framework (DRF).
- Collaborate with senior engineers to translate functional and technical requirements into robust backend implementations.
- Write clean, modular, and testable code following industry best practices and team standards.
- Participate in design discussions and code reviews to maintain code quality and continuously learn from peers.
- Work closely with frontend, QA, and security teams to deliver well-integrated, end-to-end solutions.
- Troubleshoot, debug, and resolve issues in existing systems, ensuring stability and efficiency.
- Contribute to the creation of technical documentation for components and modules you build or maintain.
- Follow defined verification plans and contribute to improving test coverage and automation.
- Participate in sprint planning, estimations, and agile ceremonies to support timely and effective project delivery.
- Proactively seek feedback and continuously improve your technical skills and understanding of system architecture.
- Support team success by collaborating effectively, sharing knowledge, and guiding junior team members when needed.
Requirements:
- Solid hands-on experience with Django and Django REST Framework 3-4 years
- Good understanding of relational databases, SQL, and working with ORMs such as Django ORM.
- Ability to contribute to design discussions and implement well-structured, maintainable backend systems.
- Exposure to writing unit and integration tests; familiarity with CI/CD pipelines and version control systems
- Proficiency in debugging, performance optimization, and addressing scalability concerns under guidance.
- Strong fundamentals in data structures, algorithms, and clean coding practices.
- Able to collaborate effectively within a team and take ownership of moderately complex modules.
- Good communication skills, with the ability to document and explain solutions to peers.
- Familiarity with cloud platforms and microservices architecture is a plus.
- Self-driven with a growth mindset and willingness to learn from peers and feedback.
Work Expectations:
Within 1 month-
- Attend KT sessions conducted by the engineering and product teams to gain a deep understanding of the product, its architecture, and workflows.
- Learn about the team's development processes, tools, and key challenges.
- Work closely with the product team to understand product requirements and contribute to the design and development of features.
- Dive deep into the existing backend architecture, including database structures, APIs, and integration points, to fully understand the technical landscape
- Begin addressing minor technical challenges and bugs, while understanding the underlying architecture and tech stack.
- Begin to participate in creating action plans for new features, ensuring that design and implementation are aligned with product goals.
Within 3 months-
- Achieve full autonomy in working on the codebase, demonstrating the ability to independently deliver high-quality features from design to deployment.
- Take complete ownership of critical modules, ensuring they are optimized for performance and maintainability.
- Act as a technical resource for the team, offering support and guidance to peers on complex issues.
- Collaborate with DevOps to optimize deployment pipelines, debug production issues, and improve backend infrastructure.
- Lead discussions for technical solutions and provide recommendations for architectural improvements.
- Contribute to the design of new features by translating functional requirements into detailed technical specifications.
- Prepare regular updates on assigned tasks and communicate effectively with the engineering manager and other stakeholders.
Within 6 months-
- Be fully independent in their development tasks, contributing to key features and solving critical challenges.
- Demonstrate strong problem-solving skills and the ability to take ownership of technical modules.
- Actively participate in code reviews and technical discussions, ensuring high-quality deliverables.
- Collaborate seamlessly with cross-functional teams to align technical solutions with business requirements.
- Establish themselves as a reliable and proactive team member, contributing to the team’s growth and success.
Personality traits we really admire:
- Great attitude to ask questions, learn and suggest process improvements.
- Has attention to details and helps identify edge cases.
- Proactive mindset.
- Highly motivated and coming up with ideas and perspective to help us move towards our goals faster.
- Follows timelines and absolute commitment to deadlines.
Interview Process
- Round 1 Interview - Profile Evaluation (EM)
- Round 2 Interview - Assignment Evaluation & Technical Problem Solving discussion (Tech Team)
- Round 4 Interview - Engineering Team & Technical Founder (CTO)
- Round 5 Interview - HR
Compensation
- Best in industry
We prefer that every employee also holds equity in the company. In this role, you will be awarded equity after 12 months, based on the impact you have created.
Please be aware that all your customers are Enterprises and Fortune 500 companies.
Why Join Us:
- Freedom & Responsibility: If you are a person who enjoys challenging work & pushing your boundaries, then this is the right place for you. We appreciate new ideas & ownership as well as flexibility with working hours.
- Great Salary & Equity: We keep up with the market standards & provide pay packages considering updated standards. Also as Appknox continues to grow, you’ll have a great opportunity to earn more & grow with us. Moreover, we also provide equity options for our top performers.
- Holistic Growth: We foster a culture of continuous learning and take a much more holistic approach to training and developing our assets: the employees. We shall also support you all on that journey of yours.
- Transparency: Being a part of a start-up is an amazing experience one of the reasons being the open communication & transparency at multiple levels. Working with Appknox will give you the opportunity to experience it all first hand.
- Health insurance: We offer health insurance coverage upto 5 Lacs for you and your family including parents.

💼 Job Title: Full Stack Developer (*Fresher/experienced*)
🏢 Company: SDS Softwares
💻 Location: Work from Home
💸 Salary range: ₹7,000 - ₹18,000 per month (based on knowledge and interview)
🕛 Shift Timings: 12 PM to 9 PM
About the role: As a Full Stack Developer, you will work on both the front-end and back-end of web applications. You will be responsible for developing user-friendly interfaces and maintaining the overall functionality of our projects.
⚜️ Key Responsibilities:
- Collaborate with cross-functional teams to define, design, and ship new features.
- Develop and maintain high-quality web applications (frontend + backend )
- Troubleshoot and debug applications to ensure peak performance.
- Participate in code reviews and contribute to the team’s knowledge base.
⚜️ Required Skills:
- Proficiency in HTML, CSS, JavaScript, React.js for front-end development. ✅
- Understanding of server-side languages such as Node.js, Python, or PHP. ✅
- Familiarity with database technologies such as MySQL, MongoDB, or ✅ PostgreSQL.
- Basic knowledge of version control systems, particularly Git.
- Strong problem-solving skills and attention to detail.
- Excellent communication skills and a team-oriented mindset.
💠 Qualifications:
- Recent graduates or individuals with internship experience (6 months to 1.5years) in software development.
- Must have a personal laptop and stable internet connection.
- Ability to join immediately is preferred.
If you are passionate about coding and eager to learn, we would love to hear from you. 👍

Role: Data Engineer
Company: PayU
Location: Bangalore/ Mumbai
Experience : 2-5 yrs
About Company:
PayU is the payments and fintech business of Prosus, a global consumer internet group and one of the largest technology investors in the world. Operating and investing globally in markets with long-term growth potential, Prosus builds leading consumer internet companies that empower people and enrich communities.
The leading online payment service provider in 36 countries, PayU is dedicated to creating a fast, simple and efficient payment process for merchants and buyers. Focused on empowering people through financial services and creating a world without financial borders where everyone can prosper, PayU is one of the biggest investors in the fintech space globally, with investments totalling $700 million- to date. PayU also specializes in credit products and services for emerging markets across the globe. We are dedicated to removing risks to merchants, allowing consumers to use credit in ways that suit them and enabling a greater number of global citizens to access credit services.
Our local operations in Asia, Central and Eastern Europe, Latin America, the Middle East, Africa and South East Asia enable us to combine the expertise of high growth companies with our own unique local knowledge and technology to ensure that our customers have access to the best financial services.
India is the biggest market for PayU globally and the company has already invested $400 million in this region in last 4 years. PayU in its next phase of growth is developing a full regional fintech ecosystem providing multiple digital financial services in one integrated experience. We are going to do this through 3 mechanisms: build, co-build/partner; select strategic investments.
PayU supports over 350,000+ merchants and millions of consumers making payments online with over 250 payment methods and 1,800+ payment specialists. The markets in which PayU operates represent a potential consumer base of nearly 2.3 billion people and a huge growth potential for merchants.
Job responsibilities:
- Design infrastructure for data, especially for but not limited to consumption in machine learning applications
- Define database architecture needed to combine and link data, and ensure integrity across different sources
- Ensure performance of data systems for machine learning to customer-facing web and mobile applications using cutting-edge open source frameworks, to highly available RESTful services, to back-end Java based systems
- Work with large, fast, complex data sets to solve difficult, non-routine analysis problems, applying advanced data handling techniques if needed
- Build data pipelines, includes implementing, testing, and maintaining infrastructural components related to the data engineering stack.
- Work closely with Data Engineers, ML Engineers and SREs to gather data engineering requirements to prototype, develop, validate and deploy data science and machine learning solutions
Requirements to be successful in this role:
- Strong knowledge and experience in Python, Pandas, Data wrangling, ETL processes, statistics, data visualisation, Data Modelling and Informatica.
- Strong experience with scalable compute solutions such as in Kafka, Snowflake
- Strong experience with workflow management libraries and tools such as Airflow, AWS Step Functions etc.
- Strong experience with data engineering practices (i.e. data ingestion pipelines and ETL)
- A good understanding of machine learning methods, algorithms, pipelines, testing practices and frameworks
- Preferred) MEng/MSc/PhD degree in computer science, engineering, mathematics, physics, or equivalent (preference: DS/ AI)
- Experience with designing and implementing tools that support sharing of data, code, practices across organizations at scale

Sizzle is an exciting new startup that’s changing the world of gaming. At Sizzle, we’re building AI to automate gaming highlights, directly from Twitch and YouTube streams.
For this role, we're looking for someone that ideally loves to watch video gaming content on Twitch and YouTube. Specifically, you will help generate training data for all the AI we are building. This will include gathering screenshots, clips and other data from gaming videos on Twitch and YouTube. You will then be responsible for labeling and annotating them. You will work very closely with our AI engineers.
You will:
- Gather training data as specified by the management and engineering team
- Label and annotate all the training data
- Ensure all data is prepped and ready to feed into the AI models
- Revise the training data as specified by the engineering team
- Test the output of the AI models and update training data needs
You should have the following qualities:
- Willingness to work hard and hit deadlines
- Work well with people
- Be able to work remotely (if not in Bangalore)
- Interested in learning about AI and computer vision
- Willingness to learn rapidly on the job
- Ideally a gamer or someone interested in watching gaming content online
Skills:
Data labeling, annotation, AI, computer vision, gaming
Work Experience: 0 years to 3 years
About Sizzle
Sizzle is building AI to automate gaming highlights, directly from Twitch and YouTube videos. Presently, there are over 700 million fans around the world that watch gaming videos on Twitch and YouTube. Sizzle is creating a new highlights experience for these fans, so they can catch up on their favorite streamers and esports leagues. Sizzle is available at http://www.sizzle.gg">www.sizzle.gg.

- Can write reliable, scalable, testable and maintainable code.
- Familiarity with Agile methodologies and clean code.
- Design and/or contribute to client-side and server-side architecture.
- Well versed with fundamentals of REST.
- Build the front-end of applications through appealing visual design.
- Knowledge of one or more front-end languages and libraries (e.g. HTML / CSS, JavaScript, XML, jQuery, Typescript) JavaScript frameworks (e.g. Angular, React, Redux, Vue.js)
- Knowledge of one or more back-end languages (e.g. C#, Java, Python, Go, Node.js and frameworks like SpringBoot, .NET Core)
- Well versed with fundamentals of database design.
- Familiarity with databases - RDBMS like MySQL, Postgres & NoSQL like MongoDB, DynamoDB.
- Well versed with one or more cloud platforms like - AWS, Azure, GCP.
- Familiar with Infrastructure as Code - CloudFormation & Terraform & deployment tools like Docker, Kubernetes.
- Familiarity with CI/CD tools like Jenkins, CircleCI, Github Actions.
- Unit testing tools like Junit, Mockito, Chai, Mocha, Jest


Introduction
Synapsica is a growth stage HealthTech startup founded by alumni from IIT Kharagpur, AIIMS New Delhi, and IIM Ahmedabad. We believe healthcare needs to be transparent and objective, while being affordable. Every patient has the right to know exactly what is happening in their bodies and they don’t have to rely on cryptic 2 liners given to them as diagnosis. Towards this aim, we are building an artificial intelligence enabled cloud based platform to analyse medical images and create v2.0 of advanced radiology reporting. We are backed by YCombinator and other investors from India, US and Japan. We are proud to have GE, AIIMS, and the Spinal Kinetics as our partners.
Your Roles and Responsibilities
The role involves computer vision tasks including development, customization and training of Convolutional Neural Networks (CNNs); application of ML techniques (SVM, regression, clustering etc.) and traditional Image Processing (OpenCV etc.). The role is research focused and would involve going through and implementing existing research papers, deep dive of problem analysis, generating new ideas, automating and optimizing key processes.
Requirements:
- Strong problem-solving ability
- Prior experience with Python, cuDNN, Tensorflow, PyTorch, Keras, Caffe (or similar Deep Learning frameworks).
- Extensive understanding of computer vision/image processing applications like object classification, segmentation, object detection etc
- Ability to write custom Convolutional Neural Network Architecture in Pytorch (or similar)
- Experience of GPU/DSP/other Multi-core architecture programming
- Effective communication with other project members and project stakeholders
- Detail-oriented, eager to learn, acquire new skills
- Prior Project Management and Team Leadership experience
- Ability to plan work and meet deadlines
- End to end deployment of deep learning models.


About the Company
Blue Sky Analytics is a Climate Tech startup that combines the power of AI & Satellite data to aid in the creation of a global environmental data stack. Our funders include Beenext and Rainmatter. Over the next 12 months, we aim to expand to 10 environmental data-sets spanning water, land, heat, and more!
We are looking for a Data Lead - someone who works at the intersection of data science, GIS, and engineering. We want a leader who not only understands environmental data but someone who can quickly assemble large scale datasets that are crucial to the well being of our planet. Come save the planet with us!
Your Role
Manage: As a leadership position, this requires long term strategic thinking. You will be in charge of daily operations of the data team. This would include running team standups, planning the execution of data generation and ensuring the algorithms are put in production. You will also be the person in charge to dumb down the data science for the rest of us who do not know what it means.
Love and Live Data: You will also be taking all the responsibility of ensuring that the data we generate is accurate, clean, and is ready to use for our clients. This would entail that you understand what the market needs, calculate feasibilities and build data pipelines. You should understand the algorithms that we use or need to use and take decisions on what would serve the needs of our clients well. We also want our Data Lead to be constantly probing for newer and optimized ways of generating datasets. It would help if they were abreast of all the latest developments in the data science and environmental worlds. The Data Lead also has to be able to work with our Platform team on integrating the data on our platform and API portal.
Collaboration: We use Clubhouse to track and manage our projects across our organization - this will require you to collaborate with the team and follow up with members on a regular basis. About 50% of the work, needs to be the pulse of the platform team. You'll collaborate closely with peers from other functions—Design, Product, Marketing, Sales, and Support to name a few—on our overall product roadmap, on product launches, and on ongoing operations. You will find yourself working with the product management team to define and execute the feature roadmap. You will be expected to work closely with the CTO, reporting on daily operations and development. We don't believe in a top-down hierarchical approach and are transparent with everyone. This means honest and mutual feedback and ability to adapt.
Teaching: Not exactly in the traditional sense. You'll recruit, coach, and develop engineers while ensuring that they are regularly receiving feedback and making rapid progress on personal and professional goals.
Humble and cool: Look we will be upfront with you about one thing - our team is fairly young and is always buzzing with work. In this fast-paced setting, we are looking for someone who can stay cool, is humble, and is willing to learn. You are adaptable, can skill up fast, and are fearless at trying new methods. After all, you're in the business of saving the planet!
Requirements
- A minimum of 5 years of industry experience.
- Hyper-curious!
- Exceptional at Remote Sensing Data, GIS, Data Science.
- Must have big data & data analytics experience
- Very good in documentation & speccing datasets
- Experience with AWS Cloud, Linux, Infra as Code & Docker (containers) is a must
- Coordinate with cross-functional teams (DevOPS, QA, Design etc.) on planning and execution
- Lead, mentor and manage deliverables of a team of talented and highly motivated team of developers
- Must have experience in building, managing, growing & hiring data teams. Has built large-scale datasets from scratch
- Managing work on team's Clubhouse & follows up with the team. ~ 50% of work, needs to be the pulse of the platform team
- Exceptional communication skills & ability to abstract away problems & build systems. Should be able to explain to the management anything & everything
- Quality control - you'll be responsible for maintaining a high quality bar for everything your team ships. This includes documentation and data quality
- Experience of having led smaller teams, would be a plus.
Benefits
- Work from anywhere: Work by the beach or from the mountains.
- Open source at heart: We are building a community where you can use, contribute and collaborate on.
- Own a slice of the pie: Possibility of becoming an owner by investing in ESOPs.
- Flexible timings: Fit your work around your lifestyle.
- Comprehensive health cover: Health cover for you and your dependents to keep you tension free.
- Work Machine of choice: Buy a device and own it after completing a year at BSA.
- Quarterly Retreats: Yes there's work-but then there's all the non-work+fun aspect aka the retreat!
- Yearly vacations: Take time off to rest and get ready for the next big assignment by availing the paid leaves.





