


Role
You will develop and maintain the key backend code and infrastructure of the company stack. You will implement AI solutions like LLMs for various tasks such as voice-based interactive systems, chatbots, and AI web apps. Ability to see projects through from start to finish with good organizational skills and attention to detail. This is a perfect role for someone who likes to build state-of-the-art AI products and work with cutting-edge AI technologies like GPT, LLAMA, etc
Qualifications
- BS or MS in Computer Science or relevant field.
- 4+ years experience in backend software development
- Be able to design high-throughput scalable backend systems
- Eagerness to learn applied AI technologies like LLMs, prompt engineering, etc
- Proficiency in Python.
- Experience with cloud computing platforms (AWS, GCP) and technologies like Docker
- Knowledge of Rest APIs, databases (mysql, mongo, vectorDB)

About Avyott
About
Building the future with AI.
Candid answers by the company
Avyott is a stealth mode AI company at the forefront of innovative technology. We offer advanced artificial intelligence solutions that revolutionize industries and empower businesses. Voice interactive systems, chat bots, and web-based AI applications are some of our product offerings. Luxury and sophistication is what we strive for.
Similar jobs


💥 What will you do?
As a Lead Backend Engineer, you will
- Build out and help scale our Django (+ReactJS / VueJS) based web application as we add new features and customers
-
Help us transition from monolithic system architecture to microservices and serverless system architecture to help meet our future scaling requirements
- Collaborate on a daily basis with a small, nimble team of product managers, engineers, and UX designers to understand business requirements and user experience goals and pain points.
- Build out features to enable multi-channel customer acquisition include partner channels, enterprise channels, and government channels
- Integrate third-party plugins to enhance customer experience and internal tooling build internal tooling to improve internal team efficiencies.
- Actively participate in code reviews
- Work with our quality assurance team to improve coverage on our automated testing suites
🙋 What are we looking for?
While we do not have a strict list of requirements for candidates interested in this role, some indicators that you would fit this role and our engineering culture are
-
Prior experience of a few years (3+ years) with python based frameworks as Django or Flask would be essential to work on our application stack.
-
You actively promote a culture of engineering excellence some of which could be writing efficient code, use of elegant design patterns, and styling your code through code-linting policies.
-
You enjoy designing software architecture by collaborating with engineering managers, architects, and other lead engineers to explore existing systems, determining areas of complexity, potential risks to successful implementation.
- You enjoy coaching folks to achieve outcomes through nudges.
📢 Other information you may want to consider
- We will be flexible for the rest of the pandemic and work remotely; however, we are not a remote-first company, and the work location would be Bangalore when things settle.
- Our backend tech stack includes Django, FastAPI, Postgres, Redis, Clickhouse, and TigerGraph. Our environments are managed through Docker, Kubernetes, and Terraform.


Technical Proficiency :
Must have :
- Strong development experience in Python in the environment of Unix/Linux/Ubuntu
- Strong practical knowledge of Python and its libraries.
- Current working experience with cloud deployment of AWS/Azure/GCP, Microservice architecture, and Docker in Python.
- Good knowledge of CI/CD and DevOps practices
- Good Experience of Python with Django/ Scrapy/ Flask frameworks.
- Good Experience in Jupyter/ Docker/ Elastic Search, etc.
- Solid understanding of software development principles and best practices.
- Strong analytical thinking and problem-solving skills.
- Proven ability to drive large-scale projects with a deep understanding of Agile SDLC, high collaboration, and leadership.
Good to have :
- Expected to have migration experience from one version to the other, as this project is about migration to the latest version.
- Preferred if had an OpenEdx platform experience or any LMS platform.


We are looking for an experienced Python developer who can help create dynamic software applications for our clients with their skill set. In this role, you will be responsible for gathering requirements from clients and accordingly write and test scalable code, and develop back-end components.
Technologies worked on:
Python - Django/Flask/FastAPI, Pytest/Unittest, AWS services.
Database - Postgresql/Mysql or NoSQL databases.
Requirement Description:
• Experience in designing,implementation & testing of Python Applications.
• Must have knowledge of at least one of python web framework Django/Flask/FastAPI and any of unit test frameworks(Pytest/Unittest).
• Should have a solid understanding of object-oriented programming (OOP).
• Well-experienced to perform Unit Testing and Integration Testing
• Have good experience in Agile based development approach.
• Expertise in developing enterprise-level web applications and REST/GRAPHQL APIs using
MicroServices, with demonstrable production-scale experience.
• Demonstrate strong design and programming skills ,writing optimized code.
• Working knowledge with SQL(MySQL,Postgresql etc) is mandatory and knowledge on NoSQL databases can be an addon.
• Understand Architecture Requirements and ensure effective design, development, validation, and support activities.
• Understanding of core AWS services, uses and basic AWS architecture best practices.
• Proficiency in developing, deploying, and debugging cloud-based applications using AWS.
• Ability to use the AWS service APIs, AWS CLI, and SDKs to write applications
• Ability to identify key features of AWS services while designing a solution.
• Identify bottlenecks and bugs, and recommend solutions by comparing the advantages and disadvantages of custom development
• Should contribute to team meetings, troubleshooting development and production problems across multiple environments and operating platforms
• Execute strong collaboration and communication skills within distributed project teams.
• Responsible for quality and timely deliverables for each given task.
• Knowledge on frontend technologies could be addon.

Architects are responsible for driving Technology & Best Practices in Engineering. We are a rapidly growing & constantly improving organisation. We seek very high levels of ownership in all individuals, especially leadership roles like this - Ownership of your team, your product. Going beyond your role & contributing to make the organisation & business better is an expectation.
Responsibilities:
- Actively participate in development along with team members for as much as 50% of their time, creating modules & systems that can then be treated as a working reflection of the best practices.
- Participating in code reviews, design reviews, architecture discussions.
- Being responsible for Scaling, Performance & Quality for the team
- Setting up best practices to help the team achieve the above and constantly thinking about improving the technology use are your responsibilities.
- Driving the adoption of these best practices around coding, design, quality, performance in your team and influence them for the entire organisation are also expectations from you.
- Experiment with new & relevant technologies and tools, and drive adoption while measuring yourself on the impact you are able to create
- Collaborate with Product Management and Product Development leaders in developing product visions and strategies.
- Define & drive implementation of long term technology vision for your product & team
- You will be the primary owner of the architecture of your product and will also be responsible for getting it reviewed, and making sure the system is built accordingly.
- Be an evangelist for technology & represent in external forums.
- Creating architectures & designs for new solutions around existing and new problem spaces
- Drive technology & tool choices for your team & be responsible for them.
Requirements:
- Quick & Excellent Problem solving skills for complex & large scale problems
- Exposure to a wide variety of problem spaces, technologies
- Very Strong System design and OO skills with a nifty ability to craft clean interfaces and operate at the right levels of abstraction
- Solid coding skills with ability to drive teams through massive refactoring exercise & improve coding standards across large code bases
- Deep knowledge, understanding & experience of working with a large variety of multi-tier architectures.
- Awareness of pitfalls & use cases for a large variety of solutions
- Deep understanding & experience of high performance web scale & real-time response systems with experience & expertise in a variety of large scale persistent systems including large databases
- Exposure to complete product development cycles - From inception to production to scaling up, supporting new requirements, re-architectures the Principle architects should have seen it all and ideally in multiple cycles.
- Should have been part of scalable product development cycles with either large data handling or large transaction processing exposure for 5 years
- Must have worked in a small setup (either a startup or a small & reasonably independent team)
- 8+ years of overall experience
- B Tech or higher in Computer Science or equivalent required


Python API DeveloperJD:
Experience: 4-6 Yrs
Notice Period: 10-20 days or within 1 month
>> Develop and maintain various security software products with queues, caching & database management.
>> Hands-on experience in Coding in Python is required along with Knowledge about Data Structures and object-oriented programming, Algorithms.
>> Extensive experience in developing asynchronous systems.
>> Integration of user-facing elements developed by front-end developers with server-side logic
>> Implementation of security and data protection.
>> Performance tuning, improvement, balancing, usability, automation
Mandatory Skills:
- Python
- Flask/Django
- API

Hi All,
We are hiring!!
Company: SpringML India Pvt Ltd.
Role:Lead Data Engineer
Location: Hyderabad
Website: https://springml.com/">https://springml.com/
About Company:
At SpringML, we are all about empowering the 'doers' in companies to make smarter decisions with their data. Our predictive analytics products and solutions apply machine learning to today's most pressing business problems so customers get insights they can trust to drive business growth.
We are a tight-knit, friendly team of passionate and driven people who are dedicated to learning, get excited to solve tough problems and like seeing results, fast. Our core values include placing our customers first, empathy and transparency, and innovation. We are a team with a focus on individual responsibility, rapid personal growth, and execution. If you share similar traits, we want you on our team.
What's the opportunity?
SpringML is looking to hire a top-notch Lead Data Engineer who is passionate about working with data and using the latest distributed framework to process large dataset.
As a Lead Data Engineer, your primary role will be to design and build data pipelines. You will be focused on helping client projects on data integration, data prep and implementing machine learning on datasets.
In this role, you will work on some of the latest technologies, collaborate with partners on early win, consultative approach with clients, interact daily with executive leadership, and help build a great company. Chosen team members will be part of the core team and play a critical role in scaling up our emerging practice.
Responsibilities:
- Ability to work as a member of a team assigned to design and implement data integration solutions.
- Build Data pipelines using standard frameworks in Hadoop, Apache Beam and other open-source solutions.
- Learn quickly – ability to understand and rapidly comprehend new areas – functional and technical – and apply detailed and critical thinking to customer solutions.
- Propose design solutions and recommend best practices for large scale data analysis
Skills:
- B.tech degree in computer science, mathematics or other relevant fields.
- 6+years of experience in ETL, Data Warehouse, Visualization and building data pipelines.
- Strong Programming skills – experience and expertise in one of the following: Java, Python, Scala, C.
- Proficient in big data/distributed computing frameworks such as Apache Spark, Kafka,
- Experience with Agile implementation methodology



Responsibilities:
- Writing reusable, testable, and efficient code
- Design and implementation of low-latency, high-availability, and performant applications
- Integration of user-facing elements developed by front-end developers with server side logic
- Implementation of security and data protection
- Integration of data storage solutions (may include databases, key-value stores, blob stores, etc.)
- Expert in Python, with knowledge of at least one Python web framework (such as Django, Flask, etc depending on your technology stack)
- Familiarity with some ORM (Object Relational Mapper) libraries
- Able to integrate multiple data sources and databases into one system
- Understanding of the threading limitations of Python, and multi-process architecture
- Good understanding of server-side templating languages (such as Jinja 2, Mako, etc depending on your technology stack)
- Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
- Understanding of accessibility and security compliance (depending on the specific project)
- Knowledge of user authentication and authorization between multiple systems, servers, and environments
- Understanding of fundamental design principles behind a scalable application
- Familiarity with event-driven programming in Python
- Understanding of the differences between multiple delivery platforms, such as mobile vs desktop, and optimizing output to match the specific platform
- Able to create database schemas that represent and support business processes
- Strong unit test and debugging skills
- Basic knowledge of machine learning algorithm and libraries like keras, tensorflow, sklearn.

About Us
DataWeave provides Retailers and Brands with “Competitive Intelligence as a Service” that enables them to take
key decisions that impact their revenue. Powered by AI, we provide easily consumable and actionable
competitive intelligence by aggregating and analyzing billions of publicly available data points on the Web to
help businesses develop data-driven strategies and make smarter decisions.
Products@DataWeave
We, the Products team at DataWeave, build data products that provide timely insights that are readily
consumable and actionable, at scale. Our underpinnings are: scale, impact, engagement, and visibility. We help
businesses take data driven decisions everyday. We also give them insights for long term strategy. We are
focussed on creating value for our customers and help them succeed.
How we work
It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest
data problems that there are. We are in the business of making sense of messy public data on the web. At
serious scale! Read more on Become a DataWeaver
What do we offer?
● Opportunity to work on some of the most compelling data products that we are building for online
retailers and brands.
● Ability to see the impact of your work and the value you are adding to our customers almost immediately.
● Opportunity to work on a variety of challenging problems and technologies to figure out what really
excites you.
● A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible
working hours.
● Learning opportunities with courses, trainings, and tech conferences. Mentorship from seniors in the
team.
● Last but not the least, competitive salary packages and fast paced growth opportunities.
Role and Responsibilities
● Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics
functionality
● Build robust RESTful APIs that serve data and insights to DataWeave and other products
● Design user interaction workflows on our products and integrating them with data APIs
● Help stabilize and scale our existing systems. Help design the next generation systems.
● Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
● Work closely with the Head of Products and UX designers to understand the product vision and design
philosophy
● Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and
interns.
● Constantly think scale, think automation. Measure everything. Optimize proactively.
● Be a tech thought leader. Add passion and vibrance to the team. Push the envelope.
Skills and Requirements
● 4-7 years of experience building and scaling APIs and web applications.
● Experience building and managing large scale data/analytics systems.
● Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good
understanding of software design principles and architectural best practices.
● Be passionate about writing code and have experience coding in multiple languages, including at least
one scripting language, preferably Python.
● Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision
is right/wrong, and so on.
● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
● Have experience working with multiple storage and indexing technologies such as MySQL, Redis,
MongoDB, Cassandra, Elastic.
● Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
● Use the command line like a pro. Be proficient in Git and other essential software development tools.
● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
● Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana,
Graylog, StatsD, Datadog etc.
● Working knowledge of building websites and apps. Good understanding of integration complexities and
dependencies.
● Working knowledge linux server administration as well as the AWS ecosystem is desirable.
● It's a huge bonus if you have some personal projects (including open source contributions) that you work
on during your spare time. Show off some of your projects you have hosted on GitHub.





