
Job Description :-
- Have intermediate/advanced knowledge of Python.
- Hands-on experience with OOP in Python. Flask/Django framework, ORM with MySQL, MongoDB is a plus.
- Must have experience of writing shell scripts and configuration files. Should be proficient in bash.
- Should have excellent Linux administration capabilities.
- Working experience of SCM. Git is preferred.
- Should have knowledge of the basics of networking in Linux, and computer networks in general.
- Experience in engineering practices such as code refactoring, design patterns, design driven development, Continuous Integration.
- Understanding of Architecture of OpenStack/Kubernetes and good knowledge of standard client interfaces is a plus.
- Code contributed to OpenStack/Kubernetes community will be plus.
- Understanding of NFV and SDN domain will be plus.

Similar jobs
Responsibilities:
• Help define and create Backend architecture and deployment using Python- Django-AWS in an agile environment with lots of ownership and active mentoring
• Work with the Product and Design teams to build new features to solve business problems and fill business needs
• Participate in code reviews to create robust and maintainable code
• Work in an agile environment where quick iterations and good feedback are a way of life
• Interact with other stakeholders for requirement, design discussions and for adoption of new features • Communicate and coordinate with our support and professional services teams to solve customer issues
• Help scale our platform as we expand our product across various markets and verticals globally
As a young, fresh startup, we are hoping to be joined by self-starting, hardworking, passionate individuals who are committed to delivering their best, who can grow into future leaders of FactWise.
Responsibilities :
● Designing and developing robust and scalable server-side applications using Python, Flask, Django, or other relevant frameworks and technologies.
● Collaborating with other developers, data scientists, and data engineers to design and implement RESTful APIs, web services, and microservices architectures.
● Writing clean, maintainable, and efficient code, and reviewing the code of other team members to ensure consistency and adherence to best practices.
● Participating in code reviews, testing, debugging, and troubleshooting to ensure the quality and reliability of applications.
● Optimising applications for performance, scalability, and security, and monitoring the production environment to ensure uptime and availability.
● Staying up-to-date with emerging trends and technologies in web development, and evaluating and recommending new tools and frameworks as needed.
● Mentoring and coaching junior developers to ensure they grow and develop their skills and knowledge in line with the needs of the team and the organisation.
● Communicating and collaborating effectively with other stakeholders, including product owners, project managers, and other development teams, to ensure projects are delivered on time and to specification.
You are a perfect match, if you have these qualification :
● Strong experience in Python and server-side development frameworks such as Flask or Django.
● Experience in building RESTful APIs, web services, and microservices architectures.
● Experience in using database technologies such as MySQL, PostgreSQL, or MongoDB.
● Familiarity with cloud-based platforms such as AWS, Azure, or Google Cloud Platform.
● Knowledge of software development best practices such as Agile methodologies, Test-Driven Development (TDD), and Continuous Integration/Continuous Deployment (CI/CD).
● Excellent problem-solving and debugging skills, and the ability to work independently as well as part of a team.
Strong communication and collaboration skills, and the ability to work effectively with other stakeholders in a fast-paced environment
Designation : Senior Software Developer (Python)
Department : Software
Location : Airoli, Navi Mumbai
Office Time : 11am – 8pm / 12pm – 9pm
Skills and Experience:
- Should have relevant experience of 5-8 years
- Python programming experience 4+ years
- Django (python web framework) experience 2+ years
- Flask/FastAPI (API development) experience 1+ years
- Pgsql database experience 3+ years
- Machine learning (ML)/NLP experience 1+ years
- Good to have knowledge of OCR/Tesseract
- Good to have knowledge of LLM/BERT/GenAI
- Good to have knowledge of Celery for queuing
- Good to have knowledge of ML model/Python App deployment on Azure/AWS
About the role
Checking quality is one of the most important tasks at Anakin. Our clients are pricing their products based on our data, and minor errors on our end can lead to our client's losses of millions of dollars. You would work with multiple tools and with people across various departments to ensure the accuracy of the data being crawled. You would setup manual and automated processes and make sure they run to ensure the highest possible data quality.
You are the engineer other engineers can count on. You embrace every problem with enthusiasm. You remove hurdles, are a self-starter, team player. You have the hunger to venture into unknown areas and make the system work.
Your Responsibilities would be to:
- Understand customer web scraping and data requirements; translate these into test approaches that include exploratory manual/visual testing and any additional automated tests deemed appropriate
- Take ownership of the end-to-end QA process in newly-started projects
- Draw conclusions about data quality by producing basic descriptive statistics, summaries, and visualisations
- Proactively suggest and take ownership of improvements to QA processes and methodologies by employing other technologies and tools, including but not limited to: browser add-ons, Excel add-ons, UI-based test automation tools etc.
- Ensure that project requirements are testable; work with project managers and/or clients to clarify ambiguities before QA begins
- Drive innovation and advanced validation and analytics techniques to ensure data quality for Anakin's customers
- Optimize data quality codebases and systems to monitor the Anakin family of app crawlers
- Configure and optimize the automated and manual testing and deployment systems used to check the quality of billions of data points of over 1000+ crawlers across the company
- Analyze data and bugs that require in-depth investigations
- Interface directly with external customers including managing relationships and steering requirements
Basic Qualifications:
- 2+ years of experience as a backend or a full-stack software engineer
- Web scraping experience with Python or Node.js
- 2+ years of experience with AWS services such as EC2, S3, Lambda, etc.
- Should have managed a team of software engineers
- Should be paranoid about data quality
Preferred Skills and Experience:
- Deep experience with network debugging across all OSI layers (Wireshark)
- Knowledge of networks or/and cybersecurity
- Broad understanding of the landscape of software engineering design patterns and principles
- Ability to work quickly and accurately in a highly stressful environment during removing bugs in run-time within minutes
- Excellent communicator, both written and verbal
Additional Requirements:
- Must be available to work extended hours and weekends when needed to meet critical deadlines
- Must have an aversion to politics and BS. Should let his/her work speak for him/her.
- Must be comfortable with uncertainty. In almost all the cases, your job will be to figure it out.
- Must not be bounded to comfort zone. Often, you will need to challenge yourself to go above and beyond.
What You'll Do
You will be part of our data platform & data engineering team. As part of this agile team, you will work in our cloud native environment and perform following activities to support core product development and client specific projects:
- You will develop the core engineering frameworks for an advanced self-service data analytics product.
- You will work with multiple types of data storage technologies such as relational, blobs, key-value stores, document databases and streaming data sources.
- You will work with latest technologies for data federation with MPP (Massive Parallel Processing) capabilities
- Your work will entail backend architecture to enable data modeling, data queries and API development for both back-end and front-end data interfaces.
- You will support client specific data processing needs using SQL and Python/Pyspark
- You will integrate our product with other data products through Django APIs
- You will partner with other team members in understanding the functional / non-functional business requirements, and translate them into software development tasks
- You will follow the software development best practices in ensuring that the code architecture and quality of code written by you is of high standard, as expected from an enterprise software
- You will be a proactive contributor to team and project discussions
Who you are
- Strong education track record - Bachelors or an advanced degree in Computer Science or a related engineering discipline from Indian Institute of Technology or equivalent premium institute.
- 2-3 years of experience in data queries, data processing and data modeling
- Excellent ANSI SQL skills to handle complex queries
- Excellent Python and Django programming skills.
- Strong knowledge and experience in modern and distributed data stack components such as the Spark, Hive, Airflow, Kubernetes, Docker etc.
- Experience with cloud environments (AWS, Azure) and native cloud technologies for data storage and data processing
- Experience with relational SQL and NoSQL databases, including Postgres, Blobs, MongoDB etc.
- Familiarity with ML models is highly preferred
- Experience with Big Data processing and performance optimization
- Should know how to write modular, optimized and documented code.
- Should have good knowledge around error handling.
- Experience in version control systems such as GIT
- Strong problem solving and communication skills.
- Self-starter, continuous learner.
Good to have some exposure to
- Start-up experience is highly preferred
- Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc.
- Agile software development methodologies.
- Working in multi-functional, multi-location teams
What You'll Love About Us – Do ask us about these!
- Be an integral part of the founding team. You will work directly with the founder
- Work Life Balance. You can't do a good job if your job is all you do!
- Prepare for the Future. Academy – we are all learners; we are all teachers!
- Diversity & Inclusion. HeForShe!
- Internal Mobility. Grow with us!
- Business knowledge of multiple sectors
We are looking for a passionate Python Django Developer responsibility include gathering user requirements, defining system functionality, and writing code in various languages like Python, Django, Javascript, HTML, CSS, etc. Ultimately, the role of the Django Developer is to build high-quality, innovative, and fully performing software that complies with coding standards and technical design.
Responsibilities:
You would be working on our core platform, improving the features of the product, testing and fixing bugs/issues, customizing it for clients, cloud and on-premise deployment, security testing, and configuration, etc. You will also get a chance to build new products from scratch.
Tools and technologies that you’d be working on include the following:
- Django, Python.
- Bootstrap, MaterializeCSS, HTML, JavaScript.
- Nginx, Gunicorn, MySQL/Postgres, API integrations (JSON, XML, SOAP).
-
Shell Commands, SSH, SSL Certificate, HTTP/HTTPS.
Eligibility:
- Knowledge of developing web applications using at least one popular web framework (preferably, Django).
- Excellent knowledge of relational databases like Postgres/MySQL, etc.
- Knowledge in designing interactive applications.
- Ability to develop software using Python (Django Framework), Javascript, HTML, and CSS.
- Ability to document requirements and specifications.
- Bachelor of Engineering Degree in Computer Science or Information Technology or Electrical Engineering.
https://docs.google.com/forms/d/e/1FAIpQLSfG91burhFb8nTk4xoU0O8i4Jyjt9W156yuJnjol1fPvlPcfg/viewform?usp=pp_url
We are looking for developers who are passionate about Python and know what are the benefits of python well.
Responsibilities:
- Design & implement new software modules based on the product requirements
- Debug existing software components, fix issues, and avoid regressions
- Be proactive, take ownership, and be accountable
- Familiar with Test Drive Development.
- Participate in software architecture, design discussions, and code reviews
Technologies we’re hiring for:
- Developed HTTP based REST APIs and implemented Authentication, Caching
- Worked with relational and NoSQL databases and aware of concepts like ORM, Migrations
- Languages & Frameworks: Python Django/Flask
- Database: Postgres/ Mongo
- Must have worked in a Unix/Linux based environment
- Advanced middleware like RabbitMQ, Celery Beat is definitely a plus.
Minimum 3+ year hands-on experience in Python needed
About LatentBridge:
We are a global intelligent automation firm with a market-leading pay-as-you-go SaaS platform and
proprietary automation accelerators that can optimise and scale enterprises’ digital programs. We
provide end-to end-automation solutions through advisory, implementation and managed services.
Our cloud platform and industry-focused AI products, enable us to make automation accessible to
every enterprise.
Python, Javascript, HTML, CSS.
Experience Range - 3 - 7 Years.
Roles and Responsibilities
- Pytest
- Worked on utilities modules and libraries
- Unified server for API testing
- Building automation framework for performance and functional testing
- Build mock services
- Self-driven
Desired Candidate Profile
- Python
- Automation
- Development
- Writing efficient, reusable, testable, and scalable code
- Understanding, analyzing, and implementing – Business needs, feature modification requests, conversion into software components
- Integration of user-oriented elements into different applications, data storage solutions
- Developing – Backend components to enhance performance and receptiveness, server-side logic, and platform, statistical learning models, highly responsive web applications
- Designing and implementing – High availability and low latency applications, data protection and security features
- Performance tuning and automation of application
- Working with Python libraries like Pandas, NumPy, etc.
- Creating predictive models for AI and ML-based features
- Keeping abreast with the latest technology and trends
- Fine-tune and develop AI/ML-based algorithms based on results
Technical Skills-
Good proficiency in,
- Python frameworks like Django, etc.
- Web frameworks and RESTful APIs
- Core Python fundamentals and programming
- Code packaging, release, and deployment
- Database knowledge
- Circles, conditional and control statements
- Object-relational mapping
- Code versioning tools like Git, Bitbucket
Fundamental understanding of,
- Front-end technologies like JS, CSS3 and HTML5
- AI, ML, Deep Learning, Version Control, Neural networking
- Data visualization, statistics, data analytics
- Design principles that are executable for a scalable app
- Creating predictive models
- Libraries like Tensorflow, Scikit-learn, etc
- Multi-process architecture
- Basic knowledge about Object Relational Mapper libraries
- Ability to integrate databases and various data sources into a unified system
- Basic knowledge about Object Relational Mapper libraries
- Ability to integrate databases and various data sources into a unified system
Affinsys is a Cognitive CX platform that combines Human & AI realms to help businesses with customer experience management, hyper-personalization & operations automation. With laser sharp focus on Customer Experience automation for financial services, we use Deep learning, NLP, computer vision, big data, recommendation system & speech analytics fused with our domain knowledge of working with 100's of banks across 50 countries to help financial institutions shift : from legacy to latest channels (Embedded Financial services), from transactions to goal based journeys (Personalized customer experience), from reactive service provider to proactive advisor (Robo/AI based advisory), from silos to ecosystem (Open banking).






.png&w=256&q=75)


