We are looking Python intern/developer role. We are looking for candidates who are strong in core Python skills, able to work independently, and strong in developing logic.

Similar jobs
We are looking for a Technical Lead - GenAI with a strong foundation in Python, Data Analytics, Data Science or Data Engineering, system design, and practical experience in building and deploying Agentic Generative AI systems. The ideal candidate is passionate about solving complex problems using LLMs, understands the architecture of modern AI agent frameworks like LangChain/LangGraph, and can deliver scalable, cloud-native back-end services with a GenAI focus.
Key Responsibilities :
- Design and implement robust, scalable back-end systems for GenAI agent-based platforms.
- Work closely with AI researchers and front-end teams to integrate LLMs and agentic workflows into production services.
- Develop and maintain services using Python (FastAPI/Django/Flask), with best practices in modularity and performance.
- Leverage and extend frameworks like LangChain, LangGraph, and similar to orchestrate tool-augmented AI agents.
- Design and deploy systems in Azure Cloud, including usage of serverless functions, Kubernetes, and scalable data services.
- Build and maintain event-driven / streaming architectures using Kafka, Event Hubs, or other messaging frameworks.
- Implement inter-service communication using gRPC and REST.
- Contribute to architectural discussions, especially around distributed systems, data flow, and fault tolerance.
Required Skills & Qualifications :
- Strong hands-on back-end development experience in Python along with Data Analytics or Data Science.
- Strong track record on platforms like LeetCode or in real-world algorithmic/system problem-solving.
- Deep knowledge of at least one Python web framework (e.g., FastAPI, Flask, Django).
- Solid understanding of LangChain, LangGraph, or equivalent LLM agent orchestration tools.
- 2+ years of hands-on experience in Generative AI systems and LLM-based platforms.
- Proven experience with system architecture, distributed systems, and microservices.
- Strong familiarity with Any Cloud infrastructure and deployment practices.
- Should know about any Data Engineering or Analytics expertise (Preferred) e.g. Azure Data Factory, Snowflake, Databricks, ETL tools Talend, Informatica or Power BI, Tableau, Data modelling, Datawarehouse development.
We are a tech venture which provides Product Engineering, QA Automation, Infrastructure, Data, and Market Research services.
Technical Proficiency :
Must have :
-
Strong development experience in Python in the environment of Unix/Linux/Ubuntu
-
Strong practical knowledge of Python and its libraries.
-
Current working experience with cloud deployment of AWS/Azure/GCP, Microservice architecture, and Docker in Python.
-
Good knowledge of CI/CD and DevOps practices
-
Good Experience of Python with Django/ Scrapy/ Flask frameworks.
-
Good Experience in Jupyter/ Docker/ Elastic Search, etc.
-
Solid understanding of software development principles and best practices.
-
Strong analytical thinking and problem-solving skills.
-
Proven ability to drive large-scale projects with a deep understanding of Agile SDLC, high collaboration, and leadership.
Good to have : -
Expected to have migration experience from one version to the other, as this project is about migration to the latest version.
-
Preferred if had an OpenEdx platform experience or any LMS platform.
Skills:
- Very Strong in WCF, WPF, MVVM, Restful WS, and XML
- Very good in Design/Architecture understanding
- Knowledge in DevExpress, Entity Framework, RDBMS and MS SQL Server would be a BIG
PLUS.
- Very Strong knowledge in OOPS and Design Patterns
- Working Knowledge in Design, Code reviews/quality, unit testing with NUnit/ XUnit and
Continuous integration
- Analytical, a problem solver, with good communication skills.
- Ability to understand the requirements, analyze and articulate into design.
- Ability to find out and work with alternatives, check feasibility and propose solutions.
- Experience in Agile methodology and following best practices
- R&D mindset and thirst for exploring/learning new skills
Qualifications
Qualifications - BE/BTech/MCA
Experience: 6+ years
- Work experience as a Python Developer
- Expertise in at least one popular Python framework (like Django, Flask or Pyramid)
- Knowledge of object-relational mapping (ORM)
- Familiarity with front-end technologies (like JavaScript and HTML5)
- Team spirit
- Good problem-solving skills
- Write effective, scalable code
- Develop back-end components to improve responsiveness and overall performance
- Integrate user-facing elements into applications
- Test and debug programs
- Improve functionality of existing systems
- Implement security and data protection solutions
Job Summary
- 5 to 8 years of experience with Python, and well versed with RDBMS (SQL Server preferred).
- Should have good experience in Data Structures, Algorithms, NumPy, and Pandas.
- Familiar with JSON and REST APIs
- Strong knowledge of object-oriented and parallel programming techniques
- Experience with test-driven development (TDD)
- Excellent analytical and problem-solving skills
- Good interpersonal skills
- Good team player
Skills:
Python Developer
Python
API
RDBMS
Currently, we are ongoing a Big Project for which we need an immediate Golang Developer for 6 Months on a contract basis with a strong understanding of quality design, problem-solving, contributing ideas on using the latest technologies etc. This role will involve collaboration with the business partners, product managers and other representatives. We’re looking for a highly skilled Senior Software Engineer to work closely with the product and the open-source community to build the Digirex vision. You’ll join a highly collaborative team working along with talented engineers focusing to be the best in Cloud-native ecosystem. You should also be willing to work with a start-up
scalability, accessibility, usability, design, and security in mind. If you don't have all
of these, that's ok, but be excited about learning the few you don't know.
Familiar with building complex web applications using Javascript / AJAX, XHTML,
CSS.
Experience with or knowledge of jQuery, Java, Struts, and other website technologies.
Strong, object-oriented design and coding skills (C/C++ and/or Java preferably on a
UNIX or Linux platform)
Solid software development background including design patterns, data structures,
test driven development
Knowledge of Perl or other scripting languages a plus
Experience with distributed (multi-tiered) systems, algorithms, and relational
databases
Software development experience in building highly scalable applications
Master’s degree in Computer Science, Computer Engineering or related technical
discipline
Experience in eCommerce and Deep hands-on technical expertise
Ability to handle multiple competing priorities in a fast-paced environment
Experience working with service oriented architectures and web based solutions.
We, the Products team at DataWeave, build data products that provide timely insights that are readily consumable and actionable, at scale. Our underpinnings are: scale, impact, engagement, and visibility. We help
businesses take data driven decisions everyday. We also give them insights for long term strategy. We are focused on creating value for our customers and help them succeed.
How we work
It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest data problems that there are. We are in the business of making sense of messy public data on the web. At
serious scale! Read more on Become a DataWeaver
What do we offer?
- Opportunity to work on some of the most compelling data products that we are building for online retailers and brands.
- Ability to see the impact of your work and the value you are adding to our customers almost immediately.
- Opportunity to work on a variety of challenging problems and technologies to figure out what really excites you.
- A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible working hours.
- Learning opportunities with courses, trainings, and tech conferences. Mentorship from seniors in the team.
- Last but not the least, competitive salary packages and fast paced growth opportunities.
Roles and Responsibilities:
● Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics
functionality
● Build robust RESTful APIs that serve data and insights to DataWeave and other products
● Design user interaction workflows on our products and integrating them with data APIs
● Help stabilize and scale our existing systems. Help design the next generation systems.
● Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
● Work closely with the Head of Products and UX designers to understand the product vision and design
philosophy
● Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and
interns.
● Constantly think scale, think automation. Measure everything. Optimize proactively.
● Be a tech thought leader. Add passion and vibrancy to the team. Push the envelope.
Skills and Requirements:
● 5-7 years of experience building and scaling APIs and web applications.
● Experience building and managing large scale data/analytics systems.
● Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good understanding of software design principles and architectural best practices.
● Be passionate about writing code and have experience coding in multiple languages, including at least one scripting language, preferably Python.
● Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision is right/wrong, and so on.
● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
● Have experience working with multiple storage and indexing technologies such as MySQL, Redis, MongoDB, Cassandra, Elastic.
● Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
● Use the command line like a pro. Be proficient in Git and other essential software development tools.
● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
● Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana, Graylog, StatsD, Datadog etc.
● Working knowledge of building websites and apps. Good understanding of integration complexities and dependencies.
● Working knowledge linux server administration as well as the AWS ecosystem is desirable.
● It's a huge bonus if you have some personal projects (including open source contributions) that you work on during your spare time. Show off some of your projects you have hosted on GitHub.







