Knowlarity Communications is India's largest cloud-based solutions provider. Our virtual phone system and enterprise solutions help make your business reliable and intelligent. With the capability to process over a million calls an hour, Knowlarity is a trusted brand for more than 8000 companies worldwide, SMBs as well as enterprises. We are funded by Sequoia Capital and Mayfield, headquartered in Singapore and have offices in Gurgaon, Mumbai, Bangalore, Dubai and the Philippines. Knowlarity solves business problems by making telephony intelligent and reliable in real time over the cloud, for Enterprises.
Must Have:
Languages : C, Python
DataBase: MySQL, PostgreSQL
Tools: Git
Operating System: Linux
Protocols: SIP, RTP, WebRTC
Good to have:
Languages : AWS, GCP, Azure (Cloud services)
Tools: FreeSWITCH, Asterisk, OpenSIP
We offer:
- A competitive salary and extensive social benefits
- Opportunity to be part of a team that invented and dominates the emerging market in the cloud telephony industry.
- Massive opportunities for growth.
- Work from a prime location - easy accessibility from both Gurgaon and Delhi.
- Work-life balance and support for career development.
- An amazing life inside the Knowlarity! Want to know more about it
Then let's stay connected!
https://www.facebook.com/Knowlarity/" target="_blank">https://www.facebook.com/
https://twitter.com/knowlarity" target="_blank">https://twitter.com/knowlarity
https://www.linkedin.com/company-beta/410771/" target="_blank">https://www.linkedin.com/

About Knowlarity Communication India Pvt Ltd
About
Connect with the team
Similar jobs
Job Summary:
Deqode is looking for a highly motivated and experienced Python + AWS Developer to join our growing technology team. This role demands hands-on experience in backend development, cloud infrastructure (AWS), containerization, automation, and client communication. The ideal candidate should be a self-starter with a strong technical foundation and a passion for delivering high-quality, scalable solutions in a client-facing environment.
Key Responsibilities:
- Design, develop, and deploy backend services and APIs using Python.
- Build and maintain scalable infrastructure on AWS (EC2, S3, Lambda, RDS, etc.).
- Automate deployments and infrastructure with Terraform and Jenkins/GitHub Actions.
- Implement containerized environments using Docker and manage orchestration via Kubernetes.
- Write automation and scripting solutions in Bash/Shell to streamline operations.
- Work with relational databases like MySQL and SQL, including query optimization.
- Collaborate directly with clients to understand requirements and provide technical solutions.
- Ensure system reliability, performance, and scalability across environments.
Required Skills:
- 3.5+ years of hands-on experience in Python development.
- Strong expertise in AWS services such as EC2, Lambda, S3, RDS, IAM, CloudWatch.
- Good understanding of Terraform or other Infrastructure as Code tools.
- Proficient with Docker and container orchestration using Kubernetes.
- Experience with CI/CD tools like Jenkins or GitHub Actions.
- Strong command of SQL/MySQL and scripting with Bash/Shell.
- Experience working with external clients or in client-facing roles.
Preferred Qualifications:
- AWS Certification (e.g., AWS Certified Developer or DevOps Engineer).
- Familiarity with Agile/Scrum methodologies.
- Strong analytical and problem-solving skills.
- Excellent communication and stakeholder management abilities.
What We’re Looking For:
- Strong experience in Python (5+ years).
- Hands-on experience with any database (SQL or NoSQL).
- Experience with frameworks like Flask, FastAPI, or Django.
- Knowledge of ORMs, API development, and unit testing
We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
With over 40 years of innovation, Quantum's end-to-end platform is uniquely equipped to orchestrate protect, and enrich data across its lifecycle, providing enhanced intelligence and actionable insights. Leading organizations in cloud services, entertainment, government, research, education, transportation, and enterprise IT trust Quantum to bring their data to life, because data makes life better, safer, and smarter. Quantum is listed on Nasdaq (QMCO) and the Russell 2000® Index. For more information visit www.quantum.com.
As a Software Engineer, you will collaborate with engineers and product managers on the development and maintenance of Quantum’s DXi-Series of disk-based backup appliance software. Quantum’s DXi series protects our customers data on premises, in the cloud, or in a hybrid environment.
You Are A Part Of:
DXi is a uniquely powerful solution within the Quantum portfolio, allowing customers to meet and exceed their backup needs with one of the fastest products on the market. You’ll work on a product that allows customers to reduce costs, maximize production, scale with ease, and positively impact the environment by reducing power and cooling requirements.
Job Responsibilities:
Responsibilities include, but are not limited to:
• Write code primarily for Linux systems, with programming languages including Python, C, C++, and Perl.
• Design and build differentiating feature sets that continue to expand product capabilities, both on premises and in the cloud.
• Work with development, test, service, and support engineers to develop tactical solutions for customer issues.
• May design and develop automated test suites.
• May maintain lab equipment.
Required Skills and/or Experience:
• Bachelor’s degree in Computer Science, Information Technology, or related field of study required.
• 5-10 years related industry experience required.
• 5+ years software development in C or C++ is required.
• 3-5 years’ experience working in a Linux environment is required.
• Experience in writing scripts: Perl, shell, bash, and/or other scripting tools is required.
• Experience with debugging tools such as GDB is required.
• Experience with source control and shared build environments is required.
- 3+ years of SDE work experience from Product based companies
- Experience in Java, Spring Boot, MySQL, Kafka, Hbase, AWS
- Experience in Multi threading, distributed systems, Best practices of coding, scaling
The ideal candidate is a self-motivated, multi-tasker, and demonstrated team-player. You will be a senior developer responsible for the development of new software products and enhancements to existing products. You should excel in working with large-scale applications and frameworks and have outstanding communication and leadership skills.
Responsibilities:
- Writing clean, high-quality, high-performance, maintainable code
- Develop and support software including applications, database integration, interfaces, and new functionality enhancements
- Coordinate cross-functionally to insure project meets business objectives and compliance standards
- Support test and deployment of new products and features
- Participate in code reviews
Requirements:
- B.Tech from Tier 1 College.
- 3+ yrs of experience, at least 2 yrs of experience with Python and Django.
- Expertise in Object Oriented Design, Database Design, and XML Schema
- Experience with Agile or Scrum software development methodologies
- Ability to multi-task, organise, and prioritise work









