11+ ISDN Jobs in Hyderabad | ISDN Job openings in Hyderabad
Apply to 11+ ISDN Jobs in Hyderabad on CutShort.io. Explore the latest ISDN Job opportunities across top companies like Google, Amazon & Adobe.
Required Skills: GSM, Linux, Networking, infrastructure, VOIP, PHP, Perl, SIP, troubleshooting, Python,HTML
- 5+ years of solid experience as an Asterisk Developer
- Must be proficient at developing software using a scripting language. A distinct advantage if you have experience using Python
- Sound knowledge of Asterisk Installation, Configuration, Dialplan, Call troubleshooting (SIP, ISDN)
- Solid experience of working With Linux Operating Systems
- Good understanding of VoIP, SIP, SS7.
Criteria
Mandatory
Strong Dremio / Lakehouse Data Architect profile
Mandatory (Experience 1) – 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
Mandatory (Experience 2) – Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
Mandatory (Technical Skills 1) – Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
Mandatory (Technical Skills 2) – Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
Mandatory (Architecture) – Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
Mandatory (Governance) – Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
Mandatory (Stakeholder Management) – Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
Mandatory (Company) – Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Job Title: Data Engineer
Location: Hyderabad
About us:
Blurgs AI is a deep-tech startup focused on maritime and defence data-intelligence solutions, specialising in multi-modal sensor fusion and data correlation. Our flagship product, Trident, provides advanced domain awareness for maritime, defence, and commercial sectors by integrating data from various sensors like AIS, Radar, SAR, and EO/IR.
At Blurgs AI, we foster a collaborative, innovative, and growth-driven culture. Our team is passionate about solving real-world challenges, and we prioritise an open, inclusive work environment where creativity and problem-solving thrive. We encourage new hires to bring their ideas to the table, offering opportunities for personal growth, skill development, and the chance to work on cutting-edge technology that impacts global defence and maritime operations.
Join us to be part of a team that's shaping the future of technology in a fast-paced, dynamic industry.
Job Summary:
We are looking for a Senior Data Engineer to design, build, and maintain a robust, scalable on-premise data infrastructure. You will focus on real-time and batch data processing using platforms such as Apache Pulsar and Apache Flink, work with NoSQL databases like MongoDB and ClickHouse, and deploy services using containerization technologies like Docker and Kubernetes. This role is ideal for engineers with strong systems knowledge, deep backend data experience, and a passion for building efficient, low-latency data pipelines in a non-cloud, on-prem environment.
Key Responsibilities:
- Data Pipeline & Streaming Development
- Design and implement real-time data pipelines using Apache Pulsar and Apache Flink to support mission-critical systems.
- Develop high-throughput, low-latency data ingestion and processing workflows across streaming and batch workloads.
- Integrate internal systems and external data sources into a unified on-prem data platform.
- Data Storage & Modelling
- Design efficient data models for MongoDB, ClickHouse, and other on-prem databases to support analytical and operational workloads.
- Optimise storage formats, indexing strategies, and partitioning schemes for performance and scalability.
- Infrastructure & Containerization
- Deploy, manage, and monitor containerised data services using Docker and Kubernetes in on-prem environments.
- Performance, Monitoring & Reliability
- Monitor the performance of streaming jobs and database queries; fine-tune for efficiency and reliability.
- Implement robust logging, metrics, and alerting solutions to ensure data system availability and uptime.
- Identify bottlenecks in the pipeline and proactively implement optimisations.
Required Skills & Experience:
- Strong experience in data engineering with a focus on on-premise infrastructure.
- Strong expertise in streaming technologies like Apache Pulsar, Apache Flink, or similar.
- Deep experience with MongoDB, ClickHouse, and other NoSQL or columnar storage databases.
- Proficient in Python, Java, or Scala for data processing and backend development.
- Hands-on experience deploying and managing systems using Docker and Kubernetes.
- Familiarity with Linux-based systems, system tuning, and resource monitoring.
Preferred Qualifications:
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, or an equivalent combination of education and experience.
Additional Responsibilities for Senior Data Engineers :
For those hired as Senior Data Engineers, the role will come with added responsibilities, including:
- Leadership & Mentorship: Guide and mentor junior engineers, sharing expertise and best practices.
- System Architecture: Lead the design and optimization of complex real-time and batch data pipelines, ensuring scalability and performance.
- Sensor Data Expertise: Focus on building and optimizing sensor-based data pipelines and stateful stream processing for mission-critical applications in domains like maritime and defense.
- End-to-End Ownership: Take responsibility for the performance, reliability, and optimization of data systems.
Compensation:
- Data Engineer CTC: 4 - 8 LPA
- Senior Data Engineer CTC: 12 - 16 LPA
JOB PURPOSE
A Senior Associate Java WCS Technologist is considered as a senior contributor for complex modules of work by bringing deep core technology expertise and relevant business subject matter expertise to the table.
JOB RESPONSIBILITIES
- Sr. Associate, Technology plays a significant role during the design and implementation of the technological solution for our clients.
- A Sr. Associate, Technology is involved in ensuring a quality technical design that satisfies the business requirements of the client.
- A Sr. Associate, Technology is often involved in package evaluations and recommendations, communicating the technological details of the project to the business users and workshops with the clients. Sr. Associate, Technology also participates in gathering business requirements and assessing existing architectures and resources.
- Sr. Associate, Technology provides leadership to the team by taking responsibility for a specific component or track of the project architecture. By taking on this level of responsibility, a Sr. Associate, Technology spends more time overseeing the tasks required to implement a solution rather than performing the tasks directly. This includes planning, estimation, resource management, issue resolution and quality assurance. A Sr. Associate, Technology is also responsible for coordinating and communicating with the other tracks and disciplines involved in the project.
- In performing the essential functions of this role, the work is fast-paced, moderately noisy and team-based. Additionally, frequent overnight travel is required.
- Prepares the technical design of the more complex technology components within the module (one or more of client/web presentation tier, server tier, data access and data model, integration component, package function customization)
- Participates in and in some cases drives design reviews of other modules and provides insightful comments to improve the design quality and design conformance to standards
- Assists the architect in articulating the pros and cons of using a certain technology stack/package or component or design pattern versus another to the clients and project team and drives selection of technologies, designs to come up with the optimal architecture
- Implements slice of application (EAR) and proofs of concept (spike solution) to prove any new technologies or integration scenarios in the module
- For package implementations, aids the Architect of the project to perform the gap analysis between business requirements and the package features and design the configuration, customizations, extensions, interfaces required to meet the requirements
- Provides innovative solutions to project level technical issues
- Critiques a design created by another designer and helps identify design and performance improvements
- Interacts with a set of clients (client senior developers and architects) to create technology specifications from business requirements for one of the modules within the project
- In some scenarios, collaborates with client developers to design, build, test and deploy the module components and integrate with the rest of the modules
- Raises and drives to closure with the client any technical design and implementation issues in the module and also in the interfaces and interactions with other modules
- Estimates the implementation and deployment of the module based on design architecture, testing strategy and overall project plan
- Assists the architect in coming up with the overall estimates for the project along with any key risks and issues and their mitigation
- Reports progress and issues to the Manager in a timely manner. In particular, to relay issues that might impact on quality or the ability to deliver to timescales or estimates
- Works with team on the development of standards, processes and procedures related to application security, upgrade management, capacity planning, application deployment, performance monitoring/tuning, and failover and disaster recovery
- Mentors the team on the best techniques to debug and troubleshoot design and implementation defects and issues
- Shares point of view on technology stack, package and latest technology and business trends in one or more relevant areas (e.g. eCommerce, Content Management).
- Documents the technical design using UML, suitable design patterns in the form of technical design narrative, object models, sequence diagrams, collaboration diagrams
SKILL REQUIREMENTS
Experience in: J2EE - Application Servers, Java - ORM, Java - Spring Framework, Core
Java, SQL Development Languages, Java - Web, Presentation Frameworks, Java - Messaging
Implementation, Java Web Services, Planning/ Execution & Tracking, Scoping and
Estimating, Data Modeling, High Availability and Failover Applications, High Throughput / Transaction Application, Logical Architecture Design, OOAD and UML, Package / Vendor Selection, Performance / Capacity Planning, Application, Security.
Ability to abstract detail into larger (repeatable) patterns, familiarity with user centered analysis and evaluation techniques
Understanding of the project life cycle process to effectively be able to manage a sub-
Business Knowledge: Domain experience on Enterprise data warehouse would be a plus.
Personal Attributes: a. Strong and innovative approach to problem solving and finding solutions b. Excellent communicator (written and verbal, formal and informal) c. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution d. Ability to multi-task under pressure and work independently with minimal supervision. e. Ability to prioritize when under pressure
EXPERIENCE
2-8 years
EDUCATION
Full Time Bachelor’s / Master’s degree (Science or Engineering preferred)
Experience & Expertise:
- Expertise with NodeJS (extensively using ExpressJS), AngularJS and MySQL
- Rich user interface development experience with HTML5, JQuery, w3css/bootstrap etc and JavaScript frameworks like AngularJS and Angular
- Expertise in building real-time web applications using MQTT, Websocket, Socket.IO, and EventSource frameworks
- Comfortable with developing MVC architecture-based code
- Comfortable in using at least one of the Graph/Chart libraries
- Ability with report generation from HTML contents
- Skilled in implementing security features across the applications and components
- Experience with tools usage - Code quality, Memory/CPU profiling tools
- Skilled with code optimization techniques
- Experience with docker container usage
- Experience with LDAP is a plus
- Strong troubleshooting techniques
- The strong defect resolution process for minimized latency
- Familiarity with confluence management (JIRA)
- Good to have a basic understanding of PHP frameworks like Yii, Codeigniter and databases like MongoDB
- Good to have strong networking fundamentals, prior working experience in network technologies and protocols
- Ability to build supportability features to reduce the defects in software components
- Knowledge of GNU tools, revision control software (SVN, Git, etc.) and development lifecycle is a plus
Its a Full Time Position with our client
Date of Joining: Immediate Joiners (within 7-10 Days)
Work Location: Hyderabad
Experience Level : 6-10 Years
Mandatory Skills: WebAPI, Angular 4+ , MVC and SQL
Job description:
•Expertise in Web API is most preferable.
•Good experience needed in Angular 4+ implementation.
•Must have very good exposure and experience working with C#,ASP.Net, MVC, Entity Framework, Web Service, Java Script, jQuery and SQL Server.
•Strong Knowledge of software implementation best practices.
•Strong experience in debugging and working with n-tier architecture (UI, Business layer and Data Access layer) along some experience with service oriented architectures(SOA)
•Ability to design and optimize SQL server stored procedures.
•Solid understanding of object oriented programming (OOP).
•Experience using version control (Git/Subversion)
•Experience using Jira and Confluence
•Develop and enhance new and existing software applications
•Mentor and train other team members
•Gain knowledge of the Energy Industry
•Provide documentation and training on the solution

VMULTIPLY solutions hiring for Hyderabad based Client
Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.
Must Have Skills
- Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
- Leading in the identification, isolation, resolution and communication of problems within the production environment.
- Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
- Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
- Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
- Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
- Works on multiple platforms and multiple projects concurrently.
- Performs code and unit testing for complex scope modules, and projects
- Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
- Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
- Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
- Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
- Working knowledge on Kafka Rest proxy.
- Ensure optimum performance, high availability and stability of solutions.
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
- Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem.
- Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
- Ability to perform data related benchmarking, performance analysis and tuning.
- Strong skills in In-memory applications, Database Design, Data Integration.
MTX Group Inc. is seeking a motivated Technical Lead - AI to join our team. MTX Group Inc. is a global implementation partner enabling organizations to become fit enterprises. MTX provides expertise across various platforms and technologies, including Google Cloud, Salesforce, artificial intelligence/machine learning, data integration, data governance, data quality, analytics, visualization and mobile technology. MTX’s very own Artificial Intelligence platform Maverick, enables clients to accelerate processes and critical decisions by leveraging a Cognitive Decision Engine, a collection of purpose-built Artificial Neural Networks designed to leverage the power of Machine Learning. The Maverick Platform includes Smart Asset Detection and Monitoring, Chatbot Services, Document Verification, to name a few.
Responsibilities:
- Extensive research and development of new AI/ML techniques that enables learning
the semantics of data (images, video, text, audio, speech, etc)
- Improving the existing ML and DNN models and products through R&D on cutting edge technologies
- Collaborate with Machine Learning teams to drive innovation of complex and accurate cognitive system
- Collaborate with Engineering and Core team to drive innovation of scalable ML and AI serving production platforms
- Create POCs to quickly test a new model architecture and create improvement over an existing methodology
- Introduce major innovations that can result in better product features and develop strategies and plans required to drive these
- Lead a team and collaborate with product managers, tech review complex implementations and provide optimisation best practices
What you will bring:
- 4-6 years of Experience
- Experience in neural networks, graphical models, reinforcement learning, and natural language processing
- Experience in Computer Vision techniques and image detection neural network models like semantic segmentation, instance segmentation, object detection, etc
- In-depth understanding of benchmarking, parallel computing, distributed computing, machine learning, and AI
- Programming experience in one or more of the following: Python, C, C++, C#, Java, R, and toolkits such as Tensorflow, Keras, PyTorch, Caffe, MxNet, SciPy, SciKit, etc
- Ability to perform research that is justified and guided by business opportunities
- Demonstrated successful implementation if industry grade AI solutions in the past
- Ability to lead a team of AI engineers in an agile development environment
What we offer:
- Group Medical Insurance (Family Floater Plan - Self + Spouse + 2 Dependent Children)
- Sum Insured: INR 5,00,000/-
- Maternity cover upto two children
- Inclusive of COVID-19 Coverage
- Cashless & Reimbursement facility
- Access to free online doctor consultation
- Personal Accident Policy (Disability Insurance) -
- Sum Insured: INR. 25,00,000/- Per Employee
- Accidental Death and Permanent Total Disability is covered up to 100% of Sum Insured
- Permanent Partial Disability is covered as per the scale of benefits decided by the Insurer
- Temporary Total Disability is covered
- An option of Paytm Food Wallet (up to Rs. 2500) as a tax saver benefit
- Monthly Internet Reimbursement of upto Rs. 1,000
- Opportunity to pursue Executive Programs/ courses at top universities globally
- Professional Development opportunities through various MTX sponsored certifications on multiple technology stacks including Salesforce, Google Cloud, Amazon & others



![[x]cube LABS](/_next/image?url=https%3A%2F%2Fcdnv2.cutshort.io%2Fcompany-static%2F639877aa0ad87e002533a1c5%2Fuser_uploaded_data%2Flogos%2Fx_whiteB_eeCk0gqs.png&w=256&q=75)
