
Provide 24X7X365 NOC Support for Voice related issues
· SMSC, SMPP, HTTP, HTTPS, SIGTRAN, SS7 experience
· Knowledge and relevant experience on VPN and messaging nodes (SMSC, Bulk SMS, SMSR, SMS HUB)
· Manage day to day operations of SMPP SS7 SMS NOC Customer/Vendor/Route/Rate Configuration Management, Trace Capturing & Troubleshooting
· New clients interconnect and interconnection testing
· Monitoring of all alarms/alerts/Performance from system and System health checklist for various nodes
· Monitoring €“ Disk Space, log files, Dumps/logs purging, application up-time
· Capacity management and reporting in case of possible breach of capacity
· Answer customer emails/calls and provide timely & high level of service
· Use a variety of tools (Ethereal/Wireshark, ping, traceroute, browser, etc.) to quickly verify reported events Head to Head testing/configuration with clients
· Management reporting as per requirement Maintain a good understanding of all NOC processes and implement them appropriately
Skills
● Telecom industry experience will be an added advantage.
● Understanding of all selection methods and techniques
● Average communicator
● Well-organized
--

About globe teleservices
About
Similar jobs
Job Title: .NET Full Stack Developer
Experience: 3 to 6 Years
Work Mode: Hybrid (2-3 days from office)
Location: Gurgaon
Joiners: Immediate joiners or candidates who have completed their notice period preferred
Key Responsibilities
- Design, develop, and maintain web applications using .NET (C#) on the backend and Angular/React on the frontend.
- Develop RESTful APIs and integrate them with front-end components.
- Collaborate with UI/UX designers, backend developers, and product managers to deliver high-quality features.
- Write clean, maintainable, and efficient code following best practices.
- Participate in code reviews and contribute to continuous improvement of development processes.
- Troubleshoot and debug issues across the application stack.
- Work with DevOps teams to support CI/CD pipelines and deployment.
- Ensure application scalability, performance, and security.
- Contribute to documentation, unit testing, and version control.
Required Skills
- Strong proficiency in C# and .NET Core/.NET Framework.
- Experience with JavaScript and modern front-end frameworks like Angular or React (preference for Angular).
- Exposure to cloud platforms – Azure (preferred), AWS, or GCP.
- Good understanding of HTML5, CSS3, and TypeScript.
- Experience in RESTful API development.
- Familiarity with Entity Framework and SQL-based databases like SQL Server.
- Understanding of version control systems like Git.
- Basic knowledge of CI/CD practices and tools like Azure DevOps or Jenkins.
The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
Job Description for Telecalling executive
1.Contact potential customers to promote solar energy products and services, including solar panel installation, maintenance, and energy savings.
2. Identify and qualify potential leads through phone conversations
3.Address customer inquiries, provide relevant information, and assist in setting up appointments for site assessments
4.Regularly follow up with leads and existing customers to close sales or resolve any issues.

- Proficiency in Python , Django and Other Allied Frameworks;
- Expert in designing UI/UX interfaces;
- Expert in testing, troubleshooting, debugging and problem solving;
- Basic knowledge of SEO;
- Good communication;
- Team building and good acumen;
- Ability to perform;
- Continuous learning
Skills
- Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model.
- Expertise in backend programming with Node.js and MongoDB.
- Experience with React.js and redux.
- Material UI and 3rd party libraries.
- Experience with clean code writing practices like avoiding callback hell like promises, async.
- Thorough understanding of Node.js and its core principles.
- Experience with popular React.js workflows (such as Flux or Redux).
- Familiarity with newer specifications of ECMAScript.
- Experience with data structure libraries (e.g., Immutable.js).
- Familiarity with RESTful APIs.
- Knowledge of modern authorization mechanisms, such as JSON Web Token.
- Familiarity with modern front-end builds pipelines and tools.
- A knack for benchmarking and optimization.
- Familiarity with code versioning tools (such as Git, SVN, and Mercurial).
Responsibilities:
- Build Node.js APIs using microservices.
- Rewriting backend code with microservices architecture & Unit tests.
- Developing new user-facing features using React.js.
- Building reusable components and front-end libraries for future use.
- Translating designs and wireframes into high-quality code.
- Optimizing components for maximum performance across a vast array of web-capable devices and browsers.
|
· 3+ years of experience as a Software Engineer · Deep understanding of server-side code, with experience of developing in Node.js · Must have good knowledge of Express, Rest API, WebSocket, OAuth, OpenID and Node.JS best practices · Must be able to create separate micro-service for each business domain · Experienced in unit testing and should be able to achieve code coverage of 90% plus · Can write complex algorithm with multi-threading as a part of the feature · Experience in writing asynchronous programming · Knowledge of cloud applications like in AWS · Familiarity with code versioning tools such as Git, SVN, and Mercurial · Practical experience of delivering in an agile environment · Practical experience of knowledge of developing real-world solutions and platforms · Good understanding of security and performance considerations · Understanding of architectural and design patterns · Deep understanding of SQL and NoSQL databases |
We are hiring DevOps Engineers for luxury-commerce platform that is well-funded and is now ready for its next level of growth. It is backed by reputed investors and is already a leader in its space. The focus for the coming years will be heavily on scaling the platform through technology. Market-driven competitive salary or the right candidate
Job Title : DevOps System Engineer
Responsibilities:
- Implementing, maintaining, monitoring, and supporting the IT infrastructure
- Writing scripts for service quality analysis, monitoring, and operation
- Designing procedures for system troubleshooting and maintenance
- Investigating and resolving technical issues by deploying updates/fixes
- Implementing automation tools and frameworks for automatic code deployment (CI/CD)
- Quality control and management of the codebase
- Ownership of infrastructure and deployments in various environments
Requirements:
- Degree in Computer Science, Engineering or a related field
- Prior experience as a DevOps engineer
- Good knowledge of various operating systems - Linux, Windows, Mac.
- Good Knowledge of Networking, virtualization, Containerization technologies.
- Familiarity with software release management and deployment (Git, CI/CD)
- Familiarity with one or more popular cloud platforms such as AWS, Azure, etc.
- Solid understanding of DevOps principles and practices
- Knowledge of systems and platforms security
- Good problem-solving skills and attention to detail
Skills: Linux, Networking, Docker, Kubernetes, AWS/Azure, Git/GitHub, Jenkins, Selenium, Puppet/Chef/Ansible, Nagios
Experience : 5+ years
Location: Prabhadevi, Mumbai
Interested candidates can apply with their updated profiles.
Regards,
HR Team
Aza Fashions
Requirements
- 3-5 years of experience working on Python Environment
- Knowledge on software development methodologies and being able to work on projects individually or as part of a team. Should be able to work with AGILE methodology
- Knowledge of relational databases, version control tools and other development tools
- Understanding of the threading limitations of Python, and multi-process architecture
- Understanding of accessibility and security compliance
- Strong interest in learning new skills/technologies and curiosity to explore various Technologies.
- Should have strong analytical skills
- Good communication
- Able to integrate multiple data sources and databases into one system
- Understanding of the differences between multiple delivery platforms, such as mobile vs desktop, and optimizing output to match the specific platform
Must Required Skills:
- Python 2/3
- should have worked on one or more major Python web frameworks like Django.
- Worked with APIs development, integration of third-party APIs
- Working knowledge of MySQL databases integration with Python Environment.
- Technical skills:
- Primary skills :
- Siebel CSW
Should have hands on experience Siebel Configuration, Scripting and Workflows
- Siebel Integration
Should have hands on experience Siebel integrations like JMS\MQ, Web services, VBC, EBC, Rest Service, API building etc.
- Siebel OpenUI
Should have hands on experience in Siebel OpenUI development.
- Database (SQL /Oracle)
Should have good experience in writing queries, performance management, query optimization etc.
- Should be able to work independently and should have strong debugging skills
- Secondary skills : (Good to have)
- Siebel certification
- Exposure to Agile
- Exposure to HP ALM and JIRA
- Candidate’s overall experience should be min 5 years to max 8 years.









