11+ Wireshark Jobs in Delhi, NCR and Gurgaon | Wireshark Job openings in Delhi, NCR and Gurgaon
Apply to 11+ Wireshark Jobs in Delhi, NCR and Gurgaon on CutShort.io. Explore the latest Wireshark Job opportunities across top companies like Google, Amazon & Adobe.
Hi
Profile : Network Support Engineer
Experience - 3+ years
Timing - Thurs-Mon: 10:30PM to 7:30 AM
Location: NOIDA, Sec-62 ( Work from Office)
Expertise & Hands on experience with the following:
- Creating new infrastructure from scratch
- Excellent working experience of LAN, WAN management and network devices
- Linux Server installation and drivers installations configuration
- Network troubleshooting in both windows, linux environments.
- Troubleshooting of WiFi Networks
- Solid understanding of networking concepts and protocols, including TCP/IP, DNS, DHCP, BGP, OSPF, VLANs, and VPNs.
- Perform network diagnostics using tools such as Wireshark, tcpdump, and SNMP.
- Investigate security incidents and implement measures to mitigate risks. ○ Participate in on-call rotation to provide support for network emergencies.
- Working experience to manage the complex network infrastructure.
- Experience to do automation to manage network devices using scripting.
- Strong analytical and problem-solving skills with the ability to quickly diagnose and resolve network issues.
- Cisco CCNA or equivalent certification is a plus.
Scripting for automation ○ Python ○ Bash
Non-Functional Requirements:
- Performs regular audits of server security and websites exposed outside to the world.
- Documenting any processes which employees need to follow in order to successfully work within our computing system.
- Experience supporting technical teams (e.g., developers and/or IT teams)
- Strong attention to details of problems and ability to navigate complex issues in high pressure situations
- The ability to effectively collaborate with various internal teams
- Excellent critical thinking and problem solving skills.
- Ensures all the problems are resolving in a timely manner and following SLAs.
- Ability to prioritize a wide range of workloads with critical deadlines.
- Experience in a 24x7 environment.
- Good communication skills
About the Company:
MyOperator is a Business AI Operator, a category-leader that unifies WhatsApp, Calls, and AI-powered chat & voice bots into one intelligent business communication platform. Unlike fragmented communication tools, MyOperator combines automation, intelligence, and workflow integration to help businesses run WhatsApp campaigns, manage calls, deploy AI chatbots, and track performance all from a single, no-code platform. Trusted by 12,000+ brands including Amazon, Domino's, Apollo, and Razorpay, MyOperator enables faster responses, higher resolution rates, and scalable customer engagement without fragmented tools or increased headcount.
About the Role:
MyOperator is looking for a motivated and customer-focused Customer Support Executive to join our dynamic team. This internship offers hands-on experience in handling customer queries, managing support tickets, and gaining exposure to the workings of a fast-growing SaaS company.
Key Responsibilities:
- Respond to customer queries via calls, emails, and support tickets.
- Assist in resolving issues related to panel access, call reports, and login problems.
- Update ticket statuses and maintain accurate customer records using Zoho Desk.
- Collaborate with the support team to escalate complex issues to relevant departments.
- Assist in creating and maintaining FAQs and troubleshooting documentation.
Requirements:
- Good communication skills in English(spoken and written).
- Basic computer proficiency and willingness to learn support tools and CRM systems.
- Strong interest in SaaS, tech support, or customer-facing roles.
- Availability for a full-time (Monday to Saturday, rotational shifts).
What You’ll Gain:
- Practical exposure to customer success and support operations in a SaaS environment.
- Opportunity to work with experienced professionals and learn industry best practices
Experience – 1-2 yrs
Qualification- MCA /B. Tech.
Job Nature - Permanent
Skills - PHP,Laravel, Codeigniter, MySQL, HTML5, CSS3, JavaScript with ES6, Ajax,Theme development & Plugin
Working days - 5
Job location- Lucknow
Responsibilities:
- Developing and maintaining high-performing, scalable, and robust themes and plugins.
- Implementing different aspects of coding standards for JavaScript, PHP, and WordPress.
- Understanding the project requirements, preparing the flow, and planning the solution to accomplish the job optimally.
- Designing and implementing new features and functionality.
- Establishing and guiding the website’s architecture and custom requirements.
- Ensure high performance and availability, and manage all technical aspects of the CMS.
- Must be able to create themes and plugins from scratch by following WordPress theme and plugin development standards.
- Strong understanding of Backend technologies, including PHP, OOP concepts, Ajax, and API integrations.
- Strong understanding of front-end technologies, including HTML5, CSS3, JavaScript, and jQuery.
- In-depth experience with WordPress terminologies like Post, Page, Category, Tags, Template hierarchy, etc.
- Excellent problem-solving and learning skills.
- Experience in working with WooCommerce and its add-ons, developing payment gateways, shipping, tax solutions, and REST API integrations.
- Have a deep understanding of WordPress database structure, built-in queries, and security practices.
- Comfortable working with debugging tools like Firebug, Chrome Inspector, etc.
Good to Have :
- Experience working with WordPress Multisite development
- Proficient understanding of code versioning tools such as Git, BitBucket, SVN, and Mercurial.
- Have worked with 3rd party API integrations with WordPress via plugins and themes.
- Knowledge of how to interact with RESTful APIs and formats (JSON, XML)
- Knowledge of WPRest API and WP-CLI
- Package management tools like NPM and Composer
About Us :
CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.
Our Values :
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement :
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace.
Role : Lead AI/Senior Engineer-AI
Location : Noida, Delhi/NCR
Experience : 5- 12 years
Education : BTech / BE / MCA / MSc Computer Science
Must Haves :
Conversational AI & NLU :
- Advanced proficiency with Dialogflow CX
- Intent classification, entity extraction, conversation flow design
- Experience building structured dialogue flows with routing logic CCAI platform familiarity
Agentic AI & Multi-Step Reasoning :
- Production experience with Google ADK (or LangChain/LangGraph equivalent)
- Multi-step reasoning and tool orchestration capability
- Tool-use patterns and function calling implementation
RAG Systems & Knowledge Management :
- Hands-on Vertex AI RAG Engine experience (or equivalent)
- Semantic search, chunking strategies, retrieval optimization
- Document processing pipelines (PDF parsing, chunking)
LLM/GenAI & Prompt Engineering :
- Production experience with Gemini models
- Advanced prompt engineering for customer support
- Langfuse experience for prompt management
Google Cloud Platform & Vertex AI :
- Advanced Vertex AI proficiency (Generative AI APIs, Agent Engine)
- Cloud Functions and Cloud Run deployment experience
- BigQuery for conversation analytics
API Integration :
- Genesys Cloud CX integration experience
- REST API design and webhook implementation
- Enterprise authentication patterns (OAuth 2.0)
Good To Have :
Conversational AI & NLU :
- Multi-language support implementation (Spanish/English)
- Telephony integration (speech recognition, TTS, DTMF)
- Barge-in handling and voice optimization
Agentic AI :
- Agent state management and session persistence
- Advanced fallback strategies and error recovery
- Dynamic tool selection and evaluation
RAG Systems :
- Re-ranking and advanced retrieval quality metrics
- Query expansion and context-aware retrieval
- Corpus organization strategies
LLM/GenAI :
- Prompt versioning, A/B testing, iterative refinement
- Prompt injection mitigation strategies
- In-context learning, few-shot, chain-of-thought techniques
LLMOps & Observability :
- Vertex AI Evaluation Service experience
- Groundedness, relevance, coherence, safety metrics
- Trace-level debugging with Cloud Trace
- Centralized logging strategies
Google Cloud :
- Application Integration connectors
- VPC Service Controls and enterprise security
- Cloud Pub/Sub for event-driven systems
Enterprise Integration :
- Third-party AI agent orchestration (SAP Joule, ServiceNow AI, Agentforce)
- Salesforce, SAP, ServiceNow integration patterns
- Context passage strategies for escalations
Architecture & System Design :
- Configuration-driven systems (Meta-Agent patterns)
- Microservices and containerization
- Scalable, multi-tenant system design
- Disaster recovery and failover strategies
Product Quality & KPIs :
- Customer support metrics expertise (CSAT, SSR, escalation rate)
- A/B testing and experimentation frameworks
- User feedback loop implementation
Deliverables :
- Architecture Design : End-to-end platform architecture, data flow diagrams, Dialogflow CX vs. ADK routing decisions
- Conversational Flows : 15+ dialogue flows covering billing, networking, appointments, troubleshooting, and escalations
- ADK Agent Implementation : Complex reasoning agents for technical support, account analysis, and context preparation
- RAG Pipeline : Document processing, chunking configuration, corpus organization (product docs, support articles, policies, promotions)
- Prompt Management : System prompts, Langfuse setup, playbook governance, version control
- Quality Framework : Evaluation pipeline, metrics dashboards, automated assessment, continuous improvement recommendations
- Integration Layer : Genesys handoff, webhook integrations, Application Integration setup, session management
- Testing & Validation : Conversation flow tests, performance testing (latency, throughput, 1000 concurrent users), security validation
- Response time <2 seconds (p95), 99.9% uptime, 1000 concurrent conversations
- Data encryption (TLS 1.2+, AES-256 at rest), PII redaction, 1-year data retention
- Graceful degradation and fallback mechanisms
DataHavn IT Solutions is a company that specializes in big data and cloud computing, artificial intelligence and machine learning, application development, and consulting services. We want to be in the frontrunner into anything to do with data and we have the required expertise to transform customer businesses by making right use of data.
About the Role:
As a Data Scientist specializing in Google Cloud, you will play a pivotal role in driving data-driven decision-making and innovation within our organization. You will leverage the power of Google Cloud's robust data analytics and machine learning tools to extract valuable insights from large datasets, develop predictive models, and optimize business processes.
Key Responsibilities:
- Data Ingestion and Preparation:
- Design and implement efficient data pipelines for ingesting, cleaning, and transforming data from various sources (e.g., databases, APIs, cloud storage) into Google Cloud Platform (GCP) data warehouses (BigQuery) or data lakes (Dataflow).
- Perform data quality assessments, handle missing values, and address inconsistencies to ensure data integrity.
- Exploratory Data Analysis (EDA):
- Conduct in-depth EDA to uncover patterns, trends, and anomalies within the data.
- Utilize visualization techniques (e.g., Tableau, Looker) to communicate findings effectively.
- Feature Engineering:
- Create relevant features from raw data to enhance model performance and interpretability.
- Explore techniques like feature selection, normalization, and dimensionality reduction.
- Model Development and Training:
- Develop and train predictive models using machine learning algorithms (e.g., linear regression, logistic regression, decision trees, random forests, neural networks) on GCP platforms like Vertex AI.
- Evaluate model performance using appropriate metrics and iterate on the modeling process.
- Model Deployment and Monitoring:
- Deploy trained models into production environments using GCP's ML tools and infrastructure.
- Monitor model performance over time, identify drift, and retrain models as needed.
- Collaboration and Communication:
- Work closely with data engineers, analysts, and business stakeholders to understand their requirements and translate them into data-driven solutions.
- Communicate findings and insights in a clear and concise manner, using visualizations and storytelling techniques.
Required Skills and Qualifications:
- Strong proficiency in Python or R programming languages.
- Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Cloud Dataproc, and Vertex AI.
- Familiarity with machine learning algorithms and techniques.
- Knowledge of data visualization tools (e.g., Tableau, Looker).
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Strong communication and interpersonal skills.
Preferred Qualifications:
- Experience with cloud-native data technologies (e.g., Apache Spark, Kubernetes).
- Knowledge of distributed systems and scalable data architectures.
- Experience with natural language processing (NLP) or computer vision applications.
- Certifications in Google Cloud Platform or relevant machine learning frameworks.
Position: Travel Sales Consultant
Location: Saket, New Delhi
Experience: 1 - 2 years (Freshers can also apply)
Minimum Educational Qualification: Graduation
Employment Type: Full-time
Responsibilities And Duties
Connect with our prospective clients both over the phone, email, and chat.
Ask the right questions to ascertain the needs of each unique traveler
Recommend and sell the right trip to the client based on their needs including any extra services that may enhance their experience.
Ensure the use of correct booking processes and procedures to minimize risk and reduce error rates.
Act at all times with the purpose of providing a life-changing experience
Things you will need to bring to begin your adventure with us:
Sales skills - You'll have that edge when it comes to sales and understanding how to provide amazing customer service. You'll be target-driven, and up for any challenge.
Travel experience - You'll be a globe trotter who has an incurable case of the travel bug.
Academic achievements - You'll have been a high flyer with academic accomplishments.
Career ambition - You'll love the thought of a challenging career that can take you places.
Benefits
Unlimited Earnings - You'll work on a fixed base salary plus uncapped commission; the more you sell, the more you'll earn! First-year average earnings are around 1 Million with potential for year-upon-year growth as you build your client base.
Training and development at our own in-house Learning Centre - We will provide you with all the tools you need to get up and running, as well as ongoing training to further develop your skills and knowledge.
Career development and advancement opportunities.
Unbeatable company culture.
Skills:- Inside Sales, Sales and Business Development
Key Responsibilities
• As a part of the DevOps team, you will be responsible for the configuration, optimization, documentation, and support of the CI/CD components.
• Creating and managing build and release pipelines with Azure DevOps and Jenkins.
• Assist in planning and reviewing application architecture and design to promote an efficient deployment process.
• Troubleshoot server performance issues & handle the continuous integration system.
• Automate infrastructure provisioning using ARM Templates and Terraform.
• Monitor and Support deployment, Cloud-based and On-premises Infrastructure.
• Diagnose and develop root-cause solutions for failures and performance issues in the production environment.
• Deploy and manage Infrastructure for production applications
• Configure security best practices for application and infrastructure
Essential Requirements
• Good hands-on experience with cloud platforms like Azure, AWS & GCP. (Preferably Azure)
• Strong knowledge of CI/CD principles.
• Strong work experience with CI/CD implementation tools like Azure DevOps, Team City, Octopus Deploy, AWS Code Deploy, and Jenkins.
• Experience of writing automation scripts with PowerShell, Bash, Python, etc.
• GitHub, JIRA, Confluence, and Continuous Integration (CI) system.
• Understanding of secure DevOps practices
Good to Have -
•Knowledge of scripting languages such as PowerShell, Bash
• Experience with project management and workflow tools such as Agile, Jira, Scrum/Kanban, etc.
• Experience with Build technologies and cloud services. (Jenkins, TeamCity, Azure DevOps, Bamboo, AWS Code Deploy)
• Strong communication skills and ability to explain protocol and processes with team and management.
• Must be able to handle multiple tasks and adapt to a constantly changing environment.
• Must have a good understanding of SDLC.
• Knowledge of Linux, Windows server, Monitoring tools, and Shell scripting.
• Self-motivated; demonstrating the ability to achieve in technologies with minimal supervision.
• Organized, flexible, and analytical ability to solve problems creatively
Expense Recording
Bank Reconciliation
Debtor Reconcilation
Vendor Payments
Vendor Reconciliation
GST & TDS Returns
We're DeepR Analytics (www.deepranalytics.com), a company based out of Toronto, Canada, that develops algorithmic trading solutions and products primarily focused on merging innovative trading algorithms with recent advances in machine learning algorithms such as deep learning and reinforced learning to automate trading and reap profits. We are continuously looking for disciplined professional C++ Software Developers familiar with Python as well. The C++Software Developers will work on the design, implementation, and enhancement of trading algorithms.
Responsibilities:
- Developing low latency trading algorithms for US Equities market in close coordination with the trade-strategy team
- Developing and maintaining large-scale, high-performance, robust software for management of high velocity and real-time market data
- Developing applications and software components to be reused across multiple products
- Designing, developing, and optimization of algorithms for quality & performance-optimized software development
- Contribute to architecture and technical design and implementation
- Developing and reviewing software requirements, designs, and test plans
- Documentation of existing and developing products
Desirable Skills:
- Familiarity with Design Patterns
- Ability to read/understand Python
- Familiarity with understanding, refactoring, debugging large C++ code repositories
- Familiarity with Git, Visual Studio, JIRA
Requirements:
- A B.Tech in Computer Science, Computer Engineering, or closely related field
- 3+ years of professional software development experience in C++
- Experience with large-scale, multi-threaded application development
- Experience designing APIs for consistency, simplicity, and extensibility
- Strong knowledge of object-oriented concept and design
- Ability to develop new approaches to complex problems
- Extensive experience with best practices in software development (code reviews, test automation/unit testing, version control)
- Well-developed written and verbal communication skills
- Familiar with working in both Linux and Windows environments.
- Willing to work in an afternoon shift (2:00 PM to 10:30 PM IST) Monday to Friday
Perks & Benefits:
- Cutting-edge work equipment.
- Two-way transportation
- Health Insurance for self & family
- Term Insurance for self
- Health and wellness benefits
Potential Benefits:
- Profit-Sharing Plan
- Employee Stock Ownership Plan





