
Job Title: Field Network Engineer Department
Location: IT department | IT department | Siddipet, Medak, Sangareddy, Vikarabad, Rangareddy, Narayanpet, Mahabubnagar, Wanaparthy, Nagarkurnool, Jogulamba, Gadwal, Medchal, Malkajgiri, and Hyderabad.
Reports to: Senior Network Engineer
Days/Hours of work: 9 Hrs (Monday to Saturday)
Salary Range: 2.52 LPA to 2.76 LPA
Requirement Bike and Driving License (Compulsory)
Summary of Position: - 1. Resolve, repair and install fiber optics systems and ensure that they work properly
2. Examine and replace faulty and old fiber optic cables and take care of fiber splicing and rectify fiber optic problem areas
3. Installation of Cisco router and switches
4. Installation of OLT and ONT devices
5. Cabling of Fibre optics cable.
Primary Responsibilities
- Cisco(Router & Switch)
- Network Devices (OLT & ONT)
- LAN and WAN experience
- Fiber Optics Cabling
Personal Specification
Any Graduate
Experience 0-1 year
Product Knowledge: Cisco, Router, Cisco Switch, Fiber Cable
Competencies
Language Hindi and English (Written and Oral).

About Jetking Technologies
About
Similar jobs
Job Title : Cognos BI Developer
Experience : 6+ Years
Location : Bangalore / Hyderabad (Hybrid)
Notice Period : Immediate Joiners Preferred (Candidates serving notice with 10–15 days left can be considered)
Interview Mode : Virtual
Job Description :
We are seeking an experienced Cognos BI Developer with strong data modeling, dashboarding, and reporting expertise to join our growing team. The ideal candidate should have a solid background in business intelligence, data visualization, and performance analysis, and be comfortable working in a hybrid setup from Bangalore or Hyderabad.
Mandatory Skills :
Cognos BI, Framework Manager, Cognos Dashboarding, SQL, Data Modeling, Report Development (charts, lists, cross tabs, maps), ETL Concepts, KPIs, Drill-through, Macros, Prompts, Filters, Calculations.
Key Responsibilities :
- Understand business requirements in the BI context and design data models using Framework Manager to transform raw data into meaningful insights.
- Develop interactive dashboards and reports using Cognos Dashboard.
- Identify and define KPIs and create reports to monitor them effectively.
- Analyze data and present actionable insights to support business decision-making.
- Translate business requirements into technical specifications and determine timelines for execution.
- Design and develop models in Framework Manager, publish packages, manage security, and create reports based on these packages.
- Develop various types of reports, including charts, lists, cross tabs, and maps, and design dashboards combining multiple reports.
- Implement reports using macros, prompts, filters, and calculations.
- Perform data warehouse development activities and ensure seamless data flow.
- Write and optimize SQL queries to investigate data and resolve performance issues.
- Utilize Cognos features such as master-detail reports, drill-throughs, bookmarks, and page sets.
- Analyze and improve ETL processes to enhance data integration.
- Apply technical enhancements to existing BI systems to improve their performance and usability.
- Possess solid understanding of database fundamentals, including relational and multidimensional database design.
- Hands-on experience with Cognos Data Modules (data modeling) and dashboarding.
We are solving complex technical problems in the financial industry and need talentedsoftware engineers to join our mission and be a part of a global software development team. As a Java Full Stack Developer, you will be responsible for designing, developing, and maintaining enterprise applications using Java and Angular. You will collaborate with cross-functional teams to define, design, and ship newfeatures, and you will work closely with Architects to ensure the technical feasibilityof designs and implement them accordingly
Hiring : SFCC Release Analyst
Experience : 5+ yrs
Location : Remote
We are looking for SFCC Release analyst with 5+ yr of experience with SFCC release management and data loads
Senior Machine Learning Engineer
📍 Location: Remote
💼 Type: Full-Time
💰 Salary: $800 - $1,000 USD / month
Apply at: https://forms.gle/Fwti67UeTEkx2Kkn6
About Us
At Momenta, we're committed to creating a safer digital world by protecting individuals and businesses from voice-based fraud and scams. Through innovative AI technology and community collaboration, we're building a future where communication is secure and trustworthy.
Position Overview
We’re hiring a Senior Machine Learning Engineer with deep expertise in audio signal processing and neural network-based detection. The selected engineer will be responsible for delivering a production-grade, real-time deepfake detection pipeline as part of a time-sensitive, high-stakes 3-month pilot deployment.
Key Responsibilities
📌 Design and Deliver Core Detection Pipeline
Lead the development of a robust, modular deepfake detection pipeline capable of ingesting, processing, and classifying real-time audio streams with high accuracy and low latency. Architect the system to operate under telecom-grade conditions with configurable interfaces and scalable deployment strategies.
📌 Model Strategy, Development, and Optimization
Own the experimentation and refinement of state-of-the-art deep learning models for voice fraud detection. Evaluate multiple model families, benchmark performance across datasets, and strategically select or ensemble models that balance precision, robustness, and compute efficiency for real-world deployment.
📌 Latency-Conscious Production Readiness
Ensure the entire detection stack meets strict performance targets, including sub-20ms inference latency. Apply industry best practices in model compression, preprocessing optimization, and system-level integration to support high-throughput inference on both CPU and GPU environments.
📌 Evaluation Framework and Continuous Testing
Design and implement a comprehensive evaluation suite to validate model accuracy, false positive rates, and environmental robustness. Conduct rigorous testing across domains, including cross-corpus validation, telephony channel effects, adversarial scenarios, and environmental noise conditions.
📌 Deployment Engineering and API Integration
Deliver a fully containerized, production-ready inference service with REST/gRPC endpoints. Build CI/CD pipelines, integration tests, and monitoring hooks to ensure system integrity, traceability, and ease of deployment across environments.
Required Skills & Qualifications
🎯 Technical Skills:
ML Frameworks: PyTorch, TensorFlow, ONNX, OpenVINO, TorchScript
Audio Libraries: Librosa, Torchaudio, FFmpeg
Model Development: CNNs, Transformers, Wav2Vec/WavLM, AASIST, RawNet
Signal Processing: VAD, noise reduction, band-pass filtering, codec simulation
Optimization: Quantization, pruning, GPU acceleration
DevOps: Git, Docker, CI/CD, FastAPI or Flask, REST/gRPC
🎯 Preferred Experience:
Prior work on audio deepfake detection or telephony speech processing
Experience with real-time ML model deployment
Understanding of adversarial robustness and domain adaptation
Familiarity with call center environments or telecom-grade constraints
Compensation & Career Path:
Competitive pay based on experience and capability. ($800 - $1,000 USD / month)
Full-time with potential for conversion to a core team role.
Opportunity to lead future research and production deployments as part of our AI division.
Why Join Momenta?
Solve a global security crisis with cutting-edge AI.
Own a deliverable that will ship into production at scale.
Join a fast-growing team with seasoned founders and engineers.
Fully remote, high-autonomy environment focused on deep work.
🚀 Apply now and help shape the future of voice security
Data Engineer
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
- AWS certification
- Spark Streaming
- Kafka Streaming / Kafka Connect
- ELK Stack
- Cassandra / MongoDB
- CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
At https://www.holofy.io/">Holofy,We are building an incredible product and to do that we need incredible people.
If you have ambition and drive and want to own your roadmap in a company that knows how to innovate and scale, we want to talk to you.
We work in small, self-managing, collaborative and agile/Kanban teams and are looking for great technology developers who can work in the same.
Location - Remote/WFH
Pre-requisites:
- 3-5 years of experience is building robust Android Applications
- Have knowledge of publishing apps on the Play Store
- Excellent command over Data Structures and Algorithms.
- Strong knowledge of Android SDK, different versions of Android, and how to deal with different screen sizes
- Familiarity with RESTful APIs to connect Android applications to back-end services
- Strong knowledge of Android UI design principles, patterns, and best practices
- Experience with offline storage, threading, and performance tuning
- Knowledge of Kotlin,Retrofit, OkHttp, Glide, Exo-Player, Android Jetpack Kotlin coroutines.
- Have an understanding of OOP, different architectural patterns (esp. MVVM), and their testability
- Knowledge of multi module architecture and product flavours.
- Knowledge of the open-source Android ecosystem and the libraries available for common tasks
- Familiarity with In-App purchases and google billing api.
- Ability to understand business requirements and translate them into technical requirements
- Familiarity with cloud message APIs and push notifications
- A knack for benchmarking and optimisation
- Understanding of Google’s Android design principles and interface guidelines
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration
Responsibilities:
- Translate designs and wireframes into high quality Code.
- Design, build, and maintain high performance, reusable, and reliable Kotlin code.
- Ensure the best possible performance, quality, and responsiveness of the application.
- Identify and correct bottlenecks and fix bugs.
- Help maintain code quality, organisation, and automatisation.
- Take Responsibility of a feature and ensure it's completion.
What else we can offer:
- Limitless growth and encouragement to be innovative and challenge status quo.
- Exceptional compensation & benefits and performance-based recognition & rewards.
- Open door policy and flexible working hours and Medical coverage
About ProjectDiscovery
ProjectDiscovery is an open-source powered cyber security company with a mission to democratize security. With one of the largest open-source security communities in the world, we host contributions from security researchers and engineers to our 20+ open-source projects, including tools like Nuclei and httpx, which have earned us over 100k GitHub stars and millions of downloads.
We’re a passionate, globally distributed team of ~35, driven by the shared mission of revolutionizing the application security landscape. Backed by $25M in funding, we’re looking for talented individuals to join us in Jaipur office.
Learn more at:
🌐 ProjectDiscovery.io
📂 ProjectDiscovery GitHub
About the Role
As a Product Frontend Engineer at ProjectDiscovery, you'll craft snappy, responsive, and scalable user interfaces. Working closely with our design, engineering, and founders, you'll create intuitive tools that empower both enterprise users and the thousands of individual users at scale.
In this role, you will:
- Build new user-facing features with beautiful and scalable UI components
- Implement complex React components like the virtualized rendering of large data tables with filters, real-time streaming, pagination and grouping.
- Create reusable and scalable react components e.g. shadcn/ui for dynamic interfaces.
- Act like owners, actively refine the front-end experience and performance.
- Work closely with founders and design engineers to implement new features.
- Improve application performance by profiling
Why Join Us?
- Competitive compensation package and stock options. (with extended exercise terms)
- Inclusive Healthcare Package.
- Learn and grow - we provide mentorship and send you to events that help you build your network and technical skills.
- Learn with intense innovation and software shipping cycles. We ship multiple times a week and push major releases a couple of times a month.
Our Interview Process
We identify talent through a streamlined process:
- We value efficiency and technical excellence in our hiring process:
- Application Review: Your application is reviewed by a technical team member.
- Initial Screening: A short call to understand your background, goals, and fit.
- Technical Rounds:Coding Assessment: Solve challenges using our tech stack.
- Create PR: Develop or enhance a feature related to one of our open-source tools.
- Final Round: Showcase your work, share your vision, and discuss how you can contribute to ProjectDiscovery at our office in Jaipur.
Apply Now to join ProjectDiscovery and help create exceptional user experiences while shaping the future of application security.
- Conducting advanced statistical analysis to provide actionable insights, identify trends, and measure performance
- Performing data exploration, cleaning, preparation and feature engineering; in addition to executing tasks such as building a POC, validation/ AB testing
- Collaborating with data engineers & architects to implement and deploy scalable solutions
- Communicating results to diverse audiences with effective writing and visualizations
- Identifying and executing on high impact projects, triage external requests, and ensure timely completion for the results to be useful
- Providing thought leadership by researching best practices, conducting experiments, and collaborating with industry leaders
What you need to have:
- 2-4 year experience in machine learning algorithms, predictive analytics, demand forecasting in real-world projects
- Strong statistical background in descriptive and inferential statistics, regression, forecasting techniques.
- Strong Programming background in Python (including packages like Tensorflow), R, D3.js , Tableau, Spark, SQL, MongoDB.
- Preferred exposure to Optimization & Meta-heuristic algorithm and related applications
- Background in a highly quantitative field like Data Science, Computer Science, Statistics, Applied Mathematics,Operations Research, Industrial Engineering, or similar fields.
- Should have 2-4 years of experience in Data Science algorithm design and implementation, data analysis in different applied problems.
- DS Mandatory skills : Python, R, SQL, Deep learning, predictive analysis, applied statistics
- Handling Recruitment functionalities.
- Work on search engines like Naukri and Monster and hire candidates.
- Short listing resumes from search engines and references.
- Joining formalities
- Lining up of candidates for the Interviews.
- Performing Reference Check for selected employees.
- Coordinating with candidates & the technical panel for scheduling the interviews.











