
About Ixsight Technologies Pvt Ltd
About
Connect with the team
Similar jobs
Key Responsibilities
1. Kernel Lifecycle & Maintenance
Upstream Alignment: Lead the strategy for upgrading enterprise kernels (e.g., migrating from LTS 5.15 to 6.6) while maintaining binary compatibility where required.
Patch Porting: Expertly port functional and performance patches between disparate kernel versions, resolving complex code conflicts and API changes.
CVE Mitigation: Monitor the Linux Kernel Mailing List (LKML) and security advisories to identify and backport CVE patches from upstream to production environments.
2. Deep-Dive Debugging & Stability
Panic Analysis: Act as the final escalation point for Kernel Panics and "Oops" messages. Utilize kdump, crash, and gdb to perform post-mortem analysis of vmcores.
Boot-Time Resolution: Debug critical failures during the early boot process (UEFI handoff, initramfs, and early kernel init) where standard logging is unavailable.
Performance Tuning: Use ebpf, ftrace, and perf to identify bottlenecks in memory management, scheduler latency, or I/O throughput.
3. Driver Development & Hardware Integration
Driver Ownership: Design, develop, or maintain at least one Open Source or Proprietary Device Driver (Network, Storage, GPU, or Character devices).
Hardware Abstraction: Interface directly with hardware registers, managing DMA mappings, and optimizing interrupt handling (MSI-X, Threaded IRQs).
Out-of-Tree Management: Maintain driver compatibility across kernel updates using DKMS or similar frameworks.
4. Infrastructure & Automation
Registry Management: Oversee the distribution of custom kernel builds and modules via GitLab Container/Package Registries.
CI/CD for Kernel: Build automated testing pipelines (Hardware-in-the-loop) to validate kernel stability before enterprise-wide deployment.
Required Technical Skills:
Languages: Mastery of C/C++ Programming (C is preferred)
Kernel Internals: Deep understanding of VFS, Memory Management (MMU/Paging), Process Scheduling, and Networking Stacks.
Debugging Tools: Expert-level use of kprobes, trace-cmd, valgrind, and hardware-level debuggers (JTAG/Serial Console).
Build Systems: Proficiency with Kbuild, Makefiles, and building RPM/Debian packages for kernel distribution.
Security: Hands-on experience with SELinux/AppArmor policy development and kernel hardening (FIPS, KSPP).
Job Title: Data Engineer
Experience: 4–14 Years
Work Mode: Remote
Employment Type: Full-Time
Position Overview:
We are looking for highly experienced Senior Data Engineers to design, architect, and lead scalable, cloud-based data platforms on AWS. The role involves building enterprise-grade data pipelines, modernizing legacy systems, and developing high-performance scoring engines and analytics solutions and collaborate closely with architecture, analytics, risk, and business teams to deliver secure, reliable, and scalable data solutions.
Key Responsibilities:
· Design and build scalable data pipelines for financial and customer data
· Build and optimize scoring engines (credit, risk, fraud, customer scoring)
· Design, develop, and optimize complex ETL/ELT pipelines (batch & real-time)
· Ensure data quality, governance, reliability, and compliance standards
· Optimize large-scale data processing using SQL, Spark/PySpark, and cloud technologies
· Lead cloud data architecture, cost optimization, and performance tuning initiatives
· Collaborate with Data Science, Analytics, and Product teams to deliver business-ready datasets
· Mentor junior engineers and establish best practices for data engineering
Key Requirements:
· Strong programming skills in Python and advanced SQL
· Experience building scalable scoring or rule-based decision engines
· Hands-on experience with Big Data technologies (Spark/PySpark/Kafka)
· Strong expertise in designing ETL/ELT pipelines and data modeling
· Experience with cloud platforms (AWS/Azure) and modern data architectures
· Solid understanding of data warehousing, data lakes, and performance tuning
· Knowledge of CI/CD, version control (Git), and production support best practices
We are looking for a Unity or Unity3D Developer to join our team! As a Unity or Unity3D Developer at our company, you will be responsible for implementing 3D Virtual Tours for Real Estate projects helping translate design ideas, concepts, and requirements into a functional and engaging VR experience. Development platforms include Android,IOS and Oculus Rift.
Roles and Responsibilities
- Plan and implement virtual tour functionality
- Transform design specification into functional tours
- Communicate with other team members for creating a seamless experience.
- Design, build, and maintain efficient, reusable, and reliable code
- Ensure the best performance, quality, and responsiveness of applications and tours
- Identify process and application bottlenecks and bugs
Desired Candidate Profile
Key Skills: Unity3D, C#, C++ and Javascript
- Extensive knowledge of Unity, as well as extensive experience as a game/real estate tour programmer, using C# on a major development platform, such as Microsoft Visual Studio (or equivalent).
- Has extensive knowledge of 3D computer graphics, including use of DirectX SDK, OpenGL, shader programming languages, and optimizing 3D performance in games/tours on both high-end and lowed.
- Experience with UI design & development.
- Experience with scripting, textures, animation, GUI styles, and user session management
- Native iOS & Android app development experience a plus
Company Overview:
An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.
Location: Hyderabad and Bangalore (Remote after 3 months WFO)
Budget: 25-45LPA
Position: Senior Full stack Dotnet Developer with React
Experience: 8+ years of commercial experience
Mandatory skills: C#, Asp.Net MVC and WebAPI, React, OOPs
Role: Individual Contributor
Interview Process:
Offline Tech Test
Customer technical Round
CTO Discussion
Job Title:
Telephony Engineer
Job Description:
- Support the operations team with debugging issues related to the telephony platform
- Identify issues with calls using monitoring and analysis tools such as VoIPMOnitor
- Write scripts to automate tasks, monitor the services and functionalities
- Contribute to the improvement of the system by providing ideas
- Help with capacity planning
- Perform the scheduled maintenance activities
- Modify the existing code to accommodate new features
Experience Range:
3- 6 years
Educational Qualifications:
Any graduation,
Job Responsibilities:
• Knowledge of open source technologies, VoIP, SIP, WebRTC etc.
• Shell scripting.
• Experience in Asterisk and Kamailio or OpenSIPS.
• C/C++ or JS programming with Linux.
• 5-8 years experience.
• Good communication skills.
• Willingness to work in the night, whenever required, for handling support related issues.
Skills Required:
ViOP, OpenSIPS, SIP, Shell Scripting, Asterisk, kamailio, C++, Linux, Development,
You will be responsible for designing, building, and maintaining data pipelines that handle Real-world data at Compile. You will be handling both inbound and outbound data deliveries at Compile for datasets including Claims, Remittances, EHR, SDOH, etc.
You will
- Work on building and maintaining data pipelines (specifically RWD).
- Build, enhance and maintain existing pipelines in pyspark, python and help build analytical insights and datasets.
- Scheduling and maintaining pipeline jobs for RWD.
- Develop, test, and implement data solutions based on the design.
- Design and implement quality checks on existing and new data pipelines.
- Ensure adherence to security and compliance that is required for the products.
- Maintain relationships with various data vendors and track changes and issues across vendors and deliveries.
You have
- Hands-on experience with ETL process (min of 5 years).
- Excellent communication skills and ability to work with multiple vendors.
- High proficiency with Spark, SQL.
- Proficiency in Data modeling, validation, quality check, and data engineering concepts.
- Experience in working with big-data processing technologies using - databricks, dbt, S3, Delta lake, Deequ, Griffin, Snowflake, BigQuery.
- Familiarity with version control technologies, and CI/CD systems.
- Understanding of scheduling tools like Airflow/Prefect.
- Min of 3 years of experience managing data warehouses.
- Familiarity with healthcare datasets is a plus.
Compile embraces diversity and equal opportunity in a serious way. We are committed to building a team of people from many backgrounds, perspectives, and skills. We know the more inclusive we are, the better our work will be.
Responsibilities
- Join the product research team to develop the leading Geodata and GeoAI platform in Hong Kong
- Be part of the product team to develop web front-end interface with a focus of data and AI application
- Develop automated software in Arical data and AI pipeline
- On job training for rapid product development cycle
- Will have to deliver reliable and maintainable code
- Use analytical skills effectively for hands on problem solving
- Translate UI/UX design wireframes to actual code
- Work with UI/UX designer
Requirements
- Degree or above in Computer Science, Information Systems, or related fields
- Coursework in programming, data science, machine learning, AI and databases
- Strong organizational and project management skills
- Proficiency with client-side scripting and database technology
Technical skills
- Knowledge of one or more system programming languages (Python, C++, Java, etc)
- Knowledge of MongoDB and other NoSQL data stores such as Redis
- Proficiency with HTML, CSS, JavaScript, cross-browser optimization
- Proficiency with one or more front-end web development framework (Bootstrap, Angular, React)
- Knowledge of one or more web-scripting languages (PHP, Rudy, Node.js etc)
- Knowledge of web-scraping library (Beautiful Soup, web driver etc)
- Knowledge of API packaging and Docker deployment
- Knowledge of app development
- Knowledge of UI/UX design
Details
- Work from home
Responsibilities
- Build and mentor the computer vision team at TransPacks
- Drive to productionize algorithms (industrial level) developed through hard-core research
- Own the design, development, testing, deployment, and craftsmanship of the team’s infrastructure and systems capable of handling massive amounts of requests with high reliability and scalability
- Leverage the deep and broad technical expertise to mentor engineers and provide leadership on resolving complex technology issues
- Entrepreneurial and out-of-box thinking essential for a technology startup
- Guide the team for unit-test code for robustness, including edge cases, usability, and general reliability
Eligibility
- Tech in Computer Science and Engineering/Electronics/Electrical Engineering, with demonstrated interest in Image Processing/Computer vision (courses, projects etc) and 6-8 years of experience
- Tech in Computer Science and Engineering/Electronics/Electrical Engineering, with demonstrated interest in Image Processing/Computer vision (Thesis work) and 4-7 years of experience
- D in Computer Science and Engineering/Electronics/Electrical Engineering, with demonstrated interest in Image Processing/Computer vision (Ph. D. Dissertation) and inclination to working in Industry to provide innovative solutions to practical problems
Requirements
- In-depth understanding of image processing algorithms, pattern recognition methods, and rule-based classifiers
- Experience in feature extraction, object recognition and tracking, image registration, noise reduction, image calibration, and correction
- Ability to understand, optimize and debug imaging algorithms
- Understating and experience in openCV library
- Fundamental understanding of mathematical techniques involved in ML and DL schemas (Instance-based methods, Boosting methods, PGM, Neural Networks etc.)
- Thorough understanding of state-of-the-art DL concepts (Sequence modeling, Attention, Convolution etc.) along with knack to imagine new schemas that work for the given data.
- Understanding of engineering principles and a clear understanding of data structures and algorithms
- Experience in writing production level codes using either C++ or Java
- Experience with technologies/libraries such as python pandas, numpy, scipy
- Experience with tensorflow and scikit.
An experienced and hands-on Technical Architect to lead our Video analytics & Surveillance product
• An ideal candidate would have worked in large scale video platforms (Youtube, Netflix, Hotstar, etc) or Surveillance softwares
• As a Technical Architect, you are hands-on and also a top contributor to the product development
• Leading teams under time-sensitive projects
Skills Required:
• Expert level Python programming language skills is a MUST
• Hands-on experience with Deep Learning & Machine learning projects is a MUST
• Has to experience in design and development of products
• Review code & mentor team in improving the quality and efficiency of the delivery
• Ability to troubleshoot and address complex technical problems.
• Has to be a quick learner & ability to adapt to increasing customer demands
• Hands-on experience in design and deploying large scale docker and Kubernetes
• Can lead a technically strong team in sharpening the product further
• Strong design capability with microservices-based architecture and its pitfalls
• Should have worked in large scale data processing systems
• Good understanding of DevOps processes
• Familiar with Identity management, Authorization & Authentication frameworks
• Possesses very strong Software Design, enterprise networking systems, advanced problem-solving skills
• Experience writing technical architecture documents









![[x]cube LABS](/_next/image?url=https%3A%2F%2Fcdnv2.cutshort.io%2Fcompany-static%2F639877aa0ad87e002533a1c5%2Fuser_uploaded_data%2Flogos%2Fx_whiteB_eeCk0gqs.png&w=256&q=75)