

TechSkillio
About
Company social profiles
Jobs at TechSkillio
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
JD
The Java Software Engineer role is to design, develop, and maintain scalable and high-performance
backend systems using Java and related technologies. This includes building APIs, integrating with
databases and external services, and ensuring system reliability, security, and maintainability.
Must Have Skills:
- Programming Languages & Frameworks (Java 18+, Spring Boot)
- API Development (RESTful APIs, OpenAPI/Swagger)
- Any Database
- CI/CD Pipelines (Jenkins, GitLab CI/CD)
- Cloud Platforms (Any)
- Unix
- Testing Frameworks (JUnit, TestNG, Mockito, JBehave)
Good to have Skills:
- Messaging & Integration (Kafka, REST)
- Azure Open AI, RAG-based Architecture, AI Agents, Agentic AI, Spring Integration
- Monitoring & Logging (Prometheus, Splunk)
- Security & Authentication (OAuth2, JWT, Spring Security)
Qualification:
- Bachelor's or Master’s degrees in Computer Science, Computer Engineering, or a related technical discipline.
- Ability to work independently and to adapt to a fast-changing environment.
- Creative, self-disciplined, and capable of identifying and completing critical tasks independently and with a sense of urgency.
Note: Banking Project is required
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Description
Overview:
We are seeking an experienced Azure Data Engineer to join our team in a hybrid Developer/Support capacity. This role focuses on enhancing and supporting existing Data & Analytics solutions by leveraging Azure Data Engineering technologies. The engineer will work on developing, maintaining, and deploying IT products and solutions that serve various business users, with a strong emphasis on performance, scalability, and reliability.
Must-Have Skills:
Azure Databricks
PySpark
Azure Synapse Analytics
Key Responsibilities:
- Incident classification and prioritization
- Log analysis and trend identification
- Coordination with Subject Matter Experts (SMEs)
- Escalation of unresolved or complex issues
- Root cause analysis and permanent resolution implementation
- Stakeholder communication and status updates
- Resolution of complex and major incidents
- Code reviews (Per week 2 per individual) to ensure adherence to standards and optimize performance
- Bug fixing of recurring or critical issues identified during operations
- Gold layer tasks, including enhancements and performance tuning.
- Design, develop, and support data pipelines and solutions using Azure data engineering services.
- Implement data flow and ETL techniques leveraging Azure Data Factory, Databricks, and Synapse.
- Cleanse, transform, and enrich datasets using Databricks notebooks and PySpark.
- Orchestrate and automate workflows across services and systems.
- Collaborate with business and technical teams to deliver robust and scalable data solutions.
- Work in a support role to resolve incidents, handle change/service requests, and monitor performance.
- Contribute to CI/CD pipeline implementation using Azure DevOps.
Technical Requirements:
- 4 to 6 years of experience in IT and Azure data engineering technologies.
- Strong experience in Azure Databricks, Azure Synapse, and ADLS Gen2.
- Proficient in Python, PySpark, and SQL.
- Experience with file formats such as JSON and Parquet.
- Working knowledge of database systems, with a preference for Teradata and Snowflake.
- Hands-on experience with Azure DevOps and CI/CD pipeline deployments.
- Understanding of Data Warehousing concepts and data modeling best practices.
- Familiarity with SNOW (ServiceNow) for incident and change management.
Non-Technical Requirements:
- Ability to work independently and collaboratively in virtual teams across geographies.
- Strong analytical and problem-solving skills.
- Experience in Agile development practices, including estimation, testing, and deployment.
- Effective task and time management with the ability to prioritize under pressure.
- Clear communication and documentation skills for project updates and technical processes.
Technologies:
- Azure Data Factory
- Azure Databricks
- Azure Synapse Analytics
- PySpark / SQL
- Azure Data Lake Storage (ADLS), Blob Storage
- Azure DevOps (CI/CD pipelines)
Nice-to-Have:
- Experience with Business Intelligence tools, preferably Power BI
- DP-203 certification (Azure Data Engineer Associate)
NOTE -
Weekly rotational shifts -
11am to 8pm
2pm to 11pm
5pm to 2 am
P.S. - In any one weekend they should be available in call. If there is any issues alone they should work on that. there will be on call support monthly once.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Role Overview:
We are seeking skilled Backend Developers to design, build, and maintain efficient, scalable, and secure server-side logic and services. The ideal candidate will have strong expertise in Python, Flask, and Google Cloud Platform (GCP), with experience building APIs, handling databases, and integrating cloud services in production environments.
Required Experience: 4+ Years
Location: Chennai, Open for remote for strong candidates
Key Responsibilities:
- Collaborate with project teams to understand business requirements and develop efficient, high-quality code.
- Design and implement low-latency, high-availability, and performant applications using frameworks such as Flask, or FastAPI.
- Integrate multiple data sources and databases into a unified system while ensuring seamless data storage and third-party library/package integration.
- Create scalable and optimized database schemas to support complex business logic and manage large volumes of data.
- Conduct thorough testing using pytest and unittest, debugging applications to ensure they run smoothly.
Required Skills & Qualifications:
- 3+ years of experience as a Python developer with strong communication skills.
- Bachelor's degree in Computer Science, Software Engineering or a related field.
- In-depth knowledge of Python frameworks such as Flask, or FastAPI.
- Strong expertise in cloud technologies, GCP preferred.
- Deep understanding of microservices architecture, multi-tenant architecture, and best practices in Python development.
- Familiarity with serverless architecture and frameworks like GCP Cloud Functions.
- Experience with deployment using Docker, Nginx, Gunicorn.
- Hands-on experience with SQL and NoSQL databases such as MySQL and Firebase.
- Proficiency with Object Relational Mappers (ORMs) like SQLAlchemy.
- Demonstrated ability to handle multiple API integrations and write modular, reusable code.
- Strong knowledge of user authentication and authorization mechanisms across multiple systems and environments.
- Familiarity with scalable application design principles and event-driven programming in Python.
- Solid experience in unit testing, debugging, and code optimization.
- Hands-on experience with modern software development methodologies, including Agile and Scrum.
- Experience with CI/CD pipelines and automation tools like Jenkins, GitLab CI, or CircleCI.
- Experience with version control system.
Driving Results:
- A good single contributor and a good team player.
- Flexible attitude towards work, as per the needs.
- Proactively identify & communicate issues and risks.
Other Personal Characteristics:
- Dynamic, engaging, self-reliant developer
- Ability to deal with ambiguity
- Manage a collaborative and analytical approach
- Self-confident and humble
- Open to continuous learning
- Intelligent, rigorous thinker who can operate successfully amongst bright people
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Description
- Min 5 years of Test Data Management tools experience with experience in Data De-identification and Masking Capabilities.
- Min 3 years of experience working on the Delphix tool with Data De-identification TDM experience
- Min 2 years of Experience in Synthetic Data Generation Capabilities
- Nice to have Experience in Python and .NET
- Understand and align with the roadmaps of key consumers with faster turnaround of test data generation.
- Optional: Prior experience working in cloud-hosted platforms, CICD pipeline, and integration of test data into the pipeline.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Summary
We are seeking a highly skilled and motivated Linux Device Driver Engineer with strong C/C++ programming skills and hands-on experience in Linux driver development. The ideal candidate will have a proven track record of working with kernel modules and hardware interfaces, and be comfortable debugging and optimizing low-level system software.
Key Responsibilities
- Porting existing Linux device drivers to new platforms, SoCs, and kernel versions.
- New driver development for custom hardware components and peripherals.
- Debugging kernel and driver-level issues using industry-standard tools.
- Integration & bring-up of hardware with Linux-based systems.
- Collaborate with hardware teams to interpret specifications and enable device functionality.
- Optimize drivers for performance, reliability, and resource efficiency.
- Write clear technical documentation for driver APIs, design, and integration steps.
Required Skills & Qualifications
- Bachelor’s/Master’s in Computer Science, Electronics, or related field.
- 4 to 8 years of professional experience in software development.
- Strong proficiency in C/C++ programming and memory management.
- Hands-on experience with any Linux device driver (character, block, network, USB, PCIe, I2C, SPI, etc.).
- Good understanding of Linux kernel architecture, module programming, and build systems.
- Knowledge of interrupt handling, DMA, and device tree configuration.
- Familiarity with cross-compilation and embedded Linux toolchains.
- Experience with debugging tools (GDB, ftrace, perf, printk, etc.).
- Version control experience (Git).
Preferred Skills
- Exposure to multiple driver types (networking, storage, multimedia, etc.).
- Experience with Yocto, Buildroot, or similar embedded Linux environments.
- Knowledge of real-time Linux and RT patches.
- Scripting knowledge (Python, Bash) for testing and automation.
Soft Skills
- Strong analytical and debugging skills.
- Good communication and collaboration abilities.
- Ability to work independently and take ownership of deliverables.
Similar companies
About the company
Appknox, a leading mobile app security solution HQ in Singapore & Bangalore was founded by Harshit Agarwal and Subho Halder.
Since its inception, Appknox has become one of the go-to security solutions with the most powerful plug-and-play security platform, enabling security researchers, developers, and enterprises to build safe and secure mobile ecosystems using a system-plus human approach.
Appknox offers VA+PT solutions ( Vulnerability Assessment + Penetration Testing ) that provide end-to-end mobile application security and testing strategies to Fortune 500, SMB and Large Enterprises Globally helping businesses and mobile developers make their mobile apps more secure, thus not only enhancing protection for their customers but also for their own brand.
During the course of 9 years, Appknox has scaled up to work with some major brands in India, South-East Asia, Middle-East, Japan, and the US and has also successfully enabled some of the top government agencies with its On-Premise deployments & compliance testing. Appknox helps 500+ Enterprises which includes 20+ Fortune 1000 and ministries/regulators across 10+ countries and some of the top banks across 20+ countries.
A champion of Value SaaS, with its customer and security-first approach Appknox has won many awards and recognitions from G2, and Gartner and is one of the top mobile app security vendors in its 2021 Application security Hype Cycle report.
Our forward-leaning, pioneering spirit is backed by SeedPlus, JFDI Asia, Microsoft Ventures, and Cisco Launchpad and a legacy of expertise that began at the dawn of 2014.
Jobs
7
About the company
Jobs
3
About the company
Jobs
16
About the company
Oddr is the legal industry’s only AI-powered invoice-to-cash platform. Oddr’s AI-powered platform centralizes, streamlines and accelerates every step of billing + collections— from bill preparation and delivery to collections and reconciliation - enabling new possibilities in analytics, forecasting, and client service that eliminate revenue leakage and increase profitability in the billing and collections lifecycle.
www.oddr.com
Jobs
9
About the company
Quantiphi is an award-winning AI-first digital engineering company driven by the desire to reimagine and realize transformational opportunities at the heart of the business. Since its inception in 2013, Quantiphi has solved the toughest and most complex business problems by combining deep industry experience, disciplined cloud, and data-engineering practices, and cutting-edge artificial intelligence research to achieve accelerated and quantifiable business results.
Jobs
5
About the company
Jobs
1
About the company
Stairio is a digital infrastructure company building scalable online systems for modern businesses.
We help service-driven brands establish strong digital foundations through high-performance websites, booking systems, management dashboards, and integrated payment solutions. Our goal is to give businesses ownership, control, and long-term digital assets that generate measurable revenue.
Jobs
2
About the company
Jobs
2
About the company
Shopflo is an enterprise technology company providing a specialized checkout infrastructure platform designed to boost conversion rates for direct-to-consumer (D2C) e-commerce brands. Founded in 2021, it focuses on enhancing the online buying experience through fast, customizable, and secure checkout pages that reduce cart abandonment.
We aim to supercharge conversions for e-commerce websites at checkout by improving user experience, helping build stronger intent and trust during the purchase
Problem statement -
(1) There is ~70% drop off at checkout for most independent e-commerce retailer (outside of large marketplaces)
(2) E-commerce cart platforms allow minimal flexibility on checkout, with their experience still same as the last decade
(3) Whereas user experiences are defined by new consumer platforms such as Swiggy, Amazon, etc.
There is a fundamental unbundling of monolith shopping cart platforms globally for mid-market and enterprise customers, who are moving towards headless (read modular) architecture.
Shopflo aims to be the global default for checkout experiences.
Jobs
3
About the company
Jobs
1






