50+ Startup Jobs in Hyderabad | Startup Job openings in Hyderabad
Apply to 50+ Startup Jobs in Hyderabad on CutShort.io. Explore the latest Startup Job opportunities across top companies like Google, Amazon & Adobe.
• 1–3 years in content marketing, SEO, or digital marketing
• Strong English writing — you can write 2,000 words that don’t read like AI filler
• SEO fundamentals: keyword research, on-page, Google Search Console
• Basic paid ads experience (Google and/or LinkedIn)
• Self-directed — you manage your own execution once given direction
Bonus Points
• SaaS or e-commerce tech marketing experience
• Know what sellers on Amazon/eBay actually complain about
• Ahrefs, SEMrush, HubSpot experience
• Can make a decent social graphic in Canva
Sector: Hospitality + Commercial Projects
About the Role
We’re looking for a motivated Junior BIM Architect / Revit Specialist to join our Hyderabad team. This role is ideal for someone who enjoys working hands-on with Autodesk Revit, building accurate BIM models, and growing into coordination and leadership responsibilities. You’ll be part of exciting hospitality and commercial projects, with opportunities to learn from senior architects and expand your BIM expertise.
What You’ll Do
- Develop and maintain Revit families for furniture, fixtures, and building components.
- Prepare layouts, finish plans, elevations, and construction details.
- Collaborate with architects, designers, and engineers to deliver coordinated BIM models.
- Support BIM workflows including clash detection and model reviews.
- Keep Revit libraries organized and up to date.
- Stay current with BIM trends and Revit best practices.
What We’re Looking For
- Bachelor’s degree in Architecture, Engineering, or related field.
- 1–5 years of experience in architectural practice with BIM/Revit focus.
- Strong skills in Autodesk Revit, especially family creation.
- Good understanding of BIM standards and workflows.
- Strong communication and problem-solving skills.
- Team player with the ability to work independently.
- Familiarity with collaboration tools/version control is a plus.
Why Join Us
- Work on high-profile hospitality and commercial projects.
- Gain mentorship and training from senior BIM professionals.
- Opportunity to grow into BIM coordination and project leadership roles.
- Collaborative, supportive team environment.
Job Title: Data Entry Operator / Data Entry Clerk
Location-Hyderbad
5 Days working
Job Summary:
Company is seeking a talented and motivated Data Entry Operator who is
responsible for accurately entering, updating, and maintaining data in company databases and
systems. This role ensures information is recorded efficiently, securely, and with attention to detail
to support smooth business operations
Job Responsibilities:
• Enter and update data into databases, spreadsheets, and systems with high accuracy.
• Verify and correct data to ensure consistency and eliminate errors.
• Review source documents for completeness and clarity before entry.
• Maintain records of activities and completed work.
• Retrieve, organize, and present data for internal reports as required.
• Identify and report discrepancies or data quality issues to supervisors.
Requirements:
• High school diploma or equivalent
• Proven experience in data entry, clerical, or administrative work.
• Strong typing skills with accuracy and speed.
• Proficiency with MS Office (Excel, Word) and database software.
• Good time management skills.
• Strong attention to detail.
• Ability to work independently and meet deadlines.
Required Skills & Experience
Education*: • Bachelor's degree in Computer Science, information technology, cybersecurity, or related area Experience*: • 3 to 7 years of experience in Security engineering and Security operations. Skills: • 3 years of hands-on experience with Microsoft Sentinel, KQL and terraform. • Strong understanding of Azure ecosystems and Azure Infrastructure/Platform service including common security services (firewalls, WAF, IDPS and RBAC). • Experience building customer analytics rules, playbooks and workbooks. • Understanding of MITRE ATT&CK, incident response and security monitoring best practices. • Experience with scripting and query languages like Python, Terraform, JSON and KQL.
Job Description
The Security Operations Engineer role is responsible for designing, implementing, and optimizing Azure Sentinel-based security monitoring solutions across cloud and hybrid environments. This role focuses on building scalable analytics, automation, threat detections, and integrating data and developing data sources from enterprise systems—including Azure, M365, network security tools, serverless applications, containerized resources and IoMT environments where applicable. This role serves as a technical expertise for Azure Sentinel engineering, KQL query development, security automation (SOAR), threat detection improvements, log onboarding, optimizing log ingestion, creating efficiency and framework dashboards and reporting design to support a high-maturity SOC.
We are seeking a highly skilled Qt/QML Engineer to design and develop advanced GUIs for aerospace applications. The role requires working closely with system architects, avionics software engineers, and mission systems experts to create reliable, intuitive, and real-time UI for mission-critical systems such as UAV ground control stations, and cockpit displays.
Key Responsibilities
- Design, develop, and maintain high-performance UI applications using Qt/QML (Qt Quick, QML, C++).
- Translate system requirements into responsive, interactive, and user-friendly interfaces.
- Integrate UI components with real-time data streams from avionics systems, UAVs, or mission control software.
- Collaborate with aerospace engineers to ensure compliance with DO-178C, or MIL-STD guidelines where applicable.
- Optimise application performance for low-latency visualisation in mission-critical environments.
- Implement data visualisation (raster and vector maps, telemetry, flight parameters, mission planning overlays).
- Write clean, testable, and maintainable code while adhering to aerospace software standards.
- Work with cross-functional teams (system engineers, hardware engineers, test teams) to validate UI against operational requirements.
- Support debugging, simulation, and testing activities, including hardware-in-the-loop (HIL) setups.
Required Qualifications
- Bachelor’s / Master’s degree in Computer Science, Software Engineering, or related field.
- 1-3 years of experience in developing Qt/QML-based applications (Qt Quick, QML, Qt Widgets).
- Strong proficiency in C++ (11/14/17) and object-oriented programming.
- Experience integrating UI with real-time data sources (TCP/IP, UDP, serial, CAN, DDS, etc.).
- Knowledge of multithreading, performance optimisation, and memory management.
- Familiarity with aerospace/automotive domain software practices or mission-critical systems.
- Good understanding of UX principles for operator consoles and mission planning systems.
- Strong problem-solving, debugging, and communication skills.
Desirable Skills
- Experience with GIS/Mapping libraries (OpenSceneGraph, Cesium, Marble, etc.).
- Knowledge of OpenGL, Vulkan, or 3D visualisation frameworks.
- Exposure to DO-178C or aerospace software compliance.
- Familiarity with UAV ground control software (QGroundControl, Mission Planner, etc.) or similar mission systems.
- Experience with Linux and cross-platform development (Windows/Linux).
- Scripting knowledge in Python for tooling and automation.
- Background in defence, aerospace, automotive or embedded systems domain.
What We Offer
- Opportunity to work on cutting-edge aerospace and defence technologies.
- Collaborative and innovation-driven work culture.
- Exposure to real-world avionics and mission systems.
- Growth opportunities in autonomy, AI/ML for aerospace, and avionics UI systems.
Role Overview
We are looking for a skilled Generative AI Developer to design, develop, and deploy AI-powered applications using Large Language Models (LLMs) and multimodal AI systems. The role involves building intelligent automation, chatbots, copilots, and content generation solutions aligned with business use cases.
Key Responsibilities
- Design and develop applications using Generative AI models (LLMs, diffusion models, etc.)
- Build AI chatbots, virtual assistants, and knowledge copilots
- Integrate LLM APIs (OpenAI, Anthropic, Google, etc.) into web/mobile apps
- Develop prompt engineering strategies for optimized outputs
- Implement Retrieval-Augmented Generation (RAG) pipelines
- Fine-tune and customize foundation models where required
- Work with vector databases for semantic search
- Collaborate with product, data, and engineering teams
- Ensure AI solutions are scalable, secure, and cost-efficient
- Monitor model performance, hallucinations, and output quality
Required Skills
- Strong programming in Python
- Experience with LLM frameworks (LangChain, LlamaIndex, Haystack)
- Hands-on with OpenAI / Gemini / Claude APIs
- Knowledge of Prompt Engineering
- Experience with Vector Databases (Pinecone, Weaviate, FAISS, Chroma)
- Understanding of RAG architectures
- Familiarity with REST APIs and microservices
- Knowledge of Docker / Cloud (AWS, Azure, GCP)
About our company:
We are an mSFA technology company that has evolved from the industry expertise we have gained over 25+ years. With over 600 success stories in mobility, digitization, and consultation, we are today the leaders in mSFA, with over 75+ Enterprises trusting WINIT mSFA across the globe.
Our state-of-the-art support center provides 24x7 support to our customers worldwide. We continuously strive to help organizations improve their efficiency, effectiveness, market cap, brand recognition, distribution and logistics, regulatory and planogram compliance, and many more through our cutting-edge WINIT mSFA application.
We are committed to enabling our customers to be autonomous with our continuous R&D and improvement in WINIT mSFA. Our application provides customers with machine learning capability so that they can innovate, attain sustainable growth, and become more resilient.
At WINIT, we value diversity, personal and professional growth, and celebrate our global team of passionate individuals who are continuously innovating our technology to help companies tackle real-world problems head-on.
We are seeking a talented AI/ML Engineer with strong hands-on experience in Generative AI and Large Language Models (LLMs) to join our Business Intelligence team. The role involves designing, developing, and deploying advanced AI/ML and GenAI-driven solutions to unlock business insights and enhance data-driven decision-making.
Key Responsibilities:
• Collaborate with business analysts and stakeholders to identify AI/ML and Generative AI use cases.
• Design and implement ML models for predictive analytics, segmentation, anomaly detection, and forecasting.
• Develop and deploy Generative AI solutions using LLMs (GPT, LLaMA, Mistral, etc.).
• Build and maintain Retrieval-Augmented Generation (RAG) pipelines and semantic search systems.
• Work with vector databases (FAISS, Pinecone, ChromaDB) for embedding storage and retrieval.
• Develop end-to-end AI/ML pipelines from data preprocessing to deployment.
• Integrate AI/ML and GenAI solutions into BI dashboards and reporting tools.
• Optimize models for performance, scalability, and reliability.
• Maintain documentation and promote knowledge sharing within the team.
Mandatory Requirements:
• 4+ years of relevant experience as an AI/ML Engineer.
• Hands-on experience in Generative AI and Large Language Models (LLMs) – Mandatory.
• Experience implementing RAG pipelines and prompt engineering techniques.
• Strong programming skills in Python.
• Experience with ML frameworks (TensorFlow, PyTorch, scikit-learn).
• Experience with vector databases (FAISS, Pinecone, ChromaDB).
• Strong understanding of SQL and database systems.
• Experience integrating AI solutions into BI tools (Power BI, Tableau).
• Strong analytical, problem-solving, and communication skills. Good to Have
• Experience with cloud platforms (AWS, Azure, GCP).
• Experience with Docker or Kubernetes.
• Exposure to NLP, computer vision, or deep learning use cases.
• Experience in MLOps and CI/CD pipelines
Computer Vision Engineer
Experience Range: 3–6 years
About the Role:
We are seeking a Computer Vision Engineer to design and implement vision-based solutions that power intelligent systems. You will work on algorithms and models that enable real-time image and video analysis for transportation and automation applications.
Key Responsibilities:
• Develop and optimize computer vision algorithms for object detection, tracking, and recognition
• Implement deep learning models using Python, OpenCV, PyTorch, TensorFlow
• Collaborate with data engineers to prepare and process large-scale image datasets
• Deploy vision models in production environments with cloud and edge computing support
• Research emerging techniques in computer vision and apply them to product development
Required Skills:
• Strong proficiency in Python and computer vision libraries (OpenCV, PyTorch, TensorFlow)
• Experience with image preprocessing, feature extraction, and model training
• Knowledge of deep learning architectures (CNNs, RNNs, Transformers)
• Familiarity with cloud deployment and MLOps practices
Preferred Qualifications:
• Bachelor’s/master's in computer science, AI, or related field
• Prior experience in Intelligent Transportation Systems (ITS) or automotive vision applications
Enter and update data into databases, spreadsheets, and systems with high accuracy.
• Verify and correct data to ensure consistency and eliminate errors.
• Review source documents for completeness and clarity before entry.
• Maintain records of activities and completed work.
• Retrieve, organize, and present data for internal reports as required.
• Identify and report discrepancies or data quality issues to supervisors. Requirements:
• High school diploma or equivalent
• Proven experience in data entry, clerical, or administrative work.
• Strong typing skills with accuracy and speed.
• Proficiency with MS Office (Excel, Word) and database software.
• Good time management skills.
• Strong attention to detail.
• Ability to work independently and meet deadlines.
Job Details
- Job Title: Lead I - Data Engineering (Python, AWS Glue, Pyspark, Terraform)
- Industry: Global digital transformation solutions provider
- Domain - Information technology (IT)
- Experience Required: 5-7 years
- Employment Type: Full Time
- Job Location: Hyderabad
- CTC Range: Best in Industry
Job Description
Data Engineer with AWS, Python, Glue, Terraform, Step function and Spark
Skills: Python, AWS Glue, Pyspark, Terraform - All are mandatory
******
Notice period - 0 to 15 days only
Job stability is mandatory
Location: Hyderabad
Job Details
- Job Title: Delivery Manager
- Industry: IT- Services
- Function - Information technology (IT)
- Experience Required: 15-18 years
- Employment Type: Full Time
- Job Location: Hyderabad
- CTC Range: Best in Industry
Preferred Skills: Excellent Communication & Stakeholder Management, Delivery Leadership, Scaled Agile, Program Governance, Cybersecurity Delivery, Executive Communication
Criteria:
1. 15+ years of experience in IT Services / System Integration / Cybersecurity services companies.
2. Must have handled enterprise client implementation projects (not internal product development only).
3. Proven ownership of end-to-end project delivery including transition to support/AMC.
4. Managed multi-stream technology implementation programs
5. Experience handling BFSI / Ecommerce / Retail / Enterprise clients.
6. Strong executive stakeholder handling and governance reporting
7. Strong hands-on exposure to SDLC delivery models
8. Prior experience delivering Cybersecurity / IAM / Cloud Security / Infrastructure / Enterprise IT projects.
9. Clear understanding of delivery governance, risk management, and milestone control.
10. Candidate must have PMP, AWS certifications for this role.
Note- Only Male candidates will be considered for this role.
Job Description
Head – Project / Delivery Management
Role Overview
We are seeking a highly experienced Project / Delivery Leader responsible for end-to-end delivery of all organizational projects, ensuring quality, timeliness, cost efficiency, and customer satisfaction.
This role demands strong expertise in scaled Agile delivery, SDLC management, cybersecurity projects, stakeholder leadership, and large-scale program execution.
Key Roles & Responsibilities
- Responsible for delivery of all projects across the organization.
- Lead project management across all SDLC delivery methodologies.
- Ensure successful project completion, handover, and future opportunity enhancement.
- Ensure seamless transition of implementation projects to support.
- Manage large-scale programs and multi-team environments.
- Strong decision-making and problem-solving capability.
- Expert client stakeholder management and executive communication.
- Present roadmap status, risks, and issues to executive leadership and mitigate roadblocks efficiently.
- Keep teams aligned with process standards at every stage.
- Monitor project progress and drive performance improvements.
- Prepare and present status reports to stakeholders.
- Own Cost / Quality / Timelines / Cybersecurity deliverables for allocated projects.
- Maximize resource utilization and proactive upskilling based on future demand.
- Ensure Customer Satisfaction (CSAT ownership).
- Complete delivery team management.
- Attrition optimization and team stability management.
Mandatory Skills & Experience
- 15+ years of proven experience in Project/Delivery Management (minimum exposure to Business Analysis).
- Strong expertise in scaling Lean & Agile practices across large development programs.
- Experience managing scaled Agile frameworks such as SAFe, DAD, Scrum, Kanban, or other iterative models at scale.
- Working knowledge of all SDLC delivery models.
- Excellent people and project management skills.
- Strong communication and executive presentation skills.
- Strong analytical and problem-solving ability.
- Experience working in small-scale organizations handling large enterprise clients.
- Proficiency in productivity tools – MS Excel, MS PowerPoint, MS Project.
- Prior experience handling Cybersecurity projects in BFSI, Ecommerce, Retail domains.
Educational Qualifications
- Engineering (CSE/ECE/EEE preferred) + MBA from reputable institutes.
- MBA specialization in Systems / Organizational Management / IT Business Management preferred.
- Management programs from reputed institutes such as IIMs are an added advantage.
- Entire education from English medium.
Additional Requirements:
- Male candidate only
- Clean shave and business formals (Grooming Policy)
- Work from Office only
Certifications
Mandatory:
- PMP
- AWS Certification
Good to Have:
- ITIL
- Certified Scrum Master
- PRINCE2
- CISSP
- CISA
Job Details
- Job Title: Project Manager
- Industry: IT- Services
- Function - Information technology (IT)
- Experience Required: 6-8 years
- Employment Type: Full Time
- Job Location: Hyderabad
- CTC Range: Best in Industry
Preferred Skills: Project Management, Excellent Communication & Stakeholder Management, Cybersecurity Delivery, Business Analysis, SDLC Expertise, IAM Exposure
Criteria:
1. 7+ years of experience in IT Services / System Integration / Cybersecurity consulting companies.
2. Proven experience managing client-facing enterprise implementation projects (not internal IT-only roles).
3. Hands-on experience in: Project initiation, AS-IS / TO-BE documentation, Requirement gathering & gap analysis.
4. Must have actively played both Project Manager and Business Analyst roles in cybersecurity/enterprise tech projects.
5. Prior experience in Identity & Access Management (IAM), Access Governance, Privileged Access, or Security implementations.
6. Experience managing projects across full SDLC lifecycle (Design → Development → Testing → Deployment → Go-live → Support transition).
7. Experience working with international enterprise clients.
Not Suitable If:
1. Pure Scrum Master without documentation ownership.
2. Only Business Analyst without delivery accountability.
3. Internal IT role without enterprise client exposure.
Job Description
Project Manager – Cybersecurity (PM + BA)
Role Overview
Key Responsibilities
Lead Project Management and Business Analysis for all cybersecurity projects.
Own project initiation, AS-IS / TO-BE documentation, and complete requirement analysis.
Coordinate with technology teams for design, development, infrastructure deployment, testing, and production go-live.
Mandatory client engagement and stakeholder management throughout the project lifecycle.
Act as liaison between clients and developers, ensuring business and technical requirements are identified, validated, and traceable.
Ensure adherence to OEM recommended best practices.
Facilitate Joint Application Design (JAD) sessions with clients and technical teams.
Capture and report project metrics, risks, issues, and actions to stakeholders.
Optimize milestone management (Fixed Bid), resource utilization (T&M), and support project efficiency.
Coordinate with internal finance to ensure accurate and timely billing.
Build strong working relationships with project teams, business leads, and clients.
Participate actively in all SDLC phases — planning, execution, communication, and post-project activities.
Conduct periodic SWOT analysis, proactive risk identification, and mitigation planning.
Prepare project documentation and presentations using MS Word, Excel, PowerPoint, and related tools.
Core Role Expectations
Strong Business Analysis capability.
Good experience in combined PM + BA activities.
Comprehensive understanding of all SDLC phases.
Understanding of enterprise software and organizational process alignment.
Experience in software delivery management.
Experience working with international/global customers.
Experience managing technical teams (as reporting manager).
Strong stakeholder management skills.
Proficiency in MS Word, Excel, MS Project, and other productivity tools.
Identity and Access Management (IAM) domain exposure.
Excellent verbal and written communication skills.
Educational Requirements
MBA in IT Business Management (ITBM) or MBA IT Systems only (Mandatory).
MBA HR / Marketing / Finance candidates are not eligible.
Engineering graduate from ECE / EEE / IT / CSE only.
Additional Requirements:
Male candidate only
Entire graduation from English medium
Clean shave and business formals (Grooming Policy)
Work from Office only
Skills & Qualifications
Must Have
Strong understanding of software systems enabling functional and technical requirements.
Solid experience in cybersecurity project environments.
Good to Have
Prior experience handling cybersecurity projects.
Understanding of COBIT standards.
Certifications
Preferred (anyone):
PMP
CISA
CISM
Are you passionate about leveraging data to drive strategic decisions? Join us at OIP Insurtech, where you'll play a pivotal role in analyzing market trends, optimizing operational processes, and transforming customer experiences. Bring your analytical prowess and business acumen to collaborate with a dynamic team dedicated to shaping the future of insurance technology. Apply now and be part of revolutionizing the industry with cutting-edge insights and solutions.
Requirements:
- 3-5 years of experience
- Capable of independently conducting various aspects of business analysis, including requirements gathering, process analysis, and documentation.
- Proactive in identifying and prioritizing issues within their domain and taking the initiative to work with the team to resolve them.
- Strong background in accounting and/or financial controlling
- Demonstrates a high level of skill and independence
- Understands the full business analysis lifecycle and is proficient in handling various business problems and custom cases.
- Good knowledge of English
- Experience in accounting and business analysis in the IT industry is highly desirable
- Working knowledge of software development concepts, technologies, practices, and principles, and the ability to use them in different situations independently
- Good understanding of the software development lifecycle
Duties and responsibilities:
- Works closely with the development team and other stakeholders to investigate model business functions, business processes, information flows, and data structures
- Investigate operational issues, problems, and new opportunities; seek effective business solutions through improvements in aspects of business areas or systems of interest
- Assists in the analysis of underlying issues and their root causes, and identifies available options
- Communicates with clients and other stakeholders and assists in presenting issues and solutions both verbally and in writing
- Specifies data, data objects, and information flows that align with the needs of the business
- Produces business analysis deliverables using relevant documentation styles in line with company standards
- Facilitates stakeholder meetings and workshops, and presents findings and actions both verbally and in writing to the business
- Able to translate business requirements into a structured form
- Able to create structure within projects and project teams following Agile principles
- Understands the domain (Insurance Industry) and how it functions, after spending 6M in the company
- Understands the entire Policy Lifecycle, after spending 6M in the company
- Experience in using tools like Jira, Confluence, LucidChart, or other software for requirements management, documentation, and process modeling
- Experience with Agile frameworks
- Experience with testing methodologies, test planning, and quality assurance processes to ensure reliable and robust systems
Responsibilities:
• End-to-end design, development, and deployment of enterprise-grade AI solutions leveraging Azure AI, Google Vertex AI, or comparable cloud platforms.
• Architect and implement advanced AI systems, including agentic workflows, LLM integrations, MCP-based solutions, RAG pipelines, and scalable microservices.
• Oversee the development of Python-based applications, RESTful APIs, data processing pipelines, and complex system integrations.
• Define and uphold engineering best practices, including CI/CD automation, testing frameworks, model evaluation procedures, observability, and operational monitoring.
• Partner closely with product owners and business stakeholders to translate requirements into actionable technical designs, delivery plans, and execution roadmaps.
• Provide hands-on technical leadership, conducting code reviews, offering architectural guidance, and ensuring adherence to security, governance, and compliance standards.
• Communicate technical decisions, delivery risks, and mitigation strategies effectively to senior leadership and cross-functional teams.
Experience: 0-1 Year
Company: WINIT
Location: Hyderabad
Employment Type: Full-Time
Key Responsibilities:
- Design, develop, and maintain web applications using .NET Core and C#.
- Leverage AI technologies to enhance functionality and automate processes.
- Build and integrate RESTful APIs and web services.
- Optimize SQL queries for efficient data management.
- Collaborate within teams to define and deliver new features.
- Ensure UI/UX feasibility and improve user experience.
- Write clean, scalable code and troubleshoot performance issues.
- Participate in the full product lifecycle and utilize version control systems (Git, SVN).
Required Skills & Qualifications:
- 0–1 year experience with .NET Core, C#, and SQL.
- Proficient in JavaScript, HTML5, CSS3.
- Proficient in using IDE tools like Cursor, Claude code & Copilot etc.
- Strong problem-solving skills and independent work ethic.
- Experience in product companies, ideally with startups or small enterprises.
- Passion for development with an interest in AI-driven solutions.
- Excellent communication skills and a strong team player.
Preferred Skills:
- Experience with cloud platforms (AWS, Azure), CI/CD pipelines, and containerization (Docker, Kubernetes).
- Familiar with unit testing (NUnit, MSTest) and Agile methodologies.
- Experience with microservices architecture and security best practices.
What We Offer:
- A dynamic, collaborative work environment with growth opportunities.
- Competitive salary and flexible working hours.
- Exposure to cutting-edge technologies, including AI.
About our company:
We are an mSFA technology company that has evolved from the industry expertise we have gained over 25+ years. With over 600 success stories in mobility, digitization, and consultation, we are today the leaders in mSFA, with over 75+ Enterprises trusting WINIT mSFA across the globe.
Our state-of-the-art support center provides 24x7 support to our customers worldwide. We continuously strive to help organizations improve their efficiency, effectiveness, market cap, brand recognition, distribution and logistics, regulatory and planogram compliance, and many more through our cutting-edge WINIT mSFA application.
We are committed to enabling our customers to be autonomous with our continuous R&D and improvement in WINIT mSFA. Our application provides customers with machine learning capability so that they can innovate, attain sustainable growth, and become more resilient.
At WINIT, we value diversity, personal and professional growth, and celebrate our global team of passionate individuals who are continuously innovating our technology to help companies tackle real-world problems head-on.
Role Overview
We are seeking a technically strong Java Support Engineer who combines solid development knowledge with a passion for support and operational excellence. The ideal candidate should have hands-on experience in Java, Spring Boot, and Angular, along with a strong understanding of application engineering concepts, and must be comfortable working in a production support environment handling incidents, troubleshooting, monitoring, and system stability.
Key Responsibilities
- Provide L2/L3 production support for enterprise applications.
- Troubleshoot, debug, and resolve application issues within defined SLAs.
- Analyze logs, identify root causes, and implement fixes or workarounds.
- Collaborate with development teams for permanent issue resolution.
- Monitor application health, performance, and availability.
- Support deployments, releases, and environment validations.
- Perform minor code fixes and enhancements when required.
- Document issues, solutions, and support procedures.
- Participate in on-call rotations and handle incident management.
Required Skills & Qualifications
- Strong hands-on experience in Java and Spring Boot.
- Working knowledge of Angular for frontend understanding.
- Good understanding of application architecture, APIs, microservices, and debugging techniques.
- Experience with log analysis tools, monitoring tools, and ticketing systems.
- Knowledge of SQL databases and query troubleshooting.
- Familiarity with Linux/Unix environments.
- Understanding of CI/CD, release processes, and version control (Git).
- Strong analytical, problem-solving, and communication skills.
Engineering Head / Technology Leader
Experience: 14+ Years
Role Summary
Seeking a senior Engineering Leader to drive technical strategy, lead large engineering teams, and deliver scalable, high-quality software products. The role combines strong people leadership with deep hands-on and architectural expertise.
Key Responsibilities
- Lead and scale a 40+ member engineering team
- Own and execute the technical roadmap aligned with business goals
- Drive AI-driven solutions and modern engineering practices
- Ensure delivery of secure, scalable, high-performance systems
- Mentor leaders, manage hiring, performance, and career growth
- Oversee architecture, code quality, and non-functional standards
- Collaborate with Product, Design, and cross-functional teams
- Champion Agile and DevOps practices
Requirements
- 14+ years in software engineering, 5+ years in leadership
- Strong expertise in Java, Microservices, AWS, Angular, AI
- Experience with Spring Boot, Kafka, Kubernetes, Docker, Cassandra
- Strong system design, security, and distributed systems knowledge
- Excellent communication and stakeholder management skills
What We Offer
- Competitive compensation
- Leadership impact in a high-growth environment
- Collaborative, flexible work culture
Location-Hyderbad
Qualification -B-Architect /interior design
Monday-Friday
9.30AM to 6.30 PM
Job Title: Senior Interior Designer
Job Description:
Graniti Vicentia India Pvt Ltd is seeking a highly skilled and experienced Senior Interior Designer to join
our dynamic team. The ideal candidate will have 3 to 4 years of relevant experience in the field and
possess a strong background in interior design, with a focus on creating aesthetically pleasing and
functional spaces. The Senior Interior Designer will be responsible for overseeing various aspects of
the design process, including conceptual design development drawings, vendor coordination, shop
drawings, quotes, and the selection of furniture, fixtures, and equipment (FFE).
Responsibilities:
Design Development:
• Lead the design development process, ensuring that concepts align with client preferences,
project goals, and industry standards.
• Develop comprehensive interior design schemes, including space planning, color schemes,
and material selections.
Drawings and Documentation:
• Create detailed interior drawings, plans, and elevations using industry-standard software.
• Collaborate with architects and other design professionals to integrate interior design
elements seamlessly into overall project documentation.
Vendor Coordination:
• Establish and maintain relationships with vendors, suppliers, and manufacturers.
• Coordinate with vendors to source and procure materials, furniture, and fixtures that meet
project specifications and budget constraints.
Shop Drawings and Specifications:
• Review and approve shop drawings to ensure they comply with design intent and project
requirements.
• Develop and communicate detailed specifications for construction and installation
processes.
Quotes and Budget Management:
• Prepare accurate and detailed project cost estimates and quotes.
• Work closely with the project management team to monitor and manage project budgets
effectively.
FF&E Selection:
• Lead the selection of furniture, fixtures, and equipment based on design concepts, client
preferences, and project requirements.
• Stay updated on industry trends and product innovations to enhance design offerings.
Qualifications:
• Bachelor's degree in Interior Design or a related field.
• 3 to 4 years of proven experience in interior design, with a focus on commercial or
residential projects.
• Proficient in AutoCAD, Adobe Creative Suite, and other relevant design software.
• Strong knowledge of materials, finishes, and furniture procurement.
• Excellent communication and interpersonal skills.
• Ability to lead and collaborate within a team environment.
• Project management skills, including the ability to manage timelines and budgets effectively.
• If you meet the above qualifications and are passionate about creating innovative and
functional interior spaces, we invite you to apply for the Senior Interior Designer position.
Please submit your resume, portfolio, and a cover letter detailing your relevant experience
and design philosophy.
Lead design development aligned with client requirements and project goals.
Develop space plans, design concepts, and material selections.
Prepare detailed interior drawings, layouts, and elevations.
Coordinate with architects, consultants, and vendors.
Review and approve shop drawings as per design intent.
Develop specifications for materials and installation.
Manage FF&E selection and sourcing.
Prepare cost estimates and support budget management.
infoatgoldnhire.in
A. Lead Analyst role
Job Title: Full stack Developer
Position: Lead Analyst
Position Description: The Mid-Level Full Stack Developer will be responsible for designing, developing, and maintaining modern web applications using both front-end and back-end technologies. This role involves building responsive user interfaces with Vue.js or other modern frameworks (React/Angular), implementing APIs with JavaScript/TypeScript and Node.js, and ensuring application quality through automated testing (Playwright). The developer will also work on GraphQL integrations, contribute to CI/CD pipelines, and support deployments in Azure environments, collaborating closely with cross-functional teams to deliver secure, scalable, and high-performance solutions.
Responsibilities:
• Develop and maintain full-stack applications using Java, Vue.js, React, or Angular for front-end and JavaScript/TypeScript with Node.js for back-end.
• Implement GraphQL or RESTful APIs and contribute to microservice development under guidance from senior engineers.
• Write clean, efficient, and maintainable code following established coding standards and best practices.
• Build responsive, high-performance UI components and ensure cross-browser compatibility.
• Create and maintain automated tests (UI/API) using tools like Playwright to ensure software quality.
• Collaborate with cross-functional teams to understand requirements and deliver technical solutions.
• Support CI/CD pipelines and assist in deployments to cloud environments (Azure exposure is a plus).
• Troubleshoot and resolve application issues, contributing to continuous improvement efforts.
• Participate in code reviews and technical discussions, providing constructive feedback and learning from senior team members.
Must-Have Skills:
• Vue.js or strong FE experience with React/Angular
• Typescript, Node Js , Java- full stack
• Test automation experience, ideally Playwright
• GraphQL
• Azure/Azure Functions knowledge (optional)
• Azure DevOps experience (optional)
• Good in communication and problem solving.
infoatgoldnhire.in
Role Summary
We are looking for a seasoned Full Stack Developer with strong backend expertise in PHP (Laravel/Symfony) and modern frontend experience using React (Vue.js or Angular exposure is a plus). The ideal candidate should possess a full-stack mindset with experience in building scalable REST APIs and interactive UI applications.
Key Responsibilities
- End-to-end feature development (Backend + Frontend)
- Design and develop scalable, secure, and maintainable applications
- Build and optimize REST APIs and reusable UI components
- Participate in technical design discussions and code reviews
- Mentor junior developers and ensure coding best practices
- Collaborate with QA, DevOps, and Architects
- Focus on performance tuning and security enhancements
Must-Have Skills
- Strong hands-on experience in PHP (Laravel/Symfony)
- Experience building and consuming REST APIs
- Expertise in React.js (Vue.js/Angular acceptable)
- Strong knowledge of JavaScript/TypeScript
- Solid understanding of HTML5, CSS3
- Experience with scalable and maintainable application architecture
Nice-to-Have Skills
- Full-stack exposure with Node.js
- Experience with MongoDB / NoSQL databases
- Familiarity with Docker, CI/CD pipelines, Cloud platforms
- Knowledge of HCM and Payroll domain
Our organization relies on its central engineering workforce to develop and maintain a product portfolio of several different startups and enterprises. Our product portfolio continuously grows as we incubate more startups, which means that various products will likely use other technologies, architecture, & frameworks - a fun place for smart tech lovers!
About the Role
We’re offering a fantastic opportunity for a dynamic and entrepreneurial marketing professional to build our entire marketing function from the ground up. With around 4 to 6 years of experience, you’ll have the chance to lead a "zero-to-one" marketing journey in a growing company, helping us scale from $2 million to $12 million in revenue over the next three years.
Key Responsibilities
- Develop and Own the Marketing Strategy: Craft and execute a holistic marketing plan aligned with our growth ambitions and revenue targets.
- Website and SEO Management: Take charge of our website, driving SEO improvements through smart content strategies and ensuring a strong digital presence.
- Brand Building and Social Media: Drive our brand presence across LinkedIn and other social platforms, ensuring consistent, impactful messaging.
- Event and Community Engagement: Organize and host developer meetups and other events, helping to establish our footprint in the developer community.
- Marketing Collateral and Sales Support: Create marketing decks and collateral that empower our sales team, maintain brand guidelines, and ensure a cohesive and polished brand image.
What We’re Looking For
We’re seeking someone who’s not just a doer but a builder—someone excited to create a marketing function in a rapidly growing environment. Ideally, you have:
- 4-6 years of marketing experience, particularly in a technology company, consulting firm, or services-oriented business. Familiarity with marketing in a tech-driven environment is a big plus.
- Strong skills in digital marketing, SEO, content creation, and social media management.
- Proven experience with event organization and community engagement.
Why Join Us?
You’ll be stepping into a role where your impact is immediate and tangible. It’s a unique opportunity to shape the future of our marketing efforts and grow your career in a collaborative, high-growth environment.
About CAW
CAW is a Product Engineering Company of 90+ geeks.
Some of the products we have built - Ukti, CodeKnack, Hoichoi, Convoisight, OneHealthAssist, Dysko, Fhynix, Space, Interakt, CashFlo, FastBar, etc.
We are (or have been) part of the engineering teams for Haptik, RazorPay, Decklar, Flipspaces, Aerchain, Postman, TitanEmail, Acceldata, Bureau, etc.
We are obsessed with automation, DevOps, OOPS, and SOLID. We are not into one tech-stack - we are into solving problems.
Find us: https://goo.gl/maps/dvR6L26JUa42
Website: https://www.caw.tech/
Policies: Handbook
Overview
We seek a detail-oriented, collaborative Technical Enablement Lead to drive the implementation of enterprise systems and associated integrations, specifically with ERP (Enterprise Resource Planning) platforms and WMS (Warehouse Management Systems). This role is ideal for someone who thrives in cross-functional environments and excels at translating operational needs into scalable technical solutions.
Key Responsibilities
Cross-Functional Technical Integration
- Drive alignment across engineering, data, and warehouse operations teams
- Work with the teams to help translate business requirements into actionable integration tasks and system configurations
Issue Resolution & Continuous Improvement
- Identify and mitigate risks early, removing barriers to keep the team on track
- Work with the technical teams who monitor system performance and data flow integrity
- Lead the teams in root cause analysis and implement iterative enhancements to integration reliability
ETL Process Oversight
- Collaborate with partner who leads the design and validation of ETL pipelines for syncing inventory, orders, and shipment data
- Ensure teams maintain data accuracy, schema compatibility, and robust error handling across platforms
Systems Mapping & Architecture
- Partner with brand and vendor IT teams to design and document integration pathways between ERP and WMS systems
- Work with teams to identify data dependencies, transformation logic, and system constraints
- Confirm alignment with operations leads on workflows and priorities
Access Control Oversight
- Partner with teams to define and coordinate role-based access controls across integrated systems
- Collaborate with infrastructure and security teams to maintain compliance and operational integrity
Qualifications:
- 3-5 + years of experience
- Strong understanding of ERP and WMS platforms and their integration patterns
- Experience working with teams that implement integration tools, e.g., ETL, data mapping
- Familiarity with system roles and access controls
- Excellent communication and coordination skills across technical and operational teams
- Ability to document and communicate systems in a clear, structured manner
Nice to Have:
- ERP: NetSuite, or JDE,
- WMS: Blue Yonder, Deposco
- Any Project Methodology certifications: Agile, Waterfall, Etc…
🚨 Hiring Alert 🚨Hiring Alert 🚨Hiring Alert 🚨
📌Job Role: Oracle Apps Technical Consultant (R12)
📍 Location: Hyderabad (Hybrid – 3 Days WFO)
💼 Experience: 8–15 Years
📝 Interview Process: 4 Rounds
📄 Employment Type: Permanent
We are looking for an experienced Oracle Apps Technical Consultant with strong expertise in Oracle EBS R12, Advanced PL/SQL & Oracle APEX.
If you have hands-on experience in Finance Modules and custom RICE components, this opportunity is for you.
🔹 Mandatory Skills
✅ Oracle EBS R12 (RICE Components – Reports, Interfaces, Conversions, Extensions, Workflows, ETL)
✅ Advanced SQL & PL/SQL Programming
✅ Oracle APEX (Mandatory)
✅ Finance Modules – AP, AR, GL, FA, INV, OM, Purchasing
✅ Performance Tuning & Optimization
✅ Strong understanding of APPS Data Model & Oracle Coding Standards
🔹 Key Responsibilities
✔ Develop customizations & enhancements in Oracle EBS R12
✔ Build PL/SQL packages, procedures, functions & triggers
✔ Design efficient program units using Oracle data model
✔ Develop Reports, Forms, Workflows (OAF is an added advantage)
✔ Perform performance tuning & query optimization
✔ Work closely with Finance & Business stakeholders
✔ Ensure adherence to Oracle development standards
🔹 Ideal Candidate
⭐ 7+ years in Oracle EBS Application Development
⭐ Strong understanding of underlying table structures
⭐ Advanced PL/SQL concepts expertise
⭐ Excellent communication skills
⭐ Experience in complex business transformation logic
📩 Interested candidates can share updated CV
📌 Subject Line: Oracle Apps Technical Consultant
#Hiring #OracleEBS #OracleApps
#OracleR12 #PLSQL #OracleAPEX
#OracleDeveloper #FinanceModules
#ERPJobs #HyderabadJobs
#TechHiring #ImmediateHiring
#AP #AR #GL #OM #INV
#ITJobs #DatabaseDeveloper

Experience: 0–2 Years (Freshers Welcome)
Location: Hyderabad
Industry: Logistics Technology | Freight Management | Supply Chain Software
Employment Type: Full-Time
Languages: English & Hindi (Mandatory)
About Us
We are a logistics technology company offering innovative freight management software that helps shipping companies, freight forwarders, and logistics providers streamline their operations. Our solutions cover freight forwarding, customs compliance, shipment tracking, rate management, and end-to-end supply chain visibility. We are expanding rapidly and looking for energetic individuals to fuel our sales pipeline.
Role Overview
As a Lead Generation Executive for Freight Software Sales, you will reach out to logistics companies, freight forwarders, shipping lines, and supply chain businesses to introduce our software solutions. Your role is to generate interest, explain how our product solves their operational challenges, and book meetings for the sales team. This is a great entry point for freshers or early-career professionals passionate about logistics technology and B2B sales.
Key Responsibilities
Prospecting & Outreach
• Identify and research target companies in the logistics and freight industry – freight forwarders, 3PLs, shipping companies, customs brokers, exporters/importers.
• Build prospect lists using LinkedIn, industry directories, and databases.
• Conduct outbound cold calls and LinkedIn outreach to introduce our freight software solutions.
• Send personalized emails highlighting how our software can improve their freight operations.
Product/Solution Communication
• Learn and understand our freight software features – shipment management, documentation, customs filing, tracking, analytics, and integrations.
• Explain the product benefits clearly to prospects in English and Hindi, tailoring the conversation to their specific pain points (e.g., manual processes, visibility gaps, compliance challenges).
• Address initial questions and objections confidently.
Appointment Setting & Lead Nurturing
• Qualify leads by understanding their current systems, pain points, and interest level.
• Book product demos and sales meetings for senior sales executives.
• Maintain regular follow-up to keep leads warm and move them through the funnel.
CRM & Reporting
• Log all prospect interactions accurately in CRM (HubSpot/Zoho/Salesforce).
• Report daily/weekly metrics: number of calls, emails, LinkedIn messages, and meetings scheduled.
What You Bring
• 0–2 years of experience in lead generation, tele-sales, or business development (freshers with excellent communication are welcome).
• Fluency in English and Hindi – both spoken and written.
• Interest in logistics, supply chain, or freight industry (prior exposure is a plus but not mandatory).
• Ability to understand and articulate software features in simple, benefit-driven language.
• Self-motivated, persistent, and comfortable with cold outreach.
• Good organizational skills and attention to detail.
• Basic computer skills; willingness to learn CRM and sales tools.
Nice to Have
• Background in logistics, shipping, or freight forwarding operations.
• Experience using LinkedIn Sales Navigator or prospecting databases.
• Familiarity with freight industry terminology (Bill of Lading, customs clearance, FCL/LCL, etc.).
Why Join Us?
• Enter the fast-growing logistics technology space with strong industry demand.
• Comprehensive product training and on-the-job mentorship.
• Performance-based incentives and clear growth trajectory.
• Opportunity to grow into senior sales, key account management, or product roles.
• Collaborative work culture with exposure to global logistics markets.
In this role, you'll be responsible for building machine learning based systems and conduct data analysis that improves the quality of our large geospatial data. You’ll be developing NLP models to extract information, using outlier detection to identifying anomalies and applying data science methods to quantify the quality of our data. You will take part in the development, integration, productionisation and deployment of the models at scale, which would require a good combination of data science and software development.
Responsibilities
- Development of machine learning models
- Building and maintaining software development solutions
- Provide insights by applying data science methods
- Take ownership of delivering features and improvements on time
Must-have Qualifications
- 4 year's experience
- Senior data scientist preferable with knowledge of NLP
- Strong programming skills and extensive experience with Python
- Professional experience working with LLMs, transformers and open-source models from HuggingFace
- Professional experience working with machine learning and data science, such as classification, feature engineering, clustering, anomaly detection and neural networks
- Knowledgeable in classic machine learning algorithms (SVM, Random Forest, Naive Bayes, KNN etc.).
- Experience using deep learning libraries and platforms, such as PyTorch
- Experience with frameworks such as Sklearn, Numpy, Pandas, Polars
- Excellent analytical and problem-solving skills
- Excellent oral and written communication skills
Extra Merit Qualifications
- Knowledge in at least one of the following: NLP, information retrieval, data mining
- Ability to do statistical modeling and building predictive models
- Programming skills and experience with Scala and/or Java
Required Qualifications:
• Bachelor’s degree in Computer Science, Engineering, or a related field.
• 5+ years of overall software development experience; minimum 4+ years in RPA development (UiPath, Blue Prism, or Automation Anywhere).
• Deep understanding of healthcare industry processes (EMR/EHR, claims processing, eligibility, compliance, etc.) is highly preferred.
• Proven ability to manage full RPA deployment including bots, orchestrators, and system integrations in enterprise environments.
• UiPath Advanced Developer or similar RPA certifications are a plus.
• Solid problem-solving, analytical, and communication skills.
Preferred Skills:
• Integration of RPA with AI/ML technologies or cloud platforms (Azure, AWS, GCP).
• Strong experience with scripting (Python, PowerShell, etc.), SQL, and integration via REST/SOAP APIs.
• Understanding of process improvement methodologies (Lean, Six Sigma).
• Track record of designing reusable components and frameworks, especially in a regulated healthcare environment.
• Strong documentation and stakeholder management abilities.
Soft Skills:
• Strong communication and interpersonal skills for cross-functional collaboration.
• Ability to mentor, lead, and motivate technical teams.
• Exceptional organizational and multitasking skills, especially in fast-paced and dynamic work environments.
Job Summary
We are seeking an experienced Java Full Stack Developer with 8+ years of hands-on experience in designing, developing, and maintaining scalable web applications. The ideal candidate should have strong expertise in Java (Spring Boot) on the backend and React.js on the frontend, along with experience in REST APIs, Microservices architecture, and cloud-based deployments.
Key Responsibilities
- Design, develop, and maintain scalable full-stack applications using Java, Spring Boot, and React.js
- Develop RESTful APIs and Microservices-based applications
- Collaborate with cross-functional teams including UI/UX, DevOps, QA, and Product teams
- Write clean, efficient, and reusable code following best practices
- Perform code reviews and mentor junior developers
- Optimize applications for performance and scalability
- Participate in architectural discussions and technical decision-making
- Ensure application security, data protection, and compliance standards
- Troubleshoot, debug, and upgrade existing systems
Required Skills:
- 7+ years of experience in Full Stack development
- Strong hands-on expertise in Java (8/11/17)
- Proficiency in Spring Boot, Spring MVC, Spring Security
- Experience in Microservices architecture and RESTful API development
- Strong knowledge of React.js, JavaScript (ES6+), HTML5, CSS3
- Experience with state management tools (Redux/Context API)
- Hands-on experience with Hibernate/JPA
- Good understanding of SQL databases (MySQL/PostgreSQL/Oracle)
- Experience with Git, Maven/Gradle, and CI/CD pipelines
- Working knowledge of Docker/Kubernetes
- Exposure to Cloud platforms (AWS/Azure/GCP) preferred
- Strong problem-solving and analytical skills
Summary:
As a Custom Software Engineer, you will engage in the design, construction, and configuration of applications tailored to meet specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality software solutions that align with organizational goals.
Roles & Responsibilities:
- Expected to perform independently and become an SME.
- Required active participation/contribution in team discussions.
- Contribute in providing solutions to work related problems.
- Assist in the documentation of application specifications and user guides.
- Collaborate with stakeholders to gather and analyze requirements for application development.
Professional & Technical Skills:
- Must To Have Skills: Proficiency in Veeva Vault.
- Strong understanding of application development methodologies.
- Experience with software testing and debugging techniques.
- Familiarity with database management and data integration processes.
- Ability to work with version control systems and collaborative development tools.
Additional Information:
- The candidate should have minimum 3 years of experience in Veeva Vault.
- This position is based at our Bengaluru office.
🚨 Hiring Alert 🚨Hiring Alert 🚨 Hiring Alert 🚨
📌 Job Role: System Integration Engineer (SW Integration)
📍 Location: Bangalore
💼 Experience: Minimum 6 Years
🎓 Qualification: Bachelor’s in Electrical / Computer Engineering
We are looking for an experienced System Integration Engineer to contribute to system integration and testing activities for next-generation vehicle controller & software platforms.
If you have strong expertise in vehicle networks, E/E architecture, and automotive system testing, this opportunity is for you.
🔹 Essential Skills
✅ Strong knowledge of E/E Architecture & Vehicle Networks
✅ Solid understanding of LIN (ISO 17987-1) & CAN (ISO 11898-1)
✅ Basic knowledge of Ethernet (OA TC8) & SOME/IP protocol
✅ Understanding of automotive testing standards (ISTQB preferred)
✅ Agile test tools: Doors Next Gen, IBM RTC, Jira
✅ Programming knowledge: Python, CAPL, C, C++
🔹 Desired Skills
⭐ Experience in CI/CD (GitHub, Artifactory, Conan, TeamCity)
⭐ Exposure to Linux, AUTOSAR, Android, QNX environments
⭐ Model-driven testing (Mockups, PoCs)
⭐ Cybersecurity knowledge – MACSec (IEEE 802.1AE)
⭐ Experience in electrical architecture industrialization
🔹 Experience Required
✔ Minimum 6 years in Systems Engineering / Vehicle Network Development
✔ Experience in sub-system or vehicle-level industrialization
📩 Interested candidates can share their updated CV
📌 Subject Line: Application for System Integration Engineer
#Hiring #SystemIntegration #SWIntegration
#AutomotiveSoftware #VehicleNetworks
#EEArchitecture #CAN #LIN #SOMEIP
#AUTOSAR #EmbeddedSystems #AutomotiveEngineering
#CI_CD #Python #CAPL #BangaloreJobs
#AutomotiveJobs #TechHiring #EngineeringCareers
#VehicleController #DistributedSystems
Role Overview
We are hiring a Principal Datacenter Backend Developer to architect and build highly scalable, reliable backend platforms for modern data centers. This role owns control-plane and data-plane services powering orchestration, monitoring, automation, and operational intelligence across large-scale on-prem, hybrid, and cloud data center environments.
This is a hands-on principal IC role with strong architectural ownership and technical leadership responsibilities.
Key Responsibilities
- Own end-to-end backend architecture for datacenter platforms (orchestration, monitoring, DCIM, automation).
- Design and build high-availability distributed systems at scale.
- Develop backend services using Java (Spring Boot / Micronaut / Quarkus) and/or Python (FastAPI / Flask / Django).
- Build microservices for resource orchestration, telemetry ingestion, capacity and asset management.
- Design REST/gRPC APIs and event-driven systems.
- Drive performance optimization, scalability, and reliability best practices.
- Embed SRE principles, observability, and security-by-design.
- Mentor senior engineers and influence technical roadmap decisions.
Required Skills
- Strong hands-on experience in Java and/or Python.
- Deep understanding of distributed systems and microservices.
- Experience with Kubernetes, Docker, CI/CD, and cloud-native deployments.
- Databases: PostgreSQL/MySQL, NoSQL, time-series data.
- Messaging systems: Kafka / Pulsar / RabbitMQ.
- Observability tools: Prometheus, Grafana, ELK/OpenSearch.
- Secure backend design (OAuth2, RBAC, audit logging).
Nice to Have
- Experience with DCIM, NMS, or infrastructure automation platforms.
- Exposure to hyperscale or colocation data centers.
- AI/ML-based monitoring or capacity planning experience.
Why Join
- Architect mission-critical platforms for large-scale data centers.
- High-impact principal role with deep technical ownership.
- Work on complex, real-world distributed systems problems.
Title:TeamLead– Software Development
(Lead ateam of developers to deliver applications in line withproduct strategy and growth)
Experience:8– 10 years
Department:InformationTechnology
Classification: Full-Time
Location:HybridinHyderabad,India (3days onsiteand2days remote)
Job Description:
Lookingforafull-time Software Development Team Lead to lead our high-performing Information
Technology team. Thisperson will play a key rolein Clarity’s business by overseeing a development
team, focusingonexisting systems and long-term growth.Thisperson will serveas the technical leader,
able to discuss data structures, new technologies, and methods of achieving system goals. This person
will be crucialin facilitating collaborationamong team members and providing mentoring.
Reporting to the Director, SoftwareDevelopment,thispersonwillberesponsible for theday-to-day
operations of their team, be the first point of escalation and technical contactfor theteam.
JobResponsibilities:
Manages all activities oftheir software developmentteamand sets goals for each team
member to ensure timely project delivery.
Performcode reviews andwrite code if needed.
Collaborateswiththe InformationTechnologydepartmentand business management
team to establish priorities for the team’s plan and manage team performance.
Provide guidance on project requirements,developer processes, andend-user
documentation.
Supports anexcellent customer experience bybeingproactive in assessing escalations
and working with the team to respond appropriately.
Uses technical expertise to contribute towards building best-in-class products. Analyzes
business needs and develops a mix of internal and externalsoftware systems that work
well together.
Using Clarity platforms, writes, reviews, and revises product requirements and
specifications. Analyzes software requirements,implements design plans, andreviews
unit tests. Participates in other areas of the software developmentprocess.
RequiredSkills:
ABachelor’s degree inComputerScience,InformationTechnology, Engineering,or a
related discipline.
Excellentwritten and verbalcommunication skills.
Experiencewith .Net Framework,WebApplications,WindowsApplications, andWeb
Services
Experience in developing andmaintaining applications using C#.NetCore,ASP.NetMVC,
and Entity Framework
Experience in building responsive front-endusingReact.js,Angular.js,HTML5, CSS3 and
JavaScript.
Experience in creating andmanaging databases, stored procedures andcomplex queries
with SQL Server
Experiencewith Azure Cloud Infrastructure
8+years of experience indesigning andcoding software inabove technology stack.
3+years ofmanaging a teamwithin adevelopmentorganization.
3+years of experience in Agile methodologies.
Preferred Skills:
Experience in Python,WordPress,PHP
Experience in using AzureDevOps
Experience working with Salesforce, orany othercomparable ticketing system
Experience in insurance/consumer benefits/file processing (EDI).
Title: Sr. Database Developer
Experience: 6 – 8 years
Department:Information Technology
Classification:Full-Time
Location:Hybrid in Hyderabad,India (3daysonsite and 2 days remote)
Job Description:
We are seeking a highly skilled Senior Database Developer to lead the design, development, and
optimization of enterprise-grade databases using MicrosoftSQL Server. The ideal candidate will
also have hands-onexperience with MS Excel and MS Access, particularly in building data models,
reports, and automation tools that support business operations.
Job Responsibilities:
Design,develop,andmaintain complexSQL Server databases, stored procedures,views,and
functions.
Optimize database performance throughindexing,query tuning,and
capacity planning.
Develop andmaintain Excel-based tools andreports using advanced formulas,pivot tables,
and VBA macros.
Modernize legacyMSAccess applications andmigrate data to SQL Server
where appropriate.
Collaboratewith application developers to supportintegrations and data pipelines.
Implementdata governance, security,andbackupstrategies.
Documentdatabase architecture, workflows,and technicalspecifications.
Provide mentorship to junior developers and contribute to code reviews and best practices.
Required Skills:
Bachelor’s degree inComputerScience,InformationTechnology,or a related field.
5 -8 years of experience indatabase development using SQL Server.
Proficiency in T-SQL,SSIS, SSRS, andSQL Server ManagementStudio (SSMS).
Strong experiencewith MSExcel(includingVBA) and MSAccess.
Familiarity with data migration, ETL processes, and jobscheduling.
Excellent analytical,problem-solving,andcommunication skills
Preferred Skills:
Experiencewithcloud platforms like AzureSQL or AWSRDS.
Knowledge ofPower BI,SharePoint,or other reporting tools.
Experiencewith MySQL
Exposure to Agile methodologies and DevOps practices
Exposure to insurance, consumer benefits,or COBRA domains.
Job Role: Teamcenter Admin
• Teamcenter and CAD (NX) Configuration Management
• Advanced debugging and root-cause analysis beyond L2
• Code fixes and minor defect remediation
• AWS knowledge, which is foundational to our Teamcenter architecture
• Experience supporting weekend and holiday code deployments
• Operational administration (break/fix, handle ticket escalations, problem management
• Support for project activities
• Deployment and code release support
• Hypercare support following deployment, which is expected to onboard approximately 1,000+ additional users
Hands-on experience in leading R2R function for US based Company, Restaurant Accounting. · Should have experience in R365, US GAAP accounting and Financial Statement preparation. · Should be able to lead client meetings, Business review calls. · Work with US counterparts in driving key process initiatives. · Manage and publish daily, weekly, and monthly performance scorecards. · Manage and own the process SLAs agree with client. · Able to interpret Financial Statements to help Executive team to make decisions. · Conduct, monthly, quarterly, and annual one-on-one with team and perform the year end appraisal and performance management. · Should be an Individual contributor. · Managing client experience. · Team management skills. · Subject Matter Expert.
B Com Graduate with 7-8 years of work experience having 3-5 years of team and client management experience. • Strong analytical skills and problem-solving skills. • Proactive, takes initiative, self-motivated, team player. • Strong stakeholder management and interpersonal skills. • Extensive understanding of financial trends both within the company and general market patterns. • Business acumen, Analytical approach, understanding of general business development and operations. • Prior experience in similar BPO/Shared Service Function of MNC. • Should have experience in US Accounting. • Should have excellent communication skills and client interaction experience. • Maintain general ledger, post transactions in R365. • Strong willingness to learn and grow. • Should be willing to work in US Shifts. • Can join immediately. • Should work from Office.
Hiring for Salesforce Project Manager
Exp : 9 - 13 yrs
Work Location : Hyderabad Hybrid
Edu : BE/B.Tech
Work Timings : 11 Am - 8 PM
Skills :
Proven experience managing Salesforce implementation or enhancement projects.
Strong understanding of the Salesforce platform, preferably Sales Cloud and/or Service Cloud.
Knowledge of Salesforce development lifecycle, integration concepts, APIs, security model, and release management.
Excellent communication, presentation, and stakeholder management skills. Strong analytical, problem-solving, and decision-making abilities.
Experience leading cross-functional teams in complex project environments.
JOB DETAILS:
* Job Title: Specialist I - DevOps Engineering
* Industry: Global Digital Transformation Solutions Provider
* Salary: Best in Industry
* Experience: 7-10 years
* Location: Bengaluru (Bangalore), Chennai, Hyderabad, Kochi (Cochin), Noida, Pune, Thiruvananthapuram
Job Description
Job Summary:
As a DevOps Engineer focused on Perforce to GitHub migration, you will be responsible for executing seamless and large-scale source control migrations. You must be proficient with GitHub Enterprise and Perforce, possess strong scripting skills (Python/Shell), and have a deep understanding of version control concepts.
The ideal candidate is a self-starter, a problem-solver, and thrives on challenges while ensuring smooth transitions with minimal disruption to development workflows.
Key Responsibilities:
- Analyze and prepare Perforce repositories — clean workspaces, merge streams, and remove unnecessary files.
- Handle large files efficiently using Git Large File Storage (LFS) for files exceeding GitHub’s 100MB size limit.
- Use git-p4 fusion (Python-based tool) to clone and migrate Perforce repositories incrementally, ensuring data integrity.
- Define migration scope — determine how much history to migrate and plan the repository structure.
- Manage branch renaming and repository organization for optimized post-migration workflows.
- Collaborate with development teams to determine migration points and finalize migration strategies.
- Troubleshoot issues related to file sizes, Python compatibility, network connectivity, or permissions during migration.
Required Qualifications:
- Strong knowledge of Git/GitHub and preferably Perforce (Helix Core) — understanding of differences, workflows, and integrations.
- Hands-on experience with P4-Fusion.
- Familiarity with cloud platforms (AWS, Azure) and containerization technologies (Docker, Kubernetes).
- Proficiency in migration tools such as git-p4 fusion — installation, configuration, and troubleshooting.
- Ability to identify and manage large files using Git LFS to meet GitHub repository size limits.
- Strong scripting skills in Python and Shell for automating migration and restructuring tasks.
- Experience in planning and executing source control migrations — defining scope, branch mapping, history retention, and permission translation.
- Familiarity with CI/CD pipeline integration to validate workflows post-migration.
- Understanding of source code management (SCM) best practices, including version history and repository organization in GitHub.
- Excellent communication and collaboration skills for cross-team coordination and migration planning.
- Proven practical experience in repository migration, large file management, and history preservation during Perforce to GitHub transitions.
Skills: Github, Kubernetes, Perforce, Perforce (Helix Core), Devops Tools
Must-Haves
Git/GitHub (advanced), Perforce (Helix Core) (advanced), Python/Shell scripting (strong), P4-Fusion (hands-on experience), Git LFS (proficient)
Role & Responsibilities
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Role Summary
We are a growing technology services company working on product engineering and growth marketing projects. We are looking for an Operations Manager who can bring structure, accountability, and execution discipline across teams.
Key Responsibilities
- Drive day-to-day operational execution across tech and marketing teams
- Track project timelines, team bandwidth, and delivery commitments
- Identify bottlenecks and ensure timely resolution
- Improve internal processes for better predictability
- Coordinate between leadership, delivery teams, and clients
- Ensure consistent status reporting and follow-ups
Requirements
- 2–5 years of experience in operations / project coordination / delivery
- Experience in startup, agency, or services-based companies preferred
- Strong execution mindset
- Comfortable handling multiple parallel projects
- Familiar with tools like Jira, ClickUp, Asana, or similar
Location & Compensation
Hyderabad (On-site / Hybrid). Compensation aligned to experience.
Role Summary
We are hiring a hands-on IT Project Manager to manage end-to-end execution of software development projects.
Key Responsibilities
Own delivery timelines for web and product development projects
Break requirements into actionable tasks and milestones
Coordinate with developers, designers, and QA
Manage scope, risks, and dependencies
Conduct client status calls and ensure expectation alignment
Ensure projects are delivered on time and within scope
Requirements
2–5 years of IT project management experience
Experience handling client-facing software projects
Strong understanding of SDLC
Experience with tools like Jira, ClickUp, Asana, etc.
Strong communication and stakeholder management skills
Location & Compensation
Hyderabad (On-site preferred). Compensation as per experience
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Job Title: AAA Implementation Engineer
Location: REMOTE
Duration: 12+ months
The AAA Implementation Engineer is responsible for delivering technical implementation services that support the evolution and ongoing maintenance of the AAA infrastructure. This includes involvement in a variety of projects, system upgrades, service and feature enhancements, as well as remediation and break-fix activities. All work must adhere to the organization’s current architectural standards, technology roadmaps, governance, and change management policies. While the primary focus is on implementation and validation engineering, the role also requires a strong understanding of design engineering, AAA policies, security posture, protocols, and cluster deployment and maintenance. The AAA Implementation Engineer will collaborate directly with both internal and external stakeholders—including Architecture, Product Engineering, Design/Implementation Engineering, Change Management, Service and Product Management, Finance, Business Management, and Operations teams—as well as various levels of senior management.
Key Responsibilities:
• Attend Project Meetings as needed.
• Creates Change documents and changes
• Scheduling of the changes with the assigned engineer
• Ensures change documents are peer reviewed are approved
• Representing change records (CRQs) on various calls
• Representing on regional pre-CAB Weekly
• Socializing changes to the other peer teams regionally so they can represent changes in region.
• Coordinates with other teams for change knowledge transfer
• Following up with Testers / Testing Coordination for all Changes
• Ensuring Peer Reviews are attached to CRQs / Chasing Approvals
• Attends various change review calls including AAA weekly internal change calls – weekly
• Reviews test plans and results, ability to assist in driving to root cause.
• Collaborate with other internal/external Bank teams such as Operations, Engineering, and requestors on core design requirements/standards and risk assessment.
• Leverage designated tools and resources to create NCDs that will drive implementation during a pre-approved change window as necessary.
• Ensure initiatives\changes are well defined with success criteria, ownership, and realistic but firm schedules.
• Rehearse changes in the lab.
• Works during weekends to implement changes. Low risk changes can be performed during the week.
• Ensures no risks are associated with the change.
• Ensure changes are user acceptance tested and authentication logs are successful after post implementation.
• Building, updating and sending Change Communication templates for weekend changes
• Works with release managers to create changes.
• Update schedule as changes is completed and new work orders are added.
• Coordinates with vendors during changes if devices need to be swapped or any type of datacenter local onsite support is needed.
• Create work orders and other requests to engage Blackbox and device, firewall and IP services updates
• Validates changes via working with users as part of user acceptance testing, creation and implementation of test plans (automated and manual), verify logs and test results.
Preferred Experience and Attributes
• Strong subject matter expertise across various enterprise identity authentication technologies ranging from AAA (RADIUS/TACACS), 802.1X technologies (Wired/Wireless), RSA and token-based systems.
• Experience with Aruba ClearPass Policy Server or Cisco Identity Services Engine (ISE) is required.
• Experience with Network Access Control (NAC) 802.1X for Wired and Wireless networks is required.
• Experience working with SSL Certificate Authorities and certificate management.
• Strong experience and detailed technical knowledge in security engineering, system and network security, authentication and authorization protocols, cryptography, application security, load balancing.
• Experience with tools such as Splunk, Excel, ideally experience in automation.
• Expert understanding of network protocols TCP/IP, HTTP, HTTPS, SSL, TLS, 802.1.X, etc.
• Experience with testing and change validation, root cause analysis, risk mitigation, security assessments, analysis of security threats, trends and architectural preferred.
• Experience with Remote Access (VPN posture) is preferred.
• Experience with Secure Cloud Analytics (Stealthwatch) is preferred.
• Project Management, ITSM
• Experience with Change Management and CAB processes and procedures.
• Focused on execution, delivery, and commitment to dates. Ability to work in a high-paced environment. Can manage risk - is a good decision maker. Understands the big picture; ability to relate to the firm’s strategy and actions and how they support our business results.
• Leadership: be a self-starter, self-directed and show initiative.
• Demonstrates ownership: Is accountable and influential/can hold others accountable (professionally).
• Strong written and verbal communications skills. Ability to communicate and influence upward as well as laterally.
• Organized and detail oriented.
• Familiarity with working in regulated and/or large global enterprises is a plus
Requirements:
• Bachelor’s degree in engineering, computer science, business, finance or related field/technical training. Post Graduate Degree a plus
• Must have strong analytical skills.
• Minimum of 8-12 years’ experience required in technical role supporting network project(s)/program(s).
• Experience with: Clearpass, Stealthwatch, ICE, AAA, SPLUNK, load balancing, captive portals, NA3RC, automation, network configuration, certificates, cluster build, upgrade and configuration.
• Working knowledge of Excel and MS Project
• Financial services (Insurance, Banking, Investment banking), is a plus.
• Ability to be nimble and flexible; prioritize workload, proactively react to issues and consistently react to shifting deadlines.
• Ability to work weekends (as needed) for migration work
Role Summary
We are looking for a Brand Manager – Digital & Technology Accounts who will own client communication, ensure delivery visibility, and coordinate seamlessly across marketing and tech teams.
Key Responsibilities
- Act as the primary point of contact for assigned client accounts
- Conduct regular status calls and share structured progress updates
- Work closely with tech, design, and marketing teams
- Track tasks and ensure timely delivery
- Maintain documentation of requirements and change
- Identify upsell opportunities in collaboration with leadership
Requirements
- 2+ years of experience in digital agency / marketing client servicing
- Strong communication and stakeholder management skills
- Experience handling multiple client account
- Ability to coordinate across creative, marketing, and tech teams
- Strong follow-up discipline and attention to detail
Location & Compensation
Hyderabad (On-site / Hybrid). Compensation based on experience and fit.
Experience with AWS RDS (PostgreSQL) and related services like ECS/Fargate, Lambda, S3, Route53.
Proven ability to execute zero or low-downtime schema changes and large-table migrations.
Familiarity with infrastructure-as-code tools (Terraform, CloudFormation) and CI/CD pipelines.
Solid knowledge of security practices — RLS, RBAC, secret management, and encryption standards

Global digital transformation solutions provider.
JOB DETAILS:
* Job Title: Lead II - Software Engineering - AWS, Apache Spark (PySpark/Scala), Apache Kafka
* Industry: Global digital transformation solutions provider
* Salary: Best in Industry
* Experience: 5-8 years
* Location: Hyderabad
Job Summary
We are seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines and cloud-based data platforms. The role involves working with large-scale batch and real-time data processing systems, collaborating with cross-functional teams, and ensuring data reliability, security, and performance across the data lifecycle.
Key Responsibilities
ETL Pipeline Development & Optimization
- Design, develop, and maintain complex end-to-end ETL pipelines for large-scale data ingestion and processing.
- Optimize data pipelines for performance, scalability, fault tolerance, and reliability.
Big Data Processing
- Develop and optimize batch and real-time data processing solutions using Apache Spark (PySpark/Scala) and Apache Kafka.
- Ensure fault-tolerant, scalable, and high-performance data processing systems.
Cloud Infrastructure Development
- Build and manage scalable, cloud-native data infrastructure on AWS.
- Design resilient and cost-efficient data pipelines adaptable to varying data volume and formats.
Real-Time & Batch Data Integration
- Enable seamless ingestion and processing of real-time streaming and batch data sources (e.g., AWS MSK).
- Ensure consistency, data quality, and a unified view across multiple data sources and formats.
Data Analysis & Insights
- Partner with business teams and data scientists to understand data requirements.
- Perform in-depth data analysis to identify trends, patterns, and anomalies.
- Deliver high-quality datasets and present actionable insights to stakeholders.
CI/CD & Automation
- Implement and maintain CI/CD pipelines using Jenkins or similar tools.
- Automate testing, deployment, and monitoring to ensure smooth production releases.
Data Security & Compliance
- Collaborate with security teams to ensure compliance with organizational and regulatory standards (e.g., GDPR, HIPAA).
- Implement data governance practices ensuring data integrity, security, and traceability.
Troubleshooting & Performance Tuning
- Identify and resolve performance bottlenecks in data pipelines.
- Apply best practices for monitoring, tuning, and optimizing data ingestion and storage.
Collaboration & Cross-Functional Work
- Work closely with engineers, data scientists, product managers, and business stakeholders.
- Participate in agile ceremonies, sprint planning, and architectural discussions.
Skills & Qualifications
Mandatory (Must-Have) Skills
- AWS Expertise
- Hands-on experience with AWS Big Data services such as EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, and EC2.
- Strong understanding of cloud-native data architectures.
- Big Data Technologies
- Proficiency in PySpark or Scala Spark and SQL for large-scale data transformation and analysis.
- Experience with Apache Spark and Apache Kafka in production environments.
- Data Frameworks
- Strong knowledge of Spark DataFrames and Datasets.
- ETL Pipeline Development
- Proven experience in building scalable and reliable ETL pipelines for both batch and real-time data processing.
- Database Modeling & Data Warehousing
- Expertise in designing scalable data models for OLAP and OLTP systems.
- Data Analysis & Insights
- Ability to perform complex data analysis and extract actionable business insights.
- Strong analytical and problem-solving skills with a data-driven mindset.
- CI/CD & Automation
- Basic to intermediate experience with CI/CD pipelines using Jenkins or similar tools.
- Familiarity with automated testing and deployment workflows.
Good-to-Have (Preferred) Skills
- Knowledge of Java for data processing applications.
- Experience with NoSQL databases (e.g., DynamoDB, Cassandra, MongoDB).
- Familiarity with data governance frameworks and compliance tooling.
- Experience with monitoring and observability tools such as AWS CloudWatch, Splunk, or Dynatrace.
- Exposure to cost optimization strategies for large-scale cloud data platforms.
Skills: big data, scala spark, apache spark, ETL pipeline development
******
Notice period - 0 to 15 days only
Job stability is mandatory
Location: Hyderabad
Note: If a candidate is a short joiner, based in Hyderabad, and fits within the approved budget, we will proceed with an offer
F2F Interview: 14th Feb 2026
3 days in office, Hybrid model.
We are seeking a seasoned Senior Developer to join our team. The ideal candidate is a C# expert who doesn't just write code but understands how to orchestrate complex business processes using the Microsoft ecosystem. You will be responsible for building scalable backend services, optimizing SQL databases, and leveraging Azure and Power Automate to deliver end-to-end automation solutions.
Responsibilities:
- Design and maintain robust, high-performance applications using C# and .NET Core.
- Write complex SQL queries, stored procedures, and optimize database schemas for performance and security.
- Deploy and manage cloud resources within Azure (App Services, Functions, Logic Apps).
- Design enterprise-level automated workflows using Microsoft Power Automate, including custom connectors to bridge the gap between Power Platform and legacy APIs.
- Provide technical mentorship, conduct code reviews, and ensure best practices in the Software Development Life Cycle (SDLC).
Technical Skills:
- C# / .NET: 8+ years of deep expertise in ASP.NET MVC, Web API, and Entity Framework.
- Database: Advanced proficiency in SQL Server
- Azure: Hands-on experience with Azure cloud architecture and integration services.
- Power Automate: Proven experience building complex flows, handling error logic, and integrating Power Automate with custom-coded environments.
- DevOps: Familiarity with CI/CD pipelines (Azure DevOps or GitHub Actions).
Company Description: Bits in Glass - India
- Industry Leader:
- Bits in Glass(BIG) has been in business for more than 20 years. In 2021 Bits in Glass joined hands with Crochet Technologies, forming a larger organization under the Bits In Glass brand to better serve customers across the globe.
- Offices across three locations in India: Pune, Hyderabad & Chandigarh.
- Specialized Pega partner since 2017, delivering Pega solutions with deep industry expertise and experience.
- Proudly ranked among the top 30 Pega partners, Bits In Glass has been one of the very few sponsors of the annual PegaWorld event.
- Elite Appian partner since 2008, delivering Appian solutions with deep industry expertise and experience.
- Operating in the United States, Canada, United Kingdom, and India.
- Dedicated global Pega CoE to support our customers and internal dev teams.
- Specializes in Databricks, AI, and cloud-based data engineering to help companies transition from manual to automated workflows.
- Employee Benefits:
- Career Growth: Opportunities for career advancement and professional development.
- Challenging Projects: Work on innovative, cutting-edge projects that make a global impact.
- Global Exposure: Collaborate with international teams and clients to broaden your professional network.
- Flexible Work Arrangements: Support for work-life balance through flexible working conditions.
- Comprehensive Benefits: Competitive compensation packages and comprehensive benefits including health insurance, and paid time off.
- Learning Opportunities- Great opportunity to upskill yourself and work on new technologies like AI-enabled Pega solutions, Data engineering, Integration, cloud migration etc.
- Company Culture:
- Collaborative Environment: Emphasizes teamwork, innovation, and knowledge sharing.
- Inclusive Workplace: Values diversity and fosters an inclusive environment where all ideas are respected.
- Continuous Learning: Encourages professional development through ongoing learning opportunities and certifications.
- Core Values:
- Integrity: Commitment to ethical practices and transparency in all business dealings.
- Excellence: Strive for the highest standards in everything we do.
- Client-Centric Approach: Focus on delivering the best solutions tailored to client needs.
Role Overview
We are hiring for Humming Apps Technologies LLP who are seeking a Senior Threat Modeler to join the security team and act as a strategic bridge between architecture and defense. This role focuses on proactively identifying vulnerabilities during the design phase to ensure applications, APIs, and cloud infrastructures are secure by design.
The position requires thinking from an attacker’s perspective to analyze trust boundaries, map attack paths, and influence the overall security posture of next-generation AI-driven and cloud-native systems. The goal is not only to detect issues but to prevent risks before implementation.
Key Responsibilities
Architectural Analysis
• Lead deep-dive threat modeling sessions across applications, APIs, microservices, and cloud-native environments
• Perform detailed reviews of system architecture, data flows, and trust boundaries
Threat Modeling Frameworks & Methodologies
• Apply industry-standard frameworks including STRIDE, PASTA, ATLAS, and MITRE ATT&CK
• Identify sophisticated attack vectors and model realistic threat scenarios
Security Design & Risk Mitigation
• Detect weaknesses during the design stage
• Provide actionable and prioritized mitigation recommendations
• Strengthen security posture through secure-by-design principles
Collaborative Security Integration
• Work closely with architects and developers during design and build phases
• Embed security practices directly into the SDLC
• Ensure security is incorporated early rather than retrofitted
Communication & Enablement
• Facilitate threat modeling demonstrations and walkthroughs
• Present findings and risk assessments to stakeholders
• Translate complex technical risks into clear, business-relevant insights
• Educate teams on secure design practices and emerging threats
Required Qualifications
Experience
• 5–10 years of dedicated experience in threat modeling, product security, or application security
Technical Expertise
• Strong understanding of software architecture and distributed systems
• Experience designing and securing RESTful APIs
• Hands-on knowledge of cloud platforms such as AWS, Azure, or GCP
Modern Threat Knowledge
• Expertise in current attack vectors including OWASP Top 10
• Understanding of API-specific threats
• Awareness of emerging risks in AI/LLM-based applications
Tools & Practices
• Practical experience with threat modeling tools
• Proficiency in technical diagramming and system visualization
Communication
• Excellent written and verbal English communication skills
• Ability to collaborate across engineering teams and stakeholders in different time zones
Preferred Qualifications
• Experience in consulting or client-facing professional services roles
• Industry certifications such as CISSP, CSSLP, OSCP, or equivalent
Customer Support & Quality Assurance Executive
Location: Hyderabad (Onsite)
Experience: 1–3 years in QA, tech support, or similar role
Department: Product & Customer Success
The Opportunity
At WINIT, we don’t just build products — we deliver experiences our customers love. As a Customer Support & Quality Assurance Executive, you’ll play a dual role: ensuring our solutions meet the highest quality standards and being the friendly, capable voice that helps customers get the most out of our technology.
This is a perfect role if you enjoy solving problems, improving processes, and making customers feel supported — while also working hands-on with cutting-edge enterprise software. You’ll also have the opportunity to use AI tools like ChatGPT, AI-powered testing assistants, and automation platforms to work smarter, resolve queries faster, and improve efficiency across both QA and support.
What You’ll Do
Quality Assurance (QA)
● Review and analyze product specifications and user requirements to ensure complete understanding.
● Design, execute, and maintain test cases for web and mobile applications.
● Log, track, and manage bugs; work closely with developers to ensure timely fixes.
● Conduct regression, functional, and usability testing to ensure every release is rock-solid.
● Use AI-powered testing tools to generate test scenarios, identify edge cases, and speed up validation.
Customer Support
● Provide timely, professional assistance via email, chat, or phone to global customers.
● Manage and support multiple customers simultaneously, prioritizing effectively.
● Use AI-driven knowledge bases and tools to quickly resolve common queries.
● Document and escalate complex issues to the right teams for resolution.
● Help onboard new customers by guiding them through key features and best practices.
● Collect feedback, identify recurring pain points, and share insights with the product team.
What You Bring
● Any bachelor’s degree — we value skills and attitude over specific majors.
● 1–3 years of experience in QA, customer support, or a similar technical/customer-facing role (SaaS/B2B tech experience preferred).
● Excellent English communication skills (verbal & written).
● Ability to handle multiple customers and tickets simultaneously while staying organized.
● Strong understanding of QA processes and familiarity with bug tracking tools (JIRA, TestRail, etc.).
● Experience with support platforms like Zendesk, Freshdesk, or Intercom is a plus.
● Familiarity with AI productivity tools for testing, ticket triage, and customer communications.
● A proactive, problem-solving mindset and the ability to manage multiple priorities.
Why WINIT
● Be part of a global leader in AI-powered Sales & Distribution solutions.
● Work in a role that blends technical expertise with customer interaction — no two days are the same.
● Learn and apply the latest AI tools to improve your efficiency and impact.
● Collaborate with talented teams in a culture that values innovation and continuous improvement.
● Competitive salary + growth opportunities within QA, Customer Success, or Product teams.
If you’re ready to combine your eye for quality with your passion for helping multiple customers succeed — and do it in an AI-first environment — we’d love to meet you.












