50+ Microsoft Windows Azure Jobs in Bangalore (Bengaluru) | Microsoft Windows Azure Job openings in Bangalore (Bengaluru)
Apply to 50+ Microsoft Windows Azure Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Microsoft Windows Azure Job opportunities across top companies like Google, Amazon & Adobe.
Like us, you'll be deeply committed to delivering impactful outcomes for customers.
- 7+ years of demonstrated ability to develop resilient, high-performance, and scalable code tailored to application usage demands.
- Ability to lead by example with hands-on development while managing project timelines and deliverables. Experience in agile methodologies and practices, including sprint planning and execution, to drive team performance and project success.
- Deep expertise in Node.js, with experience in building and maintaining complex, production-grade RESTful APIs and backend services.
- Experience writing batch/cron jobs using Python and Shell scripting.
- Experience in web application development using JavaScript and JavaScript libraries.
- Have a basic understanding of Typescript, JavaScript, HTML, CSS, JSON and REST based applications.
- Experience/Familiarity with RDBMS and NoSQL Database technologies like MySQL, MongoDB, Redis, ElasticSearch and other similar databases.
- Understanding of code versioning tools such as Git.
- Understanding of building applications deployed on the cloud using Google cloud platform(GCP)or Amazon Web Services (AWS)
- Experienced in JS-based build/Package tools like Grunt, Gulp, Bower, Webpack.

Global digital transformation solutions provider.
Role Proficiency:
Leverage expertise in a technology area (e.g. Java Microsoft technologies or Mainframe/legacy) to design system architecture.
Knowledge Examples:
- Domain/ Industry Knowledge: Basic knowledge of standard business processes within the relevant industry vertical and customer business domain
- Technology Knowledge: Demonstrates working knowledge of more than one technology area related to own area of work (e.g. Java/JEE 5+ Microsoft technologies or Mainframe/legacy) customer technology landscape multiple frameworks (Struts JSF Hibernate etc.) within one technology area and their applicability. Consider low level details such as data structures algorithms APIs and libraries and best practices for one technology stack configuration parameters for successful deployment and configuration parameters for high performance within one technology stack
- Technology Trends: Demonstrates working knowledge of technology trends related to one technology stack and awareness of technology trends related to least two technologies
- Architecture Concepts and Principles: Demonstrates working knowledge of standard architectural principles models patterns (e.g. SOA N-Tier EDA etc.) and perspective (e.g. TOGAF Zachman etc.) integration architecture including input and output components existing integration methodologies and topologies source and external system non functional requirements data architecture deployment architecture architecture governance
- Design Patterns Tools and Principles: Applies specialized knowledge of design patterns design principles practices and design tools. Knowledge of documentation of design using tolls like EA
- Software Development Process Tools & Techniques: Demonstrates thorough knowledge of end-to-end SDLC process (Agile and Traditional) SDLC methodology programming principles tools best practices (refactoring code code package etc.)
- Project Management Tools and Techniques: Demonstrates working knowledge of project management process (such as project scoping requirements management change management risk management quality assurance disaster management etc.) tools (MS Excel MPP client specific time sheets capacity planning tools etc.)
- Project Management: Demonstrates working knowledge of project governance framework RACI matrix and basic knowledge of project metrics like utilization onsite to offshore ratio span of control fresher ratio SLAs and quality metrics
- Estimation and Resource Planning: Working knowledge of estimation and resource planning techniques (e.g. TCP estimation model) company specific estimation templates
- Working knowledge of industry knowledge management tools (such as portals wiki) company and customer knowledge management tools techniques (such as workshops classroom training self-study application walkthrough and reverse KT)
- Technical Standards Documentation & Templates: Demonstrates working knowledge of various document templates and standards (such as business blueprint design documents and test specifications)
- Requirement Gathering and Analysis: Demonstrates working knowledge of requirements gathering for ( non functional) requirements analysis for functional and non functional requirement analysis tools (such as functional flow diagrams activity diagrams blueprint storyboard) techniques (business analysis process mapping etc.) and requirements management tools (e.g.MS Excel) and basic knowledge of functional requirements gathering. Specifically identify Architectural concerns and to document them as part of IT requirements including NFRs
- Solution Structuring: Demonstrates working knowledge of service offering and products
Additional Comments:
Looking for a Senior Java Architect with 12+ years of experience. Key responsibilities include:
• Excellent technical background and end to end architecture to design and implement scalable maintainable and high performing systems integrating front end technologies with back-end services.
• Collaborate with front-end teams to architect React -based user interfaces that are robust, responsive and aligned with overall technical architecture.
• Expertise in cloud-based applications on Azure, leveraging key Azure services.
• Lead the adoption of DevOps practices, including CI/CD pipelines, automation, monitoring and logging to ensure reliable and efficient deployment cycles.
• Provide technical leadership to development teams, guiding them in building solutions that adhere to best practices, industry standards and customer requirements.
• Conduct code reviews to maintain high quality code and collaborate with team to ensure code is optimized for performance, scalability and security.
• Collaborate with stakeholders to defined requirements and deliver technical solutions aligned with business goals.
• Excellent communication skills
• Mentor team members providing guidance on technical challenges and helping them grow their skill set.
• Good to have experience in GCP and retail domain.
Skills: Devops, Azure, Java
Must-Haves
Java (12+ years), React, Azure, DevOps, Cloud Architecture
Strong Java architecture and design experience.
Expertise in Azure cloud services.
Hands-on experience with React and front-end integration.
Proven track record in DevOps practices (CI/CD, automation).
Notice period - 0 to 15days only
Location: Hyderabad, Chennai, Kochi, Bangalore, Trivandrum
Excellent communication and leadership skills.

Global digital transformation solutions provider.
Job Description
We are seeking a skilled Microsoft Dynamics 365 Developer with 4–7 years of hands-on experience in designing, customizing, and developing solutions within the Dynamics 365 ecosystem. The ideal candidate should have strong technical expertise, solid understanding of CRM concepts, and experience integrating Dynamics 365 with external systems.
Key Responsibilities
- Design, develop, and customize solutions within Microsoft Dynamics 365 CE.
- Work on entity schema, relationships, form customizations, and business logic components.
- Develop custom plugins, workflow activities, and automation.
- Build and enhance integrations using APIs, Postman, and related tools.
- Implement and maintain security models across roles, privileges, and access levels.
- Troubleshoot issues, optimize performance, and support deployments.
- Collaborate with cross-functional teams and communicate effectively with stakeholders.
- Participate in version control practices using GIT.
Must-Have Skills
Core Dynamics 365 Skills
- Dynamics Concepts (Schema, Relationships, Form Customization): Advanced
- Plugin Development: Advanced (writing and optimizing plugins, calling actions, updating related entities)
- Actions & Custom Workflows: Intermediate
- Security Model: Intermediate
- Integrations: Intermediate (API handling, Postman, error handling, authorization & authentication, DLL merging)
Coding & Versioning
- C# Coding Skills: Intermediate (Able to write logic using if-else, switch, loops, error handling)
- GIT: Basic
Communication
- Communication Skills: Intermediate (Ability to clearly explain technical concepts and work with business users)
Good-to-Have Skills (Any 3 or More)
Azure & Monitoring
- Azure Functions: Basic (development, debugging, deployment)
- Azure Application Insights: Intermediate (querying logs, pushing logs)
Reporting & Data
- Power BI: Basic (building basic reports)
- Data Migration: Basic (data import with lookups, awareness of migration tools)
Power Platform
- Canvas Apps: Basic (building basic apps using Power Automate connector)
- Power Automate: Intermediate (flows & automation)
- PCF (PowerApps Component Framework): Basic
Skills: Microsoft Dynamics, Javascript, Plugins
Must-Haves
Microsoft Dynamics 365 (4-7 years), Plugin Development (Advanced), C# (Intermediate), Integrations (Intermediate), GIT (Basic)
Core Dynamics 365 Skills
Dynamics Concepts (Schema, Relationships, Form Customization): Advanced
Plugin Development: Advanced (writing and optimizing plugins, calling actions, updating related entities)
Actions & Custom Workflows: Intermediate
Security Model: Intermediate
Integrations: Intermediate
(API handling, Postman, error handling, authorization & authentication, DLL merging)
Coding & Versioning
C# Coding Skills: Intermediate
(Able to write logic using if-else, switch, loops, error handling)
GIT: Basic
Notice period - Immediate to 15 days
Locations: Bangalore only
(Ability to clearly explain technical concepts and work with business users)
Nice to Haves
(Any 3 or More)
Azure & Monitoring
Azure Functions: Basic (development, debugging, deployment)
Azure Application Insights: Intermediate (querying logs, pushing logs)
Reporting & Data
Power BI: Basic (building basic reports)
Data Migration: Basic
(data import with lookups, awareness of migration tools)
Power Platform
Canvas Apps: Basic (building basic apps using Power Automate connector)
Power Automate: Intermediate (flows & automation)
PCF (PowerApps Component Framework): Basic
About Us
At Arka Energy, we're redefining how renewable energy is experienced and adopted in homes. Our focus is on developing next-generation residential solar energy solutions through a unique combination of custom product design, intuitive simulation software, and high-impact technology. With engineering teams in Bangalore and the Bay Area, we’re committed to building innovative products that transform rooftops into smart energy ecosystems.
Our flagship product is a 3D simulation platform that models rooftops and commercial sites, allowing users to design solar layouts and generate accurate energy estimates — streamlining the residential solar design process like never before.
What We're Looking For
We're seeking a Senior DevOps Engineer who will be responsible for managing and automating cloud infrastructure and services, ensuring seamless integration and deployment of applications, and maintaining high availability and reliability. You will work closely with development and operations teams to streamline processes and enhance productivity.
Key Responsibilities
- Design and implement CI/CD pipelines using Azure DevOps.
- Automate infrastructure provisioning and configuration in the Azure cloud environment.
- Monitor and manage system health, performance, and security.
- Collaborate with development teams to ensure smooth and secure deployment of applications.
- Troubleshoot and resolve issues related to deployment and operations.
- Implement best practices for configuration management and infrastructure as code.
- Maintain documentation of processes and solutions.
Requirements
- Total relevant experience of 4 to 5 years.
- Proven experience as a DevOps Engineer, specifically with Azure.
- Experience with CI/CD tools and practices.
- Strong understanding of infrastructure as code (IaC) using tools like Terraform or ARM templates.
- Knowledge of scripting languages such as PowerShell or Python.
- Familiarity with containerization technologies like Docker and Kubernetes.
- Good to have – knowledge on AWS, Digital Ocean, GCP
- Excellent troubleshooting and problem-solving skills
- High ownership, self-starter attitude, and ability to work independently
- Strong aptitude and reasoning ability with a growth mindset
Nice to Have
· Experience working in a SaaS or product-driven startup
· Familiarity with solar industry (preferred but not required)
Required Skills: CI/CD Pipeline, Data Structures, Microservices, Determining overall architectural principles, frameworks and standards, Cloud expertise (AWS, GCP, or Azure), Distributed Systems
Criteria:
- Candidate must have 6+ years of backend engineering experience, with 1–2 years leading engineers or owning major systems.
- Must be strong in one core backend language: Node.js, Go, Java, or Python.
- Deep understanding of distributed systems, caching, high availability, and microservices architecture.
- Hands-on experience with AWS/GCP, Docker, Kubernetes, and CI/CD pipelines.
- Strong command over system design, data structures, performance tuning, and scalable architecture
- Ability to partner with Product, Data, Infrastructure, and lead end-to-end backend roadmap execution.
Description
What This Role Is All About
We’re looking for a Backend Tech Lead who’s equally obsessed with architecture decisions and clean code, someone who can zoom out to design systems and zoom in to fix that one weird memory leak. You’ll lead a small but sharp team, drive the backend roadmap, and make sure our systems stay fast, lean, and battle-tested.
What You’ll Own
● Architect backend systems that handle India-scale traffic without breaking a sweat.
● Build and evolve microservices, APIs, and internal platforms that our entire app depends on.
● Guide, mentor, and uplevel a team of backend engineers—be the go-to technical brain.
● Partner with Product, Data, and Infra to ship features that are reliable and delightful.
● Set high engineering standards—clean architecture, performance, automation, and testing.
● Lead discussions on system design, performance tuning, and infra choices.
● Keep an eye on production like a hawk: metrics, monitoring, logs, uptime.
● Identify gaps proactively and push for improvements instead of waiting for fires.
What Makes You a Great Fit
● 6+ years of backend experience; 1–2 years leading engineers or owning major systems.
● Strong in one core language (Node.js / Go / Java / Python) — pick your sword.
● Deep understanding of distributed systems, caching, high-availability, and microservices.
● Hands-on with AWS/GCP, Docker, Kubernetes, CI/CD pipelines.
● You think data structures and system design are not interviews — they’re daily tools.
● You write code that future-you won’t hate.
● Strong communication and a let’s figure this out attitude.
Bonus Points If You Have
● Built or scaled consumer apps with millions of DAUs.
● Experimented with event-driven architecture, streaming systems, or real-time pipelines.
● Love startups and don’t mind wearing multiple hats.
● Experience on logging/monitoring tools like Grafana, Prometheus, ELK, OpenTelemetry.
Why company Might Be Your Best Move
● Work on products used by real people every single day.
● Ownership from day one—your decisions will shape our core architecture.
● No unnecessary hierarchy; direct access to founders and senior leadership.
● A team that cares about quality, speed, and impact in equal measure.
● Build for Bharat — complex constraints, huge scale, real impact.

Global digital transformation solutions provider.
Job Description – Senior Technical Business Analyst
Location: Trivandrum (Preferred) | Open to any location in India
Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST
About the Role
We are seeking highly motivated and analytically strong Senior Technical Business Analysts who can work seamlessly with business and technology stakeholders to convert a one-line problem statement into a well-defined project or opportunity. This role is ideal for fresh graduates who have a strong foundation in data analytics, data engineering, data visualization, and data science, along with a strong drive to learn, collaborate, and grow in a dynamic, fast-paced environment.
As a Technical Business Analyst, you will be responsible for translating complex business challenges into actionable user stories, analytical models, and executable tasks in Jira. You will work across the entire data lifecycle—from understanding business context to delivering insights, solutions, and measurable outcomes.
Key Responsibilities
Business & Analytical Responsibilities
- Partner with business teams to understand one-line problem statements and translate them into detailed business requirements, opportunities, and project scope.
- Conduct exploratory data analysis (EDA) to uncover trends, patterns, and business insights.
- Create documentation including Business Requirement Documents (BRDs), user stories, process flows, and analytical models.
- Break down business needs into concise, actionable, and development-ready user stories in Jira.
Data & Technical Responsibilities
- Collaborate with data engineering teams to design, review, and validate data pipelines, data models, and ETL/ELT workflows.
- Build dashboards, reports, and data visualizations using leading BI tools to communicate insights effectively.
- Apply foundational data science concepts such as statistical analysis, predictive modeling, and machine learning fundamentals.
- Validate and ensure data quality, consistency, and accuracy across datasets and systems.
Collaboration & Execution
- Work closely with product, engineering, BI, and operations teams to support the end-to-end delivery of analytical solutions.
- Assist in development, testing, and rollout of data-driven solutions.
- Present findings, insights, and recommendations clearly and confidently to both technical and non-technical stakeholders.
Required Skillsets
Core Technical Skills
- 6+ years of Technical Business Analyst experience within an overall professional experience of 8+ years
- Data Analytics: SQL, descriptive analytics, business problem framing.
- Data Engineering (Foundational): Understanding of data warehousing, ETL/ELT processes, cloud data platforms (AWS/GCP/Azure preferred).
- Data Visualization: Experience with Power BI, Tableau, or equivalent tools.
- Data Science (Basic/Intermediate): Python/R, statistical methods, fundamentals of ML algorithms.
Soft Skills
- Strong analytical thinking and structured problem-solving capability.
- Ability to convert business problems into clear technical requirements.
- Excellent communication, documentation, and presentation skills.
- High curiosity, adaptability, and eagerness to learn new tools and techniques.
Educational Qualifications
- BE/B.Tech or equivalent in:
- Computer Science / IT
- Data Science
What We Look For
- Demonstrated passion for data and analytics through projects and certifications.
- Strong commitment to continuous learning and innovation.
- Ability to work both independently and in collaborative team environments.
- Passion for solving business problems using data-driven approaches.
- Proven ability (or aptitude) to convert a one-line business problem into a structured project or opportunity.
Why Join Us?
- Exposure to modern data platforms, analytics tools, and AI technologies.
- A culture that promotes innovation, ownership, and continuous learning.
- Supportive environment to build a strong career in data and analytics.
Skills: Data Analytics, Business Analysis, Sql
Must-Haves
Technical Business Analyst (6+ years), SQL, Data Visualization (Power BI, Tableau), Data Engineering (ETL/ELT, cloud platforms), Python/R
******
Notice period - 0 to 15 days (Max 30 Days)
Educational Qualifications: BE/B.Tech or equivalent in: (Computer Science / IT) /Data Science
Location: Trivandrum (Preferred) | Open to any location in India
Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST
Review Criteria
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred
- Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Job Specific Criteria
- CV Attachment is mandatory
- How many years of experience you have with Dremio?
- Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
- Are you okay with 3 Days WFO?
- Virtual Interview requires video to be on, are you okay with it?
Role & Responsibilities
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate
- Bachelor’s or master’s in computer science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Key Responsibilities:
- Application Development: Design and implement both client-side and server-side architecture using JavaScript frameworks and back-end technologies like Golang.
- Database Management: Develop and maintain relational and non-relational databases (MySQL, PostgreSQL, MongoDB) and optimize database queries and schema design.
- API Development: Build and maintain RESTfuI APIs and/or GraphQL services to integrate with front-end applications and third-party services.
- Code Quality & Performance: Write clean, maintainable code and implement best practices for scalability, performance, and security.
- Testing & Debugging: Perform testing and debugging to ensure the stability and reliability of applications across different environments and devices.
- Collaboration: Work closely with product managers, designers, and DevOps engineers to deliver features aligned with business goals.
- Documentation: Create and maintain documentation for code, systems, and application architecture to ensure knowledge transfer and team alignment.
Requirements:
- Experience: 1+ years in backend development in micro-services ecosystem, with proven experience in front-end and back-end frameworks.
- 1+ years experience Golang is mandatory
- Problem-Solving & DSA: Strong analytical skills and attention to detail.
- Front-End Skills: Proficiency in JavaScript and modern front-end frameworks (React, Angular, Vue.js) and familiarity with HTML/CSS.
- Back-End Skills: Experience with server-side languages and frameworks like Node.js, Express, Python or GoLang.
- Database Knowledge: Strong knowledge of relational databases (MySQL, PostgreSQL) and NoSQL databases (MongoDB).
- API Development: Hands-on experience with RESTfuI API design and integration, with a plus for GraphQL.
- DevOps Understanding: Familiarity with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes) is a bonus.
- Soft Skills: Excellent problem-solving skills, teamwork, and strong communication abilities.
Nice-to-Have:
- UI/UX Sensibility: Understanding of responsive design and user experience principles.
- CI/CD Knowledge: Familiarity with CI/CD tools and workflows (Jenkins, GitLab CI).
- Security Awareness: Basic understanding of web security standards and best practices.
ROLES AND RESPONSIBILITIES:
Standardization and Governance:
- Establishing and maintaining project management standards, processes, and methodologies.
- Ensuring consistent application of project management policies and procedures.
- Implementing and managing project governance processes.
Resource Management:
- Facilitating the sharing of resources, tools, and methodologies across projects.
- Planning and allocating resources effectively.
- Managing resource capacity and forecasting future needs.
Communication and Reporting:
- Ensuring effective communication and information flow among project teams and stakeholders.
- Monitoring project progress and reporting on performance.
- Communicating strategic work progress, including risks and benefits.
Project Portfolio Management:
- Supporting strategic decision-making by aligning projects with organizational goals.
- Selecting and prioritizing projects based on business objectives.
- Managing project portfolios and ensuring efficient resource allocation across projects.
Process Improvement:
- Identifying and implementing industry best practices into workflows.
- Improving project management processes and methodologies.
- Optimizing project delivery and resource utilization.
Training and Support:
- Providing training and support to project managers and team members.
- Offering project management tools, best practices, and reporting templates.
Other Responsibilities:
- Managing documentation of project history for future reference.
- Coaching project teams on implementing project management steps.
- Analysing financial data and managing project costs.
- Interfacing with functional units (Domain, Delivery, Support, Devops, HR etc).
- Advising and supporting senior management.
IDEAL CANDIDATE:
- 3+ years of proven experience in Project Management roles with strong exposure to PMO processes, standards, and governance frameworks.
- Demonstrated ability to manage project status tracking, risk assessments, budgeting, variance analysis, and defect tracking across multiple projects.
- Proficient in Project Planning and Scheduling using tools like MS Project and Advanced Excel (e.g., Gantt charts, pivot tables, macros).
- Experienced in developing project dashboards, reports, and executive summaries for senior management and stakeholders.
- Active participant in Agile environments, attending and contributing to Scrum calls, sprint planning, and retrospectives.
- Holds a Bachelor’s degree in a relevant field (e.g., Engineering, Business, IT, etc.).
- Preferably familiar with Jira, Azure DevOps, and Power BI for tracking and visualization of project data.
- Exposure to working in product-based companies or fast-paced, innovation-driven environments is a strong advantage.
Senior Software Engineer
Location: Hyderabad, India
Who We Are:
Since our inception back in 2006, Navitas has grown to be an industry leader in the digital transformation space, and we’ve served as trusted advisors supporting our client base within the commercial, federal, and state and local markets.
What We Do:
At our very core, we’re a group of problem solvers providing our award-winning technology solutions to drive digital acceleration for our customers! With proven solutions, award-winning technologies, and a team of expert problem solvers, Navitas has consistently empowered customers to use technology as a competitive advantage and deliver cutting-edge transformative solutions.
What You’ll Do:
Build, Innovate, and Own:
- Design, develop, and maintain high-performance microservices in a modern .NET/C# environment.
- Architect and optimize data pipelines and storage solutions that power our AI-driven products.
- Collaborate closely with AI and data teams to bring machine learning models into production systems.
- Build integrations with external services and APIs to enable scalable, interoperable solutions.
- Ensure robust security, scalability, and observability across distributed systems.
- Stay ahead of the curve — evaluating emerging technologies and contributing to architectural decisions for our next-gen platform.
Responsibilities will include but are not limited to:
- Provide technical guidance and code reviews that raise the bar for quality and performance.
- Help create a growth-minded engineering culture that encourages experimentation, learning, and accountability.
What You’ll Need:
- Bachelor’s degree in Computer Science or equivalent practical experience.
- 8+ years of professional experience, including 5+ years designing and maintaining scalable backend systems using C#/.NET and microservices architecture.
- Strong experience with SQL and NoSQL data stores.
- Solid hands-on knowledge of cloud platforms (AWS, GCP, or Azure).
- Proven ability to design for performance, reliability, and security in data-intensive systems.
- Excellent communication skills and ability to work effectively in a global, cross-functional environment.
Set Yourself Apart With:
- Startup experience - specifically in building product from 0-1
- Exposure to AI/ML-powered systems, data engineering, or large-scale data processing.
- Experience in healthcare or fintech domains.
- Familiarity with modern DevOps practices, CI/CD pipelines, and containerization (Docker/Kubernetes).
Equal Employer/Veterans/Disabled
Navitas Business Consulting is an affirmative action and equal opportunity employer. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact Navitas Human Resources.
Navitas is an equal opportunity employer. We provide employment and opportunities for advancement, compensation, training, and growth according to individual merit, without regard to race, color, religion, sex (including pregnancy), national origin, sexual orientation, gender identity or expression, marital status, age, genetic information, disability, veteran-status veteran or military status, or any other characteristic protected under applicable Federal, state, or local law. Our goal is for each staff member to have the opportunity to grow to the limits of their abilities and to achieve personal and organizational objectives. We will support positive programs for equal treatment of all staff and full utilization of all qualified employees at all levels within Navita
Type: Client-Facing Technical Architecture, Infrastructure Solutioning & Domain Consulting (India + International Markets)
Role Overview
Tradelab is seeking a senior Solution Architect who can interact with both Indian and international clients (Dubai, Singapore, London, US), helping them understand our trading systems, OMS/RMS/CMS stack, HFT platforms, feed systems, and Matching Engine. The architect will design scalable, secure, and ultra-low-latency deployments tailored to global forex markets, brokers, prop firms, liquidity providers, and market makers.
Key Responsibilities
1. Client Engagement (India + International Markets)
- Engage with brokers, prop trading firms, liquidity providers, and financial institutions across India, Dubai, Singapore, and global hubs.
- Explain Tradelab’s capabilities, architecture, and deployment options.
- Understand region-specific latency expectations, connectivity options, and regulatory constraints.
2. Requirement Gathering & Solutioning
- Capture client needs, throughput, order concurrency, tick volumes, and market data handling.
- Assess infra readiness (cloud/on-prem/colo).
- Propose architecture aligned with forex markets.
3. Global Architecture & Deployment Design
- Design multi-region infrastructure using AWS/Azure/GCP.
- Architect low-latency routing between India–Singapore–Dubai.
- Support deployments in DCs like Equinix SG1/DX1.
4. Networking & Security Architecture
- Architect multicast/unicast feeds, VPNs, IPSec tunnels, BGP routes.
- Implement network hardening, segmentation, WAF/firewall rules.
5. DevOps, Cloud Engineering & Scalability
- Build CI/CD pipelines, Kubernetes autoscaling, cost-optimized AWS multi-region deployments.
- Design global failover models.
6. BFSI & Trading Domain Expertise
- Indian broking, international forex, LP aggregation, HFT.
- OMS/RMS, risk engines, LP connectivity, and matching engines.
7. Latency, Performance & Capacity Planning
- Benchmark and optimize cross-region latency.
- Tune performance for high tick volumes and volatility bursts.
8. Documentation & Consulting
- Prepare HLDs, LLDs, SOWs, cost sheets, and deployment of playbooks.
- Required Skills
- AWS: EC2, VPC, EKS, NLB, MSK/Kafka, IAM, Global Accelerator.
- DevOps: Kubernetes, Docker, Helm, Terraform.
- Networking: IPSec, GRE, VPN, BGP, multicast (PIM/IGMP).
- Message buses: Kafka, RabbitMQ, Redis Streams.
Domain Skills
- Deep Broking Domain Understanding.
- Indian broking + global forex/CFD.
- FIX protocol, LP integration, market data feeds.
- Regulations: SEBI, DFSA, MAS, ESMA.
Soft Skills
- Excellent communication and client-facing ability.
- Strong presales and solutioning mindset.
- Preferred Qualifications
- B.Tech/BE/M.Tech in CS or equivalent.
- AWS Architect Professional, CCNP, CKA.
Why Join Us?
- Experience in colocation/global trading infra.
- Work with a team that expects and delivers excellence.
- A culture where risk-taking is rewarded, and complacency is not.
- Limitless opportunities for growth—if you can handle the pace.
- A place where learning is currency, and outperformance is the only metric that matters.
- The opportunity to build systems that move markets, execute trades in microseconds, and redefine fintech.
This isn’t just a job—it’s a proving ground. Ready to take the leap? Apply now.
🔧 Key Skills
- Strong expertise in Python (3.x)
- Experience with Django / Flask / FastAPI
- Good understanding of Microservices & RESTful API development
- Proficiency in MySQL/PostgreSQL – queries, stored procedures, optimization
- Solid grip on Data Structures & Algorithms (DSA)
- Comfortable working with Linux & Windows environments
- Hands-on experience with Git, CI/CD (Jenkins/GitHub Actions)
- Familiarity with Docker / Kubernetes is a plus
Key Responsibilities
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Qualifications
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 10+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Job Position: Lead II - Software Engineering
Domain: Information technology (IT)
Location: India - Thiruvananthapuram
Salary: Best in Industry
Job Positions: 1
Experience: 8 - 12 Years
Skills: .Net, Sql Azure, Rest Api, Vue.Js
Notice Period: Immediate – 30 Days
Job Summary:
We are looking for a highly skilled Senior .NET Developer with a minimum of 7 years of experience across the full software development lifecycle, including post-live support. The ideal candidate will have a strong background in .NET backend API development, Agile methodologies, and Cloud infrastructure (preferably Azure). You will play a key role in solution design, development, DevOps pipeline enhancement, and mentoring junior engineers.
Key Responsibilities:
- Design, develop, and maintain scalable and secure .NET backend APIs.
- Collaborate with product owners and stakeholders to understand requirements and translate them into technical solutions.
- Lead and contribute to Agile software delivery processes (Scrum, Kanban).
- Develop and improve CI/CD pipelines and support release cadence targets, using Infrastructure as Code tools (e.g., Terraform).
- Provide post-live support, troubleshooting, and issue resolution as part of full lifecycle responsibilities.
- Implement unit and integration testing to ensure code quality and system stability.
- Work closely with DevOps and cloud engineering teams to manage deployments on Azure (Web Apps, Container Apps, Functions, SQL).
- Contribute to front-end components when necessary, leveraging HTML, CSS, and JavaScript UI frameworks.
- Mentor and coach engineers within a co-located or distributed team environment.
- Maintain best practices in code versioning, testing, and documentation.
Mandatory Skills:
- 7+ years of .NET development experience, including API design and development
- Strong experience with Azure Cloud services, including:
- Web/Container Apps
- Azure Functions
- Azure SQL Server
- Solid understanding of Agile development methodologies (Scrum/Kanban)
- Experience in CI/CD pipeline design and implementation
- Proficient in Infrastructure as Code (IaC) – preferably Terraform
- Strong knowledge of RESTful services and JSON-based APIs
- Experience with unit and integration testing techniques
- Source control using Git
- Strong understanding of HTML, CSS, and cross-browser compatibility
Good-to-Have Skills:
- Experience with Kubernetes and Docker
- Knowledge of JavaScript UI frameworks, ideally Vue.js
- Familiarity with JIRA and Agile project tracking tools
- Exposure to Database as a Service (DBaaS) and Platform as a Service (PaaS) concepts
- Experience mentoring or coaching junior developers
- Strong problem-solving and communication skills
Position Responsibilities:
About the Role
We are seeking a skilled and motivated Senior Software Developer to join our team, responsible for developing and maintaining a robust ERP solution used by approximately 400 customers and over 30,000 users worldwide. The system is built using C# (.NET Core), leverages SQL Server for data management, and is hosted in the Microsoft Azure cloud.
This role offers the opportunity to work on a mission-critical product, contribute to architectural decisions, and help shape the future of our cloud-native ERP platform.
Key Responsibilities
- Design, develop, and maintain features and modules within the ERP system using C# (.NET Core)
- Optimise and manage SQL Server database interactions for performance and scalability
- Collaborate with cross-functional teams, including QA, DevOps, Product Management, and Support
- Participate in code reviews, architecture discussions, and technical planning
- Contribute to the adoption and improvement of CI/CD pipelines and cloud deployment practices
- Troubleshoot and resolve complex technical issues across the stack
- Ensure code quality, maintainability, and adherence to best practices
- Stay current with emerging technologies and recommend improvements where applicable
Qualifications
- Curiosity, passion, teamwork, and initiative
- Strong experience with C# and .NET Core in enterprise application development
- Solid understanding of SQL Server, including query optimization and schema design
- Experience with Azure cloud services (App Services, Azure SQL, Storage, etc.)
- Ability to utilize agentic AI as a development support, with a critical thinking attitude
- Familiarity with agile development methodologies and DevOps practices
- Ability to work independently and collaboratively in a fast-paced environment
- Excellent problem-solving and communication skills
- Master's degree in computer science or equivalent; 5+ years of relevant work experience
- Experience with ERP systems or other complex business applications is a plus
What We Offer
- A chance to work on a product that directly impacts thousands of users worldwide
- A collaborative and supportive engineering culture
- Opportunities for professional growth and technical leadership
- Competitive salary and benefits package
Job Summary:
We are looking for a skilled and motivated Backend Engineer with 2 to 4 years of professional experience to join our dynamic engineering team. You will play a key role in designing, building, and maintaining the backend systems that power our products. You’ll work closely with cross-functional teams to deliver scalable, secure, and high-performance solutions that align with business and user needs.
This role is ideal for engineers ready to take ownership of systems, contribute to architectural decisions, and solve complex backend challenges.
Website: https://www.thealteroffice.com/about
Key Responsibilities:
- Design, build, and maintain robust backend systems and APIs that are scalable and maintainable.
- Collaborate with product, frontend, and DevOps teams to deliver seamless, end-to-end solutions.
- Model and manage data using SQL (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Redis), incorporating caching where needed.
- Implement and manage authentication, authorization, and data security practices.
- Write clean, well-documented, and well-tested code following best practices.
- Work with cloud platforms (AWS, GCP, or Azure) to deploy, monitor, and scale services effectively.
- Use tools like Docker (and optionally Kubernetes) for containerization and orchestration of backend services.
- Maintain and improve CI/CD pipelines for faster and safer deployments.
- Monitor and debug production issues, using observability tools (e.g., Prometheus, Grafana, ELK) for root cause analysis.
- Participate in code reviews, contribute to improving development standards, and provide support to less experienced engineers.
- Work with event-driven or microservices-based architecture, and optionally use technologies like GraphQL, WebSockets, or message brokers such as Kafka or RabbitMQ when suitable for the solution.
Requirements:
- 2 to 4 years of professional experience as a Backend Engineer or similar role.
- Proficiency in at least one backend programming language (e.g., Python, Java, Go, Ruby, etc.).
- Strong understanding of RESTful API design, asynchronous programming, and scalable architecture patterns.
- Solid experience with both relational and NoSQL databases, including designing and optimizing data models.
- Familiarity with Docker, Git, and modern CI/CD workflows.
- Hands-on experience with cloud infrastructure and deployment processes (AWS, GCP, or Azure).
- Exposure to monitoring, logging, and performance profiling practices in production environments.
- A good understanding of security best practices in backend systems.
- Strong problem-solving, debugging, and communication skills.
- Comfortable working in a fast-paced, agile environment with evolving priorities.
Company name: JPMorgan (JPMC)
Job Category: Predictive Science
Location: Parcel 9, Embassy Tech Village, Outer Ring Road, Deverabeesanhalli Village, Varthur Hobli, Bengaluru
Job Schedule: Full time
JOB DESCRIPTION
JPMC is hiring the best talents to join the growing Asset and Wealth Management AI team. We are executing like a startup and building next-generation technology that combines JPMC unique data and full-service advantage to develop high impact AI applications and platforms in the financial services industry. We are looking for hands-on ML Engineering leader and expert who is excited about the opportunity.
As a senior ML and GenAI engineer, you will play a lead role as a senior member of our global team. Your responsibilities will entail hands on development of high-impact business solutions through data analysis, developing cutting edge ML and LLM models, and deploying these models to production environments on AWS or Azure.
You'll combine your years of proven development expertise with a never-ending quest to create innovative technology through solid engineering practices. Your passion and experience in one or more technology domains will help solve complex business problems to serve our Private Bank clients. As a constant learner and early adopter, you’re already embracing leading-edge technologies and methodologies; your example encourages others to follow suit.
Job responsibilities
• Hands-on architecture and implementation of lighthouse ML and LLM-powered solutions
• Close partnership with peers in a geographically dispersed team and colleagues across organizational lines
• Collaborate across JPMorgan AWM’s lines of business and functions to accelerate adoption of common AI capabilities
• Design and implement highly scalable and reliable data processing pipelines and deploy model inference services.
• Deploy solutions into public cloud infrastructure
• Experiment, develop and productionize high quality machine learning models, services, and platforms to make a huge technology and business impact
Required qualifications, capabilities, and skills
• Formal training or certification on software engineering concepts and 5+ years applied experience
• MS in Computer Science, Statistics, Mathematics or Machine Learning.
• Development experience, along with hands-on Machine Learning Engineering
• Proven leadership capacity, including new AI/ML idea generation and GenAI-based solutions
• Solid Python programming skills required; with other high-performance language such as Go a big plus
• Expert knowledge of one of the cloud computing platforms preferred: Amazon Web Services (AWS), Azure, Kubernetes.
• Experience in using LLMs (OpenAI, Claude or other models) to solve business problems, including full workflow toolset, such as tracing, evaluations and guardrails. Understanding of LLM fine-tuning and inference a plus
• Knowledge of data pipelines, both batch and real-time data processing on both SQL (such as Postgres) and NoSQL stores (such as OpenSearch and Redis)
• Expertise in application, data, and infrastructure architecture disciplines
• Deep knowledge in Data structures, Algorithms, Machine Learning, Data Mining, Information Retrieval, Statistics.
• Excellent communication skills and ability to communicate with senior technical and business partners
Preferred qualifications, capabilities, and skills
• Expert in at least one of the following areas: Natural Language Processing, Reinforcement Learning, Ranking and Recommendation, or Time Series Analysis.
• Knowledge of machine learning frameworks: Pytorch, Keras, MXNet, Scikit-Learn
• Understanding of finance or wealth management businesses is an added advantage
ABOUT US
JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation.
ABOUT THE TEAM
J.P. Morgan Asset & Wealth Management delivers industry-leading investment management and private banking solutions. Asset Management provides individuals, advisors and institutions with strategies and expertise that span the full spectrum of asset classes through our global network of investment professionals. Wealth Management helps individuals, families and foundations take a more intentional approach to their wealth or finances to better define, focus and realize their goals.
1 Senior Associate Technology L1 – Java Microservices
Company Description
Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.
Job Description
We are looking for a Senior Associate Technology Level 1 - Java Microservices Developer to join our team of bright thinkers and doers. You’ll use your problem-solving creativity to design, architect, and develop high-end technology solutions that solve our clients’ most complex and challenging problems across different industries.
We are on a mission to transform the world, and you will be instrumental in shaping how we do it with your ideas, thoughts, and solutions.
Your Impact:
• Drive the design, planning, and implementation of multifaceted applications, giving you breadth and depth of knowledge across the entire project lifecycle.
• Combine your technical expertise and problem-solving passion to work closely with clients, turning • complex ideas into end-to-end solutions that transform our clients’ business
• Constantly innovate and evaluate emerging technologies and methods to provide scalable and elegant solutions that help clients achieve their business goals.
Qualifications
➢ 5 to 7 Years of software development experience
➢ Strong development skills in Java JDK 1.8 or above
➢ Java fundamentals like Exceptional handling, Serialization/Deserialization and Immutability concepts
➢ Good fundamental knowledge in Enums, Collections, Annotations, Generics, Auto boxing and Data Structure
➢ Database RDBMS/No SQL (SQL, Joins, Indexing)
➢ Multithreading (Re-entrant Lock, Fork & Join, Sync, Executor Framework)
➢ Spring Core & Spring Boot, security, transactions ➢ Hands-on experience with JMS (ActiveMQ, RabbitMQ, Kafka etc)
➢ Memory Mgmt (JVM configuration, Profiling, GC), profiling, Perf tunning, Testing, Jmeter/similar tool)
➢ Devops (CI/CD: Maven/Gradle, Jenkins, Quality plugins, Docker and containersization)
➢ Logical/Analytical skills. Thorough understanding of OOPS concepts, Design principles and implementation of
➢ different type of Design patterns. ➢ Hands-on experience with any of the logging frameworks (SLF4J/LogBack/Log4j) ➢ Experience of writing Junit test cases using Mockito / Powermock frameworks.
➢ Should have practical experience with Maven/Gradle and knowledge of version control systems like Git/SVN etc.
➢ Good communication skills and ability to work with global teams to define and deliver on projects.
➢ Sound understanding/experience in software development process, test-driven development.
➢ Cloud – AWS / AZURE / GCP / PCF or any private cloud would also be fine
➢ Experience in Microservices
Job Description
We are looking for an experienced GCP Cloud Engineer to design, implement, and manage cloud-based solutions on Google Cloud Platform (GCP). The ideal candidate should have expertise in GKE (Google Kubernetes Engine), Cloud Run, Cloud Loadbalancer, Cloud function, Azure DevOps, and Terraform, with a strong focus on automation, security, and scalability.
Work location: Pune/Mumbai/Bangalore
Experience: 4-7 Years
Joining: Mid of October
You will work closely with development, operations, and security teams to ensure robust cloud infrastructure and CI/CD pipelines while optimizing performance and cost.
Key Responsibilities:
1. Cloud Infrastructure Design & Management
· Architect, deploy, and maintain GCP cloud resources via terraform/other automation.
· Implement Google Cloud Storage, Cloud SQL, file store, for data storage and processing needs.
· Manage and configure Cloud Load Balancers (HTTP(S), TCP/UDP, and SSL Proxy) for high availability and scalability.
· Optimize resource allocation, monitoring, and cost efficiency across GCP environments.
2. Kubernetes & Container Orchestration
· Deploy, manage, and optimize workloads on Google Kubernetes Engine (GKE).
· Work with Helm charts, Istio, and service meshes for microservices deployments.
· Automate scaling, rolling updates, and zero-downtime deployments.
3. Serverless & Compute Services
· Deploy and manage applications on Cloud Run and Cloud Functions for scalable, serverless workloads.
· Optimize containerized applications running on Cloud Run for cost efficiency and performance.
4. CI/CD & DevOps Automation
· Design, implement, and manage CI/CD pipelines using Azure DevOps.
· Automate infrastructure deployment using Terraform, Bash and Power shell scripting
· Integrate security and compliance checks into the DevOps workflow (DevSecOps).
Required Skills & Qualifications:
✔ Experience: 4+ years in Cloud Engineering, with a focus on GCP.
✔ Cloud Expertise: Strong knowledge of GCP services (GKE, Compute Engine, IAM, VPC, Cloud Storage, Cloud SQL, Cloud Functions).
✔ Kubernetes & Containers: Experience with GKE, Docker, GKE Networking, Helm.
✔ DevOps Tools: Hands-on experience with Azure DevOps for CI/CD pipeline automation.
✔ Infrastructure-as-Code (IaC): Expertise in Terraform for provisioning cloud resources.
✔ Scripting & Automation: Proficiency in Python, Bash, or PowerShell for automation.
✔ Security & Compliance: Knowledge of cloud security principles, IAM, and compliance standards.
About Wissen Technology
Wissen Technology, established in 2015 and part of the Wissen Group (founded in 2000), is a specialized technology consulting company. We pride ourselves on delivering high-quality solutions for global organizations across Banking & Finance, Telecom, and Healthcare domains.
Here’s why Wissen Technology stands out:
Global Presence: Offices in US, India, UK, Australia, Mexico, and Canada.
Expert Team: Wissen Group comprises over 4000 highly skilled professionals worldwide, with Wissen Technology contributing 1400 of these experts. Our team includes graduates from prestigious institutions such as Wharton, MIT, IITs, IIMs, and NITs.
Recognitions: Great Place to Work® Certified.
Featured as a Top 20 AI/ML Vendor by CIO Insider (2020).
Impressive Growth: Achieved 400% revenue growth in 5 years without external funding.
Successful Projects: Delivered $650 million worth of projects to 20+ Fortune 500 companies.
For more details:
Website: www.wissen.com
Wissen Thought leadership : https://www.wissen.com/articles/
LinkedIn: Wissen Technology
Salary (Lacs): Up to 22 LPA
Required Qualifications
• 4–7 years of total experience, with a minimum of 4 years in a full-time DevOps role
• Hands-on experience with major cloud platforms (GCP, AWS, Azure, OCI), more than one will be a plus
• Proficient in Kubernetes administration and container technologies (Docker, containerd)
• Strong Linux fundamentals
• Scripting skills in Python and shell scripting
• Knowledge of infrastructure as code with hands-on experience in Terraform and/or Pulumi (mandatory)
• Experience in maintaining and troubleshooting production environments
• Solid understanding of CI/CD concepts with hands-on experience in tools like Jenkins, GitLab CI, GitHub Actions, ArgoCD, Devtron, GCP Cloud Build, or Bitbucket Pipelines
If Interested kindly share your updated resume on 82008 31681
Key Responsibilities
- Data Architecture & Pipeline Development
- Design, implement, and optimize ETL/ELT pipelines using Azure Data Factory, Databricks, and Synapse Analytics.
- Integrate structured, semi-structured, and unstructured data from multiple sources.
- Data Storage & Management
- Develop and maintain Azure SQL Database, Azure Synapse Analytics, and Azure Data Lake solutions.
- Ensure proper indexing, partitioning, and storage optimization for performance.
- Data Governance & Security
- Implement role-based access control, data encryption, and compliance with GDPR/CCPA.
- Ensure metadata management and data lineage tracking with Azure Purview or similar tools.
- Collaboration & Stakeholder Engagement
- Work closely with BI developers, analysts, and business teams to translate requirements into data solutions.
- Provide technical guidance and best practices for data integration and transformation.
- Monitoring & Optimization
- Set up monitoring and alerting for data pipelines.
Job Type : Contract
Location : Bangalore
Experience : 5+yrs
The role focuses on cloud security engineering with a strong emphasis on GCP, while also covering AWS and Azure.
Required Skills:
- 5+ years of experience in software and/or cloud platform engineering, particularly focused on GCP environment.
- Knowledge of the Shared Responsibility Model; keen understanding of the security risks inherent in hosting cloud-based applications and data.
- Experience developing across the security assurance lifecycle (including prevent, detect, respond, and remediate controls)?Experience in configuring Public Cloud native security tooling and capabilities with a focus on Google Cloud Organizational policies/constraints, VPC SC, IAM policies and GCP APIs.
- Experience with Cloud Security Posture Management (CSPM) 3rd Party tools such as Wiz, Prisma, Check Point CloudGuard, etc.
- Experience in Policy-as-code (Rego) and OPA platform.
- Experience solutioning and configuring event-driven serverless-based security controls in Azure, including but not limited to technologies such as Azure Function, Automation Runbook, AWS Lambda and Google Cloud Functions.
- Deep understanding of DevOps processes and workflows.
- Working knowledge of the Secure SDLC process
- Experience with Infrastructure as Code (IaC) tooling, preferably Terraform.
- Familiarity with Logging and data pipeline concepts and architectures in cloud.
- Strong in scripting languages such as PowerShell or Python or Bash or Go.
- Knowledge of Agile best practices and methodologies
- Experience creating technical architecture documentation.? Excellent communication, written and interpersonal skills.
- Practical experience in designing and configuring CICD pipelines. Practical experience in GitHub Actions and Jenkins.
- Experience in ITSM.
- Ability to articulate complex technical concepts to non-technical stakeholders.
- Experience with risk control frameworks and engagements with risk and regulatory functions
- Experience in the financial industry would be a plus.
Job Title : Senior System Administrator
Experience : 7 to 12 Years
Location : Bangalore (Whitefield/Domlur) or Coimbatore
Work Mode :
- First 3 Months : Work From Office (5 Days)
- Post-Probation : Hybrid (3 Days WFO)
- Shift : Rotational (Day & Night)
- Notice Period : Immediate to 30 Days
- Salary : Up to ₹24 LPA (including 8% variable), slightly negotiable
Role Overview :
Seeking a Senior System Administrator with strong experience in server administration, virtualization, automation, and hybrid infrastructure. The role involves managing Windows environments, scripting, cloud/on-prem operations, and ensuring 24x7 system availability.
Mandatory Skills :
Windows Server, Virtualization (Citrix/VMware/Nutanix/Hyper-V), Office 365, Intune, PowerShell, Terraform/Ansible, CI/CD, Hybrid Cloud (Azure), Monitoring, Backup, NOC, DCO.
Key Responsibilities :
- Manage physical/virtual Windows servers and core services (AD, DNS, DHCP).
- Automate infrastructure using Terraform/Ansible.
- Administer Office 365, Intune, and ensure compliance.
- Support hybrid on-prem + Azure environments.
- Handle monitoring, backups, disaster recovery, and incident response.
- Collaborate on DevOps pipelines and write automation scripts (PowerShell).
Nice to Have :
MCSA/MCSE/RHCE, Azure admin experience, team leadership background
Interview Rounds :
L1 – Technical (Platform)
L2 – Technical
L3 – Techno-Managerial
L4 – HR
Role Overview:
We are seeking a Senior Software Engineer (SSE) with strong expertise in Kafka, Python, and Azure Databricks to lead and contribute to our healthcare data engineering initiatives. This role is pivotal in building scalable, real-time data pipelines and processing large-scale healthcare datasets in a secure and compliant cloud environment.
The ideal candidate will have a solid background in real-time streaming, big data processing, and cloud platforms, along with strong leadership and stakeholder engagement capabilities.
Key Responsibilities:
- Design and develop scalable real-time data streaming solutions using Apache Kafka and Python.
- Architect and implement ETL/ELT pipelines using Azure Databricks for both structured and unstructured healthcare data.
- Optimize and maintain Kafka applications, Python scripts, and Databricks workflows to ensure performance and reliability.
- Ensure data integrity, security, and compliance with healthcare standards such as HIPAA and HITRUST.
- Collaborate with data scientists, analysts, and business stakeholders to gather requirements and translate them into robust data solutions.
- Mentor junior engineers, perform code reviews, and promote engineering best practices.
- Stay current with evolving technologies in cloud, big data, and healthcare data standards.
- Contribute to the development of CI/CD pipelines and containerized environments (Docker, Kubernetes).
Required Skills & Qualifications:
- 4+ years of hands-on experience in data engineering roles.
- Strong proficiency in Kafka (including Kafka Streams, Kafka Connect, Schema Registry).
- Proficient in Python for data processing and automation.
- Experience with Azure Databricks (or readiness to ramp up quickly).
- Solid understanding of cloud platforms, with a preference for Azure (AWS/GCP is a plus).
- Strong knowledge of SQL and NoSQL databases; data modeling for large-scale systems.
- Familiarity with containerization tools like Docker and orchestration using Kubernetes.
- Exposure to CI/CD pipelines for data applications.
- Prior experience with healthcare datasets (EHR, HL7, FHIR, claims data) is highly desirable.
- Excellent problem-solving abilities and a proactive mindset.
- Strong communication and interpersonal skills to work in cross-functional teams.
Employment type- Contract basis
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using PySpark and distributed computing frameworks.
- Implement ETL processes and integrate data from structured and unstructured sources into cloud data warehouses.
- Work across Azure or AWS cloud ecosystems to deploy and manage big data workflows.
- Optimize performance of SQL queries and develop stored procedures for data transformation and analytics.
- Collaborate with Data Scientists, Analysts, and Business teams to ensure reliable data availability and quality.
- Maintain documentation and implement best practices for data architecture, governance, and security.
⚙️ Required Skills
- Programming: Proficient in PySpark, Python, and SQL, MongoDB
- Cloud Platforms: Hands-on experience with Azure Data Factory, Databricks, or AWS Glue/Redshift.
- Data Engineering Tools: Familiarity with Apache Spark, Kafka, Airflow, or similar tools.
- Data Warehousing: Strong knowledge of designing and working with data warehouses like Snowflake, BigQuery, Synapse, or Redshift.
- Data Modeling: Experience in dimensional modeling, star/snowflake schema, and data lake architecture.
- CI/CD & Version Control: Exposure to Git, Terraform, or other DevOps tools is a plus.
🧰 Preferred Qualifications
- Bachelor's or Master's in Computer Science, Engineering, or related field.
- Certifications in Azure/AWS are highly desirable.
- Knowledge of business intelligence tools (Power BI, Tableau) is a bonus.
Requirements:
- 7+ years in enterprise application development
- Proven track record of delivering complex digital solutions
- Advanced knowledge of React hooks, context API, and component and global state management
- Experience with atomic design, component libraries, and TypeScript
- Proficiency in React performance optimization and modern features
- Strong experience with modern .NET (6+)
- Proven track record applying Clean Architecture & Clean Code & SOLID principles & DDD patterns
- Expertise in building scalable REST APIs and microservices
- Experience with Azure Service Bus, Event Grid, and message-based architectures
- Proficiency in resources like App Insights, Function Apps, Key Vaults, and App Services
- Expertise in cloud development and deployment strategies
- Strong understanding of business needs, ability to meet them by creating cutting-edge solutions
- Brilliant communication abilities, knowing how to explain technical details and processes to a non-tech person
- Fluency in English
Nice to have:
- Test automation (Playwright/Cypress)
- Building CI/CD pipelines in Azure DevOps
- API-first development and OpenAPI specifications
- Experience with agile at scale (SAFe/Spotify)
- Proficiency with AI-powered development tools (GitHub Copilot, Cursor, etc.) to enhance productivity
- Bachelor's or Master's degree in Computer Science or a related field
Responsibilities:
- Front-end, API, and back-end development, ensuring the successful delivery of digital solutions
- Drive innovation by staying informed about emerging technologies, industry trends, and best practices
- Collaborate with the Product Manager, UX/UI Designers, and Engineering Manager to define agile technical requirements, provide technical estimation, and prioritize backlogs based on business needs and technical feasibility
- Participate in sprint planning, backlog grooming, and release planning meetings to ensure alignment between technical implementation and product roadmap
- Participate in hands-on development activities, including coding, debugging, and troubleshooting, to deliver high-quality applications
- Design, architecture, development of digital applications, ensuring adherence to best practices, coding standards, and architectural principles
- Design scalable architectures for multi-region deployment
- Implement test automation strategies and frameworks to automate testing processes and ensure the quality and reliability of applications
- Automate test cases and integrate testing into the CI/CD pipeline
- Conduct code reviews to ensure adherence to coding standards, best practices, and architectural guidelines
- Define and implement code best practices, development standards, and documentation processes to maintain code quality and readability
- Integrate digital applications with existing digital assets, backend systems, and third-party APIs, ensuring seamless data exchange and interoperability
- Collaborate with integration teams to design and implement integration solutions that meet business requirements and architectural standards
- Communicate technical concepts and solutions to non-technical stakeholders in a clear and understandable manner
Requirements:
- 7+ years in enterprise application development
- Proven track record of delivering complex digital solutions
- Advanced knowledge of React hooks, context API, and component and global state management
- Experience with atomic design, component libraries, and TypeScript
- Proficiency in React performance optimization and modern features
- Strong experience with modern .NET (6+)
- Proven track record applying Clean Architecture & Clean Code & SOLID principles & DDD patterns
- Expertise in building scalable REST APIs and microservices
- Experience with Azure Service Bus, Event Grid, and message-based architectures
- Proficiency in resources like App Insights, Function Apps, Key Vaults, and App Services
- Expertise in cloud development and deployment strategies
- Strong understanding of business needs, ability to meet them by creating cutting-edge solutions
- Brilliant communication abilities, knowing how to explain technical details and processes to a non-tech person
- Fluency in English
Nice to have:
- Test automation (Playwright/Cypress)
- Building CI/CD pipelines in Azure DevOps
- API-first development and OpenAPI specifications
- Experience with agile at scale (SAFe/Spotify)
- Proficiency with AI-powered development tools (GitHub Copilot, Cursor, etc.) to enhance productivity
- Bachelor's or Master's degree in Computer Science or a related field
Responsibilities:
- Front-end, API, and back-end development, ensuring the successful delivery of digital solutions
- Drive innovation by staying informed about emerging technologies, industry trends, and best practices
- Collaborate with the Product Manager, UX/UI Designers, and Engineering Manager to define agile technical requirements, provide technical estimation, and prioritize backlogs based on business needs and technical feasibility
- Participate in sprint planning, backlog grooming, and release planning meetings to ensure alignment between technical implementation and product roadmap
- Participate in hands-on development activities, including coding, debugging, and troubleshooting, to deliver high-quality applications
- Design, architecture, development of digital applications, ensuring adherence to best practices, coding standards, and architectural principles
- Design scalable architectures for multi-region deployment
- Implement test automation strategies and frameworks to automate testing processes and ensure the quality and reliability of applications
- Automate test cases and integrate testing into the CI/CD pipeline
- Conduct code reviews to ensure adherence to coding standards, best practices, and architectural guidelines
- Define and implement code best practices, development standards, and documentation processes to maintain code quality and readability
- Integrate digital applications with existing digital assets, backend systems, and third-party APIs, ensuring seamless data exchange and interoperability
- Collaborate with integration teams to design and implement integration solutions that meet business requirements and architectural standards
- Communicate technical concepts and solutions to non-technical stakeholders in a clear and understandable manner
Job Description:
We are looking for a skilled and motivated Full Stack Developer with 2–6 years of experience in designing and developing scalable web applications using .NET Core, C#, ReactJS, and MS SQL, with exposure to Microsoft Azure. The ideal candidate should be comfortable working across the full technology stack and demonstrate strong problem-solving abilities in a fast-paced, Agile environment.
Key Responsibilities:
- Design, develop, and maintain robust full stack applications using .NET Core, C#, ReactJS, and MS SQL.
- Build and consume RESTful APIs to support scalable frontend/backend integration.
- Collaborate with product owners, architects, and other developers to deliver high-quality software solutions.
- Participate in code reviews, ensure adherence to best coding practices, and contribute to continuous improvement.
- Write unit and integration tests to maintain code quality and ensure high test coverage.
- Deploy and manage applications in Microsoft Azure and contribute to improving CI/CD pipelines.
- Actively participate in Agile ceremonies such as sprint planning, stand-ups, and retrospectives.
- Troubleshoot and resolve technical issues and performance bottlenecks.
Required Skills:
- 2–6 years of hands-on experience with .NET Core and C# development.
- Proficient in ReactJS and modern front-end technologies (HTML5, CSS3, JavaScript/TypeScript).
- Experience in building and integrating RESTful APIs.
- Solid understanding of object-oriented programming, data structures, and software design principles.
- Experience working with MS SQL Server, including writing complex queries and stored procedures.
- Familiarity with Azure services such as App Services, Azure SQL, Azure Functions, etc.
Preferred Skills:
- Exposure to DevOps practices and tools such as Azure DevOps, Git, and CI/CD pipelines.
- Basic understanding of containerization (e.g., Docker) and orchestration tools like Kubernetes (K8s).
- Prior experience working in Agile/Scrum teams.
- Good communication skills and ability to work collaboratively in a cross-functional team.
- Develop, and maintain Java applications using Core Java, Spring framework, JDBC, and threading concepts.
- Strong understanding of the Spring framework and its various modules.
- Experience with JDBC for database connectivity and manipulation
- Utilize database management systems to store and retrieve data efficiently.
- Proficiency in Core Java8 and thorough understanding of threading concepts and concurrent programming.
- Experience in in working with relational and nosql databases.
- Basic understanding of cloud platforms such as Azure and GCP and gain experience on DevOps practices is added advantage.
- Knowledge of containerization technologies (e.g., Docker, Kubernetes)
- Perform debugging and troubleshooting of applications using log analysis techniques.
- Understand multi-service flow and integration between components.
- Handle large-scale data processing tasks efficiently and effectively.
- Hands on experience using Spark is an added advantage.
- Good problem-solving and analytical abilities.
- Collaborate with cross-functional teams to identify and solve complex technical problems.
- Knowledge of Agile methodologies such as Scrum or Kanban
- Stay updated with the latest technologies and industry trends to improve development processes continuously and methodologies.
Job Title : Senior Consultant (Java / NodeJS + Temporal)
Experience : 5 to 12 Years
Location : Bengaluru, Chennai, Hyderabad, Pune, Mumbai, Gurugram, Coimbatore
Work Mode : Remote (Must be open to travel for occasional team meetups)
Notice Period : Immediate Joiners or Serving Notice
Interview Process :
- R1 : Tech Interview (60 mins)
- R2 : Technical Interview
- R3 : (Optional) Interview with Client
Job Summary :
We are seeking a Senior Backend Consultant with strong hands-on expertise in Temporal (BPM/Workflow Engine) and either Node.js or Java.
The ideal candidate will have experience in designing and developing microservices and process-driven applications, as well as orchestrating complex workflows using Temporal.io.
You will work on high-scale systems, collaborating closely with cross-functional teams.
Mandatory Skills :
Temporal.io, Node.js (or Java), React.js, Keycloak IAM, PostgreSQL, Terraform, Kubernetes, Azure, Jest, OpenAPI
Key Responsibilities :
- Design and implement scalable backend services using Node.js or Java.
- Build and manage complex workflow orchestrations using Temporal.io.
- Integrate with IAM solutions like Keycloak for role-based access control.
- Work with React (v17+), TypeScript, and component-driven frontend design.
- Use PostgreSQL for structured data persistence and optimized queries.
- Manage infrastructure using Terraform and orchestrate via Kubernetes.
- Leverage Azure Services like Blob Storage, API Gateway, and AKS.
- Write and maintain API documentation using Swagger/Postman/Insomnia.
- Conduct unit and integration testing using Jest.
- Participate in code reviews and contribute to architectural decisions.
Must-Have Skills :
- Temporal.io – BPMN modeling, external task workers, Operate, Tasklist
- Node.js + TypeScript (preferred) or strong Java experience
- React.js (v17+) and component-driven UI development
- Keycloak IAM, PostgreSQL, and modern API design
- Infrastructure automation with Terraform, Kubernetes
- Experience in using GitFlow, OpenAPI, Jest for testing
Nice-to-Have Skills :
- Blockchain integration experience for secure KYC/identity flows
- Custom Camunda Connectors or exporter plugin development
- CI/CD experience using Azure DevOps or GitHub Actions
- Identity-based task completion authorization enforcement
Job Title : Automation Quality Engineer (Gen AI)
Experience : 3 to 5+ Years
Location : Bangalore / Chennai / Pune
Role Overview :
We’re hiring a Quality Engineer to lead QA efforts for AI models, applications, and infrastructure.
You'll collaborate with cross-functional teams to design test strategies, implement automation, ensure model accuracy, and maintain high product quality.
Key Responsibilities :
- Develop and maintain test strategies for AI models, APIs, and user interfaces.
- Build automation frameworks and integrate into CI/CD pipelines.
- Validate model accuracy, robustness, and monitor model drift.
- Perform regression, performance, load, and security testing.
- Log and track issues; collaborate with developers to resolve them.
- Ensure compliance with data privacy and ethical AI standards.
- Document QA processes and testing outcomes.
Mandatory Skills :
- Test Automation : Selenium, Playwright, or Deep Eval
- Programming/Scripting : Python, JavaScript
- API Testing : Postman, REST Assured
- Cloud & DevOps : Azure, Azure Kubernetes, CI/CD pipelines
- Performance Testing : JMeter
- Bug Tracking : Azure DevOps
- Methodologies : Agile delivery experience
- Soft Skills : Strong communication and problem-solving abilities
Job Title : Lead Java Developer (Backend)
Experience Required : 8 to 15 Years
Open Positions : 5
Location : Any major metro city (Bengaluru, Pune, Chennai, Kolkata, Hyderabad)
Work Mode : Open to Remote / Hybrid / Onsite
Notice Period : Immediate Joiner/30 Days or Less
About the Role :
- We are looking for experienced Lead Java Developers who bring not only strong backend development skills but also a product-oriented mindset and leadership capability.
- This is an opportunity to be part of high-impact digital transformation initiatives that go beyond writing code—you’ll help shape future-ready platforms and drive meaningful change.
- This role is embedded within a forward-thinking digital engineering team that thrives on co-innovation, lean delivery, and end-to-end ownership of platforms and products.
Key Responsibilities :
- Design, develop, and implement scalable backend systems using Java and Spring Boot.
- Collaborate with product managers, designers, and engineers to build intuitive and reliable digital products.
- Advocate and implement engineering best practices : SOLID principles, OOP, clean code, CI/CD, TDD/BDD.
- Lead Agile-based development cycles with a focus on speed, quality, and customer outcomes.
- Guide and mentor team members, fostering technical excellence and ownership.
- Utilize cloud platforms and DevOps tools to ensure performance and reliability of applications.
What We’re Looking For :
- Proven experience in Java backend development (Spring Boot, Microservices).
- 8+ Years of hands-on engineering experience with at least 2+ years in a Lead role.
- Familiarity with cloud platforms such as AWS, Azure, or GCP.
- Good understanding of containerization and orchestration tools like Docker and Kubernetes.
- Exposure to DevOps and Infrastructure as Code practices.
- Strong problem-solving skills and the ability to design solutions from first principles.
- Prior experience in product-based or startup environments is a big plus.
Ideal Candidate Profile :
- A tech enthusiast with a passion for clean code and scalable architecture.
- Someone who thrives in collaborative, transparent, and feedback-driven environments.
- A leader who takes ownership beyond individual deliverables to drive overall team and project success.
Interview Process
- Initial Technical Screening (via platform partner)
- Technical Interview with Engineering Team
- Client-facing Final Round
Additional Info :
- Targeting profiles from product/startup backgrounds.
- Strong preference for candidates with under 1 month of notice period.
- Interviews will be fast-tracked for qualified profiles.
About SAP Fioneer
Innovation is at the core of SAP Fioneer. We were spun out of SAP to drive agility, innovation, and delivery in financial services. With a foundation in cutting-edge technology and deep industry expertise, we elevate financial services through digital business innovation and cloud technology.
A rapidly growing global company with a lean and innovative team, SAP Fioneer offers an environment where you can accelerate your future.
Product Technology Stack
- Languages: PowerShell, MgGraph, Git
- Storage & Databases: Azure Storage, Azure Databases
Role Overview
As a Senior Cloud Solutions Architect / DevOps Engineer, you will be part of our cross-functional IT team in Bangalore, designing, implementing, and managing sophisticated cloud solutions on Microsoft Azure.
Key Responsibilities
Architecture & Design
- Design and document architecture blueprints and solution patterns for Azure-based applications.
- Implement hierarchical organizational governance using Azure Management Groups.
- Architect modern authentication frameworks using Azure AD/EntraID, SAML, OpenID Connect, and Azure AD B2C.
Development & Implementation
- Build closed-loop, data-driven DevOps architectures using Azure Insights.
- Apply code-driven administration practices with PowerShell, MgGraph, and Git.
- Deliver solutions using Infrastructure as Code (IaC), CI/CD pipelines, GitHub Actions, and Azure DevOps.
- Develop IAM standards with RBAC and EntraID.
Leadership & Collaboration
- Provide technical guidance and mentorship to a cross-functional Scrum team operating in sprints with a managed backlog.
- Support the delivery of SaaS solutions on Azure.
Required Qualifications & Skills
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 8+ years of experience in cloud solutions architecture and DevOps engineering.
- Extensive expertise in Azure services, core web technologies, and security best practices.
- Hands-on experience with IaC, CI/CD, Git, and pipeline automation tools.
- Strong understanding of IAM, security best practices, and governance models in Azure.
- Experience working in Scrum-based environments with backlog management.
- Bonus: Experience with Jenkins, Terraform, Docker, or Kubernetes.
Benefits
- Work with some of the brightest minds in the industry on innovative projects shaping the financial sector.
- Flexible work environment encouraging creativity and innovation.
- Pension plans, private medical insurance, wellness cover, and additional perks like celebration rewards and a meal program.
Diversity & Inclusion
At SAP Fioneer, we believe in the power of innovation that every employee brings and are committed to fostering diversity in the workplace.
Job Description below :
Required Skills
BSc/B.E./B.Tech in Computer Science or an equivalent field.
7-10 years' solid commercial experience in software development using experience using Java8, Spring boot, Hibernate, Spring Cloud and related frameworks.
Experience with Angular 8+ versions, B.J‹JS 6, IS/TypeScript
Knowledge of HTML/CSS
Good understanding of Design Patterns
Proficiency with SQL database development, including data modelling and DB performance tuning
Ability to work with customers, gather requirements and create solutions independently
Active participation within and among teams and colleagues distributed globally
Excellent problem-solving skills, in particular a methodical approach to dealing with problems across distributed systems.
Agile development experience
Desired Skills
Experience with angular forms
Experience with dynamic forms/ dynamic angular components
Experience with java Spring boot
Knowledge of Kafka Stream Processing
Understanding of secure software development concepts, especially in a cloud platform
Good communication skills.
Strong organisational skills.
Understanding of test management and automation software (e.g. ALM, Jira, JMeter).
Familiarity with Agile frameworks and Regression testing.
Previous experience within the Financial domain.
Job Title : Chief Technology Officer (CTO) – Blockchain & Web3
Location : Bangalore & Gurgaon
Job Type : Full-Time, On-Site
Working Days : 6 Days
About the Role :
- We are seeking an experienced and visionary Chief Technology Officer (CTO) to lead our Blockchain & Web3 initiatives.
- The ideal candidate will have a strong technical background in Blockchain, Distributed Ledger Technology (DLT), Smart Contracts, DeFi, and Web3 applications.
- As a CTO, you will be responsible for defining and implementing the technology roadmap, leading a high-performing tech team, and driving innovation in the Blockchain and Web3 space.
Key Responsibilities :
- Define and execute the technical strategy and roadmap for Blockchain & Web3 products and services.
- Lead the architecture, design, and development of scalable, secure, and efficient blockchain-based applications.
- Oversee Smart Contract development, Layer-1 & Layer-2 solutions, DeFi, NFTs, and decentralized applications (dApps).
- Manage and mentor a team of engineers, developers, and blockchain specialists to ensure high-quality product delivery.
- Drive R&D initiatives to stay ahead of emerging trends and advancements in Blockchain & Web3 technologies.
- Collaborate with cross-functional teams including Product, Marketing, and Business Development to align technology with business goals.
- Ensure regulatory compliance, security, and scalability of Blockchain solutions.
- Build and maintain relationships with industry partners, investors, and technology vendors to drive innovation.
Required Qualifications & Experience :
- 10+ Years of overall experience in software development with at least 5+ Years in Blockchain & Web3 technologies.
- Deep understanding of Blockchain protocols (Ethereum, Solana, Polkadot, Hyperledger, etc.), consensus mechanisms, cryptographic principles, and tokenomics.
- Hands-on experience with Solidity, Rust, Go, Node.js, Python, or other blockchain programming languages.
- Proven track record of building and scaling decentralized applications (dApps), DeFi platforms, or NFT marketplaces.
- Experience with cloud infrastructure (AWS, Azure, GCP) and DevOps best practices.
- Strong leadership and management skills with experience in building and leading high-performing teams.
- Excellent problem-solving skills with the ability to work in a fast-paced, high-growth environment.
- Strong understanding of Web3, DAOs, Metaverse, and the evolving regulatory landscape.
Preferred Qualifications :
- Prior experience in a CTO, VP Engineering, or similar leadership role.
- Experience in fundraising, investor relations, and strategic partnerships.
- Knowledge of cross-chain interoperability and Layer-2 scaling solutions.
- Understanding of data privacy, security, and compliance regulations related to Blockchain & Web3.
About AMAZECH SOLUTIONS
Amazech Solutions is a Consulting and Services company in the Information Technology Industry. Established in 2007, we are headquartered in Frisco, Texas, U.S.A. The leadership team at Amazech brings to the table expertise that stems from over 40-man years of experience in developing software solutions in global organizations in various verticals including Healthcare, Banking Services, and Media & Entertainment
We currently provide services to a wide spectrum of clients ranging from start-ups to Fortune 500 companies. We are actively engaged in Government projects, being an SBA approved company as well as being HUB certified by the State of Texas.
Our customer-centric approach comes from understanding that our clients need more than technology professionals. This is an exciting time to join Amazech as we look to grow our team in India which comprises of IT professionals with strong competence in both common and niche skill areas.
Job Description
Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive and iterative delivery environment? You will be part of a large group of makers, breakers, doers, and disruptors, who love to solve real problems and meet real customer needs.
We are seeking Software Engineers who are passionate about marrying data with emerging technologies to join our team. You will have the opportunity to be on the forefront of driving a major transformation and create various products that will disrupt and reimagine technology solutions by working with the best minds in the industry.
Location: Bangalore, Pune (Hybrid / Remote)
Experience: 5-13 years
Employment type: Full time.
Permanent website: www.amazech.com
Role Description
This is a full-time Java Backend Developer role, based in Bengaluru with flexible remote work option. The Java Backend Developer will be responsible for designing, developing, and delivering complex Java-based applications and providing end-to-end support throughout the software development lifecycle. The successful candidate will collaborate with cross-functional team members, gather and analyze requirements, identify and prioritize technical and functional requirements and provide innovative solutions to address business challenges.
Qualifications
- Bachelor's or Master's degree in Computer Science (or equivalent technical degree)
- 5+ years of hands-on software development experience in Java/J2EE
- Experience in defining software architecture, design patterns, and solution design
- Strong experience in microservices architecture, Angular, Spring Boot Framework, Hibernate, and Web Services (SOAP and REST)
- Experience in cloud infrastructure, ideally with Amazon Web Services
- Strong knowledge in database design, SQL query optimization, and performance tuning
- Demonstrated ability to lead technical teams and mentor team members
- Excellent communication, analytical, and problem-solving skills
- Experience and knowledge of Agile methodologies
- Experience with AWS, GCP, Microsoft Azure, or another cloud services
- Proven ability to work well under pressure and deliver high-quality work within tight deadlines
Other Requirements
- Bachelor's or Master's degree in Computer Science (or equivalent technical degree)
- Strong interpersonal and relationship-building skills.
- Ability to work independently and as part of a team.
- Excellent verbal and written communication skills and ability to communicate effectively with international clients.
Responsibilities:
1. Lead the strategic planning, development, and launch of Azure integration capabilities within our flagship product ensuring a seamless and efficient integration process.
2. Conduct market research and engage with customers to understand their needs and challenges related to Azure cloud services.
3. Collaborate with engineering, design, and sales teams to define product requirements, roadmaps, and go-to-market strategies for Azure Launch.
4. Deliver actionable product feature requirements and stories with wireframes (hands-on work is required here)
5. Drive feature delivery to customers and conduct feedback loop based improvements.
6. Stay abreast of Azure services and cloud industry trends to ensure our product remains competitive and innovative.
Required Experience and Skills:
· Bachelor's degree in computer science, Engineering, Business, or a related field. A master's degree or an MBA would be a plus.
· 4+ years of experience in a product management role, ideally in a cloud-based product environment.
· Strong understanding of Azure services, cloud computing technologies, and SaaS platforms.
· Experience building features/ solutions in Cloud Management/Cloud Operations in FinOps, SecOps, Cloud IT Ops.
· Experience with Azure Migration, Azure CloudOps configuration, Managed Services is a plus
· Passion for Automation and simplification of complex IT processes
· Proven track record of managing all aspects of a successful product throughout its lifecycle.
· Excellent communication, leadership, and interpersonal skills.
· Ability to work in a fast-paced, dynamic environment and handle multiple tasks simultaneously.
· Strong problem-solving skills and the ability to thrive in a fast-paced, dynamic startup environment.
· Excellent written and verbal communication skills, with experience in creating customer-facing technical documentation such as user guides and release notes.
· Comfortable with ambiguity and able to prioritize tasks in a rapidly changing environment.
· Demonstrated success in defining and launching products would be a plus.
Success Factors:
· You bring curiosity and passion for continuous learning.
· You enjoy ideating, simplifying and solving problems that create value for customers.
· You have an entrepreneurial mindset.
· You bring a growth mindset and value collaborative approach to growing others and yourself.
· You have an ownership mindset to areas you take on, and you are a self-starter that can work independently when needed.
Data Integration Developer Role
Responsibilities:
§ As a Data Integration Developer/Sr Developer, be hands-on ETL/ELT data pipelines, Snowflake DWH, CI/CD deployment Pipelines and data-readiness(data quality) design, development, implementation and address code or data issues.
§ Experience in designing and implementing modern data pipelines for a variety of data sets which includes internal/external data sources, complex relationships, various data formats and high-volume.
§ Experience and understanding of ETL Job performance techniques, Exception handling,
Query performance tuning/optimizations and data loads meeting the runtime/schedule time SLAs both batch and real-time data uses cases.
§ Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions.
§ Demonstrate strong collaborative experience across regions (APAC,EMEA and NA) to come up with design standards, High level design solutions document, cross training and resource onboarding activities.
§ Good understanding of SDLC process, Governance clearance, Peer Code reviews, Unit Test Results, Code deployments, Code Security Scanning, Confluence Jira/Kanban stories.
§ Strong attention to detail during root cause analysis, SQL query debugging and defect issue resolution by working with multiple business/IT stakeholders.
Educational Qualifications:
BTech in Computer Science or other technical course of study required.
Experience:
§ A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements
§ Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning.
§ Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus
§ Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must
§ Exposure to the financial domain knowledge is considered a plus.
§ Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus.
§ Prior experience with State Street and Charles River Development ( CRD) considered a plus.
§ Experience in tools such as Visio, PowerPoint, Excel.
§ Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus.
§ Strong SQL knowledge and debugging skills is a must.

NASDAQ listed, Service Provider IT Company
Job Summary:
As a Cloud Architect at organization, you will play a pivotal role in designing, implementing, and maintaining our multi-cloud infrastructure. You will work closely with various teams to ensure our cloud solutions are scalable, secure, and efficient across different cloud providers. Your expertise in multi-cloud strategies, database management, and microservices architecture will be essential to our success.
Key Responsibilities:
- Design and implement scalable, secure, and high-performance cloud architectures across multiple cloud platforms (AWS, Azure, Google Cloud Platform).
- Lead and manage cloud migration projects, ensuring seamless transitions between on-premises and cloud environments.
- Develop and maintain cloud-native solutions leveraging services from various cloud providers.
- Architect and deploy microservices using REST, GraphQL to support our application development needs.
- Collaborate with DevOps and development teams to ensure best practices in continuous integration and deployment (CI/CD).
- Provide guidance on database architecture, including relational and NoSQL databases, ensuring optimal performance and security.
- Implement robust security practices and policies to protect cloud environments and data.
- Design and implement data management strategies, including data governance, data integration, and data security.
- Stay-up-to-date with the latest industry trends and emerging technologies to drive continuous improvement and innovation.
- Troubleshoot and resolve cloud infrastructure issues, ensuring high availability and reliability.
- Optimize cost and performance across different cloud environments.
Qualifications/ Experience & Skills Required:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Experience: 10 - 15 Years
- Proven experience as a Cloud Architect or in a similar role, with a strong focus on multi-cloud environments.
- Expertise in cloud migration projects, both lift-and-shift and greenfield implementations.
- Strong knowledge of cloud-native solutions and microservices architecture.
- Proficiency in using GraphQL for designing and implementing APIs.
- Solid understanding of database technologies, including SQL, NoSQL, and cloud-based database solutions.
- Experience with DevOps practices and tools, including CI/CD pipelines.
- Excellent problem-solving skills and ability to troubleshoot complex issues.
- Strong communication and collaboration skills, with the ability to work effectively in a team environment.
- Deep understanding of cloud security practices and data protection regulations (e.g., GDPR, HIPAA).
- Experience with data management, including data governance, data integration, and data security.
Preferred Skills:
- Certifications in multiple cloud platforms (e.g., AWS Certified Solutions Architect, Google Certified Professional Cloud Architect, Microsoft Certified: Azure Solutions Architect).
- Experience with containerization technologies (Docker, Kubernetes).
- Familiarity with cloud cost management and optimization tools.
Responsibilities:
- Design, develop, and implement robust and efficient backend services using microservices architecture principles.
- Write clean, maintainable, and well-documented code using C# and the .NET framework.
- Develop and implement data access layers using Entity Framework.
- Utilize Azure DevOps for version control, continuous integration, and continuous delivery (CI/CD) pipelines.
- Design and manage databases on Azure SQL.
- Perform code reviews and participate in pair programming to ensure code quality.
- Troubleshoot and debug complex backend issues.
- Optimize backend performance and scalability to ensure a smooth user experience.
- Stay up-to-date with the latest advancements in backend technologies and cloud platforms.
- Collaborate effectively with frontend developers, product managers, and other stakeholders.
- Clearly communicate technical concepts to both technical and non-technical audiences.
Qualifications:
- Strong understanding of microservices architecture principles and best practices.
- In-depth knowledge of C# programming language and the .NET framework (ASP.NET MVC/Core, Web API).
- Experience working with Entity Framework for data access.
- Proficiency with Azure DevOps for CI/CD pipelines and version control (Git).
- Experience with Azure SQL for database design and management.
- Experience with unit testing and integration testing methodologies.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Strong written and verbal communication skills.
- A passion for building high-quality, scalable, and secure software applications.
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.
Role & Responsibilities:
Your role is focused on Design, Development and delivery of solutions involving:
• Data Integration, Processing & Governance
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Implement scalable architectural models for data processing and storage
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 5+ years of IT experience with 3+ years in Data related technologies
2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Cloud data specialty and other related Big data technology certifications
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
ROLE AND RESPONSIBILITIES
Should be able to work as an individual contributor and maintain good relationship with stakeholders. Should
be proactive to learn new skills per business requirement. Familiar with extraction of relevant data, cleanse and
transform data into insights that drive business value, through use of data analytics, data visualization and data
modeling techniques.
QUALIFICATIONS AND EDUCATION REQUIREMENTS
Technical Bachelor’s Degree.
Non-Technical Degree holders should have 1+ years of relevant experience.
How You'll Contribute:
● Redefine Fintech architecture standards by building easy-to-use, highly scalable,robust, and flexible APIs
● In-depth analysis of the systems/architectures and predict potential future breakdown and proactively bring solution
● Partner with internal stakeholders, to identify potential features implementation on that could cater to our growing business needs
● Drive the team towards writing high-quality codes, tackle abstracts/flaws in system design to attain revved-up API performance, high code reusability and readability.
● Think through the complex Fintech infrastructure and propose an easy-to-deploy modular infrastructure that could adapt and adjust to the specific requirements of the growing client base
● Design and create for scale, optimized memory usage and high throughput performance.
Skills Required:
● 5+ years of experience in the development of complex distributed systems
● Prior experience in building sustainable, reliable and secure microservice-based scalable architecture using Python Programming Language
● In-depth understanding of Python associated libraries and frameworks
● Strong involvement in managing and maintaining produc Ɵ on-level code with high volume API hits and low-latency APIs
● Strong knowledge of Data Structure, Algorithms, Design Patterns, Multi threading concepts, etc
● Ability to design and implement technical road maps for the system and components
● Bring in new software development practices, design/architecture innovations to make our Tech stack more robust
● Hands-on experience in cloud technologies like AWS/GCP/Azure as well as relational databases like MySQL/PostgreSQL or any NoSQL database like DynamoDB
Responsibilities:
- Ensure the quality of architecture and design of systems.
- Functionally decompose complex problems into simple, straight-forward solutions.
- Analyze and improve data quality and metrics.
- Fully and completely understand system interdependencies and limitations.
- Leverage knowledge of internal and industry in design decisions.
- Assist in the career development of others, mentoring on advanced technical issues and helping managers guide the career growth of their team members.
- Exert technical influence over multiple teams, increasing their productivity and effectiveness by sharing your deep knowledge and experience.
- Skilled in translating high level abstract business requirements into software design, designing systems specifications, standards, and programming
- Contribute to Architectural blueprints and Design for the software solutions
- Mentor team on engineering best practices such as writing clean code, designing scalable, reliable and performant software solutions, set and ensure compliance to software quality standards etc.
- Be a role model for the team with innovative thinking, passion for continuous learning and contributions to the project.
Qualifications, Skills & Experiences
- BE / B.Tech /M.Tech in Computer Science or a related field
- Minimum 7+/5+ years of experience building large scalable systems
- Minimum 2+ years of recent experience in building products on cloud is a plus
- Knowledge of asynchronous programming and WebAPI development is required
- Knowledge and awareness of cloud/ application security is must (OWASP at the minimum)
- Strong knowledge in OOPS with C#, .Net(or Java) with SQL Server or any RDBMS
- Strong experience in architecting and building multi-threaded, distributed systems.
- Strong knowledge of data structures, algorithms, and designing for performance.
- Ability to achieve stretch goals in a highly innovative and fast paced environment.
- Extensive experience of mentoring junior engineers to success.
- Experience with Microservices Architecture is a plus
- Working knowledge of CI/CD pipelines and AWS/Azure cloud services is plus
- Hands on experience of building products for unix systems in addition to windows is a plus
- Excellent Communication
Hiring for Azure Data Engineers.
Location: Bangalore
Employment type: Full-time, permanent
website: www.amazech.com
Qualifications:
B.E./B.Tech/M.E./M.Tech in Computer Science, Information Technology, Electrical or Electronic with good academic background.
Experience and Required Skill Sets:
• Minimum 5 years of hands-on experience with Azure Data Lake, Azure Data Factory, SQL Data Warehouse, Azure Blob, Azure Storage Explorer
• Experience in Data warehouse/analytical systems using Azure Synapse.
Proficient in creating Azure Data Factory pipelines for ETL processing; copy activity, custom Azure development, Synapse, etc.
• Knowledge of Azure Data Catalog, Event Grid, Service Bus, SQL, and Purview.
• Good technical knowledge in Microsoft SQL Server BI Suite (ETL, Reporting, Analytics, Dashboards) using SSIS, SSAS, SSRS, Power BI
• Design and develop batch and real-time streaming of data loads to data warehouse systems
Other Requirements:
A Bachelor's or Master's degree (Engineering or computer-related degree preferred)
Strong understanding of Software Development Life Cycles including Agile/Scrum
Responsibilities:
• Ability to create complex, enterprise-transforming applications that meet and exceed client expectations.
• Responsible for the bottom line. Strong project management abilities. Ability to encourage the team to stick to timelines.
Job Title: AWS-Azure Data Engineer with Snowflake
Location: Bangalore, India
Experience: 4+ years
Budget: 15 to 20 LPA
Notice Period: Immediate joiners or less than 15 days
Job Description:
We are seeking an experienced AWS-Azure Data Engineer with expertise in Snowflake to join our team in Bangalore. As a Data Engineer, you will be responsible for designing, implementing, and maintaining data infrastructure and systems using AWS, Azure, and Snowflake. Your primary focus will be on developing scalable and efficient data pipelines, optimizing data storage and processing, and ensuring the availability and reliability of data for analysis and reporting.
Responsibilities:
- Design, develop, and maintain data pipelines on AWS and Azure to ingest, process, and transform data from various sources.
- Optimize data storage and processing using cloud-native services and technologies such as AWS S3, AWS Glue, Azure Data Lake Storage, Azure Data Factory, etc.
- Implement and manage data warehouse solutions using Snowflake, including schema design, query optimization, and performance tuning.
- Collaborate with cross-functional teams to understand data requirements and translate them into scalable and efficient technical solutions.
- Ensure data quality and integrity by implementing data validation, cleansing, and transformation processes.
- Develop and maintain ETL processes for data integration and migration between different data sources and platforms.
- Implement and enforce data governance and security practices, including access control, encryption, and compliance with regulations.
- Collaborate with data scientists and analysts to support their data needs and enable advanced analytics and machine learning initiatives.
- Monitor and troubleshoot data pipelines and systems to identify and resolve performance issues or data inconsistencies.
- Stay updated with the latest advancements in cloud technologies, data engineering best practices, and emerging trends in the industry.
Requirements:
- Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
- Minimum of 4 years of experience as a Data Engineer, with a focus on AWS, Azure, and Snowflake.
- Strong proficiency in data modelling, ETL development, and data integration.
- Expertise in cloud platforms such as AWS and Azure, including hands-on experience with data storage and processing services.
- In-depth knowledge of Snowflake, including schema design, SQL optimization, and performance tuning.
- Experience with scripting languages such as Python or Java for data manipulation and automation tasks.
- Familiarity with data governance principles and security best practices.
- Strong problem-solving skills and ability to work independently in a fast-paced environment.
- Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams and stakeholders.
- Immediate joiner or notice period less than 15 days preferred.
If you possess the required skills and are passionate about leveraging AWS, Azure, and Snowflake to build scalable data solutions, we invite you to apply. Please submit your resume and a cover letter highlighting your relevant experience and achievements in the AWS, Azure, and Snowflake domains.
- Role: IoT Application Development (Java) Skill Set:
- Proficiency in Java 11.
- Strong knowledge of Spring Boot framework.
- Experience with Kubernetes.
- Familiarity with Kafka.
- Understanding of Azure Cloud services.
1 Experience: 3 to 5 years Location: Bangalore ; Notice period : Immediate Joiners
- Job Description: We are seeking an experienced IoT Application Developer with expertise in Java to join our team in Bangalore. As a Java Developer, you will be responsible for designing, developing, and deploying IoT applications. You should have a solid understanding of Java 11 and the Spring Boot framework. Experience with Kubernetes and Kafka is also required. Familiarity with Azure Cloud services is essential. Your role will involve collaborating with the development team to build scalable and efficient IoT solutions using Java and related technologies.
RESPONSIBILITIES
- Become an expert in our technology and platforms
- Provide top-tier, highly skilled and attentive support to our consumer app teams.
- Perform production support, troubleshooting and maintenance tasks with a focus on quality and timeliness
- Work with the Implementations and Delivery Teams on defect resolution and solution delivery
- As an active member of the backend team you would utilize and promote best practices and standards
- You’ll develop new technical skills and gain industry knowledge
- You’ll provide occasional “after-hours” incident response and support
- Maintain software to integrate with internal back-end systems
- Build tools to reduce occurence of errors and improve customer experience
- Manage system troubleshooting and maintenance.
Required skills and qualifications
• 4+ years’ experience supporting Web and Backend based application software and environments preferred
• Ability to prioritize effectively and handle shifting priorities professionally
• Experience with version management systems like GIT, Subversion
• Knowledge on Payment gateways (Razorpay, Juspay)
• Experience with web application servers, basic idea on AWS and Azure
• Databases (SQL and NoSQL) and SQL query language.
• Good understanding of Networking principles.
• An understanding and background with Object Oriented Programming languages
• Required Full stack development knowledge.
Preferred skills and qualifications
• Bachelor of science degree (or equivalent) in computer science, engineering, or relevant field
· Core responsibilities to include analyze business requirements and designs for accuracy and completeness. Develops and maintains relevant product.
· BlueYonder is seeking a Senior/Principal Architect in the Data Services department (under Luminate Platform ) to act as one of key technology leaders to build and manage BlueYonder’ s technology assets in the Data Platform and Services.
· This individual will act as a trusted technical advisor and strategic thought leader to the Data Services department. The successful candidate will have the opportunity to lead, participate, guide, and mentor other people in the team on architecture and design in a hands-on manner. You are responsible for technical direction of Data Platform. This position reports to the Global Head, Data Services and will be based in Bangalore, India.
· Core responsibilities to include Architecting and designing (along with counterparts and distinguished Architects) a ground up cloud native (we use Azure) SaaS product in Order management and micro-fulfillment
· The team currently comprises of 60+ global associates across US, India (COE) and UK and is expected to grow rapidly. The incumbent will need to have leadership qualities to also mentor junior and mid-level software associates in our team. This person will lead the Data platform architecture – Streaming, Bulk with Snowflake/Elastic Search/other tools
Our current technical environment:
· Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth , Snowflake
· • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture
· • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)
· Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, NOSQL, RDBMS, Springboot, Gradle GIT, Ignite




















