11+ IBM AIX Jobs in India
Apply to 11+ IBM AIX Jobs on CutShort.io. Find your next job, effortlessly. Browse IBM AIX Jobs and apply today!
|
Overall 5+ years of experience required in Finacle Development/Support |

Global digital transformation solutions provider.
Role Proficiency:
This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.
Skill Examples:
- Proficiency in SQL Python or other programming languages used for data manipulation.
- Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
- Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
- Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
- Experience in performance tuning.
- Experience in data warehouse design and cost improvements.
- Apply and optimize data models for efficient storage retrieval and processing of large datasets.
- Communicate and explain design/development aspects to customers.
- Estimate time and resource requirements for developing/debugging features/components.
- Participate in RFP responses and solutioning.
- Mentor team members and guide them in relevant upskilling and certification.
Knowledge Examples:
- Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
- Proficient in SQL for analytics and windowing functions.
- Understanding of data schemas and models.
- Familiarity with domain-related data.
- Knowledge of data warehouse optimization techniques.
- Understanding of data security concepts.
- Awareness of patterns frameworks and automation practices.
Additional Comments:
# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026
Project Overview:
Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.
The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.
Design, build, and maintain scalable data pipelines using Databricks and PySpark.
Develop and optimize complex SQL queries for data extraction, transformation, and analysis.
Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).
Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.
Ensure data quality, performance, and reliability across data workflows.
Participate in code reviews, data architecture discussions, and performance optimization initiatives.
Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.
Key Skills:
Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).
Excellent problem-solving, communication, and collaboration skills.
Skills: Databricks, Pyspark & Python, Sql, Aws Services
Must-Haves
Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)
Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).
******
Notice period - Immediate to 15 days
Location: Bangalore
Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence.
The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.
Responsibilities:
- Design, build, and maintain scalable data pipelines for structured and unstructured data sources
- Develop ETL processes to collect, clean, and transform data from internal and external systems
- Support integration of data into dashboards, analytics tools, and reporting systems
- Collaborate with data analysts and software developers to improve data accessibility and performance
- Document workflows and maintain data infrastructure best practices
- Assist in identifying opportunities to automate repetitive data tasks
We are looking for a motivated Technical Support Engineer (0–3 years experience) to provide technical assistance for software, hardware, and network-related issues. The candidate will troubleshoot problems, assist users, and ensure smooth system operations.
Key Responsibilities:
- Provide technical support via phone, email, or remote tools.
- Troubleshoot hardware, software, and networking issues.
- Install, configure, and maintain systems and applications.
- Diagnose technical problems and provide timely solutions.
- Document issues and resolutions in the support system.
- Escalate complex issues to senior technical teams when required.
- Assist users with system setup, updates, and configurations.
- Ensure high levels of customer satisfaction and service quality.
Required Skills:
- Basic knowledge of Windows/Linux operating systems.
- Understanding of computer hardware and software troubleshooting.
- Basic networking knowledge (TCP/IP, DNS, LAN).
- Strong problem-solving and analytical skills.
- Good communication and interpersonal skills.
- Ability to learn quickly and work in a team environment.
Education:
- Bachelor’s degree in Computer Science, Information Technology, Electronics, or related field (B.E / B.Tech / B.Sc / BCA / MCA).
Eligibility:
- 0–3 years of experience in Technical Support / IT Support / Helpdesk.
- Freshers with strong technical knowledge can also apply.
Apply here: https://connectsblue.com/jobs/743/technical-support-engineer-at-bluepms-software-solutions-pvt-ltd
Job Title: Backend Engineer – Python / Golang / Rust
Location: Bangalore, India
Experience Required: Minimum 2–3 years
About the Role
We are looking for a passionate Backend Engineer to join our growing engineering team. The ideal candidate should have hands-on experience in building enterprise-grade, scalable backend systems using microservices architecture. You will work closely with product, frontend, and DevOps teams to design, develop, and optimize robust backend solutions that can handle high traffic and ensure system reliability.
Key Responsibilities
• Design, develop, and maintain scalable backend services and APIs.
• Architect and implement microservices-based systems ensuring modularity and resilience.
• Optimize application performance, database queries, and service scalability.
• Collaborate with frontend engineers, product managers, and DevOps teams for seamless delivery.
• Implement security best practices and ensure data protection compliance.
• Write and maintain unit tests, integration tests, and documentation.
• Participate in code reviews, technical discussions, and architecture design sessions.
• Monitor, debug, and improve system performance in production environments.
Required Skills & Experience
• Programming Expertise:
• Advanced proficiency in Python (Django, FastAPI, or Flask), OR
• Strong experience in Golang or Rust for backend development.
• Microservices Architecture: Hands-on experience in designing and maintaining distributed systems.
• Database Management: Expertise in PostgreSQL, MySQL, MongoDB, including schema design and optimization.
• API Development: Strong experience in RESTful APIs and GraphQL.
• Cloud Platforms: Proficiency with AWS, GCP, or Azure for deployment and scaling.
• Containerization & Orchestration: Solid knowledge of Docker and Kubernetes.
• Messaging & Caching: Experience with Redis, RabbitMQ, Kafka, and caching strategies (Redis, Memcached).
• Version Control: Strong Git workflows and collaboration in team environments.
• Familiarity with CI/CD pipelines, DevOps practices, and cloud-native deployments.
• Proven experience working on production-grade, high-traffic applications.
Preferred Qualifications
• Understanding of software architecture patterns (event-driven, CQRS, hexagonal, etc.).
• Experience with Agile/Scrum methodologies.
• Contributions to open-source projects or strong personal backend projects.
• Experience with observability tools (Prometheus, Grafana, ELK, Jaeger).
Why Join Us?
• Work on cutting-edge backend systems that power enterprise-grade applications.
• Opportunity to learn and grow with a fast-paced engineering team.
• Exposure to cloud-native, microservices-based architectures.
• Collaborative culture that values innovation, ownership, and technical excellence.
Requirement:
▪ 4~10 years’ experience in Automation using Typescript, Javascript, Java as programming languages
▪ Able to demonstrate good technical & problem solving skills
▪ Possess relevant hands-on experience with developing automation scripts using tools like protractor, selenium, rest assured, cucumber, POM model etc.
▪ Should be ready to explore, learn and develop solutions for problems.
Our client focuses on providing solutions in terms of data, analytics, decisioning and automation. They focus on providing solutions to the lending lifecycle of financial institutions and their products are designed to focus on systemic fraud prevention, risk management, compliance etc.
Our client is a one stop solution provider, catering to the authentication, verification and diligence needs of various industries including but not limited to, banking, insurance, payments etc.
Headquartered in Mumbai, our client was founded in 2015 by a team of three veteran entrepreneurs, two of whom are chartered accountants and one is a graduate from IIT, Kharagpur. They have been funded by tier 1 investors and have raised $1.1M in funding.
What you will do:
- Interacting with the clients to understand their requirements and translating it to the developer to take it forward
- Acting as a liaison between end users/clients and internal teams and helping them fulfil client requests and resolving queries with optimum time and efforts
- Contributing to implement various solutions in the existing process flow of clients using our products and helping to communicate concepts to the product team to enhance the future product requirements
Desired Candidate Profile
What you need to have:- CA, CFA, or related field
- Excellent communication and task management skills
- In depth knowledge of the various Income Tax Department websites, portals, and their workings, etc
- In depth knowledge of the Income Tax Department rules, regulations, guidelines, due-dates, services, facilities, etc
- In depth knowledge of Income Tax return filing processes using XML Upload, JSON Upload, Prefilled JSON, etc
- In depth knowledge of the Income Tax Filing Regimes
- In depth knowledge of Income Tax XML responses, JSON responses, data points pertaining to calculation of Financial Rations, Balance Sheet, P&L, etc
- In depth knowledge of E-Return Intermediaries and their rules, regulations, guidelines, permissions, compliances, etc
- Passable knowledge of GST, GST Portal and GST filing processes
- A good working knowledge of the financial industry and regulatory environment
- Ability to quickly grasp the various data sources that the company covers and gaining a hold over them over a period of time
- Understanding and translating statistics to address client business problems and liaisoning with the analytics team to build and deliver custom solutions
- Good understanding of Data query languages like SQL, MS Office, R/Python (good to have) and other statistical analysis tools
- Ability to be creative, analytical, and think outside the box to solve problems
Debugging and maintaining written code.
Defining and organizing projects on an ongoing basis.
Reporting and resolving issues related to .NET projects.
Identifying and handling technical risks and issues.
Working in a project team alongside other developers.
Reporting on project statuses and developments to senior team members.
Participating in project meetings with management and other team members.
C# Developer Requirements:
C#, .NET 3.5 (or higher), and Microsoft Visual Studio certification and experience.
SQLite and MS Access Database Experience (Relational Database Experience)
Ability to write clean, easy to understand code.
Outstanding analytical and problem-solving capabilities.
Excellent written and verbal communication skills.
Ability to work independently and complete projects with minimal supervision.
Sound understanding of coding and development processes.
Desired candidates must have 3-7 years of experience as NodeJs Developer.
If the candidate cannot relocate to Gurgaon, we can also provide permanent Work from home for this position.
Roles and responsibilities:
- Responsible for understanding functional and business requirements and translate them into effective code
- Provide support till deployment of code into production.
- Ownership for ensuring code optimization, problem diagnosis, and on-time delivery
- Implement solutions as per the pre-defined framework /guidelines and adherence to processes
- Finding an optimal solution for the problem statement
- Conduct peer code review.
What candidate should know about:
- Excellent hands-on experience with Node.Js, Express.Js, JavaScript
- Understanding the nature of asynchronous programming and its quirks and workarounds
- Excellent hands-on experience with MongoDB, Mongo aggregation, MySQL
- Ability to build REST services, Authentications, MVC applications
- Excellent Object Oriented Programming skills and ability to write modular, secure, scalable, and maintainable code
- Experience with Elastic Search, Redis.
- Knowledge about AWS components (S3, EC2, Cloudfront, Redis Clusters, etc.)
- Self-learning abilities are required
- Familiarity with upcoming new technologies is a strong plus
We are hiring account managers who will improve our current business partnerships and successfully build new business relationships at a local level. The ideal candidate will have 3+ years of IT &Peripheral sales experience. Must be comfortable with in person custom facing meetings
Responsibilities:
Develop a relationship with corporate clients
Build strategies and sales funnel aimed at generating existing customers inquiries
Secure contracts with corporations Manage and improve our enterprise sales
Achieve sales targets and execute sales strategies as a member of a sales team.
Start and manage the whole sales cycle, and be the focal point in all relations with an existing client (repeat flow).
Develop sales and relationship with current clients and follow up on referrals.
Build long-term relationships and referrals with senior managers
Collaborate with A2Zonrent founding team member on new business improvements based on feedback from customers and observations
Qualification
Candidates with experience in similar industries will be given preference.
Ability to hold oneself accountable for achieving high levels of individual and - organizational performance.
Outstanding professional and personal references.
Highly energetic, self-motivated, and goal oriented


