11+ Software maintenance Jobs in India
Apply to 11+ Software maintenance Jobs on CutShort.io. Find your next job, effortlessly. Browse Software maintenance Jobs and apply today!

Job Responsibilities:
This role requires you to work on Linux systems and their associated services which provide the capability for IG to run their trading platform. Team responsibilities include daily troubleshooting and resolution of incidents, operational maintenance, and support for proactive and preventative analysis of Production and Development systems.
- Managing the Linux Infrastructure and web technologies
- Patching and upgrades of Redhat Linux OS and server firmware.
- General Redhat Linux system administration and networking.
iii. Troubleshooting and Issue Resolution of OS and network stack incidents.
iv. Configurations management using puppet and version control.
v. Systems monitoring and availability.
vi. Web applications and application routing.
vii. Web-site infrastructure, content delivery, and security. - Day to day responsibilities will include: Completing service requests, responding to Incidents and Problems as they arise as well as providing day to day support and troubleshooting for Production and Development systems.
3. Create a run book of operational processes and follow a support matrix of products.
4. Ensuring Internal Handovers are completed, and all OS documentation is updated.
5. Troubleshoot system issues, plan for future capacity, and monitor systems performance.
6. Proactive monitoring of the Linux platform and ownership of these tools/dashboards.
7. Work with the delivery and engineering teams to develop the platform and technologies, striving to automate where possible.
8. Continuously improve the team, tools, and processes, support regular agile releases of applications and architectural improvements.
9. The role includes participating in a team Rota to provide out-of-hours support.
Person Specification:
Ability / Expertise
This position is suited to an engineer with at least 8 years of Redhat Linux / Centos Systems Administration experience that is looking to broaden their range of technologies and work using modern tools and techniques.
We are looking for someone with the right attitude: -
Eager to learn new technologies, tools, and techniques alongside applying their existing skills and judgment.
• Pragmatic approach to balancing different work priorities such as incidents, requests and
Page Break troubleshooting.
- Can do/Proactive in improving the environments around them.
• Sets the desired goal and the plans to achieve it.
• Proud of their achievements and keen to improve further.
This will be a busy role in a team so the successful candidate’s behaviors will need to strongly align with our values:
• Champion the client: customer service is a passion, cultivates trust, has clarity and communicates well, works with pace and momentum
• Lead the way: innovative and resilient, strong learning agility and curiosity
• Love what we do: Conscientiousness - has high self-discipline, carefulness, thoroughness, and organization, Flexible and adaptable
The successful candidate will be able to relate to the statements above and give examples that back them up. We believe that previous achievements signpost a good fit at IG.
Qualifications
Essential:
• At least 4 years’ Systems Administration experience with Redhat Enterprise Linux / Centos 5/6/7 managed through a Satellite infrastructure.
• Managed an estate of 1000+ hosts and performed general system administration, networking, backup, and restore monitoring and troubleshooting functions on that estate.
• 1 Years of experience with scripting languages (bash/Perl/Ruby) and automating tasks with Puppet and Redhat Satellite. Experience with custom RPM generation.
• Strong analytical and troubleshooting skills. You will have resolved complex systems issues in your last role and have a solid understanding of the tools needed to do so.
• Excellent Communication (Listening, speaking, the transmission of concepts with/without examples, etc).
• Calm under pressure and work to tight deadlines. You will have brought critical production systems back to life.
Job Description :
We are seeking a highly experienced Sr Data Modeler / Solution Architect to join the Data Architecture team at Corporate Office in Bangalore. The ideal candidate will have 4 to 8 years of experience in data modeling and architecture with deep expertise in AWS cloud stack, data warehousing, and enterprise data modeling tools. This individual will be responsible for designing and creating enterprise-grade data models and driving the implementation of Layered Scalable Architecture or Medallion Architecture to support robust, scalable, and high-quality data marts across multiple business units.
This role will involve managing complex datasets from systems like PoS, ERP, CRM, and external sources, while optimizing performance and cost. You will also provide strategic leadership on data modeling standards, governance, and best practices, ensuring the foundation for analytics and reporting is solid and future-ready.
Key Responsibilities:
· Design and deliver conceptual, logical, and physical data models using tools like ERWin.
· Implement Layered Scalable Architecture / Medallion Architecture for building scalable, standardized data marts.
· Optimize performance and cost of AWS-based data infrastructure (Redshift, S3, Glue, Lambda, etc.).
· Collaborate with cross-functional teams (IT, business, analysts) to gather data requirements and ensure model alignment with KPIs and business logic.
· Develop and optimize SQL code, materialized views, stored procedures in AWS Redshift.
· Ensure data governance, lineage, and quality mechanisms are established across systems.
· Lead and mentor technical teams in an Agile project delivery model.
· Manage data layer creation and documentation: data dictionary, ER diagrams, purpose mapping.
· Identify data gaps and availability issues with respect to source systems.
Required Skills & Qualifications:
· Bachelor’s or Master’s degree in Computer Science, IT, or related field (B.E./B.Tech/M.E./M.Tech/MCA).
· Minimum 4 years of experience in data modeling and architecture.
· Proficiency with data modeling tools such as ERWin, with strong knowledge of forward and reverse engineering.
· Deep expertise in SQL (including advanced SQL, stored procedures, performance tuning).
· Strong experience in data warehousing, RDBMS, and ETL tools like AWS Glue, IBM DataStage, or SAP Data Services.
· Hands-on experience with AWS services: Redshift, S3, Glue, RDS, Lambda, Bedrock, and Q.
· Good understanding of reporting tools such as Tableau, Power BI, or AWS QuickSight.
· Exposure to DevOps/CI-CD pipelines, AI/ML, Gen AI, NLP, and polyglot programming is a plus.
· Familiarity with data governance tools (e.g., ORION/EIIG).
· Domain knowledge in Retail, Manufacturing, HR, or Finance preferred.
· Excellent written and verbal communication skills.
Certifications (Preferred):
· AWS Certification (e.g., AWS Certified Solutions Architect or Data Analytics – Specialty)
· Data Governance or Data Modeling Certifications (e.g., CDMP, Databricks, or TOGAF)
Mandatory Skills
aws, Technical Architecture, Aiml, SQL, Data Warehousing, Data Modelling


Desired Competencies (Technical/Behavioral Competency)
Must-Have
- Experience in working with various ML libraries and packages like Scikit learn, Numpy, Pandas, Tensorflow, Matplotlib, Caffe, etc.
- Deep Learning Frameworks: PyTorch, spaCy, Keras
- Deep Learning Architectures: LSTM, CNN, Self-Attention and Transformers
- Experience in working with Image processing, computer vision is must
- Designing data science applications, Large Language Models(LLM) , Generative Pre-trained Transformers (GPT), generative AI techniques, Natural Language Processing (NLP), machine learning techniques, Python, Jupyter Notebook, common data science packages (tensorflow, scikit-learn,kerasetc.,.) , LangChain, Flask,FastAPI, prompt engineering.
- Programming experience in Python
- Strong written and verbal communications
- Excellent interpersonal and collaboration skills.
Role descriptions / Expectations from the Role
Design and implement scalable and efficient data architectures to support generative AI workflows.
Fine tune and optimize large language models (LLM) for generative AI, conduct performance evaluation and benchmarking for LLMs and machine learning models
Apply prompt engineer techniques as required by the use case
Collaborate with research and development teams to build large language models for generative AI use cases, plan and breakdown of larger data science tasks to lower-level tasks
Lead junior data engineers on tasks such as design data pipelines, dataset creation, and deployment, use data visualization tools, machine learning techniques, natural language processing , feature engineering, deep learning , statistical modelling as required by the use case.
Purchase Manager Technical Profile JD
Male candidate -Experience 10-15 years in engineering industry (Manufacturing) and should
handle a purchase team independently.
Qualification – Diploma or Mechanical Engineer
Good technical knowledge of sheets, rods, pipes, Structural steel, Castings and
outsourcing jobs
Knowledge of drawing, design.
Good communication and negotiation skills
Expertise in vendor scouting / sourcing
Identify cost reduction areas.
Must have good knowledge of metal market
Must have knowledge of Job work process
Should be able to handle a team independently
Ready to travel anywhere to source the material
MIS of Purchase
LOCATION : Koparkhairane, Navi Mumbai
Salary - Upto 80k
About Company
Our client is one of India's finest luxury real estate brands. Renowned for leadership and excellence in real estate development, the prime focus is on planned development, creating value assets, and crafting lifestyle experiences through design and architecture.
Roles and Responsibilities
1.Developing visual concepts and ideas to conceptualize designs that meet client requirements and project objectives.
2.Create various design elements such as logos, illustrations, icons, typography, layouts, and other visual assets.Use of design software like Adobe Photoshop, Illustrator, or InDesign to create high-quality designs.
3.Work on visual identities for the company, designing logos, selecting color schemes, creating brand guidelines, and ensuring brand consistency across different platforms.
4.Create designs for both print and digital media. They design materials such as brochures, business cards, posters, packaging, websites, social media graphics, and other digital assets.
5.Collaborate with clients, marketing teams, copywriters, and other stakeholders to understand project requirements, provide design recommendations, and incorporate feedback.
6.Responsible for preparing design files for print or digital production. Candidate Should know file formats, color modes, and resolution requirements to ensure the quality and accuracy of the final output.
Competencies and Required Skills
1.Proficiency in design software -includes Adobe Photoshop, Illustrator, InDesign, Sketch, or other design tools.
2.Should have a good understanding of different typefaces, font pairings, and how to use typography to enhance the overall visual impact of their designs.
3.The ability to work well in a team, listen to feedback, and incorporate input from others is important for successful project execution.
4.The ability to create well-balanced layouts and compositions is essential. Ensure accuracy and consistency in their designs.
5.Strong communication skills are crucial for understanding project requirements, presenting design concepts, and incorporating feedback effectively.
6.Being able to manage time effectively, prioritize tasks, and work efficiently is essential for delivering projects on time. Should be adaptable and willing to learn new design trends, techniques, and tools to stay relevant and provide fresh design solutions.
Key Skills: OpenText AppWorks, OpenText Process Suite (all sub components), Java, JavaScript, XML, SQL, BIRT.
Required Skills:
- Ability to translate complex business requirement into functional technical requirements using OpenText AppWorks (erstwhile Process Suite) low code design and integration framework.
- Knowledge on implementing OpenText AppWorks integration services using REST, SOAP, and Email etc.
- Good understanding of OpenText AppWorks case management features and BPM features.
- Experience in various rules and features like LifeCycle, Activities, User Interface (XForms and Low Code design), SLA, KPI, OTDS, CARS, AppWorks Gateway, Single-Sign-On, External Authentication, SAML, Listeners (File, MQ), SOAP Services, Email integration, HTTP Connector, AppWorks integration with Content/Document Management, etc.
- Experience in full OpenText AppWorks implementation including Entity modelling and Rulesets design.
- Expertise in Java, JavaScript, SQL Server Database.
- Expertise in building Web Services, REST Services API’s (nice to have) Familiar in designing and developing applications in cloud environments (preferably Azure).
- Authentication SAML 2.0, OT.
- Knowledge of OTDS and integration with OTDS
- Exposure to HTML5 ,angular and integration of AppWorks to with angular will be plus
- Exposure to another OpenText platform (ECM, xECM, Captiva, iHUB).
A strong passion for learning and adapting to new technologies.
- Keeping up-to-date with alterations to immigration.
- Meeting with prospective and extant clients to gauge which services they require.
- Providing clients with all pertinent documentation.
- Assisting clients with the completion of paperwork, and ensuring that this is submitted on time.
- Verifying the authenticity of paperwork and supporting documents.
- Preparing and providing invoices for your services.

Job Description:
We are looking for a Big Data Engineer who have worked across the entire ETL stack. Someone who has ingested data in a batch and live stream format, transformed large volumes of daily and built Data-warehouse to store the transformed data and has integrated different visualization dashboards and applications with the data stores. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.
Responsibilities:
- Develop, test, and implement data solutions based on functional / non-functional business requirements.
- You would be required to code in Scala and PySpark daily on Cloud as well as on-prem infrastructure
- Build Data Models to store the data in a most optimized manner
- Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Implementing the ETL process and optimal data pipeline architecture
- Monitoring performance and advising any necessary infrastructure changes.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Proactively identify potential production issues and recommend and implement solutions
- Must be able to write quality code and build secure, highly available systems.
- Create design documents that describe the functionality, capacity, architecture, and process.
- Review peer-codes and pipelines before deploying to Production for optimization issues and code standards
Skill Sets:
- Good understanding of optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies.
- Proficient understanding of distributed computing principles
- Experience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.
- Implemented complex projects dealing with the considerable data size (PB).
- Optimization techniques (performance, scalability, monitoring, etc.)
- Experience with integration of data from multiple data sources
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.,
- Knowledge of various ETL techniques and frameworks, such as Flume
- Experience with various messaging systems, such as Kafka or RabbitMQ
- Creation of DAGs for data engineering
- Expert at Python /Scala programming, especially for data engineering/ ETL purposes
Location- Kolkata
Process- Website
Essential Criteria:-
We therefore welcome candidates who meet the below criteria:· Must have website Process Experience (International Outbound Voice)
· Must have minimum 4 Years of International BPO exposure
· Must be in same Role (Team Leader/ Supervisor /Accounts Manager) in Web Process
· Excellent written and verbal communication
· Good facilitation & Leadership skills
Salary: Best in Industry
NOTE : CANDIDATE MUST BE FROM KOLKATA ONLY. OUTSTATION CANDIDATES ARE NOT ELIGIBLE
For Queries: Please contact in between 11 to 6 pm only Monday- Friday
HR Priya
Looking forward to hearing from you.
Best Regards
Priya Shaw
Lead Human Resource

