
3+ years of experience in cybersecurity, with a focus on application and cloud security.
· Proficiency in security tools such as Burp Suite, Metasploit, Nessus, OWASP ZAP, and SonarQube.
· Familiarity with data privacy regulations (GDPR, CCPA) and best practices.
· Basic knowledge of AI/ML security frameworks and tools.

About Mphasis
About
Similar jobs
About IntraIntel.ai
At IntraIntel.ai, we are building a next-generation, multi-tenant AI platform that enables organizations across industries—healthcare, clinical research, manufacturing, and textiles—to harness the power of intelligent automation and Generative AI. Our platform seamlessly integrates AI agents, RAG pipelines, and LLM-based workflows into a unified, scalable, and secure ecosystem hosted on Google Cloud Platform (GCP).
We are looking for a Full Stack Developer with deep experience in AI-integrated applications, cloud-native architecture, and end-to-end platform development—someone passionate about building intelligent systems that push the boundaries of innovation.
Key Responsibilities
1. Full Stack Development
- Design, build, and maintain full-stack applications with Node.js, Express.js, and modern frontend frameworks such as React.js / Angular.
- Implement RESTful APIs, GraphQL endpoints, and real-time communication features supporting multi-tenant AI workloads.
- Optimize backend logic for scalability, modularity, and high availability on GCP.
- Integrate AI-driven features (RAG, chatbots, data pipelines) into user-facing experiences.
2. AI Integration & Agentic Architecture
- Work alongside AI engineers and architects to integrate LLMs, RAG pipelines, and AI agents (using frameworks like LangChain, CrewAI, or LlamaIndex) into the product stack.
- Develop APIs and connectors for prompt orchestration, vector storage (FAISS, Chroma, Pinecone), and model inference workflows.
- Implement context-aware AI features with secure data access boundaries and performance optimization.
3. Cloud Infrastructure & CI/CD
- Deploy, manage, and optimize applications on Google Cloud Platform (GCP) using services such as Cloud Run, GKE, BigQuery, Cloud Storage, IAM, and Pub/Sub.
- Set up and maintain CI/CD pipelines using GitHub Actions, Cloud Build, or Terraform for automated testing, integration, and deployment.
- Manage infrastructure as code (IaC), automate containerized builds, and optimize deployment strategies for multi-environment scalability.
4. UI/UX Collaboration
- Collaborate with product and design teams to transform mockups into seamless user experiences using Figma and front-end frameworks.
- Contribute to UX optimization, ensuring that AI-driven features are intuitive, responsive, and visually engaging.
- Work with designers to ensure front-end consistency across multi-tenant environments.
5. Performance, Security & Monitoring
- Ensure data privacy, scalability, and compliance through role-based access control (RBAC), encryption, and secure API practices.
- Monitor system performance using Cloud Monitoring / OpenTelemetry, ensuring uptime and reliability.
- Participate in architectural discussions to enhance system observability and security posture.
Required Skills & Qualifications
Technical Proficiency
- Backend: Node.js, Express.js, Python (for AI integration), REST/GraphQL APIs
- Frontend: React.js / Angular / Vue.js, HTML5, CSS3, TypeScript, Next.js
- Database: PostgreSQL, MongoDB, Firestore, Redis
- Cloud: Google Cloud Platform (GCP) – Cloud Run, IAM, GKE, BigQuery, Cloud Storage
- AI Integration: LLM APIs (OpenAI, Gemini, Claude), LangChain, RAG, vector databases (FAISS, Pinecone, Chroma)
- DevOps: Docker, Kubernetes, Terraform, Cloud Build, GitHub Actions
- Version Control: Git, Bitbucket
- UI/UX Collaboration: Figma, Material UI, responsive design principles
Experience & Attributes
- 5+ years of experience in full-stack development, preferably on AI or SaaS platforms.
- Strong understanding of multi-tenant architectures and modular design principles.
- Proven experience in CI/CD pipeline automation and infrastructure management.
- Experience in integrating AI services, chatbots, or intelligent recommendation systems.
- Strong problem-solving skills and ability to collaborate in a fast-paced, cross-functional environment.
- Excellent communication skills and documentation habits.
Preferred Qualifications
- Prior experience working with AI-driven SaaS or agentic AI platforms.
- Familiarity with PromptOps / MLOps practices and versioning workflows for LLMs.
- Experience in data governance and security compliance (HIPAA, GDPR, or SOC2).
- Cloud certifications (GCP Professional Cloud Developer / Architect) are a plus.
Why Join IntraIntel.ai
- Work on cutting-edge AI agentic architectures with real-world enterprise impact.
- Join a fast-growing, innovation-driven team shaping the future of AI platforms.
- Build products at scale across diverse industries with a unified mission.
- Collaborative and flexible environment encouraging ownership and creativity.
Immediate Joiners Preferred. Notice Period - Immediate to 30 Days
Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".
Only applications received via email will be reviewed. Applications through other channels will not be considered.
About Us
adesso India is a dynamic and innovative IT Services and Consulting company based in Kochi. We are committed to delivering cutting-edge solutions that make a meaningful impact on our clients. As we continue to expand our development team, we are seeking a talented and motivated Backend Developer to join us in creating scalable and high-performance backend systems.
Job Description
We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.
Responsibilities
- Development and maintenance of data pipelines and automation scripts with Python
- Creation of data queries and optimization of database processes with SQL
- Use of bash scripts for system administration, automation and deployment processes
- Database and cloud technologies
- Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake)
- Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular
- Composer (Airflow): Orchestration of data pipelines for ETL processes
- Cloud Functions: Development of serverless functions for data processing and automation
- Cloud Scheduler: Planning and automation of recurring cloud jobs
- Cloud Secret Manager: Secure storage and management of sensitive access data and API keys
- BigQuery: Processing, analyzing and querying large amounts of data in the cloud
- Cloud Storage: Storage and management of structured and unstructured data
- Cloud monitoring: monitoring the performance and stability of cloud-based applications
- Data visualization and reporting
- Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI
Requirements
- Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
- Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
- Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP
- Combined with cloud storage technologies, cloud monitoring and cloud secret management
- Excellent communication skills to effectively collaborate with team members and stakeholders.
Nice-to-Have:
- Knowledge of agile methodologies and working in cross-functional, collaborative teams.
“I DESIGN MY LIFE” is an Online Business Consulting/ Coaching Company headed by renowned Business Coach – Sumit Agarwal. We provide online consulting and trainings to Business Owners of SMEs, MSMEs across India.
You can find more about us here: https://idesignmylife.net/careers/
This is a hands-on position. The role will have the following aspects:
POSITION: Software Developer
LOCATION: Full time(permanent) work from home opportunity
LANGUAGES: JavaScript, MySQL, Python, Erp Next, HTML, CSS, and Bootstrap
ROLE : We are looking for people who can
- Code well
- Have written complex software
- Self-starters - Can read the docs and don't need hand-holding
- Experience in Python/Javascript/jQuery/Vue/MySQL will be a plus
- Functional knowledge of ERP will be a plus
Basic Qualifications
- BE / B.Tech - IT/ CS
- 1 / 2+ years of professional experience
- Strong C# and SQL skills
- Strong skills in React and TypeScript
- Familiarity with AWS services or experience working in other cloud computing environments.
- Experience with SQL Server and PostgreSQL.
- Experience with automated unit testing frameworks.
- Experience in designing and implementing REST APIs & micro services-based solutions.
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Job Description:
We are looking for a talented UI/UX Designer to join our team. As a UI/UX Designer, you will be responsible for creating intuitive and visually appealing user interfaces and experiences for our digital products. You will work closely with product managers, developers, and other stakeholders to understand user needs, translate requirements into wireframes and prototypes, and design user-friendly interfaces that enhance user satisfaction and engagement. The ideal candidate should have a strong portfolio showcasing their design skills, creativity, and attention to detail.
Responsibilities:
- Collaborate with cross-functional teams to understand user requirements, business objectives, and technical constraints.
- Conduct user research, usability testing, and analysis to gain insights into user behaviors, preferences, and pain points.
- Create wireframes, user flows, and prototypes to visualize and communicate design concepts and interaction patterns.
- Design user interfaces and experiences that are intuitive, responsive, and aesthetically pleasing across various devices and platforms.
- Develop high-fidelity mockups and interactive prototypes using design tools such as Sketch, Adobe XD, Figma, or InVision.
- Iterate on designs based on feedback from stakeholders, usability testing, and data analysis to continuously improve the user experience.
- Collaborate with developers to ensure the feasibility and implementation of design solutions while maintaining design integrity.
- Create and maintain design systems, style guides, and UI components to ensure consistency and scalability across products.
- Stay updated on industry trends, best practices, and emerging technologies in UI/UX design.
- Advocate for user-centered design principles and contribute to a culture of design excellence within the organization.
Requirements:
- Bachelor's degree in Graphic Design, Interaction Design, HCI, or related field.
- Proven experience as a UI/UX Designer or similar role, with a strong portfolio showcasing design projects and process.
- Proficiency in design tools such as Sketch, Adobe XD, Figma, or InVision.
- Solid understanding of user-centered design principles, usability heuristics, and design thinking methodologies.
- Experience conducting user research, usability testing, and analysis to inform design decisions.
- Strong visual design skills, with an eye for typography, color, layout, and iconography.
- Knowledge of responsive design principles and mobile-first design approach.
- Excellent communication, collaboration, and presentation skills.
- Ability to work effectively in a fast-paced environment and manage multiple projects simultaneously.
- Familiarity with front-end development technologies (HTML, CSS, JavaScript) is a plus but not required.
Responsibilities:
- Contacting potential clients to establish rapport and arrange meetings.
- Planning and overseeing new marketing initiatives.
- Researching organizations and individuals to find new opportunities.
- Increasing the value of current customers while attracting new ones.
- Finding and developing new markets and improving sales.
- Attending conferences, meetings, and industry events.
- Developing quotes and proposals for clients.
- Developing goals for the development team and business growth and ensuring they are met.
- Training personnel and helping team members develop their skills.
Requirements:
- Bachelor’s degree in business, marketing, or related field.
- Experience in sales, marketing, or related field.
- Strong communication skills and IT fluency.
- Ability to manage complex projects and multi-task.
- Excellent organizational skills.
- Ability to flourish with minimal guidance, be proactive, and handle uncertainty.
- Proficient in Word, Excel, Outlook, and PowerPoint.
- Comfortable using a computer for various tasks.
We have multiple open positions for full-stack engineers to work with us on a cutting-edge eCommerce trade analysis platform.
Must have experience with:
- NodeJS
- Mern Stack
- AWS
- Experience working on SaaS applications with a large codebase
Nice to have
- Tailwind
- DynamoDB
- Chart libraries such as ChartJS
We are looking for a Developer who is proficient with Node and has at least 1-3 years of relevant experience. Your primary role will be on developing microservices, while ensuring good coding practices and architecture. You will have end to end ownership of the services you develop, to ensure it is robust and easy to maintain. You will coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, and quality product is important.
Roles and Responsibilities
- Integrating and maintaining Node micro services
- Proficient in RESTful APIs and API communications, especially to consume data from and/or push data into MongoDB for Web and mobile applications
- Proficient on modeling applications on NOSQL databases(preferably MongoDB)
Desired Candidate Profile
- Strong coding experience in Node JS.
- Should have experience with microservice architecture
- Compile and analyze data, processes, and codes to troubleshoot problems and identify areas for improvement
Nice to have:
- Some experience with queues like like Redis,SQS,etc
- Experience with cloud solutions like Lambda, Docker etc
- Experience in DevOps and related practices to improve development lifecycle, continuous delivery with high quality is an advantage.
- Deploy server/related components to staging, live environments.
- Some exposure to React Frontend implementation, deeper JS skills
The ideal candidate will be responsible for designing, developing, testing, and debugging responsive web and mobile applications for the company. Using React Js, MySQL, JavaScript, HTML, and CSS, this candidate will be able to translate user and business needs into functional frontend design.
Responsibilities
- Must have
- Experience with React and Redux or similar programming
- Experience in Rest API integration
- Passionate about user experience
- Android and IOS development
- Other requirements
- Good knowledge of Javascript functions and operators
- Responsibility for major tasks and chores
- Willing to resolve technical debt in the early step
- Good understanding of sprint cycle in agile
- Can communicate with developers and non-developers
- Can work independently but a great collaborator
- Serverless API experience is not counted
- Good to have Git workflow knowledge
- Good to have Github collaboration experience
Qualifications
- Bachelor's degree or equivalent in Computer Science
- 2+ years experience in frontend development
- Familiarity using Scrum/Agile development methodologies
- Experience building object-oriented web applications in React Js, MySQL, JavaScript, HTML5, and CSS3






